Dec 02 15:52:15 crc systemd[1]: Starting Kubernetes Kubelet... Dec 02 15:52:15 crc restorecon[4715]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 15:52:15 crc restorecon[4715]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 02 15:52:16 crc restorecon[4715]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 02 15:52:16 crc restorecon[4715]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Dec 02 15:52:16 crc kubenswrapper[4933]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 02 15:52:16 crc kubenswrapper[4933]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Dec 02 15:52:16 crc kubenswrapper[4933]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 02 15:52:16 crc kubenswrapper[4933]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 02 15:52:16 crc kubenswrapper[4933]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Dec 02 15:52:16 crc kubenswrapper[4933]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.890442 4933 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.896859 4933 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.896909 4933 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.896919 4933 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.896929 4933 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.896937 4933 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.896947 4933 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.896956 4933 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.896965 4933 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.896972 4933 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.896980 4933 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.896989 4933 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.896996 4933 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.897004 4933 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.897024 4933 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.897033 4933 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.897041 4933 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.897049 4933 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.897056 4933 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.897070 4933 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.897084 4933 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.897096 4933 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.897107 4933 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.897120 4933 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.897131 4933 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.897140 4933 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.897149 4933 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.897158 4933 feature_gate.go:330] unrecognized feature gate: Example Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.897166 4933 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.897177 4933 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.897187 4933 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.897196 4933 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.897205 4933 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.897214 4933 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.897222 4933 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.897230 4933 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.897238 4933 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.897246 4933 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.897259 4933 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.897268 4933 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.897276 4933 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.897284 4933 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.897292 4933 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.897300 4933 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.897308 4933 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.897316 4933 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.897323 4933 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.897332 4933 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.897339 4933 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.897347 4933 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.897355 4933 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.897363 4933 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.897371 4933 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.897379 4933 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.897386 4933 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.897394 4933 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.897401 4933 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.897409 4933 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.897420 4933 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.897431 4933 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.897439 4933 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.897447 4933 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.897456 4933 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.897464 4933 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.897473 4933 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.897482 4933 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.897489 4933 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.897499 4933 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.897509 4933 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.897519 4933 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.897527 4933 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.897534 4933 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.897940 4933 flags.go:64] FLAG: --address="0.0.0.0" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.897968 4933 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.897986 4933 flags.go:64] FLAG: --anonymous-auth="true" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.897998 4933 flags.go:64] FLAG: --application-metrics-count-limit="100" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.898015 4933 flags.go:64] FLAG: --authentication-token-webhook="false" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.898024 4933 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.898036 4933 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.898047 4933 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.898056 4933 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.898068 4933 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.898082 4933 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.898107 4933 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.898121 4933 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.898132 4933 flags.go:64] FLAG: --cgroup-root="" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.898146 4933 flags.go:64] FLAG: --cgroups-per-qos="true" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.898155 4933 flags.go:64] FLAG: --client-ca-file="" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.898164 4933 flags.go:64] FLAG: --cloud-config="" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.898173 4933 flags.go:64] FLAG: --cloud-provider="" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.898182 4933 flags.go:64] FLAG: --cluster-dns="[]" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.898193 4933 flags.go:64] FLAG: --cluster-domain="" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.898202 4933 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.898211 4933 flags.go:64] FLAG: --config-dir="" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.898220 4933 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.898230 4933 flags.go:64] FLAG: --container-log-max-files="5" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.898242 4933 flags.go:64] FLAG: --container-log-max-size="10Mi" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.898252 4933 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.898260 4933 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.898270 4933 flags.go:64] FLAG: --containerd-namespace="k8s.io" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.898279 4933 flags.go:64] FLAG: --contention-profiling="false" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.898288 4933 flags.go:64] FLAG: --cpu-cfs-quota="true" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.898297 4933 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.898307 4933 flags.go:64] FLAG: --cpu-manager-policy="none" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.898316 4933 flags.go:64] FLAG: --cpu-manager-policy-options="" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.898327 4933 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.898336 4933 flags.go:64] FLAG: --enable-controller-attach-detach="true" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.898345 4933 flags.go:64] FLAG: --enable-debugging-handlers="true" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.898354 4933 flags.go:64] FLAG: --enable-load-reader="false" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.898363 4933 flags.go:64] FLAG: --enable-server="true" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.898372 4933 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.898385 4933 flags.go:64] FLAG: --event-burst="100" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.898395 4933 flags.go:64] FLAG: --event-qps="50" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.898404 4933 flags.go:64] FLAG: --event-storage-age-limit="default=0" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.898413 4933 flags.go:64] FLAG: --event-storage-event-limit="default=0" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.898422 4933 flags.go:64] FLAG: --eviction-hard="" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.898434 4933 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.898443 4933 flags.go:64] FLAG: --eviction-minimum-reclaim="" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.898453 4933 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.898464 4933 flags.go:64] FLAG: --eviction-soft="" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.898473 4933 flags.go:64] FLAG: --eviction-soft-grace-period="" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.898483 4933 flags.go:64] FLAG: --exit-on-lock-contention="false" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.898495 4933 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.898505 4933 flags.go:64] FLAG: --experimental-mounter-path="" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.898514 4933 flags.go:64] FLAG: --fail-cgroupv1="false" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.898523 4933 flags.go:64] FLAG: --fail-swap-on="true" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.898532 4933 flags.go:64] FLAG: --feature-gates="" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.898543 4933 flags.go:64] FLAG: --file-check-frequency="20s" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.898552 4933 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.898561 4933 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.898570 4933 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.898580 4933 flags.go:64] FLAG: --healthz-port="10248" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.898589 4933 flags.go:64] FLAG: --help="false" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.898598 4933 flags.go:64] FLAG: --hostname-override="" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.898606 4933 flags.go:64] FLAG: --housekeeping-interval="10s" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.898616 4933 flags.go:64] FLAG: --http-check-frequency="20s" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.898626 4933 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.898635 4933 flags.go:64] FLAG: --image-credential-provider-config="" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.898643 4933 flags.go:64] FLAG: --image-gc-high-threshold="85" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.898652 4933 flags.go:64] FLAG: --image-gc-low-threshold="80" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.898661 4933 flags.go:64] FLAG: --image-service-endpoint="" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.898670 4933 flags.go:64] FLAG: --kernel-memcg-notification="false" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.898679 4933 flags.go:64] FLAG: --kube-api-burst="100" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.898688 4933 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.898697 4933 flags.go:64] FLAG: --kube-api-qps="50" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.898706 4933 flags.go:64] FLAG: --kube-reserved="" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.898715 4933 flags.go:64] FLAG: --kube-reserved-cgroup="" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.898724 4933 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.898733 4933 flags.go:64] FLAG: --kubelet-cgroups="" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.898742 4933 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.898753 4933 flags.go:64] FLAG: --lock-file="" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.898762 4933 flags.go:64] FLAG: --log-cadvisor-usage="false" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.898772 4933 flags.go:64] FLAG: --log-flush-frequency="5s" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.898781 4933 flags.go:64] FLAG: --log-json-info-buffer-size="0" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.898856 4933 flags.go:64] FLAG: --log-json-split-stream="false" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.898868 4933 flags.go:64] FLAG: --log-text-info-buffer-size="0" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.898877 4933 flags.go:64] FLAG: --log-text-split-stream="false" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.898887 4933 flags.go:64] FLAG: --logging-format="text" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.898897 4933 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.898906 4933 flags.go:64] FLAG: --make-iptables-util-chains="true" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.898915 4933 flags.go:64] FLAG: --manifest-url="" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.898924 4933 flags.go:64] FLAG: --manifest-url-header="" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.898936 4933 flags.go:64] FLAG: --max-housekeeping-interval="15s" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.898945 4933 flags.go:64] FLAG: --max-open-files="1000000" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.898956 4933 flags.go:64] FLAG: --max-pods="110" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.898965 4933 flags.go:64] FLAG: --maximum-dead-containers="-1" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.898974 4933 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.898983 4933 flags.go:64] FLAG: --memory-manager-policy="None" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.898993 4933 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.899002 4933 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.899012 4933 flags.go:64] FLAG: --node-ip="192.168.126.11" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.899021 4933 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.899041 4933 flags.go:64] FLAG: --node-status-max-images="50" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.899051 4933 flags.go:64] FLAG: --node-status-update-frequency="10s" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.899060 4933 flags.go:64] FLAG: --oom-score-adj="-999" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.899071 4933 flags.go:64] FLAG: --pod-cidr="" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.899082 4933 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.899100 4933 flags.go:64] FLAG: --pod-manifest-path="" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.899111 4933 flags.go:64] FLAG: --pod-max-pids="-1" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.899122 4933 flags.go:64] FLAG: --pods-per-core="0" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.899132 4933 flags.go:64] FLAG: --port="10250" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.899143 4933 flags.go:64] FLAG: --protect-kernel-defaults="false" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.899155 4933 flags.go:64] FLAG: --provider-id="" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.899166 4933 flags.go:64] FLAG: --qos-reserved="" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.899178 4933 flags.go:64] FLAG: --read-only-port="10255" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.899188 4933 flags.go:64] FLAG: --register-node="true" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.899197 4933 flags.go:64] FLAG: --register-schedulable="true" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.899207 4933 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.899223 4933 flags.go:64] FLAG: --registry-burst="10" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.899232 4933 flags.go:64] FLAG: --registry-qps="5" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.899241 4933 flags.go:64] FLAG: --reserved-cpus="" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.899249 4933 flags.go:64] FLAG: --reserved-memory="" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.899261 4933 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.899273 4933 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.899284 4933 flags.go:64] FLAG: --rotate-certificates="false" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.899293 4933 flags.go:64] FLAG: --rotate-server-certificates="false" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.899302 4933 flags.go:64] FLAG: --runonce="false" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.899310 4933 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.899320 4933 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.899330 4933 flags.go:64] FLAG: --seccomp-default="false" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.899340 4933 flags.go:64] FLAG: --serialize-image-pulls="true" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.899350 4933 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.899359 4933 flags.go:64] FLAG: --storage-driver-db="cadvisor" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.899370 4933 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.899381 4933 flags.go:64] FLAG: --storage-driver-password="root" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.899390 4933 flags.go:64] FLAG: --storage-driver-secure="false" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.899400 4933 flags.go:64] FLAG: --storage-driver-table="stats" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.899409 4933 flags.go:64] FLAG: --storage-driver-user="root" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.899418 4933 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.899428 4933 flags.go:64] FLAG: --sync-frequency="1m0s" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.899437 4933 flags.go:64] FLAG: --system-cgroups="" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.899446 4933 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.899459 4933 flags.go:64] FLAG: --system-reserved-cgroup="" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.899468 4933 flags.go:64] FLAG: --tls-cert-file="" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.899478 4933 flags.go:64] FLAG: --tls-cipher-suites="[]" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.899489 4933 flags.go:64] FLAG: --tls-min-version="" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.899498 4933 flags.go:64] FLAG: --tls-private-key-file="" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.899506 4933 flags.go:64] FLAG: --topology-manager-policy="none" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.899515 4933 flags.go:64] FLAG: --topology-manager-policy-options="" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.899525 4933 flags.go:64] FLAG: --topology-manager-scope="container" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.899535 4933 flags.go:64] FLAG: --v="2" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.899546 4933 flags.go:64] FLAG: --version="false" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.899558 4933 flags.go:64] FLAG: --vmodule="" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.899568 4933 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.899578 4933 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.899916 4933 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.899954 4933 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.899963 4933 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.899971 4933 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.899982 4933 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.899992 4933 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.899999 4933 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.900006 4933 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.900012 4933 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.900022 4933 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.900028 4933 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.900034 4933 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.900041 4933 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.900047 4933 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.900053 4933 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.900062 4933 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.900074 4933 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.900081 4933 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.900088 4933 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.900094 4933 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.900102 4933 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.900108 4933 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.900114 4933 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.900121 4933 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.900127 4933 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.900133 4933 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.900139 4933 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.900146 4933 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.900153 4933 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.900159 4933 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.900166 4933 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.900172 4933 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.900187 4933 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.900195 4933 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.900202 4933 feature_gate.go:330] unrecognized feature gate: Example Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.900208 4933 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.900215 4933 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.900221 4933 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.900227 4933 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.900233 4933 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.900238 4933 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.900243 4933 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.900248 4933 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.900254 4933 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.900260 4933 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.900267 4933 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.900274 4933 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.900280 4933 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.900286 4933 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.900293 4933 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.900300 4933 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.900306 4933 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.900312 4933 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.900318 4933 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.900324 4933 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.900330 4933 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.900337 4933 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.900343 4933 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.900349 4933 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.900355 4933 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.900361 4933 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.900366 4933 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.900372 4933 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.900378 4933 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.900392 4933 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.900399 4933 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.900408 4933 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.900416 4933 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.900424 4933 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.900430 4933 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.900436 4933 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.900720 4933 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.915307 4933 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.915360 4933 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.915495 4933 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.915509 4933 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.915517 4933 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.915524 4933 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.915530 4933 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.915538 4933 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.915545 4933 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.915551 4933 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.915557 4933 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.915564 4933 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.915570 4933 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.915577 4933 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.915584 4933 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.915593 4933 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.915603 4933 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.915609 4933 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.915615 4933 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.915620 4933 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.915626 4933 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.915632 4933 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.915638 4933 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.915645 4933 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.915651 4933 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.915658 4933 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.915664 4933 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.915671 4933 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.915681 4933 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.915694 4933 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.915703 4933 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.915710 4933 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.915719 4933 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.915727 4933 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.915736 4933 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.915746 4933 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.915755 4933 feature_gate.go:330] unrecognized feature gate: Example Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.915764 4933 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.915772 4933 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.915779 4933 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.915786 4933 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.915793 4933 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.915800 4933 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.915807 4933 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.915813 4933 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.915842 4933 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.915848 4933 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.915853 4933 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.915858 4933 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.915863 4933 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.915869 4933 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.915874 4933 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.915879 4933 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.915885 4933 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.915890 4933 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.915897 4933 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.915903 4933 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.915908 4933 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.915913 4933 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.915918 4933 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.915923 4933 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.915928 4933 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.915935 4933 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.915941 4933 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.915947 4933 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.915952 4933 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.915958 4933 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.915963 4933 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.915968 4933 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.915974 4933 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.915979 4933 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.915984 4933 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.915989 4933 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.915999 4933 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.916170 4933 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.916178 4933 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.916184 4933 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.916190 4933 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.916196 4933 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.916201 4933 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.916206 4933 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.916211 4933 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.916216 4933 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.916221 4933 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.916228 4933 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.916235 4933 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.916241 4933 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.916246 4933 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.916253 4933 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.916260 4933 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.916266 4933 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.916272 4933 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.916278 4933 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.916284 4933 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.916290 4933 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.916296 4933 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.916302 4933 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.916307 4933 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.916312 4933 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.916317 4933 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.916322 4933 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.916327 4933 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.916332 4933 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.916337 4933 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.916342 4933 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.916347 4933 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.916352 4933 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.916359 4933 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.916366 4933 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.916371 4933 feature_gate.go:330] unrecognized feature gate: Example Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.916376 4933 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.916385 4933 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.916390 4933 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.916395 4933 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.916400 4933 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.916405 4933 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.916410 4933 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.916417 4933 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.916423 4933 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.916428 4933 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.916433 4933 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.916438 4933 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.916443 4933 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.916449 4933 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.916454 4933 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.916459 4933 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.916466 4933 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.916473 4933 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.916479 4933 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.916485 4933 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.916490 4933 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.916495 4933 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.916500 4933 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.916506 4933 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.916511 4933 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.916516 4933 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.916521 4933 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.916526 4933 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.916531 4933 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.916536 4933 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.916541 4933 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.916546 4933 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.916551 4933 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.916556 4933 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.916561 4933 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.916569 4933 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.916784 4933 server.go:940] "Client rotation is on, will bootstrap in background" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.920103 4933 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.920215 4933 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.920794 4933 server.go:997] "Starting client certificate rotation" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.920846 4933 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.921261 4933 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-11-20 18:54:11.730298698 +0000 UTC Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.921391 4933 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.926711 4933 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 02 15:52:16 crc kubenswrapper[4933]: E1202 15:52:16.928417 4933 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.213:6443: connect: connection refused" logger="UnhandledError" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.928640 4933 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.937637 4933 log.go:25] "Validated CRI v1 runtime API" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.954398 4933 log.go:25] "Validated CRI v1 image API" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.956002 4933 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.958742 4933 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-12-02-15-47-19-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.958780 4933 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.980429 4933 manager.go:217] Machine: {Timestamp:2025-12-02 15:52:16.978807975 +0000 UTC m=+0.230034708 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:84b7b789-bc9b-466b-8619-2bf2e1fdb8d0 BootID:b45811fa-f657-451d-9a34-cdd268fcc941 Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:dd:a1:de Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:dd:a1:de Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:7a:eb:fb Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:64:58:95 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:a3:53:87 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:b8:f2:90 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:e2:c4:3e:d4:f1:e6 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:ee:fd:19:94:7b:45 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.981084 4933 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.981482 4933 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.982371 4933 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.982689 4933 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.982801 4933 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.983161 4933 topology_manager.go:138] "Creating topology manager with none policy" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.983236 4933 container_manager_linux.go:303] "Creating device plugin manager" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.983488 4933 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.983587 4933 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.983870 4933 state_mem.go:36] "Initialized new in-memory state store" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.984038 4933 server.go:1245] "Using root directory" path="/var/lib/kubelet" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.985185 4933 kubelet.go:418] "Attempting to sync node with API server" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.985304 4933 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.985470 4933 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.985590 4933 kubelet.go:324] "Adding apiserver pod source" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.985674 4933 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.987374 4933 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.213:6443: connect: connection refused Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.987436 4933 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.213:6443: connect: connection refused Dec 02 15:52:16 crc kubenswrapper[4933]: E1202 15:52:16.987579 4933 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.213:6443: connect: connection refused" logger="UnhandledError" Dec 02 15:52:16 crc kubenswrapper[4933]: E1202 15:52:16.987564 4933 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.213:6443: connect: connection refused" logger="UnhandledError" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.987753 4933 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.988147 4933 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.989000 4933 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.989551 4933 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.989576 4933 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.989585 4933 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.989592 4933 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.989605 4933 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.989612 4933 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.989620 4933 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.989631 4933 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.989641 4933 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.989649 4933 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.989660 4933 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.989667 4933 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.989866 4933 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.990429 4933 server.go:1280] "Started kubelet" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.990986 4933 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.213:6443: connect: connection refused Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.990975 4933 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.991013 4933 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.992427 4933 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.992623 4933 server.go:460] "Adding debug handlers to kubelet server" Dec 02 15:52:16 crc systemd[1]: Started Kubernetes Kubelet. Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.992731 4933 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.992756 4933 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.993435 4933 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-14 11:15:20.204005925 +0000 UTC Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.993543 4933 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 1027h23m3.210468194s for next certificate rotation Dec 02 15:52:16 crc kubenswrapper[4933]: E1202 15:52:16.994034 4933 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.994089 4933 volume_manager.go:287] "The desired_state_of_world populator starts" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.994264 4933 volume_manager.go:289] "Starting Kubelet Volume Manager" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.994118 4933 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Dec 02 15:52:16 crc kubenswrapper[4933]: W1202 15:52:16.995071 4933 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.213:6443: connect: connection refused Dec 02 15:52:16 crc kubenswrapper[4933]: E1202 15:52:16.995192 4933 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.213:6443: connect: connection refused" logger="UnhandledError" Dec 02 15:52:16 crc kubenswrapper[4933]: E1202 15:52:16.995264 4933 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.213:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187d70e024bbf4ab default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-02 15:52:16.990385323 +0000 UTC m=+0.241612036,LastTimestamp:2025-12-02 15:52:16.990385323 +0000 UTC m=+0.241612036,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 02 15:52:16 crc kubenswrapper[4933]: E1202 15:52:16.997498 4933 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.213:6443: connect: connection refused" interval="200ms" Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.998517 4933 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.998549 4933 factory.go:55] Registering systemd factory Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.998560 4933 factory.go:221] Registration of the systemd container factory successfully Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.999677 4933 factory.go:153] Registering CRI-O factory Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.999699 4933 factory.go:221] Registration of the crio container factory successfully Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.999727 4933 factory.go:103] Registering Raw factory Dec 02 15:52:16 crc kubenswrapper[4933]: I1202 15:52:16.999746 4933 manager.go:1196] Started watching for new ooms in manager Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.001996 4933 manager.go:319] Starting recovery of all containers Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.010840 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.010926 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.010939 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.010951 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.010964 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.011001 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.011013 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.011025 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.011040 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.011050 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.011059 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.011068 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.011079 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.011093 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.011103 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.011114 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.011124 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.011137 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.011150 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.011161 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.011172 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.011184 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.011198 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.011212 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.011225 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.011236 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.011253 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.011268 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.011305 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.011317 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.011328 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.011341 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.011354 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.011374 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.011419 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.011432 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.011444 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.011457 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.011473 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.011488 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.011503 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.011519 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.011532 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.011546 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.011561 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.011577 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.011595 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.011612 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.011628 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.011648 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.011667 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.011713 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.011734 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.011750 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.011765 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.011781 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.011798 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.011811 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.011909 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.011922 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.011970 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.011985 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.011999 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.012013 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.012026 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.012039 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.012528 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.012546 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.012560 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.012572 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.012587 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.012603 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.012616 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.012629 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.012644 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.012657 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.012672 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.012692 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.012705 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.012718 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.012737 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.012754 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.012769 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.012782 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.012796 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.012809 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.012841 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.012856 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.012869 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.012883 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.012895 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.012908 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.012921 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.012933 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.012947 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.012959 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.012972 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.012983 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.012995 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.013006 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.013018 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.013030 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.013041 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.013052 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.013076 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.013089 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.013103 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.013116 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.013129 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.013142 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.013161 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.013176 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.013190 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.013203 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.013215 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.013231 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.013244 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.013259 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.013272 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.013285 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.013300 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.013312 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.013323 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.013334 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.013348 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.013361 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.013375 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.014048 4933 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.014074 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.014097 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.014126 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.014139 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.014153 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.014165 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.014175 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.014187 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.014199 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.014212 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.014223 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.014233 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.014243 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.014252 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.014261 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.014270 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.014280 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.014291 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.014301 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.014310 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.014319 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.014327 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.014336 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.014347 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.014357 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.014368 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.014378 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.014389 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.014398 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.014412 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.014423 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.014434 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.014445 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.014458 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.014474 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.014488 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.014500 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.014513 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.014524 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.014538 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.014550 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.014562 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.014576 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.014596 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.014646 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.014659 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.014669 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.014683 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.014705 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.014731 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.014746 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.014761 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.014776 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.014790 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.014804 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.014840 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.014852 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.014865 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.014878 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.014892 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.014905 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.014922 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.014933 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.014943 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.014957 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.014970 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.014983 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.014992 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.015003 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.015013 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.015023 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.015032 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.015042 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.015055 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.015064 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.015074 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.015084 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.015094 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.015106 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.015117 4933 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.015131 4933 reconstruct.go:97] "Volume reconstruction finished" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.015140 4933 reconciler.go:26] "Reconciler: start to sync state" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.023412 4933 manager.go:324] Recovery completed Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.037347 4933 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.042898 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.042951 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.042961 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.044322 4933 cpu_manager.go:225] "Starting CPU manager" policy="none" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.044341 4933 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.044491 4933 state_mem.go:36] "Initialized new in-memory state store" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.049897 4933 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.051990 4933 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.052041 4933 status_manager.go:217] "Starting to sync pod status with apiserver" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.052075 4933 kubelet.go:2335] "Starting kubelet main sync loop" Dec 02 15:52:17 crc kubenswrapper[4933]: E1202 15:52:17.052123 4933 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 02 15:52:17 crc kubenswrapper[4933]: E1202 15:52:17.094668 4933 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 02 15:52:17 crc kubenswrapper[4933]: W1202 15:52:17.095975 4933 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.213:6443: connect: connection refused Dec 02 15:52:17 crc kubenswrapper[4933]: E1202 15:52:17.096077 4933 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.213:6443: connect: connection refused" logger="UnhandledError" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.103711 4933 policy_none.go:49] "None policy: Start" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.105359 4933 memory_manager.go:170] "Starting memorymanager" policy="None" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.105393 4933 state_mem.go:35] "Initializing new in-memory state store" Dec 02 15:52:17 crc kubenswrapper[4933]: E1202 15:52:17.152253 4933 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.165408 4933 manager.go:334] "Starting Device Plugin manager" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.166184 4933 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.166221 4933 server.go:79] "Starting device plugin registration server" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.166898 4933 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.166920 4933 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.167273 4933 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.167449 4933 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.167464 4933 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 02 15:52:17 crc kubenswrapper[4933]: E1202 15:52:17.175537 4933 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 02 15:52:17 crc kubenswrapper[4933]: E1202 15:52:17.199126 4933 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.213:6443: connect: connection refused" interval="400ms" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.267907 4933 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.269333 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.269390 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.269401 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.269437 4933 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 02 15:52:17 crc kubenswrapper[4933]: E1202 15:52:17.270177 4933 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.213:6443: connect: connection refused" node="crc" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.352662 4933 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.352851 4933 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.354212 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.354367 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.354499 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.354807 4933 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.355292 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.355421 4933 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.356145 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.356296 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.356395 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.356615 4933 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.356837 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.356907 4933 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.357310 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.357360 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.357380 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.358149 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.358174 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.358183 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.358317 4933 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.358414 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.358472 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.358526 4933 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.358483 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.358811 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.359364 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.359393 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.359406 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.359502 4933 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.359929 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.359963 4933 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.360166 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.360284 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.360361 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.360653 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.360675 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.360676 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.360683 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.360717 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.360729 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.360993 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.361031 4933 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.362366 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.362400 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.362410 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.422055 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.422135 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.422176 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.422239 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.422285 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.422335 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.422375 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.422461 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.422510 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.422538 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.422562 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.422583 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.422601 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.422622 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.422639 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.470594 4933 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.472291 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.472350 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.472361 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.472394 4933 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 02 15:52:17 crc kubenswrapper[4933]: E1202 15:52:17.473089 4933 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.213:6443: connect: connection refused" node="crc" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.524810 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.524938 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.524993 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.525026 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.525047 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.525074 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.525096 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.525119 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.525119 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.525141 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.525189 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.525226 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.525231 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.525244 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.525271 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.525199 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.525313 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.525316 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.525354 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.525428 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.525438 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.525458 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.525552 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.525378 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.525644 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.525672 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.525701 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.525705 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.525761 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.525788 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 15:52:17 crc kubenswrapper[4933]: E1202 15:52:17.600642 4933 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.213:6443: connect: connection refused" interval="800ms" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.690859 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.695287 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.711912 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.727141 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.734608 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 02 15:52:17 crc kubenswrapper[4933]: W1202 15:52:17.735940 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-77cfd7d6b09b733d118e35dd69232092d668eec555cc03b1d82a41e2cf52b960 WatchSource:0}: Error finding container 77cfd7d6b09b733d118e35dd69232092d668eec555cc03b1d82a41e2cf52b960: Status 404 returned error can't find the container with id 77cfd7d6b09b733d118e35dd69232092d668eec555cc03b1d82a41e2cf52b960 Dec 02 15:52:17 crc kubenswrapper[4933]: W1202 15:52:17.737312 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-66f6957dcd35e7d4c87dbff134d81dfee9aa0b5dc9f13a19ec973c2f418dd983 WatchSource:0}: Error finding container 66f6957dcd35e7d4c87dbff134d81dfee9aa0b5dc9f13a19ec973c2f418dd983: Status 404 returned error can't find the container with id 66f6957dcd35e7d4c87dbff134d81dfee9aa0b5dc9f13a19ec973c2f418dd983 Dec 02 15:52:17 crc kubenswrapper[4933]: W1202 15:52:17.742060 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-59a7ab04bbf751174118fea3c1a03d34a438843db3e4cdeee4f54de35ff6e12f WatchSource:0}: Error finding container 59a7ab04bbf751174118fea3c1a03d34a438843db3e4cdeee4f54de35ff6e12f: Status 404 returned error can't find the container with id 59a7ab04bbf751174118fea3c1a03d34a438843db3e4cdeee4f54de35ff6e12f Dec 02 15:52:17 crc kubenswrapper[4933]: W1202 15:52:17.750289 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-5589e7ee1ddeec02af0bf6ed375ea7e47ec10080642f975bf881252bbe80a6f1 WatchSource:0}: Error finding container 5589e7ee1ddeec02af0bf6ed375ea7e47ec10080642f975bf881252bbe80a6f1: Status 404 returned error can't find the container with id 5589e7ee1ddeec02af0bf6ed375ea7e47ec10080642f975bf881252bbe80a6f1 Dec 02 15:52:17 crc kubenswrapper[4933]: W1202 15:52:17.751319 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-96b1d62472a86d6027c07d04c509b3e567aa4fd7cd614191144224a74c2152ef WatchSource:0}: Error finding container 96b1d62472a86d6027c07d04c509b3e567aa4fd7cd614191144224a74c2152ef: Status 404 returned error can't find the container with id 96b1d62472a86d6027c07d04c509b3e567aa4fd7cd614191144224a74c2152ef Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.873856 4933 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.876276 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.876316 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.876352 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.876380 4933 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 02 15:52:17 crc kubenswrapper[4933]: E1202 15:52:17.877043 4933 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.213:6443: connect: connection refused" node="crc" Dec 02 15:52:17 crc kubenswrapper[4933]: I1202 15:52:17.992331 4933 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.213:6443: connect: connection refused Dec 02 15:52:18 crc kubenswrapper[4933]: I1202 15:52:18.058224 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"59a7ab04bbf751174118fea3c1a03d34a438843db3e4cdeee4f54de35ff6e12f"} Dec 02 15:52:18 crc kubenswrapper[4933]: I1202 15:52:18.059650 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"66f6957dcd35e7d4c87dbff134d81dfee9aa0b5dc9f13a19ec973c2f418dd983"} Dec 02 15:52:18 crc kubenswrapper[4933]: I1202 15:52:18.060403 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"77cfd7d6b09b733d118e35dd69232092d668eec555cc03b1d82a41e2cf52b960"} Dec 02 15:52:18 crc kubenswrapper[4933]: I1202 15:52:18.061685 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"96b1d62472a86d6027c07d04c509b3e567aa4fd7cd614191144224a74c2152ef"} Dec 02 15:52:18 crc kubenswrapper[4933]: I1202 15:52:18.062571 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"5589e7ee1ddeec02af0bf6ed375ea7e47ec10080642f975bf881252bbe80a6f1"} Dec 02 15:52:18 crc kubenswrapper[4933]: W1202 15:52:18.118656 4933 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.213:6443: connect: connection refused Dec 02 15:52:18 crc kubenswrapper[4933]: E1202 15:52:18.118781 4933 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.213:6443: connect: connection refused" logger="UnhandledError" Dec 02 15:52:18 crc kubenswrapper[4933]: W1202 15:52:18.374289 4933 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.213:6443: connect: connection refused Dec 02 15:52:18 crc kubenswrapper[4933]: E1202 15:52:18.374394 4933 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.213:6443: connect: connection refused" logger="UnhandledError" Dec 02 15:52:18 crc kubenswrapper[4933]: W1202 15:52:18.397836 4933 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.213:6443: connect: connection refused Dec 02 15:52:18 crc kubenswrapper[4933]: E1202 15:52:18.397951 4933 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.213:6443: connect: connection refused" logger="UnhandledError" Dec 02 15:52:18 crc kubenswrapper[4933]: E1202 15:52:18.401845 4933 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.213:6443: connect: connection refused" interval="1.6s" Dec 02 15:52:18 crc kubenswrapper[4933]: W1202 15:52:18.610695 4933 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.213:6443: connect: connection refused Dec 02 15:52:18 crc kubenswrapper[4933]: E1202 15:52:18.610781 4933 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.213:6443: connect: connection refused" logger="UnhandledError" Dec 02 15:52:18 crc kubenswrapper[4933]: I1202 15:52:18.677997 4933 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 15:52:18 crc kubenswrapper[4933]: I1202 15:52:18.680031 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:18 crc kubenswrapper[4933]: I1202 15:52:18.680119 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:18 crc kubenswrapper[4933]: I1202 15:52:18.680137 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:18 crc kubenswrapper[4933]: I1202 15:52:18.680179 4933 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 02 15:52:18 crc kubenswrapper[4933]: E1202 15:52:18.680879 4933 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.213:6443: connect: connection refused" node="crc" Dec 02 15:52:18 crc kubenswrapper[4933]: I1202 15:52:18.992743 4933 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.213:6443: connect: connection refused Dec 02 15:52:19 crc kubenswrapper[4933]: I1202 15:52:19.068857 4933 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="278a1b76a0d8f7d922ab08d7ecd0912f0cde3f3438de43f5c5214e703bd244c2" exitCode=0 Dec 02 15:52:19 crc kubenswrapper[4933]: I1202 15:52:19.068935 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"278a1b76a0d8f7d922ab08d7ecd0912f0cde3f3438de43f5c5214e703bd244c2"} Dec 02 15:52:19 crc kubenswrapper[4933]: I1202 15:52:19.069047 4933 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 15:52:19 crc kubenswrapper[4933]: I1202 15:52:19.070750 4933 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="7d0c36fe96576d4594aff50e5b631d1bc23ea352372722d332d11c9dc6b5b7cc" exitCode=0 Dec 02 15:52:19 crc kubenswrapper[4933]: I1202 15:52:19.071297 4933 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 15:52:19 crc kubenswrapper[4933]: I1202 15:52:19.071286 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:19 crc kubenswrapper[4933]: I1202 15:52:19.071648 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:19 crc kubenswrapper[4933]: I1202 15:52:19.071675 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:19 crc kubenswrapper[4933]: I1202 15:52:19.071781 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"7d0c36fe96576d4594aff50e5b631d1bc23ea352372722d332d11c9dc6b5b7cc"} Dec 02 15:52:19 crc kubenswrapper[4933]: I1202 15:52:19.072478 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:19 crc kubenswrapper[4933]: I1202 15:52:19.072518 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:19 crc kubenswrapper[4933]: I1202 15:52:19.072532 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:19 crc kubenswrapper[4933]: I1202 15:52:19.074664 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"7399fed7994f0ed3f12a423d3f6796e84d8687f9c16a3050ccbb90e1c80a07d5"} Dec 02 15:52:19 crc kubenswrapper[4933]: I1202 15:52:19.074747 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"fee933bdd8638f0085a6f720a178c8ce59bf46b40a0bcb015ac9c570e25ce97d"} Dec 02 15:52:19 crc kubenswrapper[4933]: I1202 15:52:19.092049 4933 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="cd63fc5b2ffbd52b09628908868b28e2168bfe8bc453de41ce175aedcd404cc5" exitCode=0 Dec 02 15:52:19 crc kubenswrapper[4933]: I1202 15:52:19.092196 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"cd63fc5b2ffbd52b09628908868b28e2168bfe8bc453de41ce175aedcd404cc5"} Dec 02 15:52:19 crc kubenswrapper[4933]: I1202 15:52:19.092383 4933 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 15:52:19 crc kubenswrapper[4933]: I1202 15:52:19.094182 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:19 crc kubenswrapper[4933]: I1202 15:52:19.094219 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:19 crc kubenswrapper[4933]: I1202 15:52:19.094232 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:19 crc kubenswrapper[4933]: I1202 15:52:19.097044 4933 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="94a860a32ad2b16621b3f2208b144968777437df38bf39ce6ab6b534a77ea154" exitCode=0 Dec 02 15:52:19 crc kubenswrapper[4933]: I1202 15:52:19.097135 4933 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 15:52:19 crc kubenswrapper[4933]: I1202 15:52:19.097132 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"94a860a32ad2b16621b3f2208b144968777437df38bf39ce6ab6b534a77ea154"} Dec 02 15:52:19 crc kubenswrapper[4933]: I1202 15:52:19.097831 4933 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Dec 02 15:52:19 crc kubenswrapper[4933]: I1202 15:52:19.098608 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:19 crc kubenswrapper[4933]: I1202 15:52:19.098706 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:19 crc kubenswrapper[4933]: I1202 15:52:19.098764 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:19 crc kubenswrapper[4933]: E1202 15:52:19.098912 4933 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.213:6443: connect: connection refused" logger="UnhandledError" Dec 02 15:52:19 crc kubenswrapper[4933]: I1202 15:52:19.120251 4933 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 15:52:19 crc kubenswrapper[4933]: I1202 15:52:19.121463 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:19 crc kubenswrapper[4933]: I1202 15:52:19.121546 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:19 crc kubenswrapper[4933]: I1202 15:52:19.121604 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:19 crc kubenswrapper[4933]: E1202 15:52:19.871264 4933 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.213:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187d70e024bbf4ab default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-02 15:52:16.990385323 +0000 UTC m=+0.241612036,LastTimestamp:2025-12-02 15:52:16.990385323 +0000 UTC m=+0.241612036,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 02 15:52:19 crc kubenswrapper[4933]: I1202 15:52:19.992532 4933 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.213:6443: connect: connection refused Dec 02 15:52:20 crc kubenswrapper[4933]: E1202 15:52:20.003569 4933 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.213:6443: connect: connection refused" interval="3.2s" Dec 02 15:52:20 crc kubenswrapper[4933]: W1202 15:52:20.078560 4933 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.213:6443: connect: connection refused Dec 02 15:52:20 crc kubenswrapper[4933]: E1202 15:52:20.078692 4933 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.213:6443: connect: connection refused" logger="UnhandledError" Dec 02 15:52:20 crc kubenswrapper[4933]: I1202 15:52:20.114252 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"242478d3d008a43cabff06a9c4bf90faf8c8eb3a12f5f43cde5d0156c47d6a8f"} Dec 02 15:52:20 crc kubenswrapper[4933]: I1202 15:52:20.114415 4933 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 15:52:20 crc kubenswrapper[4933]: I1202 15:52:20.115536 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:20 crc kubenswrapper[4933]: I1202 15:52:20.115587 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:20 crc kubenswrapper[4933]: I1202 15:52:20.115607 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:20 crc kubenswrapper[4933]: I1202 15:52:20.121986 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"defe82313be0dd4818134a0c694acbf728c8e31d0df745b7f2241a3e57c1bd8b"} Dec 02 15:52:20 crc kubenswrapper[4933]: I1202 15:52:20.122065 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"d957eb940436ccb56c610cb177a4670ec0fe3aa5437e333a99c291c4263c5978"} Dec 02 15:52:20 crc kubenswrapper[4933]: I1202 15:52:20.122154 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"befd22a9932058934ba3614781a7f133b4e432d5488c47697d082722ac11e0c2"} Dec 02 15:52:20 crc kubenswrapper[4933]: I1202 15:52:20.122152 4933 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 15:52:20 crc kubenswrapper[4933]: I1202 15:52:20.123678 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:20 crc kubenswrapper[4933]: I1202 15:52:20.123714 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:20 crc kubenswrapper[4933]: I1202 15:52:20.123723 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:20 crc kubenswrapper[4933]: I1202 15:52:20.124900 4933 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 15:52:20 crc kubenswrapper[4933]: I1202 15:52:20.125395 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"db14d6e6ebdfa06ff02570eb66fe7ea17a7705fdaa767b6fb91d7ed12eacd59a"} Dec 02 15:52:20 crc kubenswrapper[4933]: I1202 15:52:20.125428 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"b7cb1abb6f86878fc3daef153191ea3a2ebe06b3f1fc7df959539938c3b6a724"} Dec 02 15:52:20 crc kubenswrapper[4933]: I1202 15:52:20.125770 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:20 crc kubenswrapper[4933]: I1202 15:52:20.125788 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:20 crc kubenswrapper[4933]: I1202 15:52:20.125796 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:20 crc kubenswrapper[4933]: I1202 15:52:20.129370 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"266ff5f18c317176752753189e1fd32c61d17e22bebba19421d71cdce41c10e4"} Dec 02 15:52:20 crc kubenswrapper[4933]: I1202 15:52:20.129432 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"559f28d5b8b13a4d78a727f671a75d2c61a391f83089c98749a8071b466c8fae"} Dec 02 15:52:20 crc kubenswrapper[4933]: I1202 15:52:20.129451 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"50e3f8f75abacf2b0701b20cc58796727bb04d99836c7ad69c27355d0c7f070f"} Dec 02 15:52:20 crc kubenswrapper[4933]: I1202 15:52:20.129466 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"67497877f8241167fd89cf2ebbc7257a910b4f22e96a1e58f78936317daf19aa"} Dec 02 15:52:20 crc kubenswrapper[4933]: I1202 15:52:20.132547 4933 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="57d56c47e0f5d4c1ab0ee1bf1098a81be9d384c8f8035cff1eabc824523b46f8" exitCode=0 Dec 02 15:52:20 crc kubenswrapper[4933]: I1202 15:52:20.132654 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"57d56c47e0f5d4c1ab0ee1bf1098a81be9d384c8f8035cff1eabc824523b46f8"} Dec 02 15:52:20 crc kubenswrapper[4933]: I1202 15:52:20.132860 4933 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 15:52:20 crc kubenswrapper[4933]: I1202 15:52:20.133958 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:20 crc kubenswrapper[4933]: I1202 15:52:20.134015 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:20 crc kubenswrapper[4933]: I1202 15:52:20.134029 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:20 crc kubenswrapper[4933]: I1202 15:52:20.281817 4933 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 15:52:20 crc kubenswrapper[4933]: I1202 15:52:20.283414 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:20 crc kubenswrapper[4933]: I1202 15:52:20.283451 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:20 crc kubenswrapper[4933]: I1202 15:52:20.283466 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:20 crc kubenswrapper[4933]: I1202 15:52:20.283500 4933 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 02 15:52:21 crc kubenswrapper[4933]: I1202 15:52:21.138588 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"bdced4864fc5e9a41404f9484c6126634ffcbc3388080207f6a5508be6dc7b19"} Dec 02 15:52:21 crc kubenswrapper[4933]: I1202 15:52:21.139498 4933 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 15:52:21 crc kubenswrapper[4933]: I1202 15:52:21.140717 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:21 crc kubenswrapper[4933]: I1202 15:52:21.140874 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:21 crc kubenswrapper[4933]: I1202 15:52:21.140975 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:21 crc kubenswrapper[4933]: I1202 15:52:21.140892 4933 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 15:52:21 crc kubenswrapper[4933]: I1202 15:52:21.140812 4933 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="54b33eb07acab9a8b66ee5312351af62b7dba3c8212de0dff4cd777235f5f2cb" exitCode=0 Dec 02 15:52:21 crc kubenswrapper[4933]: I1202 15:52:21.141225 4933 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 02 15:52:21 crc kubenswrapper[4933]: I1202 15:52:21.141258 4933 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 15:52:21 crc kubenswrapper[4933]: I1202 15:52:21.140842 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"54b33eb07acab9a8b66ee5312351af62b7dba3c8212de0dff4cd777235f5f2cb"} Dec 02 15:52:21 crc kubenswrapper[4933]: I1202 15:52:21.141367 4933 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 15:52:21 crc kubenswrapper[4933]: I1202 15:52:21.141554 4933 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 15:52:21 crc kubenswrapper[4933]: I1202 15:52:21.142258 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:21 crc kubenswrapper[4933]: I1202 15:52:21.142274 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:21 crc kubenswrapper[4933]: I1202 15:52:21.142282 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:21 crc kubenswrapper[4933]: I1202 15:52:21.142355 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:21 crc kubenswrapper[4933]: I1202 15:52:21.142374 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:21 crc kubenswrapper[4933]: I1202 15:52:21.142384 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:21 crc kubenswrapper[4933]: I1202 15:52:21.142391 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:21 crc kubenswrapper[4933]: I1202 15:52:21.142444 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:21 crc kubenswrapper[4933]: I1202 15:52:21.142468 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:21 crc kubenswrapper[4933]: I1202 15:52:21.142962 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:21 crc kubenswrapper[4933]: I1202 15:52:21.142989 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:21 crc kubenswrapper[4933]: I1202 15:52:21.142999 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:21 crc kubenswrapper[4933]: I1202 15:52:21.724760 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 15:52:22 crc kubenswrapper[4933]: I1202 15:52:22.148201 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"c86a4b6d42de44d954b5bcf8b42ac10cffaac09c9f7e362a2d26b4a0fcba9a74"} Dec 02 15:52:22 crc kubenswrapper[4933]: I1202 15:52:22.148275 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"5e5b219049292adb37755387b666dabb044ec252e589f4f554a96bb2e858612c"} Dec 02 15:52:22 crc kubenswrapper[4933]: I1202 15:52:22.148292 4933 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 15:52:22 crc kubenswrapper[4933]: I1202 15:52:22.148303 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"4d4f02704d1f6ee3644bceab7f4cd44ef43e948d1777fa2e225642a5da98901f"} Dec 02 15:52:22 crc kubenswrapper[4933]: I1202 15:52:22.148324 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"292a536fa8488df1cde96062a3d40cbe5ecf556b30ae0ba609d9deb6b8dde7d1"} Dec 02 15:52:22 crc kubenswrapper[4933]: I1202 15:52:22.148337 4933 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 15:52:22 crc kubenswrapper[4933]: I1202 15:52:22.148418 4933 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 15:52:22 crc kubenswrapper[4933]: I1202 15:52:22.148344 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"d1258d15649d53d011ca2d8bb0e3e6b17ddaabe8ee2de9d184d85141ef2229ff"} Dec 02 15:52:22 crc kubenswrapper[4933]: I1202 15:52:22.148523 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 15:52:22 crc kubenswrapper[4933]: I1202 15:52:22.149508 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:22 crc kubenswrapper[4933]: I1202 15:52:22.149551 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:22 crc kubenswrapper[4933]: I1202 15:52:22.149568 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:22 crc kubenswrapper[4933]: I1202 15:52:22.149592 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:22 crc kubenswrapper[4933]: I1202 15:52:22.149613 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:22 crc kubenswrapper[4933]: I1202 15:52:22.149622 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:22 crc kubenswrapper[4933]: I1202 15:52:22.150399 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:22 crc kubenswrapper[4933]: I1202 15:52:22.150461 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:22 crc kubenswrapper[4933]: I1202 15:52:22.150477 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:22 crc kubenswrapper[4933]: I1202 15:52:22.185162 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 15:52:22 crc kubenswrapper[4933]: I1202 15:52:22.196886 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 15:52:22 crc kubenswrapper[4933]: I1202 15:52:22.499919 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 02 15:52:22 crc kubenswrapper[4933]: I1202 15:52:22.500158 4933 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 15:52:22 crc kubenswrapper[4933]: I1202 15:52:22.501703 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:22 crc kubenswrapper[4933]: I1202 15:52:22.501782 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:22 crc kubenswrapper[4933]: I1202 15:52:22.501806 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:22 crc kubenswrapper[4933]: I1202 15:52:22.728901 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 15:52:23 crc kubenswrapper[4933]: I1202 15:52:23.109399 4933 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Dec 02 15:52:23 crc kubenswrapper[4933]: I1202 15:52:23.150534 4933 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 15:52:23 crc kubenswrapper[4933]: I1202 15:52:23.150615 4933 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 15:52:23 crc kubenswrapper[4933]: I1202 15:52:23.150683 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 15:52:23 crc kubenswrapper[4933]: I1202 15:52:23.150635 4933 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 15:52:23 crc kubenswrapper[4933]: I1202 15:52:23.151863 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:23 crc kubenswrapper[4933]: I1202 15:52:23.151910 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:23 crc kubenswrapper[4933]: I1202 15:52:23.151912 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:23 crc kubenswrapper[4933]: I1202 15:52:23.151924 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:23 crc kubenswrapper[4933]: I1202 15:52:23.151937 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:23 crc kubenswrapper[4933]: I1202 15:52:23.151955 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:23 crc kubenswrapper[4933]: I1202 15:52:23.151947 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:23 crc kubenswrapper[4933]: I1202 15:52:23.151988 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:23 crc kubenswrapper[4933]: I1202 15:52:23.152000 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:23 crc kubenswrapper[4933]: I1202 15:52:23.518142 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Dec 02 15:52:23 crc kubenswrapper[4933]: I1202 15:52:23.749164 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Dec 02 15:52:24 crc kubenswrapper[4933]: I1202 15:52:24.152764 4933 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 15:52:24 crc kubenswrapper[4933]: I1202 15:52:24.152856 4933 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 15:52:24 crc kubenswrapper[4933]: I1202 15:52:24.152967 4933 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 15:52:24 crc kubenswrapper[4933]: I1202 15:52:24.154034 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:24 crc kubenswrapper[4933]: I1202 15:52:24.154066 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:24 crc kubenswrapper[4933]: I1202 15:52:24.154076 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:24 crc kubenswrapper[4933]: I1202 15:52:24.154358 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:24 crc kubenswrapper[4933]: I1202 15:52:24.154407 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:24 crc kubenswrapper[4933]: I1202 15:52:24.154425 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:24 crc kubenswrapper[4933]: I1202 15:52:24.154540 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:24 crc kubenswrapper[4933]: I1202 15:52:24.154584 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:24 crc kubenswrapper[4933]: I1202 15:52:24.154599 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:24 crc kubenswrapper[4933]: I1202 15:52:24.725751 4933 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 02 15:52:24 crc kubenswrapper[4933]: I1202 15:52:24.725866 4933 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 02 15:52:25 crc kubenswrapper[4933]: I1202 15:52:25.156121 4933 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 15:52:25 crc kubenswrapper[4933]: I1202 15:52:25.157892 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:25 crc kubenswrapper[4933]: I1202 15:52:25.157950 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:25 crc kubenswrapper[4933]: I1202 15:52:25.157964 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:25 crc kubenswrapper[4933]: I1202 15:52:25.322545 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 15:52:25 crc kubenswrapper[4933]: I1202 15:52:25.322810 4933 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 15:52:25 crc kubenswrapper[4933]: I1202 15:52:25.324900 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:25 crc kubenswrapper[4933]: I1202 15:52:25.324961 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:25 crc kubenswrapper[4933]: I1202 15:52:25.324991 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:27 crc kubenswrapper[4933]: E1202 15:52:27.178150 4933 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 02 15:52:28 crc kubenswrapper[4933]: I1202 15:52:28.566203 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 15:52:28 crc kubenswrapper[4933]: I1202 15:52:28.566374 4933 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 15:52:28 crc kubenswrapper[4933]: I1202 15:52:28.567737 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:28 crc kubenswrapper[4933]: I1202 15:52:28.567796 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:28 crc kubenswrapper[4933]: I1202 15:52:28.567810 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:28 crc kubenswrapper[4933]: I1202 15:52:28.886383 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 15:52:29 crc kubenswrapper[4933]: I1202 15:52:29.168818 4933 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 15:52:29 crc kubenswrapper[4933]: I1202 15:52:29.170270 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:29 crc kubenswrapper[4933]: I1202 15:52:29.170335 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:29 crc kubenswrapper[4933]: I1202 15:52:29.170353 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:30 crc kubenswrapper[4933]: E1202 15:52:30.284680 4933 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": net/http: TLS handshake timeout" node="crc" Dec 02 15:52:30 crc kubenswrapper[4933]: W1202 15:52:30.400221 4933 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Dec 02 15:52:30 crc kubenswrapper[4933]: I1202 15:52:30.400386 4933 trace.go:236] Trace[84228605]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (02-Dec-2025 15:52:20.398) (total time: 10002ms): Dec 02 15:52:30 crc kubenswrapper[4933]: Trace[84228605]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (15:52:30.400) Dec 02 15:52:30 crc kubenswrapper[4933]: Trace[84228605]: [10.002082969s] [10.002082969s] END Dec 02 15:52:30 crc kubenswrapper[4933]: E1202 15:52:30.400423 4933 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Dec 02 15:52:30 crc kubenswrapper[4933]: W1202 15:52:30.862750 4933 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Dec 02 15:52:30 crc kubenswrapper[4933]: I1202 15:52:30.862897 4933 trace.go:236] Trace[966343377]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (02-Dec-2025 15:52:20.860) (total time: 10002ms): Dec 02 15:52:30 crc kubenswrapper[4933]: Trace[966343377]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10002ms (15:52:30.862) Dec 02 15:52:30 crc kubenswrapper[4933]: Trace[966343377]: [10.002159182s] [10.002159182s] END Dec 02 15:52:30 crc kubenswrapper[4933]: E1202 15:52:30.862925 4933 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Dec 02 15:52:30 crc kubenswrapper[4933]: W1202 15:52:30.924394 4933 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout Dec 02 15:52:30 crc kubenswrapper[4933]: I1202 15:52:30.924572 4933 trace.go:236] Trace[572703570]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (02-Dec-2025 15:52:20.922) (total time: 10002ms): Dec 02 15:52:30 crc kubenswrapper[4933]: Trace[572703570]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (15:52:30.924) Dec 02 15:52:30 crc kubenswrapper[4933]: Trace[572703570]: [10.002134781s] [10.002134781s] END Dec 02 15:52:30 crc kubenswrapper[4933]: E1202 15:52:30.924615 4933 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Dec 02 15:52:30 crc kubenswrapper[4933]: I1202 15:52:30.993186 4933 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Dec 02 15:52:31 crc kubenswrapper[4933]: I1202 15:52:31.163504 4933 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 02 15:52:31 crc kubenswrapper[4933]: I1202 15:52:31.163576 4933 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 02 15:52:31 crc kubenswrapper[4933]: I1202 15:52:31.175385 4933 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 02 15:52:31 crc kubenswrapper[4933]: I1202 15:52:31.175449 4933 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 02 15:52:33 crc kubenswrapper[4933]: I1202 15:52:33.485717 4933 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 15:52:33 crc kubenswrapper[4933]: I1202 15:52:33.488322 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:33 crc kubenswrapper[4933]: I1202 15:52:33.488401 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:33 crc kubenswrapper[4933]: I1202 15:52:33.488420 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:33 crc kubenswrapper[4933]: I1202 15:52:33.488466 4933 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 02 15:52:33 crc kubenswrapper[4933]: E1202 15:52:33.502988 4933 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Dec 02 15:52:33 crc kubenswrapper[4933]: I1202 15:52:33.546039 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Dec 02 15:52:33 crc kubenswrapper[4933]: I1202 15:52:33.546335 4933 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 15:52:33 crc kubenswrapper[4933]: I1202 15:52:33.547880 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:33 crc kubenswrapper[4933]: I1202 15:52:33.547990 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:33 crc kubenswrapper[4933]: I1202 15:52:33.548020 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:33 crc kubenswrapper[4933]: I1202 15:52:33.567351 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Dec 02 15:52:34 crc kubenswrapper[4933]: I1202 15:52:34.183081 4933 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 15:52:34 crc kubenswrapper[4933]: I1202 15:52:34.184447 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:34 crc kubenswrapper[4933]: I1202 15:52:34.184502 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:34 crc kubenswrapper[4933]: I1202 15:52:34.184512 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:34 crc kubenswrapper[4933]: I1202 15:52:34.725357 4933 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 02 15:52:34 crc kubenswrapper[4933]: I1202 15:52:34.725449 4933 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 02 15:52:34 crc kubenswrapper[4933]: I1202 15:52:34.755445 4933 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 02 15:52:35 crc kubenswrapper[4933]: I1202 15:52:35.049398 4933 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 02 15:52:35 crc kubenswrapper[4933]: I1202 15:52:35.331468 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 15:52:35 crc kubenswrapper[4933]: I1202 15:52:35.331659 4933 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 15:52:35 crc kubenswrapper[4933]: I1202 15:52:35.333249 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:35 crc kubenswrapper[4933]: I1202 15:52:35.333313 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:35 crc kubenswrapper[4933]: I1202 15:52:35.333330 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:35 crc kubenswrapper[4933]: I1202 15:52:35.339508 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 15:52:35 crc kubenswrapper[4933]: I1202 15:52:35.481770 4933 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 02 15:52:35 crc kubenswrapper[4933]: I1202 15:52:35.995468 4933 apiserver.go:52] "Watching apiserver" Dec 02 15:52:35 crc kubenswrapper[4933]: I1202 15:52:35.999311 4933 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 02 15:52:35 crc kubenswrapper[4933]: I1202 15:52:35.999632 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf"] Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.000223 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.000355 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.000372 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.000450 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.000650 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 02 15:52:36 crc kubenswrapper[4933]: E1202 15:52:36.000757 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.001003 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 15:52:36 crc kubenswrapper[4933]: E1202 15:52:36.001347 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 15:52:36 crc kubenswrapper[4933]: E1202 15:52:36.001044 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.003997 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.004019 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.004140 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.004357 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.004357 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.004446 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.004645 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.004989 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.006266 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.035872 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.048375 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.061423 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.077207 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.094894 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.095616 4933 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.106958 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.121166 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 15:52:36 crc kubenswrapper[4933]: E1202 15:52:36.131562 4933 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.148660 4933 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.151286 4933 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.158416 4933 trace.go:236] Trace[1773110577]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (02-Dec-2025 15:52:25.468) (total time: 10689ms): Dec 02 15:52:36 crc kubenswrapper[4933]: Trace[1773110577]: ---"Objects listed" error: 10689ms (15:52:36.158) Dec 02 15:52:36 crc kubenswrapper[4933]: Trace[1773110577]: [10.689597934s] [10.689597934s] END Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.158462 4933 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.191005 4933 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:58154->192.168.126.11:17697: read: connection reset by peer" start-of-body= Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.191074 4933 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:58154->192.168.126.11:17697: read: connection reset by peer" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.191005 4933 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:58150->192.168.126.11:17697: read: connection reset by peer" start-of-body= Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.191201 4933 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:58150->192.168.126.11:17697: read: connection reset by peer" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.191461 4933 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.191497 4933 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.203456 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.251760 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.251812 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.251894 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.251921 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.251940 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.251959 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.251983 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.252029 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.252054 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.252079 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.252119 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.252136 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.252156 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.252175 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.252191 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.252207 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.252227 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.252247 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.252284 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.252301 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.252320 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.252337 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.252356 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.252468 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.252486 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.252509 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.252531 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.252497 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.252548 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.252590 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.252649 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.252689 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.252844 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.252986 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.253317 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.253456 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.253539 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.253614 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.253696 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.253781 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.253907 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.254023 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.254099 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.254166 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.254562 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.254670 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.254759 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.254868 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.254961 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.255057 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.255163 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.255259 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.257517 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.258065 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.258112 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.258156 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.258188 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.258230 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.258274 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.258305 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.258339 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.258375 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.258405 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.258441 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.258478 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.258517 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.258547 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.258644 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.258684 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.258714 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.258753 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.258787 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.259015 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.259055 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.259479 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.259535 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.259566 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.259602 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.259654 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.259689 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.259725 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.259761 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.259792 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.259855 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.259890 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.259931 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.259960 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.259999 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.260046 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.260080 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.260126 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.260165 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.260202 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.260243 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.260285 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.260319 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.260347 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.260381 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.260404 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.260425 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.260449 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.260475 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.260494 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.260523 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.260550 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.261038 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.261089 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.261127 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.261163 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.253691 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.261203 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.261238 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.253973 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.253969 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.261279 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.261316 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.261345 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.261387 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.261427 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.261459 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.262440 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.262521 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.262552 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.262578 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.262604 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.262628 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.262650 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.262674 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.262695 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.262716 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.262740 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.262761 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.262780 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.262801 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.262838 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.262864 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.262888 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.262912 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.262930 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.262955 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.262978 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.263001 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.263026 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.263249 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.263334 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.263366 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.263397 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.263430 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.263457 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.263478 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.263502 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.263528 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.263564 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.263587 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.263607 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.263634 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.263658 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.263677 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.263700 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.263746 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.264673 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.254220 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.254456 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.254568 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.254588 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.254899 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.254904 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.276532 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.254923 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.255194 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.255246 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.255246 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.256351 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.256597 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.256655 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.256685 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.276632 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.257107 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.257390 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.276741 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.257625 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.257656 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.258110 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.258248 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.276945 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.257577 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.259326 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.259397 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.259632 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.259606 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.259783 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.260156 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.260060 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.264281 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.264625 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.264911 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.264947 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.265048 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: E1202 15:52:36.265246 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 15:52:36.765139297 +0000 UTC m=+20.016366040 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.277188 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.277230 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.277243 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.277301 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.277331 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.277362 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.277394 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.277422 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.277456 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.277489 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.277520 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.277548 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.277568 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.277587 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.277800 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.277603 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.277852 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.277837 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.277878 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.277889 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.265651 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.265764 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.265874 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.265976 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.266040 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.266257 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.266381 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.278247 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.278274 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.278518 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.278590 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.278636 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.278682 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.278728 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.278762 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.278773 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.265506 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.266672 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.267039 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.267062 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.267637 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.279120 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.279179 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.279207 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.267503 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.268317 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.268581 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.268578 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.269436 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.269467 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.269812 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.269914 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.270395 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.269127 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.268200 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.271281 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.271338 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.271684 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.271890 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.271952 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.270767 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.272198 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.272228 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.273057 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.273324 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.273332 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.273466 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.273660 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.274883 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.275249 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.275267 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.275487 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.275483 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.276028 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.278253 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.276201 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.276273 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.276461 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.276533 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.258608 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.276965 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.278849 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.279451 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.279530 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.279761 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.279896 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.279933 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.280205 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.280529 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.280613 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.280643 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.280753 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.280857 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.281119 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.281455 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.267735 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.281550 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.281566 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.258726 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.282420 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.282433 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.282577 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.282805 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.266667 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.283438 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.283270 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.283306 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.278814 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.283522 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.283548 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.283572 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.283595 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.283618 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.283639 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.283659 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.283682 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.284335 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.284365 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.284390 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.284451 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.284518 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.284565 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.284588 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.284608 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.284630 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.284653 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.284657 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.284674 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.284727 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.284778 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.285107 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.285258 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.285297 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.285320 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.285345 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.285371 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.285439 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.285468 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.285528 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.285636 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.285813 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.285878 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.286410 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.286383 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.286963 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.287099 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.287174 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.287208 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.287241 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.287281 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.287336 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.287032 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.287871 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.287911 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: E1202 15:52:36.288136 4933 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.288138 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.288185 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.288129 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.288243 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.288253 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.288463 4933 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.288554 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.288554 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.288620 4933 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: E1202 15:52:36.288711 4933 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.288652 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.288921 4933 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.289022 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.289085 4933 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: E1202 15:52:36.289127 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 15:52:36.789094593 +0000 UTC m=+20.040321306 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.289216 4933 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.289280 4933 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.289350 4933 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.289317 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 02 15:52:36 crc kubenswrapper[4933]: E1202 15:52:36.289549 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 15:52:36.789522794 +0000 UTC m=+20.040749497 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.289980 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.290018 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.290054 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.290084 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.290111 4933 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.290175 4933 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.290198 4933 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.290260 4933 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.290283 4933 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.290346 4933 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.290432 4933 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.290457 4933 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.290512 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.290535 4933 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.290593 4933 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.290617 4933 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.290638 4933 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.290728 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.290752 4933 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.290587 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.290659 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.291970 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.291996 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.291537 4933 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.291199 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.291308 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.291431 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.291435 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.292039 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.292435 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.291069 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.292574 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.292753 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.292859 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.292976 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.293017 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.293236 4933 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.293303 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.293725 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.293850 4933 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.294059 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.293475 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.293634 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.294132 4933 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.294296 4933 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.294320 4933 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.294334 4933 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.294350 4933 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.294362 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.294374 4933 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.294424 4933 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.294436 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.294446 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.294458 4933 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.294468 4933 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.294478 4933 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.294489 4933 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.294500 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.294513 4933 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.294523 4933 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.294578 4933 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.294590 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.294602 4933 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.294614 4933 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.294626 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.294636 4933 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.294647 4933 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.294659 4933 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.294671 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.294682 4933 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.294693 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.294703 4933 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.294714 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.294768 4933 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.294778 4933 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.294789 4933 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.294799 4933 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.294847 4933 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.294876 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.294906 4933 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.294918 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.294928 4933 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.294939 4933 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.294966 4933 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.294976 4933 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.295006 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.295016 4933 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.295028 4933 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.295038 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.295065 4933 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.295075 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.295085 4933 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.295095 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.295105 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.295114 4933 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.295125 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.295138 4933 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.295150 4933 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.295162 4933 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.295174 4933 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.295185 4933 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.295196 4933 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.295207 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.295219 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.295234 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.295244 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.295254 4933 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.295264 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.295274 4933 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.295285 4933 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.295298 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.295318 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.295326 4933 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.295337 4933 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.295350 4933 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.295362 4933 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.295375 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.295389 4933 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.295401 4933 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.295416 4933 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.295430 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.295444 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.295456 4933 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.295469 4933 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.295479 4933 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.295488 4933 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.295497 4933 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.295508 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.295584 4933 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.295616 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.295626 4933 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.295636 4933 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.295647 4933 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.295657 4933 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.295666 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.295676 4933 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.295686 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.295696 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.295707 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.295717 4933 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.295727 4933 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.295739 4933 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.295748 4933 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.295760 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.295771 4933 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.295780 4933 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.295789 4933 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.295798 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.296050 4933 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.300960 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.301353 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.303174 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: E1202 15:52:36.304229 4933 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 15:52:36 crc kubenswrapper[4933]: E1202 15:52:36.304254 4933 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 15:52:36 crc kubenswrapper[4933]: E1202 15:52:36.304271 4933 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 15:52:36 crc kubenswrapper[4933]: E1202 15:52:36.304347 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-02 15:52:36.804321469 +0000 UTC m=+20.055548172 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 15:52:36 crc kubenswrapper[4933]: E1202 15:52:36.304800 4933 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 15:52:36 crc kubenswrapper[4933]: E1202 15:52:36.304816 4933 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 15:52:36 crc kubenswrapper[4933]: E1202 15:52:36.304858 4933 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 15:52:36 crc kubenswrapper[4933]: E1202 15:52:36.304918 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-02 15:52:36.804906304 +0000 UTC m=+20.056132997 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.307402 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.307455 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.309635 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.309690 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.309634 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.310225 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.310256 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.310529 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.310617 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.310798 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.311001 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.311708 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.311917 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.312254 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.312689 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.312880 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.313105 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.313601 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.313558 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.313451 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.314285 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.314333 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.315137 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.315191 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.315356 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.315350 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.319174 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.326481 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.339214 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.348513 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.354136 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.397068 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.397288 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.397435 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.397512 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.397666 4933 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.398074 4933 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.398093 4933 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.398109 4933 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.398123 4933 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.398137 4933 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.398149 4933 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.398162 4933 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.398208 4933 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.398220 4933 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.398234 4933 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.398247 4933 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.398259 4933 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.398272 4933 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.398283 4933 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.398298 4933 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.398312 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.398325 4933 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.398338 4933 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.398351 4933 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.398365 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.398409 4933 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.398422 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.398434 4933 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.398494 4933 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.398507 4933 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.398521 4933 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.398535 4933 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.398547 4933 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.398562 4933 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.398577 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.398589 4933 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.398601 4933 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.398611 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.398623 4933 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.398634 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.398646 4933 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.398658 4933 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.398670 4933 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.398682 4933 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.398693 4933 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.398705 4933 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.398716 4933 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.398728 4933 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.398739 4933 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.398750 4933 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.398761 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.398773 4933 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.398784 4933 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.398796 4933 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.398809 4933 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.398844 4933 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.398856 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.398868 4933 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.398879 4933 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.617937 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.631592 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.802249 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 15:52:36 crc kubenswrapper[4933]: E1202 15:52:36.802372 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 15:52:37.80235205 +0000 UTC m=+21.053578763 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.802437 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.802468 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 15:52:36 crc kubenswrapper[4933]: E1202 15:52:36.802580 4933 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 15:52:36 crc kubenswrapper[4933]: E1202 15:52:36.802614 4933 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 15:52:36 crc kubenswrapper[4933]: E1202 15:52:36.802626 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 15:52:37.802619557 +0000 UTC m=+21.053846260 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 15:52:36 crc kubenswrapper[4933]: E1202 15:52:36.802738 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 15:52:37.802682718 +0000 UTC m=+21.053909461 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 15:52:36 crc kubenswrapper[4933]: E1202 15:52:36.903297 4933 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 15:52:36 crc kubenswrapper[4933]: E1202 15:52:36.903354 4933 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 15:52:36 crc kubenswrapper[4933]: E1202 15:52:36.903381 4933 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 15:52:36 crc kubenswrapper[4933]: E1202 15:52:36.903512 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-02 15:52:37.903480391 +0000 UTC m=+21.154707124 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.903087 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 15:52:36 crc kubenswrapper[4933]: I1202 15:52:36.903728 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 15:52:36 crc kubenswrapper[4933]: E1202 15:52:36.903853 4933 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 15:52:36 crc kubenswrapper[4933]: E1202 15:52:36.903901 4933 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 15:52:36 crc kubenswrapper[4933]: E1202 15:52:36.903918 4933 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 15:52:36 crc kubenswrapper[4933]: E1202 15:52:36.903954 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-02 15:52:37.903943672 +0000 UTC m=+21.155170375 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 15:52:37 crc kubenswrapper[4933]: I1202 15:52:37.057299 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Dec 02 15:52:37 crc kubenswrapper[4933]: I1202 15:52:37.058114 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Dec 02 15:52:37 crc kubenswrapper[4933]: I1202 15:52:37.058780 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Dec 02 15:52:37 crc kubenswrapper[4933]: I1202 15:52:37.059383 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Dec 02 15:52:37 crc kubenswrapper[4933]: I1202 15:52:37.059983 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Dec 02 15:52:37 crc kubenswrapper[4933]: I1202 15:52:37.060501 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Dec 02 15:52:37 crc kubenswrapper[4933]: I1202 15:52:37.062102 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Dec 02 15:52:37 crc kubenswrapper[4933]: I1202 15:52:37.062614 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Dec 02 15:52:37 crc kubenswrapper[4933]: I1202 15:52:37.063574 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Dec 02 15:52:37 crc kubenswrapper[4933]: I1202 15:52:37.064091 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Dec 02 15:52:37 crc kubenswrapper[4933]: I1202 15:52:37.064964 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Dec 02 15:52:37 crc kubenswrapper[4933]: I1202 15:52:37.065628 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Dec 02 15:52:37 crc kubenswrapper[4933]: I1202 15:52:37.066136 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Dec 02 15:52:37 crc kubenswrapper[4933]: I1202 15:52:37.067015 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Dec 02 15:52:37 crc kubenswrapper[4933]: I1202 15:52:37.067497 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Dec 02 15:52:37 crc kubenswrapper[4933]: I1202 15:52:37.068334 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Dec 02 15:52:37 crc kubenswrapper[4933]: I1202 15:52:37.068964 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Dec 02 15:52:37 crc kubenswrapper[4933]: I1202 15:52:37.069379 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Dec 02 15:52:37 crc kubenswrapper[4933]: I1202 15:52:37.073879 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Dec 02 15:52:37 crc kubenswrapper[4933]: I1202 15:52:37.074665 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Dec 02 15:52:37 crc kubenswrapper[4933]: I1202 15:52:37.075483 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Dec 02 15:52:37 crc kubenswrapper[4933]: I1202 15:52:37.076015 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Dec 02 15:52:37 crc kubenswrapper[4933]: I1202 15:52:37.076532 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Dec 02 15:52:37 crc kubenswrapper[4933]: I1202 15:52:37.077692 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Dec 02 15:52:37 crc kubenswrapper[4933]: I1202 15:52:37.078137 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Dec 02 15:52:37 crc kubenswrapper[4933]: I1202 15:52:37.079112 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Dec 02 15:52:37 crc kubenswrapper[4933]: I1202 15:52:37.079691 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Dec 02 15:52:37 crc kubenswrapper[4933]: I1202 15:52:37.080618 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Dec 02 15:52:37 crc kubenswrapper[4933]: I1202 15:52:37.081209 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Dec 02 15:52:37 crc kubenswrapper[4933]: I1202 15:52:37.082049 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Dec 02 15:52:37 crc kubenswrapper[4933]: I1202 15:52:37.082510 4933 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Dec 02 15:52:37 crc kubenswrapper[4933]: I1202 15:52:37.082600 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Dec 02 15:52:37 crc kubenswrapper[4933]: I1202 15:52:37.084283 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Dec 02 15:52:37 crc kubenswrapper[4933]: I1202 15:52:37.085128 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Dec 02 15:52:37 crc kubenswrapper[4933]: I1202 15:52:37.085556 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Dec 02 15:52:37 crc kubenswrapper[4933]: I1202 15:52:37.086963 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Dec 02 15:52:37 crc kubenswrapper[4933]: I1202 15:52:37.087940 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Dec 02 15:52:37 crc kubenswrapper[4933]: I1202 15:52:37.088468 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Dec 02 15:52:37 crc kubenswrapper[4933]: I1202 15:52:37.089452 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Dec 02 15:52:37 crc kubenswrapper[4933]: I1202 15:52:37.090109 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Dec 02 15:52:37 crc kubenswrapper[4933]: I1202 15:52:37.090573 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:37Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:37 crc kubenswrapper[4933]: I1202 15:52:37.090973 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Dec 02 15:52:37 crc kubenswrapper[4933]: I1202 15:52:37.091544 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Dec 02 15:52:37 crc kubenswrapper[4933]: I1202 15:52:37.092649 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Dec 02 15:52:37 crc kubenswrapper[4933]: I1202 15:52:37.094905 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Dec 02 15:52:37 crc kubenswrapper[4933]: I1202 15:52:37.095497 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Dec 02 15:52:37 crc kubenswrapper[4933]: I1202 15:52:37.096632 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Dec 02 15:52:37 crc kubenswrapper[4933]: I1202 15:52:37.097151 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Dec 02 15:52:37 crc kubenswrapper[4933]: I1202 15:52:37.098376 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Dec 02 15:52:37 crc kubenswrapper[4933]: I1202 15:52:37.098957 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Dec 02 15:52:37 crc kubenswrapper[4933]: I1202 15:52:37.100106 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Dec 02 15:52:37 crc kubenswrapper[4933]: I1202 15:52:37.100601 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Dec 02 15:52:37 crc kubenswrapper[4933]: I1202 15:52:37.101599 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Dec 02 15:52:37 crc kubenswrapper[4933]: I1202 15:52:37.102182 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Dec 02 15:52:37 crc kubenswrapper[4933]: I1202 15:52:37.102664 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Dec 02 15:52:37 crc kubenswrapper[4933]: I1202 15:52:37.108998 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:37Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:37 crc kubenswrapper[4933]: I1202 15:52:37.122885 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:37Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:37 crc kubenswrapper[4933]: I1202 15:52:37.140476 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:37Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:37 crc kubenswrapper[4933]: I1202 15:52:37.165507 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67141d41-dade-4d16-8921-1a3eeaef658e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67497877f8241167fd89cf2ebbc7257a910b4f22e96a1e58f78936317daf19aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://559f28d5b8b13a4d78a727f671a75d2c61a391f83089c98749a8071b466c8fae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50e3f8f75abacf2b0701b20cc58796727bb04d99836c7ad69c27355d0c7f070f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdced4864fc5e9a41404f9484c6126634ffcbc3388080207f6a5508be6dc7b19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://266ff5f18c317176752753189e1fd32c61d17e22bebba19421d71cdce41c10e4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd63fc5b2ffbd52b09628908868b28e2168bfe8bc453de41ce175aedcd404cc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd63fc5b2ffbd52b09628908868b28e2168bfe8bc453de41ce175aedcd404cc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:37Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:37 crc kubenswrapper[4933]: I1202 15:52:37.185210 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:37Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:37 crc kubenswrapper[4933]: I1202 15:52:37.195566 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"50efb59904b9275848b8f068dfa8943515c66087209fe13dc75888354ecaff09"} Dec 02 15:52:37 crc kubenswrapper[4933]: I1202 15:52:37.195646 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"1ea4c41a070502693100c2b085d1f6269783366ab2d0157f36fe724835e5b949"} Dec 02 15:52:37 crc kubenswrapper[4933]: I1202 15:52:37.198739 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"0c0555430401fd089ba4f14bef44c9a03bcc4352a3159c34aa592797211ff912"} Dec 02 15:52:37 crc kubenswrapper[4933]: I1202 15:52:37.198785 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"2774460b514418fc05e1d8ac0ca0a8cda1194fab9151804bed266e6bf44c7369"} Dec 02 15:52:37 crc kubenswrapper[4933]: I1202 15:52:37.198803 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"f24205085d6203cfe6d680f79b48c68df3f8e9f232e66a82598fdd012f6fbeeb"} Dec 02 15:52:37 crc kubenswrapper[4933]: I1202 15:52:37.200787 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 02 15:52:37 crc kubenswrapper[4933]: I1202 15:52:37.200987 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:37Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:37 crc kubenswrapper[4933]: I1202 15:52:37.203005 4933 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="bdced4864fc5e9a41404f9484c6126634ffcbc3388080207f6a5508be6dc7b19" exitCode=255 Dec 02 15:52:37 crc kubenswrapper[4933]: I1202 15:52:37.203040 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"bdced4864fc5e9a41404f9484c6126634ffcbc3388080207f6a5508be6dc7b19"} Dec 02 15:52:37 crc kubenswrapper[4933]: I1202 15:52:37.203723 4933 scope.go:117] "RemoveContainer" containerID="bdced4864fc5e9a41404f9484c6126634ffcbc3388080207f6a5508be6dc7b19" Dec 02 15:52:37 crc kubenswrapper[4933]: I1202 15:52:37.204043 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"b26d78b09797004ef559ee46a55ec063c7ba4ad62eccde98cb6a4dd141d61d26"} Dec 02 15:52:37 crc kubenswrapper[4933]: I1202 15:52:37.218066 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67141d41-dade-4d16-8921-1a3eeaef658e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67497877f8241167fd89cf2ebbc7257a910b4f22e96a1e58f78936317daf19aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://559f28d5b8b13a4d78a727f671a75d2c61a391f83089c98749a8071b466c8fae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50e3f8f75abacf2b0701b20cc58796727bb04d99836c7ad69c27355d0c7f070f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdced4864fc5e9a41404f9484c6126634ffcbc3388080207f6a5508be6dc7b19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://266ff5f18c317176752753189e1fd32c61d17e22bebba19421d71cdce41c10e4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd63fc5b2ffbd52b09628908868b28e2168bfe8bc453de41ce175aedcd404cc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd63fc5b2ffbd52b09628908868b28e2168bfe8bc453de41ce175aedcd404cc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:37Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:37 crc kubenswrapper[4933]: I1202 15:52:37.238392 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c0555430401fd089ba4f14bef44c9a03bcc4352a3159c34aa592797211ff912\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2774460b514418fc05e1d8ac0ca0a8cda1194fab9151804bed266e6bf44c7369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:37Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:37 crc kubenswrapper[4933]: I1202 15:52:37.253879 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:37Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:37 crc kubenswrapper[4933]: I1202 15:52:37.267375 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:37Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:37 crc kubenswrapper[4933]: I1202 15:52:37.285342 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50efb59904b9275848b8f068dfa8943515c66087209fe13dc75888354ecaff09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:37Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:37 crc kubenswrapper[4933]: I1202 15:52:37.299379 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:37Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:37 crc kubenswrapper[4933]: I1202 15:52:37.311742 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:37Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:37 crc kubenswrapper[4933]: I1202 15:52:37.326284 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c0555430401fd089ba4f14bef44c9a03bcc4352a3159c34aa592797211ff912\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2774460b514418fc05e1d8ac0ca0a8cda1194fab9151804bed266e6bf44c7369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:37Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:37 crc kubenswrapper[4933]: I1202 15:52:37.343102 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:37Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:37 crc kubenswrapper[4933]: I1202 15:52:37.359784 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:37Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:37 crc kubenswrapper[4933]: I1202 15:52:37.381087 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67141d41-dade-4d16-8921-1a3eeaef658e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67497877f8241167fd89cf2ebbc7257a910b4f22e96a1e58f78936317daf19aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://559f28d5b8b13a4d78a727f671a75d2c61a391f83089c98749a8071b466c8fae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50e3f8f75abacf2b0701b20cc58796727bb04d99836c7ad69c27355d0c7f070f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdced4864fc5e9a41404f9484c6126634ffcbc3388080207f6a5508be6dc7b19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdced4864fc5e9a41404f9484c6126634ffcbc3388080207f6a5508be6dc7b19\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 15:52:30.552832 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 15:52:30.556124 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1621699441/tls.crt::/tmp/serving-cert-1621699441/tls.key\\\\\\\"\\\\nI1202 15:52:36.166152 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 15:52:36.169452 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 15:52:36.169553 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 15:52:36.169614 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 15:52:36.169667 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 15:52:36.177343 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 15:52:36.177409 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 15:52:36.177417 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 15:52:36.177423 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 15:52:36.177428 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 15:52:36.177432 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 15:52:36.177437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 15:52:36.177361 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 15:52:36.181525 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://266ff5f18c317176752753189e1fd32c61d17e22bebba19421d71cdce41c10e4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd63fc5b2ffbd52b09628908868b28e2168bfe8bc453de41ce175aedcd404cc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd63fc5b2ffbd52b09628908868b28e2168bfe8bc453de41ce175aedcd404cc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:37Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:37 crc kubenswrapper[4933]: I1202 15:52:37.403546 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50efb59904b9275848b8f068dfa8943515c66087209fe13dc75888354ecaff09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:37Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:37 crc kubenswrapper[4933]: I1202 15:52:37.418429 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:37Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:37 crc kubenswrapper[4933]: I1202 15:52:37.435107 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:37Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:37 crc kubenswrapper[4933]: I1202 15:52:37.816563 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 15:52:37 crc kubenswrapper[4933]: E1202 15:52:37.816837 4933 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 15:52:37 crc kubenswrapper[4933]: E1202 15:52:37.816926 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 15:52:39.816897209 +0000 UTC m=+23.068123912 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 15:52:37 crc kubenswrapper[4933]: I1202 15:52:37.817120 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 15:52:37 crc kubenswrapper[4933]: I1202 15:52:37.817249 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 15:52:37 crc kubenswrapper[4933]: E1202 15:52:37.817389 4933 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 15:52:37 crc kubenswrapper[4933]: E1202 15:52:37.817395 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 15:52:39.81735022 +0000 UTC m=+23.068576943 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:52:37 crc kubenswrapper[4933]: E1202 15:52:37.817614 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 15:52:39.817602577 +0000 UTC m=+23.068829280 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 15:52:37 crc kubenswrapper[4933]: I1202 15:52:37.918633 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 15:52:37 crc kubenswrapper[4933]: I1202 15:52:37.918987 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 15:52:37 crc kubenswrapper[4933]: E1202 15:52:37.918932 4933 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 15:52:37 crc kubenswrapper[4933]: E1202 15:52:37.919214 4933 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 15:52:37 crc kubenswrapper[4933]: E1202 15:52:37.919324 4933 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 15:52:37 crc kubenswrapper[4933]: E1202 15:52:37.919098 4933 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 15:52:37 crc kubenswrapper[4933]: E1202 15:52:37.919436 4933 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 15:52:37 crc kubenswrapper[4933]: E1202 15:52:37.919454 4933 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 15:52:37 crc kubenswrapper[4933]: E1202 15:52:37.919540 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-02 15:52:39.919516297 +0000 UTC m=+23.170743000 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 15:52:37 crc kubenswrapper[4933]: E1202 15:52:37.926731 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-02 15:52:39.926682759 +0000 UTC m=+23.177909462 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 15:52:38 crc kubenswrapper[4933]: I1202 15:52:38.052347 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 15:52:38 crc kubenswrapper[4933]: I1202 15:52:38.052398 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 15:52:38 crc kubenswrapper[4933]: I1202 15:52:38.052426 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 15:52:38 crc kubenswrapper[4933]: E1202 15:52:38.052505 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 15:52:38 crc kubenswrapper[4933]: E1202 15:52:38.052931 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 15:52:38 crc kubenswrapper[4933]: E1202 15:52:38.053172 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 15:52:38 crc kubenswrapper[4933]: I1202 15:52:38.209816 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 02 15:52:38 crc kubenswrapper[4933]: I1202 15:52:38.212264 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"a0f8661c980893fc646e84a6bb8547946653723fb3f1449d585953e25715d588"} Dec 02 15:52:38 crc kubenswrapper[4933]: I1202 15:52:38.212754 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 15:52:38 crc kubenswrapper[4933]: I1202 15:52:38.232208 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50efb59904b9275848b8f068dfa8943515c66087209fe13dc75888354ecaff09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:38Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:38 crc kubenswrapper[4933]: I1202 15:52:38.253182 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:38Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:38 crc kubenswrapper[4933]: I1202 15:52:38.267114 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:38Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:38 crc kubenswrapper[4933]: I1202 15:52:38.290404 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c0555430401fd089ba4f14bef44c9a03bcc4352a3159c34aa592797211ff912\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2774460b514418fc05e1d8ac0ca0a8cda1194fab9151804bed266e6bf44c7369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:38Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:38 crc kubenswrapper[4933]: I1202 15:52:38.321351 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:38Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:38 crc kubenswrapper[4933]: I1202 15:52:38.343813 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:38Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:38 crc kubenswrapper[4933]: I1202 15:52:38.363941 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67141d41-dade-4d16-8921-1a3eeaef658e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67497877f8241167fd89cf2ebbc7257a910b4f22e96a1e58f78936317daf19aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://559f28d5b8b13a4d78a727f671a75d2c61a391f83089c98749a8071b466c8fae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50e3f8f75abacf2b0701b20cc58796727bb04d99836c7ad69c27355d0c7f070f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0f8661c980893fc646e84a6bb8547946653723fb3f1449d585953e25715d588\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdced4864fc5e9a41404f9484c6126634ffcbc3388080207f6a5508be6dc7b19\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 15:52:30.552832 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 15:52:30.556124 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1621699441/tls.crt::/tmp/serving-cert-1621699441/tls.key\\\\\\\"\\\\nI1202 15:52:36.166152 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 15:52:36.169452 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 15:52:36.169553 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 15:52:36.169614 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 15:52:36.169667 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 15:52:36.177343 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 15:52:36.177409 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 15:52:36.177417 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 15:52:36.177423 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 15:52:36.177428 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 15:52:36.177432 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 15:52:36.177437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 15:52:36.177361 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 15:52:36.181525 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://266ff5f18c317176752753189e1fd32c61d17e22bebba19421d71cdce41c10e4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd63fc5b2ffbd52b09628908868b28e2168bfe8bc453de41ce175aedcd404cc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd63fc5b2ffbd52b09628908868b28e2168bfe8bc453de41ce175aedcd404cc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:38Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:39 crc kubenswrapper[4933]: I1202 15:52:39.837242 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 15:52:39 crc kubenswrapper[4933]: I1202 15:52:39.837366 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 15:52:39 crc kubenswrapper[4933]: I1202 15:52:39.837387 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 15:52:39 crc kubenswrapper[4933]: E1202 15:52:39.837516 4933 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 15:52:39 crc kubenswrapper[4933]: E1202 15:52:39.837586 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 15:52:43.837516584 +0000 UTC m=+27.088743297 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:52:39 crc kubenswrapper[4933]: E1202 15:52:39.837672 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 15:52:43.837636577 +0000 UTC m=+27.088863520 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 15:52:39 crc kubenswrapper[4933]: E1202 15:52:39.837597 4933 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 15:52:39 crc kubenswrapper[4933]: E1202 15:52:39.837763 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 15:52:43.83775297 +0000 UTC m=+27.088979673 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 15:52:39 crc kubenswrapper[4933]: I1202 15:52:39.903985 4933 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 15:52:39 crc kubenswrapper[4933]: I1202 15:52:39.905790 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:39 crc kubenswrapper[4933]: I1202 15:52:39.905863 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:39 crc kubenswrapper[4933]: I1202 15:52:39.905877 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:39 crc kubenswrapper[4933]: I1202 15:52:39.905976 4933 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 02 15:52:39 crc kubenswrapper[4933]: I1202 15:52:39.925251 4933 kubelet_node_status.go:115] "Node was previously registered" node="crc" Dec 02 15:52:39 crc kubenswrapper[4933]: I1202 15:52:39.925609 4933 kubelet_node_status.go:79] "Successfully registered node" node="crc" Dec 02 15:52:39 crc kubenswrapper[4933]: I1202 15:52:39.926944 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:39 crc kubenswrapper[4933]: I1202 15:52:39.926988 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:39 crc kubenswrapper[4933]: I1202 15:52:39.927001 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:39 crc kubenswrapper[4933]: I1202 15:52:39.927020 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:39 crc kubenswrapper[4933]: I1202 15:52:39.927034 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:39Z","lastTransitionTime":"2025-12-02T15:52:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:39 crc kubenswrapper[4933]: I1202 15:52:39.938633 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 15:52:39 crc kubenswrapper[4933]: I1202 15:52:39.938669 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 15:52:39 crc kubenswrapper[4933]: E1202 15:52:39.938793 4933 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 15:52:39 crc kubenswrapper[4933]: E1202 15:52:39.938811 4933 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 15:52:39 crc kubenswrapper[4933]: E1202 15:52:39.938839 4933 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 15:52:39 crc kubenswrapper[4933]: E1202 15:52:39.938880 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-02 15:52:43.938865381 +0000 UTC m=+27.190092084 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 15:52:39 crc kubenswrapper[4933]: E1202 15:52:39.939100 4933 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 15:52:39 crc kubenswrapper[4933]: E1202 15:52:39.939186 4933 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 15:52:39 crc kubenswrapper[4933]: E1202 15:52:39.939232 4933 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 15:52:39 crc kubenswrapper[4933]: E1202 15:52:39.939344 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-02 15:52:43.939311882 +0000 UTC m=+27.190538745 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 15:52:39 crc kubenswrapper[4933]: E1202 15:52:39.966707 4933 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:52:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:52:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:52:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:52:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b45811fa-f657-451d-9a34-cdd268fcc941\\\",\\\"systemUUID\\\":\\\"84b7b789-bc9b-466b-8619-2bf2e1fdb8d0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:39Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:39 crc kubenswrapper[4933]: I1202 15:52:39.970891 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:39 crc kubenswrapper[4933]: I1202 15:52:39.970940 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:39 crc kubenswrapper[4933]: I1202 15:52:39.970953 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:39 crc kubenswrapper[4933]: I1202 15:52:39.970969 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:39 crc kubenswrapper[4933]: I1202 15:52:39.970979 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:39Z","lastTransitionTime":"2025-12-02T15:52:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:39 crc kubenswrapper[4933]: E1202 15:52:39.985621 4933 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:52:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:52:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:52:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:52:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b45811fa-f657-451d-9a34-cdd268fcc941\\\",\\\"systemUUID\\\":\\\"84b7b789-bc9b-466b-8619-2bf2e1fdb8d0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:39Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:39 crc kubenswrapper[4933]: I1202 15:52:39.994754 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:39 crc kubenswrapper[4933]: I1202 15:52:39.994838 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:39 crc kubenswrapper[4933]: I1202 15:52:39.994855 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:39 crc kubenswrapper[4933]: I1202 15:52:39.994877 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:39 crc kubenswrapper[4933]: I1202 15:52:39.994891 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:39Z","lastTransitionTime":"2025-12-02T15:52:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:40 crc kubenswrapper[4933]: E1202 15:52:40.020794 4933 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:52:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:52:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:52:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:52:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b45811fa-f657-451d-9a34-cdd268fcc941\\\",\\\"systemUUID\\\":\\\"84b7b789-bc9b-466b-8619-2bf2e1fdb8d0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:40Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:40 crc kubenswrapper[4933]: I1202 15:52:40.026901 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:40 crc kubenswrapper[4933]: I1202 15:52:40.026932 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:40 crc kubenswrapper[4933]: I1202 15:52:40.026942 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:40 crc kubenswrapper[4933]: I1202 15:52:40.026958 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:40 crc kubenswrapper[4933]: I1202 15:52:40.026969 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:40Z","lastTransitionTime":"2025-12-02T15:52:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:40 crc kubenswrapper[4933]: I1202 15:52:40.052558 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 15:52:40 crc kubenswrapper[4933]: E1202 15:52:40.052738 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 15:52:40 crc kubenswrapper[4933]: I1202 15:52:40.053152 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 15:52:40 crc kubenswrapper[4933]: E1202 15:52:40.053204 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 15:52:40 crc kubenswrapper[4933]: I1202 15:52:40.053242 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 15:52:40 crc kubenswrapper[4933]: E1202 15:52:40.053284 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 15:52:40 crc kubenswrapper[4933]: E1202 15:52:40.055369 4933 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b45811fa-f657-451d-9a34-cdd268fcc941\\\",\\\"systemUUID\\\":\\\"84b7b789-bc9b-466b-8619-2bf2e1fdb8d0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:40Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:40 crc kubenswrapper[4933]: I1202 15:52:40.059445 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:40 crc kubenswrapper[4933]: I1202 15:52:40.059502 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:40 crc kubenswrapper[4933]: I1202 15:52:40.059516 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:40 crc kubenswrapper[4933]: I1202 15:52:40.059535 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:40 crc kubenswrapper[4933]: I1202 15:52:40.059546 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:40Z","lastTransitionTime":"2025-12-02T15:52:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:40 crc kubenswrapper[4933]: E1202 15:52:40.071925 4933 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b45811fa-f657-451d-9a34-cdd268fcc941\\\",\\\"systemUUID\\\":\\\"84b7b789-bc9b-466b-8619-2bf2e1fdb8d0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:40Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:40 crc kubenswrapper[4933]: E1202 15:52:40.072038 4933 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 02 15:52:40 crc kubenswrapper[4933]: I1202 15:52:40.073703 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:40 crc kubenswrapper[4933]: I1202 15:52:40.073735 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:40 crc kubenswrapper[4933]: I1202 15:52:40.073746 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:40 crc kubenswrapper[4933]: I1202 15:52:40.073760 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:40 crc kubenswrapper[4933]: I1202 15:52:40.073773 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:40Z","lastTransitionTime":"2025-12-02T15:52:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:40 crc kubenswrapper[4933]: I1202 15:52:40.176371 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:40 crc kubenswrapper[4933]: I1202 15:52:40.176416 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:40 crc kubenswrapper[4933]: I1202 15:52:40.176431 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:40 crc kubenswrapper[4933]: I1202 15:52:40.176453 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:40 crc kubenswrapper[4933]: I1202 15:52:40.176466 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:40Z","lastTransitionTime":"2025-12-02T15:52:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:40 crc kubenswrapper[4933]: I1202 15:52:40.220069 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"4424b26b4c67e9508d92cc6bbc82b291d93c587a8463026a856d87b7b778079e"} Dec 02 15:52:40 crc kubenswrapper[4933]: I1202 15:52:40.238518 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50efb59904b9275848b8f068dfa8943515c66087209fe13dc75888354ecaff09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:40Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:40 crc kubenswrapper[4933]: I1202 15:52:40.252217 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:40Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:40 crc kubenswrapper[4933]: I1202 15:52:40.270387 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4424b26b4c67e9508d92cc6bbc82b291d93c587a8463026a856d87b7b778079e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:40Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:40 crc kubenswrapper[4933]: I1202 15:52:40.279191 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:40 crc kubenswrapper[4933]: I1202 15:52:40.279273 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:40 crc kubenswrapper[4933]: I1202 15:52:40.279299 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:40 crc kubenswrapper[4933]: I1202 15:52:40.279340 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:40 crc kubenswrapper[4933]: I1202 15:52:40.279364 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:40Z","lastTransitionTime":"2025-12-02T15:52:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:40 crc kubenswrapper[4933]: I1202 15:52:40.290645 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67141d41-dade-4d16-8921-1a3eeaef658e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67497877f8241167fd89cf2ebbc7257a910b4f22e96a1e58f78936317daf19aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://559f28d5b8b13a4d78a727f671a75d2c61a391f83089c98749a8071b466c8fae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50e3f8f75abacf2b0701b20cc58796727bb04d99836c7ad69c27355d0c7f070f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0f8661c980893fc646e84a6bb8547946653723fb3f1449d585953e25715d588\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdced4864fc5e9a41404f9484c6126634ffcbc3388080207f6a5508be6dc7b19\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 15:52:30.552832 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 15:52:30.556124 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1621699441/tls.crt::/tmp/serving-cert-1621699441/tls.key\\\\\\\"\\\\nI1202 15:52:36.166152 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 15:52:36.169452 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 15:52:36.169553 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 15:52:36.169614 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 15:52:36.169667 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 15:52:36.177343 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 15:52:36.177409 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 15:52:36.177417 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 15:52:36.177423 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 15:52:36.177428 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 15:52:36.177432 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 15:52:36.177437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 15:52:36.177361 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 15:52:36.181525 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://266ff5f18c317176752753189e1fd32c61d17e22bebba19421d71cdce41c10e4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd63fc5b2ffbd52b09628908868b28e2168bfe8bc453de41ce175aedcd404cc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd63fc5b2ffbd52b09628908868b28e2168bfe8bc453de41ce175aedcd404cc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:40Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:40 crc kubenswrapper[4933]: I1202 15:52:40.310445 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c0555430401fd089ba4f14bef44c9a03bcc4352a3159c34aa592797211ff912\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2774460b514418fc05e1d8ac0ca0a8cda1194fab9151804bed266e6bf44c7369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:40Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:40 crc kubenswrapper[4933]: I1202 15:52:40.330568 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:40Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:40 crc kubenswrapper[4933]: I1202 15:52:40.349043 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:40Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:40 crc kubenswrapper[4933]: I1202 15:52:40.382936 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:40 crc kubenswrapper[4933]: I1202 15:52:40.383002 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:40 crc kubenswrapper[4933]: I1202 15:52:40.383020 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:40 crc kubenswrapper[4933]: I1202 15:52:40.383042 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:40 crc kubenswrapper[4933]: I1202 15:52:40.383060 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:40Z","lastTransitionTime":"2025-12-02T15:52:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:40 crc kubenswrapper[4933]: I1202 15:52:40.486523 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:40 crc kubenswrapper[4933]: I1202 15:52:40.486596 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:40 crc kubenswrapper[4933]: I1202 15:52:40.486620 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:40 crc kubenswrapper[4933]: I1202 15:52:40.486655 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:40 crc kubenswrapper[4933]: I1202 15:52:40.486679 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:40Z","lastTransitionTime":"2025-12-02T15:52:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:40 crc kubenswrapper[4933]: I1202 15:52:40.589935 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:40 crc kubenswrapper[4933]: I1202 15:52:40.589985 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:40 crc kubenswrapper[4933]: I1202 15:52:40.589998 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:40 crc kubenswrapper[4933]: I1202 15:52:40.590021 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:40 crc kubenswrapper[4933]: I1202 15:52:40.590035 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:40Z","lastTransitionTime":"2025-12-02T15:52:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:40 crc kubenswrapper[4933]: I1202 15:52:40.692690 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:40 crc kubenswrapper[4933]: I1202 15:52:40.692749 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:40 crc kubenswrapper[4933]: I1202 15:52:40.692767 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:40 crc kubenswrapper[4933]: I1202 15:52:40.692796 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:40 crc kubenswrapper[4933]: I1202 15:52:40.692817 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:40Z","lastTransitionTime":"2025-12-02T15:52:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:40 crc kubenswrapper[4933]: I1202 15:52:40.796628 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:40 crc kubenswrapper[4933]: I1202 15:52:40.796701 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:40 crc kubenswrapper[4933]: I1202 15:52:40.796724 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:40 crc kubenswrapper[4933]: I1202 15:52:40.796757 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:40 crc kubenswrapper[4933]: I1202 15:52:40.796781 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:40Z","lastTransitionTime":"2025-12-02T15:52:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:40 crc kubenswrapper[4933]: I1202 15:52:40.900532 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:40 crc kubenswrapper[4933]: I1202 15:52:40.900627 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:40 crc kubenswrapper[4933]: I1202 15:52:40.900653 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:40 crc kubenswrapper[4933]: I1202 15:52:40.900688 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:40 crc kubenswrapper[4933]: I1202 15:52:40.900711 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:40Z","lastTransitionTime":"2025-12-02T15:52:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:41 crc kubenswrapper[4933]: I1202 15:52:41.003742 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:41 crc kubenswrapper[4933]: I1202 15:52:41.003868 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:41 crc kubenswrapper[4933]: I1202 15:52:41.003897 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:41 crc kubenswrapper[4933]: I1202 15:52:41.003931 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:41 crc kubenswrapper[4933]: I1202 15:52:41.003959 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:41Z","lastTransitionTime":"2025-12-02T15:52:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:41 crc kubenswrapper[4933]: I1202 15:52:41.107051 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:41 crc kubenswrapper[4933]: I1202 15:52:41.107124 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:41 crc kubenswrapper[4933]: I1202 15:52:41.107153 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:41 crc kubenswrapper[4933]: I1202 15:52:41.107184 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:41 crc kubenswrapper[4933]: I1202 15:52:41.107209 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:41Z","lastTransitionTime":"2025-12-02T15:52:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:41 crc kubenswrapper[4933]: I1202 15:52:41.209633 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:41 crc kubenswrapper[4933]: I1202 15:52:41.209701 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:41 crc kubenswrapper[4933]: I1202 15:52:41.209713 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:41 crc kubenswrapper[4933]: I1202 15:52:41.209737 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:41 crc kubenswrapper[4933]: I1202 15:52:41.209753 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:41Z","lastTransitionTime":"2025-12-02T15:52:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:41 crc kubenswrapper[4933]: I1202 15:52:41.313112 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:41 crc kubenswrapper[4933]: I1202 15:52:41.313193 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:41 crc kubenswrapper[4933]: I1202 15:52:41.313233 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:41 crc kubenswrapper[4933]: I1202 15:52:41.313256 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:41 crc kubenswrapper[4933]: I1202 15:52:41.313269 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:41Z","lastTransitionTime":"2025-12-02T15:52:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:41 crc kubenswrapper[4933]: I1202 15:52:41.416255 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:41 crc kubenswrapper[4933]: I1202 15:52:41.416334 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:41 crc kubenswrapper[4933]: I1202 15:52:41.416357 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:41 crc kubenswrapper[4933]: I1202 15:52:41.416383 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:41 crc kubenswrapper[4933]: I1202 15:52:41.416411 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:41Z","lastTransitionTime":"2025-12-02T15:52:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:41 crc kubenswrapper[4933]: I1202 15:52:41.518989 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:41 crc kubenswrapper[4933]: I1202 15:52:41.519046 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:41 crc kubenswrapper[4933]: I1202 15:52:41.519058 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:41 crc kubenswrapper[4933]: I1202 15:52:41.519076 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:41 crc kubenswrapper[4933]: I1202 15:52:41.519090 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:41Z","lastTransitionTime":"2025-12-02T15:52:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:41 crc kubenswrapper[4933]: I1202 15:52:41.621651 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:41 crc kubenswrapper[4933]: I1202 15:52:41.621696 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:41 crc kubenswrapper[4933]: I1202 15:52:41.621705 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:41 crc kubenswrapper[4933]: I1202 15:52:41.621720 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:41 crc kubenswrapper[4933]: I1202 15:52:41.621731 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:41Z","lastTransitionTime":"2025-12-02T15:52:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:41 crc kubenswrapper[4933]: I1202 15:52:41.725654 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:41 crc kubenswrapper[4933]: I1202 15:52:41.725741 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:41 crc kubenswrapper[4933]: I1202 15:52:41.725768 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:41 crc kubenswrapper[4933]: I1202 15:52:41.725803 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:41 crc kubenswrapper[4933]: I1202 15:52:41.725876 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:41Z","lastTransitionTime":"2025-12-02T15:52:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:41 crc kubenswrapper[4933]: I1202 15:52:41.733384 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 15:52:41 crc kubenswrapper[4933]: I1202 15:52:41.740521 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 15:52:41 crc kubenswrapper[4933]: I1202 15:52:41.747876 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Dec 02 15:52:41 crc kubenswrapper[4933]: I1202 15:52:41.760023 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67141d41-dade-4d16-8921-1a3eeaef658e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67497877f8241167fd89cf2ebbc7257a910b4f22e96a1e58f78936317daf19aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://559f28d5b8b13a4d78a727f671a75d2c61a391f83089c98749a8071b466c8fae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50e3f8f75abacf2b0701b20cc58796727bb04d99836c7ad69c27355d0c7f070f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0f8661c980893fc646e84a6bb8547946653723fb3f1449d585953e25715d588\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdced4864fc5e9a41404f9484c6126634ffcbc3388080207f6a5508be6dc7b19\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 15:52:30.552832 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 15:52:30.556124 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1621699441/tls.crt::/tmp/serving-cert-1621699441/tls.key\\\\\\\"\\\\nI1202 15:52:36.166152 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 15:52:36.169452 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 15:52:36.169553 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 15:52:36.169614 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 15:52:36.169667 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 15:52:36.177343 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 15:52:36.177409 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 15:52:36.177417 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 15:52:36.177423 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 15:52:36.177428 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 15:52:36.177432 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 15:52:36.177437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 15:52:36.177361 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 15:52:36.181525 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://266ff5f18c317176752753189e1fd32c61d17e22bebba19421d71cdce41c10e4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd63fc5b2ffbd52b09628908868b28e2168bfe8bc453de41ce175aedcd404cc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd63fc5b2ffbd52b09628908868b28e2168bfe8bc453de41ce175aedcd404cc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:41Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:41 crc kubenswrapper[4933]: I1202 15:52:41.775957 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c0555430401fd089ba4f14bef44c9a03bcc4352a3159c34aa592797211ff912\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2774460b514418fc05e1d8ac0ca0a8cda1194fab9151804bed266e6bf44c7369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:41Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:41 crc kubenswrapper[4933]: I1202 15:52:41.794465 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:41Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:41 crc kubenswrapper[4933]: I1202 15:52:41.816921 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:41Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:41 crc kubenswrapper[4933]: I1202 15:52:41.829509 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:41 crc kubenswrapper[4933]: I1202 15:52:41.829576 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:41 crc kubenswrapper[4933]: I1202 15:52:41.829599 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:41 crc kubenswrapper[4933]: I1202 15:52:41.829625 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:41 crc kubenswrapper[4933]: I1202 15:52:41.829646 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:41Z","lastTransitionTime":"2025-12-02T15:52:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:41 crc kubenswrapper[4933]: I1202 15:52:41.832123 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4424b26b4c67e9508d92cc6bbc82b291d93c587a8463026a856d87b7b778079e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:41Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:41 crc kubenswrapper[4933]: I1202 15:52:41.851937 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50efb59904b9275848b8f068dfa8943515c66087209fe13dc75888354ecaff09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:41Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:41 crc kubenswrapper[4933]: I1202 15:52:41.866446 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:41Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:41 crc kubenswrapper[4933]: I1202 15:52:41.883497 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c0555430401fd089ba4f14bef44c9a03bcc4352a3159c34aa592797211ff912\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2774460b514418fc05e1d8ac0ca0a8cda1194fab9151804bed266e6bf44c7369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:41Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:41 crc kubenswrapper[4933]: I1202 15:52:41.898075 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:41Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:41 crc kubenswrapper[4933]: I1202 15:52:41.919258 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:41Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:41 crc kubenswrapper[4933]: I1202 15:52:41.933290 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:41 crc kubenswrapper[4933]: I1202 15:52:41.933372 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:41 crc kubenswrapper[4933]: I1202 15:52:41.933398 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:41 crc kubenswrapper[4933]: I1202 15:52:41.933436 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:41 crc kubenswrapper[4933]: I1202 15:52:41.933464 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:41Z","lastTransitionTime":"2025-12-02T15:52:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:41 crc kubenswrapper[4933]: I1202 15:52:41.941969 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67141d41-dade-4d16-8921-1a3eeaef658e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67497877f8241167fd89cf2ebbc7257a910b4f22e96a1e58f78936317daf19aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://559f28d5b8b13a4d78a727f671a75d2c61a391f83089c98749a8071b466c8fae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50e3f8f75abacf2b0701b20cc58796727bb04d99836c7ad69c27355d0c7f070f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0f8661c980893fc646e84a6bb8547946653723fb3f1449d585953e25715d588\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdced4864fc5e9a41404f9484c6126634ffcbc3388080207f6a5508be6dc7b19\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 15:52:30.552832 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 15:52:30.556124 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1621699441/tls.crt::/tmp/serving-cert-1621699441/tls.key\\\\\\\"\\\\nI1202 15:52:36.166152 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 15:52:36.169452 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 15:52:36.169553 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 15:52:36.169614 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 15:52:36.169667 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 15:52:36.177343 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 15:52:36.177409 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 15:52:36.177417 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 15:52:36.177423 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 15:52:36.177428 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 15:52:36.177432 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 15:52:36.177437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 15:52:36.177361 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 15:52:36.181525 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://266ff5f18c317176752753189e1fd32c61d17e22bebba19421d71cdce41c10e4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd63fc5b2ffbd52b09628908868b28e2168bfe8bc453de41ce175aedcd404cc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd63fc5b2ffbd52b09628908868b28e2168bfe8bc453de41ce175aedcd404cc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:41Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:41 crc kubenswrapper[4933]: I1202 15:52:41.961702 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06b195f1-3296-4050-9361-eab421cde8d4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7399fed7994f0ed3f12a423d3f6796e84d8687f9c16a3050ccbb90e1c80a07d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee933bdd8638f0085a6f720a178c8ce59bf46b40a0bcb015ac9c570e25ce97d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7cb1abb6f86878fc3daef153191ea3a2ebe06b3f1fc7df959539938c3b6a724\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db14d6e6ebdfa06ff02570eb66fe7ea17a7705fdaa767b6fb91d7ed12eacd59a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:41Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:41 crc kubenswrapper[4933]: I1202 15:52:41.983459 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50efb59904b9275848b8f068dfa8943515c66087209fe13dc75888354ecaff09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:41Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:42 crc kubenswrapper[4933]: I1202 15:52:42.009623 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:42Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:42 crc kubenswrapper[4933]: I1202 15:52:42.033295 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4424b26b4c67e9508d92cc6bbc82b291d93c587a8463026a856d87b7b778079e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:42Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:42 crc kubenswrapper[4933]: I1202 15:52:42.036291 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:42 crc kubenswrapper[4933]: I1202 15:52:42.036334 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:42 crc kubenswrapper[4933]: I1202 15:52:42.036348 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:42 crc kubenswrapper[4933]: I1202 15:52:42.036367 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:42 crc kubenswrapper[4933]: I1202 15:52:42.036380 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:42Z","lastTransitionTime":"2025-12-02T15:52:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:42 crc kubenswrapper[4933]: I1202 15:52:42.052999 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 15:52:42 crc kubenswrapper[4933]: I1202 15:52:42.053038 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 15:52:42 crc kubenswrapper[4933]: I1202 15:52:42.053060 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 15:52:42 crc kubenswrapper[4933]: E1202 15:52:42.053180 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 15:52:42 crc kubenswrapper[4933]: E1202 15:52:42.053336 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 15:52:42 crc kubenswrapper[4933]: E1202 15:52:42.053576 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 15:52:42 crc kubenswrapper[4933]: I1202 15:52:42.138777 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:42 crc kubenswrapper[4933]: I1202 15:52:42.138870 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:42 crc kubenswrapper[4933]: I1202 15:52:42.138882 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:42 crc kubenswrapper[4933]: I1202 15:52:42.138902 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:42 crc kubenswrapper[4933]: I1202 15:52:42.138913 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:42Z","lastTransitionTime":"2025-12-02T15:52:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:42 crc kubenswrapper[4933]: I1202 15:52:42.241113 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:42 crc kubenswrapper[4933]: I1202 15:52:42.241155 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:42 crc kubenswrapper[4933]: I1202 15:52:42.241168 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:42 crc kubenswrapper[4933]: I1202 15:52:42.241185 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:42 crc kubenswrapper[4933]: I1202 15:52:42.241197 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:42Z","lastTransitionTime":"2025-12-02T15:52:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:42 crc kubenswrapper[4933]: I1202 15:52:42.335179 4933 csr.go:261] certificate signing request csr-hgkqv is approved, waiting to be issued Dec 02 15:52:42 crc kubenswrapper[4933]: I1202 15:52:42.343300 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:42 crc kubenswrapper[4933]: I1202 15:52:42.343348 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:42 crc kubenswrapper[4933]: I1202 15:52:42.343360 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:42 crc kubenswrapper[4933]: I1202 15:52:42.343380 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:42 crc kubenswrapper[4933]: I1202 15:52:42.343393 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:42Z","lastTransitionTime":"2025-12-02T15:52:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:42 crc kubenswrapper[4933]: I1202 15:52:42.355339 4933 csr.go:257] certificate signing request csr-hgkqv is issued Dec 02 15:52:42 crc kubenswrapper[4933]: I1202 15:52:42.445633 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:42 crc kubenswrapper[4933]: I1202 15:52:42.445662 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:42 crc kubenswrapper[4933]: I1202 15:52:42.445672 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:42 crc kubenswrapper[4933]: I1202 15:52:42.445686 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:42 crc kubenswrapper[4933]: I1202 15:52:42.445696 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:42Z","lastTransitionTime":"2025-12-02T15:52:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:42 crc kubenswrapper[4933]: I1202 15:52:42.548047 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:42 crc kubenswrapper[4933]: I1202 15:52:42.548113 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:42 crc kubenswrapper[4933]: I1202 15:52:42.548124 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:42 crc kubenswrapper[4933]: I1202 15:52:42.548154 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:42 crc kubenswrapper[4933]: I1202 15:52:42.548165 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:42Z","lastTransitionTime":"2025-12-02T15:52:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:42 crc kubenswrapper[4933]: I1202 15:52:42.650746 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:42 crc kubenswrapper[4933]: I1202 15:52:42.650809 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:42 crc kubenswrapper[4933]: I1202 15:52:42.650837 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:42 crc kubenswrapper[4933]: I1202 15:52:42.650856 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:42 crc kubenswrapper[4933]: I1202 15:52:42.650868 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:42Z","lastTransitionTime":"2025-12-02T15:52:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:42 crc kubenswrapper[4933]: I1202 15:52:42.753486 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:42 crc kubenswrapper[4933]: I1202 15:52:42.753554 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:42 crc kubenswrapper[4933]: I1202 15:52:42.753565 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:42 crc kubenswrapper[4933]: I1202 15:52:42.753578 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:42 crc kubenswrapper[4933]: I1202 15:52:42.753587 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:42Z","lastTransitionTime":"2025-12-02T15:52:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:42 crc kubenswrapper[4933]: I1202 15:52:42.856372 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:42 crc kubenswrapper[4933]: I1202 15:52:42.856420 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:42 crc kubenswrapper[4933]: I1202 15:52:42.856433 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:42 crc kubenswrapper[4933]: I1202 15:52:42.856451 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:42 crc kubenswrapper[4933]: I1202 15:52:42.856466 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:42Z","lastTransitionTime":"2025-12-02T15:52:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:42 crc kubenswrapper[4933]: I1202 15:52:42.959326 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:42 crc kubenswrapper[4933]: I1202 15:52:42.959366 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:42 crc kubenswrapper[4933]: I1202 15:52:42.959380 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:42 crc kubenswrapper[4933]: I1202 15:52:42.959398 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:42 crc kubenswrapper[4933]: I1202 15:52:42.959410 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:42Z","lastTransitionTime":"2025-12-02T15:52:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.062043 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.062087 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.062098 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.062114 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.062126 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:43Z","lastTransitionTime":"2025-12-02T15:52:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.166477 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.166533 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.166545 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.166563 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.166581 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:43Z","lastTransitionTime":"2025-12-02T15:52:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.222232 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-d2p6w"] Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.222839 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.225352 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-5d9dn"] Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.225598 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-5d9dn" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.227948 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.228543 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.230865 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.231303 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.231411 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.231655 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.231690 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.233259 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.256613 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06b195f1-3296-4050-9361-eab421cde8d4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7399fed7994f0ed3f12a423d3f6796e84d8687f9c16a3050ccbb90e1c80a07d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee933bdd8638f0085a6f720a178c8ce59bf46b40a0bcb015ac9c570e25ce97d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7cb1abb6f86878fc3daef153191ea3a2ebe06b3f1fc7df959539938c3b6a724\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db14d6e6ebdfa06ff02570eb66fe7ea17a7705fdaa767b6fb91d7ed12eacd59a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:43Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.269611 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.269663 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.269674 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.269694 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.269707 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:43Z","lastTransitionTime":"2025-12-02T15:52:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.273145 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t48jm\" (UniqueName: \"kubernetes.io/projected/e6c1c5e6-50dd-428a-890c-2c3f0456f2fa-kube-api-access-t48jm\") pod \"machine-config-daemon-d2p6w\" (UID: \"e6c1c5e6-50dd-428a-890c-2c3f0456f2fa\") " pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.273213 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/e6c1c5e6-50dd-428a-890c-2c3f0456f2fa-rootfs\") pod \"machine-config-daemon-d2p6w\" (UID: \"e6c1c5e6-50dd-428a-890c-2c3f0456f2fa\") " pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.273255 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/97edaf10-912b-42e7-a9e7-930381d48508-hosts-file\") pod \"node-resolver-5d9dn\" (UID: \"97edaf10-912b-42e7-a9e7-930381d48508\") " pod="openshift-dns/node-resolver-5d9dn" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.273286 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s82hp\" (UniqueName: \"kubernetes.io/projected/97edaf10-912b-42e7-a9e7-930381d48508-kube-api-access-s82hp\") pod \"node-resolver-5d9dn\" (UID: \"97edaf10-912b-42e7-a9e7-930381d48508\") " pod="openshift-dns/node-resolver-5d9dn" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.273313 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e6c1c5e6-50dd-428a-890c-2c3f0456f2fa-proxy-tls\") pod \"machine-config-daemon-d2p6w\" (UID: \"e6c1c5e6-50dd-428a-890c-2c3f0456f2fa\") " pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.273338 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e6c1c5e6-50dd-428a-890c-2c3f0456f2fa-mcd-auth-proxy-config\") pod \"machine-config-daemon-d2p6w\" (UID: \"e6c1c5e6-50dd-428a-890c-2c3f0456f2fa\") " pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.282255 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50efb59904b9275848b8f068dfa8943515c66087209fe13dc75888354ecaff09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:43Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.308456 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:43Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.327013 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4424b26b4c67e9508d92cc6bbc82b291d93c587a8463026a856d87b7b778079e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:43Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.343934 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67141d41-dade-4d16-8921-1a3eeaef658e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67497877f8241167fd89cf2ebbc7257a910b4f22e96a1e58f78936317daf19aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://559f28d5b8b13a4d78a727f671a75d2c61a391f83089c98749a8071b466c8fae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50e3f8f75abacf2b0701b20cc58796727bb04d99836c7ad69c27355d0c7f070f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0f8661c980893fc646e84a6bb8547946653723fb3f1449d585953e25715d588\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdced4864fc5e9a41404f9484c6126634ffcbc3388080207f6a5508be6dc7b19\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 15:52:30.552832 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 15:52:30.556124 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1621699441/tls.crt::/tmp/serving-cert-1621699441/tls.key\\\\\\\"\\\\nI1202 15:52:36.166152 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 15:52:36.169452 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 15:52:36.169553 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 15:52:36.169614 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 15:52:36.169667 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 15:52:36.177343 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 15:52:36.177409 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 15:52:36.177417 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 15:52:36.177423 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 15:52:36.177428 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 15:52:36.177432 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 15:52:36.177437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 15:52:36.177361 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 15:52:36.181525 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://266ff5f18c317176752753189e1fd32c61d17e22bebba19421d71cdce41c10e4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd63fc5b2ffbd52b09628908868b28e2168bfe8bc453de41ce175aedcd404cc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd63fc5b2ffbd52b09628908868b28e2168bfe8bc453de41ce175aedcd404cc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:43Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.356754 4933 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-12-02 15:47:42 +0000 UTC, rotation deadline is 2026-09-01 22:14:22.755030143 +0000 UTC Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.356835 4933 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6558h21m39.398213285s for next certificate rotation Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.358675 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c0555430401fd089ba4f14bef44c9a03bcc4352a3159c34aa592797211ff912\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2774460b514418fc05e1d8ac0ca0a8cda1194fab9151804bed266e6bf44c7369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:43Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.372634 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.372682 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.372694 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.372714 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.372729 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:43Z","lastTransitionTime":"2025-12-02T15:52:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.374117 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/97edaf10-912b-42e7-a9e7-930381d48508-hosts-file\") pod \"node-resolver-5d9dn\" (UID: \"97edaf10-912b-42e7-a9e7-930381d48508\") " pod="openshift-dns/node-resolver-5d9dn" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.374168 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s82hp\" (UniqueName: \"kubernetes.io/projected/97edaf10-912b-42e7-a9e7-930381d48508-kube-api-access-s82hp\") pod \"node-resolver-5d9dn\" (UID: \"97edaf10-912b-42e7-a9e7-930381d48508\") " pod="openshift-dns/node-resolver-5d9dn" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.374201 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e6c1c5e6-50dd-428a-890c-2c3f0456f2fa-proxy-tls\") pod \"machine-config-daemon-d2p6w\" (UID: \"e6c1c5e6-50dd-428a-890c-2c3f0456f2fa\") " pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.374230 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e6c1c5e6-50dd-428a-890c-2c3f0456f2fa-mcd-auth-proxy-config\") pod \"machine-config-daemon-d2p6w\" (UID: \"e6c1c5e6-50dd-428a-890c-2c3f0456f2fa\") " pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.374261 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/97edaf10-912b-42e7-a9e7-930381d48508-hosts-file\") pod \"node-resolver-5d9dn\" (UID: \"97edaf10-912b-42e7-a9e7-930381d48508\") " pod="openshift-dns/node-resolver-5d9dn" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.374277 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t48jm\" (UniqueName: \"kubernetes.io/projected/e6c1c5e6-50dd-428a-890c-2c3f0456f2fa-kube-api-access-t48jm\") pod \"machine-config-daemon-d2p6w\" (UID: \"e6c1c5e6-50dd-428a-890c-2c3f0456f2fa\") " pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.374313 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/e6c1c5e6-50dd-428a-890c-2c3f0456f2fa-rootfs\") pod \"machine-config-daemon-d2p6w\" (UID: \"e6c1c5e6-50dd-428a-890c-2c3f0456f2fa\") " pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.374370 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/e6c1c5e6-50dd-428a-890c-2c3f0456f2fa-rootfs\") pod \"machine-config-daemon-d2p6w\" (UID: \"e6c1c5e6-50dd-428a-890c-2c3f0456f2fa\") " pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.375222 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e6c1c5e6-50dd-428a-890c-2c3f0456f2fa-mcd-auth-proxy-config\") pod \"machine-config-daemon-d2p6w\" (UID: \"e6c1c5e6-50dd-428a-890c-2c3f0456f2fa\") " pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.381501 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e6c1c5e6-50dd-428a-890c-2c3f0456f2fa-proxy-tls\") pod \"machine-config-daemon-d2p6w\" (UID: \"e6c1c5e6-50dd-428a-890c-2c3f0456f2fa\") " pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.382322 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:43Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.403903 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t48jm\" (UniqueName: \"kubernetes.io/projected/e6c1c5e6-50dd-428a-890c-2c3f0456f2fa-kube-api-access-t48jm\") pod \"machine-config-daemon-d2p6w\" (UID: \"e6c1c5e6-50dd-428a-890c-2c3f0456f2fa\") " pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.406533 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s82hp\" (UniqueName: \"kubernetes.io/projected/97edaf10-912b-42e7-a9e7-930381d48508-kube-api-access-s82hp\") pod \"node-resolver-5d9dn\" (UID: \"97edaf10-912b-42e7-a9e7-930381d48508\") " pod="openshift-dns/node-resolver-5d9dn" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.418587 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:43Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.437490 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6c1c5e6-50dd-428a-890c-2c3f0456f2fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t48jm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t48jm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-d2p6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:43Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.457024 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5d9dn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97edaf10-912b-42e7-a9e7-930381d48508\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s82hp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5d9dn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:43Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.475907 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.475944 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.475953 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.475967 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.475979 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:43Z","lastTransitionTime":"2025-12-02T15:52:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.480111 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67141d41-dade-4d16-8921-1a3eeaef658e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67497877f8241167fd89cf2ebbc7257a910b4f22e96a1e58f78936317daf19aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://559f28d5b8b13a4d78a727f671a75d2c61a391f83089c98749a8071b466c8fae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50e3f8f75abacf2b0701b20cc58796727bb04d99836c7ad69c27355d0c7f070f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0f8661c980893fc646e84a6bb8547946653723fb3f1449d585953e25715d588\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdced4864fc5e9a41404f9484c6126634ffcbc3388080207f6a5508be6dc7b19\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 15:52:30.552832 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 15:52:30.556124 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1621699441/tls.crt::/tmp/serving-cert-1621699441/tls.key\\\\\\\"\\\\nI1202 15:52:36.166152 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 15:52:36.169452 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 15:52:36.169553 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 15:52:36.169614 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 15:52:36.169667 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 15:52:36.177343 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 15:52:36.177409 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 15:52:36.177417 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 15:52:36.177423 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 15:52:36.177428 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 15:52:36.177432 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 15:52:36.177437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 15:52:36.177361 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 15:52:36.181525 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://266ff5f18c317176752753189e1fd32c61d17e22bebba19421d71cdce41c10e4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd63fc5b2ffbd52b09628908868b28e2168bfe8bc453de41ce175aedcd404cc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd63fc5b2ffbd52b09628908868b28e2168bfe8bc453de41ce175aedcd404cc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:43Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.513289 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c0555430401fd089ba4f14bef44c9a03bcc4352a3159c34aa592797211ff912\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2774460b514418fc05e1d8ac0ca0a8cda1194fab9151804bed266e6bf44c7369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:43Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.531025 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:43Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.535833 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.541860 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-5d9dn" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.550956 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:43Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:43 crc kubenswrapper[4933]: W1202 15:52:43.559648 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97edaf10_912b_42e7_a9e7_930381d48508.slice/crio-c303bd18519e3e15b69d5b56b42e5ec98c6ea22f0e4a1602bcd6311560a00e36 WatchSource:0}: Error finding container c303bd18519e3e15b69d5b56b42e5ec98c6ea22f0e4a1602bcd6311560a00e36: Status 404 returned error can't find the container with id c303bd18519e3e15b69d5b56b42e5ec98c6ea22f0e4a1602bcd6311560a00e36 Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.563611 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6c1c5e6-50dd-428a-890c-2c3f0456f2fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t48jm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t48jm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-d2p6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:43Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.577609 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06b195f1-3296-4050-9361-eab421cde8d4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7399fed7994f0ed3f12a423d3f6796e84d8687f9c16a3050ccbb90e1c80a07d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee933bdd8638f0085a6f720a178c8ce59bf46b40a0bcb015ac9c570e25ce97d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7cb1abb6f86878fc3daef153191ea3a2ebe06b3f1fc7df959539938c3b6a724\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db14d6e6ebdfa06ff02570eb66fe7ea17a7705fdaa767b6fb91d7ed12eacd59a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:43Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.579912 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.579960 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.580164 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.580186 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.580198 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:43Z","lastTransitionTime":"2025-12-02T15:52:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.591389 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50efb59904b9275848b8f068dfa8943515c66087209fe13dc75888354ecaff09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:43Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.605336 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:43Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.622758 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4424b26b4c67e9508d92cc6bbc82b291d93c587a8463026a856d87b7b778079e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:43Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.623278 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-8mklc"] Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.629210 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-s779q"] Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.630375 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.633351 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.633492 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.634168 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.634191 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.635050 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.641322 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.642109 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-z6kjz"] Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.642258 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-s779q" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.642503 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-z6kjz" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.641657 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.643857 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.643898 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.645850 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.646157 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c0555430401fd089ba4f14bef44c9a03bcc4352a3159c34aa592797211ff912\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2774460b514418fc05e1d8ac0ca0a8cda1194fab9151804bed266e6bf44c7369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:43Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.646341 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.646457 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.646762 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.647203 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.662095 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:43Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.676706 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1972064c-ea30-421c-b009-2bc675a98fcc-env-overrides\") pod \"ovnkube-node-8mklc\" (UID: \"1972064c-ea30-421c-b009-2bc675a98fcc\") " pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.676758 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b033c545-93a2-4401-842b-22456e44216b-multus-cni-dir\") pod \"multus-z6kjz\" (UID: \"b033c545-93a2-4401-842b-22456e44216b\") " pod="openshift-multus/multus-z6kjz" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.676786 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1972064c-ea30-421c-b009-2bc675a98fcc-host-run-ovn-kubernetes\") pod \"ovnkube-node-8mklc\" (UID: \"1972064c-ea30-421c-b009-2bc675a98fcc\") " pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.676816 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/b033c545-93a2-4401-842b-22456e44216b-multus-daemon-config\") pod \"multus-z6kjz\" (UID: \"b033c545-93a2-4401-842b-22456e44216b\") " pod="openshift-multus/multus-z6kjz" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.676865 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/72a30a71-fe04-43a2-8f60-c9b12a0a6e5a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-s779q\" (UID: \"72a30a71-fe04-43a2-8f60-c9b12a0a6e5a\") " pod="openshift-multus/multus-additional-cni-plugins-s779q" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.676897 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/72a30a71-fe04-43a2-8f60-c9b12a0a6e5a-system-cni-dir\") pod \"multus-additional-cni-plugins-s779q\" (UID: \"72a30a71-fe04-43a2-8f60-c9b12a0a6e5a\") " pod="openshift-multus/multus-additional-cni-plugins-s779q" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.676934 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b033c545-93a2-4401-842b-22456e44216b-multus-conf-dir\") pod \"multus-z6kjz\" (UID: \"b033c545-93a2-4401-842b-22456e44216b\") " pod="openshift-multus/multus-z6kjz" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.676955 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1972064c-ea30-421c-b009-2bc675a98fcc-host-cni-bin\") pod \"ovnkube-node-8mklc\" (UID: \"1972064c-ea30-421c-b009-2bc675a98fcc\") " pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.676977 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1972064c-ea30-421c-b009-2bc675a98fcc-ovnkube-script-lib\") pod \"ovnkube-node-8mklc\" (UID: \"1972064c-ea30-421c-b009-2bc675a98fcc\") " pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.677002 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b033c545-93a2-4401-842b-22456e44216b-host-var-lib-kubelet\") pod \"multus-z6kjz\" (UID: \"b033c545-93a2-4401-842b-22456e44216b\") " pod="openshift-multus/multus-z6kjz" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.677023 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b033c545-93a2-4401-842b-22456e44216b-cni-binary-copy\") pod \"multus-z6kjz\" (UID: \"b033c545-93a2-4401-842b-22456e44216b\") " pod="openshift-multus/multus-z6kjz" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.677046 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1972064c-ea30-421c-b009-2bc675a98fcc-var-lib-openvswitch\") pod \"ovnkube-node-8mklc\" (UID: \"1972064c-ea30-421c-b009-2bc675a98fcc\") " pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.677088 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1972064c-ea30-421c-b009-2bc675a98fcc-log-socket\") pod \"ovnkube-node-8mklc\" (UID: \"1972064c-ea30-421c-b009-2bc675a98fcc\") " pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.677337 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b033c545-93a2-4401-842b-22456e44216b-host-var-lib-cni-bin\") pod \"multus-z6kjz\" (UID: \"b033c545-93a2-4401-842b-22456e44216b\") " pod="openshift-multus/multus-z6kjz" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.677385 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1972064c-ea30-421c-b009-2bc675a98fcc-ovnkube-config\") pod \"ovnkube-node-8mklc\" (UID: \"1972064c-ea30-421c-b009-2bc675a98fcc\") " pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.677404 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b033c545-93a2-4401-842b-22456e44216b-host-run-netns\") pod \"multus-z6kjz\" (UID: \"b033c545-93a2-4401-842b-22456e44216b\") " pod="openshift-multus/multus-z6kjz" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.677425 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/b033c545-93a2-4401-842b-22456e44216b-host-var-lib-cni-multus\") pod \"multus-z6kjz\" (UID: \"b033c545-93a2-4401-842b-22456e44216b\") " pod="openshift-multus/multus-z6kjz" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.677445 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdq96\" (UniqueName: \"kubernetes.io/projected/b033c545-93a2-4401-842b-22456e44216b-kube-api-access-zdq96\") pod \"multus-z6kjz\" (UID: \"b033c545-93a2-4401-842b-22456e44216b\") " pod="openshift-multus/multus-z6kjz" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.677511 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1972064c-ea30-421c-b009-2bc675a98fcc-host-run-netns\") pod \"ovnkube-node-8mklc\" (UID: \"1972064c-ea30-421c-b009-2bc675a98fcc\") " pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.677534 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1972064c-ea30-421c-b009-2bc675a98fcc-run-openvswitch\") pod \"ovnkube-node-8mklc\" (UID: \"1972064c-ea30-421c-b009-2bc675a98fcc\") " pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.677564 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b033c545-93a2-4401-842b-22456e44216b-os-release\") pod \"multus-z6kjz\" (UID: \"b033c545-93a2-4401-842b-22456e44216b\") " pod="openshift-multus/multus-z6kjz" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.677637 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/b033c545-93a2-4401-842b-22456e44216b-hostroot\") pod \"multus-z6kjz\" (UID: \"b033c545-93a2-4401-842b-22456e44216b\") " pod="openshift-multus/multus-z6kjz" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.677683 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/b033c545-93a2-4401-842b-22456e44216b-host-run-multus-certs\") pod \"multus-z6kjz\" (UID: \"b033c545-93a2-4401-842b-22456e44216b\") " pod="openshift-multus/multus-z6kjz" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.677715 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1972064c-ea30-421c-b009-2bc675a98fcc-node-log\") pod \"ovnkube-node-8mklc\" (UID: \"1972064c-ea30-421c-b009-2bc675a98fcc\") " pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.677741 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1972064c-ea30-421c-b009-2bc675a98fcc-ovn-node-metrics-cert\") pod \"ovnkube-node-8mklc\" (UID: \"1972064c-ea30-421c-b009-2bc675a98fcc\") " pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.677758 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pcpjn\" (UniqueName: \"kubernetes.io/projected/1972064c-ea30-421c-b009-2bc675a98fcc-kube-api-access-pcpjn\") pod \"ovnkube-node-8mklc\" (UID: \"1972064c-ea30-421c-b009-2bc675a98fcc\") " pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.677788 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b033c545-93a2-4401-842b-22456e44216b-cnibin\") pod \"multus-z6kjz\" (UID: \"b033c545-93a2-4401-842b-22456e44216b\") " pod="openshift-multus/multus-z6kjz" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.677807 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/72a30a71-fe04-43a2-8f60-c9b12a0a6e5a-cni-binary-copy\") pod \"multus-additional-cni-plugins-s779q\" (UID: \"72a30a71-fe04-43a2-8f60-c9b12a0a6e5a\") " pod="openshift-multus/multus-additional-cni-plugins-s779q" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.677846 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/b033c545-93a2-4401-842b-22456e44216b-multus-socket-dir-parent\") pod \"multus-z6kjz\" (UID: \"b033c545-93a2-4401-842b-22456e44216b\") " pod="openshift-multus/multus-z6kjz" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.677890 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1972064c-ea30-421c-b009-2bc675a98fcc-host-slash\") pod \"ovnkube-node-8mklc\" (UID: \"1972064c-ea30-421c-b009-2bc675a98fcc\") " pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.677936 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1972064c-ea30-421c-b009-2bc675a98fcc-host-cni-netd\") pod \"ovnkube-node-8mklc\" (UID: \"1972064c-ea30-421c-b009-2bc675a98fcc\") " pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.677972 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1972064c-ea30-421c-b009-2bc675a98fcc-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8mklc\" (UID: \"1972064c-ea30-421c-b009-2bc675a98fcc\") " pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.678066 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1972064c-ea30-421c-b009-2bc675a98fcc-systemd-units\") pod \"ovnkube-node-8mklc\" (UID: \"1972064c-ea30-421c-b009-2bc675a98fcc\") " pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.678088 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/72a30a71-fe04-43a2-8f60-c9b12a0a6e5a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-s779q\" (UID: \"72a30a71-fe04-43a2-8f60-c9b12a0a6e5a\") " pod="openshift-multus/multus-additional-cni-plugins-s779q" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.678105 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1972064c-ea30-421c-b009-2bc675a98fcc-run-ovn\") pod \"ovnkube-node-8mklc\" (UID: \"1972064c-ea30-421c-b009-2bc675a98fcc\") " pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.678127 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/72a30a71-fe04-43a2-8f60-c9b12a0a6e5a-os-release\") pod \"multus-additional-cni-plugins-s779q\" (UID: \"72a30a71-fe04-43a2-8f60-c9b12a0a6e5a\") " pod="openshift-multus/multus-additional-cni-plugins-s779q" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.678144 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b033c545-93a2-4401-842b-22456e44216b-etc-kubernetes\") pod \"multus-z6kjz\" (UID: \"b033c545-93a2-4401-842b-22456e44216b\") " pod="openshift-multus/multus-z6kjz" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.678160 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1972064c-ea30-421c-b009-2bc675a98fcc-host-kubelet\") pod \"ovnkube-node-8mklc\" (UID: \"1972064c-ea30-421c-b009-2bc675a98fcc\") " pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.678176 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b033c545-93a2-4401-842b-22456e44216b-system-cni-dir\") pod \"multus-z6kjz\" (UID: \"b033c545-93a2-4401-842b-22456e44216b\") " pod="openshift-multus/multus-z6kjz" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.678192 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/b033c545-93a2-4401-842b-22456e44216b-host-run-k8s-cni-cncf-io\") pod \"multus-z6kjz\" (UID: \"b033c545-93a2-4401-842b-22456e44216b\") " pod="openshift-multus/multus-z6kjz" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.678208 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1972064c-ea30-421c-b009-2bc675a98fcc-etc-openvswitch\") pod \"ovnkube-node-8mklc\" (UID: \"1972064c-ea30-421c-b009-2bc675a98fcc\") " pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.678228 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1972064c-ea30-421c-b009-2bc675a98fcc-run-systemd\") pod \"ovnkube-node-8mklc\" (UID: \"1972064c-ea30-421c-b009-2bc675a98fcc\") " pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.678250 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/72a30a71-fe04-43a2-8f60-c9b12a0a6e5a-cnibin\") pod \"multus-additional-cni-plugins-s779q\" (UID: \"72a30a71-fe04-43a2-8f60-c9b12a0a6e5a\") " pod="openshift-multus/multus-additional-cni-plugins-s779q" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.678274 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgds8\" (UniqueName: \"kubernetes.io/projected/72a30a71-fe04-43a2-8f60-c9b12a0a6e5a-kube-api-access-zgds8\") pod \"multus-additional-cni-plugins-s779q\" (UID: \"72a30a71-fe04-43a2-8f60-c9b12a0a6e5a\") " pod="openshift-multus/multus-additional-cni-plugins-s779q" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.679295 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67141d41-dade-4d16-8921-1a3eeaef658e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67497877f8241167fd89cf2ebbc7257a910b4f22e96a1e58f78936317daf19aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://559f28d5b8b13a4d78a727f671a75d2c61a391f83089c98749a8071b466c8fae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50e3f8f75abacf2b0701b20cc58796727bb04d99836c7ad69c27355d0c7f070f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0f8661c980893fc646e84a6bb8547946653723fb3f1449d585953e25715d588\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdced4864fc5e9a41404f9484c6126634ffcbc3388080207f6a5508be6dc7b19\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 15:52:30.552832 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 15:52:30.556124 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1621699441/tls.crt::/tmp/serving-cert-1621699441/tls.key\\\\\\\"\\\\nI1202 15:52:36.166152 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 15:52:36.169452 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 15:52:36.169553 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 15:52:36.169614 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 15:52:36.169667 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 15:52:36.177343 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 15:52:36.177409 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 15:52:36.177417 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 15:52:36.177423 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 15:52:36.177428 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 15:52:36.177432 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 15:52:36.177437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 15:52:36.177361 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 15:52:36.181525 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://266ff5f18c317176752753189e1fd32c61d17e22bebba19421d71cdce41c10e4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd63fc5b2ffbd52b09628908868b28e2168bfe8bc453de41ce175aedcd404cc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd63fc5b2ffbd52b09628908868b28e2168bfe8bc453de41ce175aedcd404cc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:43Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.683005 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.683079 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.683092 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.683119 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.683131 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:43Z","lastTransitionTime":"2025-12-02T15:52:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.692157 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06b195f1-3296-4050-9361-eab421cde8d4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7399fed7994f0ed3f12a423d3f6796e84d8687f9c16a3050ccbb90e1c80a07d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee933bdd8638f0085a6f720a178c8ce59bf46b40a0bcb015ac9c570e25ce97d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7cb1abb6f86878fc3daef153191ea3a2ebe06b3f1fc7df959539938c3b6a724\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db14d6e6ebdfa06ff02570eb66fe7ea17a7705fdaa767b6fb91d7ed12eacd59a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:43Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.710192 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1972064c-ea30-421c-b009-2bc675a98fcc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8mklc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:43Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.724903 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:43Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.737410 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6c1c5e6-50dd-428a-890c-2c3f0456f2fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t48jm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t48jm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-d2p6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:43Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.749014 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5d9dn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97edaf10-912b-42e7-a9e7-930381d48508\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s82hp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5d9dn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:43Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.761687 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:43Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.772630 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4424b26b4c67e9508d92cc6bbc82b291d93c587a8463026a856d87b7b778079e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:43Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.778612 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b033c545-93a2-4401-842b-22456e44216b-cni-binary-copy\") pod \"multus-z6kjz\" (UID: \"b033c545-93a2-4401-842b-22456e44216b\") " pod="openshift-multus/multus-z6kjz" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.778651 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1972064c-ea30-421c-b009-2bc675a98fcc-var-lib-openvswitch\") pod \"ovnkube-node-8mklc\" (UID: \"1972064c-ea30-421c-b009-2bc675a98fcc\") " pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.778680 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1972064c-ea30-421c-b009-2bc675a98fcc-log-socket\") pod \"ovnkube-node-8mklc\" (UID: \"1972064c-ea30-421c-b009-2bc675a98fcc\") " pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.778701 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1972064c-ea30-421c-b009-2bc675a98fcc-ovnkube-config\") pod \"ovnkube-node-8mklc\" (UID: \"1972064c-ea30-421c-b009-2bc675a98fcc\") " pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.778724 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b033c545-93a2-4401-842b-22456e44216b-host-var-lib-cni-bin\") pod \"multus-z6kjz\" (UID: \"b033c545-93a2-4401-842b-22456e44216b\") " pod="openshift-multus/multus-z6kjz" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.778741 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b033c545-93a2-4401-842b-22456e44216b-host-run-netns\") pod \"multus-z6kjz\" (UID: \"b033c545-93a2-4401-842b-22456e44216b\") " pod="openshift-multus/multus-z6kjz" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.778755 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/b033c545-93a2-4401-842b-22456e44216b-host-var-lib-cni-multus\") pod \"multus-z6kjz\" (UID: \"b033c545-93a2-4401-842b-22456e44216b\") " pod="openshift-multus/multus-z6kjz" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.778772 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdq96\" (UniqueName: \"kubernetes.io/projected/b033c545-93a2-4401-842b-22456e44216b-kube-api-access-zdq96\") pod \"multus-z6kjz\" (UID: \"b033c545-93a2-4401-842b-22456e44216b\") " pod="openshift-multus/multus-z6kjz" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.778769 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1972064c-ea30-421c-b009-2bc675a98fcc-var-lib-openvswitch\") pod \"ovnkube-node-8mklc\" (UID: \"1972064c-ea30-421c-b009-2bc675a98fcc\") " pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.778789 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1972064c-ea30-421c-b009-2bc675a98fcc-host-run-netns\") pod \"ovnkube-node-8mklc\" (UID: \"1972064c-ea30-421c-b009-2bc675a98fcc\") " pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.778815 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1972064c-ea30-421c-b009-2bc675a98fcc-host-run-netns\") pod \"ovnkube-node-8mklc\" (UID: \"1972064c-ea30-421c-b009-2bc675a98fcc\") " pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.778842 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/b033c545-93a2-4401-842b-22456e44216b-host-run-multus-certs\") pod \"multus-z6kjz\" (UID: \"b033c545-93a2-4401-842b-22456e44216b\") " pod="openshift-multus/multus-z6kjz" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.778872 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b033c545-93a2-4401-842b-22456e44216b-host-var-lib-cni-bin\") pod \"multus-z6kjz\" (UID: \"b033c545-93a2-4401-842b-22456e44216b\") " pod="openshift-multus/multus-z6kjz" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.778872 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/b033c545-93a2-4401-842b-22456e44216b-host-var-lib-cni-multus\") pod \"multus-z6kjz\" (UID: \"b033c545-93a2-4401-842b-22456e44216b\") " pod="openshift-multus/multus-z6kjz" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.778894 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1972064c-ea30-421c-b009-2bc675a98fcc-log-socket\") pod \"ovnkube-node-8mklc\" (UID: \"1972064c-ea30-421c-b009-2bc675a98fcc\") " pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.778920 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/b033c545-93a2-4401-842b-22456e44216b-host-run-multus-certs\") pod \"multus-z6kjz\" (UID: \"b033c545-93a2-4401-842b-22456e44216b\") " pod="openshift-multus/multus-z6kjz" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.778946 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1972064c-ea30-421c-b009-2bc675a98fcc-run-openvswitch\") pod \"ovnkube-node-8mklc\" (UID: \"1972064c-ea30-421c-b009-2bc675a98fcc\") " pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.778954 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b033c545-93a2-4401-842b-22456e44216b-host-run-netns\") pod \"multus-z6kjz\" (UID: \"b033c545-93a2-4401-842b-22456e44216b\") " pod="openshift-multus/multus-z6kjz" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.779068 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b033c545-93a2-4401-842b-22456e44216b-os-release\") pod \"multus-z6kjz\" (UID: \"b033c545-93a2-4401-842b-22456e44216b\") " pod="openshift-multus/multus-z6kjz" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.779098 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/b033c545-93a2-4401-842b-22456e44216b-hostroot\") pod \"multus-z6kjz\" (UID: \"b033c545-93a2-4401-842b-22456e44216b\") " pod="openshift-multus/multus-z6kjz" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.779120 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1972064c-ea30-421c-b009-2bc675a98fcc-node-log\") pod \"ovnkube-node-8mklc\" (UID: \"1972064c-ea30-421c-b009-2bc675a98fcc\") " pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.779124 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1972064c-ea30-421c-b009-2bc675a98fcc-run-openvswitch\") pod \"ovnkube-node-8mklc\" (UID: \"1972064c-ea30-421c-b009-2bc675a98fcc\") " pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.779144 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1972064c-ea30-421c-b009-2bc675a98fcc-ovn-node-metrics-cert\") pod \"ovnkube-node-8mklc\" (UID: \"1972064c-ea30-421c-b009-2bc675a98fcc\") " pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.779170 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pcpjn\" (UniqueName: \"kubernetes.io/projected/1972064c-ea30-421c-b009-2bc675a98fcc-kube-api-access-pcpjn\") pod \"ovnkube-node-8mklc\" (UID: \"1972064c-ea30-421c-b009-2bc675a98fcc\") " pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.779190 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b033c545-93a2-4401-842b-22456e44216b-os-release\") pod \"multus-z6kjz\" (UID: \"b033c545-93a2-4401-842b-22456e44216b\") " pod="openshift-multus/multus-z6kjz" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.779206 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1972064c-ea30-421c-b009-2bc675a98fcc-node-log\") pod \"ovnkube-node-8mklc\" (UID: \"1972064c-ea30-421c-b009-2bc675a98fcc\") " pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.779221 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/b033c545-93a2-4401-842b-22456e44216b-hostroot\") pod \"multus-z6kjz\" (UID: \"b033c545-93a2-4401-842b-22456e44216b\") " pod="openshift-multus/multus-z6kjz" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.779426 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b033c545-93a2-4401-842b-22456e44216b-cnibin\") pod \"multus-z6kjz\" (UID: \"b033c545-93a2-4401-842b-22456e44216b\") " pod="openshift-multus/multus-z6kjz" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.779467 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1972064c-ea30-421c-b009-2bc675a98fcc-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8mklc\" (UID: \"1972064c-ea30-421c-b009-2bc675a98fcc\") " pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.779478 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b033c545-93a2-4401-842b-22456e44216b-cnibin\") pod \"multus-z6kjz\" (UID: \"b033c545-93a2-4401-842b-22456e44216b\") " pod="openshift-multus/multus-z6kjz" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.779539 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/72a30a71-fe04-43a2-8f60-c9b12a0a6e5a-cni-binary-copy\") pod \"multus-additional-cni-plugins-s779q\" (UID: \"72a30a71-fe04-43a2-8f60-c9b12a0a6e5a\") " pod="openshift-multus/multus-additional-cni-plugins-s779q" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.779589 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1972064c-ea30-421c-b009-2bc675a98fcc-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8mklc\" (UID: \"1972064c-ea30-421c-b009-2bc675a98fcc\") " pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.779589 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/b033c545-93a2-4401-842b-22456e44216b-multus-socket-dir-parent\") pod \"multus-z6kjz\" (UID: \"b033c545-93a2-4401-842b-22456e44216b\") " pod="openshift-multus/multus-z6kjz" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.779634 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1972064c-ea30-421c-b009-2bc675a98fcc-host-slash\") pod \"ovnkube-node-8mklc\" (UID: \"1972064c-ea30-421c-b009-2bc675a98fcc\") " pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.779650 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1972064c-ea30-421c-b009-2bc675a98fcc-host-cni-netd\") pod \"ovnkube-node-8mklc\" (UID: \"1972064c-ea30-421c-b009-2bc675a98fcc\") " pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.779669 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1972064c-ea30-421c-b009-2bc675a98fcc-systemd-units\") pod \"ovnkube-node-8mklc\" (UID: \"1972064c-ea30-421c-b009-2bc675a98fcc\") " pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.779713 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/72a30a71-fe04-43a2-8f60-c9b12a0a6e5a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-s779q\" (UID: \"72a30a71-fe04-43a2-8f60-c9b12a0a6e5a\") " pod="openshift-multus/multus-additional-cni-plugins-s779q" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.779731 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1972064c-ea30-421c-b009-2bc675a98fcc-run-ovn\") pod \"ovnkube-node-8mklc\" (UID: \"1972064c-ea30-421c-b009-2bc675a98fcc\") " pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.779770 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/72a30a71-fe04-43a2-8f60-c9b12a0a6e5a-os-release\") pod \"multus-additional-cni-plugins-s779q\" (UID: \"72a30a71-fe04-43a2-8f60-c9b12a0a6e5a\") " pod="openshift-multus/multus-additional-cni-plugins-s779q" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.779789 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b033c545-93a2-4401-842b-22456e44216b-etc-kubernetes\") pod \"multus-z6kjz\" (UID: \"b033c545-93a2-4401-842b-22456e44216b\") " pod="openshift-multus/multus-z6kjz" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.779809 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1972064c-ea30-421c-b009-2bc675a98fcc-host-kubelet\") pod \"ovnkube-node-8mklc\" (UID: \"1972064c-ea30-421c-b009-2bc675a98fcc\") " pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.779837 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/b033c545-93a2-4401-842b-22456e44216b-multus-socket-dir-parent\") pod \"multus-z6kjz\" (UID: \"b033c545-93a2-4401-842b-22456e44216b\") " pod="openshift-multus/multus-z6kjz" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.779852 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b033c545-93a2-4401-842b-22456e44216b-system-cni-dir\") pod \"multus-z6kjz\" (UID: \"b033c545-93a2-4401-842b-22456e44216b\") " pod="openshift-multus/multus-z6kjz" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.779876 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/b033c545-93a2-4401-842b-22456e44216b-host-run-k8s-cni-cncf-io\") pod \"multus-z6kjz\" (UID: \"b033c545-93a2-4401-842b-22456e44216b\") " pod="openshift-multus/multus-z6kjz" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.779896 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1972064c-ea30-421c-b009-2bc675a98fcc-etc-openvswitch\") pod \"ovnkube-node-8mklc\" (UID: \"1972064c-ea30-421c-b009-2bc675a98fcc\") " pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.779899 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1972064c-ea30-421c-b009-2bc675a98fcc-run-ovn\") pod \"ovnkube-node-8mklc\" (UID: \"1972064c-ea30-421c-b009-2bc675a98fcc\") " pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.779915 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1972064c-ea30-421c-b009-2bc675a98fcc-run-systemd\") pod \"ovnkube-node-8mklc\" (UID: \"1972064c-ea30-421c-b009-2bc675a98fcc\") " pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.779940 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1972064c-ea30-421c-b009-2bc675a98fcc-host-slash\") pod \"ovnkube-node-8mklc\" (UID: \"1972064c-ea30-421c-b009-2bc675a98fcc\") " pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.779937 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/72a30a71-fe04-43a2-8f60-c9b12a0a6e5a-cnibin\") pod \"multus-additional-cni-plugins-s779q\" (UID: \"72a30a71-fe04-43a2-8f60-c9b12a0a6e5a\") " pod="openshift-multus/multus-additional-cni-plugins-s779q" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.779977 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgds8\" (UniqueName: \"kubernetes.io/projected/72a30a71-fe04-43a2-8f60-c9b12a0a6e5a-kube-api-access-zgds8\") pod \"multus-additional-cni-plugins-s779q\" (UID: \"72a30a71-fe04-43a2-8f60-c9b12a0a6e5a\") " pod="openshift-multus/multus-additional-cni-plugins-s779q" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.779983 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1972064c-ea30-421c-b009-2bc675a98fcc-host-cni-netd\") pod \"ovnkube-node-8mklc\" (UID: \"1972064c-ea30-421c-b009-2bc675a98fcc\") " pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.780001 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1972064c-ea30-421c-b009-2bc675a98fcc-ovnkube-config\") pod \"ovnkube-node-8mklc\" (UID: \"1972064c-ea30-421c-b009-2bc675a98fcc\") " pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.780021 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1972064c-ea30-421c-b009-2bc675a98fcc-systemd-units\") pod \"ovnkube-node-8mklc\" (UID: \"1972064c-ea30-421c-b009-2bc675a98fcc\") " pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.780027 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1972064c-ea30-421c-b009-2bc675a98fcc-host-run-ovn-kubernetes\") pod \"ovnkube-node-8mklc\" (UID: \"1972064c-ea30-421c-b009-2bc675a98fcc\") " pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.780004 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1972064c-ea30-421c-b009-2bc675a98fcc-host-run-ovn-kubernetes\") pod \"ovnkube-node-8mklc\" (UID: \"1972064c-ea30-421c-b009-2bc675a98fcc\") " pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.780062 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1972064c-ea30-421c-b009-2bc675a98fcc-env-overrides\") pod \"ovnkube-node-8mklc\" (UID: \"1972064c-ea30-421c-b009-2bc675a98fcc\") " pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.780067 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b033c545-93a2-4401-842b-22456e44216b-etc-kubernetes\") pod \"multus-z6kjz\" (UID: \"b033c545-93a2-4401-842b-22456e44216b\") " pod="openshift-multus/multus-z6kjz" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.780082 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b033c545-93a2-4401-842b-22456e44216b-multus-cni-dir\") pod \"multus-z6kjz\" (UID: \"b033c545-93a2-4401-842b-22456e44216b\") " pod="openshift-multus/multus-z6kjz" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.780102 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/b033c545-93a2-4401-842b-22456e44216b-multus-daemon-config\") pod \"multus-z6kjz\" (UID: \"b033c545-93a2-4401-842b-22456e44216b\") " pod="openshift-multus/multus-z6kjz" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.780120 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/72a30a71-fe04-43a2-8f60-c9b12a0a6e5a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-s779q\" (UID: \"72a30a71-fe04-43a2-8f60-c9b12a0a6e5a\") " pod="openshift-multus/multus-additional-cni-plugins-s779q" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.780137 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1972064c-ea30-421c-b009-2bc675a98fcc-host-cni-bin\") pod \"ovnkube-node-8mklc\" (UID: \"1972064c-ea30-421c-b009-2bc675a98fcc\") " pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.780142 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/72a30a71-fe04-43a2-8f60-c9b12a0a6e5a-os-release\") pod \"multus-additional-cni-plugins-s779q\" (UID: \"72a30a71-fe04-43a2-8f60-c9b12a0a6e5a\") " pod="openshift-multus/multus-additional-cni-plugins-s779q" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.780156 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/72a30a71-fe04-43a2-8f60-c9b12a0a6e5a-system-cni-dir\") pod \"multus-additional-cni-plugins-s779q\" (UID: \"72a30a71-fe04-43a2-8f60-c9b12a0a6e5a\") " pod="openshift-multus/multus-additional-cni-plugins-s779q" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.780186 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b033c545-93a2-4401-842b-22456e44216b-multus-conf-dir\") pod \"multus-z6kjz\" (UID: \"b033c545-93a2-4401-842b-22456e44216b\") " pod="openshift-multus/multus-z6kjz" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.780188 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1972064c-ea30-421c-b009-2bc675a98fcc-host-kubelet\") pod \"ovnkube-node-8mklc\" (UID: \"1972064c-ea30-421c-b009-2bc675a98fcc\") " pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.780192 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b033c545-93a2-4401-842b-22456e44216b-system-cni-dir\") pod \"multus-z6kjz\" (UID: \"b033c545-93a2-4401-842b-22456e44216b\") " pod="openshift-multus/multus-z6kjz" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.780203 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b033c545-93a2-4401-842b-22456e44216b-host-var-lib-kubelet\") pod \"multus-z6kjz\" (UID: \"b033c545-93a2-4401-842b-22456e44216b\") " pod="openshift-multus/multus-z6kjz" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.779796 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b033c545-93a2-4401-842b-22456e44216b-cni-binary-copy\") pod \"multus-z6kjz\" (UID: \"b033c545-93a2-4401-842b-22456e44216b\") " pod="openshift-multus/multus-z6kjz" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.780229 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1972064c-ea30-421c-b009-2bc675a98fcc-ovnkube-script-lib\") pod \"ovnkube-node-8mklc\" (UID: \"1972064c-ea30-421c-b009-2bc675a98fcc\") " pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.780296 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/72a30a71-fe04-43a2-8f60-c9b12a0a6e5a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-s779q\" (UID: \"72a30a71-fe04-43a2-8f60-c9b12a0a6e5a\") " pod="openshift-multus/multus-additional-cni-plugins-s779q" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.780321 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1972064c-ea30-421c-b009-2bc675a98fcc-run-systemd\") pod \"ovnkube-node-8mklc\" (UID: \"1972064c-ea30-421c-b009-2bc675a98fcc\") " pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.780347 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1972064c-ea30-421c-b009-2bc675a98fcc-host-cni-bin\") pod \"ovnkube-node-8mklc\" (UID: \"1972064c-ea30-421c-b009-2bc675a98fcc\") " pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.780362 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/b033c545-93a2-4401-842b-22456e44216b-host-run-k8s-cni-cncf-io\") pod \"multus-z6kjz\" (UID: \"b033c545-93a2-4401-842b-22456e44216b\") " pod="openshift-multus/multus-z6kjz" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.780399 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1972064c-ea30-421c-b009-2bc675a98fcc-etc-openvswitch\") pod \"ovnkube-node-8mklc\" (UID: \"1972064c-ea30-421c-b009-2bc675a98fcc\") " pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.780539 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b033c545-93a2-4401-842b-22456e44216b-multus-cni-dir\") pod \"multus-z6kjz\" (UID: \"b033c545-93a2-4401-842b-22456e44216b\") " pod="openshift-multus/multus-z6kjz" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.780565 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/72a30a71-fe04-43a2-8f60-c9b12a0a6e5a-cni-binary-copy\") pod \"multus-additional-cni-plugins-s779q\" (UID: \"72a30a71-fe04-43a2-8f60-c9b12a0a6e5a\") " pod="openshift-multus/multus-additional-cni-plugins-s779q" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.780653 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/72a30a71-fe04-43a2-8f60-c9b12a0a6e5a-cnibin\") pod \"multus-additional-cni-plugins-s779q\" (UID: \"72a30a71-fe04-43a2-8f60-c9b12a0a6e5a\") " pod="openshift-multus/multus-additional-cni-plugins-s779q" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.780696 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b033c545-93a2-4401-842b-22456e44216b-multus-conf-dir\") pod \"multus-z6kjz\" (UID: \"b033c545-93a2-4401-842b-22456e44216b\") " pod="openshift-multus/multus-z6kjz" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.780733 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/72a30a71-fe04-43a2-8f60-c9b12a0a6e5a-system-cni-dir\") pod \"multus-additional-cni-plugins-s779q\" (UID: \"72a30a71-fe04-43a2-8f60-c9b12a0a6e5a\") " pod="openshift-multus/multus-additional-cni-plugins-s779q" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.780942 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1972064c-ea30-421c-b009-2bc675a98fcc-ovnkube-script-lib\") pod \"ovnkube-node-8mklc\" (UID: \"1972064c-ea30-421c-b009-2bc675a98fcc\") " pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.781006 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1972064c-ea30-421c-b009-2bc675a98fcc-env-overrides\") pod \"ovnkube-node-8mklc\" (UID: \"1972064c-ea30-421c-b009-2bc675a98fcc\") " pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.781029 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b033c545-93a2-4401-842b-22456e44216b-host-var-lib-kubelet\") pod \"multus-z6kjz\" (UID: \"b033c545-93a2-4401-842b-22456e44216b\") " pod="openshift-multus/multus-z6kjz" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.781102 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/b033c545-93a2-4401-842b-22456e44216b-multus-daemon-config\") pod \"multus-z6kjz\" (UID: \"b033c545-93a2-4401-842b-22456e44216b\") " pod="openshift-multus/multus-z6kjz" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.781470 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/72a30a71-fe04-43a2-8f60-c9b12a0a6e5a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-s779q\" (UID: \"72a30a71-fe04-43a2-8f60-c9b12a0a6e5a\") " pod="openshift-multus/multus-additional-cni-plugins-s779q" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.787163 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.787198 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.787208 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.787226 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.787239 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:43Z","lastTransitionTime":"2025-12-02T15:52:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.787344 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1972064c-ea30-421c-b009-2bc675a98fcc-ovn-node-metrics-cert\") pod \"ovnkube-node-8mklc\" (UID: \"1972064c-ea30-421c-b009-2bc675a98fcc\") " pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.795898 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50efb59904b9275848b8f068dfa8943515c66087209fe13dc75888354ecaff09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:43Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.797758 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pcpjn\" (UniqueName: \"kubernetes.io/projected/1972064c-ea30-421c-b009-2bc675a98fcc-kube-api-access-pcpjn\") pod \"ovnkube-node-8mklc\" (UID: \"1972064c-ea30-421c-b009-2bc675a98fcc\") " pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.798582 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdq96\" (UniqueName: \"kubernetes.io/projected/b033c545-93a2-4401-842b-22456e44216b-kube-api-access-zdq96\") pod \"multus-z6kjz\" (UID: \"b033c545-93a2-4401-842b-22456e44216b\") " pod="openshift-multus/multus-z6kjz" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.803146 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgds8\" (UniqueName: \"kubernetes.io/projected/72a30a71-fe04-43a2-8f60-c9b12a0a6e5a-kube-api-access-zgds8\") pod \"multus-additional-cni-plugins-s779q\" (UID: \"72a30a71-fe04-43a2-8f60-c9b12a0a6e5a\") " pod="openshift-multus/multus-additional-cni-plugins-s779q" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.812223 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:43Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.825334 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6c1c5e6-50dd-428a-890c-2c3f0456f2fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t48jm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t48jm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-d2p6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:43Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.835676 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5d9dn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97edaf10-912b-42e7-a9e7-930381d48508\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s82hp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5d9dn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:43Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.853399 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50efb59904b9275848b8f068dfa8943515c66087209fe13dc75888354ecaff09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:43Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.865525 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:43Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.879222 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4424b26b4c67e9508d92cc6bbc82b291d93c587a8463026a856d87b7b778079e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:43Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.881727 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 15:52:43 crc kubenswrapper[4933]: E1202 15:52:43.881872 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 15:52:51.881844522 +0000 UTC m=+35.133071225 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.881976 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.882014 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 15:52:43 crc kubenswrapper[4933]: E1202 15:52:43.882143 4933 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 15:52:43 crc kubenswrapper[4933]: E1202 15:52:43.882151 4933 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 15:52:43 crc kubenswrapper[4933]: E1202 15:52:43.882211 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 15:52:51.882197141 +0000 UTC m=+35.133423844 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 15:52:43 crc kubenswrapper[4933]: E1202 15:52:43.882241 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 15:52:51.882230462 +0000 UTC m=+35.133457165 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.891296 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.891362 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.891377 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.891400 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.891415 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:43Z","lastTransitionTime":"2025-12-02T15:52:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.895373 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-z6kjz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b033c545-93a2-4401-842b-22456e44216b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdq96\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-z6kjz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:43Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.911240 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67141d41-dade-4d16-8921-1a3eeaef658e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67497877f8241167fd89cf2ebbc7257a910b4f22e96a1e58f78936317daf19aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://559f28d5b8b13a4d78a727f671a75d2c61a391f83089c98749a8071b466c8fae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50e3f8f75abacf2b0701b20cc58796727bb04d99836c7ad69c27355d0c7f070f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0f8661c980893fc646e84a6bb8547946653723fb3f1449d585953e25715d588\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdced4864fc5e9a41404f9484c6126634ffcbc3388080207f6a5508be6dc7b19\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 15:52:30.552832 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 15:52:30.556124 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1621699441/tls.crt::/tmp/serving-cert-1621699441/tls.key\\\\\\\"\\\\nI1202 15:52:36.166152 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 15:52:36.169452 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 15:52:36.169553 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 15:52:36.169614 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 15:52:36.169667 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 15:52:36.177343 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 15:52:36.177409 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 15:52:36.177417 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 15:52:36.177423 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 15:52:36.177428 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 15:52:36.177432 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 15:52:36.177437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 15:52:36.177361 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 15:52:36.181525 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://266ff5f18c317176752753189e1fd32c61d17e22bebba19421d71cdce41c10e4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd63fc5b2ffbd52b09628908868b28e2168bfe8bc453de41ce175aedcd404cc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd63fc5b2ffbd52b09628908868b28e2168bfe8bc453de41ce175aedcd404cc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:43Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.924310 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c0555430401fd089ba4f14bef44c9a03bcc4352a3159c34aa592797211ff912\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2774460b514418fc05e1d8ac0ca0a8cda1194fab9151804bed266e6bf44c7369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:43Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.937172 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:43Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.950274 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-s779q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72a30a71-fe04-43a2-8f60-c9b12a0a6e5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-s779q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:43Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.956156 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.962213 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-s779q" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.963739 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06b195f1-3296-4050-9361-eab421cde8d4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7399fed7994f0ed3f12a423d3f6796e84d8687f9c16a3050ccbb90e1c80a07d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee933bdd8638f0085a6f720a178c8ce59bf46b40a0bcb015ac9c570e25ce97d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7cb1abb6f86878fc3daef153191ea3a2ebe06b3f1fc7df959539938c3b6a724\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db14d6e6ebdfa06ff02570eb66fe7ea17a7705fdaa767b6fb91d7ed12eacd59a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:43Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.970907 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-z6kjz" Dec 02 15:52:43 crc kubenswrapper[4933]: W1202 15:52:43.971867 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1972064c_ea30_421c_b009_2bc675a98fcc.slice/crio-25abdf54a55a12332b638f065187ad0f7c9d3127374dcb58155f40674bc59d43 WatchSource:0}: Error finding container 25abdf54a55a12332b638f065187ad0f7c9d3127374dcb58155f40674bc59d43: Status 404 returned error can't find the container with id 25abdf54a55a12332b638f065187ad0f7c9d3127374dcb58155f40674bc59d43 Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.983557 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.983609 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 15:52:43 crc kubenswrapper[4933]: E1202 15:52:43.983765 4933 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 15:52:43 crc kubenswrapper[4933]: E1202 15:52:43.983783 4933 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 15:52:43 crc kubenswrapper[4933]: E1202 15:52:43.983795 4933 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 15:52:43 crc kubenswrapper[4933]: E1202 15:52:43.983875 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-02 15:52:51.983853855 +0000 UTC m=+35.235080558 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 15:52:43 crc kubenswrapper[4933]: E1202 15:52:43.984439 4933 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 15:52:43 crc kubenswrapper[4933]: E1202 15:52:43.984463 4933 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 15:52:43 crc kubenswrapper[4933]: E1202 15:52:43.984475 4933 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 15:52:43 crc kubenswrapper[4933]: E1202 15:52:43.984504 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-02 15:52:51.984494702 +0000 UTC m=+35.235721405 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.987205 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1972064c-ea30-421c-b009-2bc675a98fcc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8mklc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:43Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.993795 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.993858 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.993869 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.993886 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:43 crc kubenswrapper[4933]: I1202 15:52:43.993897 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:43Z","lastTransitionTime":"2025-12-02T15:52:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:44 crc kubenswrapper[4933]: I1202 15:52:44.053070 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 15:52:44 crc kubenswrapper[4933]: E1202 15:52:44.053208 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 15:52:44 crc kubenswrapper[4933]: I1202 15:52:44.053587 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 15:52:44 crc kubenswrapper[4933]: E1202 15:52:44.053651 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 15:52:44 crc kubenswrapper[4933]: I1202 15:52:44.053698 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 15:52:44 crc kubenswrapper[4933]: E1202 15:52:44.053745 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 15:52:44 crc kubenswrapper[4933]: I1202 15:52:44.112200 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:44 crc kubenswrapper[4933]: I1202 15:52:44.112253 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:44 crc kubenswrapper[4933]: I1202 15:52:44.112265 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:44 crc kubenswrapper[4933]: I1202 15:52:44.112286 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:44 crc kubenswrapper[4933]: I1202 15:52:44.112300 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:44Z","lastTransitionTime":"2025-12-02T15:52:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:44 crc kubenswrapper[4933]: I1202 15:52:44.217274 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:44 crc kubenswrapper[4933]: I1202 15:52:44.217336 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:44 crc kubenswrapper[4933]: I1202 15:52:44.217353 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:44 crc kubenswrapper[4933]: I1202 15:52:44.217376 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:44 crc kubenswrapper[4933]: I1202 15:52:44.217389 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:44Z","lastTransitionTime":"2025-12-02T15:52:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:44 crc kubenswrapper[4933]: I1202 15:52:44.231764 4933 generic.go:334] "Generic (PLEG): container finished" podID="1972064c-ea30-421c-b009-2bc675a98fcc" containerID="d0062caebc274431e99ef7dc875a50f2f2d756215bd2700b7206582f68e5f376" exitCode=0 Dec 02 15:52:44 crc kubenswrapper[4933]: I1202 15:52:44.231871 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" event={"ID":"1972064c-ea30-421c-b009-2bc675a98fcc","Type":"ContainerDied","Data":"d0062caebc274431e99ef7dc875a50f2f2d756215bd2700b7206582f68e5f376"} Dec 02 15:52:44 crc kubenswrapper[4933]: I1202 15:52:44.231940 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" event={"ID":"1972064c-ea30-421c-b009-2bc675a98fcc","Type":"ContainerStarted","Data":"25abdf54a55a12332b638f065187ad0f7c9d3127374dcb58155f40674bc59d43"} Dec 02 15:52:44 crc kubenswrapper[4933]: I1202 15:52:44.234258 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-z6kjz" event={"ID":"b033c545-93a2-4401-842b-22456e44216b","Type":"ContainerStarted","Data":"a5307d7bbe56091012f9975b2a42eafb27d8c90b53817f1f82d8269e23456759"} Dec 02 15:52:44 crc kubenswrapper[4933]: I1202 15:52:44.234327 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-z6kjz" event={"ID":"b033c545-93a2-4401-842b-22456e44216b","Type":"ContainerStarted","Data":"2d93c50d83cefb292cff20fff6a8505f58b14df8763e318c8cc42bc254ec1103"} Dec 02 15:52:44 crc kubenswrapper[4933]: I1202 15:52:44.236630 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-5d9dn" event={"ID":"97edaf10-912b-42e7-a9e7-930381d48508","Type":"ContainerStarted","Data":"ea0970d622ab76b94ceab66d1d10d469581574368d38c8cd7c6b7a26f81cb6d0"} Dec 02 15:52:44 crc kubenswrapper[4933]: I1202 15:52:44.236685 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-5d9dn" event={"ID":"97edaf10-912b-42e7-a9e7-930381d48508","Type":"ContainerStarted","Data":"c303bd18519e3e15b69d5b56b42e5ec98c6ea22f0e4a1602bcd6311560a00e36"} Dec 02 15:52:44 crc kubenswrapper[4933]: I1202 15:52:44.242352 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" event={"ID":"e6c1c5e6-50dd-428a-890c-2c3f0456f2fa","Type":"ContainerStarted","Data":"d84363ac0dfeec81ad7770d6ffd34547605fc51bebb545c4639f4c069bab93ec"} Dec 02 15:52:44 crc kubenswrapper[4933]: I1202 15:52:44.242406 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" event={"ID":"e6c1c5e6-50dd-428a-890c-2c3f0456f2fa","Type":"ContainerStarted","Data":"54194f3459a2bbe748821e4f8e94abdd18e7c4e483d4cc2c9d5b765db584dd01"} Dec 02 15:52:44 crc kubenswrapper[4933]: I1202 15:52:44.242419 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" event={"ID":"e6c1c5e6-50dd-428a-890c-2c3f0456f2fa","Type":"ContainerStarted","Data":"067802d0161cf19da95aa4d7395a7887d9f2b777410492ff2010376e28e0e474"} Dec 02 15:52:44 crc kubenswrapper[4933]: I1202 15:52:44.252603 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-s779q" event={"ID":"72a30a71-fe04-43a2-8f60-c9b12a0a6e5a","Type":"ContainerStarted","Data":"7a30681fada08b471c4881c901cad65378d0285d2c805f3e841af4353d4e0f98"} Dec 02 15:52:44 crc kubenswrapper[4933]: I1202 15:52:44.253373 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50efb59904b9275848b8f068dfa8943515c66087209fe13dc75888354ecaff09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:44Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:44 crc kubenswrapper[4933]: I1202 15:52:44.266849 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:44Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:44 crc kubenswrapper[4933]: I1202 15:52:44.284986 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4424b26b4c67e9508d92cc6bbc82b291d93c587a8463026a856d87b7b778079e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:44Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:44 crc kubenswrapper[4933]: I1202 15:52:44.301125 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-z6kjz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b033c545-93a2-4401-842b-22456e44216b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdq96\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-z6kjz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:44Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:44 crc kubenswrapper[4933]: I1202 15:52:44.320795 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:44 crc kubenswrapper[4933]: I1202 15:52:44.320852 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:44 crc kubenswrapper[4933]: I1202 15:52:44.320862 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:44 crc kubenswrapper[4933]: I1202 15:52:44.320878 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:44 crc kubenswrapper[4933]: I1202 15:52:44.320890 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:44Z","lastTransitionTime":"2025-12-02T15:52:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:44 crc kubenswrapper[4933]: I1202 15:52:44.324707 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67141d41-dade-4d16-8921-1a3eeaef658e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67497877f8241167fd89cf2ebbc7257a910b4f22e96a1e58f78936317daf19aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://559f28d5b8b13a4d78a727f671a75d2c61a391f83089c98749a8071b466c8fae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50e3f8f75abacf2b0701b20cc58796727bb04d99836c7ad69c27355d0c7f070f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0f8661c980893fc646e84a6bb8547946653723fb3f1449d585953e25715d588\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdced4864fc5e9a41404f9484c6126634ffcbc3388080207f6a5508be6dc7b19\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 15:52:30.552832 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 15:52:30.556124 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1621699441/tls.crt::/tmp/serving-cert-1621699441/tls.key\\\\\\\"\\\\nI1202 15:52:36.166152 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 15:52:36.169452 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 15:52:36.169553 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 15:52:36.169614 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 15:52:36.169667 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 15:52:36.177343 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 15:52:36.177409 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 15:52:36.177417 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 15:52:36.177423 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 15:52:36.177428 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 15:52:36.177432 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 15:52:36.177437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 15:52:36.177361 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 15:52:36.181525 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://266ff5f18c317176752753189e1fd32c61d17e22bebba19421d71cdce41c10e4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd63fc5b2ffbd52b09628908868b28e2168bfe8bc453de41ce175aedcd404cc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd63fc5b2ffbd52b09628908868b28e2168bfe8bc453de41ce175aedcd404cc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:44Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:44 crc kubenswrapper[4933]: I1202 15:52:44.343097 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c0555430401fd089ba4f14bef44c9a03bcc4352a3159c34aa592797211ff912\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2774460b514418fc05e1d8ac0ca0a8cda1194fab9151804bed266e6bf44c7369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:44Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:44 crc kubenswrapper[4933]: I1202 15:52:44.362892 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:44Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:44 crc kubenswrapper[4933]: I1202 15:52:44.377133 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-s779q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72a30a71-fe04-43a2-8f60-c9b12a0a6e5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-s779q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:44Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:44 crc kubenswrapper[4933]: I1202 15:52:44.391569 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06b195f1-3296-4050-9361-eab421cde8d4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7399fed7994f0ed3f12a423d3f6796e84d8687f9c16a3050ccbb90e1c80a07d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee933bdd8638f0085a6f720a178c8ce59bf46b40a0bcb015ac9c570e25ce97d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7cb1abb6f86878fc3daef153191ea3a2ebe06b3f1fc7df959539938c3b6a724\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db14d6e6ebdfa06ff02570eb66fe7ea17a7705fdaa767b6fb91d7ed12eacd59a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:44Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:44 crc kubenswrapper[4933]: I1202 15:52:44.410361 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1972064c-ea30-421c-b009-2bc675a98fcc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0062caebc274431e99ef7dc875a50f2f2d756215bd2700b7206582f68e5f376\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0062caebc274431e99ef7dc875a50f2f2d756215bd2700b7206582f68e5f376\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8mklc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:44Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:44 crc kubenswrapper[4933]: I1202 15:52:44.423881 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:44 crc kubenswrapper[4933]: I1202 15:52:44.423922 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:44 crc kubenswrapper[4933]: I1202 15:52:44.423933 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:44 crc kubenswrapper[4933]: I1202 15:52:44.423949 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:44 crc kubenswrapper[4933]: I1202 15:52:44.423977 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:44Z","lastTransitionTime":"2025-12-02T15:52:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:44 crc kubenswrapper[4933]: I1202 15:52:44.424468 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:44Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:44 crc kubenswrapper[4933]: I1202 15:52:44.436405 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6c1c5e6-50dd-428a-890c-2c3f0456f2fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t48jm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t48jm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-d2p6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:44Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:44 crc kubenswrapper[4933]: I1202 15:52:44.448850 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5d9dn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97edaf10-912b-42e7-a9e7-930381d48508\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s82hp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5d9dn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:44Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:44 crc kubenswrapper[4933]: I1202 15:52:44.464181 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-s779q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72a30a71-fe04-43a2-8f60-c9b12a0a6e5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-s779q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:44Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:44 crc kubenswrapper[4933]: I1202 15:52:44.482264 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67141d41-dade-4d16-8921-1a3eeaef658e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67497877f8241167fd89cf2ebbc7257a910b4f22e96a1e58f78936317daf19aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://559f28d5b8b13a4d78a727f671a75d2c61a391f83089c98749a8071b466c8fae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50e3f8f75abacf2b0701b20cc58796727bb04d99836c7ad69c27355d0c7f070f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0f8661c980893fc646e84a6bb8547946653723fb3f1449d585953e25715d588\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdced4864fc5e9a41404f9484c6126634ffcbc3388080207f6a5508be6dc7b19\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 15:52:30.552832 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 15:52:30.556124 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1621699441/tls.crt::/tmp/serving-cert-1621699441/tls.key\\\\\\\"\\\\nI1202 15:52:36.166152 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 15:52:36.169452 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 15:52:36.169553 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 15:52:36.169614 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 15:52:36.169667 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 15:52:36.177343 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 15:52:36.177409 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 15:52:36.177417 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 15:52:36.177423 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 15:52:36.177428 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 15:52:36.177432 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 15:52:36.177437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 15:52:36.177361 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 15:52:36.181525 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://266ff5f18c317176752753189e1fd32c61d17e22bebba19421d71cdce41c10e4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd63fc5b2ffbd52b09628908868b28e2168bfe8bc453de41ce175aedcd404cc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd63fc5b2ffbd52b09628908868b28e2168bfe8bc453de41ce175aedcd404cc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:44Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:44 crc kubenswrapper[4933]: I1202 15:52:44.494539 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c0555430401fd089ba4f14bef44c9a03bcc4352a3159c34aa592797211ff912\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2774460b514418fc05e1d8ac0ca0a8cda1194fab9151804bed266e6bf44c7369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:44Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:44 crc kubenswrapper[4933]: I1202 15:52:44.508155 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:44Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:44 crc kubenswrapper[4933]: I1202 15:52:44.526138 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:44 crc kubenswrapper[4933]: I1202 15:52:44.526172 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:44 crc kubenswrapper[4933]: I1202 15:52:44.526180 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:44 crc kubenswrapper[4933]: I1202 15:52:44.526193 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:44 crc kubenswrapper[4933]: I1202 15:52:44.526202 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:44Z","lastTransitionTime":"2025-12-02T15:52:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:44 crc kubenswrapper[4933]: I1202 15:52:44.531429 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1972064c-ea30-421c-b009-2bc675a98fcc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0062caebc274431e99ef7dc875a50f2f2d756215bd2700b7206582f68e5f376\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0062caebc274431e99ef7dc875a50f2f2d756215bd2700b7206582f68e5f376\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8mklc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:44Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:44 crc kubenswrapper[4933]: I1202 15:52:44.544611 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06b195f1-3296-4050-9361-eab421cde8d4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7399fed7994f0ed3f12a423d3f6796e84d8687f9c16a3050ccbb90e1c80a07d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee933bdd8638f0085a6f720a178c8ce59bf46b40a0bcb015ac9c570e25ce97d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7cb1abb6f86878fc3daef153191ea3a2ebe06b3f1fc7df959539938c3b6a724\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db14d6e6ebdfa06ff02570eb66fe7ea17a7705fdaa767b6fb91d7ed12eacd59a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:44Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:44 crc kubenswrapper[4933]: I1202 15:52:44.554806 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5d9dn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97edaf10-912b-42e7-a9e7-930381d48508\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea0970d622ab76b94ceab66d1d10d469581574368d38c8cd7c6b7a26f81cb6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s82hp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5d9dn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:44Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:44 crc kubenswrapper[4933]: I1202 15:52:44.566104 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:44Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:44 crc kubenswrapper[4933]: I1202 15:52:44.576717 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6c1c5e6-50dd-428a-890c-2c3f0456f2fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d84363ac0dfeec81ad7770d6ffd34547605fc51bebb545c4639f4c069bab93ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t48jm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54194f3459a2bbe748821e4f8e94abdd18e7c4e483d4cc2c9d5b765db584dd01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t48jm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-d2p6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:44Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:44 crc kubenswrapper[4933]: I1202 15:52:44.590586 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-z6kjz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b033c545-93a2-4401-842b-22456e44216b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5307d7bbe56091012f9975b2a42eafb27d8c90b53817f1f82d8269e23456759\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdq96\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-z6kjz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:44Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:44 crc kubenswrapper[4933]: I1202 15:52:44.603769 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50efb59904b9275848b8f068dfa8943515c66087209fe13dc75888354ecaff09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:44Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:44 crc kubenswrapper[4933]: I1202 15:52:44.615286 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:44Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:44 crc kubenswrapper[4933]: I1202 15:52:44.628604 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:44 crc kubenswrapper[4933]: I1202 15:52:44.628666 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:44 crc kubenswrapper[4933]: I1202 15:52:44.628677 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:44 crc kubenswrapper[4933]: I1202 15:52:44.628693 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:44 crc kubenswrapper[4933]: I1202 15:52:44.628704 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:44Z","lastTransitionTime":"2025-12-02T15:52:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:44 crc kubenswrapper[4933]: I1202 15:52:44.630246 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4424b26b4c67e9508d92cc6bbc82b291d93c587a8463026a856d87b7b778079e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:44Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:44 crc kubenswrapper[4933]: I1202 15:52:44.731760 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:44 crc kubenswrapper[4933]: I1202 15:52:44.731808 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:44 crc kubenswrapper[4933]: I1202 15:52:44.731832 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:44 crc kubenswrapper[4933]: I1202 15:52:44.731847 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:44 crc kubenswrapper[4933]: I1202 15:52:44.731858 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:44Z","lastTransitionTime":"2025-12-02T15:52:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:44 crc kubenswrapper[4933]: I1202 15:52:44.833712 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:44 crc kubenswrapper[4933]: I1202 15:52:44.833784 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:44 crc kubenswrapper[4933]: I1202 15:52:44.833795 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:44 crc kubenswrapper[4933]: I1202 15:52:44.833811 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:44 crc kubenswrapper[4933]: I1202 15:52:44.833842 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:44Z","lastTransitionTime":"2025-12-02T15:52:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:44 crc kubenswrapper[4933]: I1202 15:52:44.936803 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:44 crc kubenswrapper[4933]: I1202 15:52:44.937138 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:44 crc kubenswrapper[4933]: I1202 15:52:44.937152 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:44 crc kubenswrapper[4933]: I1202 15:52:44.937170 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:44 crc kubenswrapper[4933]: I1202 15:52:44.937183 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:44Z","lastTransitionTime":"2025-12-02T15:52:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:45 crc kubenswrapper[4933]: I1202 15:52:45.041409 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:45 crc kubenswrapper[4933]: I1202 15:52:45.041451 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:45 crc kubenswrapper[4933]: I1202 15:52:45.041463 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:45 crc kubenswrapper[4933]: I1202 15:52:45.041486 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:45 crc kubenswrapper[4933]: I1202 15:52:45.041496 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:45Z","lastTransitionTime":"2025-12-02T15:52:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:45 crc kubenswrapper[4933]: I1202 15:52:45.144329 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:45 crc kubenswrapper[4933]: I1202 15:52:45.144358 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:45 crc kubenswrapper[4933]: I1202 15:52:45.144366 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:45 crc kubenswrapper[4933]: I1202 15:52:45.144378 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:45 crc kubenswrapper[4933]: I1202 15:52:45.144387 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:45Z","lastTransitionTime":"2025-12-02T15:52:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:45 crc kubenswrapper[4933]: I1202 15:52:45.281170 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:45 crc kubenswrapper[4933]: I1202 15:52:45.281471 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:45 crc kubenswrapper[4933]: I1202 15:52:45.281479 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:45 crc kubenswrapper[4933]: I1202 15:52:45.281493 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:45 crc kubenswrapper[4933]: I1202 15:52:45.281501 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:45Z","lastTransitionTime":"2025-12-02T15:52:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:45 crc kubenswrapper[4933]: I1202 15:52:45.285289 4933 generic.go:334] "Generic (PLEG): container finished" podID="72a30a71-fe04-43a2-8f60-c9b12a0a6e5a" containerID="a448b3118b6160344bea740978c045ce2351af934805ee95eb6b940963b377bf" exitCode=0 Dec 02 15:52:45 crc kubenswrapper[4933]: I1202 15:52:45.285389 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-s779q" event={"ID":"72a30a71-fe04-43a2-8f60-c9b12a0a6e5a","Type":"ContainerDied","Data":"a448b3118b6160344bea740978c045ce2351af934805ee95eb6b940963b377bf"} Dec 02 15:52:45 crc kubenswrapper[4933]: I1202 15:52:45.289288 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" event={"ID":"1972064c-ea30-421c-b009-2bc675a98fcc","Type":"ContainerStarted","Data":"e2a3f2ac3ef3759389e0c4807b4235aea66b8d9570ef020af90110402296a755"} Dec 02 15:52:45 crc kubenswrapper[4933]: I1202 15:52:45.289349 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" event={"ID":"1972064c-ea30-421c-b009-2bc675a98fcc","Type":"ContainerStarted","Data":"3920b75a79accaa154ca827b9d813f5bbb7c7c18369c0b6d3bb82cab653bcb66"} Dec 02 15:52:45 crc kubenswrapper[4933]: I1202 15:52:45.289376 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" event={"ID":"1972064c-ea30-421c-b009-2bc675a98fcc","Type":"ContainerStarted","Data":"a499bb33dae499d565aa22f537a095fe9465bd1d68d829248fc7a896f692f67d"} Dec 02 15:52:45 crc kubenswrapper[4933]: I1202 15:52:45.289400 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" event={"ID":"1972064c-ea30-421c-b009-2bc675a98fcc","Type":"ContainerStarted","Data":"14c25265781df0f480ecebbc167b5e413d869433f7258cc299a2c2bc12ee2af3"} Dec 02 15:52:45 crc kubenswrapper[4933]: I1202 15:52:45.304339 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:45Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:45 crc kubenswrapper[4933]: I1202 15:52:45.321260 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6c1c5e6-50dd-428a-890c-2c3f0456f2fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d84363ac0dfeec81ad7770d6ffd34547605fc51bebb545c4639f4c069bab93ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t48jm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54194f3459a2bbe748821e4f8e94abdd18e7c4e483d4cc2c9d5b765db584dd01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t48jm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-d2p6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:45Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:45 crc kubenswrapper[4933]: I1202 15:52:45.333730 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5d9dn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97edaf10-912b-42e7-a9e7-930381d48508\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea0970d622ab76b94ceab66d1d10d469581574368d38c8cd7c6b7a26f81cb6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s82hp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5d9dn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:45Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:45 crc kubenswrapper[4933]: I1202 15:52:45.350400 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50efb59904b9275848b8f068dfa8943515c66087209fe13dc75888354ecaff09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:45Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:45 crc kubenswrapper[4933]: I1202 15:52:45.363539 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:45Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:45 crc kubenswrapper[4933]: I1202 15:52:45.380276 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4424b26b4c67e9508d92cc6bbc82b291d93c587a8463026a856d87b7b778079e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:45Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:45 crc kubenswrapper[4933]: I1202 15:52:45.384330 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:45 crc kubenswrapper[4933]: I1202 15:52:45.384361 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:45 crc kubenswrapper[4933]: I1202 15:52:45.384371 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:45 crc kubenswrapper[4933]: I1202 15:52:45.384385 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:45 crc kubenswrapper[4933]: I1202 15:52:45.384396 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:45Z","lastTransitionTime":"2025-12-02T15:52:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:45 crc kubenswrapper[4933]: I1202 15:52:45.403604 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-z6kjz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b033c545-93a2-4401-842b-22456e44216b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5307d7bbe56091012f9975b2a42eafb27d8c90b53817f1f82d8269e23456759\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdq96\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-z6kjz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:45Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:45 crc kubenswrapper[4933]: I1202 15:52:45.420351 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67141d41-dade-4d16-8921-1a3eeaef658e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67497877f8241167fd89cf2ebbc7257a910b4f22e96a1e58f78936317daf19aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://559f28d5b8b13a4d78a727f671a75d2c61a391f83089c98749a8071b466c8fae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50e3f8f75abacf2b0701b20cc58796727bb04d99836c7ad69c27355d0c7f070f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0f8661c980893fc646e84a6bb8547946653723fb3f1449d585953e25715d588\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdced4864fc5e9a41404f9484c6126634ffcbc3388080207f6a5508be6dc7b19\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 15:52:30.552832 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 15:52:30.556124 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1621699441/tls.crt::/tmp/serving-cert-1621699441/tls.key\\\\\\\"\\\\nI1202 15:52:36.166152 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 15:52:36.169452 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 15:52:36.169553 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 15:52:36.169614 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 15:52:36.169667 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 15:52:36.177343 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 15:52:36.177409 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 15:52:36.177417 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 15:52:36.177423 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 15:52:36.177428 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 15:52:36.177432 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 15:52:36.177437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 15:52:36.177361 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 15:52:36.181525 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://266ff5f18c317176752753189e1fd32c61d17e22bebba19421d71cdce41c10e4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd63fc5b2ffbd52b09628908868b28e2168bfe8bc453de41ce175aedcd404cc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd63fc5b2ffbd52b09628908868b28e2168bfe8bc453de41ce175aedcd404cc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:45Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:45 crc kubenswrapper[4933]: I1202 15:52:45.438308 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c0555430401fd089ba4f14bef44c9a03bcc4352a3159c34aa592797211ff912\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2774460b514418fc05e1d8ac0ca0a8cda1194fab9151804bed266e6bf44c7369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:45Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:45 crc kubenswrapper[4933]: I1202 15:52:45.454482 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:45Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:45 crc kubenswrapper[4933]: I1202 15:52:45.477310 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-s779q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72a30a71-fe04-43a2-8f60-c9b12a0a6e5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a448b3118b6160344bea740978c045ce2351af934805ee95eb6b940963b377bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a448b3118b6160344bea740978c045ce2351af934805ee95eb6b940963b377bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-s779q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:45Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:45 crc kubenswrapper[4933]: I1202 15:52:45.492213 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:45 crc kubenswrapper[4933]: I1202 15:52:45.492542 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:45 crc kubenswrapper[4933]: I1202 15:52:45.492615 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:45 crc kubenswrapper[4933]: I1202 15:52:45.492680 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:45 crc kubenswrapper[4933]: I1202 15:52:45.492749 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:45Z","lastTransitionTime":"2025-12-02T15:52:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:45 crc kubenswrapper[4933]: I1202 15:52:45.499438 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06b195f1-3296-4050-9361-eab421cde8d4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7399fed7994f0ed3f12a423d3f6796e84d8687f9c16a3050ccbb90e1c80a07d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee933bdd8638f0085a6f720a178c8ce59bf46b40a0bcb015ac9c570e25ce97d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7cb1abb6f86878fc3daef153191ea3a2ebe06b3f1fc7df959539938c3b6a724\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db14d6e6ebdfa06ff02570eb66fe7ea17a7705fdaa767b6fb91d7ed12eacd59a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:45Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:45 crc kubenswrapper[4933]: I1202 15:52:45.525537 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1972064c-ea30-421c-b009-2bc675a98fcc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0062caebc274431e99ef7dc875a50f2f2d756215bd2700b7206582f68e5f376\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0062caebc274431e99ef7dc875a50f2f2d756215bd2700b7206582f68e5f376\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8mklc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:45Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:45 crc kubenswrapper[4933]: I1202 15:52:45.595574 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:45 crc kubenswrapper[4933]: I1202 15:52:45.595865 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:45 crc kubenswrapper[4933]: I1202 15:52:45.595943 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:45 crc kubenswrapper[4933]: I1202 15:52:45.596002 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:45 crc kubenswrapper[4933]: I1202 15:52:45.596054 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:45Z","lastTransitionTime":"2025-12-02T15:52:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:45 crc kubenswrapper[4933]: I1202 15:52:45.699292 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:45 crc kubenswrapper[4933]: I1202 15:52:45.699343 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:45 crc kubenswrapper[4933]: I1202 15:52:45.699356 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:45 crc kubenswrapper[4933]: I1202 15:52:45.699375 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:45 crc kubenswrapper[4933]: I1202 15:52:45.699387 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:45Z","lastTransitionTime":"2025-12-02T15:52:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:45 crc kubenswrapper[4933]: I1202 15:52:45.802775 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:45 crc kubenswrapper[4933]: I1202 15:52:45.803142 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:45 crc kubenswrapper[4933]: I1202 15:52:45.803243 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:45 crc kubenswrapper[4933]: I1202 15:52:45.803327 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:45 crc kubenswrapper[4933]: I1202 15:52:45.803395 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:45Z","lastTransitionTime":"2025-12-02T15:52:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:45 crc kubenswrapper[4933]: I1202 15:52:45.906934 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:45 crc kubenswrapper[4933]: I1202 15:52:45.906977 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:45 crc kubenswrapper[4933]: I1202 15:52:45.906989 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:45 crc kubenswrapper[4933]: I1202 15:52:45.907007 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:45 crc kubenswrapper[4933]: I1202 15:52:45.907019 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:45Z","lastTransitionTime":"2025-12-02T15:52:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:46 crc kubenswrapper[4933]: I1202 15:52:46.010311 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:46 crc kubenswrapper[4933]: I1202 15:52:46.010351 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:46 crc kubenswrapper[4933]: I1202 15:52:46.010362 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:46 crc kubenswrapper[4933]: I1202 15:52:46.010378 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:46 crc kubenswrapper[4933]: I1202 15:52:46.010388 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:46Z","lastTransitionTime":"2025-12-02T15:52:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:46 crc kubenswrapper[4933]: I1202 15:52:46.053017 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 15:52:46 crc kubenswrapper[4933]: E1202 15:52:46.053145 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 15:52:46 crc kubenswrapper[4933]: I1202 15:52:46.053442 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 15:52:46 crc kubenswrapper[4933]: E1202 15:52:46.053486 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 15:52:46 crc kubenswrapper[4933]: I1202 15:52:46.054623 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 15:52:46 crc kubenswrapper[4933]: E1202 15:52:46.054916 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 15:52:46 crc kubenswrapper[4933]: I1202 15:52:46.112672 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:46 crc kubenswrapper[4933]: I1202 15:52:46.112731 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:46 crc kubenswrapper[4933]: I1202 15:52:46.112744 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:46 crc kubenswrapper[4933]: I1202 15:52:46.112764 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:46 crc kubenswrapper[4933]: I1202 15:52:46.112779 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:46Z","lastTransitionTime":"2025-12-02T15:52:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:46 crc kubenswrapper[4933]: I1202 15:52:46.215245 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:46 crc kubenswrapper[4933]: I1202 15:52:46.215320 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:46 crc kubenswrapper[4933]: I1202 15:52:46.215342 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:46 crc kubenswrapper[4933]: I1202 15:52:46.215368 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:46 crc kubenswrapper[4933]: I1202 15:52:46.215387 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:46Z","lastTransitionTime":"2025-12-02T15:52:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:46 crc kubenswrapper[4933]: I1202 15:52:46.295883 4933 generic.go:334] "Generic (PLEG): container finished" podID="72a30a71-fe04-43a2-8f60-c9b12a0a6e5a" containerID="4f11b7eae04e4265f3d00a32c0c1c91419ef6da612c6ecd2e69ef4e90b5f72b5" exitCode=0 Dec 02 15:52:46 crc kubenswrapper[4933]: I1202 15:52:46.295969 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-s779q" event={"ID":"72a30a71-fe04-43a2-8f60-c9b12a0a6e5a","Type":"ContainerDied","Data":"4f11b7eae04e4265f3d00a32c0c1c91419ef6da612c6ecd2e69ef4e90b5f72b5"} Dec 02 15:52:46 crc kubenswrapper[4933]: I1202 15:52:46.304174 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" event={"ID":"1972064c-ea30-421c-b009-2bc675a98fcc","Type":"ContainerStarted","Data":"7d4c0805f39e3410185b2d898b61b42dfeb544cd47c1ac7570a099ece7921a20"} Dec 02 15:52:46 crc kubenswrapper[4933]: I1202 15:52:46.304245 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" event={"ID":"1972064c-ea30-421c-b009-2bc675a98fcc","Type":"ContainerStarted","Data":"170155e5803e9eabfc4866a3cd76bd74567d9ff6707bebfdf7f24a2330173140"} Dec 02 15:52:46 crc kubenswrapper[4933]: I1202 15:52:46.318916 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:46 crc kubenswrapper[4933]: I1202 15:52:46.318967 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:46 crc kubenswrapper[4933]: I1202 15:52:46.318980 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:46 crc kubenswrapper[4933]: I1202 15:52:46.318997 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:46 crc kubenswrapper[4933]: I1202 15:52:46.319009 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:46Z","lastTransitionTime":"2025-12-02T15:52:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:46 crc kubenswrapper[4933]: I1202 15:52:46.323584 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4424b26b4c67e9508d92cc6bbc82b291d93c587a8463026a856d87b7b778079e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:46Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:46 crc kubenswrapper[4933]: I1202 15:52:46.359632 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-z6kjz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b033c545-93a2-4401-842b-22456e44216b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5307d7bbe56091012f9975b2a42eafb27d8c90b53817f1f82d8269e23456759\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdq96\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-z6kjz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:46Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:46 crc kubenswrapper[4933]: I1202 15:52:46.387502 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50efb59904b9275848b8f068dfa8943515c66087209fe13dc75888354ecaff09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:46Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:46 crc kubenswrapper[4933]: I1202 15:52:46.422119 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:46 crc kubenswrapper[4933]: I1202 15:52:46.422156 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:46 crc kubenswrapper[4933]: I1202 15:52:46.422165 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:46 crc kubenswrapper[4933]: I1202 15:52:46.422185 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:46 crc kubenswrapper[4933]: I1202 15:52:46.422196 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:46Z","lastTransitionTime":"2025-12-02T15:52:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:46 crc kubenswrapper[4933]: I1202 15:52:46.428258 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:46Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:46 crc kubenswrapper[4933]: I1202 15:52:46.444428 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:46Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:46 crc kubenswrapper[4933]: I1202 15:52:46.461725 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-s779q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72a30a71-fe04-43a2-8f60-c9b12a0a6e5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a448b3118b6160344bea740978c045ce2351af934805ee95eb6b940963b377bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a448b3118b6160344bea740978c045ce2351af934805ee95eb6b940963b377bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f11b7eae04e4265f3d00a32c0c1c91419ef6da612c6ecd2e69ef4e90b5f72b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f11b7eae04e4265f3d00a32c0c1c91419ef6da612c6ecd2e69ef4e90b5f72b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-s779q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:46Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:46 crc kubenswrapper[4933]: I1202 15:52:46.476130 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67141d41-dade-4d16-8921-1a3eeaef658e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67497877f8241167fd89cf2ebbc7257a910b4f22e96a1e58f78936317daf19aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://559f28d5b8b13a4d78a727f671a75d2c61a391f83089c98749a8071b466c8fae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50e3f8f75abacf2b0701b20cc58796727bb04d99836c7ad69c27355d0c7f070f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0f8661c980893fc646e84a6bb8547946653723fb3f1449d585953e25715d588\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdced4864fc5e9a41404f9484c6126634ffcbc3388080207f6a5508be6dc7b19\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 15:52:30.552832 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 15:52:30.556124 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1621699441/tls.crt::/tmp/serving-cert-1621699441/tls.key\\\\\\\"\\\\nI1202 15:52:36.166152 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 15:52:36.169452 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 15:52:36.169553 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 15:52:36.169614 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 15:52:36.169667 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 15:52:36.177343 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 15:52:36.177409 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 15:52:36.177417 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 15:52:36.177423 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 15:52:36.177428 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 15:52:36.177432 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 15:52:36.177437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 15:52:36.177361 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 15:52:36.181525 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://266ff5f18c317176752753189e1fd32c61d17e22bebba19421d71cdce41c10e4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd63fc5b2ffbd52b09628908868b28e2168bfe8bc453de41ce175aedcd404cc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd63fc5b2ffbd52b09628908868b28e2168bfe8bc453de41ce175aedcd404cc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:46Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:46 crc kubenswrapper[4933]: I1202 15:52:46.495163 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c0555430401fd089ba4f14bef44c9a03bcc4352a3159c34aa592797211ff912\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2774460b514418fc05e1d8ac0ca0a8cda1194fab9151804bed266e6bf44c7369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:46Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:46 crc kubenswrapper[4933]: I1202 15:52:46.513292 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06b195f1-3296-4050-9361-eab421cde8d4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7399fed7994f0ed3f12a423d3f6796e84d8687f9c16a3050ccbb90e1c80a07d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee933bdd8638f0085a6f720a178c8ce59bf46b40a0bcb015ac9c570e25ce97d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7cb1abb6f86878fc3daef153191ea3a2ebe06b3f1fc7df959539938c3b6a724\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db14d6e6ebdfa06ff02570eb66fe7ea17a7705fdaa767b6fb91d7ed12eacd59a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:46Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:46 crc kubenswrapper[4933]: I1202 15:52:46.526107 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:46 crc kubenswrapper[4933]: I1202 15:52:46.526168 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:46 crc kubenswrapper[4933]: I1202 15:52:46.526182 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:46 crc kubenswrapper[4933]: I1202 15:52:46.526205 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:46 crc kubenswrapper[4933]: I1202 15:52:46.526220 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:46Z","lastTransitionTime":"2025-12-02T15:52:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:46 crc kubenswrapper[4933]: I1202 15:52:46.532807 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1972064c-ea30-421c-b009-2bc675a98fcc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0062caebc274431e99ef7dc875a50f2f2d756215bd2700b7206582f68e5f376\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0062caebc274431e99ef7dc875a50f2f2d756215bd2700b7206582f68e5f376\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8mklc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:46Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:46 crc kubenswrapper[4933]: I1202 15:52:46.544796 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6c1c5e6-50dd-428a-890c-2c3f0456f2fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d84363ac0dfeec81ad7770d6ffd34547605fc51bebb545c4639f4c069bab93ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t48jm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54194f3459a2bbe748821e4f8e94abdd18e7c4e483d4cc2c9d5b765db584dd01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t48jm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-d2p6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:46Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:46 crc kubenswrapper[4933]: I1202 15:52:46.559472 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5d9dn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97edaf10-912b-42e7-a9e7-930381d48508\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea0970d622ab76b94ceab66d1d10d469581574368d38c8cd7c6b7a26f81cb6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s82hp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5d9dn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:46Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:46 crc kubenswrapper[4933]: I1202 15:52:46.570757 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:46Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:46 crc kubenswrapper[4933]: I1202 15:52:46.631236 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:46 crc kubenswrapper[4933]: I1202 15:52:46.631278 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:46 crc kubenswrapper[4933]: I1202 15:52:46.631288 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:46 crc kubenswrapper[4933]: I1202 15:52:46.631305 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:46 crc kubenswrapper[4933]: I1202 15:52:46.631315 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:46Z","lastTransitionTime":"2025-12-02T15:52:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:46 crc kubenswrapper[4933]: I1202 15:52:46.734604 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:46 crc kubenswrapper[4933]: I1202 15:52:46.734675 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:46 crc kubenswrapper[4933]: I1202 15:52:46.734689 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:46 crc kubenswrapper[4933]: I1202 15:52:46.734711 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:46 crc kubenswrapper[4933]: I1202 15:52:46.734726 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:46Z","lastTransitionTime":"2025-12-02T15:52:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:46 crc kubenswrapper[4933]: I1202 15:52:46.837659 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:46 crc kubenswrapper[4933]: I1202 15:52:46.837749 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:46 crc kubenswrapper[4933]: I1202 15:52:46.837783 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:46 crc kubenswrapper[4933]: I1202 15:52:46.837857 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:46 crc kubenswrapper[4933]: I1202 15:52:46.837891 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:46Z","lastTransitionTime":"2025-12-02T15:52:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:46 crc kubenswrapper[4933]: I1202 15:52:46.921384 4933 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Dec 02 15:52:46 crc kubenswrapper[4933]: I1202 15:52:46.941091 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:46 crc kubenswrapper[4933]: I1202 15:52:46.941139 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:46 crc kubenswrapper[4933]: I1202 15:52:46.941150 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:46 crc kubenswrapper[4933]: I1202 15:52:46.941168 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:46 crc kubenswrapper[4933]: I1202 15:52:46.941180 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:46Z","lastTransitionTime":"2025-12-02T15:52:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:47 crc kubenswrapper[4933]: I1202 15:52:47.044294 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:47 crc kubenswrapper[4933]: I1202 15:52:47.044714 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:47 crc kubenswrapper[4933]: I1202 15:52:47.044890 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:47 crc kubenswrapper[4933]: I1202 15:52:47.045042 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:47 crc kubenswrapper[4933]: I1202 15:52:47.045156 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:47Z","lastTransitionTime":"2025-12-02T15:52:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:47 crc kubenswrapper[4933]: I1202 15:52:47.066245 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50efb59904b9275848b8f068dfa8943515c66087209fe13dc75888354ecaff09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:47Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:47 crc kubenswrapper[4933]: I1202 15:52:47.079967 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:47Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:47 crc kubenswrapper[4933]: I1202 15:52:47.091954 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4424b26b4c67e9508d92cc6bbc82b291d93c587a8463026a856d87b7b778079e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:47Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:47 crc kubenswrapper[4933]: I1202 15:52:47.106353 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-z6kjz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b033c545-93a2-4401-842b-22456e44216b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5307d7bbe56091012f9975b2a42eafb27d8c90b53817f1f82d8269e23456759\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdq96\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-z6kjz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:47Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:47 crc kubenswrapper[4933]: I1202 15:52:47.122008 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67141d41-dade-4d16-8921-1a3eeaef658e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67497877f8241167fd89cf2ebbc7257a910b4f22e96a1e58f78936317daf19aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://559f28d5b8b13a4d78a727f671a75d2c61a391f83089c98749a8071b466c8fae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50e3f8f75abacf2b0701b20cc58796727bb04d99836c7ad69c27355d0c7f070f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0f8661c980893fc646e84a6bb8547946653723fb3f1449d585953e25715d588\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdced4864fc5e9a41404f9484c6126634ffcbc3388080207f6a5508be6dc7b19\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 15:52:30.552832 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 15:52:30.556124 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1621699441/tls.crt::/tmp/serving-cert-1621699441/tls.key\\\\\\\"\\\\nI1202 15:52:36.166152 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 15:52:36.169452 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 15:52:36.169553 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 15:52:36.169614 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 15:52:36.169667 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 15:52:36.177343 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 15:52:36.177409 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 15:52:36.177417 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 15:52:36.177423 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 15:52:36.177428 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 15:52:36.177432 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 15:52:36.177437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 15:52:36.177361 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 15:52:36.181525 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://266ff5f18c317176752753189e1fd32c61d17e22bebba19421d71cdce41c10e4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd63fc5b2ffbd52b09628908868b28e2168bfe8bc453de41ce175aedcd404cc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd63fc5b2ffbd52b09628908868b28e2168bfe8bc453de41ce175aedcd404cc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:47Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:47 crc kubenswrapper[4933]: I1202 15:52:47.135434 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c0555430401fd089ba4f14bef44c9a03bcc4352a3159c34aa592797211ff912\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2774460b514418fc05e1d8ac0ca0a8cda1194fab9151804bed266e6bf44c7369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:47Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:47 crc kubenswrapper[4933]: I1202 15:52:47.148300 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:47 crc kubenswrapper[4933]: I1202 15:52:47.148390 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:47 crc kubenswrapper[4933]: I1202 15:52:47.148406 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:47 crc kubenswrapper[4933]: I1202 15:52:47.148434 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:47 crc kubenswrapper[4933]: I1202 15:52:47.148451 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:47Z","lastTransitionTime":"2025-12-02T15:52:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:47 crc kubenswrapper[4933]: I1202 15:52:47.148955 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:47Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:47 crc kubenswrapper[4933]: I1202 15:52:47.166549 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-s779q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72a30a71-fe04-43a2-8f60-c9b12a0a6e5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a448b3118b6160344bea740978c045ce2351af934805ee95eb6b940963b377bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a448b3118b6160344bea740978c045ce2351af934805ee95eb6b940963b377bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f11b7eae04e4265f3d00a32c0c1c91419ef6da612c6ecd2e69ef4e90b5f72b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f11b7eae04e4265f3d00a32c0c1c91419ef6da612c6ecd2e69ef4e90b5f72b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-s779q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:47Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:47 crc kubenswrapper[4933]: I1202 15:52:47.186900 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06b195f1-3296-4050-9361-eab421cde8d4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7399fed7994f0ed3f12a423d3f6796e84d8687f9c16a3050ccbb90e1c80a07d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee933bdd8638f0085a6f720a178c8ce59bf46b40a0bcb015ac9c570e25ce97d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7cb1abb6f86878fc3daef153191ea3a2ebe06b3f1fc7df959539938c3b6a724\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db14d6e6ebdfa06ff02570eb66fe7ea17a7705fdaa767b6fb91d7ed12eacd59a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:47Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:47 crc kubenswrapper[4933]: I1202 15:52:47.213569 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1972064c-ea30-421c-b009-2bc675a98fcc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0062caebc274431e99ef7dc875a50f2f2d756215bd2700b7206582f68e5f376\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0062caebc274431e99ef7dc875a50f2f2d756215bd2700b7206582f68e5f376\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8mklc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:47Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:47 crc kubenswrapper[4933]: I1202 15:52:47.235735 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:47Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:47 crc kubenswrapper[4933]: I1202 15:52:47.251436 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:47 crc kubenswrapper[4933]: I1202 15:52:47.251501 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:47 crc kubenswrapper[4933]: I1202 15:52:47.251542 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:47 crc kubenswrapper[4933]: I1202 15:52:47.251798 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:47 crc kubenswrapper[4933]: I1202 15:52:47.251812 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:47Z","lastTransitionTime":"2025-12-02T15:52:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:47 crc kubenswrapper[4933]: I1202 15:52:47.252105 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6c1c5e6-50dd-428a-890c-2c3f0456f2fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d84363ac0dfeec81ad7770d6ffd34547605fc51bebb545c4639f4c069bab93ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t48jm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54194f3459a2bbe748821e4f8e94abdd18e7c4e483d4cc2c9d5b765db584dd01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t48jm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-d2p6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:47Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:47 crc kubenswrapper[4933]: I1202 15:52:47.266508 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5d9dn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97edaf10-912b-42e7-a9e7-930381d48508\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea0970d622ab76b94ceab66d1d10d469581574368d38c8cd7c6b7a26f81cb6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s82hp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5d9dn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:47Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:47 crc kubenswrapper[4933]: I1202 15:52:47.310187 4933 generic.go:334] "Generic (PLEG): container finished" podID="72a30a71-fe04-43a2-8f60-c9b12a0a6e5a" containerID="89093d056f612b90064277410dc17c8b1879459a9e9491538ccd8dabeb2703e1" exitCode=0 Dec 02 15:52:47 crc kubenswrapper[4933]: I1202 15:52:47.310236 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-s779q" event={"ID":"72a30a71-fe04-43a2-8f60-c9b12a0a6e5a","Type":"ContainerDied","Data":"89093d056f612b90064277410dc17c8b1879459a9e9491538ccd8dabeb2703e1"} Dec 02 15:52:47 crc kubenswrapper[4933]: I1202 15:52:47.321650 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5d9dn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97edaf10-912b-42e7-a9e7-930381d48508\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea0970d622ab76b94ceab66d1d10d469581574368d38c8cd7c6b7a26f81cb6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s82hp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5d9dn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:47Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:47 crc kubenswrapper[4933]: I1202 15:52:47.333225 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:47Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:47 crc kubenswrapper[4933]: I1202 15:52:47.340233 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-fl25w"] Dec 02 15:52:47 crc kubenswrapper[4933]: I1202 15:52:47.340635 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-fl25w" Dec 02 15:52:47 crc kubenswrapper[4933]: I1202 15:52:47.342453 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 02 15:52:47 crc kubenswrapper[4933]: I1202 15:52:47.342953 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 02 15:52:47 crc kubenswrapper[4933]: I1202 15:52:47.344723 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 02 15:52:47 crc kubenswrapper[4933]: I1202 15:52:47.344986 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 02 15:52:47 crc kubenswrapper[4933]: I1202 15:52:47.347447 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6c1c5e6-50dd-428a-890c-2c3f0456f2fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d84363ac0dfeec81ad7770d6ffd34547605fc51bebb545c4639f4c069bab93ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t48jm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54194f3459a2bbe748821e4f8e94abdd18e7c4e483d4cc2c9d5b765db584dd01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t48jm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-d2p6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:47Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:47 crc kubenswrapper[4933]: I1202 15:52:47.353958 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:47 crc kubenswrapper[4933]: I1202 15:52:47.353979 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:47 crc kubenswrapper[4933]: I1202 15:52:47.353987 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:47 crc kubenswrapper[4933]: I1202 15:52:47.354000 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:47 crc kubenswrapper[4933]: I1202 15:52:47.354008 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:47Z","lastTransitionTime":"2025-12-02T15:52:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:47 crc kubenswrapper[4933]: I1202 15:52:47.362506 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-z6kjz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b033c545-93a2-4401-842b-22456e44216b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5307d7bbe56091012f9975b2a42eafb27d8c90b53817f1f82d8269e23456759\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdq96\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-z6kjz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:47Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:47 crc kubenswrapper[4933]: I1202 15:52:47.379224 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50efb59904b9275848b8f068dfa8943515c66087209fe13dc75888354ecaff09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:47Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:47 crc kubenswrapper[4933]: I1202 15:52:47.391349 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:47Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:47 crc kubenswrapper[4933]: I1202 15:52:47.406283 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4424b26b4c67e9508d92cc6bbc82b291d93c587a8463026a856d87b7b778079e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:47Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:47 crc kubenswrapper[4933]: I1202 15:52:47.417627 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/680c0df1-e4d6-4e1c-a36d-2378e821d2d9-host\") pod \"node-ca-fl25w\" (UID: \"680c0df1-e4d6-4e1c-a36d-2378e821d2d9\") " pod="openshift-image-registry/node-ca-fl25w" Dec 02 15:52:47 crc kubenswrapper[4933]: I1202 15:52:47.417660 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-559sn\" (UniqueName: \"kubernetes.io/projected/680c0df1-e4d6-4e1c-a36d-2378e821d2d9-kube-api-access-559sn\") pod \"node-ca-fl25w\" (UID: \"680c0df1-e4d6-4e1c-a36d-2378e821d2d9\") " pod="openshift-image-registry/node-ca-fl25w" Dec 02 15:52:47 crc kubenswrapper[4933]: I1202 15:52:47.417686 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/680c0df1-e4d6-4e1c-a36d-2378e821d2d9-serviceca\") pod \"node-ca-fl25w\" (UID: \"680c0df1-e4d6-4e1c-a36d-2378e821d2d9\") " pod="openshift-image-registry/node-ca-fl25w" Dec 02 15:52:47 crc kubenswrapper[4933]: I1202 15:52:47.422589 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-s779q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72a30a71-fe04-43a2-8f60-c9b12a0a6e5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a448b3118b6160344bea740978c045ce2351af934805ee95eb6b940963b377bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a448b3118b6160344bea740978c045ce2351af934805ee95eb6b940963b377bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f11b7eae04e4265f3d00a32c0c1c91419ef6da612c6ecd2e69ef4e90b5f72b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f11b7eae04e4265f3d00a32c0c1c91419ef6da612c6ecd2e69ef4e90b5f72b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89093d056f612b90064277410dc17c8b1879459a9e9491538ccd8dabeb2703e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89093d056f612b90064277410dc17c8b1879459a9e9491538ccd8dabeb2703e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-s779q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:47Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:47 crc kubenswrapper[4933]: I1202 15:52:47.439791 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67141d41-dade-4d16-8921-1a3eeaef658e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67497877f8241167fd89cf2ebbc7257a910b4f22e96a1e58f78936317daf19aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://559f28d5b8b13a4d78a727f671a75d2c61a391f83089c98749a8071b466c8fae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50e3f8f75abacf2b0701b20cc58796727bb04d99836c7ad69c27355d0c7f070f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0f8661c980893fc646e84a6bb8547946653723fb3f1449d585953e25715d588\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdced4864fc5e9a41404f9484c6126634ffcbc3388080207f6a5508be6dc7b19\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 15:52:30.552832 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 15:52:30.556124 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1621699441/tls.crt::/tmp/serving-cert-1621699441/tls.key\\\\\\\"\\\\nI1202 15:52:36.166152 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 15:52:36.169452 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 15:52:36.169553 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 15:52:36.169614 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 15:52:36.169667 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 15:52:36.177343 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 15:52:36.177409 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 15:52:36.177417 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 15:52:36.177423 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 15:52:36.177428 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 15:52:36.177432 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 15:52:36.177437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 15:52:36.177361 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 15:52:36.181525 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://266ff5f18c317176752753189e1fd32c61d17e22bebba19421d71cdce41c10e4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd63fc5b2ffbd52b09628908868b28e2168bfe8bc453de41ce175aedcd404cc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd63fc5b2ffbd52b09628908868b28e2168bfe8bc453de41ce175aedcd404cc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:47Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:47 crc kubenswrapper[4933]: I1202 15:52:47.456400 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c0555430401fd089ba4f14bef44c9a03bcc4352a3159c34aa592797211ff912\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2774460b514418fc05e1d8ac0ca0a8cda1194fab9151804bed266e6bf44c7369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:47Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:47 crc kubenswrapper[4933]: I1202 15:52:47.457642 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:47 crc kubenswrapper[4933]: I1202 15:52:47.457699 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:47 crc kubenswrapper[4933]: I1202 15:52:47.457714 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:47 crc kubenswrapper[4933]: I1202 15:52:47.457735 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:47 crc kubenswrapper[4933]: I1202 15:52:47.457749 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:47Z","lastTransitionTime":"2025-12-02T15:52:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:47 crc kubenswrapper[4933]: I1202 15:52:47.472073 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:47Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:47 crc kubenswrapper[4933]: I1202 15:52:47.493430 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1972064c-ea30-421c-b009-2bc675a98fcc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0062caebc274431e99ef7dc875a50f2f2d756215bd2700b7206582f68e5f376\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0062caebc274431e99ef7dc875a50f2f2d756215bd2700b7206582f68e5f376\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8mklc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:47Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:47 crc kubenswrapper[4933]: I1202 15:52:47.510731 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06b195f1-3296-4050-9361-eab421cde8d4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7399fed7994f0ed3f12a423d3f6796e84d8687f9c16a3050ccbb90e1c80a07d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee933bdd8638f0085a6f720a178c8ce59bf46b40a0bcb015ac9c570e25ce97d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7cb1abb6f86878fc3daef153191ea3a2ebe06b3f1fc7df959539938c3b6a724\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db14d6e6ebdfa06ff02570eb66fe7ea17a7705fdaa767b6fb91d7ed12eacd59a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:47Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:47 crc kubenswrapper[4933]: I1202 15:52:47.518595 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/680c0df1-e4d6-4e1c-a36d-2378e821d2d9-host\") pod \"node-ca-fl25w\" (UID: \"680c0df1-e4d6-4e1c-a36d-2378e821d2d9\") " pod="openshift-image-registry/node-ca-fl25w" Dec 02 15:52:47 crc kubenswrapper[4933]: I1202 15:52:47.518656 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-559sn\" (UniqueName: \"kubernetes.io/projected/680c0df1-e4d6-4e1c-a36d-2378e821d2d9-kube-api-access-559sn\") pod \"node-ca-fl25w\" (UID: \"680c0df1-e4d6-4e1c-a36d-2378e821d2d9\") " pod="openshift-image-registry/node-ca-fl25w" Dec 02 15:52:47 crc kubenswrapper[4933]: I1202 15:52:47.518694 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/680c0df1-e4d6-4e1c-a36d-2378e821d2d9-serviceca\") pod \"node-ca-fl25w\" (UID: \"680c0df1-e4d6-4e1c-a36d-2378e821d2d9\") " pod="openshift-image-registry/node-ca-fl25w" Dec 02 15:52:47 crc kubenswrapper[4933]: I1202 15:52:47.518815 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/680c0df1-e4d6-4e1c-a36d-2378e821d2d9-host\") pod \"node-ca-fl25w\" (UID: \"680c0df1-e4d6-4e1c-a36d-2378e821d2d9\") " pod="openshift-image-registry/node-ca-fl25w" Dec 02 15:52:47 crc kubenswrapper[4933]: I1202 15:52:47.519858 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/680c0df1-e4d6-4e1c-a36d-2378e821d2d9-serviceca\") pod \"node-ca-fl25w\" (UID: \"680c0df1-e4d6-4e1c-a36d-2378e821d2d9\") " pod="openshift-image-registry/node-ca-fl25w" Dec 02 15:52:47 crc kubenswrapper[4933]: I1202 15:52:47.530438 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06b195f1-3296-4050-9361-eab421cde8d4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7399fed7994f0ed3f12a423d3f6796e84d8687f9c16a3050ccbb90e1c80a07d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee933bdd8638f0085a6f720a178c8ce59bf46b40a0bcb015ac9c570e25ce97d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7cb1abb6f86878fc3daef153191ea3a2ebe06b3f1fc7df959539938c3b6a724\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db14d6e6ebdfa06ff02570eb66fe7ea17a7705fdaa767b6fb91d7ed12eacd59a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:47Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:47 crc kubenswrapper[4933]: I1202 15:52:47.539460 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-559sn\" (UniqueName: \"kubernetes.io/projected/680c0df1-e4d6-4e1c-a36d-2378e821d2d9-kube-api-access-559sn\") pod \"node-ca-fl25w\" (UID: \"680c0df1-e4d6-4e1c-a36d-2378e821d2d9\") " pod="openshift-image-registry/node-ca-fl25w" Dec 02 15:52:47 crc kubenswrapper[4933]: I1202 15:52:47.550325 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1972064c-ea30-421c-b009-2bc675a98fcc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0062caebc274431e99ef7dc875a50f2f2d756215bd2700b7206582f68e5f376\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0062caebc274431e99ef7dc875a50f2f2d756215bd2700b7206582f68e5f376\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8mklc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:47Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:47 crc kubenswrapper[4933]: I1202 15:52:47.560947 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:47 crc kubenswrapper[4933]: I1202 15:52:47.561014 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:47 crc kubenswrapper[4933]: I1202 15:52:47.561025 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:47 crc kubenswrapper[4933]: I1202 15:52:47.561070 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:47 crc kubenswrapper[4933]: I1202 15:52:47.561081 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:47Z","lastTransitionTime":"2025-12-02T15:52:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:47 crc kubenswrapper[4933]: I1202 15:52:47.565698 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:47Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:47 crc kubenswrapper[4933]: I1202 15:52:47.579060 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6c1c5e6-50dd-428a-890c-2c3f0456f2fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d84363ac0dfeec81ad7770d6ffd34547605fc51bebb545c4639f4c069bab93ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t48jm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54194f3459a2bbe748821e4f8e94abdd18e7c4e483d4cc2c9d5b765db584dd01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t48jm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-d2p6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:47Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:47 crc kubenswrapper[4933]: I1202 15:52:47.596088 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5d9dn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97edaf10-912b-42e7-a9e7-930381d48508\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea0970d622ab76b94ceab66d1d10d469581574368d38c8cd7c6b7a26f81cb6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s82hp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5d9dn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:47Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:47 crc kubenswrapper[4933]: I1202 15:52:47.610395 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fl25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"680c0df1-e4d6-4e1c-a36d-2378e821d2d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:47Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:47Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-559sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fl25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:47Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:47 crc kubenswrapper[4933]: I1202 15:52:47.625326 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50efb59904b9275848b8f068dfa8943515c66087209fe13dc75888354ecaff09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:47Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:47 crc kubenswrapper[4933]: I1202 15:52:47.638719 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:47Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:47 crc kubenswrapper[4933]: I1202 15:52:47.651516 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4424b26b4c67e9508d92cc6bbc82b291d93c587a8463026a856d87b7b778079e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:47Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:47 crc kubenswrapper[4933]: I1202 15:52:47.664005 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:47 crc kubenswrapper[4933]: I1202 15:52:47.664061 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:47 crc kubenswrapper[4933]: I1202 15:52:47.664072 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:47 crc kubenswrapper[4933]: I1202 15:52:47.664108 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:47 crc kubenswrapper[4933]: I1202 15:52:47.664119 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:47Z","lastTransitionTime":"2025-12-02T15:52:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:47 crc kubenswrapper[4933]: I1202 15:52:47.664844 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-fl25w" Dec 02 15:52:47 crc kubenswrapper[4933]: I1202 15:52:47.668384 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-z6kjz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b033c545-93a2-4401-842b-22456e44216b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5307d7bbe56091012f9975b2a42eafb27d8c90b53817f1f82d8269e23456759\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdq96\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-z6kjz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:47Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:47 crc kubenswrapper[4933]: W1202 15:52:47.689970 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod680c0df1_e4d6_4e1c_a36d_2378e821d2d9.slice/crio-b4be6790fdc3873a7088bc1411d98c670e66d471b5aa3b7c08883ef1810ce658 WatchSource:0}: Error finding container b4be6790fdc3873a7088bc1411d98c670e66d471b5aa3b7c08883ef1810ce658: Status 404 returned error can't find the container with id b4be6790fdc3873a7088bc1411d98c670e66d471b5aa3b7c08883ef1810ce658 Dec 02 15:52:47 crc kubenswrapper[4933]: I1202 15:52:47.696511 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67141d41-dade-4d16-8921-1a3eeaef658e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67497877f8241167fd89cf2ebbc7257a910b4f22e96a1e58f78936317daf19aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://559f28d5b8b13a4d78a727f671a75d2c61a391f83089c98749a8071b466c8fae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50e3f8f75abacf2b0701b20cc58796727bb04d99836c7ad69c27355d0c7f070f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0f8661c980893fc646e84a6bb8547946653723fb3f1449d585953e25715d588\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdced4864fc5e9a41404f9484c6126634ffcbc3388080207f6a5508be6dc7b19\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 15:52:30.552832 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 15:52:30.556124 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1621699441/tls.crt::/tmp/serving-cert-1621699441/tls.key\\\\\\\"\\\\nI1202 15:52:36.166152 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 15:52:36.169452 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 15:52:36.169553 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 15:52:36.169614 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 15:52:36.169667 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 15:52:36.177343 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 15:52:36.177409 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 15:52:36.177417 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 15:52:36.177423 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 15:52:36.177428 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 15:52:36.177432 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 15:52:36.177437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 15:52:36.177361 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 15:52:36.181525 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://266ff5f18c317176752753189e1fd32c61d17e22bebba19421d71cdce41c10e4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd63fc5b2ffbd52b09628908868b28e2168bfe8bc453de41ce175aedcd404cc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd63fc5b2ffbd52b09628908868b28e2168bfe8bc453de41ce175aedcd404cc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:47Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:47 crc kubenswrapper[4933]: I1202 15:52:47.714529 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c0555430401fd089ba4f14bef44c9a03bcc4352a3159c34aa592797211ff912\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2774460b514418fc05e1d8ac0ca0a8cda1194fab9151804bed266e6bf44c7369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:47Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:47 crc kubenswrapper[4933]: I1202 15:52:47.735409 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:47Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:47 crc kubenswrapper[4933]: I1202 15:52:47.752367 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-s779q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72a30a71-fe04-43a2-8f60-c9b12a0a6e5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a448b3118b6160344bea740978c045ce2351af934805ee95eb6b940963b377bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a448b3118b6160344bea740978c045ce2351af934805ee95eb6b940963b377bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f11b7eae04e4265f3d00a32c0c1c91419ef6da612c6ecd2e69ef4e90b5f72b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f11b7eae04e4265f3d00a32c0c1c91419ef6da612c6ecd2e69ef4e90b5f72b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89093d056f612b90064277410dc17c8b1879459a9e9491538ccd8dabeb2703e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89093d056f612b90064277410dc17c8b1879459a9e9491538ccd8dabeb2703e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-s779q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:47Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:47 crc kubenswrapper[4933]: I1202 15:52:47.766552 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:47 crc kubenswrapper[4933]: I1202 15:52:47.766601 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:47 crc kubenswrapper[4933]: I1202 15:52:47.766614 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:47 crc kubenswrapper[4933]: I1202 15:52:47.766632 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:47 crc kubenswrapper[4933]: I1202 15:52:47.766643 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:47Z","lastTransitionTime":"2025-12-02T15:52:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:47 crc kubenswrapper[4933]: I1202 15:52:47.870485 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:47 crc kubenswrapper[4933]: I1202 15:52:47.870534 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:47 crc kubenswrapper[4933]: I1202 15:52:47.870551 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:47 crc kubenswrapper[4933]: I1202 15:52:47.870575 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:47 crc kubenswrapper[4933]: I1202 15:52:47.870589 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:47Z","lastTransitionTime":"2025-12-02T15:52:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:47 crc kubenswrapper[4933]: I1202 15:52:47.973112 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:47 crc kubenswrapper[4933]: I1202 15:52:47.973160 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:47 crc kubenswrapper[4933]: I1202 15:52:47.973170 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:47 crc kubenswrapper[4933]: I1202 15:52:47.973185 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:47 crc kubenswrapper[4933]: I1202 15:52:47.973195 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:47Z","lastTransitionTime":"2025-12-02T15:52:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:48 crc kubenswrapper[4933]: I1202 15:52:48.052654 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 15:52:48 crc kubenswrapper[4933]: E1202 15:52:48.052874 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 15:52:48 crc kubenswrapper[4933]: I1202 15:52:48.053411 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 15:52:48 crc kubenswrapper[4933]: E1202 15:52:48.053467 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 15:52:48 crc kubenswrapper[4933]: I1202 15:52:48.053518 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 15:52:48 crc kubenswrapper[4933]: E1202 15:52:48.053570 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 15:52:48 crc kubenswrapper[4933]: I1202 15:52:48.076493 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:48 crc kubenswrapper[4933]: I1202 15:52:48.076539 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:48 crc kubenswrapper[4933]: I1202 15:52:48.076554 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:48 crc kubenswrapper[4933]: I1202 15:52:48.076574 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:48 crc kubenswrapper[4933]: I1202 15:52:48.076587 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:48Z","lastTransitionTime":"2025-12-02T15:52:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:48 crc kubenswrapper[4933]: I1202 15:52:48.180162 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:48 crc kubenswrapper[4933]: I1202 15:52:48.180501 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:48 crc kubenswrapper[4933]: I1202 15:52:48.180511 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:48 crc kubenswrapper[4933]: I1202 15:52:48.180527 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:48 crc kubenswrapper[4933]: I1202 15:52:48.180540 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:48Z","lastTransitionTime":"2025-12-02T15:52:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:48 crc kubenswrapper[4933]: I1202 15:52:48.283547 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:48 crc kubenswrapper[4933]: I1202 15:52:48.283600 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:48 crc kubenswrapper[4933]: I1202 15:52:48.283612 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:48 crc kubenswrapper[4933]: I1202 15:52:48.283630 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:48 crc kubenswrapper[4933]: I1202 15:52:48.283643 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:48Z","lastTransitionTime":"2025-12-02T15:52:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:48 crc kubenswrapper[4933]: I1202 15:52:48.319231 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" event={"ID":"1972064c-ea30-421c-b009-2bc675a98fcc","Type":"ContainerStarted","Data":"9ae6c9973a0f8983a5da9a3a25ba23d7d836e7f5bdaa60b60627d604ef3d2afc"} Dec 02 15:52:48 crc kubenswrapper[4933]: I1202 15:52:48.322435 4933 generic.go:334] "Generic (PLEG): container finished" podID="72a30a71-fe04-43a2-8f60-c9b12a0a6e5a" containerID="4e773c0ba8db44a579b0f5b59c7402d773402fbb1b3e3b57bf95e9244df635c4" exitCode=0 Dec 02 15:52:48 crc kubenswrapper[4933]: I1202 15:52:48.322491 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-s779q" event={"ID":"72a30a71-fe04-43a2-8f60-c9b12a0a6e5a","Type":"ContainerDied","Data":"4e773c0ba8db44a579b0f5b59c7402d773402fbb1b3e3b57bf95e9244df635c4"} Dec 02 15:52:48 crc kubenswrapper[4933]: I1202 15:52:48.325497 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-fl25w" event={"ID":"680c0df1-e4d6-4e1c-a36d-2378e821d2d9","Type":"ContainerStarted","Data":"bdbdb1eb6ad7b6e710feeda6af64e9557ec1c3c938fd850fa5b2835abc45f098"} Dec 02 15:52:48 crc kubenswrapper[4933]: I1202 15:52:48.325571 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-fl25w" event={"ID":"680c0df1-e4d6-4e1c-a36d-2378e821d2d9","Type":"ContainerStarted","Data":"b4be6790fdc3873a7088bc1411d98c670e66d471b5aa3b7c08883ef1810ce658"} Dec 02 15:52:48 crc kubenswrapper[4933]: I1202 15:52:48.351457 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06b195f1-3296-4050-9361-eab421cde8d4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7399fed7994f0ed3f12a423d3f6796e84d8687f9c16a3050ccbb90e1c80a07d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee933bdd8638f0085a6f720a178c8ce59bf46b40a0bcb015ac9c570e25ce97d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7cb1abb6f86878fc3daef153191ea3a2ebe06b3f1fc7df959539938c3b6a724\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db14d6e6ebdfa06ff02570eb66fe7ea17a7705fdaa767b6fb91d7ed12eacd59a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:48Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:48 crc kubenswrapper[4933]: I1202 15:52:48.372981 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1972064c-ea30-421c-b009-2bc675a98fcc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0062caebc274431e99ef7dc875a50f2f2d756215bd2700b7206582f68e5f376\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0062caebc274431e99ef7dc875a50f2f2d756215bd2700b7206582f68e5f376\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8mklc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:48Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:48 crc kubenswrapper[4933]: I1202 15:52:48.386401 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:48 crc kubenswrapper[4933]: I1202 15:52:48.386447 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:48 crc kubenswrapper[4933]: I1202 15:52:48.386459 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:48 crc kubenswrapper[4933]: I1202 15:52:48.386479 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:48 crc kubenswrapper[4933]: I1202 15:52:48.386492 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:48Z","lastTransitionTime":"2025-12-02T15:52:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:48 crc kubenswrapper[4933]: I1202 15:52:48.391720 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:48Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:48 crc kubenswrapper[4933]: I1202 15:52:48.406972 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6c1c5e6-50dd-428a-890c-2c3f0456f2fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d84363ac0dfeec81ad7770d6ffd34547605fc51bebb545c4639f4c069bab93ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t48jm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54194f3459a2bbe748821e4f8e94abdd18e7c4e483d4cc2c9d5b765db584dd01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t48jm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-d2p6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:48Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:48 crc kubenswrapper[4933]: I1202 15:52:48.420473 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5d9dn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97edaf10-912b-42e7-a9e7-930381d48508\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea0970d622ab76b94ceab66d1d10d469581574368d38c8cd7c6b7a26f81cb6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s82hp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5d9dn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:48Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:48 crc kubenswrapper[4933]: I1202 15:52:48.438875 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fl25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"680c0df1-e4d6-4e1c-a36d-2378e821d2d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:47Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:47Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-559sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fl25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:48Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:48 crc kubenswrapper[4933]: I1202 15:52:48.457670 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50efb59904b9275848b8f068dfa8943515c66087209fe13dc75888354ecaff09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:48Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:48 crc kubenswrapper[4933]: I1202 15:52:48.474677 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:48Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:48 crc kubenswrapper[4933]: I1202 15:52:48.489497 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:48 crc kubenswrapper[4933]: I1202 15:52:48.489504 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4424b26b4c67e9508d92cc6bbc82b291d93c587a8463026a856d87b7b778079e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:48Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:48 crc kubenswrapper[4933]: I1202 15:52:48.489547 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:48 crc kubenswrapper[4933]: I1202 15:52:48.489561 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:48 crc kubenswrapper[4933]: I1202 15:52:48.489582 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:48 crc kubenswrapper[4933]: I1202 15:52:48.489595 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:48Z","lastTransitionTime":"2025-12-02T15:52:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:48 crc kubenswrapper[4933]: I1202 15:52:48.507068 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-z6kjz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b033c545-93a2-4401-842b-22456e44216b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5307d7bbe56091012f9975b2a42eafb27d8c90b53817f1f82d8269e23456759\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdq96\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-z6kjz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:48Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:48 crc kubenswrapper[4933]: I1202 15:52:48.524110 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67141d41-dade-4d16-8921-1a3eeaef658e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67497877f8241167fd89cf2ebbc7257a910b4f22e96a1e58f78936317daf19aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://559f28d5b8b13a4d78a727f671a75d2c61a391f83089c98749a8071b466c8fae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50e3f8f75abacf2b0701b20cc58796727bb04d99836c7ad69c27355d0c7f070f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0f8661c980893fc646e84a6bb8547946653723fb3f1449d585953e25715d588\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdced4864fc5e9a41404f9484c6126634ffcbc3388080207f6a5508be6dc7b19\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 15:52:30.552832 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 15:52:30.556124 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1621699441/tls.crt::/tmp/serving-cert-1621699441/tls.key\\\\\\\"\\\\nI1202 15:52:36.166152 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 15:52:36.169452 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 15:52:36.169553 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 15:52:36.169614 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 15:52:36.169667 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 15:52:36.177343 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 15:52:36.177409 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 15:52:36.177417 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 15:52:36.177423 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 15:52:36.177428 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 15:52:36.177432 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 15:52:36.177437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 15:52:36.177361 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 15:52:36.181525 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://266ff5f18c317176752753189e1fd32c61d17e22bebba19421d71cdce41c10e4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd63fc5b2ffbd52b09628908868b28e2168bfe8bc453de41ce175aedcd404cc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd63fc5b2ffbd52b09628908868b28e2168bfe8bc453de41ce175aedcd404cc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:48Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:48 crc kubenswrapper[4933]: I1202 15:52:48.537684 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c0555430401fd089ba4f14bef44c9a03bcc4352a3159c34aa592797211ff912\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2774460b514418fc05e1d8ac0ca0a8cda1194fab9151804bed266e6bf44c7369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:48Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:48 crc kubenswrapper[4933]: I1202 15:52:48.550321 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:48Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:48 crc kubenswrapper[4933]: I1202 15:52:48.566311 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-s779q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72a30a71-fe04-43a2-8f60-c9b12a0a6e5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a448b3118b6160344bea740978c045ce2351af934805ee95eb6b940963b377bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a448b3118b6160344bea740978c045ce2351af934805ee95eb6b940963b377bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f11b7eae04e4265f3d00a32c0c1c91419ef6da612c6ecd2e69ef4e90b5f72b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f11b7eae04e4265f3d00a32c0c1c91419ef6da612c6ecd2e69ef4e90b5f72b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89093d056f612b90064277410dc17c8b1879459a9e9491538ccd8dabeb2703e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89093d056f612b90064277410dc17c8b1879459a9e9491538ccd8dabeb2703e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e773c0ba8db44a579b0f5b59c7402d773402fbb1b3e3b57bf95e9244df635c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e773c0ba8db44a579b0f5b59c7402d773402fbb1b3e3b57bf95e9244df635c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-s779q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:48Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:48 crc kubenswrapper[4933]: I1202 15:52:48.585881 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06b195f1-3296-4050-9361-eab421cde8d4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7399fed7994f0ed3f12a423d3f6796e84d8687f9c16a3050ccbb90e1c80a07d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee933bdd8638f0085a6f720a178c8ce59bf46b40a0bcb015ac9c570e25ce97d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7cb1abb6f86878fc3daef153191ea3a2ebe06b3f1fc7df959539938c3b6a724\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db14d6e6ebdfa06ff02570eb66fe7ea17a7705fdaa767b6fb91d7ed12eacd59a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:48Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:48 crc kubenswrapper[4933]: I1202 15:52:48.601435 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:48 crc kubenswrapper[4933]: I1202 15:52:48.601669 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:48 crc kubenswrapper[4933]: I1202 15:52:48.602360 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:48 crc kubenswrapper[4933]: I1202 15:52:48.602392 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:48 crc kubenswrapper[4933]: I1202 15:52:48.602407 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:48Z","lastTransitionTime":"2025-12-02T15:52:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:48 crc kubenswrapper[4933]: I1202 15:52:48.606935 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1972064c-ea30-421c-b009-2bc675a98fcc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0062caebc274431e99ef7dc875a50f2f2d756215bd2700b7206582f68e5f376\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0062caebc274431e99ef7dc875a50f2f2d756215bd2700b7206582f68e5f376\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8mklc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:48Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:48 crc kubenswrapper[4933]: I1202 15:52:48.621897 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:48Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:48 crc kubenswrapper[4933]: I1202 15:52:48.633584 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6c1c5e6-50dd-428a-890c-2c3f0456f2fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d84363ac0dfeec81ad7770d6ffd34547605fc51bebb545c4639f4c069bab93ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t48jm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54194f3459a2bbe748821e4f8e94abdd18e7c4e483d4cc2c9d5b765db584dd01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t48jm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-d2p6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:48Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:48 crc kubenswrapper[4933]: I1202 15:52:48.647417 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5d9dn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97edaf10-912b-42e7-a9e7-930381d48508\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea0970d622ab76b94ceab66d1d10d469581574368d38c8cd7c6b7a26f81cb6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s82hp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5d9dn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:48Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:48 crc kubenswrapper[4933]: I1202 15:52:48.661043 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fl25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"680c0df1-e4d6-4e1c-a36d-2378e821d2d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bdbdb1eb6ad7b6e710feeda6af64e9557ec1c3c938fd850fa5b2835abc45f098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-559sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fl25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:48Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:48 crc kubenswrapper[4933]: I1202 15:52:48.686390 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50efb59904b9275848b8f068dfa8943515c66087209fe13dc75888354ecaff09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:48Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:48 crc kubenswrapper[4933]: I1202 15:52:48.701583 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:48Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:48 crc kubenswrapper[4933]: I1202 15:52:48.706005 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:48 crc kubenswrapper[4933]: I1202 15:52:48.706048 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:48 crc kubenswrapper[4933]: I1202 15:52:48.706057 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:48 crc kubenswrapper[4933]: I1202 15:52:48.706073 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:48 crc kubenswrapper[4933]: I1202 15:52:48.706085 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:48Z","lastTransitionTime":"2025-12-02T15:52:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:48 crc kubenswrapper[4933]: I1202 15:52:48.717482 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4424b26b4c67e9508d92cc6bbc82b291d93c587a8463026a856d87b7b778079e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:48Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:48 crc kubenswrapper[4933]: I1202 15:52:48.734403 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-z6kjz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b033c545-93a2-4401-842b-22456e44216b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5307d7bbe56091012f9975b2a42eafb27d8c90b53817f1f82d8269e23456759\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdq96\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-z6kjz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:48Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:48 crc kubenswrapper[4933]: I1202 15:52:48.751078 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67141d41-dade-4d16-8921-1a3eeaef658e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67497877f8241167fd89cf2ebbc7257a910b4f22e96a1e58f78936317daf19aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://559f28d5b8b13a4d78a727f671a75d2c61a391f83089c98749a8071b466c8fae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50e3f8f75abacf2b0701b20cc58796727bb04d99836c7ad69c27355d0c7f070f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0f8661c980893fc646e84a6bb8547946653723fb3f1449d585953e25715d588\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdced4864fc5e9a41404f9484c6126634ffcbc3388080207f6a5508be6dc7b19\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 15:52:30.552832 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 15:52:30.556124 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1621699441/tls.crt::/tmp/serving-cert-1621699441/tls.key\\\\\\\"\\\\nI1202 15:52:36.166152 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 15:52:36.169452 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 15:52:36.169553 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 15:52:36.169614 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 15:52:36.169667 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 15:52:36.177343 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 15:52:36.177409 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 15:52:36.177417 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 15:52:36.177423 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 15:52:36.177428 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 15:52:36.177432 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 15:52:36.177437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 15:52:36.177361 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 15:52:36.181525 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://266ff5f18c317176752753189e1fd32c61d17e22bebba19421d71cdce41c10e4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd63fc5b2ffbd52b09628908868b28e2168bfe8bc453de41ce175aedcd404cc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd63fc5b2ffbd52b09628908868b28e2168bfe8bc453de41ce175aedcd404cc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:48Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:48 crc kubenswrapper[4933]: I1202 15:52:48.764978 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c0555430401fd089ba4f14bef44c9a03bcc4352a3159c34aa592797211ff912\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2774460b514418fc05e1d8ac0ca0a8cda1194fab9151804bed266e6bf44c7369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:48Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:48 crc kubenswrapper[4933]: I1202 15:52:48.786980 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:48Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:48 crc kubenswrapper[4933]: I1202 15:52:48.802119 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-s779q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72a30a71-fe04-43a2-8f60-c9b12a0a6e5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a448b3118b6160344bea740978c045ce2351af934805ee95eb6b940963b377bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a448b3118b6160344bea740978c045ce2351af934805ee95eb6b940963b377bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f11b7eae04e4265f3d00a32c0c1c91419ef6da612c6ecd2e69ef4e90b5f72b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f11b7eae04e4265f3d00a32c0c1c91419ef6da612c6ecd2e69ef4e90b5f72b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89093d056f612b90064277410dc17c8b1879459a9e9491538ccd8dabeb2703e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89093d056f612b90064277410dc17c8b1879459a9e9491538ccd8dabeb2703e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e773c0ba8db44a579b0f5b59c7402d773402fbb1b3e3b57bf95e9244df635c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e773c0ba8db44a579b0f5b59c7402d773402fbb1b3e3b57bf95e9244df635c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-s779q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:48Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:48 crc kubenswrapper[4933]: I1202 15:52:48.809707 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:48 crc kubenswrapper[4933]: I1202 15:52:48.809766 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:48 crc kubenswrapper[4933]: I1202 15:52:48.809777 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:48 crc kubenswrapper[4933]: I1202 15:52:48.809796 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:48 crc kubenswrapper[4933]: I1202 15:52:48.809806 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:48Z","lastTransitionTime":"2025-12-02T15:52:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:48 crc kubenswrapper[4933]: I1202 15:52:48.912653 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:48 crc kubenswrapper[4933]: I1202 15:52:48.912684 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:48 crc kubenswrapper[4933]: I1202 15:52:48.912692 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:48 crc kubenswrapper[4933]: I1202 15:52:48.912704 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:48 crc kubenswrapper[4933]: I1202 15:52:48.912713 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:48Z","lastTransitionTime":"2025-12-02T15:52:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:49 crc kubenswrapper[4933]: I1202 15:52:49.015189 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:49 crc kubenswrapper[4933]: I1202 15:52:49.015262 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:49 crc kubenswrapper[4933]: I1202 15:52:49.015274 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:49 crc kubenswrapper[4933]: I1202 15:52:49.015315 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:49 crc kubenswrapper[4933]: I1202 15:52:49.015326 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:49Z","lastTransitionTime":"2025-12-02T15:52:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:49 crc kubenswrapper[4933]: I1202 15:52:49.033287 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 15:52:49 crc kubenswrapper[4933]: I1202 15:52:49.050517 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06b195f1-3296-4050-9361-eab421cde8d4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7399fed7994f0ed3f12a423d3f6796e84d8687f9c16a3050ccbb90e1c80a07d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee933bdd8638f0085a6f720a178c8ce59bf46b40a0bcb015ac9c570e25ce97d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7cb1abb6f86878fc3daef153191ea3a2ebe06b3f1fc7df959539938c3b6a724\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db14d6e6ebdfa06ff02570eb66fe7ea17a7705fdaa767b6fb91d7ed12eacd59a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:49Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:49 crc kubenswrapper[4933]: I1202 15:52:49.073217 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1972064c-ea30-421c-b009-2bc675a98fcc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0062caebc274431e99ef7dc875a50f2f2d756215bd2700b7206582f68e5f376\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0062caebc274431e99ef7dc875a50f2f2d756215bd2700b7206582f68e5f376\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8mklc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:49Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:49 crc kubenswrapper[4933]: I1202 15:52:49.085907 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:49Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:49 crc kubenswrapper[4933]: I1202 15:52:49.102499 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6c1c5e6-50dd-428a-890c-2c3f0456f2fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d84363ac0dfeec81ad7770d6ffd34547605fc51bebb545c4639f4c069bab93ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t48jm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54194f3459a2bbe748821e4f8e94abdd18e7c4e483d4cc2c9d5b765db584dd01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t48jm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-d2p6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:49Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:49 crc kubenswrapper[4933]: I1202 15:52:49.118007 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5d9dn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97edaf10-912b-42e7-a9e7-930381d48508\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea0970d622ab76b94ceab66d1d10d469581574368d38c8cd7c6b7a26f81cb6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s82hp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5d9dn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:49Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:49 crc kubenswrapper[4933]: I1202 15:52:49.119087 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:49 crc kubenswrapper[4933]: I1202 15:52:49.119127 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:49 crc kubenswrapper[4933]: I1202 15:52:49.119136 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:49 crc kubenswrapper[4933]: I1202 15:52:49.119152 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:49 crc kubenswrapper[4933]: I1202 15:52:49.119165 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:49Z","lastTransitionTime":"2025-12-02T15:52:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:49 crc kubenswrapper[4933]: I1202 15:52:49.130531 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fl25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"680c0df1-e4d6-4e1c-a36d-2378e821d2d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bdbdb1eb6ad7b6e710feeda6af64e9557ec1c3c938fd850fa5b2835abc45f098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-559sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fl25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:49Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:49 crc kubenswrapper[4933]: I1202 15:52:49.146900 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50efb59904b9275848b8f068dfa8943515c66087209fe13dc75888354ecaff09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:49Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:49 crc kubenswrapper[4933]: I1202 15:52:49.160259 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:49Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:49 crc kubenswrapper[4933]: I1202 15:52:49.170174 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4424b26b4c67e9508d92cc6bbc82b291d93c587a8463026a856d87b7b778079e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:49Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:49 crc kubenswrapper[4933]: I1202 15:52:49.180604 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-z6kjz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b033c545-93a2-4401-842b-22456e44216b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5307d7bbe56091012f9975b2a42eafb27d8c90b53817f1f82d8269e23456759\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdq96\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-z6kjz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:49Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:49 crc kubenswrapper[4933]: I1202 15:52:49.197988 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67141d41-dade-4d16-8921-1a3eeaef658e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67497877f8241167fd89cf2ebbc7257a910b4f22e96a1e58f78936317daf19aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://559f28d5b8b13a4d78a727f671a75d2c61a391f83089c98749a8071b466c8fae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50e3f8f75abacf2b0701b20cc58796727bb04d99836c7ad69c27355d0c7f070f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0f8661c980893fc646e84a6bb8547946653723fb3f1449d585953e25715d588\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdced4864fc5e9a41404f9484c6126634ffcbc3388080207f6a5508be6dc7b19\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 15:52:30.552832 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 15:52:30.556124 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1621699441/tls.crt::/tmp/serving-cert-1621699441/tls.key\\\\\\\"\\\\nI1202 15:52:36.166152 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 15:52:36.169452 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 15:52:36.169553 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 15:52:36.169614 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 15:52:36.169667 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 15:52:36.177343 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 15:52:36.177409 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 15:52:36.177417 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 15:52:36.177423 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 15:52:36.177428 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 15:52:36.177432 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 15:52:36.177437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 15:52:36.177361 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 15:52:36.181525 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://266ff5f18c317176752753189e1fd32c61d17e22bebba19421d71cdce41c10e4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd63fc5b2ffbd52b09628908868b28e2168bfe8bc453de41ce175aedcd404cc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd63fc5b2ffbd52b09628908868b28e2168bfe8bc453de41ce175aedcd404cc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:49Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:49 crc kubenswrapper[4933]: I1202 15:52:49.211742 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c0555430401fd089ba4f14bef44c9a03bcc4352a3159c34aa592797211ff912\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2774460b514418fc05e1d8ac0ca0a8cda1194fab9151804bed266e6bf44c7369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:49Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:49 crc kubenswrapper[4933]: I1202 15:52:49.222172 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:49 crc kubenswrapper[4933]: I1202 15:52:49.222199 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:49 crc kubenswrapper[4933]: I1202 15:52:49.222208 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:49 crc kubenswrapper[4933]: I1202 15:52:49.222224 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:49 crc kubenswrapper[4933]: I1202 15:52:49.222233 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:49Z","lastTransitionTime":"2025-12-02T15:52:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:49 crc kubenswrapper[4933]: I1202 15:52:49.225273 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:49Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:49 crc kubenswrapper[4933]: I1202 15:52:49.237512 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-s779q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72a30a71-fe04-43a2-8f60-c9b12a0a6e5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a448b3118b6160344bea740978c045ce2351af934805ee95eb6b940963b377bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a448b3118b6160344bea740978c045ce2351af934805ee95eb6b940963b377bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f11b7eae04e4265f3d00a32c0c1c91419ef6da612c6ecd2e69ef4e90b5f72b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f11b7eae04e4265f3d00a32c0c1c91419ef6da612c6ecd2e69ef4e90b5f72b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89093d056f612b90064277410dc17c8b1879459a9e9491538ccd8dabeb2703e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89093d056f612b90064277410dc17c8b1879459a9e9491538ccd8dabeb2703e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e773c0ba8db44a579b0f5b59c7402d773402fbb1b3e3b57bf95e9244df635c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e773c0ba8db44a579b0f5b59c7402d773402fbb1b3e3b57bf95e9244df635c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-s779q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:49Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:49 crc kubenswrapper[4933]: I1202 15:52:49.325561 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:49 crc kubenswrapper[4933]: I1202 15:52:49.325593 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:49 crc kubenswrapper[4933]: I1202 15:52:49.325601 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:49 crc kubenswrapper[4933]: I1202 15:52:49.325615 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:49 crc kubenswrapper[4933]: I1202 15:52:49.325623 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:49Z","lastTransitionTime":"2025-12-02T15:52:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:49 crc kubenswrapper[4933]: I1202 15:52:49.332091 4933 generic.go:334] "Generic (PLEG): container finished" podID="72a30a71-fe04-43a2-8f60-c9b12a0a6e5a" containerID="4a098d78bfca730eec14ca5f48ae082e24f0623ec3752e1b0182dceb18821e3b" exitCode=0 Dec 02 15:52:49 crc kubenswrapper[4933]: I1202 15:52:49.332149 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-s779q" event={"ID":"72a30a71-fe04-43a2-8f60-c9b12a0a6e5a","Type":"ContainerDied","Data":"4a098d78bfca730eec14ca5f48ae082e24f0623ec3752e1b0182dceb18821e3b"} Dec 02 15:52:49 crc kubenswrapper[4933]: I1202 15:52:49.348050 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06b195f1-3296-4050-9361-eab421cde8d4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7399fed7994f0ed3f12a423d3f6796e84d8687f9c16a3050ccbb90e1c80a07d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee933bdd8638f0085a6f720a178c8ce59bf46b40a0bcb015ac9c570e25ce97d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7cb1abb6f86878fc3daef153191ea3a2ebe06b3f1fc7df959539938c3b6a724\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db14d6e6ebdfa06ff02570eb66fe7ea17a7705fdaa767b6fb91d7ed12eacd59a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:49Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:49 crc kubenswrapper[4933]: I1202 15:52:49.373515 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1972064c-ea30-421c-b009-2bc675a98fcc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0062caebc274431e99ef7dc875a50f2f2d756215bd2700b7206582f68e5f376\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0062caebc274431e99ef7dc875a50f2f2d756215bd2700b7206582f68e5f376\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8mklc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:49Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:49 crc kubenswrapper[4933]: I1202 15:52:49.391361 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:49Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:49 crc kubenswrapper[4933]: I1202 15:52:49.403811 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6c1c5e6-50dd-428a-890c-2c3f0456f2fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d84363ac0dfeec81ad7770d6ffd34547605fc51bebb545c4639f4c069bab93ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t48jm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54194f3459a2bbe748821e4f8e94abdd18e7c4e483d4cc2c9d5b765db584dd01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t48jm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-d2p6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:49Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:49 crc kubenswrapper[4933]: I1202 15:52:49.414328 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5d9dn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97edaf10-912b-42e7-a9e7-930381d48508\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea0970d622ab76b94ceab66d1d10d469581574368d38c8cd7c6b7a26f81cb6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s82hp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5d9dn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:49Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:49 crc kubenswrapper[4933]: I1202 15:52:49.428286 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:49 crc kubenswrapper[4933]: I1202 15:52:49.428345 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:49 crc kubenswrapper[4933]: I1202 15:52:49.428361 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:49 crc kubenswrapper[4933]: I1202 15:52:49.428382 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:49 crc kubenswrapper[4933]: I1202 15:52:49.428398 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:49Z","lastTransitionTime":"2025-12-02T15:52:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:49 crc kubenswrapper[4933]: I1202 15:52:49.429853 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fl25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"680c0df1-e4d6-4e1c-a36d-2378e821d2d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bdbdb1eb6ad7b6e710feeda6af64e9557ec1c3c938fd850fa5b2835abc45f098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-559sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fl25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:49Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:49 crc kubenswrapper[4933]: I1202 15:52:49.446806 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50efb59904b9275848b8f068dfa8943515c66087209fe13dc75888354ecaff09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:49Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:49 crc kubenswrapper[4933]: I1202 15:52:49.461289 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:49Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:49 crc kubenswrapper[4933]: I1202 15:52:49.473570 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4424b26b4c67e9508d92cc6bbc82b291d93c587a8463026a856d87b7b778079e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:49Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:49 crc kubenswrapper[4933]: I1202 15:52:49.496453 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-z6kjz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b033c545-93a2-4401-842b-22456e44216b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5307d7bbe56091012f9975b2a42eafb27d8c90b53817f1f82d8269e23456759\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdq96\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-z6kjz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:49Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:49 crc kubenswrapper[4933]: I1202 15:52:49.516738 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67141d41-dade-4d16-8921-1a3eeaef658e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67497877f8241167fd89cf2ebbc7257a910b4f22e96a1e58f78936317daf19aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://559f28d5b8b13a4d78a727f671a75d2c61a391f83089c98749a8071b466c8fae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50e3f8f75abacf2b0701b20cc58796727bb04d99836c7ad69c27355d0c7f070f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0f8661c980893fc646e84a6bb8547946653723fb3f1449d585953e25715d588\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdced4864fc5e9a41404f9484c6126634ffcbc3388080207f6a5508be6dc7b19\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 15:52:30.552832 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 15:52:30.556124 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1621699441/tls.crt::/tmp/serving-cert-1621699441/tls.key\\\\\\\"\\\\nI1202 15:52:36.166152 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 15:52:36.169452 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 15:52:36.169553 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 15:52:36.169614 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 15:52:36.169667 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 15:52:36.177343 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 15:52:36.177409 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 15:52:36.177417 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 15:52:36.177423 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 15:52:36.177428 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 15:52:36.177432 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 15:52:36.177437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 15:52:36.177361 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 15:52:36.181525 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://266ff5f18c317176752753189e1fd32c61d17e22bebba19421d71cdce41c10e4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd63fc5b2ffbd52b09628908868b28e2168bfe8bc453de41ce175aedcd404cc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd63fc5b2ffbd52b09628908868b28e2168bfe8bc453de41ce175aedcd404cc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:49Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:49 crc kubenswrapper[4933]: I1202 15:52:49.530442 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c0555430401fd089ba4f14bef44c9a03bcc4352a3159c34aa592797211ff912\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2774460b514418fc05e1d8ac0ca0a8cda1194fab9151804bed266e6bf44c7369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:49Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:49 crc kubenswrapper[4933]: I1202 15:52:49.532051 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:49 crc kubenswrapper[4933]: I1202 15:52:49.532087 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:49 crc kubenswrapper[4933]: I1202 15:52:49.532102 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:49 crc kubenswrapper[4933]: I1202 15:52:49.532123 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:49 crc kubenswrapper[4933]: I1202 15:52:49.532138 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:49Z","lastTransitionTime":"2025-12-02T15:52:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:49 crc kubenswrapper[4933]: I1202 15:52:49.544632 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:49Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:49 crc kubenswrapper[4933]: I1202 15:52:49.567909 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-s779q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72a30a71-fe04-43a2-8f60-c9b12a0a6e5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a448b3118b6160344bea740978c045ce2351af934805ee95eb6b940963b377bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a448b3118b6160344bea740978c045ce2351af934805ee95eb6b940963b377bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f11b7eae04e4265f3d00a32c0c1c91419ef6da612c6ecd2e69ef4e90b5f72b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f11b7eae04e4265f3d00a32c0c1c91419ef6da612c6ecd2e69ef4e90b5f72b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89093d056f612b90064277410dc17c8b1879459a9e9491538ccd8dabeb2703e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89093d056f612b90064277410dc17c8b1879459a9e9491538ccd8dabeb2703e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e773c0ba8db44a579b0f5b59c7402d773402fbb1b3e3b57bf95e9244df635c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e773c0ba8db44a579b0f5b59c7402d773402fbb1b3e3b57bf95e9244df635c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a098d78bfca730eec14ca5f48ae082e24f0623ec3752e1b0182dceb18821e3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a098d78bfca730eec14ca5f48ae082e24f0623ec3752e1b0182dceb18821e3b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-s779q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:49Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:49 crc kubenswrapper[4933]: I1202 15:52:49.635372 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:49 crc kubenswrapper[4933]: I1202 15:52:49.635428 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:49 crc kubenswrapper[4933]: I1202 15:52:49.635439 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:49 crc kubenswrapper[4933]: I1202 15:52:49.635457 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:49 crc kubenswrapper[4933]: I1202 15:52:49.635471 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:49Z","lastTransitionTime":"2025-12-02T15:52:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:49 crc kubenswrapper[4933]: I1202 15:52:49.738882 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:49 crc kubenswrapper[4933]: I1202 15:52:49.738952 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:49 crc kubenswrapper[4933]: I1202 15:52:49.738970 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:49 crc kubenswrapper[4933]: I1202 15:52:49.738996 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:49 crc kubenswrapper[4933]: I1202 15:52:49.739013 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:49Z","lastTransitionTime":"2025-12-02T15:52:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:49 crc kubenswrapper[4933]: I1202 15:52:49.842652 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:49 crc kubenswrapper[4933]: I1202 15:52:49.843394 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:49 crc kubenswrapper[4933]: I1202 15:52:49.843429 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:49 crc kubenswrapper[4933]: I1202 15:52:49.843461 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:49 crc kubenswrapper[4933]: I1202 15:52:49.843476 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:49Z","lastTransitionTime":"2025-12-02T15:52:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:49 crc kubenswrapper[4933]: I1202 15:52:49.946650 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:49 crc kubenswrapper[4933]: I1202 15:52:49.946715 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:49 crc kubenswrapper[4933]: I1202 15:52:49.946727 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:49 crc kubenswrapper[4933]: I1202 15:52:49.946749 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:49 crc kubenswrapper[4933]: I1202 15:52:49.946763 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:49Z","lastTransitionTime":"2025-12-02T15:52:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:50 crc kubenswrapper[4933]: I1202 15:52:50.050500 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:50 crc kubenswrapper[4933]: I1202 15:52:50.050570 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:50 crc kubenswrapper[4933]: I1202 15:52:50.050586 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:50 crc kubenswrapper[4933]: I1202 15:52:50.050609 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:50 crc kubenswrapper[4933]: I1202 15:52:50.050625 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:50Z","lastTransitionTime":"2025-12-02T15:52:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:50 crc kubenswrapper[4933]: I1202 15:52:50.053325 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 15:52:50 crc kubenswrapper[4933]: I1202 15:52:50.053425 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 15:52:50 crc kubenswrapper[4933]: E1202 15:52:50.053486 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 15:52:50 crc kubenswrapper[4933]: I1202 15:52:50.053337 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 15:52:50 crc kubenswrapper[4933]: E1202 15:52:50.053633 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 15:52:50 crc kubenswrapper[4933]: E1202 15:52:50.053727 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 15:52:50 crc kubenswrapper[4933]: I1202 15:52:50.153914 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:50 crc kubenswrapper[4933]: I1202 15:52:50.153956 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:50 crc kubenswrapper[4933]: I1202 15:52:50.153968 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:50 crc kubenswrapper[4933]: I1202 15:52:50.153987 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:50 crc kubenswrapper[4933]: I1202 15:52:50.153999 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:50Z","lastTransitionTime":"2025-12-02T15:52:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:50 crc kubenswrapper[4933]: I1202 15:52:50.235355 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:50 crc kubenswrapper[4933]: I1202 15:52:50.235441 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:50 crc kubenswrapper[4933]: I1202 15:52:50.235452 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:50 crc kubenswrapper[4933]: I1202 15:52:50.235472 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:50 crc kubenswrapper[4933]: I1202 15:52:50.235485 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:50Z","lastTransitionTime":"2025-12-02T15:52:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:50 crc kubenswrapper[4933]: E1202 15:52:50.249759 4933 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:52:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:52:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:52:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:52:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b45811fa-f657-451d-9a34-cdd268fcc941\\\",\\\"systemUUID\\\":\\\"84b7b789-bc9b-466b-8619-2bf2e1fdb8d0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:50Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:50 crc kubenswrapper[4933]: I1202 15:52:50.253700 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:50 crc kubenswrapper[4933]: I1202 15:52:50.253733 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:50 crc kubenswrapper[4933]: I1202 15:52:50.253743 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:50 crc kubenswrapper[4933]: I1202 15:52:50.253763 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:50 crc kubenswrapper[4933]: I1202 15:52:50.253776 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:50Z","lastTransitionTime":"2025-12-02T15:52:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:50 crc kubenswrapper[4933]: E1202 15:52:50.266057 4933 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:52:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:52:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:52:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:52:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b45811fa-f657-451d-9a34-cdd268fcc941\\\",\\\"systemUUID\\\":\\\"84b7b789-bc9b-466b-8619-2bf2e1fdb8d0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:50Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:50 crc kubenswrapper[4933]: I1202 15:52:50.270810 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:50 crc kubenswrapper[4933]: I1202 15:52:50.270881 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:50 crc kubenswrapper[4933]: I1202 15:52:50.270897 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:50 crc kubenswrapper[4933]: I1202 15:52:50.270928 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:50 crc kubenswrapper[4933]: I1202 15:52:50.270941 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:50Z","lastTransitionTime":"2025-12-02T15:52:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:50 crc kubenswrapper[4933]: E1202 15:52:50.287093 4933 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:52:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:52:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:52:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:52:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b45811fa-f657-451d-9a34-cdd268fcc941\\\",\\\"systemUUID\\\":\\\"84b7b789-bc9b-466b-8619-2bf2e1fdb8d0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:50Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:50 crc kubenswrapper[4933]: I1202 15:52:50.291200 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:50 crc kubenswrapper[4933]: I1202 15:52:50.291228 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:50 crc kubenswrapper[4933]: I1202 15:52:50.291237 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:50 crc kubenswrapper[4933]: I1202 15:52:50.291253 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:50 crc kubenswrapper[4933]: I1202 15:52:50.291262 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:50Z","lastTransitionTime":"2025-12-02T15:52:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:50 crc kubenswrapper[4933]: E1202 15:52:50.311420 4933 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:52:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:52:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:52:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:52:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b45811fa-f657-451d-9a34-cdd268fcc941\\\",\\\"systemUUID\\\":\\\"84b7b789-bc9b-466b-8619-2bf2e1fdb8d0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:50Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:50 crc kubenswrapper[4933]: I1202 15:52:50.315004 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:50 crc kubenswrapper[4933]: I1202 15:52:50.315072 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:50 crc kubenswrapper[4933]: I1202 15:52:50.315088 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:50 crc kubenswrapper[4933]: I1202 15:52:50.315106 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:50 crc kubenswrapper[4933]: I1202 15:52:50.315117 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:50Z","lastTransitionTime":"2025-12-02T15:52:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:50 crc kubenswrapper[4933]: E1202 15:52:50.326804 4933 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:52:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:52:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:52:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:52:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b45811fa-f657-451d-9a34-cdd268fcc941\\\",\\\"systemUUID\\\":\\\"84b7b789-bc9b-466b-8619-2bf2e1fdb8d0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:50Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:50 crc kubenswrapper[4933]: E1202 15:52:50.326976 4933 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 02 15:52:50 crc kubenswrapper[4933]: I1202 15:52:50.328756 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:50 crc kubenswrapper[4933]: I1202 15:52:50.328796 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:50 crc kubenswrapper[4933]: I1202 15:52:50.328804 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:50 crc kubenswrapper[4933]: I1202 15:52:50.328835 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:50 crc kubenswrapper[4933]: I1202 15:52:50.328871 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:50Z","lastTransitionTime":"2025-12-02T15:52:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:50 crc kubenswrapper[4933]: I1202 15:52:50.339144 4933 generic.go:334] "Generic (PLEG): container finished" podID="72a30a71-fe04-43a2-8f60-c9b12a0a6e5a" containerID="e41fea9eed5df332e515a81666075fc4cb3171b47a7c222b36dac4d5a7533692" exitCode=0 Dec 02 15:52:50 crc kubenswrapper[4933]: I1202 15:52:50.339209 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-s779q" event={"ID":"72a30a71-fe04-43a2-8f60-c9b12a0a6e5a","Type":"ContainerDied","Data":"e41fea9eed5df332e515a81666075fc4cb3171b47a7c222b36dac4d5a7533692"} Dec 02 15:52:50 crc kubenswrapper[4933]: I1202 15:52:50.352034 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" event={"ID":"1972064c-ea30-421c-b009-2bc675a98fcc","Type":"ContainerStarted","Data":"266447045a0c1f371e1f4464fa07d948355d9b00ce2443dcd0b6e1835d9cc835"} Dec 02 15:52:50 crc kubenswrapper[4933]: I1202 15:52:50.353111 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" Dec 02 15:52:50 crc kubenswrapper[4933]: I1202 15:52:50.353186 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" Dec 02 15:52:50 crc kubenswrapper[4933]: I1202 15:52:50.358783 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67141d41-dade-4d16-8921-1a3eeaef658e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67497877f8241167fd89cf2ebbc7257a910b4f22e96a1e58f78936317daf19aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://559f28d5b8b13a4d78a727f671a75d2c61a391f83089c98749a8071b466c8fae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50e3f8f75abacf2b0701b20cc58796727bb04d99836c7ad69c27355d0c7f070f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0f8661c980893fc646e84a6bb8547946653723fb3f1449d585953e25715d588\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdced4864fc5e9a41404f9484c6126634ffcbc3388080207f6a5508be6dc7b19\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 15:52:30.552832 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 15:52:30.556124 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1621699441/tls.crt::/tmp/serving-cert-1621699441/tls.key\\\\\\\"\\\\nI1202 15:52:36.166152 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 15:52:36.169452 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 15:52:36.169553 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 15:52:36.169614 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 15:52:36.169667 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 15:52:36.177343 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 15:52:36.177409 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 15:52:36.177417 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 15:52:36.177423 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 15:52:36.177428 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 15:52:36.177432 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 15:52:36.177437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 15:52:36.177361 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 15:52:36.181525 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://266ff5f18c317176752753189e1fd32c61d17e22bebba19421d71cdce41c10e4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd63fc5b2ffbd52b09628908868b28e2168bfe8bc453de41ce175aedcd404cc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd63fc5b2ffbd52b09628908868b28e2168bfe8bc453de41ce175aedcd404cc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:50Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:50 crc kubenswrapper[4933]: I1202 15:52:50.374325 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c0555430401fd089ba4f14bef44c9a03bcc4352a3159c34aa592797211ff912\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2774460b514418fc05e1d8ac0ca0a8cda1194fab9151804bed266e6bf44c7369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:50Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:50 crc kubenswrapper[4933]: I1202 15:52:50.381487 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" Dec 02 15:52:50 crc kubenswrapper[4933]: I1202 15:52:50.383387 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" Dec 02 15:52:50 crc kubenswrapper[4933]: I1202 15:52:50.390487 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:50Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:50 crc kubenswrapper[4933]: I1202 15:52:50.406852 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-s779q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72a30a71-fe04-43a2-8f60-c9b12a0a6e5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a448b3118b6160344bea740978c045ce2351af934805ee95eb6b940963b377bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a448b3118b6160344bea740978c045ce2351af934805ee95eb6b940963b377bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f11b7eae04e4265f3d00a32c0c1c91419ef6da612c6ecd2e69ef4e90b5f72b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f11b7eae04e4265f3d00a32c0c1c91419ef6da612c6ecd2e69ef4e90b5f72b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89093d056f612b90064277410dc17c8b1879459a9e9491538ccd8dabeb2703e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89093d056f612b90064277410dc17c8b1879459a9e9491538ccd8dabeb2703e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e773c0ba8db44a579b0f5b59c7402d773402fbb1b3e3b57bf95e9244df635c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e773c0ba8db44a579b0f5b59c7402d773402fbb1b3e3b57bf95e9244df635c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a098d78bfca730eec14ca5f48ae082e24f0623ec3752e1b0182dceb18821e3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a098d78bfca730eec14ca5f48ae082e24f0623ec3752e1b0182dceb18821e3b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e41fea9eed5df332e515a81666075fc4cb3171b47a7c222b36dac4d5a7533692\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e41fea9eed5df332e515a81666075fc4cb3171b47a7c222b36dac4d5a7533692\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-s779q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:50Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:50 crc kubenswrapper[4933]: I1202 15:52:50.425017 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06b195f1-3296-4050-9361-eab421cde8d4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7399fed7994f0ed3f12a423d3f6796e84d8687f9c16a3050ccbb90e1c80a07d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee933bdd8638f0085a6f720a178c8ce59bf46b40a0bcb015ac9c570e25ce97d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7cb1abb6f86878fc3daef153191ea3a2ebe06b3f1fc7df959539938c3b6a724\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db14d6e6ebdfa06ff02570eb66fe7ea17a7705fdaa767b6fb91d7ed12eacd59a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:50Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:50 crc kubenswrapper[4933]: I1202 15:52:50.436114 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:50 crc kubenswrapper[4933]: I1202 15:52:50.436170 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:50 crc kubenswrapper[4933]: I1202 15:52:50.436185 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:50 crc kubenswrapper[4933]: I1202 15:52:50.436205 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:50 crc kubenswrapper[4933]: I1202 15:52:50.436217 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:50Z","lastTransitionTime":"2025-12-02T15:52:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:50 crc kubenswrapper[4933]: I1202 15:52:50.452414 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1972064c-ea30-421c-b009-2bc675a98fcc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0062caebc274431e99ef7dc875a50f2f2d756215bd2700b7206582f68e5f376\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0062caebc274431e99ef7dc875a50f2f2d756215bd2700b7206582f68e5f376\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8mklc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:50Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:50 crc kubenswrapper[4933]: I1202 15:52:50.466623 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fl25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"680c0df1-e4d6-4e1c-a36d-2378e821d2d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bdbdb1eb6ad7b6e710feeda6af64e9557ec1c3c938fd850fa5b2835abc45f098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-559sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fl25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:50Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:50 crc kubenswrapper[4933]: I1202 15:52:50.483158 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:50Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:50 crc kubenswrapper[4933]: I1202 15:52:50.500956 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6c1c5e6-50dd-428a-890c-2c3f0456f2fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d84363ac0dfeec81ad7770d6ffd34547605fc51bebb545c4639f4c069bab93ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t48jm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54194f3459a2bbe748821e4f8e94abdd18e7c4e483d4cc2c9d5b765db584dd01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t48jm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-d2p6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:50Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:50 crc kubenswrapper[4933]: I1202 15:52:50.516129 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5d9dn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97edaf10-912b-42e7-a9e7-930381d48508\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea0970d622ab76b94ceab66d1d10d469581574368d38c8cd7c6b7a26f81cb6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s82hp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5d9dn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:50Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:50 crc kubenswrapper[4933]: I1202 15:52:50.536655 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50efb59904b9275848b8f068dfa8943515c66087209fe13dc75888354ecaff09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:50Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:50 crc kubenswrapper[4933]: I1202 15:52:50.539649 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:50 crc kubenswrapper[4933]: I1202 15:52:50.539725 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:50 crc kubenswrapper[4933]: I1202 15:52:50.539760 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:50 crc kubenswrapper[4933]: I1202 15:52:50.539785 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:50 crc kubenswrapper[4933]: I1202 15:52:50.539850 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:50Z","lastTransitionTime":"2025-12-02T15:52:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:50 crc kubenswrapper[4933]: I1202 15:52:50.551691 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:50Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:50 crc kubenswrapper[4933]: I1202 15:52:50.565532 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4424b26b4c67e9508d92cc6bbc82b291d93c587a8463026a856d87b7b778079e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:50Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:50 crc kubenswrapper[4933]: I1202 15:52:50.580551 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-z6kjz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b033c545-93a2-4401-842b-22456e44216b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5307d7bbe56091012f9975b2a42eafb27d8c90b53817f1f82d8269e23456759\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdq96\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-z6kjz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:50Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:50 crc kubenswrapper[4933]: I1202 15:52:50.602281 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06b195f1-3296-4050-9361-eab421cde8d4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7399fed7994f0ed3f12a423d3f6796e84d8687f9c16a3050ccbb90e1c80a07d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee933bdd8638f0085a6f720a178c8ce59bf46b40a0bcb015ac9c570e25ce97d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7cb1abb6f86878fc3daef153191ea3a2ebe06b3f1fc7df959539938c3b6a724\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db14d6e6ebdfa06ff02570eb66fe7ea17a7705fdaa767b6fb91d7ed12eacd59a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:50Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:50 crc kubenswrapper[4933]: I1202 15:52:50.626530 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1972064c-ea30-421c-b009-2bc675a98fcc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3920b75a79accaa154ca827b9d813f5bbb7c7c18369c0b6d3bb82cab653bcb66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a3f2ac3ef3759389e0c4807b4235aea66b8d9570ef020af90110402296a755\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d4c0805f39e3410185b2d898b61b42dfeb544cd47c1ac7570a099ece7921a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://170155e5803e9eabfc4866a3cd76bd74567d9ff6707bebfdf7f24a2330173140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a499bb33dae499d565aa22f537a095fe9465bd1d68d829248fc7a896f692f67d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14c25265781df0f480ecebbc167b5e413d869433f7258cc299a2c2bc12ee2af3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://266447045a0c1f371e1f4464fa07d948355d9b00ce2443dcd0b6e1835d9cc835\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ae6c9973a0f8983a5da9a3a25ba23d7d836e7f5bdaa60b60627d604ef3d2afc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0062caebc274431e99ef7dc875a50f2f2d756215bd2700b7206582f68e5f376\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0062caebc274431e99ef7dc875a50f2f2d756215bd2700b7206582f68e5f376\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8mklc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:50Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:50 crc kubenswrapper[4933]: I1202 15:52:50.642287 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:50Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:50 crc kubenswrapper[4933]: I1202 15:52:50.644093 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:50 crc kubenswrapper[4933]: I1202 15:52:50.644166 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:50 crc kubenswrapper[4933]: I1202 15:52:50.644190 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:50 crc kubenswrapper[4933]: I1202 15:52:50.645205 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:50 crc kubenswrapper[4933]: I1202 15:52:50.645265 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:50Z","lastTransitionTime":"2025-12-02T15:52:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:50 crc kubenswrapper[4933]: I1202 15:52:50.655368 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6c1c5e6-50dd-428a-890c-2c3f0456f2fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d84363ac0dfeec81ad7770d6ffd34547605fc51bebb545c4639f4c069bab93ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t48jm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54194f3459a2bbe748821e4f8e94abdd18e7c4e483d4cc2c9d5b765db584dd01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t48jm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-d2p6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:50Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:50 crc kubenswrapper[4933]: I1202 15:52:50.667002 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5d9dn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97edaf10-912b-42e7-a9e7-930381d48508\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea0970d622ab76b94ceab66d1d10d469581574368d38c8cd7c6b7a26f81cb6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s82hp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5d9dn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:50Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:50 crc kubenswrapper[4933]: I1202 15:52:50.678284 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fl25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"680c0df1-e4d6-4e1c-a36d-2378e821d2d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bdbdb1eb6ad7b6e710feeda6af64e9557ec1c3c938fd850fa5b2835abc45f098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-559sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fl25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:50Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:50 crc kubenswrapper[4933]: I1202 15:52:50.692448 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50efb59904b9275848b8f068dfa8943515c66087209fe13dc75888354ecaff09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:50Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:50 crc kubenswrapper[4933]: I1202 15:52:50.707771 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:50Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:50 crc kubenswrapper[4933]: I1202 15:52:50.723026 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4424b26b4c67e9508d92cc6bbc82b291d93c587a8463026a856d87b7b778079e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:50Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:50 crc kubenswrapper[4933]: I1202 15:52:50.736959 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-z6kjz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b033c545-93a2-4401-842b-22456e44216b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5307d7bbe56091012f9975b2a42eafb27d8c90b53817f1f82d8269e23456759\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdq96\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-z6kjz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:50Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:50 crc kubenswrapper[4933]: I1202 15:52:50.747592 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:50 crc kubenswrapper[4933]: I1202 15:52:50.747633 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:50 crc kubenswrapper[4933]: I1202 15:52:50.747644 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:50 crc kubenswrapper[4933]: I1202 15:52:50.747659 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:50 crc kubenswrapper[4933]: I1202 15:52:50.747669 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:50Z","lastTransitionTime":"2025-12-02T15:52:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:50 crc kubenswrapper[4933]: I1202 15:52:50.750529 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67141d41-dade-4d16-8921-1a3eeaef658e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67497877f8241167fd89cf2ebbc7257a910b4f22e96a1e58f78936317daf19aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://559f28d5b8b13a4d78a727f671a75d2c61a391f83089c98749a8071b466c8fae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50e3f8f75abacf2b0701b20cc58796727bb04d99836c7ad69c27355d0c7f070f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0f8661c980893fc646e84a6bb8547946653723fb3f1449d585953e25715d588\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdced4864fc5e9a41404f9484c6126634ffcbc3388080207f6a5508be6dc7b19\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 15:52:30.552832 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 15:52:30.556124 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1621699441/tls.crt::/tmp/serving-cert-1621699441/tls.key\\\\\\\"\\\\nI1202 15:52:36.166152 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 15:52:36.169452 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 15:52:36.169553 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 15:52:36.169614 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 15:52:36.169667 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 15:52:36.177343 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 15:52:36.177409 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 15:52:36.177417 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 15:52:36.177423 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 15:52:36.177428 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 15:52:36.177432 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 15:52:36.177437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 15:52:36.177361 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 15:52:36.181525 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://266ff5f18c317176752753189e1fd32c61d17e22bebba19421d71cdce41c10e4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd63fc5b2ffbd52b09628908868b28e2168bfe8bc453de41ce175aedcd404cc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd63fc5b2ffbd52b09628908868b28e2168bfe8bc453de41ce175aedcd404cc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:50Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:50 crc kubenswrapper[4933]: I1202 15:52:50.764428 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c0555430401fd089ba4f14bef44c9a03bcc4352a3159c34aa592797211ff912\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2774460b514418fc05e1d8ac0ca0a8cda1194fab9151804bed266e6bf44c7369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:50Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:50 crc kubenswrapper[4933]: I1202 15:52:50.777961 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:50Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:50 crc kubenswrapper[4933]: I1202 15:52:50.790817 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-s779q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72a30a71-fe04-43a2-8f60-c9b12a0a6e5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a448b3118b6160344bea740978c045ce2351af934805ee95eb6b940963b377bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a448b3118b6160344bea740978c045ce2351af934805ee95eb6b940963b377bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f11b7eae04e4265f3d00a32c0c1c91419ef6da612c6ecd2e69ef4e90b5f72b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f11b7eae04e4265f3d00a32c0c1c91419ef6da612c6ecd2e69ef4e90b5f72b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89093d056f612b90064277410dc17c8b1879459a9e9491538ccd8dabeb2703e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89093d056f612b90064277410dc17c8b1879459a9e9491538ccd8dabeb2703e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e773c0ba8db44a579b0f5b59c7402d773402fbb1b3e3b57bf95e9244df635c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e773c0ba8db44a579b0f5b59c7402d773402fbb1b3e3b57bf95e9244df635c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a098d78bfca730eec14ca5f48ae082e24f0623ec3752e1b0182dceb18821e3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a098d78bfca730eec14ca5f48ae082e24f0623ec3752e1b0182dceb18821e3b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e41fea9eed5df332e515a81666075fc4cb3171b47a7c222b36dac4d5a7533692\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e41fea9eed5df332e515a81666075fc4cb3171b47a7c222b36dac4d5a7533692\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-s779q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:50Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:50 crc kubenswrapper[4933]: I1202 15:52:50.850888 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:50 crc kubenswrapper[4933]: I1202 15:52:50.850940 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:50 crc kubenswrapper[4933]: I1202 15:52:50.850951 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:50 crc kubenswrapper[4933]: I1202 15:52:50.850968 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:50 crc kubenswrapper[4933]: I1202 15:52:50.850981 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:50Z","lastTransitionTime":"2025-12-02T15:52:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:50 crc kubenswrapper[4933]: I1202 15:52:50.952893 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:50 crc kubenswrapper[4933]: I1202 15:52:50.952936 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:50 crc kubenswrapper[4933]: I1202 15:52:50.952954 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:50 crc kubenswrapper[4933]: I1202 15:52:50.952974 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:50 crc kubenswrapper[4933]: I1202 15:52:50.952986 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:50Z","lastTransitionTime":"2025-12-02T15:52:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:51 crc kubenswrapper[4933]: I1202 15:52:51.065718 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:51 crc kubenswrapper[4933]: I1202 15:52:51.065764 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:51 crc kubenswrapper[4933]: I1202 15:52:51.065776 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:51 crc kubenswrapper[4933]: I1202 15:52:51.065796 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:51 crc kubenswrapper[4933]: I1202 15:52:51.065810 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:51Z","lastTransitionTime":"2025-12-02T15:52:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:51 crc kubenswrapper[4933]: I1202 15:52:51.168550 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:51 crc kubenswrapper[4933]: I1202 15:52:51.168596 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:51 crc kubenswrapper[4933]: I1202 15:52:51.168610 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:51 crc kubenswrapper[4933]: I1202 15:52:51.168627 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:51 crc kubenswrapper[4933]: I1202 15:52:51.168640 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:51Z","lastTransitionTime":"2025-12-02T15:52:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:51 crc kubenswrapper[4933]: I1202 15:52:51.271377 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:51 crc kubenswrapper[4933]: I1202 15:52:51.271435 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:51 crc kubenswrapper[4933]: I1202 15:52:51.271451 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:51 crc kubenswrapper[4933]: I1202 15:52:51.271473 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:51 crc kubenswrapper[4933]: I1202 15:52:51.271488 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:51Z","lastTransitionTime":"2025-12-02T15:52:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:51 crc kubenswrapper[4933]: I1202 15:52:51.362501 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-s779q" event={"ID":"72a30a71-fe04-43a2-8f60-c9b12a0a6e5a","Type":"ContainerStarted","Data":"a2367cc195b2375ceed6df398214268e414ae13f5459750b4a1f3bbe4ef59363"} Dec 02 15:52:51 crc kubenswrapper[4933]: I1202 15:52:51.362590 4933 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 02 15:52:51 crc kubenswrapper[4933]: I1202 15:52:51.374644 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:51 crc kubenswrapper[4933]: I1202 15:52:51.374713 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:51 crc kubenswrapper[4933]: I1202 15:52:51.374727 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:51 crc kubenswrapper[4933]: I1202 15:52:51.374745 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:51 crc kubenswrapper[4933]: I1202 15:52:51.374758 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:51Z","lastTransitionTime":"2025-12-02T15:52:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:51 crc kubenswrapper[4933]: I1202 15:52:51.387635 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6c1c5e6-50dd-428a-890c-2c3f0456f2fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d84363ac0dfeec81ad7770d6ffd34547605fc51bebb545c4639f4c069bab93ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t48jm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54194f3459a2bbe748821e4f8e94abdd18e7c4e483d4cc2c9d5b765db584dd01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t48jm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-d2p6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:51Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:51 crc kubenswrapper[4933]: I1202 15:52:51.402683 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5d9dn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97edaf10-912b-42e7-a9e7-930381d48508\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea0970d622ab76b94ceab66d1d10d469581574368d38c8cd7c6b7a26f81cb6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s82hp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5d9dn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:51Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:51 crc kubenswrapper[4933]: I1202 15:52:51.418449 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fl25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"680c0df1-e4d6-4e1c-a36d-2378e821d2d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bdbdb1eb6ad7b6e710feeda6af64e9557ec1c3c938fd850fa5b2835abc45f098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-559sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fl25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:51Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:51 crc kubenswrapper[4933]: I1202 15:52:51.440627 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:51Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:51 crc kubenswrapper[4933]: I1202 15:52:51.463909 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4424b26b4c67e9508d92cc6bbc82b291d93c587a8463026a856d87b7b778079e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:51Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:51 crc kubenswrapper[4933]: I1202 15:52:51.477237 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:51 crc kubenswrapper[4933]: I1202 15:52:51.477277 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:51 crc kubenswrapper[4933]: I1202 15:52:51.477290 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:51 crc kubenswrapper[4933]: I1202 15:52:51.477306 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:51 crc kubenswrapper[4933]: I1202 15:52:51.477320 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:51Z","lastTransitionTime":"2025-12-02T15:52:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:51 crc kubenswrapper[4933]: I1202 15:52:51.484955 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-z6kjz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b033c545-93a2-4401-842b-22456e44216b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5307d7bbe56091012f9975b2a42eafb27d8c90b53817f1f82d8269e23456759\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdq96\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-z6kjz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:51Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:51 crc kubenswrapper[4933]: I1202 15:52:51.503678 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50efb59904b9275848b8f068dfa8943515c66087209fe13dc75888354ecaff09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:51Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:51 crc kubenswrapper[4933]: I1202 15:52:51.527265 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:51Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:51 crc kubenswrapper[4933]: I1202 15:52:51.545627 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:51Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:51 crc kubenswrapper[4933]: I1202 15:52:51.568899 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-s779q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72a30a71-fe04-43a2-8f60-c9b12a0a6e5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2367cc195b2375ceed6df398214268e414ae13f5459750b4a1f3bbe4ef59363\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a448b3118b6160344bea740978c045ce2351af934805ee95eb6b940963b377bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a448b3118b6160344bea740978c045ce2351af934805ee95eb6b940963b377bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f11b7eae04e4265f3d00a32c0c1c91419ef6da612c6ecd2e69ef4e90b5f72b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f11b7eae04e4265f3d00a32c0c1c91419ef6da612c6ecd2e69ef4e90b5f72b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89093d056f612b90064277410dc17c8b1879459a9e9491538ccd8dabeb2703e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89093d056f612b90064277410dc17c8b1879459a9e9491538ccd8dabeb2703e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e773c0ba8db44a579b0f5b59c7402d773402fbb1b3e3b57bf95e9244df635c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e773c0ba8db44a579b0f5b59c7402d773402fbb1b3e3b57bf95e9244df635c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a098d78bfca730eec14ca5f48ae082e24f0623ec3752e1b0182dceb18821e3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a098d78bfca730eec14ca5f48ae082e24f0623ec3752e1b0182dceb18821e3b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e41fea9eed5df332e515a81666075fc4cb3171b47a7c222b36dac4d5a7533692\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e41fea9eed5df332e515a81666075fc4cb3171b47a7c222b36dac4d5a7533692\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-s779q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:51Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:51 crc kubenswrapper[4933]: I1202 15:52:51.579843 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:51 crc kubenswrapper[4933]: I1202 15:52:51.579880 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:51 crc kubenswrapper[4933]: I1202 15:52:51.579893 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:51 crc kubenswrapper[4933]: I1202 15:52:51.579911 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:51 crc kubenswrapper[4933]: I1202 15:52:51.579925 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:51Z","lastTransitionTime":"2025-12-02T15:52:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:51 crc kubenswrapper[4933]: I1202 15:52:51.589300 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67141d41-dade-4d16-8921-1a3eeaef658e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67497877f8241167fd89cf2ebbc7257a910b4f22e96a1e58f78936317daf19aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://559f28d5b8b13a4d78a727f671a75d2c61a391f83089c98749a8071b466c8fae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50e3f8f75abacf2b0701b20cc58796727bb04d99836c7ad69c27355d0c7f070f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0f8661c980893fc646e84a6bb8547946653723fb3f1449d585953e25715d588\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdced4864fc5e9a41404f9484c6126634ffcbc3388080207f6a5508be6dc7b19\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 15:52:30.552832 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 15:52:30.556124 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1621699441/tls.crt::/tmp/serving-cert-1621699441/tls.key\\\\\\\"\\\\nI1202 15:52:36.166152 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 15:52:36.169452 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 15:52:36.169553 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 15:52:36.169614 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 15:52:36.169667 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 15:52:36.177343 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 15:52:36.177409 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 15:52:36.177417 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 15:52:36.177423 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 15:52:36.177428 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 15:52:36.177432 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 15:52:36.177437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 15:52:36.177361 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 15:52:36.181525 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://266ff5f18c317176752753189e1fd32c61d17e22bebba19421d71cdce41c10e4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd63fc5b2ffbd52b09628908868b28e2168bfe8bc453de41ce175aedcd404cc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd63fc5b2ffbd52b09628908868b28e2168bfe8bc453de41ce175aedcd404cc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:51Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:51 crc kubenswrapper[4933]: I1202 15:52:51.607729 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c0555430401fd089ba4f14bef44c9a03bcc4352a3159c34aa592797211ff912\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2774460b514418fc05e1d8ac0ca0a8cda1194fab9151804bed266e6bf44c7369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:51Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:51 crc kubenswrapper[4933]: I1202 15:52:51.621757 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06b195f1-3296-4050-9361-eab421cde8d4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7399fed7994f0ed3f12a423d3f6796e84d8687f9c16a3050ccbb90e1c80a07d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee933bdd8638f0085a6f720a178c8ce59bf46b40a0bcb015ac9c570e25ce97d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7cb1abb6f86878fc3daef153191ea3a2ebe06b3f1fc7df959539938c3b6a724\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db14d6e6ebdfa06ff02570eb66fe7ea17a7705fdaa767b6fb91d7ed12eacd59a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:51Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:51 crc kubenswrapper[4933]: I1202 15:52:51.640474 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1972064c-ea30-421c-b009-2bc675a98fcc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3920b75a79accaa154ca827b9d813f5bbb7c7c18369c0b6d3bb82cab653bcb66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a3f2ac3ef3759389e0c4807b4235aea66b8d9570ef020af90110402296a755\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d4c0805f39e3410185b2d898b61b42dfeb544cd47c1ac7570a099ece7921a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://170155e5803e9eabfc4866a3cd76bd74567d9ff6707bebfdf7f24a2330173140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a499bb33dae499d565aa22f537a095fe9465bd1d68d829248fc7a896f692f67d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14c25265781df0f480ecebbc167b5e413d869433f7258cc299a2c2bc12ee2af3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://266447045a0c1f371e1f4464fa07d948355d9b00ce2443dcd0b6e1835d9cc835\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ae6c9973a0f8983a5da9a3a25ba23d7d836e7f5bdaa60b60627d604ef3d2afc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0062caebc274431e99ef7dc875a50f2f2d756215bd2700b7206582f68e5f376\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0062caebc274431e99ef7dc875a50f2f2d756215bd2700b7206582f68e5f376\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8mklc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:51Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:51 crc kubenswrapper[4933]: I1202 15:52:51.682779 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:51 crc kubenswrapper[4933]: I1202 15:52:51.682835 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:51 crc kubenswrapper[4933]: I1202 15:52:51.682846 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:51 crc kubenswrapper[4933]: I1202 15:52:51.682861 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:51 crc kubenswrapper[4933]: I1202 15:52:51.682872 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:51Z","lastTransitionTime":"2025-12-02T15:52:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:51 crc kubenswrapper[4933]: I1202 15:52:51.786070 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:51 crc kubenswrapper[4933]: I1202 15:52:51.786586 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:51 crc kubenswrapper[4933]: I1202 15:52:51.786600 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:51 crc kubenswrapper[4933]: I1202 15:52:51.786621 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:51 crc kubenswrapper[4933]: I1202 15:52:51.786641 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:51Z","lastTransitionTime":"2025-12-02T15:52:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:51 crc kubenswrapper[4933]: I1202 15:52:51.889467 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:51 crc kubenswrapper[4933]: I1202 15:52:51.889495 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:51 crc kubenswrapper[4933]: I1202 15:52:51.889503 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:51 crc kubenswrapper[4933]: I1202 15:52:51.889516 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:51 crc kubenswrapper[4933]: I1202 15:52:51.889525 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:51Z","lastTransitionTime":"2025-12-02T15:52:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:52 crc kubenswrapper[4933]: I1202 15:52:52.004316 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 15:52:52 crc kubenswrapper[4933]: I1202 15:52:52.004456 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 15:52:52 crc kubenswrapper[4933]: I1202 15:52:52.004492 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 15:52:52 crc kubenswrapper[4933]: I1202 15:52:52.004518 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 15:52:52 crc kubenswrapper[4933]: I1202 15:52:52.004539 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 15:52:52 crc kubenswrapper[4933]: E1202 15:52:52.004652 4933 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 15:52:52 crc kubenswrapper[4933]: E1202 15:52:52.004709 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 15:53:08.004689583 +0000 UTC m=+51.255916306 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 15:52:52 crc kubenswrapper[4933]: E1202 15:52:52.005182 4933 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 15:52:52 crc kubenswrapper[4933]: E1202 15:52:52.005196 4933 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 15:52:52 crc kubenswrapper[4933]: E1202 15:52:52.005313 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 15:53:08.005248277 +0000 UTC m=+51.256475000 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:52:52 crc kubenswrapper[4933]: E1202 15:52:52.005412 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 15:53:08.005392101 +0000 UTC m=+51.256618874 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 15:52:52 crc kubenswrapper[4933]: E1202 15:52:52.005215 4933 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 15:52:52 crc kubenswrapper[4933]: E1202 15:52:52.005471 4933 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 15:52:52 crc kubenswrapper[4933]: E1202 15:52:52.005576 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-02 15:53:08.005540375 +0000 UTC m=+51.256767118 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 15:52:52 crc kubenswrapper[4933]: E1202 15:52:52.006214 4933 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 15:52:52 crc kubenswrapper[4933]: E1202 15:52:52.006566 4933 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 15:52:52 crc kubenswrapper[4933]: E1202 15:52:52.006620 4933 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 15:52:52 crc kubenswrapper[4933]: E1202 15:52:52.007272 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-02 15:53:08.007227817 +0000 UTC m=+51.258454570 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 15:52:52 crc kubenswrapper[4933]: I1202 15:52:52.009210 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:52 crc kubenswrapper[4933]: I1202 15:52:52.009288 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:52 crc kubenswrapper[4933]: I1202 15:52:52.009316 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:52 crc kubenswrapper[4933]: I1202 15:52:52.009349 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:52 crc kubenswrapper[4933]: I1202 15:52:52.009375 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:52Z","lastTransitionTime":"2025-12-02T15:52:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:52 crc kubenswrapper[4933]: I1202 15:52:52.052692 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 15:52:52 crc kubenswrapper[4933]: I1202 15:52:52.052780 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 15:52:52 crc kubenswrapper[4933]: I1202 15:52:52.052886 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 15:52:52 crc kubenswrapper[4933]: E1202 15:52:52.052995 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 15:52:52 crc kubenswrapper[4933]: E1202 15:52:52.053111 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 15:52:52 crc kubenswrapper[4933]: E1202 15:52:52.053263 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 15:52:52 crc kubenswrapper[4933]: I1202 15:52:52.112897 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:52 crc kubenswrapper[4933]: I1202 15:52:52.113020 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:52 crc kubenswrapper[4933]: I1202 15:52:52.113046 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:52 crc kubenswrapper[4933]: I1202 15:52:52.113072 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:52 crc kubenswrapper[4933]: I1202 15:52:52.113090 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:52Z","lastTransitionTime":"2025-12-02T15:52:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:52 crc kubenswrapper[4933]: I1202 15:52:52.215078 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:52 crc kubenswrapper[4933]: I1202 15:52:52.215121 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:52 crc kubenswrapper[4933]: I1202 15:52:52.215131 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:52 crc kubenswrapper[4933]: I1202 15:52:52.215148 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:52 crc kubenswrapper[4933]: I1202 15:52:52.215161 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:52Z","lastTransitionTime":"2025-12-02T15:52:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:52 crc kubenswrapper[4933]: I1202 15:52:52.318279 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:52 crc kubenswrapper[4933]: I1202 15:52:52.318355 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:52 crc kubenswrapper[4933]: I1202 15:52:52.318379 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:52 crc kubenswrapper[4933]: I1202 15:52:52.318409 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:52 crc kubenswrapper[4933]: I1202 15:52:52.318430 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:52Z","lastTransitionTime":"2025-12-02T15:52:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:52 crc kubenswrapper[4933]: I1202 15:52:52.369363 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8mklc_1972064c-ea30-421c-b009-2bc675a98fcc/ovnkube-controller/0.log" Dec 02 15:52:52 crc kubenswrapper[4933]: I1202 15:52:52.373793 4933 generic.go:334] "Generic (PLEG): container finished" podID="1972064c-ea30-421c-b009-2bc675a98fcc" containerID="266447045a0c1f371e1f4464fa07d948355d9b00ce2443dcd0b6e1835d9cc835" exitCode=1 Dec 02 15:52:52 crc kubenswrapper[4933]: I1202 15:52:52.373898 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" event={"ID":"1972064c-ea30-421c-b009-2bc675a98fcc","Type":"ContainerDied","Data":"266447045a0c1f371e1f4464fa07d948355d9b00ce2443dcd0b6e1835d9cc835"} Dec 02 15:52:52 crc kubenswrapper[4933]: I1202 15:52:52.375290 4933 scope.go:117] "RemoveContainer" containerID="266447045a0c1f371e1f4464fa07d948355d9b00ce2443dcd0b6e1835d9cc835" Dec 02 15:52:52 crc kubenswrapper[4933]: I1202 15:52:52.404609 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-z6kjz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b033c545-93a2-4401-842b-22456e44216b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5307d7bbe56091012f9975b2a42eafb27d8c90b53817f1f82d8269e23456759\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdq96\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-z6kjz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:52Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:52 crc kubenswrapper[4933]: I1202 15:52:52.424513 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:52 crc kubenswrapper[4933]: I1202 15:52:52.424580 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:52 crc kubenswrapper[4933]: I1202 15:52:52.424602 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:52 crc kubenswrapper[4933]: I1202 15:52:52.424633 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:52 crc kubenswrapper[4933]: I1202 15:52:52.424653 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:52Z","lastTransitionTime":"2025-12-02T15:52:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:52 crc kubenswrapper[4933]: I1202 15:52:52.435952 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50efb59904b9275848b8f068dfa8943515c66087209fe13dc75888354ecaff09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:52Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:52 crc kubenswrapper[4933]: I1202 15:52:52.455648 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:52Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:52 crc kubenswrapper[4933]: I1202 15:52:52.472616 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4424b26b4c67e9508d92cc6bbc82b291d93c587a8463026a856d87b7b778079e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:52Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:52 crc kubenswrapper[4933]: I1202 15:52:52.498866 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-s779q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72a30a71-fe04-43a2-8f60-c9b12a0a6e5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2367cc195b2375ceed6df398214268e414ae13f5459750b4a1f3bbe4ef59363\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a448b3118b6160344bea740978c045ce2351af934805ee95eb6b940963b377bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a448b3118b6160344bea740978c045ce2351af934805ee95eb6b940963b377bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f11b7eae04e4265f3d00a32c0c1c91419ef6da612c6ecd2e69ef4e90b5f72b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f11b7eae04e4265f3d00a32c0c1c91419ef6da612c6ecd2e69ef4e90b5f72b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89093d056f612b90064277410dc17c8b1879459a9e9491538ccd8dabeb2703e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89093d056f612b90064277410dc17c8b1879459a9e9491538ccd8dabeb2703e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e773c0ba8db44a579b0f5b59c7402d773402fbb1b3e3b57bf95e9244df635c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e773c0ba8db44a579b0f5b59c7402d773402fbb1b3e3b57bf95e9244df635c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a098d78bfca730eec14ca5f48ae082e24f0623ec3752e1b0182dceb18821e3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a098d78bfca730eec14ca5f48ae082e24f0623ec3752e1b0182dceb18821e3b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e41fea9eed5df332e515a81666075fc4cb3171b47a7c222b36dac4d5a7533692\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e41fea9eed5df332e515a81666075fc4cb3171b47a7c222b36dac4d5a7533692\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-s779q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:52Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:52 crc kubenswrapper[4933]: I1202 15:52:52.524119 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67141d41-dade-4d16-8921-1a3eeaef658e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67497877f8241167fd89cf2ebbc7257a910b4f22e96a1e58f78936317daf19aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://559f28d5b8b13a4d78a727f671a75d2c61a391f83089c98749a8071b466c8fae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50e3f8f75abacf2b0701b20cc58796727bb04d99836c7ad69c27355d0c7f070f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0f8661c980893fc646e84a6bb8547946653723fb3f1449d585953e25715d588\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdced4864fc5e9a41404f9484c6126634ffcbc3388080207f6a5508be6dc7b19\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 15:52:30.552832 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 15:52:30.556124 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1621699441/tls.crt::/tmp/serving-cert-1621699441/tls.key\\\\\\\"\\\\nI1202 15:52:36.166152 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 15:52:36.169452 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 15:52:36.169553 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 15:52:36.169614 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 15:52:36.169667 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 15:52:36.177343 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 15:52:36.177409 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 15:52:36.177417 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 15:52:36.177423 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 15:52:36.177428 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 15:52:36.177432 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 15:52:36.177437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 15:52:36.177361 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 15:52:36.181525 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://266ff5f18c317176752753189e1fd32c61d17e22bebba19421d71cdce41c10e4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd63fc5b2ffbd52b09628908868b28e2168bfe8bc453de41ce175aedcd404cc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd63fc5b2ffbd52b09628908868b28e2168bfe8bc453de41ce175aedcd404cc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:52Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:52 crc kubenswrapper[4933]: I1202 15:52:52.528037 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:52 crc kubenswrapper[4933]: I1202 15:52:52.528356 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:52 crc kubenswrapper[4933]: I1202 15:52:52.528379 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:52 crc kubenswrapper[4933]: I1202 15:52:52.528399 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:52 crc kubenswrapper[4933]: I1202 15:52:52.528413 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:52Z","lastTransitionTime":"2025-12-02T15:52:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:52 crc kubenswrapper[4933]: I1202 15:52:52.539420 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c0555430401fd089ba4f14bef44c9a03bcc4352a3159c34aa592797211ff912\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2774460b514418fc05e1d8ac0ca0a8cda1194fab9151804bed266e6bf44c7369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:52Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:52 crc kubenswrapper[4933]: I1202 15:52:52.554300 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:52Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:52 crc kubenswrapper[4933]: I1202 15:52:52.576608 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1972064c-ea30-421c-b009-2bc675a98fcc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3920b75a79accaa154ca827b9d813f5bbb7c7c18369c0b6d3bb82cab653bcb66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a3f2ac3ef3759389e0c4807b4235aea66b8d9570ef020af90110402296a755\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d4c0805f39e3410185b2d898b61b42dfeb544cd47c1ac7570a099ece7921a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://170155e5803e9eabfc4866a3cd76bd74567d9ff6707bebfdf7f24a2330173140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a499bb33dae499d565aa22f537a095fe9465bd1d68d829248fc7a896f692f67d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14c25265781df0f480ecebbc167b5e413d869433f7258cc299a2c2bc12ee2af3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://266447045a0c1f371e1f4464fa07d948355d9b00ce2443dcd0b6e1835d9cc835\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://266447045a0c1f371e1f4464fa07d948355d9b00ce2443dcd0b6e1835d9cc835\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T15:52:52Z\\\",\\\"message\\\":\\\"o:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 15:52:52.182648 6243 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 15:52:52.182987 6243 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 15:52:52.183056 6243 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1202 15:52:52.183310 6243 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1202 15:52:52.183953 6243 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1202 15:52:52.183976 6243 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1202 15:52:52.183990 6243 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1202 15:52:52.184013 6243 factory.go:656] Stopping watch factory\\\\nI1202 15:52:52.184023 6243 handler.go:208] Removed *v1.Node event handler 2\\\\nI1202 15:52:52.184029 6243 ovnkube.go:599] Stopped ovnkube\\\\nI1202 15:52:52.184039 6243 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI12\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ae6c9973a0f8983a5da9a3a25ba23d7d836e7f5bdaa60b60627d604ef3d2afc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0062caebc274431e99ef7dc875a50f2f2d756215bd2700b7206582f68e5f376\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0062caebc274431e99ef7dc875a50f2f2d756215bd2700b7206582f68e5f376\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8mklc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:52Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:52 crc kubenswrapper[4933]: I1202 15:52:52.590683 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06b195f1-3296-4050-9361-eab421cde8d4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7399fed7994f0ed3f12a423d3f6796e84d8687f9c16a3050ccbb90e1c80a07d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee933bdd8638f0085a6f720a178c8ce59bf46b40a0bcb015ac9c570e25ce97d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7cb1abb6f86878fc3daef153191ea3a2ebe06b3f1fc7df959539938c3b6a724\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db14d6e6ebdfa06ff02570eb66fe7ea17a7705fdaa767b6fb91d7ed12eacd59a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:52Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:52 crc kubenswrapper[4933]: I1202 15:52:52.603672 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5d9dn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97edaf10-912b-42e7-a9e7-930381d48508\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea0970d622ab76b94ceab66d1d10d469581574368d38c8cd7c6b7a26f81cb6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s82hp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5d9dn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:52Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:52 crc kubenswrapper[4933]: I1202 15:52:52.617874 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fl25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"680c0df1-e4d6-4e1c-a36d-2378e821d2d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bdbdb1eb6ad7b6e710feeda6af64e9557ec1c3c938fd850fa5b2835abc45f098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-559sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fl25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:52Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:52 crc kubenswrapper[4933]: I1202 15:52:52.632224 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:52 crc kubenswrapper[4933]: I1202 15:52:52.632275 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:52 crc kubenswrapper[4933]: I1202 15:52:52.632287 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:52 crc kubenswrapper[4933]: I1202 15:52:52.632308 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:52 crc kubenswrapper[4933]: I1202 15:52:52.632320 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:52Z","lastTransitionTime":"2025-12-02T15:52:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:52 crc kubenswrapper[4933]: I1202 15:52:52.633684 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:52Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:52 crc kubenswrapper[4933]: I1202 15:52:52.648162 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6c1c5e6-50dd-428a-890c-2c3f0456f2fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d84363ac0dfeec81ad7770d6ffd34547605fc51bebb545c4639f4c069bab93ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t48jm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54194f3459a2bbe748821e4f8e94abdd18e7c4e483d4cc2c9d5b765db584dd01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t48jm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-d2p6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:52Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:52 crc kubenswrapper[4933]: I1202 15:52:52.735127 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:52 crc kubenswrapper[4933]: I1202 15:52:52.735168 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:52 crc kubenswrapper[4933]: I1202 15:52:52.735182 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:52 crc kubenswrapper[4933]: I1202 15:52:52.735198 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:52 crc kubenswrapper[4933]: I1202 15:52:52.735212 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:52Z","lastTransitionTime":"2025-12-02T15:52:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:52 crc kubenswrapper[4933]: I1202 15:52:52.838668 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:52 crc kubenswrapper[4933]: I1202 15:52:52.838714 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:52 crc kubenswrapper[4933]: I1202 15:52:52.838728 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:52 crc kubenswrapper[4933]: I1202 15:52:52.838749 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:52 crc kubenswrapper[4933]: I1202 15:52:52.838762 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:52Z","lastTransitionTime":"2025-12-02T15:52:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:52 crc kubenswrapper[4933]: I1202 15:52:52.940877 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:52 crc kubenswrapper[4933]: I1202 15:52:52.940921 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:52 crc kubenswrapper[4933]: I1202 15:52:52.940933 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:52 crc kubenswrapper[4933]: I1202 15:52:52.940951 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:52 crc kubenswrapper[4933]: I1202 15:52:52.940963 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:52Z","lastTransitionTime":"2025-12-02T15:52:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:53 crc kubenswrapper[4933]: I1202 15:52:53.043631 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:53 crc kubenswrapper[4933]: I1202 15:52:53.043674 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:53 crc kubenswrapper[4933]: I1202 15:52:53.043685 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:53 crc kubenswrapper[4933]: I1202 15:52:53.043700 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:53 crc kubenswrapper[4933]: I1202 15:52:53.043709 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:53Z","lastTransitionTime":"2025-12-02T15:52:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:53 crc kubenswrapper[4933]: I1202 15:52:53.145660 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:53 crc kubenswrapper[4933]: I1202 15:52:53.145708 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:53 crc kubenswrapper[4933]: I1202 15:52:53.145738 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:53 crc kubenswrapper[4933]: I1202 15:52:53.145755 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:53 crc kubenswrapper[4933]: I1202 15:52:53.145769 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:53Z","lastTransitionTime":"2025-12-02T15:52:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:53 crc kubenswrapper[4933]: I1202 15:52:53.248627 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:53 crc kubenswrapper[4933]: I1202 15:52:53.248679 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:53 crc kubenswrapper[4933]: I1202 15:52:53.248689 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:53 crc kubenswrapper[4933]: I1202 15:52:53.248705 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:53 crc kubenswrapper[4933]: I1202 15:52:53.248715 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:53Z","lastTransitionTime":"2025-12-02T15:52:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:53 crc kubenswrapper[4933]: I1202 15:52:53.351343 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:53 crc kubenswrapper[4933]: I1202 15:52:53.351433 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:53 crc kubenswrapper[4933]: I1202 15:52:53.351470 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:53 crc kubenswrapper[4933]: I1202 15:52:53.351505 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:53 crc kubenswrapper[4933]: I1202 15:52:53.351528 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:53Z","lastTransitionTime":"2025-12-02T15:52:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:53 crc kubenswrapper[4933]: I1202 15:52:53.380109 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8mklc_1972064c-ea30-421c-b009-2bc675a98fcc/ovnkube-controller/0.log" Dec 02 15:52:53 crc kubenswrapper[4933]: I1202 15:52:53.383328 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" event={"ID":"1972064c-ea30-421c-b009-2bc675a98fcc","Type":"ContainerStarted","Data":"04ab3fe2bc8cab1157c68d938926ecb8b4fc1bca6f2bba21333de8a7213ad8df"} Dec 02 15:52:53 crc kubenswrapper[4933]: I1202 15:52:53.383542 4933 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 02 15:52:53 crc kubenswrapper[4933]: I1202 15:52:53.399707 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:53Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:53 crc kubenswrapper[4933]: I1202 15:52:53.412079 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4424b26b4c67e9508d92cc6bbc82b291d93c587a8463026a856d87b7b778079e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:53Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:53 crc kubenswrapper[4933]: I1202 15:52:53.428363 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-z6kjz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b033c545-93a2-4401-842b-22456e44216b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5307d7bbe56091012f9975b2a42eafb27d8c90b53817f1f82d8269e23456759\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdq96\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-z6kjz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:53Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:53 crc kubenswrapper[4933]: I1202 15:52:53.444247 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50efb59904b9275848b8f068dfa8943515c66087209fe13dc75888354ecaff09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:53Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:53 crc kubenswrapper[4933]: I1202 15:52:53.453984 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:53 crc kubenswrapper[4933]: I1202 15:52:53.454021 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:53 crc kubenswrapper[4933]: I1202 15:52:53.454032 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:53 crc kubenswrapper[4933]: I1202 15:52:53.454050 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:53 crc kubenswrapper[4933]: I1202 15:52:53.454062 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:53Z","lastTransitionTime":"2025-12-02T15:52:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:53 crc kubenswrapper[4933]: I1202 15:52:53.457040 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c0555430401fd089ba4f14bef44c9a03bcc4352a3159c34aa592797211ff912\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2774460b514418fc05e1d8ac0ca0a8cda1194fab9151804bed266e6bf44c7369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:53Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:53 crc kubenswrapper[4933]: I1202 15:52:53.470049 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:53Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:53 crc kubenswrapper[4933]: I1202 15:52:53.487078 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-s779q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72a30a71-fe04-43a2-8f60-c9b12a0a6e5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2367cc195b2375ceed6df398214268e414ae13f5459750b4a1f3bbe4ef59363\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a448b3118b6160344bea740978c045ce2351af934805ee95eb6b940963b377bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a448b3118b6160344bea740978c045ce2351af934805ee95eb6b940963b377bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f11b7eae04e4265f3d00a32c0c1c91419ef6da612c6ecd2e69ef4e90b5f72b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f11b7eae04e4265f3d00a32c0c1c91419ef6da612c6ecd2e69ef4e90b5f72b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89093d056f612b90064277410dc17c8b1879459a9e9491538ccd8dabeb2703e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89093d056f612b90064277410dc17c8b1879459a9e9491538ccd8dabeb2703e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e773c0ba8db44a579b0f5b59c7402d773402fbb1b3e3b57bf95e9244df635c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e773c0ba8db44a579b0f5b59c7402d773402fbb1b3e3b57bf95e9244df635c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a098d78bfca730eec14ca5f48ae082e24f0623ec3752e1b0182dceb18821e3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a098d78bfca730eec14ca5f48ae082e24f0623ec3752e1b0182dceb18821e3b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e41fea9eed5df332e515a81666075fc4cb3171b47a7c222b36dac4d5a7533692\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e41fea9eed5df332e515a81666075fc4cb3171b47a7c222b36dac4d5a7533692\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-s779q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:53Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:53 crc kubenswrapper[4933]: I1202 15:52:53.505679 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67141d41-dade-4d16-8921-1a3eeaef658e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67497877f8241167fd89cf2ebbc7257a910b4f22e96a1e58f78936317daf19aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://559f28d5b8b13a4d78a727f671a75d2c61a391f83089c98749a8071b466c8fae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50e3f8f75abacf2b0701b20cc58796727bb04d99836c7ad69c27355d0c7f070f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0f8661c980893fc646e84a6bb8547946653723fb3f1449d585953e25715d588\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdced4864fc5e9a41404f9484c6126634ffcbc3388080207f6a5508be6dc7b19\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 15:52:30.552832 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 15:52:30.556124 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1621699441/tls.crt::/tmp/serving-cert-1621699441/tls.key\\\\\\\"\\\\nI1202 15:52:36.166152 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 15:52:36.169452 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 15:52:36.169553 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 15:52:36.169614 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 15:52:36.169667 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 15:52:36.177343 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 15:52:36.177409 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 15:52:36.177417 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 15:52:36.177423 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 15:52:36.177428 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 15:52:36.177432 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 15:52:36.177437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 15:52:36.177361 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 15:52:36.181525 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://266ff5f18c317176752753189e1fd32c61d17e22bebba19421d71cdce41c10e4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd63fc5b2ffbd52b09628908868b28e2168bfe8bc453de41ce175aedcd404cc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd63fc5b2ffbd52b09628908868b28e2168bfe8bc453de41ce175aedcd404cc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:53Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:53 crc kubenswrapper[4933]: I1202 15:52:53.524548 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06b195f1-3296-4050-9361-eab421cde8d4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7399fed7994f0ed3f12a423d3f6796e84d8687f9c16a3050ccbb90e1c80a07d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee933bdd8638f0085a6f720a178c8ce59bf46b40a0bcb015ac9c570e25ce97d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7cb1abb6f86878fc3daef153191ea3a2ebe06b3f1fc7df959539938c3b6a724\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db14d6e6ebdfa06ff02570eb66fe7ea17a7705fdaa767b6fb91d7ed12eacd59a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:53Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:53 crc kubenswrapper[4933]: I1202 15:52:53.556300 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1972064c-ea30-421c-b009-2bc675a98fcc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3920b75a79accaa154ca827b9d813f5bbb7c7c18369c0b6d3bb82cab653bcb66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a3f2ac3ef3759389e0c4807b4235aea66b8d9570ef020af90110402296a755\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d4c0805f39e3410185b2d898b61b42dfeb544cd47c1ac7570a099ece7921a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://170155e5803e9eabfc4866a3cd76bd74567d9ff6707bebfdf7f24a2330173140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a499bb33dae499d565aa22f537a095fe9465bd1d68d829248fc7a896f692f67d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14c25265781df0f480ecebbc167b5e413d869433f7258cc299a2c2bc12ee2af3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04ab3fe2bc8cab1157c68d938926ecb8b4fc1bca6f2bba21333de8a7213ad8df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://266447045a0c1f371e1f4464fa07d948355d9b00ce2443dcd0b6e1835d9cc835\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T15:52:52Z\\\",\\\"message\\\":\\\"o:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 15:52:52.182648 6243 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 15:52:52.182987 6243 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 15:52:52.183056 6243 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1202 15:52:52.183310 6243 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1202 15:52:52.183953 6243 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1202 15:52:52.183976 6243 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1202 15:52:52.183990 6243 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1202 15:52:52.184013 6243 factory.go:656] Stopping watch factory\\\\nI1202 15:52:52.184023 6243 handler.go:208] Removed *v1.Node event handler 2\\\\nI1202 15:52:52.184029 6243 ovnkube.go:599] Stopped ovnkube\\\\nI1202 15:52:52.184039 6243 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI12\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ae6c9973a0f8983a5da9a3a25ba23d7d836e7f5bdaa60b60627d604ef3d2afc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0062caebc274431e99ef7dc875a50f2f2d756215bd2700b7206582f68e5f376\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0062caebc274431e99ef7dc875a50f2f2d756215bd2700b7206582f68e5f376\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8mklc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:53Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:53 crc kubenswrapper[4933]: I1202 15:52:53.556951 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:53 crc kubenswrapper[4933]: I1202 15:52:53.557011 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:53 crc kubenswrapper[4933]: I1202 15:52:53.557030 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:53 crc kubenswrapper[4933]: I1202 15:52:53.557055 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:53 crc kubenswrapper[4933]: I1202 15:52:53.557074 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:53Z","lastTransitionTime":"2025-12-02T15:52:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:53 crc kubenswrapper[4933]: I1202 15:52:53.570063 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:53Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:53 crc kubenswrapper[4933]: I1202 15:52:53.582440 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6c1c5e6-50dd-428a-890c-2c3f0456f2fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d84363ac0dfeec81ad7770d6ffd34547605fc51bebb545c4639f4c069bab93ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t48jm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54194f3459a2bbe748821e4f8e94abdd18e7c4e483d4cc2c9d5b765db584dd01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t48jm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-d2p6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:53Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:53 crc kubenswrapper[4933]: I1202 15:52:53.594385 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5d9dn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97edaf10-912b-42e7-a9e7-930381d48508\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea0970d622ab76b94ceab66d1d10d469581574368d38c8cd7c6b7a26f81cb6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s82hp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5d9dn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:53Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:53 crc kubenswrapper[4933]: I1202 15:52:53.604431 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fl25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"680c0df1-e4d6-4e1c-a36d-2378e821d2d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bdbdb1eb6ad7b6e710feeda6af64e9557ec1c3c938fd850fa5b2835abc45f098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-559sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fl25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:53Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:53 crc kubenswrapper[4933]: I1202 15:52:53.661568 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:53 crc kubenswrapper[4933]: I1202 15:52:53.661613 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:53 crc kubenswrapper[4933]: I1202 15:52:53.661626 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:53 crc kubenswrapper[4933]: I1202 15:52:53.661644 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:53 crc kubenswrapper[4933]: I1202 15:52:53.661658 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:53Z","lastTransitionTime":"2025-12-02T15:52:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:53 crc kubenswrapper[4933]: I1202 15:52:53.763591 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:53 crc kubenswrapper[4933]: I1202 15:52:53.763670 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:53 crc kubenswrapper[4933]: I1202 15:52:53.763692 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:53 crc kubenswrapper[4933]: I1202 15:52:53.763722 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:53 crc kubenswrapper[4933]: I1202 15:52:53.763743 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:53Z","lastTransitionTime":"2025-12-02T15:52:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:53 crc kubenswrapper[4933]: I1202 15:52:53.867051 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:53 crc kubenswrapper[4933]: I1202 15:52:53.867108 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:53 crc kubenswrapper[4933]: I1202 15:52:53.867117 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:53 crc kubenswrapper[4933]: I1202 15:52:53.867137 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:53 crc kubenswrapper[4933]: I1202 15:52:53.867149 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:53Z","lastTransitionTime":"2025-12-02T15:52:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:53 crc kubenswrapper[4933]: I1202 15:52:53.969898 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:53 crc kubenswrapper[4933]: I1202 15:52:53.969959 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:53 crc kubenswrapper[4933]: I1202 15:52:53.969972 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:53 crc kubenswrapper[4933]: I1202 15:52:53.969999 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:53 crc kubenswrapper[4933]: I1202 15:52:53.970014 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:53Z","lastTransitionTime":"2025-12-02T15:52:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:54 crc kubenswrapper[4933]: I1202 15:52:54.053248 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 15:52:54 crc kubenswrapper[4933]: I1202 15:52:54.053294 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 15:52:54 crc kubenswrapper[4933]: I1202 15:52:54.053248 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 15:52:54 crc kubenswrapper[4933]: E1202 15:52:54.053405 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 15:52:54 crc kubenswrapper[4933]: E1202 15:52:54.053455 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 15:52:54 crc kubenswrapper[4933]: E1202 15:52:54.053523 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 15:52:54 crc kubenswrapper[4933]: I1202 15:52:54.072309 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:54 crc kubenswrapper[4933]: I1202 15:52:54.072404 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:54 crc kubenswrapper[4933]: I1202 15:52:54.072423 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:54 crc kubenswrapper[4933]: I1202 15:52:54.072478 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:54 crc kubenswrapper[4933]: I1202 15:52:54.072492 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:54Z","lastTransitionTime":"2025-12-02T15:52:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:54 crc kubenswrapper[4933]: I1202 15:52:54.175902 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:54 crc kubenswrapper[4933]: I1202 15:52:54.175934 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:54 crc kubenswrapper[4933]: I1202 15:52:54.175946 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:54 crc kubenswrapper[4933]: I1202 15:52:54.175960 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:54 crc kubenswrapper[4933]: I1202 15:52:54.175970 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:54Z","lastTransitionTime":"2025-12-02T15:52:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:54 crc kubenswrapper[4933]: I1202 15:52:54.278330 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:54 crc kubenswrapper[4933]: I1202 15:52:54.278367 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:54 crc kubenswrapper[4933]: I1202 15:52:54.278377 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:54 crc kubenswrapper[4933]: I1202 15:52:54.278393 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:54 crc kubenswrapper[4933]: I1202 15:52:54.278407 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:54Z","lastTransitionTime":"2025-12-02T15:52:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:54 crc kubenswrapper[4933]: I1202 15:52:54.381837 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:54 crc kubenswrapper[4933]: I1202 15:52:54.381901 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:54 crc kubenswrapper[4933]: I1202 15:52:54.381914 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:54 crc kubenswrapper[4933]: I1202 15:52:54.381942 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:54 crc kubenswrapper[4933]: I1202 15:52:54.381957 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:54Z","lastTransitionTime":"2025-12-02T15:52:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:54 crc kubenswrapper[4933]: I1202 15:52:54.387737 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8mklc_1972064c-ea30-421c-b009-2bc675a98fcc/ovnkube-controller/1.log" Dec 02 15:52:54 crc kubenswrapper[4933]: I1202 15:52:54.388472 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8mklc_1972064c-ea30-421c-b009-2bc675a98fcc/ovnkube-controller/0.log" Dec 02 15:52:54 crc kubenswrapper[4933]: I1202 15:52:54.392003 4933 generic.go:334] "Generic (PLEG): container finished" podID="1972064c-ea30-421c-b009-2bc675a98fcc" containerID="04ab3fe2bc8cab1157c68d938926ecb8b4fc1bca6f2bba21333de8a7213ad8df" exitCode=1 Dec 02 15:52:54 crc kubenswrapper[4933]: I1202 15:52:54.392061 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" event={"ID":"1972064c-ea30-421c-b009-2bc675a98fcc","Type":"ContainerDied","Data":"04ab3fe2bc8cab1157c68d938926ecb8b4fc1bca6f2bba21333de8a7213ad8df"} Dec 02 15:52:54 crc kubenswrapper[4933]: I1202 15:52:54.392128 4933 scope.go:117] "RemoveContainer" containerID="266447045a0c1f371e1f4464fa07d948355d9b00ce2443dcd0b6e1835d9cc835" Dec 02 15:52:54 crc kubenswrapper[4933]: I1202 15:52:54.392919 4933 scope.go:117] "RemoveContainer" containerID="04ab3fe2bc8cab1157c68d938926ecb8b4fc1bca6f2bba21333de8a7213ad8df" Dec 02 15:52:54 crc kubenswrapper[4933]: E1202 15:52:54.393118 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-8mklc_openshift-ovn-kubernetes(1972064c-ea30-421c-b009-2bc675a98fcc)\"" pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" podUID="1972064c-ea30-421c-b009-2bc675a98fcc" Dec 02 15:52:54 crc kubenswrapper[4933]: I1202 15:52:54.418298 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:54Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:54 crc kubenswrapper[4933]: I1202 15:52:54.436797 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6c1c5e6-50dd-428a-890c-2c3f0456f2fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d84363ac0dfeec81ad7770d6ffd34547605fc51bebb545c4639f4c069bab93ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t48jm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54194f3459a2bbe748821e4f8e94abdd18e7c4e483d4cc2c9d5b765db584dd01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t48jm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-d2p6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:54Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:54 crc kubenswrapper[4933]: I1202 15:52:54.451729 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5d9dn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97edaf10-912b-42e7-a9e7-930381d48508\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea0970d622ab76b94ceab66d1d10d469581574368d38c8cd7c6b7a26f81cb6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s82hp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5d9dn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:54Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:54 crc kubenswrapper[4933]: I1202 15:52:54.464336 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fl25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"680c0df1-e4d6-4e1c-a36d-2378e821d2d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bdbdb1eb6ad7b6e710feeda6af64e9557ec1c3c938fd850fa5b2835abc45f098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-559sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fl25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:54Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:54 crc kubenswrapper[4933]: I1202 15:52:54.481559 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50efb59904b9275848b8f068dfa8943515c66087209fe13dc75888354ecaff09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:54Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:54 crc kubenswrapper[4933]: I1202 15:52:54.484095 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:54 crc kubenswrapper[4933]: I1202 15:52:54.484148 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:54 crc kubenswrapper[4933]: I1202 15:52:54.484161 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:54 crc kubenswrapper[4933]: I1202 15:52:54.484181 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:54 crc kubenswrapper[4933]: I1202 15:52:54.484193 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:54Z","lastTransitionTime":"2025-12-02T15:52:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:54 crc kubenswrapper[4933]: I1202 15:52:54.498927 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:54Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:54 crc kubenswrapper[4933]: I1202 15:52:54.515457 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4424b26b4c67e9508d92cc6bbc82b291d93c587a8463026a856d87b7b778079e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:54Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:54 crc kubenswrapper[4933]: I1202 15:52:54.532063 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-z6kjz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b033c545-93a2-4401-842b-22456e44216b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5307d7bbe56091012f9975b2a42eafb27d8c90b53817f1f82d8269e23456759\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdq96\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-z6kjz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:54Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:54 crc kubenswrapper[4933]: I1202 15:52:54.548701 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67141d41-dade-4d16-8921-1a3eeaef658e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67497877f8241167fd89cf2ebbc7257a910b4f22e96a1e58f78936317daf19aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://559f28d5b8b13a4d78a727f671a75d2c61a391f83089c98749a8071b466c8fae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50e3f8f75abacf2b0701b20cc58796727bb04d99836c7ad69c27355d0c7f070f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0f8661c980893fc646e84a6bb8547946653723fb3f1449d585953e25715d588\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdced4864fc5e9a41404f9484c6126634ffcbc3388080207f6a5508be6dc7b19\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 15:52:30.552832 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 15:52:30.556124 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1621699441/tls.crt::/tmp/serving-cert-1621699441/tls.key\\\\\\\"\\\\nI1202 15:52:36.166152 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 15:52:36.169452 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 15:52:36.169553 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 15:52:36.169614 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 15:52:36.169667 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 15:52:36.177343 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 15:52:36.177409 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 15:52:36.177417 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 15:52:36.177423 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 15:52:36.177428 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 15:52:36.177432 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 15:52:36.177437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 15:52:36.177361 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 15:52:36.181525 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://266ff5f18c317176752753189e1fd32c61d17e22bebba19421d71cdce41c10e4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd63fc5b2ffbd52b09628908868b28e2168bfe8bc453de41ce175aedcd404cc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd63fc5b2ffbd52b09628908868b28e2168bfe8bc453de41ce175aedcd404cc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:54Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:54 crc kubenswrapper[4933]: I1202 15:52:54.567084 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c0555430401fd089ba4f14bef44c9a03bcc4352a3159c34aa592797211ff912\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2774460b514418fc05e1d8ac0ca0a8cda1194fab9151804bed266e6bf44c7369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:54Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:54 crc kubenswrapper[4933]: I1202 15:52:54.584150 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:54Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:54 crc kubenswrapper[4933]: I1202 15:52:54.586011 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:54 crc kubenswrapper[4933]: I1202 15:52:54.586045 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:54 crc kubenswrapper[4933]: I1202 15:52:54.586057 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:54 crc kubenswrapper[4933]: I1202 15:52:54.586075 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:54 crc kubenswrapper[4933]: I1202 15:52:54.586086 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:54Z","lastTransitionTime":"2025-12-02T15:52:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:54 crc kubenswrapper[4933]: I1202 15:52:54.601814 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-s779q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72a30a71-fe04-43a2-8f60-c9b12a0a6e5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2367cc195b2375ceed6df398214268e414ae13f5459750b4a1f3bbe4ef59363\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a448b3118b6160344bea740978c045ce2351af934805ee95eb6b940963b377bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a448b3118b6160344bea740978c045ce2351af934805ee95eb6b940963b377bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f11b7eae04e4265f3d00a32c0c1c91419ef6da612c6ecd2e69ef4e90b5f72b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f11b7eae04e4265f3d00a32c0c1c91419ef6da612c6ecd2e69ef4e90b5f72b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89093d056f612b90064277410dc17c8b1879459a9e9491538ccd8dabeb2703e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89093d056f612b90064277410dc17c8b1879459a9e9491538ccd8dabeb2703e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e773c0ba8db44a579b0f5b59c7402d773402fbb1b3e3b57bf95e9244df635c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e773c0ba8db44a579b0f5b59c7402d773402fbb1b3e3b57bf95e9244df635c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a098d78bfca730eec14ca5f48ae082e24f0623ec3752e1b0182dceb18821e3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a098d78bfca730eec14ca5f48ae082e24f0623ec3752e1b0182dceb18821e3b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e41fea9eed5df332e515a81666075fc4cb3171b47a7c222b36dac4d5a7533692\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e41fea9eed5df332e515a81666075fc4cb3171b47a7c222b36dac4d5a7533692\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-s779q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:54Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:54 crc kubenswrapper[4933]: I1202 15:52:54.616597 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06b195f1-3296-4050-9361-eab421cde8d4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7399fed7994f0ed3f12a423d3f6796e84d8687f9c16a3050ccbb90e1c80a07d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee933bdd8638f0085a6f720a178c8ce59bf46b40a0bcb015ac9c570e25ce97d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7cb1abb6f86878fc3daef153191ea3a2ebe06b3f1fc7df959539938c3b6a724\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db14d6e6ebdfa06ff02570eb66fe7ea17a7705fdaa767b6fb91d7ed12eacd59a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:54Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:54 crc kubenswrapper[4933]: I1202 15:52:54.668365 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1972064c-ea30-421c-b009-2bc675a98fcc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3920b75a79accaa154ca827b9d813f5bbb7c7c18369c0b6d3bb82cab653bcb66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a3f2ac3ef3759389e0c4807b4235aea66b8d9570ef020af90110402296a755\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d4c0805f39e3410185b2d898b61b42dfeb544cd47c1ac7570a099ece7921a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://170155e5803e9eabfc4866a3cd76bd74567d9ff6707bebfdf7f24a2330173140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a499bb33dae499d565aa22f537a095fe9465bd1d68d829248fc7a896f692f67d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14c25265781df0f480ecebbc167b5e413d869433f7258cc299a2c2bc12ee2af3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04ab3fe2bc8cab1157c68d938926ecb8b4fc1bca6f2bba21333de8a7213ad8df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://266447045a0c1f371e1f4464fa07d948355d9b00ce2443dcd0b6e1835d9cc835\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T15:52:52Z\\\",\\\"message\\\":\\\"o:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 15:52:52.182648 6243 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 15:52:52.182987 6243 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 15:52:52.183056 6243 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1202 15:52:52.183310 6243 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1202 15:52:52.183953 6243 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1202 15:52:52.183976 6243 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1202 15:52:52.183990 6243 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1202 15:52:52.184013 6243 factory.go:656] Stopping watch factory\\\\nI1202 15:52:52.184023 6243 handler.go:208] Removed *v1.Node event handler 2\\\\nI1202 15:52:52.184029 6243 ovnkube.go:599] Stopped ovnkube\\\\nI1202 15:52:52.184039 6243 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI12\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04ab3fe2bc8cab1157c68d938926ecb8b4fc1bca6f2bba21333de8a7213ad8df\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T15:52:53Z\\\",\\\"message\\\":\\\"]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1202 15:52:53.202107 6385 services_controller.go:444] Built service openshift-machine-api/control-plane-machine-set-operator LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1202 15:52:53.202117 6385 services_controller.go:445] Built service openshift-machine-api/control-plane-machine-set-operator LB template configs for network=default: []services.lbConfig(nil)\\\\nF1202 15:52:53.201748 6385 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:53Z is after 2025-08\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ae6c9973a0f8983a5da9a3a25ba23d7d836e7f5bdaa60b60627d604ef3d2afc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0062caebc274431e99ef7dc875a50f2f2d756215bd2700b7206582f68e5f376\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0062caebc274431e99ef7dc875a50f2f2d756215bd2700b7206582f68e5f376\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8mklc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:54Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:54 crc kubenswrapper[4933]: I1202 15:52:54.688006 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:54 crc kubenswrapper[4933]: I1202 15:52:54.688241 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:54 crc kubenswrapper[4933]: I1202 15:52:54.688424 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:54 crc kubenswrapper[4933]: I1202 15:52:54.688496 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:54 crc kubenswrapper[4933]: I1202 15:52:54.688552 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:54Z","lastTransitionTime":"2025-12-02T15:52:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:54 crc kubenswrapper[4933]: I1202 15:52:54.783431 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s488w"] Dec 02 15:52:54 crc kubenswrapper[4933]: I1202 15:52:54.784356 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s488w" Dec 02 15:52:54 crc kubenswrapper[4933]: I1202 15:52:54.790769 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 02 15:52:54 crc kubenswrapper[4933]: I1202 15:52:54.791800 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:54 crc kubenswrapper[4933]: I1202 15:52:54.791925 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:54 crc kubenswrapper[4933]: I1202 15:52:54.792003 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:54 crc kubenswrapper[4933]: I1202 15:52:54.792082 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:54 crc kubenswrapper[4933]: I1202 15:52:54.792275 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:54Z","lastTransitionTime":"2025-12-02T15:52:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:54 crc kubenswrapper[4933]: I1202 15:52:54.795380 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 02 15:52:54 crc kubenswrapper[4933]: I1202 15:52:54.803819 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50efb59904b9275848b8f068dfa8943515c66087209fe13dc75888354ecaff09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:54Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:54 crc kubenswrapper[4933]: I1202 15:52:54.822229 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:54Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:54 crc kubenswrapper[4933]: I1202 15:52:54.835729 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4424b26b4c67e9508d92cc6bbc82b291d93c587a8463026a856d87b7b778079e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:54Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:54 crc kubenswrapper[4933]: I1202 15:52:54.851797 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-z6kjz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b033c545-93a2-4401-842b-22456e44216b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5307d7bbe56091012f9975b2a42eafb27d8c90b53817f1f82d8269e23456759\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdq96\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-z6kjz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:54Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:54 crc kubenswrapper[4933]: I1202 15:52:54.867069 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67141d41-dade-4d16-8921-1a3eeaef658e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67497877f8241167fd89cf2ebbc7257a910b4f22e96a1e58f78936317daf19aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://559f28d5b8b13a4d78a727f671a75d2c61a391f83089c98749a8071b466c8fae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50e3f8f75abacf2b0701b20cc58796727bb04d99836c7ad69c27355d0c7f070f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0f8661c980893fc646e84a6bb8547946653723fb3f1449d585953e25715d588\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdced4864fc5e9a41404f9484c6126634ffcbc3388080207f6a5508be6dc7b19\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 15:52:30.552832 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 15:52:30.556124 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1621699441/tls.crt::/tmp/serving-cert-1621699441/tls.key\\\\\\\"\\\\nI1202 15:52:36.166152 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 15:52:36.169452 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 15:52:36.169553 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 15:52:36.169614 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 15:52:36.169667 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 15:52:36.177343 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 15:52:36.177409 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 15:52:36.177417 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 15:52:36.177423 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 15:52:36.177428 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 15:52:36.177432 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 15:52:36.177437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 15:52:36.177361 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 15:52:36.181525 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://266ff5f18c317176752753189e1fd32c61d17e22bebba19421d71cdce41c10e4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd63fc5b2ffbd52b09628908868b28e2168bfe8bc453de41ce175aedcd404cc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd63fc5b2ffbd52b09628908868b28e2168bfe8bc453de41ce175aedcd404cc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:54Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:54 crc kubenswrapper[4933]: I1202 15:52:54.880187 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c0555430401fd089ba4f14bef44c9a03bcc4352a3159c34aa592797211ff912\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2774460b514418fc05e1d8ac0ca0a8cda1194fab9151804bed266e6bf44c7369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:54Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:54 crc kubenswrapper[4933]: I1202 15:52:54.891891 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:54Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:54 crc kubenswrapper[4933]: I1202 15:52:54.895660 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:54 crc kubenswrapper[4933]: I1202 15:52:54.895798 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:54 crc kubenswrapper[4933]: I1202 15:52:54.895934 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:54 crc kubenswrapper[4933]: I1202 15:52:54.896058 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:54 crc kubenswrapper[4933]: I1202 15:52:54.896145 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:54Z","lastTransitionTime":"2025-12-02T15:52:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:54 crc kubenswrapper[4933]: I1202 15:52:54.909807 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-s779q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72a30a71-fe04-43a2-8f60-c9b12a0a6e5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2367cc195b2375ceed6df398214268e414ae13f5459750b4a1f3bbe4ef59363\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a448b3118b6160344bea740978c045ce2351af934805ee95eb6b940963b377bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a448b3118b6160344bea740978c045ce2351af934805ee95eb6b940963b377bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f11b7eae04e4265f3d00a32c0c1c91419ef6da612c6ecd2e69ef4e90b5f72b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f11b7eae04e4265f3d00a32c0c1c91419ef6da612c6ecd2e69ef4e90b5f72b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89093d056f612b90064277410dc17c8b1879459a9e9491538ccd8dabeb2703e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89093d056f612b90064277410dc17c8b1879459a9e9491538ccd8dabeb2703e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e773c0ba8db44a579b0f5b59c7402d773402fbb1b3e3b57bf95e9244df635c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e773c0ba8db44a579b0f5b59c7402d773402fbb1b3e3b57bf95e9244df635c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a098d78bfca730eec14ca5f48ae082e24f0623ec3752e1b0182dceb18821e3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a098d78bfca730eec14ca5f48ae082e24f0623ec3752e1b0182dceb18821e3b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e41fea9eed5df332e515a81666075fc4cb3171b47a7c222b36dac4d5a7533692\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e41fea9eed5df332e515a81666075fc4cb3171b47a7c222b36dac4d5a7533692\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-s779q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:54Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:54 crc kubenswrapper[4933]: I1202 15:52:54.923657 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06b195f1-3296-4050-9361-eab421cde8d4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7399fed7994f0ed3f12a423d3f6796e84d8687f9c16a3050ccbb90e1c80a07d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee933bdd8638f0085a6f720a178c8ce59bf46b40a0bcb015ac9c570e25ce97d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7cb1abb6f86878fc3daef153191ea3a2ebe06b3f1fc7df959539938c3b6a724\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db14d6e6ebdfa06ff02570eb66fe7ea17a7705fdaa767b6fb91d7ed12eacd59a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:54Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:54 crc kubenswrapper[4933]: I1202 15:52:54.942181 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1972064c-ea30-421c-b009-2bc675a98fcc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3920b75a79accaa154ca827b9d813f5bbb7c7c18369c0b6d3bb82cab653bcb66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a3f2ac3ef3759389e0c4807b4235aea66b8d9570ef020af90110402296a755\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d4c0805f39e3410185b2d898b61b42dfeb544cd47c1ac7570a099ece7921a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://170155e5803e9eabfc4866a3cd76bd74567d9ff6707bebfdf7f24a2330173140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a499bb33dae499d565aa22f537a095fe9465bd1d68d829248fc7a896f692f67d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14c25265781df0f480ecebbc167b5e413d869433f7258cc299a2c2bc12ee2af3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04ab3fe2bc8cab1157c68d938926ecb8b4fc1bca6f2bba21333de8a7213ad8df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://266447045a0c1f371e1f4464fa07d948355d9b00ce2443dcd0b6e1835d9cc835\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T15:52:52Z\\\",\\\"message\\\":\\\"o:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 15:52:52.182648 6243 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 15:52:52.182987 6243 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 15:52:52.183056 6243 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1202 15:52:52.183310 6243 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1202 15:52:52.183953 6243 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1202 15:52:52.183976 6243 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1202 15:52:52.183990 6243 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1202 15:52:52.184013 6243 factory.go:656] Stopping watch factory\\\\nI1202 15:52:52.184023 6243 handler.go:208] Removed *v1.Node event handler 2\\\\nI1202 15:52:52.184029 6243 ovnkube.go:599] Stopped ovnkube\\\\nI1202 15:52:52.184039 6243 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI12\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04ab3fe2bc8cab1157c68d938926ecb8b4fc1bca6f2bba21333de8a7213ad8df\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T15:52:53Z\\\",\\\"message\\\":\\\"]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1202 15:52:53.202107 6385 services_controller.go:444] Built service openshift-machine-api/control-plane-machine-set-operator LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1202 15:52:53.202117 6385 services_controller.go:445] Built service openshift-machine-api/control-plane-machine-set-operator LB template configs for network=default: []services.lbConfig(nil)\\\\nF1202 15:52:53.201748 6385 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:53Z is after 2025-08\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ae6c9973a0f8983a5da9a3a25ba23d7d836e7f5bdaa60b60627d604ef3d2afc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0062caebc274431e99ef7dc875a50f2f2d756215bd2700b7206582f68e5f376\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0062caebc274431e99ef7dc875a50f2f2d756215bd2700b7206582f68e5f376\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8mklc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:54Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:54 crc kubenswrapper[4933]: I1202 15:52:54.944527 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a161d9d5-a56f-45e9-93e4-50e7220cd31e-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-s488w\" (UID: \"a161d9d5-a56f-45e9-93e4-50e7220cd31e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s488w" Dec 02 15:52:54 crc kubenswrapper[4933]: I1202 15:52:54.944774 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a161d9d5-a56f-45e9-93e4-50e7220cd31e-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-s488w\" (UID: \"a161d9d5-a56f-45e9-93e4-50e7220cd31e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s488w" Dec 02 15:52:54 crc kubenswrapper[4933]: I1202 15:52:54.944902 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a161d9d5-a56f-45e9-93e4-50e7220cd31e-env-overrides\") pod \"ovnkube-control-plane-749d76644c-s488w\" (UID: \"a161d9d5-a56f-45e9-93e4-50e7220cd31e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s488w" Dec 02 15:52:54 crc kubenswrapper[4933]: I1202 15:52:54.945033 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnvpp\" (UniqueName: \"kubernetes.io/projected/a161d9d5-a56f-45e9-93e4-50e7220cd31e-kube-api-access-mnvpp\") pod \"ovnkube-control-plane-749d76644c-s488w\" (UID: \"a161d9d5-a56f-45e9-93e4-50e7220cd31e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s488w" Dec 02 15:52:54 crc kubenswrapper[4933]: I1202 15:52:54.956143 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s488w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a161d9d5-a56f-45e9-93e4-50e7220cd31e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mnvpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mnvpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s488w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:54Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:54 crc kubenswrapper[4933]: I1202 15:52:54.970768 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:54Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:54 crc kubenswrapper[4933]: I1202 15:52:54.984448 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6c1c5e6-50dd-428a-890c-2c3f0456f2fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d84363ac0dfeec81ad7770d6ffd34547605fc51bebb545c4639f4c069bab93ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t48jm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54194f3459a2bbe748821e4f8e94abdd18e7c4e483d4cc2c9d5b765db584dd01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t48jm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-d2p6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:54Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:54 crc kubenswrapper[4933]: I1202 15:52:54.995570 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5d9dn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97edaf10-912b-42e7-a9e7-930381d48508\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea0970d622ab76b94ceab66d1d10d469581574368d38c8cd7c6b7a26f81cb6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s82hp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5d9dn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:54Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:54 crc kubenswrapper[4933]: I1202 15:52:54.999015 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:54 crc kubenswrapper[4933]: I1202 15:52:54.999119 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:54 crc kubenswrapper[4933]: I1202 15:52:54.999255 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:54 crc kubenswrapper[4933]: I1202 15:52:54.999321 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:54 crc kubenswrapper[4933]: I1202 15:52:54.999377 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:54Z","lastTransitionTime":"2025-12-02T15:52:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:55 crc kubenswrapper[4933]: I1202 15:52:55.006101 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fl25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"680c0df1-e4d6-4e1c-a36d-2378e821d2d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bdbdb1eb6ad7b6e710feeda6af64e9557ec1c3c938fd850fa5b2835abc45f098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-559sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fl25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:55Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:55 crc kubenswrapper[4933]: I1202 15:52:55.045953 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnvpp\" (UniqueName: \"kubernetes.io/projected/a161d9d5-a56f-45e9-93e4-50e7220cd31e-kube-api-access-mnvpp\") pod \"ovnkube-control-plane-749d76644c-s488w\" (UID: \"a161d9d5-a56f-45e9-93e4-50e7220cd31e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s488w" Dec 02 15:52:55 crc kubenswrapper[4933]: I1202 15:52:55.045993 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a161d9d5-a56f-45e9-93e4-50e7220cd31e-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-s488w\" (UID: \"a161d9d5-a56f-45e9-93e4-50e7220cd31e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s488w" Dec 02 15:52:55 crc kubenswrapper[4933]: I1202 15:52:55.046010 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a161d9d5-a56f-45e9-93e4-50e7220cd31e-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-s488w\" (UID: \"a161d9d5-a56f-45e9-93e4-50e7220cd31e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s488w" Dec 02 15:52:55 crc kubenswrapper[4933]: I1202 15:52:55.046030 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a161d9d5-a56f-45e9-93e4-50e7220cd31e-env-overrides\") pod \"ovnkube-control-plane-749d76644c-s488w\" (UID: \"a161d9d5-a56f-45e9-93e4-50e7220cd31e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s488w" Dec 02 15:52:55 crc kubenswrapper[4933]: I1202 15:52:55.046613 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a161d9d5-a56f-45e9-93e4-50e7220cd31e-env-overrides\") pod \"ovnkube-control-plane-749d76644c-s488w\" (UID: \"a161d9d5-a56f-45e9-93e4-50e7220cd31e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s488w" Dec 02 15:52:55 crc kubenswrapper[4933]: I1202 15:52:55.047213 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a161d9d5-a56f-45e9-93e4-50e7220cd31e-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-s488w\" (UID: \"a161d9d5-a56f-45e9-93e4-50e7220cd31e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s488w" Dec 02 15:52:55 crc kubenswrapper[4933]: I1202 15:52:55.056837 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a161d9d5-a56f-45e9-93e4-50e7220cd31e-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-s488w\" (UID: \"a161d9d5-a56f-45e9-93e4-50e7220cd31e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s488w" Dec 02 15:52:55 crc kubenswrapper[4933]: I1202 15:52:55.068314 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnvpp\" (UniqueName: \"kubernetes.io/projected/a161d9d5-a56f-45e9-93e4-50e7220cd31e-kube-api-access-mnvpp\") pod \"ovnkube-control-plane-749d76644c-s488w\" (UID: \"a161d9d5-a56f-45e9-93e4-50e7220cd31e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s488w" Dec 02 15:52:55 crc kubenswrapper[4933]: I1202 15:52:55.097775 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s488w" Dec 02 15:52:55 crc kubenswrapper[4933]: I1202 15:52:55.101553 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:55 crc kubenswrapper[4933]: I1202 15:52:55.101598 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:55 crc kubenswrapper[4933]: I1202 15:52:55.101610 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:55 crc kubenswrapper[4933]: I1202 15:52:55.101624 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:55 crc kubenswrapper[4933]: I1202 15:52:55.101634 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:55Z","lastTransitionTime":"2025-12-02T15:52:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:55 crc kubenswrapper[4933]: W1202 15:52:55.110142 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda161d9d5_a56f_45e9_93e4_50e7220cd31e.slice/crio-a1dd1b373aa58494e62a97b9c4cd930afcef7ca981b6de70dc1f5cdf37b24418 WatchSource:0}: Error finding container a1dd1b373aa58494e62a97b9c4cd930afcef7ca981b6de70dc1f5cdf37b24418: Status 404 returned error can't find the container with id a1dd1b373aa58494e62a97b9c4cd930afcef7ca981b6de70dc1f5cdf37b24418 Dec 02 15:52:55 crc kubenswrapper[4933]: I1202 15:52:55.205382 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:55 crc kubenswrapper[4933]: I1202 15:52:55.205422 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:55 crc kubenswrapper[4933]: I1202 15:52:55.205431 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:55 crc kubenswrapper[4933]: I1202 15:52:55.205446 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:55 crc kubenswrapper[4933]: I1202 15:52:55.205455 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:55Z","lastTransitionTime":"2025-12-02T15:52:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:55 crc kubenswrapper[4933]: I1202 15:52:55.308406 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:55 crc kubenswrapper[4933]: I1202 15:52:55.308476 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:55 crc kubenswrapper[4933]: I1202 15:52:55.308494 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:55 crc kubenswrapper[4933]: I1202 15:52:55.308523 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:55 crc kubenswrapper[4933]: I1202 15:52:55.308544 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:55Z","lastTransitionTime":"2025-12-02T15:52:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:55 crc kubenswrapper[4933]: I1202 15:52:55.397038 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s488w" event={"ID":"a161d9d5-a56f-45e9-93e4-50e7220cd31e","Type":"ContainerStarted","Data":"a1dd1b373aa58494e62a97b9c4cd930afcef7ca981b6de70dc1f5cdf37b24418"} Dec 02 15:52:55 crc kubenswrapper[4933]: I1202 15:52:55.399082 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8mklc_1972064c-ea30-421c-b009-2bc675a98fcc/ovnkube-controller/1.log" Dec 02 15:52:55 crc kubenswrapper[4933]: I1202 15:52:55.402620 4933 scope.go:117] "RemoveContainer" containerID="04ab3fe2bc8cab1157c68d938926ecb8b4fc1bca6f2bba21333de8a7213ad8df" Dec 02 15:52:55 crc kubenswrapper[4933]: E1202 15:52:55.402933 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-8mklc_openshift-ovn-kubernetes(1972064c-ea30-421c-b009-2bc675a98fcc)\"" pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" podUID="1972064c-ea30-421c-b009-2bc675a98fcc" Dec 02 15:52:55 crc kubenswrapper[4933]: I1202 15:52:55.411612 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:55 crc kubenswrapper[4933]: I1202 15:52:55.411664 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:55 crc kubenswrapper[4933]: I1202 15:52:55.411688 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:55 crc kubenswrapper[4933]: I1202 15:52:55.411717 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:55 crc kubenswrapper[4933]: I1202 15:52:55.411739 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:55Z","lastTransitionTime":"2025-12-02T15:52:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:55 crc kubenswrapper[4933]: I1202 15:52:55.417033 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fl25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"680c0df1-e4d6-4e1c-a36d-2378e821d2d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bdbdb1eb6ad7b6e710feeda6af64e9557ec1c3c938fd850fa5b2835abc45f098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-559sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fl25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:55Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:55 crc kubenswrapper[4933]: I1202 15:52:55.435698 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:55Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:55 crc kubenswrapper[4933]: I1202 15:52:55.449610 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6c1c5e6-50dd-428a-890c-2c3f0456f2fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d84363ac0dfeec81ad7770d6ffd34547605fc51bebb545c4639f4c069bab93ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t48jm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54194f3459a2bbe748821e4f8e94abdd18e7c4e483d4cc2c9d5b765db584dd01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t48jm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-d2p6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:55Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:55 crc kubenswrapper[4933]: I1202 15:52:55.462810 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5d9dn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97edaf10-912b-42e7-a9e7-930381d48508\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea0970d622ab76b94ceab66d1d10d469581574368d38c8cd7c6b7a26f81cb6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s82hp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5d9dn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:55Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:55 crc kubenswrapper[4933]: I1202 15:52:55.486103 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50efb59904b9275848b8f068dfa8943515c66087209fe13dc75888354ecaff09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:55Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:55 crc kubenswrapper[4933]: I1202 15:52:55.505170 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:55Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:55 crc kubenswrapper[4933]: I1202 15:52:55.514488 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:55 crc kubenswrapper[4933]: I1202 15:52:55.514521 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:55 crc kubenswrapper[4933]: I1202 15:52:55.514531 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:55 crc kubenswrapper[4933]: I1202 15:52:55.514545 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:55 crc kubenswrapper[4933]: I1202 15:52:55.514554 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:55Z","lastTransitionTime":"2025-12-02T15:52:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:55 crc kubenswrapper[4933]: I1202 15:52:55.519462 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4424b26b4c67e9508d92cc6bbc82b291d93c587a8463026a856d87b7b778079e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:55Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:55 crc kubenswrapper[4933]: I1202 15:52:55.545206 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-z6kjz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b033c545-93a2-4401-842b-22456e44216b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5307d7bbe56091012f9975b2a42eafb27d8c90b53817f1f82d8269e23456759\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdq96\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-z6kjz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:55Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:55 crc kubenswrapper[4933]: I1202 15:52:55.566699 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67141d41-dade-4d16-8921-1a3eeaef658e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67497877f8241167fd89cf2ebbc7257a910b4f22e96a1e58f78936317daf19aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://559f28d5b8b13a4d78a727f671a75d2c61a391f83089c98749a8071b466c8fae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50e3f8f75abacf2b0701b20cc58796727bb04d99836c7ad69c27355d0c7f070f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0f8661c980893fc646e84a6bb8547946653723fb3f1449d585953e25715d588\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdced4864fc5e9a41404f9484c6126634ffcbc3388080207f6a5508be6dc7b19\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 15:52:30.552832 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 15:52:30.556124 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1621699441/tls.crt::/tmp/serving-cert-1621699441/tls.key\\\\\\\"\\\\nI1202 15:52:36.166152 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 15:52:36.169452 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 15:52:36.169553 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 15:52:36.169614 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 15:52:36.169667 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 15:52:36.177343 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 15:52:36.177409 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 15:52:36.177417 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 15:52:36.177423 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 15:52:36.177428 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 15:52:36.177432 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 15:52:36.177437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 15:52:36.177361 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 15:52:36.181525 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://266ff5f18c317176752753189e1fd32c61d17e22bebba19421d71cdce41c10e4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd63fc5b2ffbd52b09628908868b28e2168bfe8bc453de41ce175aedcd404cc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd63fc5b2ffbd52b09628908868b28e2168bfe8bc453de41ce175aedcd404cc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:55Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:55 crc kubenswrapper[4933]: I1202 15:52:55.583339 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c0555430401fd089ba4f14bef44c9a03bcc4352a3159c34aa592797211ff912\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2774460b514418fc05e1d8ac0ca0a8cda1194fab9151804bed266e6bf44c7369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:55Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:55 crc kubenswrapper[4933]: I1202 15:52:55.599306 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:55Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:55 crc kubenswrapper[4933]: I1202 15:52:55.618064 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:55 crc kubenswrapper[4933]: I1202 15:52:55.618117 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:55 crc kubenswrapper[4933]: I1202 15:52:55.618135 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:55 crc kubenswrapper[4933]: I1202 15:52:55.618153 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:55 crc kubenswrapper[4933]: I1202 15:52:55.618167 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:55Z","lastTransitionTime":"2025-12-02T15:52:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:55 crc kubenswrapper[4933]: I1202 15:52:55.624276 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-s779q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72a30a71-fe04-43a2-8f60-c9b12a0a6e5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2367cc195b2375ceed6df398214268e414ae13f5459750b4a1f3bbe4ef59363\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a448b3118b6160344bea740978c045ce2351af934805ee95eb6b940963b377bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a448b3118b6160344bea740978c045ce2351af934805ee95eb6b940963b377bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f11b7eae04e4265f3d00a32c0c1c91419ef6da612c6ecd2e69ef4e90b5f72b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f11b7eae04e4265f3d00a32c0c1c91419ef6da612c6ecd2e69ef4e90b5f72b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89093d056f612b90064277410dc17c8b1879459a9e9491538ccd8dabeb2703e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89093d056f612b90064277410dc17c8b1879459a9e9491538ccd8dabeb2703e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e773c0ba8db44a579b0f5b59c7402d773402fbb1b3e3b57bf95e9244df635c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e773c0ba8db44a579b0f5b59c7402d773402fbb1b3e3b57bf95e9244df635c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a098d78bfca730eec14ca5f48ae082e24f0623ec3752e1b0182dceb18821e3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a098d78bfca730eec14ca5f48ae082e24f0623ec3752e1b0182dceb18821e3b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e41fea9eed5df332e515a81666075fc4cb3171b47a7c222b36dac4d5a7533692\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e41fea9eed5df332e515a81666075fc4cb3171b47a7c222b36dac4d5a7533692\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-s779q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:55Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:55 crc kubenswrapper[4933]: I1202 15:52:55.639483 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s488w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a161d9d5-a56f-45e9-93e4-50e7220cd31e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mnvpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mnvpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s488w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:55Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:55 crc kubenswrapper[4933]: I1202 15:52:55.653105 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06b195f1-3296-4050-9361-eab421cde8d4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7399fed7994f0ed3f12a423d3f6796e84d8687f9c16a3050ccbb90e1c80a07d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee933bdd8638f0085a6f720a178c8ce59bf46b40a0bcb015ac9c570e25ce97d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7cb1abb6f86878fc3daef153191ea3a2ebe06b3f1fc7df959539938c3b6a724\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db14d6e6ebdfa06ff02570eb66fe7ea17a7705fdaa767b6fb91d7ed12eacd59a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:55Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:55 crc kubenswrapper[4933]: I1202 15:52:55.678006 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1972064c-ea30-421c-b009-2bc675a98fcc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3920b75a79accaa154ca827b9d813f5bbb7c7c18369c0b6d3bb82cab653bcb66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a3f2ac3ef3759389e0c4807b4235aea66b8d9570ef020af90110402296a755\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d4c0805f39e3410185b2d898b61b42dfeb544cd47c1ac7570a099ece7921a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://170155e5803e9eabfc4866a3cd76bd74567d9ff6707bebfdf7f24a2330173140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a499bb33dae499d565aa22f537a095fe9465bd1d68d829248fc7a896f692f67d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14c25265781df0f480ecebbc167b5e413d869433f7258cc299a2c2bc12ee2af3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04ab3fe2bc8cab1157c68d938926ecb8b4fc1bca6f2bba21333de8a7213ad8df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04ab3fe2bc8cab1157c68d938926ecb8b4fc1bca6f2bba21333de8a7213ad8df\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T15:52:53Z\\\",\\\"message\\\":\\\"]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1202 15:52:53.202107 6385 services_controller.go:444] Built service openshift-machine-api/control-plane-machine-set-operator LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1202 15:52:53.202117 6385 services_controller.go:445] Built service openshift-machine-api/control-plane-machine-set-operator LB template configs for network=default: []services.lbConfig(nil)\\\\nF1202 15:52:53.201748 6385 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:53Z is after 2025-08\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-8mklc_openshift-ovn-kubernetes(1972064c-ea30-421c-b009-2bc675a98fcc)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ae6c9973a0f8983a5da9a3a25ba23d7d836e7f5bdaa60b60627d604ef3d2afc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0062caebc274431e99ef7dc875a50f2f2d756215bd2700b7206582f68e5f376\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0062caebc274431e99ef7dc875a50f2f2d756215bd2700b7206582f68e5f376\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8mklc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:55Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:55 crc kubenswrapper[4933]: I1202 15:52:55.721551 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:55 crc kubenswrapper[4933]: I1202 15:52:55.721601 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:55 crc kubenswrapper[4933]: I1202 15:52:55.721614 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:55 crc kubenswrapper[4933]: I1202 15:52:55.721635 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:55 crc kubenswrapper[4933]: I1202 15:52:55.721649 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:55Z","lastTransitionTime":"2025-12-02T15:52:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:55 crc kubenswrapper[4933]: I1202 15:52:55.825493 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:55 crc kubenswrapper[4933]: I1202 15:52:55.825528 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:55 crc kubenswrapper[4933]: I1202 15:52:55.825537 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:55 crc kubenswrapper[4933]: I1202 15:52:55.825550 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:55 crc kubenswrapper[4933]: I1202 15:52:55.825558 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:55Z","lastTransitionTime":"2025-12-02T15:52:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:55 crc kubenswrapper[4933]: I1202 15:52:55.928451 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:55 crc kubenswrapper[4933]: I1202 15:52:55.928498 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:55 crc kubenswrapper[4933]: I1202 15:52:55.928510 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:55 crc kubenswrapper[4933]: I1202 15:52:55.928530 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:55 crc kubenswrapper[4933]: I1202 15:52:55.928542 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:55Z","lastTransitionTime":"2025-12-02T15:52:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:56 crc kubenswrapper[4933]: I1202 15:52:56.031304 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:56 crc kubenswrapper[4933]: I1202 15:52:56.031347 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:56 crc kubenswrapper[4933]: I1202 15:52:56.031357 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:56 crc kubenswrapper[4933]: I1202 15:52:56.031374 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:56 crc kubenswrapper[4933]: I1202 15:52:56.031384 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:56Z","lastTransitionTime":"2025-12-02T15:52:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:56 crc kubenswrapper[4933]: I1202 15:52:56.052690 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 15:52:56 crc kubenswrapper[4933]: I1202 15:52:56.052747 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 15:52:56 crc kubenswrapper[4933]: I1202 15:52:56.052769 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 15:52:56 crc kubenswrapper[4933]: E1202 15:52:56.052848 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 15:52:56 crc kubenswrapper[4933]: E1202 15:52:56.053000 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 15:52:56 crc kubenswrapper[4933]: E1202 15:52:56.053076 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 15:52:56 crc kubenswrapper[4933]: I1202 15:52:56.134327 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:56 crc kubenswrapper[4933]: I1202 15:52:56.134386 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:56 crc kubenswrapper[4933]: I1202 15:52:56.134402 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:56 crc kubenswrapper[4933]: I1202 15:52:56.134422 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:56 crc kubenswrapper[4933]: I1202 15:52:56.134437 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:56Z","lastTransitionTime":"2025-12-02T15:52:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:56 crc kubenswrapper[4933]: I1202 15:52:56.237723 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:56 crc kubenswrapper[4933]: I1202 15:52:56.237776 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:56 crc kubenswrapper[4933]: I1202 15:52:56.237787 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:56 crc kubenswrapper[4933]: I1202 15:52:56.237810 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:56 crc kubenswrapper[4933]: I1202 15:52:56.237852 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:56Z","lastTransitionTime":"2025-12-02T15:52:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:56 crc kubenswrapper[4933]: I1202 15:52:56.340482 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:56 crc kubenswrapper[4933]: I1202 15:52:56.340535 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:56 crc kubenswrapper[4933]: I1202 15:52:56.340546 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:56 crc kubenswrapper[4933]: I1202 15:52:56.340564 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:56 crc kubenswrapper[4933]: I1202 15:52:56.340587 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:56Z","lastTransitionTime":"2025-12-02T15:52:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:56 crc kubenswrapper[4933]: I1202 15:52:56.408050 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s488w" event={"ID":"a161d9d5-a56f-45e9-93e4-50e7220cd31e","Type":"ContainerStarted","Data":"5cba0775cbbe9a6ce1f0b0fe3559f2b5eb39bd13d4f35686fe2ff92d7d833909"} Dec 02 15:52:56 crc kubenswrapper[4933]: I1202 15:52:56.408112 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s488w" event={"ID":"a161d9d5-a56f-45e9-93e4-50e7220cd31e","Type":"ContainerStarted","Data":"38f86b30416fb2f10e8444d5c7c0afe84f16d619e83e7a5e1186eaf4c274a51f"} Dec 02 15:52:56 crc kubenswrapper[4933]: I1202 15:52:56.432443 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67141d41-dade-4d16-8921-1a3eeaef658e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67497877f8241167fd89cf2ebbc7257a910b4f22e96a1e58f78936317daf19aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://559f28d5b8b13a4d78a727f671a75d2c61a391f83089c98749a8071b466c8fae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50e3f8f75abacf2b0701b20cc58796727bb04d99836c7ad69c27355d0c7f070f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0f8661c980893fc646e84a6bb8547946653723fb3f1449d585953e25715d588\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdced4864fc5e9a41404f9484c6126634ffcbc3388080207f6a5508be6dc7b19\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 15:52:30.552832 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 15:52:30.556124 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1621699441/tls.crt::/tmp/serving-cert-1621699441/tls.key\\\\\\\"\\\\nI1202 15:52:36.166152 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 15:52:36.169452 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 15:52:36.169553 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 15:52:36.169614 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 15:52:36.169667 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 15:52:36.177343 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 15:52:36.177409 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 15:52:36.177417 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 15:52:36.177423 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 15:52:36.177428 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 15:52:36.177432 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 15:52:36.177437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 15:52:36.177361 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 15:52:36.181525 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://266ff5f18c317176752753189e1fd32c61d17e22bebba19421d71cdce41c10e4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd63fc5b2ffbd52b09628908868b28e2168bfe8bc453de41ce175aedcd404cc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd63fc5b2ffbd52b09628908868b28e2168bfe8bc453de41ce175aedcd404cc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:56Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:56 crc kubenswrapper[4933]: I1202 15:52:56.442632 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:56 crc kubenswrapper[4933]: I1202 15:52:56.442676 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:56 crc kubenswrapper[4933]: I1202 15:52:56.442688 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:56 crc kubenswrapper[4933]: I1202 15:52:56.442707 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:56 crc kubenswrapper[4933]: I1202 15:52:56.442719 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:56Z","lastTransitionTime":"2025-12-02T15:52:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:56 crc kubenswrapper[4933]: I1202 15:52:56.451560 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c0555430401fd089ba4f14bef44c9a03bcc4352a3159c34aa592797211ff912\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2774460b514418fc05e1d8ac0ca0a8cda1194fab9151804bed266e6bf44c7369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:56Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:56 crc kubenswrapper[4933]: I1202 15:52:56.482426 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:56Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:56 crc kubenswrapper[4933]: I1202 15:52:56.511775 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-s779q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72a30a71-fe04-43a2-8f60-c9b12a0a6e5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2367cc195b2375ceed6df398214268e414ae13f5459750b4a1f3bbe4ef59363\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a448b3118b6160344bea740978c045ce2351af934805ee95eb6b940963b377bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a448b3118b6160344bea740978c045ce2351af934805ee95eb6b940963b377bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f11b7eae04e4265f3d00a32c0c1c91419ef6da612c6ecd2e69ef4e90b5f72b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f11b7eae04e4265f3d00a32c0c1c91419ef6da612c6ecd2e69ef4e90b5f72b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89093d056f612b90064277410dc17c8b1879459a9e9491538ccd8dabeb2703e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89093d056f612b90064277410dc17c8b1879459a9e9491538ccd8dabeb2703e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e773c0ba8db44a579b0f5b59c7402d773402fbb1b3e3b57bf95e9244df635c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e773c0ba8db44a579b0f5b59c7402d773402fbb1b3e3b57bf95e9244df635c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a098d78bfca730eec14ca5f48ae082e24f0623ec3752e1b0182dceb18821e3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a098d78bfca730eec14ca5f48ae082e24f0623ec3752e1b0182dceb18821e3b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e41fea9eed5df332e515a81666075fc4cb3171b47a7c222b36dac4d5a7533692\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e41fea9eed5df332e515a81666075fc4cb3171b47a7c222b36dac4d5a7533692\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-s779q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:56Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:56 crc kubenswrapper[4933]: I1202 15:52:56.538475 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06b195f1-3296-4050-9361-eab421cde8d4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7399fed7994f0ed3f12a423d3f6796e84d8687f9c16a3050ccbb90e1c80a07d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee933bdd8638f0085a6f720a178c8ce59bf46b40a0bcb015ac9c570e25ce97d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7cb1abb6f86878fc3daef153191ea3a2ebe06b3f1fc7df959539938c3b6a724\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db14d6e6ebdfa06ff02570eb66fe7ea17a7705fdaa767b6fb91d7ed12eacd59a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:56Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:56 crc kubenswrapper[4933]: I1202 15:52:56.545477 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:56 crc kubenswrapper[4933]: I1202 15:52:56.545515 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:56 crc kubenswrapper[4933]: I1202 15:52:56.545524 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:56 crc kubenswrapper[4933]: I1202 15:52:56.545539 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:56 crc kubenswrapper[4933]: I1202 15:52:56.545551 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:56Z","lastTransitionTime":"2025-12-02T15:52:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:56 crc kubenswrapper[4933]: I1202 15:52:56.563137 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1972064c-ea30-421c-b009-2bc675a98fcc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3920b75a79accaa154ca827b9d813f5bbb7c7c18369c0b6d3bb82cab653bcb66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a3f2ac3ef3759389e0c4807b4235aea66b8d9570ef020af90110402296a755\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d4c0805f39e3410185b2d898b61b42dfeb544cd47c1ac7570a099ece7921a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://170155e5803e9eabfc4866a3cd76bd74567d9ff6707bebfdf7f24a2330173140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a499bb33dae499d565aa22f537a095fe9465bd1d68d829248fc7a896f692f67d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14c25265781df0f480ecebbc167b5e413d869433f7258cc299a2c2bc12ee2af3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04ab3fe2bc8cab1157c68d938926ecb8b4fc1bca6f2bba21333de8a7213ad8df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04ab3fe2bc8cab1157c68d938926ecb8b4fc1bca6f2bba21333de8a7213ad8df\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T15:52:53Z\\\",\\\"message\\\":\\\"]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1202 15:52:53.202107 6385 services_controller.go:444] Built service openshift-machine-api/control-plane-machine-set-operator LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1202 15:52:53.202117 6385 services_controller.go:445] Built service openshift-machine-api/control-plane-machine-set-operator LB template configs for network=default: []services.lbConfig(nil)\\\\nF1202 15:52:53.201748 6385 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:53Z is after 2025-08\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-8mklc_openshift-ovn-kubernetes(1972064c-ea30-421c-b009-2bc675a98fcc)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ae6c9973a0f8983a5da9a3a25ba23d7d836e7f5bdaa60b60627d604ef3d2afc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0062caebc274431e99ef7dc875a50f2f2d756215bd2700b7206582f68e5f376\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0062caebc274431e99ef7dc875a50f2f2d756215bd2700b7206582f68e5f376\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8mklc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:56Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:56 crc kubenswrapper[4933]: I1202 15:52:56.577007 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s488w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a161d9d5-a56f-45e9-93e4-50e7220cd31e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38f86b30416fb2f10e8444d5c7c0afe84f16d619e83e7a5e1186eaf4c274a51f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mnvpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cba0775cbbe9a6ce1f0b0fe3559f2b5eb39bd13d4f35686fe2ff92d7d833909\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mnvpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s488w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:56Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:56 crc kubenswrapper[4933]: I1202 15:52:56.590248 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:56Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:56 crc kubenswrapper[4933]: I1202 15:52:56.604673 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6c1c5e6-50dd-428a-890c-2c3f0456f2fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d84363ac0dfeec81ad7770d6ffd34547605fc51bebb545c4639f4c069bab93ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t48jm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54194f3459a2bbe748821e4f8e94abdd18e7c4e483d4cc2c9d5b765db584dd01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t48jm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-d2p6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:56Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:56 crc kubenswrapper[4933]: I1202 15:52:56.617458 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5d9dn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97edaf10-912b-42e7-a9e7-930381d48508\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea0970d622ab76b94ceab66d1d10d469581574368d38c8cd7c6b7a26f81cb6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s82hp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5d9dn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:56Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:56 crc kubenswrapper[4933]: I1202 15:52:56.630150 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fl25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"680c0df1-e4d6-4e1c-a36d-2378e821d2d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bdbdb1eb6ad7b6e710feeda6af64e9557ec1c3c938fd850fa5b2835abc45f098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-559sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fl25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:56Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:56 crc kubenswrapper[4933]: I1202 15:52:56.643336 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50efb59904b9275848b8f068dfa8943515c66087209fe13dc75888354ecaff09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:56Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:56 crc kubenswrapper[4933]: I1202 15:52:56.651600 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:56 crc kubenswrapper[4933]: I1202 15:52:56.651647 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:56 crc kubenswrapper[4933]: I1202 15:52:56.651656 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:56 crc kubenswrapper[4933]: I1202 15:52:56.651670 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:56 crc kubenswrapper[4933]: I1202 15:52:56.651680 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:56Z","lastTransitionTime":"2025-12-02T15:52:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:56 crc kubenswrapper[4933]: I1202 15:52:56.662484 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:56Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:56 crc kubenswrapper[4933]: I1202 15:52:56.674725 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4424b26b4c67e9508d92cc6bbc82b291d93c587a8463026a856d87b7b778079e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:56Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:56 crc kubenswrapper[4933]: I1202 15:52:56.688937 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-z6kjz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b033c545-93a2-4401-842b-22456e44216b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5307d7bbe56091012f9975b2a42eafb27d8c90b53817f1f82d8269e23456759\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdq96\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-z6kjz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:56Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:56 crc kubenswrapper[4933]: I1202 15:52:56.753975 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:56 crc kubenswrapper[4933]: I1202 15:52:56.754061 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:56 crc kubenswrapper[4933]: I1202 15:52:56.754080 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:56 crc kubenswrapper[4933]: I1202 15:52:56.754104 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:56 crc kubenswrapper[4933]: I1202 15:52:56.754121 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:56Z","lastTransitionTime":"2025-12-02T15:52:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:56 crc kubenswrapper[4933]: I1202 15:52:56.858470 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:56 crc kubenswrapper[4933]: I1202 15:52:56.858531 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:56 crc kubenswrapper[4933]: I1202 15:52:56.858540 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:56 crc kubenswrapper[4933]: I1202 15:52:56.858556 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:56 crc kubenswrapper[4933]: I1202 15:52:56.858571 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:56Z","lastTransitionTime":"2025-12-02T15:52:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:56 crc kubenswrapper[4933]: I1202 15:52:56.962156 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:56 crc kubenswrapper[4933]: I1202 15:52:56.962221 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:56 crc kubenswrapper[4933]: I1202 15:52:56.962238 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:56 crc kubenswrapper[4933]: I1202 15:52:56.962260 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:56 crc kubenswrapper[4933]: I1202 15:52:56.962278 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:56Z","lastTransitionTime":"2025-12-02T15:52:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:56 crc kubenswrapper[4933]: I1202 15:52:56.993000 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-qbps2"] Dec 02 15:52:56 crc kubenswrapper[4933]: I1202 15:52:56.993696 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qbps2" Dec 02 15:52:56 crc kubenswrapper[4933]: E1202 15:52:56.993796 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qbps2" podUID="c95a4730-1427-4097-9ca3-4bd251e7acf0" Dec 02 15:52:57 crc kubenswrapper[4933]: I1202 15:52:57.012712 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06b195f1-3296-4050-9361-eab421cde8d4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7399fed7994f0ed3f12a423d3f6796e84d8687f9c16a3050ccbb90e1c80a07d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee933bdd8638f0085a6f720a178c8ce59bf46b40a0bcb015ac9c570e25ce97d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7cb1abb6f86878fc3daef153191ea3a2ebe06b3f1fc7df959539938c3b6a724\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db14d6e6ebdfa06ff02570eb66fe7ea17a7705fdaa767b6fb91d7ed12eacd59a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:57Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:57 crc kubenswrapper[4933]: I1202 15:52:57.043423 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1972064c-ea30-421c-b009-2bc675a98fcc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3920b75a79accaa154ca827b9d813f5bbb7c7c18369c0b6d3bb82cab653bcb66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a3f2ac3ef3759389e0c4807b4235aea66b8d9570ef020af90110402296a755\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d4c0805f39e3410185b2d898b61b42dfeb544cd47c1ac7570a099ece7921a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://170155e5803e9eabfc4866a3cd76bd74567d9ff6707bebfdf7f24a2330173140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a499bb33dae499d565aa22f537a095fe9465bd1d68d829248fc7a896f692f67d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14c25265781df0f480ecebbc167b5e413d869433f7258cc299a2c2bc12ee2af3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04ab3fe2bc8cab1157c68d938926ecb8b4fc1bca6f2bba21333de8a7213ad8df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04ab3fe2bc8cab1157c68d938926ecb8b4fc1bca6f2bba21333de8a7213ad8df\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T15:52:53Z\\\",\\\"message\\\":\\\"]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1202 15:52:53.202107 6385 services_controller.go:444] Built service openshift-machine-api/control-plane-machine-set-operator LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1202 15:52:53.202117 6385 services_controller.go:445] Built service openshift-machine-api/control-plane-machine-set-operator LB template configs for network=default: []services.lbConfig(nil)\\\\nF1202 15:52:53.201748 6385 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:53Z is after 2025-08\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-8mklc_openshift-ovn-kubernetes(1972064c-ea30-421c-b009-2bc675a98fcc)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ae6c9973a0f8983a5da9a3a25ba23d7d836e7f5bdaa60b60627d604ef3d2afc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0062caebc274431e99ef7dc875a50f2f2d756215bd2700b7206582f68e5f376\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0062caebc274431e99ef7dc875a50f2f2d756215bd2700b7206582f68e5f376\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8mklc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:57Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:57 crc kubenswrapper[4933]: I1202 15:52:57.062479 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s488w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a161d9d5-a56f-45e9-93e4-50e7220cd31e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38f86b30416fb2f10e8444d5c7c0afe84f16d619e83e7a5e1186eaf4c274a51f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mnvpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cba0775cbbe9a6ce1f0b0fe3559f2b5eb39bd13d4f35686fe2ff92d7d833909\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mnvpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s488w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:57Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:57 crc kubenswrapper[4933]: I1202 15:52:57.064731 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:57 crc kubenswrapper[4933]: I1202 15:52:57.064780 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:57 crc kubenswrapper[4933]: I1202 15:52:57.064799 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:57 crc kubenswrapper[4933]: I1202 15:52:57.064844 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:57 crc kubenswrapper[4933]: I1202 15:52:57.064861 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:57Z","lastTransitionTime":"2025-12-02T15:52:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:57 crc kubenswrapper[4933]: I1202 15:52:57.078854 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6c1c5e6-50dd-428a-890c-2c3f0456f2fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d84363ac0dfeec81ad7770d6ffd34547605fc51bebb545c4639f4c069bab93ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t48jm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54194f3459a2bbe748821e4f8e94abdd18e7c4e483d4cc2c9d5b765db584dd01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t48jm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-d2p6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:57Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:57 crc kubenswrapper[4933]: I1202 15:52:57.094138 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5d9dn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97edaf10-912b-42e7-a9e7-930381d48508\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea0970d622ab76b94ceab66d1d10d469581574368d38c8cd7c6b7a26f81cb6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s82hp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5d9dn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:57Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:57 crc kubenswrapper[4933]: I1202 15:52:57.108653 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fl25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"680c0df1-e4d6-4e1c-a36d-2378e821d2d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bdbdb1eb6ad7b6e710feeda6af64e9557ec1c3c938fd850fa5b2835abc45f098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-559sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fl25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:57Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:57 crc kubenswrapper[4933]: I1202 15:52:57.123650 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qbps2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c95a4730-1427-4097-9ca3-4bd251e7acf0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdwf7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdwf7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:56Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qbps2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:57Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:57 crc kubenswrapper[4933]: I1202 15:52:57.142470 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:57Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:57 crc kubenswrapper[4933]: I1202 15:52:57.158064 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4424b26b4c67e9508d92cc6bbc82b291d93c587a8463026a856d87b7b778079e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:57Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:57 crc kubenswrapper[4933]: I1202 15:52:57.167435 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:57 crc kubenswrapper[4933]: I1202 15:52:57.167543 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:57 crc kubenswrapper[4933]: I1202 15:52:57.167572 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:57 crc kubenswrapper[4933]: I1202 15:52:57.167605 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:57 crc kubenswrapper[4933]: I1202 15:52:57.167628 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:57Z","lastTransitionTime":"2025-12-02T15:52:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:57 crc kubenswrapper[4933]: I1202 15:52:57.170990 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c95a4730-1427-4097-9ca3-4bd251e7acf0-metrics-certs\") pod \"network-metrics-daemon-qbps2\" (UID: \"c95a4730-1427-4097-9ca3-4bd251e7acf0\") " pod="openshift-multus/network-metrics-daemon-qbps2" Dec 02 15:52:57 crc kubenswrapper[4933]: I1202 15:52:57.171096 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdwf7\" (UniqueName: \"kubernetes.io/projected/c95a4730-1427-4097-9ca3-4bd251e7acf0-kube-api-access-cdwf7\") pod \"network-metrics-daemon-qbps2\" (UID: \"c95a4730-1427-4097-9ca3-4bd251e7acf0\") " pod="openshift-multus/network-metrics-daemon-qbps2" Dec 02 15:52:57 crc kubenswrapper[4933]: I1202 15:52:57.175156 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-z6kjz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b033c545-93a2-4401-842b-22456e44216b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5307d7bbe56091012f9975b2a42eafb27d8c90b53817f1f82d8269e23456759\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdq96\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-z6kjz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:57Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:57 crc kubenswrapper[4933]: I1202 15:52:57.198320 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50efb59904b9275848b8f068dfa8943515c66087209fe13dc75888354ecaff09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:57Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:57 crc kubenswrapper[4933]: I1202 15:52:57.213277 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:57Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:57 crc kubenswrapper[4933]: I1202 15:52:57.229039 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:57Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:57 crc kubenswrapper[4933]: I1202 15:52:57.247322 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-s779q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72a30a71-fe04-43a2-8f60-c9b12a0a6e5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2367cc195b2375ceed6df398214268e414ae13f5459750b4a1f3bbe4ef59363\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a448b3118b6160344bea740978c045ce2351af934805ee95eb6b940963b377bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a448b3118b6160344bea740978c045ce2351af934805ee95eb6b940963b377bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f11b7eae04e4265f3d00a32c0c1c91419ef6da612c6ecd2e69ef4e90b5f72b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f11b7eae04e4265f3d00a32c0c1c91419ef6da612c6ecd2e69ef4e90b5f72b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89093d056f612b90064277410dc17c8b1879459a9e9491538ccd8dabeb2703e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89093d056f612b90064277410dc17c8b1879459a9e9491538ccd8dabeb2703e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e773c0ba8db44a579b0f5b59c7402d773402fbb1b3e3b57bf95e9244df635c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e773c0ba8db44a579b0f5b59c7402d773402fbb1b3e3b57bf95e9244df635c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a098d78bfca730eec14ca5f48ae082e24f0623ec3752e1b0182dceb18821e3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a098d78bfca730eec14ca5f48ae082e24f0623ec3752e1b0182dceb18821e3b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e41fea9eed5df332e515a81666075fc4cb3171b47a7c222b36dac4d5a7533692\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e41fea9eed5df332e515a81666075fc4cb3171b47a7c222b36dac4d5a7533692\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-s779q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:57Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:57 crc kubenswrapper[4933]: I1202 15:52:57.264087 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67141d41-dade-4d16-8921-1a3eeaef658e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67497877f8241167fd89cf2ebbc7257a910b4f22e96a1e58f78936317daf19aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://559f28d5b8b13a4d78a727f671a75d2c61a391f83089c98749a8071b466c8fae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50e3f8f75abacf2b0701b20cc58796727bb04d99836c7ad69c27355d0c7f070f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0f8661c980893fc646e84a6bb8547946653723fb3f1449d585953e25715d588\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdced4864fc5e9a41404f9484c6126634ffcbc3388080207f6a5508be6dc7b19\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 15:52:30.552832 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 15:52:30.556124 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1621699441/tls.crt::/tmp/serving-cert-1621699441/tls.key\\\\\\\"\\\\nI1202 15:52:36.166152 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 15:52:36.169452 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 15:52:36.169553 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 15:52:36.169614 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 15:52:36.169667 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 15:52:36.177343 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 15:52:36.177409 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 15:52:36.177417 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 15:52:36.177423 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 15:52:36.177428 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 15:52:36.177432 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 15:52:36.177437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 15:52:36.177361 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 15:52:36.181525 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://266ff5f18c317176752753189e1fd32c61d17e22bebba19421d71cdce41c10e4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd63fc5b2ffbd52b09628908868b28e2168bfe8bc453de41ce175aedcd404cc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd63fc5b2ffbd52b09628908868b28e2168bfe8bc453de41ce175aedcd404cc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:57Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:57 crc kubenswrapper[4933]: I1202 15:52:57.270199 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:57 crc kubenswrapper[4933]: I1202 15:52:57.270231 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:57 crc kubenswrapper[4933]: I1202 15:52:57.270243 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:57 crc kubenswrapper[4933]: I1202 15:52:57.270261 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:57 crc kubenswrapper[4933]: I1202 15:52:57.270274 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:57Z","lastTransitionTime":"2025-12-02T15:52:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:57 crc kubenswrapper[4933]: I1202 15:52:57.271639 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c95a4730-1427-4097-9ca3-4bd251e7acf0-metrics-certs\") pod \"network-metrics-daemon-qbps2\" (UID: \"c95a4730-1427-4097-9ca3-4bd251e7acf0\") " pod="openshift-multus/network-metrics-daemon-qbps2" Dec 02 15:52:57 crc kubenswrapper[4933]: I1202 15:52:57.271718 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cdwf7\" (UniqueName: \"kubernetes.io/projected/c95a4730-1427-4097-9ca3-4bd251e7acf0-kube-api-access-cdwf7\") pod \"network-metrics-daemon-qbps2\" (UID: \"c95a4730-1427-4097-9ca3-4bd251e7acf0\") " pod="openshift-multus/network-metrics-daemon-qbps2" Dec 02 15:52:57 crc kubenswrapper[4933]: E1202 15:52:57.271847 4933 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 15:52:57 crc kubenswrapper[4933]: E1202 15:52:57.271917 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c95a4730-1427-4097-9ca3-4bd251e7acf0-metrics-certs podName:c95a4730-1427-4097-9ca3-4bd251e7acf0 nodeName:}" failed. No retries permitted until 2025-12-02 15:52:57.771898207 +0000 UTC m=+41.023124920 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c95a4730-1427-4097-9ca3-4bd251e7acf0-metrics-certs") pod "network-metrics-daemon-qbps2" (UID: "c95a4730-1427-4097-9ca3-4bd251e7acf0") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 15:52:57 crc kubenswrapper[4933]: I1202 15:52:57.282206 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c0555430401fd089ba4f14bef44c9a03bcc4352a3159c34aa592797211ff912\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2774460b514418fc05e1d8ac0ca0a8cda1194fab9151804bed266e6bf44c7369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:57Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:57 crc kubenswrapper[4933]: I1202 15:52:57.301419 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06b195f1-3296-4050-9361-eab421cde8d4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7399fed7994f0ed3f12a423d3f6796e84d8687f9c16a3050ccbb90e1c80a07d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee933bdd8638f0085a6f720a178c8ce59bf46b40a0bcb015ac9c570e25ce97d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7cb1abb6f86878fc3daef153191ea3a2ebe06b3f1fc7df959539938c3b6a724\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db14d6e6ebdfa06ff02570eb66fe7ea17a7705fdaa767b6fb91d7ed12eacd59a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:57Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:57 crc kubenswrapper[4933]: I1202 15:52:57.302338 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdwf7\" (UniqueName: \"kubernetes.io/projected/c95a4730-1427-4097-9ca3-4bd251e7acf0-kube-api-access-cdwf7\") pod \"network-metrics-daemon-qbps2\" (UID: \"c95a4730-1427-4097-9ca3-4bd251e7acf0\") " pod="openshift-multus/network-metrics-daemon-qbps2" Dec 02 15:52:57 crc kubenswrapper[4933]: I1202 15:52:57.329529 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1972064c-ea30-421c-b009-2bc675a98fcc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3920b75a79accaa154ca827b9d813f5bbb7c7c18369c0b6d3bb82cab653bcb66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a3f2ac3ef3759389e0c4807b4235aea66b8d9570ef020af90110402296a755\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d4c0805f39e3410185b2d898b61b42dfeb544cd47c1ac7570a099ece7921a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://170155e5803e9eabfc4866a3cd76bd74567d9ff6707bebfdf7f24a2330173140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a499bb33dae499d565aa22f537a095fe9465bd1d68d829248fc7a896f692f67d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14c25265781df0f480ecebbc167b5e413d869433f7258cc299a2c2bc12ee2af3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04ab3fe2bc8cab1157c68d938926ecb8b4fc1bca6f2bba21333de8a7213ad8df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04ab3fe2bc8cab1157c68d938926ecb8b4fc1bca6f2bba21333de8a7213ad8df\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T15:52:53Z\\\",\\\"message\\\":\\\"]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1202 15:52:53.202107 6385 services_controller.go:444] Built service openshift-machine-api/control-plane-machine-set-operator LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1202 15:52:53.202117 6385 services_controller.go:445] Built service openshift-machine-api/control-plane-machine-set-operator LB template configs for network=default: []services.lbConfig(nil)\\\\nF1202 15:52:53.201748 6385 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:53Z is after 2025-08\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-8mklc_openshift-ovn-kubernetes(1972064c-ea30-421c-b009-2bc675a98fcc)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ae6c9973a0f8983a5da9a3a25ba23d7d836e7f5bdaa60b60627d604ef3d2afc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0062caebc274431e99ef7dc875a50f2f2d756215bd2700b7206582f68e5f376\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0062caebc274431e99ef7dc875a50f2f2d756215bd2700b7206582f68e5f376\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8mklc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:57Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:57 crc kubenswrapper[4933]: I1202 15:52:57.348771 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s488w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a161d9d5-a56f-45e9-93e4-50e7220cd31e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38f86b30416fb2f10e8444d5c7c0afe84f16d619e83e7a5e1186eaf4c274a51f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mnvpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cba0775cbbe9a6ce1f0b0fe3559f2b5eb39bd13d4f35686fe2ff92d7d833909\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mnvpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s488w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:57Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:57 crc kubenswrapper[4933]: I1202 15:52:57.364892 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:57Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:57 crc kubenswrapper[4933]: I1202 15:52:57.373745 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:57 crc kubenswrapper[4933]: I1202 15:52:57.373783 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:57 crc kubenswrapper[4933]: I1202 15:52:57.373795 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:57 crc kubenswrapper[4933]: I1202 15:52:57.373812 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:57 crc kubenswrapper[4933]: I1202 15:52:57.373841 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:57Z","lastTransitionTime":"2025-12-02T15:52:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:57 crc kubenswrapper[4933]: I1202 15:52:57.379663 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6c1c5e6-50dd-428a-890c-2c3f0456f2fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d84363ac0dfeec81ad7770d6ffd34547605fc51bebb545c4639f4c069bab93ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t48jm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54194f3459a2bbe748821e4f8e94abdd18e7c4e483d4cc2c9d5b765db584dd01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t48jm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-d2p6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:57Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:57 crc kubenswrapper[4933]: I1202 15:52:57.390884 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5d9dn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97edaf10-912b-42e7-a9e7-930381d48508\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea0970d622ab76b94ceab66d1d10d469581574368d38c8cd7c6b7a26f81cb6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s82hp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5d9dn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:57Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:57 crc kubenswrapper[4933]: I1202 15:52:57.402486 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fl25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"680c0df1-e4d6-4e1c-a36d-2378e821d2d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bdbdb1eb6ad7b6e710feeda6af64e9557ec1c3c938fd850fa5b2835abc45f098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-559sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fl25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:57Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:57 crc kubenswrapper[4933]: I1202 15:52:57.414304 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qbps2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c95a4730-1427-4097-9ca3-4bd251e7acf0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdwf7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdwf7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:56Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qbps2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:57Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:57 crc kubenswrapper[4933]: I1202 15:52:57.429389 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50efb59904b9275848b8f068dfa8943515c66087209fe13dc75888354ecaff09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:57Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:57 crc kubenswrapper[4933]: I1202 15:52:57.443425 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:57Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:57 crc kubenswrapper[4933]: I1202 15:52:57.456243 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4424b26b4c67e9508d92cc6bbc82b291d93c587a8463026a856d87b7b778079e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:57Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:57 crc kubenswrapper[4933]: I1202 15:52:57.467548 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-z6kjz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b033c545-93a2-4401-842b-22456e44216b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5307d7bbe56091012f9975b2a42eafb27d8c90b53817f1f82d8269e23456759\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdq96\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-z6kjz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:57Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:57 crc kubenswrapper[4933]: I1202 15:52:57.476988 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:57 crc kubenswrapper[4933]: I1202 15:52:57.477058 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:57 crc kubenswrapper[4933]: I1202 15:52:57.477069 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:57 crc kubenswrapper[4933]: I1202 15:52:57.477086 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:57 crc kubenswrapper[4933]: I1202 15:52:57.477096 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:57Z","lastTransitionTime":"2025-12-02T15:52:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:57 crc kubenswrapper[4933]: I1202 15:52:57.483469 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67141d41-dade-4d16-8921-1a3eeaef658e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67497877f8241167fd89cf2ebbc7257a910b4f22e96a1e58f78936317daf19aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://559f28d5b8b13a4d78a727f671a75d2c61a391f83089c98749a8071b466c8fae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50e3f8f75abacf2b0701b20cc58796727bb04d99836c7ad69c27355d0c7f070f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0f8661c980893fc646e84a6bb8547946653723fb3f1449d585953e25715d588\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdced4864fc5e9a41404f9484c6126634ffcbc3388080207f6a5508be6dc7b19\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 15:52:30.552832 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 15:52:30.556124 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1621699441/tls.crt::/tmp/serving-cert-1621699441/tls.key\\\\\\\"\\\\nI1202 15:52:36.166152 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 15:52:36.169452 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 15:52:36.169553 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 15:52:36.169614 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 15:52:36.169667 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 15:52:36.177343 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 15:52:36.177409 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 15:52:36.177417 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 15:52:36.177423 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 15:52:36.177428 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 15:52:36.177432 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 15:52:36.177437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 15:52:36.177361 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 15:52:36.181525 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://266ff5f18c317176752753189e1fd32c61d17e22bebba19421d71cdce41c10e4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd63fc5b2ffbd52b09628908868b28e2168bfe8bc453de41ce175aedcd404cc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd63fc5b2ffbd52b09628908868b28e2168bfe8bc453de41ce175aedcd404cc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:57Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:57 crc kubenswrapper[4933]: I1202 15:52:57.496870 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c0555430401fd089ba4f14bef44c9a03bcc4352a3159c34aa592797211ff912\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2774460b514418fc05e1d8ac0ca0a8cda1194fab9151804bed266e6bf44c7369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:57Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:57 crc kubenswrapper[4933]: I1202 15:52:57.516647 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:57Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:57 crc kubenswrapper[4933]: I1202 15:52:57.535705 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-s779q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72a30a71-fe04-43a2-8f60-c9b12a0a6e5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2367cc195b2375ceed6df398214268e414ae13f5459750b4a1f3bbe4ef59363\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a448b3118b6160344bea740978c045ce2351af934805ee95eb6b940963b377bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a448b3118b6160344bea740978c045ce2351af934805ee95eb6b940963b377bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f11b7eae04e4265f3d00a32c0c1c91419ef6da612c6ecd2e69ef4e90b5f72b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f11b7eae04e4265f3d00a32c0c1c91419ef6da612c6ecd2e69ef4e90b5f72b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89093d056f612b90064277410dc17c8b1879459a9e9491538ccd8dabeb2703e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89093d056f612b90064277410dc17c8b1879459a9e9491538ccd8dabeb2703e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e773c0ba8db44a579b0f5b59c7402d773402fbb1b3e3b57bf95e9244df635c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e773c0ba8db44a579b0f5b59c7402d773402fbb1b3e3b57bf95e9244df635c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a098d78bfca730eec14ca5f48ae082e24f0623ec3752e1b0182dceb18821e3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a098d78bfca730eec14ca5f48ae082e24f0623ec3752e1b0182dceb18821e3b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e41fea9eed5df332e515a81666075fc4cb3171b47a7c222b36dac4d5a7533692\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e41fea9eed5df332e515a81666075fc4cb3171b47a7c222b36dac4d5a7533692\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-s779q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:57Z is after 2025-08-24T17:21:41Z" Dec 02 15:52:57 crc kubenswrapper[4933]: I1202 15:52:57.579172 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:57 crc kubenswrapper[4933]: I1202 15:52:57.579218 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:57 crc kubenswrapper[4933]: I1202 15:52:57.579230 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:57 crc kubenswrapper[4933]: I1202 15:52:57.579248 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:57 crc kubenswrapper[4933]: I1202 15:52:57.579262 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:57Z","lastTransitionTime":"2025-12-02T15:52:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:57 crc kubenswrapper[4933]: I1202 15:52:57.682338 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:57 crc kubenswrapper[4933]: I1202 15:52:57.682392 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:57 crc kubenswrapper[4933]: I1202 15:52:57.682448 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:57 crc kubenswrapper[4933]: I1202 15:52:57.682471 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:57 crc kubenswrapper[4933]: I1202 15:52:57.682483 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:57Z","lastTransitionTime":"2025-12-02T15:52:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:57 crc kubenswrapper[4933]: I1202 15:52:57.776974 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c95a4730-1427-4097-9ca3-4bd251e7acf0-metrics-certs\") pod \"network-metrics-daemon-qbps2\" (UID: \"c95a4730-1427-4097-9ca3-4bd251e7acf0\") " pod="openshift-multus/network-metrics-daemon-qbps2" Dec 02 15:52:57 crc kubenswrapper[4933]: E1202 15:52:57.777228 4933 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 15:52:57 crc kubenswrapper[4933]: E1202 15:52:57.777373 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c95a4730-1427-4097-9ca3-4bd251e7acf0-metrics-certs podName:c95a4730-1427-4097-9ca3-4bd251e7acf0 nodeName:}" failed. No retries permitted until 2025-12-02 15:52:58.777344845 +0000 UTC m=+42.028571548 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c95a4730-1427-4097-9ca3-4bd251e7acf0-metrics-certs") pod "network-metrics-daemon-qbps2" (UID: "c95a4730-1427-4097-9ca3-4bd251e7acf0") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 15:52:57 crc kubenswrapper[4933]: I1202 15:52:57.785405 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:57 crc kubenswrapper[4933]: I1202 15:52:57.785465 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:57 crc kubenswrapper[4933]: I1202 15:52:57.785478 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:57 crc kubenswrapper[4933]: I1202 15:52:57.785497 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:57 crc kubenswrapper[4933]: I1202 15:52:57.785507 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:57Z","lastTransitionTime":"2025-12-02T15:52:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:57 crc kubenswrapper[4933]: I1202 15:52:57.888243 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:57 crc kubenswrapper[4933]: I1202 15:52:57.888292 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:57 crc kubenswrapper[4933]: I1202 15:52:57.888301 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:57 crc kubenswrapper[4933]: I1202 15:52:57.888318 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:57 crc kubenswrapper[4933]: I1202 15:52:57.888333 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:57Z","lastTransitionTime":"2025-12-02T15:52:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:57 crc kubenswrapper[4933]: I1202 15:52:57.996018 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:57 crc kubenswrapper[4933]: I1202 15:52:57.996510 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:57 crc kubenswrapper[4933]: I1202 15:52:57.996651 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:57 crc kubenswrapper[4933]: I1202 15:52:57.996788 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:57 crc kubenswrapper[4933]: I1202 15:52:57.997007 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:57Z","lastTransitionTime":"2025-12-02T15:52:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:58 crc kubenswrapper[4933]: I1202 15:52:58.052951 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 15:52:58 crc kubenswrapper[4933]: I1202 15:52:58.053098 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 15:52:58 crc kubenswrapper[4933]: E1202 15:52:58.053179 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 15:52:58 crc kubenswrapper[4933]: E1202 15:52:58.053340 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 15:52:58 crc kubenswrapper[4933]: I1202 15:52:58.052957 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 15:52:58 crc kubenswrapper[4933]: E1202 15:52:58.053526 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 15:52:58 crc kubenswrapper[4933]: I1202 15:52:58.100759 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:58 crc kubenswrapper[4933]: I1202 15:52:58.100815 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:58 crc kubenswrapper[4933]: I1202 15:52:58.100858 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:58 crc kubenswrapper[4933]: I1202 15:52:58.100879 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:58 crc kubenswrapper[4933]: I1202 15:52:58.100896 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:58Z","lastTransitionTime":"2025-12-02T15:52:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:58 crc kubenswrapper[4933]: I1202 15:52:58.203631 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:58 crc kubenswrapper[4933]: I1202 15:52:58.203699 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:58 crc kubenswrapper[4933]: I1202 15:52:58.203718 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:58 crc kubenswrapper[4933]: I1202 15:52:58.203745 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:58 crc kubenswrapper[4933]: I1202 15:52:58.203763 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:58Z","lastTransitionTime":"2025-12-02T15:52:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:58 crc kubenswrapper[4933]: I1202 15:52:58.307326 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:58 crc kubenswrapper[4933]: I1202 15:52:58.307392 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:58 crc kubenswrapper[4933]: I1202 15:52:58.307420 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:58 crc kubenswrapper[4933]: I1202 15:52:58.307448 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:58 crc kubenswrapper[4933]: I1202 15:52:58.307466 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:58Z","lastTransitionTime":"2025-12-02T15:52:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:58 crc kubenswrapper[4933]: I1202 15:52:58.410173 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:58 crc kubenswrapper[4933]: I1202 15:52:58.410264 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:58 crc kubenswrapper[4933]: I1202 15:52:58.410283 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:58 crc kubenswrapper[4933]: I1202 15:52:58.410313 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:58 crc kubenswrapper[4933]: I1202 15:52:58.410331 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:58Z","lastTransitionTime":"2025-12-02T15:52:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:58 crc kubenswrapper[4933]: I1202 15:52:58.514255 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:58 crc kubenswrapper[4933]: I1202 15:52:58.514327 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:58 crc kubenswrapper[4933]: I1202 15:52:58.514345 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:58 crc kubenswrapper[4933]: I1202 15:52:58.514374 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:58 crc kubenswrapper[4933]: I1202 15:52:58.514418 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:58Z","lastTransitionTime":"2025-12-02T15:52:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:58 crc kubenswrapper[4933]: I1202 15:52:58.616516 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:58 crc kubenswrapper[4933]: I1202 15:52:58.616564 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:58 crc kubenswrapper[4933]: I1202 15:52:58.616573 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:58 crc kubenswrapper[4933]: I1202 15:52:58.616585 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:58 crc kubenswrapper[4933]: I1202 15:52:58.616594 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:58Z","lastTransitionTime":"2025-12-02T15:52:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:58 crc kubenswrapper[4933]: I1202 15:52:58.719914 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:58 crc kubenswrapper[4933]: I1202 15:52:58.719977 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:58 crc kubenswrapper[4933]: I1202 15:52:58.719995 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:58 crc kubenswrapper[4933]: I1202 15:52:58.720019 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:58 crc kubenswrapper[4933]: I1202 15:52:58.720036 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:58Z","lastTransitionTime":"2025-12-02T15:52:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:58 crc kubenswrapper[4933]: I1202 15:52:58.788476 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c95a4730-1427-4097-9ca3-4bd251e7acf0-metrics-certs\") pod \"network-metrics-daemon-qbps2\" (UID: \"c95a4730-1427-4097-9ca3-4bd251e7acf0\") " pod="openshift-multus/network-metrics-daemon-qbps2" Dec 02 15:52:58 crc kubenswrapper[4933]: E1202 15:52:58.788777 4933 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 15:52:58 crc kubenswrapper[4933]: E1202 15:52:58.788991 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c95a4730-1427-4097-9ca3-4bd251e7acf0-metrics-certs podName:c95a4730-1427-4097-9ca3-4bd251e7acf0 nodeName:}" failed. No retries permitted until 2025-12-02 15:53:00.788964411 +0000 UTC m=+44.040191154 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c95a4730-1427-4097-9ca3-4bd251e7acf0-metrics-certs") pod "network-metrics-daemon-qbps2" (UID: "c95a4730-1427-4097-9ca3-4bd251e7acf0") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 15:52:58 crc kubenswrapper[4933]: I1202 15:52:58.824353 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:58 crc kubenswrapper[4933]: I1202 15:52:58.824401 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:58 crc kubenswrapper[4933]: I1202 15:52:58.824411 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:58 crc kubenswrapper[4933]: I1202 15:52:58.824428 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:58 crc kubenswrapper[4933]: I1202 15:52:58.824439 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:58Z","lastTransitionTime":"2025-12-02T15:52:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:58 crc kubenswrapper[4933]: I1202 15:52:58.927473 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:58 crc kubenswrapper[4933]: I1202 15:52:58.927561 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:58 crc kubenswrapper[4933]: I1202 15:52:58.927586 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:58 crc kubenswrapper[4933]: I1202 15:52:58.927615 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:58 crc kubenswrapper[4933]: I1202 15:52:58.927638 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:58Z","lastTransitionTime":"2025-12-02T15:52:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:59 crc kubenswrapper[4933]: I1202 15:52:59.030250 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:59 crc kubenswrapper[4933]: I1202 15:52:59.030279 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:59 crc kubenswrapper[4933]: I1202 15:52:59.030288 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:59 crc kubenswrapper[4933]: I1202 15:52:59.030301 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:59 crc kubenswrapper[4933]: I1202 15:52:59.030309 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:59Z","lastTransitionTime":"2025-12-02T15:52:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:59 crc kubenswrapper[4933]: I1202 15:52:59.053323 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qbps2" Dec 02 15:52:59 crc kubenswrapper[4933]: E1202 15:52:59.053486 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qbps2" podUID="c95a4730-1427-4097-9ca3-4bd251e7acf0" Dec 02 15:52:59 crc kubenswrapper[4933]: I1202 15:52:59.133098 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:59 crc kubenswrapper[4933]: I1202 15:52:59.133152 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:59 crc kubenswrapper[4933]: I1202 15:52:59.133165 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:59 crc kubenswrapper[4933]: I1202 15:52:59.133183 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:59 crc kubenswrapper[4933]: I1202 15:52:59.133196 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:59Z","lastTransitionTime":"2025-12-02T15:52:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:59 crc kubenswrapper[4933]: I1202 15:52:59.236341 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:59 crc kubenswrapper[4933]: I1202 15:52:59.236397 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:59 crc kubenswrapper[4933]: I1202 15:52:59.236406 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:59 crc kubenswrapper[4933]: I1202 15:52:59.236428 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:59 crc kubenswrapper[4933]: I1202 15:52:59.236438 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:59Z","lastTransitionTime":"2025-12-02T15:52:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:59 crc kubenswrapper[4933]: I1202 15:52:59.339225 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:59 crc kubenswrapper[4933]: I1202 15:52:59.339315 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:59 crc kubenswrapper[4933]: I1202 15:52:59.339333 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:59 crc kubenswrapper[4933]: I1202 15:52:59.339359 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:59 crc kubenswrapper[4933]: I1202 15:52:59.339378 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:59Z","lastTransitionTime":"2025-12-02T15:52:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:59 crc kubenswrapper[4933]: I1202 15:52:59.443276 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:59 crc kubenswrapper[4933]: I1202 15:52:59.443330 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:59 crc kubenswrapper[4933]: I1202 15:52:59.443343 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:59 crc kubenswrapper[4933]: I1202 15:52:59.443365 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:59 crc kubenswrapper[4933]: I1202 15:52:59.443381 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:59Z","lastTransitionTime":"2025-12-02T15:52:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:59 crc kubenswrapper[4933]: I1202 15:52:59.546323 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:59 crc kubenswrapper[4933]: I1202 15:52:59.546432 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:59 crc kubenswrapper[4933]: I1202 15:52:59.546475 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:59 crc kubenswrapper[4933]: I1202 15:52:59.546563 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:59 crc kubenswrapper[4933]: I1202 15:52:59.546590 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:59Z","lastTransitionTime":"2025-12-02T15:52:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:59 crc kubenswrapper[4933]: I1202 15:52:59.649734 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:59 crc kubenswrapper[4933]: I1202 15:52:59.649803 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:59 crc kubenswrapper[4933]: I1202 15:52:59.649818 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:59 crc kubenswrapper[4933]: I1202 15:52:59.649870 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:59 crc kubenswrapper[4933]: I1202 15:52:59.649888 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:59Z","lastTransitionTime":"2025-12-02T15:52:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:59 crc kubenswrapper[4933]: I1202 15:52:59.752574 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:59 crc kubenswrapper[4933]: I1202 15:52:59.752635 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:59 crc kubenswrapper[4933]: I1202 15:52:59.752646 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:59 crc kubenswrapper[4933]: I1202 15:52:59.752664 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:59 crc kubenswrapper[4933]: I1202 15:52:59.752679 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:59Z","lastTransitionTime":"2025-12-02T15:52:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:59 crc kubenswrapper[4933]: I1202 15:52:59.855345 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:59 crc kubenswrapper[4933]: I1202 15:52:59.855401 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:59 crc kubenswrapper[4933]: I1202 15:52:59.855413 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:59 crc kubenswrapper[4933]: I1202 15:52:59.855429 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:59 crc kubenswrapper[4933]: I1202 15:52:59.855440 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:59Z","lastTransitionTime":"2025-12-02T15:52:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:52:59 crc kubenswrapper[4933]: I1202 15:52:59.958872 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:52:59 crc kubenswrapper[4933]: I1202 15:52:59.958954 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:52:59 crc kubenswrapper[4933]: I1202 15:52:59.958971 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:52:59 crc kubenswrapper[4933]: I1202 15:52:59.958998 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:52:59 crc kubenswrapper[4933]: I1202 15:52:59.959016 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:52:59Z","lastTransitionTime":"2025-12-02T15:52:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:00 crc kubenswrapper[4933]: I1202 15:53:00.053411 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 15:53:00 crc kubenswrapper[4933]: I1202 15:53:00.053485 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 15:53:00 crc kubenswrapper[4933]: E1202 15:53:00.053558 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 15:53:00 crc kubenswrapper[4933]: I1202 15:53:00.053411 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 15:53:00 crc kubenswrapper[4933]: E1202 15:53:00.053724 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 15:53:00 crc kubenswrapper[4933]: E1202 15:53:00.053887 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 15:53:00 crc kubenswrapper[4933]: I1202 15:53:00.061744 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:00 crc kubenswrapper[4933]: I1202 15:53:00.061806 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:00 crc kubenswrapper[4933]: I1202 15:53:00.061819 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:00 crc kubenswrapper[4933]: I1202 15:53:00.061874 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:00 crc kubenswrapper[4933]: I1202 15:53:00.061886 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:00Z","lastTransitionTime":"2025-12-02T15:53:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:00 crc kubenswrapper[4933]: I1202 15:53:00.165411 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:00 crc kubenswrapper[4933]: I1202 15:53:00.165526 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:00 crc kubenswrapper[4933]: I1202 15:53:00.165553 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:00 crc kubenswrapper[4933]: I1202 15:53:00.165587 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:00 crc kubenswrapper[4933]: I1202 15:53:00.165613 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:00Z","lastTransitionTime":"2025-12-02T15:53:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:00 crc kubenswrapper[4933]: I1202 15:53:00.269114 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:00 crc kubenswrapper[4933]: I1202 15:53:00.269203 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:00 crc kubenswrapper[4933]: I1202 15:53:00.269240 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:00 crc kubenswrapper[4933]: I1202 15:53:00.269275 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:00 crc kubenswrapper[4933]: I1202 15:53:00.269304 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:00Z","lastTransitionTime":"2025-12-02T15:53:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:00 crc kubenswrapper[4933]: I1202 15:53:00.372902 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:00 crc kubenswrapper[4933]: I1202 15:53:00.372986 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:00 crc kubenswrapper[4933]: I1202 15:53:00.373005 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:00 crc kubenswrapper[4933]: I1202 15:53:00.373030 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:00 crc kubenswrapper[4933]: I1202 15:53:00.373048 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:00Z","lastTransitionTime":"2025-12-02T15:53:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:00 crc kubenswrapper[4933]: I1202 15:53:00.427663 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:00 crc kubenswrapper[4933]: I1202 15:53:00.427723 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:00 crc kubenswrapper[4933]: I1202 15:53:00.427735 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:00 crc kubenswrapper[4933]: I1202 15:53:00.427758 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:00 crc kubenswrapper[4933]: I1202 15:53:00.427773 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:00Z","lastTransitionTime":"2025-12-02T15:53:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:00 crc kubenswrapper[4933]: E1202 15:53:00.442896 4933 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:53:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:53:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:53:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:53:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b45811fa-f657-451d-9a34-cdd268fcc941\\\",\\\"systemUUID\\\":\\\"84b7b789-bc9b-466b-8619-2bf2e1fdb8d0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:00Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:00 crc kubenswrapper[4933]: I1202 15:53:00.447580 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:00 crc kubenswrapper[4933]: I1202 15:53:00.447662 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:00 crc kubenswrapper[4933]: I1202 15:53:00.447682 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:00 crc kubenswrapper[4933]: I1202 15:53:00.447709 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:00 crc kubenswrapper[4933]: I1202 15:53:00.447728 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:00Z","lastTransitionTime":"2025-12-02T15:53:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:00 crc kubenswrapper[4933]: E1202 15:53:00.467753 4933 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:53:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:53:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:53:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:53:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b45811fa-f657-451d-9a34-cdd268fcc941\\\",\\\"systemUUID\\\":\\\"84b7b789-bc9b-466b-8619-2bf2e1fdb8d0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:00Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:00 crc kubenswrapper[4933]: I1202 15:53:00.473086 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:00 crc kubenswrapper[4933]: I1202 15:53:00.473151 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:00 crc kubenswrapper[4933]: I1202 15:53:00.473174 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:00 crc kubenswrapper[4933]: I1202 15:53:00.473203 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:00 crc kubenswrapper[4933]: I1202 15:53:00.473226 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:00Z","lastTransitionTime":"2025-12-02T15:53:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:00 crc kubenswrapper[4933]: E1202 15:53:00.494562 4933 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:53:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:53:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:53:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:53:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b45811fa-f657-451d-9a34-cdd268fcc941\\\",\\\"systemUUID\\\":\\\"84b7b789-bc9b-466b-8619-2bf2e1fdb8d0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:00Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:00 crc kubenswrapper[4933]: I1202 15:53:00.499721 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:00 crc kubenswrapper[4933]: I1202 15:53:00.499791 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:00 crc kubenswrapper[4933]: I1202 15:53:00.499804 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:00 crc kubenswrapper[4933]: I1202 15:53:00.499854 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:00 crc kubenswrapper[4933]: I1202 15:53:00.499883 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:00Z","lastTransitionTime":"2025-12-02T15:53:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:00 crc kubenswrapper[4933]: E1202 15:53:00.518918 4933 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:53:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:53:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:53:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:53:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b45811fa-f657-451d-9a34-cdd268fcc941\\\",\\\"systemUUID\\\":\\\"84b7b789-bc9b-466b-8619-2bf2e1fdb8d0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:00Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:00 crc kubenswrapper[4933]: I1202 15:53:00.525417 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:00 crc kubenswrapper[4933]: I1202 15:53:00.525485 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:00 crc kubenswrapper[4933]: I1202 15:53:00.525517 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:00 crc kubenswrapper[4933]: I1202 15:53:00.525539 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:00 crc kubenswrapper[4933]: I1202 15:53:00.525558 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:00Z","lastTransitionTime":"2025-12-02T15:53:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:00 crc kubenswrapper[4933]: E1202 15:53:00.541702 4933 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:53:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:53:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:53:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:53:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b45811fa-f657-451d-9a34-cdd268fcc941\\\",\\\"systemUUID\\\":\\\"84b7b789-bc9b-466b-8619-2bf2e1fdb8d0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:00Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:00 crc kubenswrapper[4933]: E1202 15:53:00.541854 4933 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 02 15:53:00 crc kubenswrapper[4933]: I1202 15:53:00.544214 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:00 crc kubenswrapper[4933]: I1202 15:53:00.544281 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:00 crc kubenswrapper[4933]: I1202 15:53:00.544302 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:00 crc kubenswrapper[4933]: I1202 15:53:00.544334 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:00 crc kubenswrapper[4933]: I1202 15:53:00.544364 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:00Z","lastTransitionTime":"2025-12-02T15:53:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:00 crc kubenswrapper[4933]: I1202 15:53:00.647655 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:00 crc kubenswrapper[4933]: I1202 15:53:00.647734 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:00 crc kubenswrapper[4933]: I1202 15:53:00.647758 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:00 crc kubenswrapper[4933]: I1202 15:53:00.647789 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:00 crc kubenswrapper[4933]: I1202 15:53:00.647813 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:00Z","lastTransitionTime":"2025-12-02T15:53:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:00 crc kubenswrapper[4933]: I1202 15:53:00.751231 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:00 crc kubenswrapper[4933]: I1202 15:53:00.751296 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:00 crc kubenswrapper[4933]: I1202 15:53:00.751309 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:00 crc kubenswrapper[4933]: I1202 15:53:00.751332 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:00 crc kubenswrapper[4933]: I1202 15:53:00.751346 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:00Z","lastTransitionTime":"2025-12-02T15:53:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:00 crc kubenswrapper[4933]: I1202 15:53:00.810057 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c95a4730-1427-4097-9ca3-4bd251e7acf0-metrics-certs\") pod \"network-metrics-daemon-qbps2\" (UID: \"c95a4730-1427-4097-9ca3-4bd251e7acf0\") " pod="openshift-multus/network-metrics-daemon-qbps2" Dec 02 15:53:00 crc kubenswrapper[4933]: E1202 15:53:00.810264 4933 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 15:53:00 crc kubenswrapper[4933]: E1202 15:53:00.810371 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c95a4730-1427-4097-9ca3-4bd251e7acf0-metrics-certs podName:c95a4730-1427-4097-9ca3-4bd251e7acf0 nodeName:}" failed. No retries permitted until 2025-12-02 15:53:04.810328475 +0000 UTC m=+48.061555178 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c95a4730-1427-4097-9ca3-4bd251e7acf0-metrics-certs") pod "network-metrics-daemon-qbps2" (UID: "c95a4730-1427-4097-9ca3-4bd251e7acf0") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 15:53:00 crc kubenswrapper[4933]: I1202 15:53:00.853639 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:00 crc kubenswrapper[4933]: I1202 15:53:00.853713 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:00 crc kubenswrapper[4933]: I1202 15:53:00.853724 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:00 crc kubenswrapper[4933]: I1202 15:53:00.853742 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:00 crc kubenswrapper[4933]: I1202 15:53:00.853755 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:00Z","lastTransitionTime":"2025-12-02T15:53:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:00 crc kubenswrapper[4933]: I1202 15:53:00.957469 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:00 crc kubenswrapper[4933]: I1202 15:53:00.957539 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:00 crc kubenswrapper[4933]: I1202 15:53:00.957551 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:00 crc kubenswrapper[4933]: I1202 15:53:00.957572 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:00 crc kubenswrapper[4933]: I1202 15:53:00.957585 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:00Z","lastTransitionTime":"2025-12-02T15:53:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:01 crc kubenswrapper[4933]: I1202 15:53:01.052942 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qbps2" Dec 02 15:53:01 crc kubenswrapper[4933]: E1202 15:53:01.053160 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qbps2" podUID="c95a4730-1427-4097-9ca3-4bd251e7acf0" Dec 02 15:53:01 crc kubenswrapper[4933]: I1202 15:53:01.060204 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:01 crc kubenswrapper[4933]: I1202 15:53:01.060244 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:01 crc kubenswrapper[4933]: I1202 15:53:01.060254 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:01 crc kubenswrapper[4933]: I1202 15:53:01.060273 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:01 crc kubenswrapper[4933]: I1202 15:53:01.060288 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:01Z","lastTransitionTime":"2025-12-02T15:53:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:01 crc kubenswrapper[4933]: I1202 15:53:01.163306 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:01 crc kubenswrapper[4933]: I1202 15:53:01.163371 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:01 crc kubenswrapper[4933]: I1202 15:53:01.163380 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:01 crc kubenswrapper[4933]: I1202 15:53:01.163394 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:01 crc kubenswrapper[4933]: I1202 15:53:01.163404 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:01Z","lastTransitionTime":"2025-12-02T15:53:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:01 crc kubenswrapper[4933]: I1202 15:53:01.266244 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:01 crc kubenswrapper[4933]: I1202 15:53:01.266319 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:01 crc kubenswrapper[4933]: I1202 15:53:01.266332 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:01 crc kubenswrapper[4933]: I1202 15:53:01.266349 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:01 crc kubenswrapper[4933]: I1202 15:53:01.266363 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:01Z","lastTransitionTime":"2025-12-02T15:53:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:01 crc kubenswrapper[4933]: I1202 15:53:01.369607 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:01 crc kubenswrapper[4933]: I1202 15:53:01.369668 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:01 crc kubenswrapper[4933]: I1202 15:53:01.369679 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:01 crc kubenswrapper[4933]: I1202 15:53:01.369698 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:01 crc kubenswrapper[4933]: I1202 15:53:01.369708 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:01Z","lastTransitionTime":"2025-12-02T15:53:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:01 crc kubenswrapper[4933]: I1202 15:53:01.473180 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:01 crc kubenswrapper[4933]: I1202 15:53:01.473244 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:01 crc kubenswrapper[4933]: I1202 15:53:01.473271 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:01 crc kubenswrapper[4933]: I1202 15:53:01.473307 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:01 crc kubenswrapper[4933]: I1202 15:53:01.473333 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:01Z","lastTransitionTime":"2025-12-02T15:53:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:01 crc kubenswrapper[4933]: I1202 15:53:01.576654 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:01 crc kubenswrapper[4933]: I1202 15:53:01.576698 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:01 crc kubenswrapper[4933]: I1202 15:53:01.576706 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:01 crc kubenswrapper[4933]: I1202 15:53:01.576722 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:01 crc kubenswrapper[4933]: I1202 15:53:01.576736 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:01Z","lastTransitionTime":"2025-12-02T15:53:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:01 crc kubenswrapper[4933]: I1202 15:53:01.681117 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:01 crc kubenswrapper[4933]: I1202 15:53:01.681173 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:01 crc kubenswrapper[4933]: I1202 15:53:01.681185 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:01 crc kubenswrapper[4933]: I1202 15:53:01.681208 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:01 crc kubenswrapper[4933]: I1202 15:53:01.681220 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:01Z","lastTransitionTime":"2025-12-02T15:53:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:01 crc kubenswrapper[4933]: I1202 15:53:01.784648 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:01 crc kubenswrapper[4933]: I1202 15:53:01.784997 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:01 crc kubenswrapper[4933]: I1202 15:53:01.785066 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:01 crc kubenswrapper[4933]: I1202 15:53:01.785136 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:01 crc kubenswrapper[4933]: I1202 15:53:01.785197 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:01Z","lastTransitionTime":"2025-12-02T15:53:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:01 crc kubenswrapper[4933]: I1202 15:53:01.888692 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:01 crc kubenswrapper[4933]: I1202 15:53:01.888741 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:01 crc kubenswrapper[4933]: I1202 15:53:01.888754 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:01 crc kubenswrapper[4933]: I1202 15:53:01.888774 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:01 crc kubenswrapper[4933]: I1202 15:53:01.888790 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:01Z","lastTransitionTime":"2025-12-02T15:53:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:01 crc kubenswrapper[4933]: I1202 15:53:01.992262 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:01 crc kubenswrapper[4933]: I1202 15:53:01.992338 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:01 crc kubenswrapper[4933]: I1202 15:53:01.992362 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:01 crc kubenswrapper[4933]: I1202 15:53:01.992392 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:01 crc kubenswrapper[4933]: I1202 15:53:01.992413 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:01Z","lastTransitionTime":"2025-12-02T15:53:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:02 crc kubenswrapper[4933]: I1202 15:53:02.052711 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 15:53:02 crc kubenswrapper[4933]: I1202 15:53:02.052711 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 15:53:02 crc kubenswrapper[4933]: I1202 15:53:02.052733 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 15:53:02 crc kubenswrapper[4933]: E1202 15:53:02.052932 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 15:53:02 crc kubenswrapper[4933]: E1202 15:53:02.053063 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 15:53:02 crc kubenswrapper[4933]: E1202 15:53:02.053320 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 15:53:02 crc kubenswrapper[4933]: I1202 15:53:02.096341 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:02 crc kubenswrapper[4933]: I1202 15:53:02.096406 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:02 crc kubenswrapper[4933]: I1202 15:53:02.096425 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:02 crc kubenswrapper[4933]: I1202 15:53:02.096449 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:02 crc kubenswrapper[4933]: I1202 15:53:02.096467 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:02Z","lastTransitionTime":"2025-12-02T15:53:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:02 crc kubenswrapper[4933]: I1202 15:53:02.199126 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:02 crc kubenswrapper[4933]: I1202 15:53:02.199191 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:02 crc kubenswrapper[4933]: I1202 15:53:02.199211 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:02 crc kubenswrapper[4933]: I1202 15:53:02.199237 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:02 crc kubenswrapper[4933]: I1202 15:53:02.199256 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:02Z","lastTransitionTime":"2025-12-02T15:53:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:02 crc kubenswrapper[4933]: I1202 15:53:02.302723 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:02 crc kubenswrapper[4933]: I1202 15:53:02.302801 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:02 crc kubenswrapper[4933]: I1202 15:53:02.302856 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:02 crc kubenswrapper[4933]: I1202 15:53:02.302893 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:02 crc kubenswrapper[4933]: I1202 15:53:02.302911 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:02Z","lastTransitionTime":"2025-12-02T15:53:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:02 crc kubenswrapper[4933]: I1202 15:53:02.405767 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:02 crc kubenswrapper[4933]: I1202 15:53:02.405860 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:02 crc kubenswrapper[4933]: I1202 15:53:02.405894 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:02 crc kubenswrapper[4933]: I1202 15:53:02.405923 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:02 crc kubenswrapper[4933]: I1202 15:53:02.405943 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:02Z","lastTransitionTime":"2025-12-02T15:53:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:02 crc kubenswrapper[4933]: I1202 15:53:02.509181 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:02 crc kubenswrapper[4933]: I1202 15:53:02.509242 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:02 crc kubenswrapper[4933]: I1202 15:53:02.509264 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:02 crc kubenswrapper[4933]: I1202 15:53:02.509288 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:02 crc kubenswrapper[4933]: I1202 15:53:02.509304 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:02Z","lastTransitionTime":"2025-12-02T15:53:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:02 crc kubenswrapper[4933]: I1202 15:53:02.612033 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:02 crc kubenswrapper[4933]: I1202 15:53:02.612117 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:02 crc kubenswrapper[4933]: I1202 15:53:02.612141 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:02 crc kubenswrapper[4933]: I1202 15:53:02.612167 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:02 crc kubenswrapper[4933]: I1202 15:53:02.612184 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:02Z","lastTransitionTime":"2025-12-02T15:53:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:02 crc kubenswrapper[4933]: I1202 15:53:02.714990 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:02 crc kubenswrapper[4933]: I1202 15:53:02.715053 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:02 crc kubenswrapper[4933]: I1202 15:53:02.715066 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:02 crc kubenswrapper[4933]: I1202 15:53:02.715090 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:02 crc kubenswrapper[4933]: I1202 15:53:02.715104 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:02Z","lastTransitionTime":"2025-12-02T15:53:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:02 crc kubenswrapper[4933]: I1202 15:53:02.818943 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:02 crc kubenswrapper[4933]: I1202 15:53:02.819010 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:02 crc kubenswrapper[4933]: I1202 15:53:02.819050 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:02 crc kubenswrapper[4933]: I1202 15:53:02.819080 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:02 crc kubenswrapper[4933]: I1202 15:53:02.819103 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:02Z","lastTransitionTime":"2025-12-02T15:53:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:02 crc kubenswrapper[4933]: I1202 15:53:02.922220 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:02 crc kubenswrapper[4933]: I1202 15:53:02.922312 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:02 crc kubenswrapper[4933]: I1202 15:53:02.922347 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:02 crc kubenswrapper[4933]: I1202 15:53:02.922377 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:02 crc kubenswrapper[4933]: I1202 15:53:02.922399 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:02Z","lastTransitionTime":"2025-12-02T15:53:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:03 crc kubenswrapper[4933]: I1202 15:53:03.025181 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:03 crc kubenswrapper[4933]: I1202 15:53:03.025253 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:03 crc kubenswrapper[4933]: I1202 15:53:03.025266 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:03 crc kubenswrapper[4933]: I1202 15:53:03.025288 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:03 crc kubenswrapper[4933]: I1202 15:53:03.025302 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:03Z","lastTransitionTime":"2025-12-02T15:53:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:03 crc kubenswrapper[4933]: I1202 15:53:03.052888 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qbps2" Dec 02 15:53:03 crc kubenswrapper[4933]: E1202 15:53:03.053131 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qbps2" podUID="c95a4730-1427-4097-9ca3-4bd251e7acf0" Dec 02 15:53:03 crc kubenswrapper[4933]: I1202 15:53:03.128009 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:03 crc kubenswrapper[4933]: I1202 15:53:03.128455 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:03 crc kubenswrapper[4933]: I1202 15:53:03.128704 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:03 crc kubenswrapper[4933]: I1202 15:53:03.128766 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:03 crc kubenswrapper[4933]: I1202 15:53:03.128793 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:03Z","lastTransitionTime":"2025-12-02T15:53:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:03 crc kubenswrapper[4933]: I1202 15:53:03.231246 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:03 crc kubenswrapper[4933]: I1202 15:53:03.231315 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:03 crc kubenswrapper[4933]: I1202 15:53:03.231330 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:03 crc kubenswrapper[4933]: I1202 15:53:03.231354 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:03 crc kubenswrapper[4933]: I1202 15:53:03.231369 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:03Z","lastTransitionTime":"2025-12-02T15:53:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:03 crc kubenswrapper[4933]: I1202 15:53:03.334658 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:03 crc kubenswrapper[4933]: I1202 15:53:03.334714 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:03 crc kubenswrapper[4933]: I1202 15:53:03.334726 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:03 crc kubenswrapper[4933]: I1202 15:53:03.334747 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:03 crc kubenswrapper[4933]: I1202 15:53:03.334762 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:03Z","lastTransitionTime":"2025-12-02T15:53:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:03 crc kubenswrapper[4933]: I1202 15:53:03.437538 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:03 crc kubenswrapper[4933]: I1202 15:53:03.437584 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:03 crc kubenswrapper[4933]: I1202 15:53:03.437595 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:03 crc kubenswrapper[4933]: I1202 15:53:03.437612 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:03 crc kubenswrapper[4933]: I1202 15:53:03.437625 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:03Z","lastTransitionTime":"2025-12-02T15:53:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:03 crc kubenswrapper[4933]: I1202 15:53:03.540611 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:03 crc kubenswrapper[4933]: I1202 15:53:03.541323 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:03 crc kubenswrapper[4933]: I1202 15:53:03.541421 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:03 crc kubenswrapper[4933]: I1202 15:53:03.541524 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:03 crc kubenswrapper[4933]: I1202 15:53:03.541608 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:03Z","lastTransitionTime":"2025-12-02T15:53:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:03 crc kubenswrapper[4933]: I1202 15:53:03.645164 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:03 crc kubenswrapper[4933]: I1202 15:53:03.645221 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:03 crc kubenswrapper[4933]: I1202 15:53:03.645235 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:03 crc kubenswrapper[4933]: I1202 15:53:03.645259 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:03 crc kubenswrapper[4933]: I1202 15:53:03.645276 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:03Z","lastTransitionTime":"2025-12-02T15:53:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:03 crc kubenswrapper[4933]: I1202 15:53:03.748523 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:03 crc kubenswrapper[4933]: I1202 15:53:03.748595 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:03 crc kubenswrapper[4933]: I1202 15:53:03.748605 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:03 crc kubenswrapper[4933]: I1202 15:53:03.748625 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:03 crc kubenswrapper[4933]: I1202 15:53:03.748637 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:03Z","lastTransitionTime":"2025-12-02T15:53:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:03 crc kubenswrapper[4933]: I1202 15:53:03.851397 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:03 crc kubenswrapper[4933]: I1202 15:53:03.851450 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:03 crc kubenswrapper[4933]: I1202 15:53:03.851466 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:03 crc kubenswrapper[4933]: I1202 15:53:03.851490 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:03 crc kubenswrapper[4933]: I1202 15:53:03.851580 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:03Z","lastTransitionTime":"2025-12-02T15:53:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:03 crc kubenswrapper[4933]: I1202 15:53:03.955049 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:03 crc kubenswrapper[4933]: I1202 15:53:03.955163 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:03 crc kubenswrapper[4933]: I1202 15:53:03.955179 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:03 crc kubenswrapper[4933]: I1202 15:53:03.955197 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:03 crc kubenswrapper[4933]: I1202 15:53:03.955209 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:03Z","lastTransitionTime":"2025-12-02T15:53:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:04 crc kubenswrapper[4933]: I1202 15:53:04.052519 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 15:53:04 crc kubenswrapper[4933]: I1202 15:53:04.052599 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 15:53:04 crc kubenswrapper[4933]: I1202 15:53:04.052523 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 15:53:04 crc kubenswrapper[4933]: E1202 15:53:04.052727 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 15:53:04 crc kubenswrapper[4933]: E1202 15:53:04.052911 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 15:53:04 crc kubenswrapper[4933]: E1202 15:53:04.053024 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 15:53:04 crc kubenswrapper[4933]: I1202 15:53:04.057627 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:04 crc kubenswrapper[4933]: I1202 15:53:04.057667 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:04 crc kubenswrapper[4933]: I1202 15:53:04.057677 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:04 crc kubenswrapper[4933]: I1202 15:53:04.057690 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:04 crc kubenswrapper[4933]: I1202 15:53:04.057702 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:04Z","lastTransitionTime":"2025-12-02T15:53:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:04 crc kubenswrapper[4933]: I1202 15:53:04.160717 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:04 crc kubenswrapper[4933]: I1202 15:53:04.160805 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:04 crc kubenswrapper[4933]: I1202 15:53:04.160814 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:04 crc kubenswrapper[4933]: I1202 15:53:04.160856 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:04 crc kubenswrapper[4933]: I1202 15:53:04.160868 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:04Z","lastTransitionTime":"2025-12-02T15:53:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:04 crc kubenswrapper[4933]: I1202 15:53:04.264181 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:04 crc kubenswrapper[4933]: I1202 15:53:04.264243 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:04 crc kubenswrapper[4933]: I1202 15:53:04.264253 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:04 crc kubenswrapper[4933]: I1202 15:53:04.264268 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:04 crc kubenswrapper[4933]: I1202 15:53:04.264279 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:04Z","lastTransitionTime":"2025-12-02T15:53:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:04 crc kubenswrapper[4933]: I1202 15:53:04.367714 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:04 crc kubenswrapper[4933]: I1202 15:53:04.368125 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:04 crc kubenswrapper[4933]: I1202 15:53:04.368144 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:04 crc kubenswrapper[4933]: I1202 15:53:04.368222 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:04 crc kubenswrapper[4933]: I1202 15:53:04.368238 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:04Z","lastTransitionTime":"2025-12-02T15:53:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:04 crc kubenswrapper[4933]: I1202 15:53:04.472105 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:04 crc kubenswrapper[4933]: I1202 15:53:04.472180 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:04 crc kubenswrapper[4933]: I1202 15:53:04.472199 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:04 crc kubenswrapper[4933]: I1202 15:53:04.472230 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:04 crc kubenswrapper[4933]: I1202 15:53:04.472248 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:04Z","lastTransitionTime":"2025-12-02T15:53:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:04 crc kubenswrapper[4933]: I1202 15:53:04.575044 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:04 crc kubenswrapper[4933]: I1202 15:53:04.575091 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:04 crc kubenswrapper[4933]: I1202 15:53:04.575101 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:04 crc kubenswrapper[4933]: I1202 15:53:04.575117 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:04 crc kubenswrapper[4933]: I1202 15:53:04.575128 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:04Z","lastTransitionTime":"2025-12-02T15:53:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:04 crc kubenswrapper[4933]: I1202 15:53:04.678852 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:04 crc kubenswrapper[4933]: I1202 15:53:04.678903 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:04 crc kubenswrapper[4933]: I1202 15:53:04.678917 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:04 crc kubenswrapper[4933]: I1202 15:53:04.678936 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:04 crc kubenswrapper[4933]: I1202 15:53:04.678949 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:04Z","lastTransitionTime":"2025-12-02T15:53:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:04 crc kubenswrapper[4933]: I1202 15:53:04.781606 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:04 crc kubenswrapper[4933]: I1202 15:53:04.781639 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:04 crc kubenswrapper[4933]: I1202 15:53:04.781647 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:04 crc kubenswrapper[4933]: I1202 15:53:04.781661 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:04 crc kubenswrapper[4933]: I1202 15:53:04.781671 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:04Z","lastTransitionTime":"2025-12-02T15:53:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:04 crc kubenswrapper[4933]: I1202 15:53:04.854380 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c95a4730-1427-4097-9ca3-4bd251e7acf0-metrics-certs\") pod \"network-metrics-daemon-qbps2\" (UID: \"c95a4730-1427-4097-9ca3-4bd251e7acf0\") " pod="openshift-multus/network-metrics-daemon-qbps2" Dec 02 15:53:04 crc kubenswrapper[4933]: E1202 15:53:04.854574 4933 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 15:53:04 crc kubenswrapper[4933]: E1202 15:53:04.854663 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c95a4730-1427-4097-9ca3-4bd251e7acf0-metrics-certs podName:c95a4730-1427-4097-9ca3-4bd251e7acf0 nodeName:}" failed. No retries permitted until 2025-12-02 15:53:12.854642112 +0000 UTC m=+56.105868855 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c95a4730-1427-4097-9ca3-4bd251e7acf0-metrics-certs") pod "network-metrics-daemon-qbps2" (UID: "c95a4730-1427-4097-9ca3-4bd251e7acf0") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 15:53:04 crc kubenswrapper[4933]: I1202 15:53:04.884180 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:04 crc kubenswrapper[4933]: I1202 15:53:04.884229 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:04 crc kubenswrapper[4933]: I1202 15:53:04.884237 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:04 crc kubenswrapper[4933]: I1202 15:53:04.884254 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:04 crc kubenswrapper[4933]: I1202 15:53:04.884265 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:04Z","lastTransitionTime":"2025-12-02T15:53:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:04 crc kubenswrapper[4933]: I1202 15:53:04.987068 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:04 crc kubenswrapper[4933]: I1202 15:53:04.987131 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:04 crc kubenswrapper[4933]: I1202 15:53:04.987146 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:04 crc kubenswrapper[4933]: I1202 15:53:04.987164 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:04 crc kubenswrapper[4933]: I1202 15:53:04.987176 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:04Z","lastTransitionTime":"2025-12-02T15:53:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:05 crc kubenswrapper[4933]: I1202 15:53:05.052479 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qbps2" Dec 02 15:53:05 crc kubenswrapper[4933]: E1202 15:53:05.052650 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qbps2" podUID="c95a4730-1427-4097-9ca3-4bd251e7acf0" Dec 02 15:53:05 crc kubenswrapper[4933]: I1202 15:53:05.089728 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:05 crc kubenswrapper[4933]: I1202 15:53:05.089803 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:05 crc kubenswrapper[4933]: I1202 15:53:05.089816 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:05 crc kubenswrapper[4933]: I1202 15:53:05.089852 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:05 crc kubenswrapper[4933]: I1202 15:53:05.089864 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:05Z","lastTransitionTime":"2025-12-02T15:53:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:05 crc kubenswrapper[4933]: I1202 15:53:05.192301 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:05 crc kubenswrapper[4933]: I1202 15:53:05.192331 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:05 crc kubenswrapper[4933]: I1202 15:53:05.192339 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:05 crc kubenswrapper[4933]: I1202 15:53:05.192351 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:05 crc kubenswrapper[4933]: I1202 15:53:05.192360 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:05Z","lastTransitionTime":"2025-12-02T15:53:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:05 crc kubenswrapper[4933]: I1202 15:53:05.295712 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:05 crc kubenswrapper[4933]: I1202 15:53:05.295754 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:05 crc kubenswrapper[4933]: I1202 15:53:05.295763 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:05 crc kubenswrapper[4933]: I1202 15:53:05.295778 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:05 crc kubenswrapper[4933]: I1202 15:53:05.295787 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:05Z","lastTransitionTime":"2025-12-02T15:53:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:05 crc kubenswrapper[4933]: I1202 15:53:05.398568 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:05 crc kubenswrapper[4933]: I1202 15:53:05.398631 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:05 crc kubenswrapper[4933]: I1202 15:53:05.398648 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:05 crc kubenswrapper[4933]: I1202 15:53:05.398675 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:05 crc kubenswrapper[4933]: I1202 15:53:05.398692 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:05Z","lastTransitionTime":"2025-12-02T15:53:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:05 crc kubenswrapper[4933]: I1202 15:53:05.502368 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:05 crc kubenswrapper[4933]: I1202 15:53:05.502458 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:05 crc kubenswrapper[4933]: I1202 15:53:05.502480 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:05 crc kubenswrapper[4933]: I1202 15:53:05.502505 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:05 crc kubenswrapper[4933]: I1202 15:53:05.502525 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:05Z","lastTransitionTime":"2025-12-02T15:53:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:05 crc kubenswrapper[4933]: I1202 15:53:05.606423 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:05 crc kubenswrapper[4933]: I1202 15:53:05.606529 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:05 crc kubenswrapper[4933]: I1202 15:53:05.606554 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:05 crc kubenswrapper[4933]: I1202 15:53:05.606593 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:05 crc kubenswrapper[4933]: I1202 15:53:05.606617 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:05Z","lastTransitionTime":"2025-12-02T15:53:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:05 crc kubenswrapper[4933]: I1202 15:53:05.709514 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:05 crc kubenswrapper[4933]: I1202 15:53:05.709557 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:05 crc kubenswrapper[4933]: I1202 15:53:05.709567 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:05 crc kubenswrapper[4933]: I1202 15:53:05.709582 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:05 crc kubenswrapper[4933]: I1202 15:53:05.709592 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:05Z","lastTransitionTime":"2025-12-02T15:53:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:05 crc kubenswrapper[4933]: I1202 15:53:05.813223 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:05 crc kubenswrapper[4933]: I1202 15:53:05.813289 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:05 crc kubenswrapper[4933]: I1202 15:53:05.813324 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:05 crc kubenswrapper[4933]: I1202 15:53:05.813354 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:05 crc kubenswrapper[4933]: I1202 15:53:05.813375 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:05Z","lastTransitionTime":"2025-12-02T15:53:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:05 crc kubenswrapper[4933]: I1202 15:53:05.917344 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:05 crc kubenswrapper[4933]: I1202 15:53:05.917539 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:05 crc kubenswrapper[4933]: I1202 15:53:05.917575 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:05 crc kubenswrapper[4933]: I1202 15:53:05.917605 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:05 crc kubenswrapper[4933]: I1202 15:53:05.917657 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:05Z","lastTransitionTime":"2025-12-02T15:53:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:06 crc kubenswrapper[4933]: I1202 15:53:06.021209 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:06 crc kubenswrapper[4933]: I1202 15:53:06.021321 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:06 crc kubenswrapper[4933]: I1202 15:53:06.021335 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:06 crc kubenswrapper[4933]: I1202 15:53:06.021352 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:06 crc kubenswrapper[4933]: I1202 15:53:06.021364 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:06Z","lastTransitionTime":"2025-12-02T15:53:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:06 crc kubenswrapper[4933]: I1202 15:53:06.052996 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 15:53:06 crc kubenswrapper[4933]: I1202 15:53:06.053002 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 15:53:06 crc kubenswrapper[4933]: E1202 15:53:06.053192 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 15:53:06 crc kubenswrapper[4933]: I1202 15:53:06.053035 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 15:53:06 crc kubenswrapper[4933]: E1202 15:53:06.053344 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 15:53:06 crc kubenswrapper[4933]: E1202 15:53:06.053434 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 15:53:06 crc kubenswrapper[4933]: I1202 15:53:06.124517 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:06 crc kubenswrapper[4933]: I1202 15:53:06.124616 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:06 crc kubenswrapper[4933]: I1202 15:53:06.124667 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:06 crc kubenswrapper[4933]: I1202 15:53:06.124693 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:06 crc kubenswrapper[4933]: I1202 15:53:06.124743 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:06Z","lastTransitionTime":"2025-12-02T15:53:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:06 crc kubenswrapper[4933]: I1202 15:53:06.228332 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:06 crc kubenswrapper[4933]: I1202 15:53:06.228407 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:06 crc kubenswrapper[4933]: I1202 15:53:06.228418 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:06 crc kubenswrapper[4933]: I1202 15:53:06.228438 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:06 crc kubenswrapper[4933]: I1202 15:53:06.228453 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:06Z","lastTransitionTime":"2025-12-02T15:53:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:06 crc kubenswrapper[4933]: I1202 15:53:06.332045 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:06 crc kubenswrapper[4933]: I1202 15:53:06.332113 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:06 crc kubenswrapper[4933]: I1202 15:53:06.332134 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:06 crc kubenswrapper[4933]: I1202 15:53:06.332161 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:06 crc kubenswrapper[4933]: I1202 15:53:06.332184 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:06Z","lastTransitionTime":"2025-12-02T15:53:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:06 crc kubenswrapper[4933]: I1202 15:53:06.435208 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:06 crc kubenswrapper[4933]: I1202 15:53:06.435270 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:06 crc kubenswrapper[4933]: I1202 15:53:06.435288 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:06 crc kubenswrapper[4933]: I1202 15:53:06.435308 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:06 crc kubenswrapper[4933]: I1202 15:53:06.435323 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:06Z","lastTransitionTime":"2025-12-02T15:53:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:06 crc kubenswrapper[4933]: I1202 15:53:06.538420 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:06 crc kubenswrapper[4933]: I1202 15:53:06.538467 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:06 crc kubenswrapper[4933]: I1202 15:53:06.538477 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:06 crc kubenswrapper[4933]: I1202 15:53:06.538495 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:06 crc kubenswrapper[4933]: I1202 15:53:06.538507 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:06Z","lastTransitionTime":"2025-12-02T15:53:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:06 crc kubenswrapper[4933]: I1202 15:53:06.641627 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:06 crc kubenswrapper[4933]: I1202 15:53:06.641675 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:06 crc kubenswrapper[4933]: I1202 15:53:06.641689 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:06 crc kubenswrapper[4933]: I1202 15:53:06.641709 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:06 crc kubenswrapper[4933]: I1202 15:53:06.641728 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:06Z","lastTransitionTime":"2025-12-02T15:53:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:06 crc kubenswrapper[4933]: I1202 15:53:06.745708 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:06 crc kubenswrapper[4933]: I1202 15:53:06.745778 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:06 crc kubenswrapper[4933]: I1202 15:53:06.745795 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:06 crc kubenswrapper[4933]: I1202 15:53:06.745851 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:06 crc kubenswrapper[4933]: I1202 15:53:06.745870 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:06Z","lastTransitionTime":"2025-12-02T15:53:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:06 crc kubenswrapper[4933]: I1202 15:53:06.849690 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:06 crc kubenswrapper[4933]: I1202 15:53:06.849756 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:06 crc kubenswrapper[4933]: I1202 15:53:06.849774 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:06 crc kubenswrapper[4933]: I1202 15:53:06.849809 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:06 crc kubenswrapper[4933]: I1202 15:53:06.849866 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:06Z","lastTransitionTime":"2025-12-02T15:53:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:06 crc kubenswrapper[4933]: I1202 15:53:06.953180 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:06 crc kubenswrapper[4933]: I1202 15:53:06.953239 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:06 crc kubenswrapper[4933]: I1202 15:53:06.953253 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:06 crc kubenswrapper[4933]: I1202 15:53:06.953272 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:06 crc kubenswrapper[4933]: I1202 15:53:06.953287 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:06Z","lastTransitionTime":"2025-12-02T15:53:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:07 crc kubenswrapper[4933]: I1202 15:53:07.053174 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qbps2" Dec 02 15:53:07 crc kubenswrapper[4933]: E1202 15:53:07.053332 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qbps2" podUID="c95a4730-1427-4097-9ca3-4bd251e7acf0" Dec 02 15:53:07 crc kubenswrapper[4933]: I1202 15:53:07.054437 4933 scope.go:117] "RemoveContainer" containerID="04ab3fe2bc8cab1157c68d938926ecb8b4fc1bca6f2bba21333de8a7213ad8df" Dec 02 15:53:07 crc kubenswrapper[4933]: I1202 15:53:07.055951 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:07 crc kubenswrapper[4933]: I1202 15:53:07.056010 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:07 crc kubenswrapper[4933]: I1202 15:53:07.056025 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:07 crc kubenswrapper[4933]: I1202 15:53:07.056047 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:07 crc kubenswrapper[4933]: I1202 15:53:07.056083 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:07Z","lastTransitionTime":"2025-12-02T15:53:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:07 crc kubenswrapper[4933]: I1202 15:53:07.071569 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06b195f1-3296-4050-9361-eab421cde8d4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7399fed7994f0ed3f12a423d3f6796e84d8687f9c16a3050ccbb90e1c80a07d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee933bdd8638f0085a6f720a178c8ce59bf46b40a0bcb015ac9c570e25ce97d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7cb1abb6f86878fc3daef153191ea3a2ebe06b3f1fc7df959539938c3b6a724\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db14d6e6ebdfa06ff02570eb66fe7ea17a7705fdaa767b6fb91d7ed12eacd59a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:07Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:07 crc kubenswrapper[4933]: I1202 15:53:07.097086 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1972064c-ea30-421c-b009-2bc675a98fcc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3920b75a79accaa154ca827b9d813f5bbb7c7c18369c0b6d3bb82cab653bcb66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a3f2ac3ef3759389e0c4807b4235aea66b8d9570ef020af90110402296a755\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d4c0805f39e3410185b2d898b61b42dfeb544cd47c1ac7570a099ece7921a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://170155e5803e9eabfc4866a3cd76bd74567d9ff6707bebfdf7f24a2330173140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a499bb33dae499d565aa22f537a095fe9465bd1d68d829248fc7a896f692f67d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14c25265781df0f480ecebbc167b5e413d869433f7258cc299a2c2bc12ee2af3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04ab3fe2bc8cab1157c68d938926ecb8b4fc1bca6f2bba21333de8a7213ad8df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04ab3fe2bc8cab1157c68d938926ecb8b4fc1bca6f2bba21333de8a7213ad8df\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T15:52:53Z\\\",\\\"message\\\":\\\"]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1202 15:52:53.202107 6385 services_controller.go:444] Built service openshift-machine-api/control-plane-machine-set-operator LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1202 15:52:53.202117 6385 services_controller.go:445] Built service openshift-machine-api/control-plane-machine-set-operator LB template configs for network=default: []services.lbConfig(nil)\\\\nF1202 15:52:53.201748 6385 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:53Z is after 2025-08\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-8mklc_openshift-ovn-kubernetes(1972064c-ea30-421c-b009-2bc675a98fcc)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ae6c9973a0f8983a5da9a3a25ba23d7d836e7f5bdaa60b60627d604ef3d2afc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0062caebc274431e99ef7dc875a50f2f2d756215bd2700b7206582f68e5f376\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0062caebc274431e99ef7dc875a50f2f2d756215bd2700b7206582f68e5f376\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8mklc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:07Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:07 crc kubenswrapper[4933]: I1202 15:53:07.111993 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s488w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a161d9d5-a56f-45e9-93e4-50e7220cd31e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38f86b30416fb2f10e8444d5c7c0afe84f16d619e83e7a5e1186eaf4c274a51f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mnvpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cba0775cbbe9a6ce1f0b0fe3559f2b5eb39bd13d4f35686fe2ff92d7d833909\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mnvpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s488w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:07Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:07 crc kubenswrapper[4933]: I1202 15:53:07.125484 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:07Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:07 crc kubenswrapper[4933]: I1202 15:53:07.139174 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6c1c5e6-50dd-428a-890c-2c3f0456f2fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d84363ac0dfeec81ad7770d6ffd34547605fc51bebb545c4639f4c069bab93ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t48jm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54194f3459a2bbe748821e4f8e94abdd18e7c4e483d4cc2c9d5b765db584dd01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t48jm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-d2p6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:07Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:07 crc kubenswrapper[4933]: I1202 15:53:07.148003 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5d9dn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97edaf10-912b-42e7-a9e7-930381d48508\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea0970d622ab76b94ceab66d1d10d469581574368d38c8cd7c6b7a26f81cb6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s82hp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5d9dn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:07Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:07 crc kubenswrapper[4933]: I1202 15:53:07.159081 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:07 crc kubenswrapper[4933]: I1202 15:53:07.159146 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:07 crc kubenswrapper[4933]: I1202 15:53:07.159164 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:07 crc kubenswrapper[4933]: I1202 15:53:07.159188 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:07 crc kubenswrapper[4933]: I1202 15:53:07.159205 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:07Z","lastTransitionTime":"2025-12-02T15:53:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:07 crc kubenswrapper[4933]: I1202 15:53:07.163295 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fl25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"680c0df1-e4d6-4e1c-a36d-2378e821d2d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bdbdb1eb6ad7b6e710feeda6af64e9557ec1c3c938fd850fa5b2835abc45f098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-559sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fl25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:07Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:07 crc kubenswrapper[4933]: I1202 15:53:07.176970 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qbps2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c95a4730-1427-4097-9ca3-4bd251e7acf0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdwf7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdwf7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:56Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qbps2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:07Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:07 crc kubenswrapper[4933]: I1202 15:53:07.192425 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:07Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:07 crc kubenswrapper[4933]: I1202 15:53:07.207082 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4424b26b4c67e9508d92cc6bbc82b291d93c587a8463026a856d87b7b778079e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:07Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:07 crc kubenswrapper[4933]: I1202 15:53:07.222412 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-z6kjz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b033c545-93a2-4401-842b-22456e44216b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5307d7bbe56091012f9975b2a42eafb27d8c90b53817f1f82d8269e23456759\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdq96\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-z6kjz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:07Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:07 crc kubenswrapper[4933]: I1202 15:53:07.236303 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50efb59904b9275848b8f068dfa8943515c66087209fe13dc75888354ecaff09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:07Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:07 crc kubenswrapper[4933]: I1202 15:53:07.250011 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c0555430401fd089ba4f14bef44c9a03bcc4352a3159c34aa592797211ff912\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2774460b514418fc05e1d8ac0ca0a8cda1194fab9151804bed266e6bf44c7369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:07Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:07 crc kubenswrapper[4933]: I1202 15:53:07.262931 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:07Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:07 crc kubenswrapper[4933]: I1202 15:53:07.263037 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:07 crc kubenswrapper[4933]: I1202 15:53:07.263071 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:07 crc kubenswrapper[4933]: I1202 15:53:07.263083 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:07 crc kubenswrapper[4933]: I1202 15:53:07.263101 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:07 crc kubenswrapper[4933]: I1202 15:53:07.263112 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:07Z","lastTransitionTime":"2025-12-02T15:53:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:07 crc kubenswrapper[4933]: I1202 15:53:07.275975 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-s779q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72a30a71-fe04-43a2-8f60-c9b12a0a6e5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2367cc195b2375ceed6df398214268e414ae13f5459750b4a1f3bbe4ef59363\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a448b3118b6160344bea740978c045ce2351af934805ee95eb6b940963b377bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a448b3118b6160344bea740978c045ce2351af934805ee95eb6b940963b377bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f11b7eae04e4265f3d00a32c0c1c91419ef6da612c6ecd2e69ef4e90b5f72b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f11b7eae04e4265f3d00a32c0c1c91419ef6da612c6ecd2e69ef4e90b5f72b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89093d056f612b90064277410dc17c8b1879459a9e9491538ccd8dabeb2703e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89093d056f612b90064277410dc17c8b1879459a9e9491538ccd8dabeb2703e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e773c0ba8db44a579b0f5b59c7402d773402fbb1b3e3b57bf95e9244df635c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e773c0ba8db44a579b0f5b59c7402d773402fbb1b3e3b57bf95e9244df635c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a098d78bfca730eec14ca5f48ae082e24f0623ec3752e1b0182dceb18821e3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a098d78bfca730eec14ca5f48ae082e24f0623ec3752e1b0182dceb18821e3b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e41fea9eed5df332e515a81666075fc4cb3171b47a7c222b36dac4d5a7533692\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e41fea9eed5df332e515a81666075fc4cb3171b47a7c222b36dac4d5a7533692\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-s779q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:07Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:07 crc kubenswrapper[4933]: I1202 15:53:07.291309 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67141d41-dade-4d16-8921-1a3eeaef658e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67497877f8241167fd89cf2ebbc7257a910b4f22e96a1e58f78936317daf19aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://559f28d5b8b13a4d78a727f671a75d2c61a391f83089c98749a8071b466c8fae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50e3f8f75abacf2b0701b20cc58796727bb04d99836c7ad69c27355d0c7f070f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0f8661c980893fc646e84a6bb8547946653723fb3f1449d585953e25715d588\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdced4864fc5e9a41404f9484c6126634ffcbc3388080207f6a5508be6dc7b19\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 15:52:30.552832 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 15:52:30.556124 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1621699441/tls.crt::/tmp/serving-cert-1621699441/tls.key\\\\\\\"\\\\nI1202 15:52:36.166152 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 15:52:36.169452 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 15:52:36.169553 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 15:52:36.169614 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 15:52:36.169667 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 15:52:36.177343 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 15:52:36.177409 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 15:52:36.177417 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 15:52:36.177423 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 15:52:36.177428 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 15:52:36.177432 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 15:52:36.177437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 15:52:36.177361 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 15:52:36.181525 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://266ff5f18c317176752753189e1fd32c61d17e22bebba19421d71cdce41c10e4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd63fc5b2ffbd52b09628908868b28e2168bfe8bc453de41ce175aedcd404cc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd63fc5b2ffbd52b09628908868b28e2168bfe8bc453de41ce175aedcd404cc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:07Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:07 crc kubenswrapper[4933]: I1202 15:53:07.365498 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:07 crc kubenswrapper[4933]: I1202 15:53:07.365550 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:07 crc kubenswrapper[4933]: I1202 15:53:07.365561 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:07 crc kubenswrapper[4933]: I1202 15:53:07.365578 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:07 crc kubenswrapper[4933]: I1202 15:53:07.365597 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:07Z","lastTransitionTime":"2025-12-02T15:53:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:07 crc kubenswrapper[4933]: I1202 15:53:07.456069 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8mklc_1972064c-ea30-421c-b009-2bc675a98fcc/ovnkube-controller/1.log" Dec 02 15:53:07 crc kubenswrapper[4933]: I1202 15:53:07.460085 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" event={"ID":"1972064c-ea30-421c-b009-2bc675a98fcc","Type":"ContainerStarted","Data":"e68bfc1eb2e35cbb02950eed88f4e4cb09b5ee597cbf3dbefcbb7dca7a9f90d7"} Dec 02 15:53:07 crc kubenswrapper[4933]: I1202 15:53:07.460390 4933 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 02 15:53:07 crc kubenswrapper[4933]: I1202 15:53:07.467733 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:07 crc kubenswrapper[4933]: I1202 15:53:07.467765 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:07 crc kubenswrapper[4933]: I1202 15:53:07.467774 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:07 crc kubenswrapper[4933]: I1202 15:53:07.467786 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:07 crc kubenswrapper[4933]: I1202 15:53:07.467796 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:07Z","lastTransitionTime":"2025-12-02T15:53:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:07 crc kubenswrapper[4933]: I1202 15:53:07.475264 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5d9dn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97edaf10-912b-42e7-a9e7-930381d48508\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea0970d622ab76b94ceab66d1d10d469581574368d38c8cd7c6b7a26f81cb6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s82hp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5d9dn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:07Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:07 crc kubenswrapper[4933]: I1202 15:53:07.490092 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fl25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"680c0df1-e4d6-4e1c-a36d-2378e821d2d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bdbdb1eb6ad7b6e710feeda6af64e9557ec1c3c938fd850fa5b2835abc45f098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-559sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fl25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:07Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:07 crc kubenswrapper[4933]: I1202 15:53:07.506609 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qbps2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c95a4730-1427-4097-9ca3-4bd251e7acf0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdwf7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdwf7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:56Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qbps2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:07Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:07 crc kubenswrapper[4933]: I1202 15:53:07.524405 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:07Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:07 crc kubenswrapper[4933]: I1202 15:53:07.541044 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6c1c5e6-50dd-428a-890c-2c3f0456f2fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d84363ac0dfeec81ad7770d6ffd34547605fc51bebb545c4639f4c069bab93ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t48jm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54194f3459a2bbe748821e4f8e94abdd18e7c4e483d4cc2c9d5b765db584dd01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t48jm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-d2p6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:07Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:07 crc kubenswrapper[4933]: I1202 15:53:07.567073 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-z6kjz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b033c545-93a2-4401-842b-22456e44216b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5307d7bbe56091012f9975b2a42eafb27d8c90b53817f1f82d8269e23456759\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdq96\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-z6kjz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:07Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:07 crc kubenswrapper[4933]: I1202 15:53:07.571362 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:07 crc kubenswrapper[4933]: I1202 15:53:07.571419 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:07 crc kubenswrapper[4933]: I1202 15:53:07.571436 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:07 crc kubenswrapper[4933]: I1202 15:53:07.571455 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:07 crc kubenswrapper[4933]: I1202 15:53:07.571467 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:07Z","lastTransitionTime":"2025-12-02T15:53:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:07 crc kubenswrapper[4933]: I1202 15:53:07.592899 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50efb59904b9275848b8f068dfa8943515c66087209fe13dc75888354ecaff09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:07Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:07 crc kubenswrapper[4933]: I1202 15:53:07.617112 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:07Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:07 crc kubenswrapper[4933]: I1202 15:53:07.628719 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4424b26b4c67e9508d92cc6bbc82b291d93c587a8463026a856d87b7b778079e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:07Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:07 crc kubenswrapper[4933]: I1202 15:53:07.643861 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-s779q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72a30a71-fe04-43a2-8f60-c9b12a0a6e5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2367cc195b2375ceed6df398214268e414ae13f5459750b4a1f3bbe4ef59363\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a448b3118b6160344bea740978c045ce2351af934805ee95eb6b940963b377bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a448b3118b6160344bea740978c045ce2351af934805ee95eb6b940963b377bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f11b7eae04e4265f3d00a32c0c1c91419ef6da612c6ecd2e69ef4e90b5f72b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f11b7eae04e4265f3d00a32c0c1c91419ef6da612c6ecd2e69ef4e90b5f72b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89093d056f612b90064277410dc17c8b1879459a9e9491538ccd8dabeb2703e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89093d056f612b90064277410dc17c8b1879459a9e9491538ccd8dabeb2703e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e773c0ba8db44a579b0f5b59c7402d773402fbb1b3e3b57bf95e9244df635c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e773c0ba8db44a579b0f5b59c7402d773402fbb1b3e3b57bf95e9244df635c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a098d78bfca730eec14ca5f48ae082e24f0623ec3752e1b0182dceb18821e3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a098d78bfca730eec14ca5f48ae082e24f0623ec3752e1b0182dceb18821e3b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e41fea9eed5df332e515a81666075fc4cb3171b47a7c222b36dac4d5a7533692\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e41fea9eed5df332e515a81666075fc4cb3171b47a7c222b36dac4d5a7533692\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-s779q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:07Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:07 crc kubenswrapper[4933]: I1202 15:53:07.658853 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67141d41-dade-4d16-8921-1a3eeaef658e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67497877f8241167fd89cf2ebbc7257a910b4f22e96a1e58f78936317daf19aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://559f28d5b8b13a4d78a727f671a75d2c61a391f83089c98749a8071b466c8fae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50e3f8f75abacf2b0701b20cc58796727bb04d99836c7ad69c27355d0c7f070f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0f8661c980893fc646e84a6bb8547946653723fb3f1449d585953e25715d588\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdced4864fc5e9a41404f9484c6126634ffcbc3388080207f6a5508be6dc7b19\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 15:52:30.552832 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 15:52:30.556124 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1621699441/tls.crt::/tmp/serving-cert-1621699441/tls.key\\\\\\\"\\\\nI1202 15:52:36.166152 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 15:52:36.169452 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 15:52:36.169553 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 15:52:36.169614 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 15:52:36.169667 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 15:52:36.177343 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 15:52:36.177409 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 15:52:36.177417 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 15:52:36.177423 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 15:52:36.177428 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 15:52:36.177432 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 15:52:36.177437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 15:52:36.177361 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 15:52:36.181525 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://266ff5f18c317176752753189e1fd32c61d17e22bebba19421d71cdce41c10e4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd63fc5b2ffbd52b09628908868b28e2168bfe8bc453de41ce175aedcd404cc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd63fc5b2ffbd52b09628908868b28e2168bfe8bc453de41ce175aedcd404cc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:07Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:07 crc kubenswrapper[4933]: I1202 15:53:07.674056 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:07 crc kubenswrapper[4933]: I1202 15:53:07.674301 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:07 crc kubenswrapper[4933]: I1202 15:53:07.674771 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:07 crc kubenswrapper[4933]: I1202 15:53:07.674901 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:07 crc kubenswrapper[4933]: I1202 15:53:07.674991 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:07Z","lastTransitionTime":"2025-12-02T15:53:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:07 crc kubenswrapper[4933]: I1202 15:53:07.675344 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c0555430401fd089ba4f14bef44c9a03bcc4352a3159c34aa592797211ff912\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2774460b514418fc05e1d8ac0ca0a8cda1194fab9151804bed266e6bf44c7369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:07Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:07 crc kubenswrapper[4933]: I1202 15:53:07.692438 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:07Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:07 crc kubenswrapper[4933]: I1202 15:53:07.712337 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1972064c-ea30-421c-b009-2bc675a98fcc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3920b75a79accaa154ca827b9d813f5bbb7c7c18369c0b6d3bb82cab653bcb66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a3f2ac3ef3759389e0c4807b4235aea66b8d9570ef020af90110402296a755\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d4c0805f39e3410185b2d898b61b42dfeb544cd47c1ac7570a099ece7921a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://170155e5803e9eabfc4866a3cd76bd74567d9ff6707bebfdf7f24a2330173140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a499bb33dae499d565aa22f537a095fe9465bd1d68d829248fc7a896f692f67d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14c25265781df0f480ecebbc167b5e413d869433f7258cc299a2c2bc12ee2af3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e68bfc1eb2e35cbb02950eed88f4e4cb09b5ee597cbf3dbefcbb7dca7a9f90d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04ab3fe2bc8cab1157c68d938926ecb8b4fc1bca6f2bba21333de8a7213ad8df\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T15:52:53Z\\\",\\\"message\\\":\\\"]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1202 15:52:53.202107 6385 services_controller.go:444] Built service openshift-machine-api/control-plane-machine-set-operator LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1202 15:52:53.202117 6385 services_controller.go:445] Built service openshift-machine-api/control-plane-machine-set-operator LB template configs for network=default: []services.lbConfig(nil)\\\\nF1202 15:52:53.201748 6385 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:53Z is after 2025-08\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ae6c9973a0f8983a5da9a3a25ba23d7d836e7f5bdaa60b60627d604ef3d2afc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0062caebc274431e99ef7dc875a50f2f2d756215bd2700b7206582f68e5f376\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0062caebc274431e99ef7dc875a50f2f2d756215bd2700b7206582f68e5f376\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8mklc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:07Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:07 crc kubenswrapper[4933]: I1202 15:53:07.726511 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s488w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a161d9d5-a56f-45e9-93e4-50e7220cd31e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38f86b30416fb2f10e8444d5c7c0afe84f16d619e83e7a5e1186eaf4c274a51f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mnvpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cba0775cbbe9a6ce1f0b0fe3559f2b5eb39bd13d4f35686fe2ff92d7d833909\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mnvpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s488w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:07Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:07 crc kubenswrapper[4933]: I1202 15:53:07.739972 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06b195f1-3296-4050-9361-eab421cde8d4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7399fed7994f0ed3f12a423d3f6796e84d8687f9c16a3050ccbb90e1c80a07d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee933bdd8638f0085a6f720a178c8ce59bf46b40a0bcb015ac9c570e25ce97d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7cb1abb6f86878fc3daef153191ea3a2ebe06b3f1fc7df959539938c3b6a724\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db14d6e6ebdfa06ff02570eb66fe7ea17a7705fdaa767b6fb91d7ed12eacd59a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:07Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:07 crc kubenswrapper[4933]: I1202 15:53:07.777831 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:07 crc kubenswrapper[4933]: I1202 15:53:07.777879 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:07 crc kubenswrapper[4933]: I1202 15:53:07.777892 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:07 crc kubenswrapper[4933]: I1202 15:53:07.777911 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:07 crc kubenswrapper[4933]: I1202 15:53:07.777926 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:07Z","lastTransitionTime":"2025-12-02T15:53:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:07 crc kubenswrapper[4933]: I1202 15:53:07.880569 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:07 crc kubenswrapper[4933]: I1202 15:53:07.880642 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:07 crc kubenswrapper[4933]: I1202 15:53:07.880656 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:07 crc kubenswrapper[4933]: I1202 15:53:07.880679 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:07 crc kubenswrapper[4933]: I1202 15:53:07.880696 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:07Z","lastTransitionTime":"2025-12-02T15:53:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:07 crc kubenswrapper[4933]: I1202 15:53:07.983235 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:07 crc kubenswrapper[4933]: I1202 15:53:07.983646 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:07 crc kubenswrapper[4933]: I1202 15:53:07.983783 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:07 crc kubenswrapper[4933]: I1202 15:53:07.983935 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:07 crc kubenswrapper[4933]: I1202 15:53:07.984044 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:07Z","lastTransitionTime":"2025-12-02T15:53:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:08 crc kubenswrapper[4933]: I1202 15:53:08.052656 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 15:53:08 crc kubenswrapper[4933]: I1202 15:53:08.052780 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 15:53:08 crc kubenswrapper[4933]: E1202 15:53:08.052914 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 15:53:08 crc kubenswrapper[4933]: E1202 15:53:08.053085 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 15:53:08 crc kubenswrapper[4933]: I1202 15:53:08.052693 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 15:53:08 crc kubenswrapper[4933]: E1202 15:53:08.053591 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 15:53:08 crc kubenswrapper[4933]: I1202 15:53:08.087951 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:08 crc kubenswrapper[4933]: I1202 15:53:08.088044 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:08 crc kubenswrapper[4933]: I1202 15:53:08.088072 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:08 crc kubenswrapper[4933]: I1202 15:53:08.088108 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:08 crc kubenswrapper[4933]: I1202 15:53:08.088132 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:08Z","lastTransitionTime":"2025-12-02T15:53:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:08 crc kubenswrapper[4933]: I1202 15:53:08.092530 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 15:53:08 crc kubenswrapper[4933]: E1202 15:53:08.092721 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 15:53:40.092686993 +0000 UTC m=+83.343913736 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:53:08 crc kubenswrapper[4933]: I1202 15:53:08.093015 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 15:53:08 crc kubenswrapper[4933]: I1202 15:53:08.093217 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 15:53:08 crc kubenswrapper[4933]: E1202 15:53:08.093307 4933 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 15:53:08 crc kubenswrapper[4933]: E1202 15:53:08.093467 4933 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 15:53:08 crc kubenswrapper[4933]: E1202 15:53:08.093490 4933 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 15:53:08 crc kubenswrapper[4933]: E1202 15:53:08.093510 4933 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 15:53:08 crc kubenswrapper[4933]: I1202 15:53:08.093424 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 15:53:08 crc kubenswrapper[4933]: E1202 15:53:08.093549 4933 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 15:53:08 crc kubenswrapper[4933]: E1202 15:53:08.093706 4933 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 15:53:08 crc kubenswrapper[4933]: E1202 15:53:08.093551 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-02 15:53:40.093534795 +0000 UTC m=+83.344761528 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 15:53:08 crc kubenswrapper[4933]: I1202 15:53:08.093813 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 15:53:08 crc kubenswrapper[4933]: E1202 15:53:08.093953 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-02 15:53:40.093934225 +0000 UTC m=+83.345160948 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 15:53:08 crc kubenswrapper[4933]: E1202 15:53:08.094026 4933 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 15:53:08 crc kubenswrapper[4933]: E1202 15:53:08.094168 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 15:53:40.09413204 +0000 UTC m=+83.345358893 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 15:53:08 crc kubenswrapper[4933]: E1202 15:53:08.094559 4933 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 15:53:08 crc kubenswrapper[4933]: E1202 15:53:08.094747 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 15:53:40.094727715 +0000 UTC m=+83.345954458 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 15:53:08 crc kubenswrapper[4933]: I1202 15:53:08.192220 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:08 crc kubenswrapper[4933]: I1202 15:53:08.192580 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:08 crc kubenswrapper[4933]: I1202 15:53:08.192669 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:08 crc kubenswrapper[4933]: I1202 15:53:08.192757 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:08 crc kubenswrapper[4933]: I1202 15:53:08.192957 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:08Z","lastTransitionTime":"2025-12-02T15:53:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:08 crc kubenswrapper[4933]: I1202 15:53:08.296314 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:08 crc kubenswrapper[4933]: I1202 15:53:08.296367 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:08 crc kubenswrapper[4933]: I1202 15:53:08.296381 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:08 crc kubenswrapper[4933]: I1202 15:53:08.296401 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:08 crc kubenswrapper[4933]: I1202 15:53:08.296415 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:08Z","lastTransitionTime":"2025-12-02T15:53:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:08 crc kubenswrapper[4933]: I1202 15:53:08.399185 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:08 crc kubenswrapper[4933]: I1202 15:53:08.399276 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:08 crc kubenswrapper[4933]: I1202 15:53:08.399302 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:08 crc kubenswrapper[4933]: I1202 15:53:08.399337 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:08 crc kubenswrapper[4933]: I1202 15:53:08.399362 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:08Z","lastTransitionTime":"2025-12-02T15:53:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:08 crc kubenswrapper[4933]: I1202 15:53:08.467430 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8mklc_1972064c-ea30-421c-b009-2bc675a98fcc/ovnkube-controller/2.log" Dec 02 15:53:08 crc kubenswrapper[4933]: I1202 15:53:08.468607 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8mklc_1972064c-ea30-421c-b009-2bc675a98fcc/ovnkube-controller/1.log" Dec 02 15:53:08 crc kubenswrapper[4933]: I1202 15:53:08.472965 4933 generic.go:334] "Generic (PLEG): container finished" podID="1972064c-ea30-421c-b009-2bc675a98fcc" containerID="e68bfc1eb2e35cbb02950eed88f4e4cb09b5ee597cbf3dbefcbb7dca7a9f90d7" exitCode=1 Dec 02 15:53:08 crc kubenswrapper[4933]: I1202 15:53:08.473014 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" event={"ID":"1972064c-ea30-421c-b009-2bc675a98fcc","Type":"ContainerDied","Data":"e68bfc1eb2e35cbb02950eed88f4e4cb09b5ee597cbf3dbefcbb7dca7a9f90d7"} Dec 02 15:53:08 crc kubenswrapper[4933]: I1202 15:53:08.473115 4933 scope.go:117] "RemoveContainer" containerID="04ab3fe2bc8cab1157c68d938926ecb8b4fc1bca6f2bba21333de8a7213ad8df" Dec 02 15:53:08 crc kubenswrapper[4933]: I1202 15:53:08.478608 4933 scope.go:117] "RemoveContainer" containerID="e68bfc1eb2e35cbb02950eed88f4e4cb09b5ee597cbf3dbefcbb7dca7a9f90d7" Dec 02 15:53:08 crc kubenswrapper[4933]: E1202 15:53:08.479006 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-8mklc_openshift-ovn-kubernetes(1972064c-ea30-421c-b009-2bc675a98fcc)\"" pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" podUID="1972064c-ea30-421c-b009-2bc675a98fcc" Dec 02 15:53:08 crc kubenswrapper[4933]: I1202 15:53:08.493791 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06b195f1-3296-4050-9361-eab421cde8d4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7399fed7994f0ed3f12a423d3f6796e84d8687f9c16a3050ccbb90e1c80a07d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee933bdd8638f0085a6f720a178c8ce59bf46b40a0bcb015ac9c570e25ce97d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7cb1abb6f86878fc3daef153191ea3a2ebe06b3f1fc7df959539938c3b6a724\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db14d6e6ebdfa06ff02570eb66fe7ea17a7705fdaa767b6fb91d7ed12eacd59a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:08Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:08 crc kubenswrapper[4933]: I1202 15:53:08.502010 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:08 crc kubenswrapper[4933]: I1202 15:53:08.502141 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:08 crc kubenswrapper[4933]: I1202 15:53:08.502153 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:08 crc kubenswrapper[4933]: I1202 15:53:08.502171 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:08 crc kubenswrapper[4933]: I1202 15:53:08.502182 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:08Z","lastTransitionTime":"2025-12-02T15:53:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:08 crc kubenswrapper[4933]: I1202 15:53:08.524580 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1972064c-ea30-421c-b009-2bc675a98fcc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3920b75a79accaa154ca827b9d813f5bbb7c7c18369c0b6d3bb82cab653bcb66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a3f2ac3ef3759389e0c4807b4235aea66b8d9570ef020af90110402296a755\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d4c0805f39e3410185b2d898b61b42dfeb544cd47c1ac7570a099ece7921a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://170155e5803e9eabfc4866a3cd76bd74567d9ff6707bebfdf7f24a2330173140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a499bb33dae499d565aa22f537a095fe9465bd1d68d829248fc7a896f692f67d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14c25265781df0f480ecebbc167b5e413d869433f7258cc299a2c2bc12ee2af3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e68bfc1eb2e35cbb02950eed88f4e4cb09b5ee597cbf3dbefcbb7dca7a9f90d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04ab3fe2bc8cab1157c68d938926ecb8b4fc1bca6f2bba21333de8a7213ad8df\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T15:52:53Z\\\",\\\"message\\\":\\\"]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1202 15:52:53.202107 6385 services_controller.go:444] Built service openshift-machine-api/control-plane-machine-set-operator LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1202 15:52:53.202117 6385 services_controller.go:445] Built service openshift-machine-api/control-plane-machine-set-operator LB template configs for network=default: []services.lbConfig(nil)\\\\nF1202 15:52:53.201748 6385 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:53Z is after 2025-08\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e68bfc1eb2e35cbb02950eed88f4e4cb09b5ee597cbf3dbefcbb7dca7a9f90d7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T15:53:08Z\\\",\\\"message\\\":\\\"g for pod on switch crc\\\\nI1202 15:53:07.975214 6599 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1202 15:53:07.975521 6599 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1202 15:53:07.975535 6599 base_network_controller_pods.go:916] Annotation values: ip=[10.217.0.3/23] ; mac=0a:58:0a:d9:00:03 ; gw=[10.217.0.1]\\\\nI1202 15:53:07.975530 6599 ovn.go:134] Ensuring zone local for Pod openshift-kube-controller-manager/kube-controller-manager-crc in node crc\\\\nI1202 15:53:07.975580 6599 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc after 0 failed attempt(s)\\\\nI1202 15:53:07.975354 6599 services_controller.go:444] Built service openshift-apiserver/api LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1202 15:53:07.975596 6599 default_network_controller.go:776] Recording success event on pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1202 15:53:07.975612 6599 services_controller.go:445] Built service openshift-apiserver/api LB template configs for network=default: []services.lbConfig(nil)\\\\nF1202 15:53:07.975627 6599 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T15:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ae6c9973a0f8983a5da9a3a25ba23d7d836e7f5bdaa60b60627d604ef3d2afc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0062caebc274431e99ef7dc875a50f2f2d756215bd2700b7206582f68e5f376\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0062caebc274431e99ef7dc875a50f2f2d756215bd2700b7206582f68e5f376\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8mklc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:08Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:08 crc kubenswrapper[4933]: I1202 15:53:08.538713 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s488w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a161d9d5-a56f-45e9-93e4-50e7220cd31e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38f86b30416fb2f10e8444d5c7c0afe84f16d619e83e7a5e1186eaf4c274a51f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mnvpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cba0775cbbe9a6ce1f0b0fe3559f2b5eb39bd13d4f35686fe2ff92d7d833909\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mnvpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s488w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:08Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:08 crc kubenswrapper[4933]: I1202 15:53:08.552627 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:08Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:08 crc kubenswrapper[4933]: I1202 15:53:08.566617 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6c1c5e6-50dd-428a-890c-2c3f0456f2fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d84363ac0dfeec81ad7770d6ffd34547605fc51bebb545c4639f4c069bab93ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t48jm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54194f3459a2bbe748821e4f8e94abdd18e7c4e483d4cc2c9d5b765db584dd01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t48jm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-d2p6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:08Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:08 crc kubenswrapper[4933]: I1202 15:53:08.580612 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5d9dn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97edaf10-912b-42e7-a9e7-930381d48508\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea0970d622ab76b94ceab66d1d10d469581574368d38c8cd7c6b7a26f81cb6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s82hp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5d9dn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:08Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:08 crc kubenswrapper[4933]: I1202 15:53:08.594598 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fl25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"680c0df1-e4d6-4e1c-a36d-2378e821d2d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bdbdb1eb6ad7b6e710feeda6af64e9557ec1c3c938fd850fa5b2835abc45f098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-559sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fl25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:08Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:08 crc kubenswrapper[4933]: I1202 15:53:08.604629 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:08 crc kubenswrapper[4933]: I1202 15:53:08.604674 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:08 crc kubenswrapper[4933]: I1202 15:53:08.604685 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:08 crc kubenswrapper[4933]: I1202 15:53:08.604701 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:08 crc kubenswrapper[4933]: I1202 15:53:08.604713 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:08Z","lastTransitionTime":"2025-12-02T15:53:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:08 crc kubenswrapper[4933]: I1202 15:53:08.609553 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qbps2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c95a4730-1427-4097-9ca3-4bd251e7acf0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdwf7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdwf7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:56Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qbps2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:08Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:08 crc kubenswrapper[4933]: I1202 15:53:08.625020 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:08Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:08 crc kubenswrapper[4933]: I1202 15:53:08.638143 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4424b26b4c67e9508d92cc6bbc82b291d93c587a8463026a856d87b7b778079e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:08Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:08 crc kubenswrapper[4933]: I1202 15:53:08.650665 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-z6kjz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b033c545-93a2-4401-842b-22456e44216b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5307d7bbe56091012f9975b2a42eafb27d8c90b53817f1f82d8269e23456759\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdq96\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-z6kjz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:08Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:08 crc kubenswrapper[4933]: I1202 15:53:08.668208 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50efb59904b9275848b8f068dfa8943515c66087209fe13dc75888354ecaff09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:08Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:08 crc kubenswrapper[4933]: I1202 15:53:08.683131 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c0555430401fd089ba4f14bef44c9a03bcc4352a3159c34aa592797211ff912\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2774460b514418fc05e1d8ac0ca0a8cda1194fab9151804bed266e6bf44c7369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:08Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:08 crc kubenswrapper[4933]: I1202 15:53:08.696612 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:08Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:08 crc kubenswrapper[4933]: I1202 15:53:08.707875 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:08 crc kubenswrapper[4933]: I1202 15:53:08.707920 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:08 crc kubenswrapper[4933]: I1202 15:53:08.707930 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:08 crc kubenswrapper[4933]: I1202 15:53:08.707948 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:08 crc kubenswrapper[4933]: I1202 15:53:08.707960 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:08Z","lastTransitionTime":"2025-12-02T15:53:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:08 crc kubenswrapper[4933]: I1202 15:53:08.718232 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-s779q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72a30a71-fe04-43a2-8f60-c9b12a0a6e5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2367cc195b2375ceed6df398214268e414ae13f5459750b4a1f3bbe4ef59363\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a448b3118b6160344bea740978c045ce2351af934805ee95eb6b940963b377bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a448b3118b6160344bea740978c045ce2351af934805ee95eb6b940963b377bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f11b7eae04e4265f3d00a32c0c1c91419ef6da612c6ecd2e69ef4e90b5f72b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f11b7eae04e4265f3d00a32c0c1c91419ef6da612c6ecd2e69ef4e90b5f72b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89093d056f612b90064277410dc17c8b1879459a9e9491538ccd8dabeb2703e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89093d056f612b90064277410dc17c8b1879459a9e9491538ccd8dabeb2703e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e773c0ba8db44a579b0f5b59c7402d773402fbb1b3e3b57bf95e9244df635c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e773c0ba8db44a579b0f5b59c7402d773402fbb1b3e3b57bf95e9244df635c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a098d78bfca730eec14ca5f48ae082e24f0623ec3752e1b0182dceb18821e3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a098d78bfca730eec14ca5f48ae082e24f0623ec3752e1b0182dceb18821e3b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e41fea9eed5df332e515a81666075fc4cb3171b47a7c222b36dac4d5a7533692\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e41fea9eed5df332e515a81666075fc4cb3171b47a7c222b36dac4d5a7533692\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-s779q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:08Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:08 crc kubenswrapper[4933]: I1202 15:53:08.732989 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67141d41-dade-4d16-8921-1a3eeaef658e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67497877f8241167fd89cf2ebbc7257a910b4f22e96a1e58f78936317daf19aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://559f28d5b8b13a4d78a727f671a75d2c61a391f83089c98749a8071b466c8fae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50e3f8f75abacf2b0701b20cc58796727bb04d99836c7ad69c27355d0c7f070f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0f8661c980893fc646e84a6bb8547946653723fb3f1449d585953e25715d588\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdced4864fc5e9a41404f9484c6126634ffcbc3388080207f6a5508be6dc7b19\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 15:52:30.552832 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 15:52:30.556124 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1621699441/tls.crt::/tmp/serving-cert-1621699441/tls.key\\\\\\\"\\\\nI1202 15:52:36.166152 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 15:52:36.169452 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 15:52:36.169553 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 15:52:36.169614 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 15:52:36.169667 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 15:52:36.177343 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 15:52:36.177409 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 15:52:36.177417 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 15:52:36.177423 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 15:52:36.177428 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 15:52:36.177432 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 15:52:36.177437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 15:52:36.177361 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 15:52:36.181525 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://266ff5f18c317176752753189e1fd32c61d17e22bebba19421d71cdce41c10e4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd63fc5b2ffbd52b09628908868b28e2168bfe8bc453de41ce175aedcd404cc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd63fc5b2ffbd52b09628908868b28e2168bfe8bc453de41ce175aedcd404cc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:08Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:08 crc kubenswrapper[4933]: I1202 15:53:08.811784 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:08 crc kubenswrapper[4933]: I1202 15:53:08.811869 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:08 crc kubenswrapper[4933]: I1202 15:53:08.811883 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:08 crc kubenswrapper[4933]: I1202 15:53:08.811902 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:08 crc kubenswrapper[4933]: I1202 15:53:08.812209 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:08Z","lastTransitionTime":"2025-12-02T15:53:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:08 crc kubenswrapper[4933]: I1202 15:53:08.915863 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:08 crc kubenswrapper[4933]: I1202 15:53:08.915973 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:08 crc kubenswrapper[4933]: I1202 15:53:08.915986 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:08 crc kubenswrapper[4933]: I1202 15:53:08.916007 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:08 crc kubenswrapper[4933]: I1202 15:53:08.916020 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:08Z","lastTransitionTime":"2025-12-02T15:53:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:09 crc kubenswrapper[4933]: I1202 15:53:09.019764 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:09 crc kubenswrapper[4933]: I1202 15:53:09.019812 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:09 crc kubenswrapper[4933]: I1202 15:53:09.019836 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:09 crc kubenswrapper[4933]: I1202 15:53:09.019852 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:09 crc kubenswrapper[4933]: I1202 15:53:09.019863 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:09Z","lastTransitionTime":"2025-12-02T15:53:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:09 crc kubenswrapper[4933]: I1202 15:53:09.052950 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qbps2" Dec 02 15:53:09 crc kubenswrapper[4933]: E1202 15:53:09.053188 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qbps2" podUID="c95a4730-1427-4097-9ca3-4bd251e7acf0" Dec 02 15:53:09 crc kubenswrapper[4933]: I1202 15:53:09.122575 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:09 crc kubenswrapper[4933]: I1202 15:53:09.122645 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:09 crc kubenswrapper[4933]: I1202 15:53:09.122663 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:09 crc kubenswrapper[4933]: I1202 15:53:09.122695 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:09 crc kubenswrapper[4933]: I1202 15:53:09.122713 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:09Z","lastTransitionTime":"2025-12-02T15:53:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:09 crc kubenswrapper[4933]: I1202 15:53:09.226109 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:09 crc kubenswrapper[4933]: I1202 15:53:09.226147 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:09 crc kubenswrapper[4933]: I1202 15:53:09.226158 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:09 crc kubenswrapper[4933]: I1202 15:53:09.226173 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:09 crc kubenswrapper[4933]: I1202 15:53:09.226184 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:09Z","lastTransitionTime":"2025-12-02T15:53:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:09 crc kubenswrapper[4933]: I1202 15:53:09.329373 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:09 crc kubenswrapper[4933]: I1202 15:53:09.329450 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:09 crc kubenswrapper[4933]: I1202 15:53:09.329475 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:09 crc kubenswrapper[4933]: I1202 15:53:09.329507 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:09 crc kubenswrapper[4933]: I1202 15:53:09.329528 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:09Z","lastTransitionTime":"2025-12-02T15:53:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:09 crc kubenswrapper[4933]: I1202 15:53:09.432637 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:09 crc kubenswrapper[4933]: I1202 15:53:09.432703 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:09 crc kubenswrapper[4933]: I1202 15:53:09.432716 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:09 crc kubenswrapper[4933]: I1202 15:53:09.432732 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:09 crc kubenswrapper[4933]: I1202 15:53:09.432742 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:09Z","lastTransitionTime":"2025-12-02T15:53:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:09 crc kubenswrapper[4933]: I1202 15:53:09.481250 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8mklc_1972064c-ea30-421c-b009-2bc675a98fcc/ovnkube-controller/2.log" Dec 02 15:53:09 crc kubenswrapper[4933]: I1202 15:53:09.536251 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:09 crc kubenswrapper[4933]: I1202 15:53:09.536296 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:09 crc kubenswrapper[4933]: I1202 15:53:09.536305 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:09 crc kubenswrapper[4933]: I1202 15:53:09.536323 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:09 crc kubenswrapper[4933]: I1202 15:53:09.536333 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:09Z","lastTransitionTime":"2025-12-02T15:53:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:09 crc kubenswrapper[4933]: I1202 15:53:09.640174 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:09 crc kubenswrapper[4933]: I1202 15:53:09.640262 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:09 crc kubenswrapper[4933]: I1202 15:53:09.640282 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:09 crc kubenswrapper[4933]: I1202 15:53:09.640308 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:09 crc kubenswrapper[4933]: I1202 15:53:09.640328 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:09Z","lastTransitionTime":"2025-12-02T15:53:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:09 crc kubenswrapper[4933]: I1202 15:53:09.742913 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:09 crc kubenswrapper[4933]: I1202 15:53:09.742976 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:09 crc kubenswrapper[4933]: I1202 15:53:09.742987 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:09 crc kubenswrapper[4933]: I1202 15:53:09.743007 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:09 crc kubenswrapper[4933]: I1202 15:53:09.743021 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:09Z","lastTransitionTime":"2025-12-02T15:53:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:09 crc kubenswrapper[4933]: I1202 15:53:09.846308 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:09 crc kubenswrapper[4933]: I1202 15:53:09.846392 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:09 crc kubenswrapper[4933]: I1202 15:53:09.846402 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:09 crc kubenswrapper[4933]: I1202 15:53:09.846417 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:09 crc kubenswrapper[4933]: I1202 15:53:09.846427 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:09Z","lastTransitionTime":"2025-12-02T15:53:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:09 crc kubenswrapper[4933]: I1202 15:53:09.949705 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:09 crc kubenswrapper[4933]: I1202 15:53:09.949786 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:09 crc kubenswrapper[4933]: I1202 15:53:09.949863 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:09 crc kubenswrapper[4933]: I1202 15:53:09.949917 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:09 crc kubenswrapper[4933]: I1202 15:53:09.949941 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:09Z","lastTransitionTime":"2025-12-02T15:53:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:10 crc kubenswrapper[4933]: I1202 15:53:10.053174 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 15:53:10 crc kubenswrapper[4933]: I1202 15:53:10.053210 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 15:53:10 crc kubenswrapper[4933]: I1202 15:53:10.053642 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 15:53:10 crc kubenswrapper[4933]: I1202 15:53:10.053768 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:10 crc kubenswrapper[4933]: I1202 15:53:10.053806 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:10 crc kubenswrapper[4933]: E1202 15:53:10.053795 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 15:53:10 crc kubenswrapper[4933]: I1202 15:53:10.053856 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:10 crc kubenswrapper[4933]: I1202 15:53:10.053880 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:10 crc kubenswrapper[4933]: I1202 15:53:10.053898 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:10Z","lastTransitionTime":"2025-12-02T15:53:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:10 crc kubenswrapper[4933]: E1202 15:53:10.053942 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 15:53:10 crc kubenswrapper[4933]: E1202 15:53:10.054008 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 15:53:10 crc kubenswrapper[4933]: I1202 15:53:10.157699 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:10 crc kubenswrapper[4933]: I1202 15:53:10.157773 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:10 crc kubenswrapper[4933]: I1202 15:53:10.157787 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:10 crc kubenswrapper[4933]: I1202 15:53:10.157804 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:10 crc kubenswrapper[4933]: I1202 15:53:10.157841 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:10Z","lastTransitionTime":"2025-12-02T15:53:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:10 crc kubenswrapper[4933]: I1202 15:53:10.261279 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:10 crc kubenswrapper[4933]: I1202 15:53:10.261358 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:10 crc kubenswrapper[4933]: I1202 15:53:10.261376 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:10 crc kubenswrapper[4933]: I1202 15:53:10.261402 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:10 crc kubenswrapper[4933]: I1202 15:53:10.261420 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:10Z","lastTransitionTime":"2025-12-02T15:53:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:10 crc kubenswrapper[4933]: I1202 15:53:10.364704 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:10 crc kubenswrapper[4933]: I1202 15:53:10.364762 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:10 crc kubenswrapper[4933]: I1202 15:53:10.364773 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:10 crc kubenswrapper[4933]: I1202 15:53:10.364798 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:10 crc kubenswrapper[4933]: I1202 15:53:10.364812 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:10Z","lastTransitionTime":"2025-12-02T15:53:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:10 crc kubenswrapper[4933]: I1202 15:53:10.468606 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:10 crc kubenswrapper[4933]: I1202 15:53:10.468658 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:10 crc kubenswrapper[4933]: I1202 15:53:10.468670 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:10 crc kubenswrapper[4933]: I1202 15:53:10.468689 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:10 crc kubenswrapper[4933]: I1202 15:53:10.468706 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:10Z","lastTransitionTime":"2025-12-02T15:53:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:10 crc kubenswrapper[4933]: I1202 15:53:10.573119 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:10 crc kubenswrapper[4933]: I1202 15:53:10.573184 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:10 crc kubenswrapper[4933]: I1202 15:53:10.573198 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:10 crc kubenswrapper[4933]: I1202 15:53:10.573221 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:10 crc kubenswrapper[4933]: I1202 15:53:10.573242 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:10Z","lastTransitionTime":"2025-12-02T15:53:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:10 crc kubenswrapper[4933]: I1202 15:53:10.676961 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:10 crc kubenswrapper[4933]: I1202 15:53:10.677433 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:10 crc kubenswrapper[4933]: I1202 15:53:10.677665 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:10 crc kubenswrapper[4933]: I1202 15:53:10.677990 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:10 crc kubenswrapper[4933]: I1202 15:53:10.678135 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:10Z","lastTransitionTime":"2025-12-02T15:53:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:10 crc kubenswrapper[4933]: I1202 15:53:10.782078 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:10 crc kubenswrapper[4933]: I1202 15:53:10.782140 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:10 crc kubenswrapper[4933]: I1202 15:53:10.782149 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:10 crc kubenswrapper[4933]: I1202 15:53:10.782170 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:10 crc kubenswrapper[4933]: I1202 15:53:10.782182 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:10Z","lastTransitionTime":"2025-12-02T15:53:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:10 crc kubenswrapper[4933]: I1202 15:53:10.885389 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:10 crc kubenswrapper[4933]: I1202 15:53:10.885465 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:10 crc kubenswrapper[4933]: I1202 15:53:10.885481 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:10 crc kubenswrapper[4933]: I1202 15:53:10.885504 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:10 crc kubenswrapper[4933]: I1202 15:53:10.885524 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:10Z","lastTransitionTime":"2025-12-02T15:53:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:10 crc kubenswrapper[4933]: I1202 15:53:10.926952 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:10 crc kubenswrapper[4933]: I1202 15:53:10.927049 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:10 crc kubenswrapper[4933]: I1202 15:53:10.927069 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:10 crc kubenswrapper[4933]: I1202 15:53:10.927099 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:10 crc kubenswrapper[4933]: I1202 15:53:10.927112 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:10Z","lastTransitionTime":"2025-12-02T15:53:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:10 crc kubenswrapper[4933]: E1202 15:53:10.950695 4933 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:53:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:53:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:53:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:53:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b45811fa-f657-451d-9a34-cdd268fcc941\\\",\\\"systemUUID\\\":\\\"84b7b789-bc9b-466b-8619-2bf2e1fdb8d0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:10Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:10 crc kubenswrapper[4933]: I1202 15:53:10.956356 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:10 crc kubenswrapper[4933]: I1202 15:53:10.956430 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:10 crc kubenswrapper[4933]: I1202 15:53:10.956796 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:10 crc kubenswrapper[4933]: I1202 15:53:10.956869 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:10 crc kubenswrapper[4933]: I1202 15:53:10.956886 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:10Z","lastTransitionTime":"2025-12-02T15:53:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:10 crc kubenswrapper[4933]: E1202 15:53:10.974990 4933 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:53:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:53:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:53:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:53:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b45811fa-f657-451d-9a34-cdd268fcc941\\\",\\\"systemUUID\\\":\\\"84b7b789-bc9b-466b-8619-2bf2e1fdb8d0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:10Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:10 crc kubenswrapper[4933]: I1202 15:53:10.980142 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:10 crc kubenswrapper[4933]: I1202 15:53:10.980190 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:10 crc kubenswrapper[4933]: I1202 15:53:10.980342 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:10 crc kubenswrapper[4933]: I1202 15:53:10.980751 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:10 crc kubenswrapper[4933]: I1202 15:53:10.980777 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:10Z","lastTransitionTime":"2025-12-02T15:53:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:11 crc kubenswrapper[4933]: E1202 15:53:10.999983 4933 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:53:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:53:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:53:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:53:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b45811fa-f657-451d-9a34-cdd268fcc941\\\",\\\"systemUUID\\\":\\\"84b7b789-bc9b-466b-8619-2bf2e1fdb8d0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:10Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:11 crc kubenswrapper[4933]: I1202 15:53:11.004896 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:11 crc kubenswrapper[4933]: I1202 15:53:11.004952 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:11 crc kubenswrapper[4933]: I1202 15:53:11.004974 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:11 crc kubenswrapper[4933]: I1202 15:53:11.004998 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:11 crc kubenswrapper[4933]: I1202 15:53:11.005016 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:11Z","lastTransitionTime":"2025-12-02T15:53:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:11 crc kubenswrapper[4933]: E1202 15:53:11.022913 4933 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:53:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:53:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:53:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:53:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b45811fa-f657-451d-9a34-cdd268fcc941\\\",\\\"systemUUID\\\":\\\"84b7b789-bc9b-466b-8619-2bf2e1fdb8d0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:11Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:11 crc kubenswrapper[4933]: I1202 15:53:11.027236 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:11 crc kubenswrapper[4933]: I1202 15:53:11.027295 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:11 crc kubenswrapper[4933]: I1202 15:53:11.027308 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:11 crc kubenswrapper[4933]: I1202 15:53:11.027335 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:11 crc kubenswrapper[4933]: I1202 15:53:11.027351 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:11Z","lastTransitionTime":"2025-12-02T15:53:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:11 crc kubenswrapper[4933]: E1202 15:53:11.045307 4933 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:53:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:53:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:53:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:53:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b45811fa-f657-451d-9a34-cdd268fcc941\\\",\\\"systemUUID\\\":\\\"84b7b789-bc9b-466b-8619-2bf2e1fdb8d0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:11Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:11 crc kubenswrapper[4933]: E1202 15:53:11.045511 4933 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 02 15:53:11 crc kubenswrapper[4933]: I1202 15:53:11.047462 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:11 crc kubenswrapper[4933]: I1202 15:53:11.047515 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:11 crc kubenswrapper[4933]: I1202 15:53:11.047531 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:11 crc kubenswrapper[4933]: I1202 15:53:11.047556 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:11 crc kubenswrapper[4933]: I1202 15:53:11.047572 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:11Z","lastTransitionTime":"2025-12-02T15:53:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:11 crc kubenswrapper[4933]: I1202 15:53:11.052955 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qbps2" Dec 02 15:53:11 crc kubenswrapper[4933]: E1202 15:53:11.053221 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qbps2" podUID="c95a4730-1427-4097-9ca3-4bd251e7acf0" Dec 02 15:53:11 crc kubenswrapper[4933]: I1202 15:53:11.150372 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:11 crc kubenswrapper[4933]: I1202 15:53:11.151868 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:11 crc kubenswrapper[4933]: I1202 15:53:11.151886 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:11 crc kubenswrapper[4933]: I1202 15:53:11.151908 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:11 crc kubenswrapper[4933]: I1202 15:53:11.151923 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:11Z","lastTransitionTime":"2025-12-02T15:53:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:11 crc kubenswrapper[4933]: I1202 15:53:11.254530 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:11 crc kubenswrapper[4933]: I1202 15:53:11.254584 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:11 crc kubenswrapper[4933]: I1202 15:53:11.254602 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:11 crc kubenswrapper[4933]: I1202 15:53:11.254628 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:11 crc kubenswrapper[4933]: I1202 15:53:11.254647 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:11Z","lastTransitionTime":"2025-12-02T15:53:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:11 crc kubenswrapper[4933]: I1202 15:53:11.357522 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:11 crc kubenswrapper[4933]: I1202 15:53:11.357930 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:11 crc kubenswrapper[4933]: I1202 15:53:11.357940 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:11 crc kubenswrapper[4933]: I1202 15:53:11.357960 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:11 crc kubenswrapper[4933]: I1202 15:53:11.357971 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:11Z","lastTransitionTime":"2025-12-02T15:53:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:11 crc kubenswrapper[4933]: I1202 15:53:11.461423 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:11 crc kubenswrapper[4933]: I1202 15:53:11.461489 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:11 crc kubenswrapper[4933]: I1202 15:53:11.461508 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:11 crc kubenswrapper[4933]: I1202 15:53:11.461530 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:11 crc kubenswrapper[4933]: I1202 15:53:11.461547 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:11Z","lastTransitionTime":"2025-12-02T15:53:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:11 crc kubenswrapper[4933]: I1202 15:53:11.564376 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:11 crc kubenswrapper[4933]: I1202 15:53:11.564433 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:11 crc kubenswrapper[4933]: I1202 15:53:11.564444 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:11 crc kubenswrapper[4933]: I1202 15:53:11.564465 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:11 crc kubenswrapper[4933]: I1202 15:53:11.564479 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:11Z","lastTransitionTime":"2025-12-02T15:53:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:11 crc kubenswrapper[4933]: I1202 15:53:11.667559 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:11 crc kubenswrapper[4933]: I1202 15:53:11.667616 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:11 crc kubenswrapper[4933]: I1202 15:53:11.667627 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:11 crc kubenswrapper[4933]: I1202 15:53:11.667650 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:11 crc kubenswrapper[4933]: I1202 15:53:11.667664 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:11Z","lastTransitionTime":"2025-12-02T15:53:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:11 crc kubenswrapper[4933]: I1202 15:53:11.770182 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:11 crc kubenswrapper[4933]: I1202 15:53:11.770266 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:11 crc kubenswrapper[4933]: I1202 15:53:11.770286 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:11 crc kubenswrapper[4933]: I1202 15:53:11.770314 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:11 crc kubenswrapper[4933]: I1202 15:53:11.770332 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:11Z","lastTransitionTime":"2025-12-02T15:53:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:11 crc kubenswrapper[4933]: I1202 15:53:11.874151 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:11 crc kubenswrapper[4933]: I1202 15:53:11.874225 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:11 crc kubenswrapper[4933]: I1202 15:53:11.874246 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:11 crc kubenswrapper[4933]: I1202 15:53:11.874275 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:11 crc kubenswrapper[4933]: I1202 15:53:11.874298 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:11Z","lastTransitionTime":"2025-12-02T15:53:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:11 crc kubenswrapper[4933]: I1202 15:53:11.977297 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:11 crc kubenswrapper[4933]: I1202 15:53:11.977353 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:11 crc kubenswrapper[4933]: I1202 15:53:11.977366 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:11 crc kubenswrapper[4933]: I1202 15:53:11.977389 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:11 crc kubenswrapper[4933]: I1202 15:53:11.977404 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:11Z","lastTransitionTime":"2025-12-02T15:53:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:12 crc kubenswrapper[4933]: I1202 15:53:12.052383 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 15:53:12 crc kubenswrapper[4933]: I1202 15:53:12.052432 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 15:53:12 crc kubenswrapper[4933]: I1202 15:53:12.052445 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 15:53:12 crc kubenswrapper[4933]: E1202 15:53:12.052616 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 15:53:12 crc kubenswrapper[4933]: E1202 15:53:12.052749 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 15:53:12 crc kubenswrapper[4933]: E1202 15:53:12.052877 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 15:53:12 crc kubenswrapper[4933]: I1202 15:53:12.083943 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:12 crc kubenswrapper[4933]: I1202 15:53:12.084004 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:12 crc kubenswrapper[4933]: I1202 15:53:12.084018 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:12 crc kubenswrapper[4933]: I1202 15:53:12.084043 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:12 crc kubenswrapper[4933]: I1202 15:53:12.084059 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:12Z","lastTransitionTime":"2025-12-02T15:53:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:12 crc kubenswrapper[4933]: I1202 15:53:12.186622 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:12 crc kubenswrapper[4933]: I1202 15:53:12.186686 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:12 crc kubenswrapper[4933]: I1202 15:53:12.186705 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:12 crc kubenswrapper[4933]: I1202 15:53:12.186730 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:12 crc kubenswrapper[4933]: I1202 15:53:12.186748 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:12Z","lastTransitionTime":"2025-12-02T15:53:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:12 crc kubenswrapper[4933]: I1202 15:53:12.290489 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:12 crc kubenswrapper[4933]: I1202 15:53:12.290532 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:12 crc kubenswrapper[4933]: I1202 15:53:12.290541 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:12 crc kubenswrapper[4933]: I1202 15:53:12.290556 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:12 crc kubenswrapper[4933]: I1202 15:53:12.290567 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:12Z","lastTransitionTime":"2025-12-02T15:53:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:12 crc kubenswrapper[4933]: I1202 15:53:12.393794 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:12 crc kubenswrapper[4933]: I1202 15:53:12.393897 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:12 crc kubenswrapper[4933]: I1202 15:53:12.393917 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:12 crc kubenswrapper[4933]: I1202 15:53:12.393947 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:12 crc kubenswrapper[4933]: I1202 15:53:12.393967 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:12Z","lastTransitionTime":"2025-12-02T15:53:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:12 crc kubenswrapper[4933]: I1202 15:53:12.497133 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:12 crc kubenswrapper[4933]: I1202 15:53:12.497192 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:12 crc kubenswrapper[4933]: I1202 15:53:12.497207 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:12 crc kubenswrapper[4933]: I1202 15:53:12.497229 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:12 crc kubenswrapper[4933]: I1202 15:53:12.497249 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:12Z","lastTransitionTime":"2025-12-02T15:53:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:12 crc kubenswrapper[4933]: I1202 15:53:12.508241 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 02 15:53:12 crc kubenswrapper[4933]: I1202 15:53:12.524440 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Dec 02 15:53:12 crc kubenswrapper[4933]: I1202 15:53:12.524856 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6c1c5e6-50dd-428a-890c-2c3f0456f2fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d84363ac0dfeec81ad7770d6ffd34547605fc51bebb545c4639f4c069bab93ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t48jm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54194f3459a2bbe748821e4f8e94abdd18e7c4e483d4cc2c9d5b765db584dd01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t48jm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-d2p6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:12Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:12 crc kubenswrapper[4933]: I1202 15:53:12.544126 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5d9dn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97edaf10-912b-42e7-a9e7-930381d48508\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea0970d622ab76b94ceab66d1d10d469581574368d38c8cd7c6b7a26f81cb6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s82hp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5d9dn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:12Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:12 crc kubenswrapper[4933]: I1202 15:53:12.563366 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fl25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"680c0df1-e4d6-4e1c-a36d-2378e821d2d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bdbdb1eb6ad7b6e710feeda6af64e9557ec1c3c938fd850fa5b2835abc45f098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-559sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fl25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:12Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:12 crc kubenswrapper[4933]: I1202 15:53:12.581409 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qbps2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c95a4730-1427-4097-9ca3-4bd251e7acf0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdwf7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdwf7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:56Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qbps2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:12Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:12 crc kubenswrapper[4933]: I1202 15:53:12.602406 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:12Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:12 crc kubenswrapper[4933]: I1202 15:53:12.602701 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:12 crc kubenswrapper[4933]: I1202 15:53:12.602805 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:12 crc kubenswrapper[4933]: I1202 15:53:12.602853 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:12 crc kubenswrapper[4933]: I1202 15:53:12.602885 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:12 crc kubenswrapper[4933]: I1202 15:53:12.602903 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:12Z","lastTransitionTime":"2025-12-02T15:53:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:12 crc kubenswrapper[4933]: I1202 15:53:12.621730 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4424b26b4c67e9508d92cc6bbc82b291d93c587a8463026a856d87b7b778079e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:12Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:12 crc kubenswrapper[4933]: I1202 15:53:12.653431 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-z6kjz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b033c545-93a2-4401-842b-22456e44216b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5307d7bbe56091012f9975b2a42eafb27d8c90b53817f1f82d8269e23456759\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdq96\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-z6kjz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:12Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:12 crc kubenswrapper[4933]: I1202 15:53:12.667132 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50efb59904b9275848b8f068dfa8943515c66087209fe13dc75888354ecaff09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:12Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:12 crc kubenswrapper[4933]: I1202 15:53:12.680924 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:12Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:12 crc kubenswrapper[4933]: I1202 15:53:12.694242 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" Dec 02 15:53:12 crc kubenswrapper[4933]: I1202 15:53:12.695247 4933 scope.go:117] "RemoveContainer" containerID="e68bfc1eb2e35cbb02950eed88f4e4cb09b5ee597cbf3dbefcbb7dca7a9f90d7" Dec 02 15:53:12 crc kubenswrapper[4933]: I1202 15:53:12.695310 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:12Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:12 crc kubenswrapper[4933]: E1202 15:53:12.695424 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-8mklc_openshift-ovn-kubernetes(1972064c-ea30-421c-b009-2bc675a98fcc)\"" pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" podUID="1972064c-ea30-421c-b009-2bc675a98fcc" Dec 02 15:53:12 crc kubenswrapper[4933]: I1202 15:53:12.715026 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:12 crc kubenswrapper[4933]: I1202 15:53:12.715088 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:12 crc kubenswrapper[4933]: I1202 15:53:12.715101 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:12 crc kubenswrapper[4933]: I1202 15:53:12.715123 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:12 crc kubenswrapper[4933]: I1202 15:53:12.715138 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:12Z","lastTransitionTime":"2025-12-02T15:53:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:12 crc kubenswrapper[4933]: I1202 15:53:12.725499 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-s779q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72a30a71-fe04-43a2-8f60-c9b12a0a6e5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2367cc195b2375ceed6df398214268e414ae13f5459750b4a1f3bbe4ef59363\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a448b3118b6160344bea740978c045ce2351af934805ee95eb6b940963b377bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a448b3118b6160344bea740978c045ce2351af934805ee95eb6b940963b377bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f11b7eae04e4265f3d00a32c0c1c91419ef6da612c6ecd2e69ef4e90b5f72b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f11b7eae04e4265f3d00a32c0c1c91419ef6da612c6ecd2e69ef4e90b5f72b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89093d056f612b90064277410dc17c8b1879459a9e9491538ccd8dabeb2703e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89093d056f612b90064277410dc17c8b1879459a9e9491538ccd8dabeb2703e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e773c0ba8db44a579b0f5b59c7402d773402fbb1b3e3b57bf95e9244df635c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e773c0ba8db44a579b0f5b59c7402d773402fbb1b3e3b57bf95e9244df635c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a098d78bfca730eec14ca5f48ae082e24f0623ec3752e1b0182dceb18821e3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a098d78bfca730eec14ca5f48ae082e24f0623ec3752e1b0182dceb18821e3b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e41fea9eed5df332e515a81666075fc4cb3171b47a7c222b36dac4d5a7533692\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e41fea9eed5df332e515a81666075fc4cb3171b47a7c222b36dac4d5a7533692\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-s779q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:12Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:12 crc kubenswrapper[4933]: I1202 15:53:12.768558 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67141d41-dade-4d16-8921-1a3eeaef658e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67497877f8241167fd89cf2ebbc7257a910b4f22e96a1e58f78936317daf19aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://559f28d5b8b13a4d78a727f671a75d2c61a391f83089c98749a8071b466c8fae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50e3f8f75abacf2b0701b20cc58796727bb04d99836c7ad69c27355d0c7f070f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0f8661c980893fc646e84a6bb8547946653723fb3f1449d585953e25715d588\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdced4864fc5e9a41404f9484c6126634ffcbc3388080207f6a5508be6dc7b19\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 15:52:30.552832 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 15:52:30.556124 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1621699441/tls.crt::/tmp/serving-cert-1621699441/tls.key\\\\\\\"\\\\nI1202 15:52:36.166152 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 15:52:36.169452 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 15:52:36.169553 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 15:52:36.169614 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 15:52:36.169667 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 15:52:36.177343 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 15:52:36.177409 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 15:52:36.177417 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 15:52:36.177423 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 15:52:36.177428 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 15:52:36.177432 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 15:52:36.177437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 15:52:36.177361 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 15:52:36.181525 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://266ff5f18c317176752753189e1fd32c61d17e22bebba19421d71cdce41c10e4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd63fc5b2ffbd52b09628908868b28e2168bfe8bc453de41ce175aedcd404cc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd63fc5b2ffbd52b09628908868b28e2168bfe8bc453de41ce175aedcd404cc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:12Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:12 crc kubenswrapper[4933]: I1202 15:53:12.782844 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c0555430401fd089ba4f14bef44c9a03bcc4352a3159c34aa592797211ff912\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2774460b514418fc05e1d8ac0ca0a8cda1194fab9151804bed266e6bf44c7369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:12Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:12 crc kubenswrapper[4933]: I1202 15:53:12.796669 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06b195f1-3296-4050-9361-eab421cde8d4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7399fed7994f0ed3f12a423d3f6796e84d8687f9c16a3050ccbb90e1c80a07d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee933bdd8638f0085a6f720a178c8ce59bf46b40a0bcb015ac9c570e25ce97d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7cb1abb6f86878fc3daef153191ea3a2ebe06b3f1fc7df959539938c3b6a724\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db14d6e6ebdfa06ff02570eb66fe7ea17a7705fdaa767b6fb91d7ed12eacd59a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:12Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:12 crc kubenswrapper[4933]: I1202 15:53:12.817452 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1972064c-ea30-421c-b009-2bc675a98fcc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3920b75a79accaa154ca827b9d813f5bbb7c7c18369c0b6d3bb82cab653bcb66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a3f2ac3ef3759389e0c4807b4235aea66b8d9570ef020af90110402296a755\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d4c0805f39e3410185b2d898b61b42dfeb544cd47c1ac7570a099ece7921a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://170155e5803e9eabfc4866a3cd76bd74567d9ff6707bebfdf7f24a2330173140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a499bb33dae499d565aa22f537a095fe9465bd1d68d829248fc7a896f692f67d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14c25265781df0f480ecebbc167b5e413d869433f7258cc299a2c2bc12ee2af3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e68bfc1eb2e35cbb02950eed88f4e4cb09b5ee597cbf3dbefcbb7dca7a9f90d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04ab3fe2bc8cab1157c68d938926ecb8b4fc1bca6f2bba21333de8a7213ad8df\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T15:52:53Z\\\",\\\"message\\\":\\\"]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1202 15:52:53.202107 6385 services_controller.go:444] Built service openshift-machine-api/control-plane-machine-set-operator LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1202 15:52:53.202117 6385 services_controller.go:445] Built service openshift-machine-api/control-plane-machine-set-operator LB template configs for network=default: []services.lbConfig(nil)\\\\nF1202 15:52:53.201748 6385 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:52:53Z is after 2025-08\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e68bfc1eb2e35cbb02950eed88f4e4cb09b5ee597cbf3dbefcbb7dca7a9f90d7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T15:53:08Z\\\",\\\"message\\\":\\\"g for pod on switch crc\\\\nI1202 15:53:07.975214 6599 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1202 15:53:07.975521 6599 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1202 15:53:07.975535 6599 base_network_controller_pods.go:916] Annotation values: ip=[10.217.0.3/23] ; mac=0a:58:0a:d9:00:03 ; gw=[10.217.0.1]\\\\nI1202 15:53:07.975530 6599 ovn.go:134] Ensuring zone local for Pod openshift-kube-controller-manager/kube-controller-manager-crc in node crc\\\\nI1202 15:53:07.975580 6599 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc after 0 failed attempt(s)\\\\nI1202 15:53:07.975354 6599 services_controller.go:444] Built service openshift-apiserver/api LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1202 15:53:07.975596 6599 default_network_controller.go:776] Recording success event on pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1202 15:53:07.975612 6599 services_controller.go:445] Built service openshift-apiserver/api LB template configs for network=default: []services.lbConfig(nil)\\\\nF1202 15:53:07.975627 6599 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T15:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ae6c9973a0f8983a5da9a3a25ba23d7d836e7f5bdaa60b60627d604ef3d2afc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0062caebc274431e99ef7dc875a50f2f2d756215bd2700b7206582f68e5f376\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0062caebc274431e99ef7dc875a50f2f2d756215bd2700b7206582f68e5f376\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8mklc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:12Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:12 crc kubenswrapper[4933]: I1202 15:53:12.818336 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:12 crc kubenswrapper[4933]: I1202 15:53:12.818392 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:12 crc kubenswrapper[4933]: I1202 15:53:12.818404 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:12 crc kubenswrapper[4933]: I1202 15:53:12.818429 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:12 crc kubenswrapper[4933]: I1202 15:53:12.818443 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:12Z","lastTransitionTime":"2025-12-02T15:53:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:12 crc kubenswrapper[4933]: I1202 15:53:12.830376 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s488w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a161d9d5-a56f-45e9-93e4-50e7220cd31e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38f86b30416fb2f10e8444d5c7c0afe84f16d619e83e7a5e1186eaf4c274a51f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mnvpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cba0775cbbe9a6ce1f0b0fe3559f2b5eb39bd13d4f35686fe2ff92d7d833909\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mnvpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s488w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:12Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:12 crc kubenswrapper[4933]: I1202 15:53:12.842334 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67141d41-dade-4d16-8921-1a3eeaef658e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67497877f8241167fd89cf2ebbc7257a910b4f22e96a1e58f78936317daf19aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://559f28d5b8b13a4d78a727f671a75d2c61a391f83089c98749a8071b466c8fae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50e3f8f75abacf2b0701b20cc58796727bb04d99836c7ad69c27355d0c7f070f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0f8661c980893fc646e84a6bb8547946653723fb3f1449d585953e25715d588\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdced4864fc5e9a41404f9484c6126634ffcbc3388080207f6a5508be6dc7b19\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 15:52:30.552832 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 15:52:30.556124 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1621699441/tls.crt::/tmp/serving-cert-1621699441/tls.key\\\\\\\"\\\\nI1202 15:52:36.166152 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 15:52:36.169452 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 15:52:36.169553 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 15:52:36.169614 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 15:52:36.169667 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 15:52:36.177343 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 15:52:36.177409 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 15:52:36.177417 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 15:52:36.177423 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 15:52:36.177428 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 15:52:36.177432 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 15:52:36.177437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 15:52:36.177361 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 15:52:36.181525 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://266ff5f18c317176752753189e1fd32c61d17e22bebba19421d71cdce41c10e4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd63fc5b2ffbd52b09628908868b28e2168bfe8bc453de41ce175aedcd404cc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd63fc5b2ffbd52b09628908868b28e2168bfe8bc453de41ce175aedcd404cc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:12Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:12 crc kubenswrapper[4933]: I1202 15:53:12.854210 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c0555430401fd089ba4f14bef44c9a03bcc4352a3159c34aa592797211ff912\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2774460b514418fc05e1d8ac0ca0a8cda1194fab9151804bed266e6bf44c7369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:12Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:12 crc kubenswrapper[4933]: I1202 15:53:12.865795 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:12Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:12 crc kubenswrapper[4933]: I1202 15:53:12.884402 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-s779q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72a30a71-fe04-43a2-8f60-c9b12a0a6e5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2367cc195b2375ceed6df398214268e414ae13f5459750b4a1f3bbe4ef59363\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a448b3118b6160344bea740978c045ce2351af934805ee95eb6b940963b377bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a448b3118b6160344bea740978c045ce2351af934805ee95eb6b940963b377bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f11b7eae04e4265f3d00a32c0c1c91419ef6da612c6ecd2e69ef4e90b5f72b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f11b7eae04e4265f3d00a32c0c1c91419ef6da612c6ecd2e69ef4e90b5f72b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89093d056f612b90064277410dc17c8b1879459a9e9491538ccd8dabeb2703e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89093d056f612b90064277410dc17c8b1879459a9e9491538ccd8dabeb2703e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e773c0ba8db44a579b0f5b59c7402d773402fbb1b3e3b57bf95e9244df635c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e773c0ba8db44a579b0f5b59c7402d773402fbb1b3e3b57bf95e9244df635c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a098d78bfca730eec14ca5f48ae082e24f0623ec3752e1b0182dceb18821e3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a098d78bfca730eec14ca5f48ae082e24f0623ec3752e1b0182dceb18821e3b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e41fea9eed5df332e515a81666075fc4cb3171b47a7c222b36dac4d5a7533692\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e41fea9eed5df332e515a81666075fc4cb3171b47a7c222b36dac4d5a7533692\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-s779q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:12Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:12 crc kubenswrapper[4933]: I1202 15:53:12.921688 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:12 crc kubenswrapper[4933]: I1202 15:53:12.921766 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:12 crc kubenswrapper[4933]: I1202 15:53:12.921789 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:12 crc kubenswrapper[4933]: I1202 15:53:12.921817 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:12 crc kubenswrapper[4933]: I1202 15:53:12.921865 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:12Z","lastTransitionTime":"2025-12-02T15:53:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:12 crc kubenswrapper[4933]: I1202 15:53:12.946338 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s488w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a161d9d5-a56f-45e9-93e4-50e7220cd31e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38f86b30416fb2f10e8444d5c7c0afe84f16d619e83e7a5e1186eaf4c274a51f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mnvpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cba0775cbbe9a6ce1f0b0fe3559f2b5eb39bd13d4f35686fe2ff92d7d833909\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mnvpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s488w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:12Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:12 crc kubenswrapper[4933]: I1202 15:53:12.951741 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c95a4730-1427-4097-9ca3-4bd251e7acf0-metrics-certs\") pod \"network-metrics-daemon-qbps2\" (UID: \"c95a4730-1427-4097-9ca3-4bd251e7acf0\") " pod="openshift-multus/network-metrics-daemon-qbps2" Dec 02 15:53:12 crc kubenswrapper[4933]: E1202 15:53:12.951981 4933 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 15:53:12 crc kubenswrapper[4933]: E1202 15:53:12.952073 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c95a4730-1427-4097-9ca3-4bd251e7acf0-metrics-certs podName:c95a4730-1427-4097-9ca3-4bd251e7acf0 nodeName:}" failed. No retries permitted until 2025-12-02 15:53:28.952051749 +0000 UTC m=+72.203278452 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c95a4730-1427-4097-9ca3-4bd251e7acf0-metrics-certs") pod "network-metrics-daemon-qbps2" (UID: "c95a4730-1427-4097-9ca3-4bd251e7acf0") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 15:53:12 crc kubenswrapper[4933]: I1202 15:53:12.960130 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06b195f1-3296-4050-9361-eab421cde8d4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7399fed7994f0ed3f12a423d3f6796e84d8687f9c16a3050ccbb90e1c80a07d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee933bdd8638f0085a6f720a178c8ce59bf46b40a0bcb015ac9c570e25ce97d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7cb1abb6f86878fc3daef153191ea3a2ebe06b3f1fc7df959539938c3b6a724\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db14d6e6ebdfa06ff02570eb66fe7ea17a7705fdaa767b6fb91d7ed12eacd59a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:12Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:12 crc kubenswrapper[4933]: I1202 15:53:12.981201 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1972064c-ea30-421c-b009-2bc675a98fcc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3920b75a79accaa154ca827b9d813f5bbb7c7c18369c0b6d3bb82cab653bcb66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a3f2ac3ef3759389e0c4807b4235aea66b8d9570ef020af90110402296a755\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d4c0805f39e3410185b2d898b61b42dfeb544cd47c1ac7570a099ece7921a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://170155e5803e9eabfc4866a3cd76bd74567d9ff6707bebfdf7f24a2330173140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a499bb33dae499d565aa22f537a095fe9465bd1d68d829248fc7a896f692f67d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14c25265781df0f480ecebbc167b5e413d869433f7258cc299a2c2bc12ee2af3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e68bfc1eb2e35cbb02950eed88f4e4cb09b5ee597cbf3dbefcbb7dca7a9f90d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e68bfc1eb2e35cbb02950eed88f4e4cb09b5ee597cbf3dbefcbb7dca7a9f90d7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T15:53:08Z\\\",\\\"message\\\":\\\"g for pod on switch crc\\\\nI1202 15:53:07.975214 6599 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1202 15:53:07.975521 6599 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1202 15:53:07.975535 6599 base_network_controller_pods.go:916] Annotation values: ip=[10.217.0.3/23] ; mac=0a:58:0a:d9:00:03 ; gw=[10.217.0.1]\\\\nI1202 15:53:07.975530 6599 ovn.go:134] Ensuring zone local for Pod openshift-kube-controller-manager/kube-controller-manager-crc in node crc\\\\nI1202 15:53:07.975580 6599 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc after 0 failed attempt(s)\\\\nI1202 15:53:07.975354 6599 services_controller.go:444] Built service openshift-apiserver/api LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1202 15:53:07.975596 6599 default_network_controller.go:776] Recording success event on pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1202 15:53:07.975612 6599 services_controller.go:445] Built service openshift-apiserver/api LB template configs for network=default: []services.lbConfig(nil)\\\\nF1202 15:53:07.975627 6599 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T15:53:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-8mklc_openshift-ovn-kubernetes(1972064c-ea30-421c-b009-2bc675a98fcc)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ae6c9973a0f8983a5da9a3a25ba23d7d836e7f5bdaa60b60627d604ef3d2afc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0062caebc274431e99ef7dc875a50f2f2d756215bd2700b7206582f68e5f376\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0062caebc274431e99ef7dc875a50f2f2d756215bd2700b7206582f68e5f376\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8mklc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:12Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:12 crc kubenswrapper[4933]: I1202 15:53:12.996797 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fl25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"680c0df1-e4d6-4e1c-a36d-2378e821d2d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bdbdb1eb6ad7b6e710feeda6af64e9557ec1c3c938fd850fa5b2835abc45f098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-559sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fl25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:12Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:13 crc kubenswrapper[4933]: I1202 15:53:13.014408 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qbps2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c95a4730-1427-4097-9ca3-4bd251e7acf0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdwf7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdwf7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:56Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qbps2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:13Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:13 crc kubenswrapper[4933]: I1202 15:53:13.024633 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:13 crc kubenswrapper[4933]: I1202 15:53:13.024693 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:13 crc kubenswrapper[4933]: I1202 15:53:13.024720 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:13 crc kubenswrapper[4933]: I1202 15:53:13.024745 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:13 crc kubenswrapper[4933]: I1202 15:53:13.024760 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:13Z","lastTransitionTime":"2025-12-02T15:53:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:13 crc kubenswrapper[4933]: I1202 15:53:13.029689 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7faf072-ca84-4fd4-9409-b46ca6d4f1b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://befd22a9932058934ba3614781a7f133b4e432d5488c47697d082722ac11e0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d957eb940436ccb56c610cb177a4670ec0fe3aa5437e333a99c291c4263c5978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://defe82313be0dd4818134a0c694acbf728c8e31d0df745b7f2241a3e57c1bd8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d0c36fe96576d4594aff50e5b631d1bc23ea352372722d332d11c9dc6b5b7cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d0c36fe96576d4594aff50e5b631d1bc23ea352372722d332d11c9dc6b5b7cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:17Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:13Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:13 crc kubenswrapper[4933]: I1202 15:53:13.048086 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:13Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:13 crc kubenswrapper[4933]: I1202 15:53:13.052432 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qbps2" Dec 02 15:53:13 crc kubenswrapper[4933]: E1202 15:53:13.052663 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qbps2" podUID="c95a4730-1427-4097-9ca3-4bd251e7acf0" Dec 02 15:53:13 crc kubenswrapper[4933]: I1202 15:53:13.065915 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6c1c5e6-50dd-428a-890c-2c3f0456f2fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d84363ac0dfeec81ad7770d6ffd34547605fc51bebb545c4639f4c069bab93ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t48jm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54194f3459a2bbe748821e4f8e94abdd18e7c4e483d4cc2c9d5b765db584dd01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t48jm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-d2p6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:13Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:13 crc kubenswrapper[4933]: I1202 15:53:13.079238 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5d9dn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97edaf10-912b-42e7-a9e7-930381d48508\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea0970d622ab76b94ceab66d1d10d469581574368d38c8cd7c6b7a26f81cb6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s82hp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5d9dn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:13Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:13 crc kubenswrapper[4933]: I1202 15:53:13.099904 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50efb59904b9275848b8f068dfa8943515c66087209fe13dc75888354ecaff09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:13Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:13 crc kubenswrapper[4933]: I1202 15:53:13.117463 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:13Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:13 crc kubenswrapper[4933]: I1202 15:53:13.127287 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:13 crc kubenswrapper[4933]: I1202 15:53:13.127336 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:13 crc kubenswrapper[4933]: I1202 15:53:13.127348 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:13 crc kubenswrapper[4933]: I1202 15:53:13.127369 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:13 crc kubenswrapper[4933]: I1202 15:53:13.127387 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:13Z","lastTransitionTime":"2025-12-02T15:53:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:13 crc kubenswrapper[4933]: I1202 15:53:13.133724 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4424b26b4c67e9508d92cc6bbc82b291d93c587a8463026a856d87b7b778079e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:13Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:13 crc kubenswrapper[4933]: I1202 15:53:13.148818 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-z6kjz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b033c545-93a2-4401-842b-22456e44216b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5307d7bbe56091012f9975b2a42eafb27d8c90b53817f1f82d8269e23456759\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdq96\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-z6kjz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:13Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:13 crc kubenswrapper[4933]: I1202 15:53:13.230307 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:13 crc kubenswrapper[4933]: I1202 15:53:13.230339 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:13 crc kubenswrapper[4933]: I1202 15:53:13.230349 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:13 crc kubenswrapper[4933]: I1202 15:53:13.230365 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:13 crc kubenswrapper[4933]: I1202 15:53:13.230377 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:13Z","lastTransitionTime":"2025-12-02T15:53:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:13 crc kubenswrapper[4933]: I1202 15:53:13.333700 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:13 crc kubenswrapper[4933]: I1202 15:53:13.333759 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:13 crc kubenswrapper[4933]: I1202 15:53:13.333775 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:13 crc kubenswrapper[4933]: I1202 15:53:13.333799 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:13 crc kubenswrapper[4933]: I1202 15:53:13.333854 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:13Z","lastTransitionTime":"2025-12-02T15:53:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:13 crc kubenswrapper[4933]: I1202 15:53:13.436971 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:13 crc kubenswrapper[4933]: I1202 15:53:13.437047 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:13 crc kubenswrapper[4933]: I1202 15:53:13.437060 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:13 crc kubenswrapper[4933]: I1202 15:53:13.437081 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:13 crc kubenswrapper[4933]: I1202 15:53:13.437095 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:13Z","lastTransitionTime":"2025-12-02T15:53:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:13 crc kubenswrapper[4933]: I1202 15:53:13.539975 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:13 crc kubenswrapper[4933]: I1202 15:53:13.540030 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:13 crc kubenswrapper[4933]: I1202 15:53:13.540040 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:13 crc kubenswrapper[4933]: I1202 15:53:13.540059 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:13 crc kubenswrapper[4933]: I1202 15:53:13.540072 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:13Z","lastTransitionTime":"2025-12-02T15:53:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:13 crc kubenswrapper[4933]: I1202 15:53:13.642794 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:13 crc kubenswrapper[4933]: I1202 15:53:13.642857 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:13 crc kubenswrapper[4933]: I1202 15:53:13.642867 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:13 crc kubenswrapper[4933]: I1202 15:53:13.642884 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:13 crc kubenswrapper[4933]: I1202 15:53:13.642894 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:13Z","lastTransitionTime":"2025-12-02T15:53:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:13 crc kubenswrapper[4933]: I1202 15:53:13.745961 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:13 crc kubenswrapper[4933]: I1202 15:53:13.746001 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:13 crc kubenswrapper[4933]: I1202 15:53:13.746013 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:13 crc kubenswrapper[4933]: I1202 15:53:13.746031 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:13 crc kubenswrapper[4933]: I1202 15:53:13.746043 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:13Z","lastTransitionTime":"2025-12-02T15:53:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:13 crc kubenswrapper[4933]: I1202 15:53:13.850145 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:13 crc kubenswrapper[4933]: I1202 15:53:13.850232 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:13 crc kubenswrapper[4933]: I1202 15:53:13.850257 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:13 crc kubenswrapper[4933]: I1202 15:53:13.850292 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:13 crc kubenswrapper[4933]: I1202 15:53:13.850318 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:13Z","lastTransitionTime":"2025-12-02T15:53:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:13 crc kubenswrapper[4933]: I1202 15:53:13.953698 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:13 crc kubenswrapper[4933]: I1202 15:53:13.953792 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:13 crc kubenswrapper[4933]: I1202 15:53:13.953817 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:13 crc kubenswrapper[4933]: I1202 15:53:13.953894 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:13 crc kubenswrapper[4933]: I1202 15:53:13.953922 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:13Z","lastTransitionTime":"2025-12-02T15:53:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:14 crc kubenswrapper[4933]: I1202 15:53:14.053168 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 15:53:14 crc kubenswrapper[4933]: I1202 15:53:14.053218 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 15:53:14 crc kubenswrapper[4933]: I1202 15:53:14.053251 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 15:53:14 crc kubenswrapper[4933]: E1202 15:53:14.053993 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 15:53:14 crc kubenswrapper[4933]: E1202 15:53:14.054186 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 15:53:14 crc kubenswrapper[4933]: E1202 15:53:14.054232 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 15:53:14 crc kubenswrapper[4933]: I1202 15:53:14.056560 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:14 crc kubenswrapper[4933]: I1202 15:53:14.056683 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:14 crc kubenswrapper[4933]: I1202 15:53:14.056779 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:14 crc kubenswrapper[4933]: I1202 15:53:14.056936 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:14 crc kubenswrapper[4933]: I1202 15:53:14.057049 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:14Z","lastTransitionTime":"2025-12-02T15:53:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:14 crc kubenswrapper[4933]: I1202 15:53:14.160051 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:14 crc kubenswrapper[4933]: I1202 15:53:14.160116 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:14 crc kubenswrapper[4933]: I1202 15:53:14.160130 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:14 crc kubenswrapper[4933]: I1202 15:53:14.160157 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:14 crc kubenswrapper[4933]: I1202 15:53:14.160173 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:14Z","lastTransitionTime":"2025-12-02T15:53:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:14 crc kubenswrapper[4933]: I1202 15:53:14.263511 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:14 crc kubenswrapper[4933]: I1202 15:53:14.263582 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:14 crc kubenswrapper[4933]: I1202 15:53:14.263601 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:14 crc kubenswrapper[4933]: I1202 15:53:14.263634 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:14 crc kubenswrapper[4933]: I1202 15:53:14.263655 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:14Z","lastTransitionTime":"2025-12-02T15:53:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:14 crc kubenswrapper[4933]: I1202 15:53:14.366482 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:14 crc kubenswrapper[4933]: I1202 15:53:14.366855 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:14 crc kubenswrapper[4933]: I1202 15:53:14.367646 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:14 crc kubenswrapper[4933]: I1202 15:53:14.367752 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:14 crc kubenswrapper[4933]: I1202 15:53:14.367867 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:14Z","lastTransitionTime":"2025-12-02T15:53:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:14 crc kubenswrapper[4933]: I1202 15:53:14.471431 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:14 crc kubenswrapper[4933]: I1202 15:53:14.471518 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:14 crc kubenswrapper[4933]: I1202 15:53:14.471534 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:14 crc kubenswrapper[4933]: I1202 15:53:14.471560 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:14 crc kubenswrapper[4933]: I1202 15:53:14.471574 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:14Z","lastTransitionTime":"2025-12-02T15:53:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:14 crc kubenswrapper[4933]: I1202 15:53:14.574533 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:14 crc kubenswrapper[4933]: I1202 15:53:14.574584 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:14 crc kubenswrapper[4933]: I1202 15:53:14.574595 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:14 crc kubenswrapper[4933]: I1202 15:53:14.574613 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:14 crc kubenswrapper[4933]: I1202 15:53:14.574627 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:14Z","lastTransitionTime":"2025-12-02T15:53:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:14 crc kubenswrapper[4933]: I1202 15:53:14.677946 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:14 crc kubenswrapper[4933]: I1202 15:53:14.677990 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:14 crc kubenswrapper[4933]: I1202 15:53:14.678000 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:14 crc kubenswrapper[4933]: I1202 15:53:14.678017 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:14 crc kubenswrapper[4933]: I1202 15:53:14.678028 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:14Z","lastTransitionTime":"2025-12-02T15:53:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:14 crc kubenswrapper[4933]: I1202 15:53:14.781099 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:14 crc kubenswrapper[4933]: I1202 15:53:14.781155 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:14 crc kubenswrapper[4933]: I1202 15:53:14.781165 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:14 crc kubenswrapper[4933]: I1202 15:53:14.781184 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:14 crc kubenswrapper[4933]: I1202 15:53:14.781197 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:14Z","lastTransitionTime":"2025-12-02T15:53:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:14 crc kubenswrapper[4933]: I1202 15:53:14.884203 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:14 crc kubenswrapper[4933]: I1202 15:53:14.884255 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:14 crc kubenswrapper[4933]: I1202 15:53:14.884264 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:14 crc kubenswrapper[4933]: I1202 15:53:14.884282 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:14 crc kubenswrapper[4933]: I1202 15:53:14.884297 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:14Z","lastTransitionTime":"2025-12-02T15:53:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:14 crc kubenswrapper[4933]: I1202 15:53:14.987934 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:14 crc kubenswrapper[4933]: I1202 15:53:14.987978 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:14 crc kubenswrapper[4933]: I1202 15:53:14.987989 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:14 crc kubenswrapper[4933]: I1202 15:53:14.988008 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:14 crc kubenswrapper[4933]: I1202 15:53:14.988020 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:14Z","lastTransitionTime":"2025-12-02T15:53:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:15 crc kubenswrapper[4933]: I1202 15:53:15.052802 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qbps2" Dec 02 15:53:15 crc kubenswrapper[4933]: E1202 15:53:15.053009 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qbps2" podUID="c95a4730-1427-4097-9ca3-4bd251e7acf0" Dec 02 15:53:15 crc kubenswrapper[4933]: I1202 15:53:15.090721 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:15 crc kubenswrapper[4933]: I1202 15:53:15.090789 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:15 crc kubenswrapper[4933]: I1202 15:53:15.090808 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:15 crc kubenswrapper[4933]: I1202 15:53:15.090860 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:15 crc kubenswrapper[4933]: I1202 15:53:15.090880 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:15Z","lastTransitionTime":"2025-12-02T15:53:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:15 crc kubenswrapper[4933]: I1202 15:53:15.193755 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:15 crc kubenswrapper[4933]: I1202 15:53:15.193812 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:15 crc kubenswrapper[4933]: I1202 15:53:15.193837 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:15 crc kubenswrapper[4933]: I1202 15:53:15.193858 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:15 crc kubenswrapper[4933]: I1202 15:53:15.193895 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:15Z","lastTransitionTime":"2025-12-02T15:53:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:15 crc kubenswrapper[4933]: I1202 15:53:15.297082 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:15 crc kubenswrapper[4933]: I1202 15:53:15.297664 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:15 crc kubenswrapper[4933]: I1202 15:53:15.297802 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:15 crc kubenswrapper[4933]: I1202 15:53:15.298075 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:15 crc kubenswrapper[4933]: I1202 15:53:15.298220 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:15Z","lastTransitionTime":"2025-12-02T15:53:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:15 crc kubenswrapper[4933]: I1202 15:53:15.401784 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:15 crc kubenswrapper[4933]: I1202 15:53:15.401859 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:15 crc kubenswrapper[4933]: I1202 15:53:15.401875 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:15 crc kubenswrapper[4933]: I1202 15:53:15.401899 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:15 crc kubenswrapper[4933]: I1202 15:53:15.401920 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:15Z","lastTransitionTime":"2025-12-02T15:53:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:15 crc kubenswrapper[4933]: I1202 15:53:15.504950 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:15 crc kubenswrapper[4933]: I1202 15:53:15.505031 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:15 crc kubenswrapper[4933]: I1202 15:53:15.505049 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:15 crc kubenswrapper[4933]: I1202 15:53:15.505074 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:15 crc kubenswrapper[4933]: I1202 15:53:15.505099 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:15Z","lastTransitionTime":"2025-12-02T15:53:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:15 crc kubenswrapper[4933]: I1202 15:53:15.607893 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:15 crc kubenswrapper[4933]: I1202 15:53:15.607937 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:15 crc kubenswrapper[4933]: I1202 15:53:15.607947 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:15 crc kubenswrapper[4933]: I1202 15:53:15.607959 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:15 crc kubenswrapper[4933]: I1202 15:53:15.607968 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:15Z","lastTransitionTime":"2025-12-02T15:53:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:15 crc kubenswrapper[4933]: I1202 15:53:15.710285 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:15 crc kubenswrapper[4933]: I1202 15:53:15.710325 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:15 crc kubenswrapper[4933]: I1202 15:53:15.710334 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:15 crc kubenswrapper[4933]: I1202 15:53:15.710349 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:15 crc kubenswrapper[4933]: I1202 15:53:15.710358 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:15Z","lastTransitionTime":"2025-12-02T15:53:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:15 crc kubenswrapper[4933]: I1202 15:53:15.814429 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:15 crc kubenswrapper[4933]: I1202 15:53:15.814482 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:15 crc kubenswrapper[4933]: I1202 15:53:15.814498 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:15 crc kubenswrapper[4933]: I1202 15:53:15.814521 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:15 crc kubenswrapper[4933]: I1202 15:53:15.814538 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:15Z","lastTransitionTime":"2025-12-02T15:53:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:15 crc kubenswrapper[4933]: I1202 15:53:15.917628 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:15 crc kubenswrapper[4933]: I1202 15:53:15.917694 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:15 crc kubenswrapper[4933]: I1202 15:53:15.917712 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:15 crc kubenswrapper[4933]: I1202 15:53:15.917738 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:15 crc kubenswrapper[4933]: I1202 15:53:15.917758 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:15Z","lastTransitionTime":"2025-12-02T15:53:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:16 crc kubenswrapper[4933]: I1202 15:53:16.020348 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:16 crc kubenswrapper[4933]: I1202 15:53:16.020427 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:16 crc kubenswrapper[4933]: I1202 15:53:16.020448 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:16 crc kubenswrapper[4933]: I1202 15:53:16.020478 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:16 crc kubenswrapper[4933]: I1202 15:53:16.020499 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:16Z","lastTransitionTime":"2025-12-02T15:53:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:16 crc kubenswrapper[4933]: I1202 15:53:16.052986 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 15:53:16 crc kubenswrapper[4933]: I1202 15:53:16.053094 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 15:53:16 crc kubenswrapper[4933]: I1202 15:53:16.053018 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 15:53:16 crc kubenswrapper[4933]: E1202 15:53:16.053165 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 15:53:16 crc kubenswrapper[4933]: E1202 15:53:16.053590 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 15:53:16 crc kubenswrapper[4933]: E1202 15:53:16.053687 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 15:53:16 crc kubenswrapper[4933]: I1202 15:53:16.124304 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:16 crc kubenswrapper[4933]: I1202 15:53:16.124361 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:16 crc kubenswrapper[4933]: I1202 15:53:16.124380 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:16 crc kubenswrapper[4933]: I1202 15:53:16.124408 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:16 crc kubenswrapper[4933]: I1202 15:53:16.124427 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:16Z","lastTransitionTime":"2025-12-02T15:53:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:16 crc kubenswrapper[4933]: I1202 15:53:16.227966 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:16 crc kubenswrapper[4933]: I1202 15:53:16.228220 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:16 crc kubenswrapper[4933]: I1202 15:53:16.228238 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:16 crc kubenswrapper[4933]: I1202 15:53:16.228263 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:16 crc kubenswrapper[4933]: I1202 15:53:16.228286 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:16Z","lastTransitionTime":"2025-12-02T15:53:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:16 crc kubenswrapper[4933]: I1202 15:53:16.331649 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:16 crc kubenswrapper[4933]: I1202 15:53:16.331748 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:16 crc kubenswrapper[4933]: I1202 15:53:16.331765 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:16 crc kubenswrapper[4933]: I1202 15:53:16.331789 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:16 crc kubenswrapper[4933]: I1202 15:53:16.331808 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:16Z","lastTransitionTime":"2025-12-02T15:53:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:16 crc kubenswrapper[4933]: I1202 15:53:16.435005 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:16 crc kubenswrapper[4933]: I1202 15:53:16.435257 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:16 crc kubenswrapper[4933]: I1202 15:53:16.435269 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:16 crc kubenswrapper[4933]: I1202 15:53:16.435289 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:16 crc kubenswrapper[4933]: I1202 15:53:16.435305 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:16Z","lastTransitionTime":"2025-12-02T15:53:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:16 crc kubenswrapper[4933]: I1202 15:53:16.539116 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:16 crc kubenswrapper[4933]: I1202 15:53:16.539186 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:16 crc kubenswrapper[4933]: I1202 15:53:16.539205 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:16 crc kubenswrapper[4933]: I1202 15:53:16.539232 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:16 crc kubenswrapper[4933]: I1202 15:53:16.539251 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:16Z","lastTransitionTime":"2025-12-02T15:53:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:16 crc kubenswrapper[4933]: I1202 15:53:16.642859 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:16 crc kubenswrapper[4933]: I1202 15:53:16.642947 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:16 crc kubenswrapper[4933]: I1202 15:53:16.642973 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:16 crc kubenswrapper[4933]: I1202 15:53:16.643002 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:16 crc kubenswrapper[4933]: I1202 15:53:16.643021 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:16Z","lastTransitionTime":"2025-12-02T15:53:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:16 crc kubenswrapper[4933]: I1202 15:53:16.747123 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:16 crc kubenswrapper[4933]: I1202 15:53:16.747182 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:16 crc kubenswrapper[4933]: I1202 15:53:16.747196 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:16 crc kubenswrapper[4933]: I1202 15:53:16.747217 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:16 crc kubenswrapper[4933]: I1202 15:53:16.747232 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:16Z","lastTransitionTime":"2025-12-02T15:53:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:16 crc kubenswrapper[4933]: I1202 15:53:16.850250 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:16 crc kubenswrapper[4933]: I1202 15:53:16.850300 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:16 crc kubenswrapper[4933]: I1202 15:53:16.850309 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:16 crc kubenswrapper[4933]: I1202 15:53:16.850325 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:16 crc kubenswrapper[4933]: I1202 15:53:16.850335 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:16Z","lastTransitionTime":"2025-12-02T15:53:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:16 crc kubenswrapper[4933]: I1202 15:53:16.953361 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:16 crc kubenswrapper[4933]: I1202 15:53:16.953428 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:16 crc kubenswrapper[4933]: I1202 15:53:16.953452 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:16 crc kubenswrapper[4933]: I1202 15:53:16.953481 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:16 crc kubenswrapper[4933]: I1202 15:53:16.953501 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:16Z","lastTransitionTime":"2025-12-02T15:53:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:17 crc kubenswrapper[4933]: I1202 15:53:17.052409 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qbps2" Dec 02 15:53:17 crc kubenswrapper[4933]: E1202 15:53:17.052643 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qbps2" podUID="c95a4730-1427-4097-9ca3-4bd251e7acf0" Dec 02 15:53:17 crc kubenswrapper[4933]: I1202 15:53:17.057372 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:17 crc kubenswrapper[4933]: I1202 15:53:17.057421 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:17 crc kubenswrapper[4933]: I1202 15:53:17.057432 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:17 crc kubenswrapper[4933]: I1202 15:53:17.057452 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:17 crc kubenswrapper[4933]: I1202 15:53:17.057467 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:17Z","lastTransitionTime":"2025-12-02T15:53:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:17 crc kubenswrapper[4933]: I1202 15:53:17.071202 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06b195f1-3296-4050-9361-eab421cde8d4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7399fed7994f0ed3f12a423d3f6796e84d8687f9c16a3050ccbb90e1c80a07d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee933bdd8638f0085a6f720a178c8ce59bf46b40a0bcb015ac9c570e25ce97d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7cb1abb6f86878fc3daef153191ea3a2ebe06b3f1fc7df959539938c3b6a724\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db14d6e6ebdfa06ff02570eb66fe7ea17a7705fdaa767b6fb91d7ed12eacd59a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:17Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:17 crc kubenswrapper[4933]: I1202 15:53:17.090415 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1972064c-ea30-421c-b009-2bc675a98fcc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3920b75a79accaa154ca827b9d813f5bbb7c7c18369c0b6d3bb82cab653bcb66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a3f2ac3ef3759389e0c4807b4235aea66b8d9570ef020af90110402296a755\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d4c0805f39e3410185b2d898b61b42dfeb544cd47c1ac7570a099ece7921a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://170155e5803e9eabfc4866a3cd76bd74567d9ff6707bebfdf7f24a2330173140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a499bb33dae499d565aa22f537a095fe9465bd1d68d829248fc7a896f692f67d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14c25265781df0f480ecebbc167b5e413d869433f7258cc299a2c2bc12ee2af3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e68bfc1eb2e35cbb02950eed88f4e4cb09b5ee597cbf3dbefcbb7dca7a9f90d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e68bfc1eb2e35cbb02950eed88f4e4cb09b5ee597cbf3dbefcbb7dca7a9f90d7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T15:53:08Z\\\",\\\"message\\\":\\\"g for pod on switch crc\\\\nI1202 15:53:07.975214 6599 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1202 15:53:07.975521 6599 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1202 15:53:07.975535 6599 base_network_controller_pods.go:916] Annotation values: ip=[10.217.0.3/23] ; mac=0a:58:0a:d9:00:03 ; gw=[10.217.0.1]\\\\nI1202 15:53:07.975530 6599 ovn.go:134] Ensuring zone local for Pod openshift-kube-controller-manager/kube-controller-manager-crc in node crc\\\\nI1202 15:53:07.975580 6599 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc after 0 failed attempt(s)\\\\nI1202 15:53:07.975354 6599 services_controller.go:444] Built service openshift-apiserver/api LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1202 15:53:07.975596 6599 default_network_controller.go:776] Recording success event on pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1202 15:53:07.975612 6599 services_controller.go:445] Built service openshift-apiserver/api LB template configs for network=default: []services.lbConfig(nil)\\\\nF1202 15:53:07.975627 6599 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T15:53:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-8mklc_openshift-ovn-kubernetes(1972064c-ea30-421c-b009-2bc675a98fcc)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ae6c9973a0f8983a5da9a3a25ba23d7d836e7f5bdaa60b60627d604ef3d2afc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0062caebc274431e99ef7dc875a50f2f2d756215bd2700b7206582f68e5f376\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0062caebc274431e99ef7dc875a50f2f2d756215bd2700b7206582f68e5f376\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8mklc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:17Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:17 crc kubenswrapper[4933]: I1202 15:53:17.103174 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s488w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a161d9d5-a56f-45e9-93e4-50e7220cd31e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38f86b30416fb2f10e8444d5c7c0afe84f16d619e83e7a5e1186eaf4c274a51f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mnvpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cba0775cbbe9a6ce1f0b0fe3559f2b5eb39bd13d4f35686fe2ff92d7d833909\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mnvpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s488w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:17Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:17 crc kubenswrapper[4933]: I1202 15:53:17.114945 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6c1c5e6-50dd-428a-890c-2c3f0456f2fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d84363ac0dfeec81ad7770d6ffd34547605fc51bebb545c4639f4c069bab93ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t48jm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54194f3459a2bbe748821e4f8e94abdd18e7c4e483d4cc2c9d5b765db584dd01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t48jm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-d2p6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:17Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:17 crc kubenswrapper[4933]: I1202 15:53:17.130498 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5d9dn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97edaf10-912b-42e7-a9e7-930381d48508\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea0970d622ab76b94ceab66d1d10d469581574368d38c8cd7c6b7a26f81cb6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s82hp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5d9dn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:17Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:17 crc kubenswrapper[4933]: I1202 15:53:17.141545 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fl25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"680c0df1-e4d6-4e1c-a36d-2378e821d2d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bdbdb1eb6ad7b6e710feeda6af64e9557ec1c3c938fd850fa5b2835abc45f098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-559sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fl25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:17Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:17 crc kubenswrapper[4933]: I1202 15:53:17.153150 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qbps2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c95a4730-1427-4097-9ca3-4bd251e7acf0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdwf7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdwf7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:56Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qbps2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:17Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:17 crc kubenswrapper[4933]: I1202 15:53:17.160447 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:17 crc kubenswrapper[4933]: I1202 15:53:17.160506 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:17 crc kubenswrapper[4933]: I1202 15:53:17.160521 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:17 crc kubenswrapper[4933]: I1202 15:53:17.160539 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:17 crc kubenswrapper[4933]: I1202 15:53:17.160553 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:17Z","lastTransitionTime":"2025-12-02T15:53:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:17 crc kubenswrapper[4933]: I1202 15:53:17.168804 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7faf072-ca84-4fd4-9409-b46ca6d4f1b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://befd22a9932058934ba3614781a7f133b4e432d5488c47697d082722ac11e0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d957eb940436ccb56c610cb177a4670ec0fe3aa5437e333a99c291c4263c5978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://defe82313be0dd4818134a0c694acbf728c8e31d0df745b7f2241a3e57c1bd8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d0c36fe96576d4594aff50e5b631d1bc23ea352372722d332d11c9dc6b5b7cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d0c36fe96576d4594aff50e5b631d1bc23ea352372722d332d11c9dc6b5b7cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:17Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:17Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:17 crc kubenswrapper[4933]: I1202 15:53:17.183918 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:17Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:17 crc kubenswrapper[4933]: I1202 15:53:17.201883 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4424b26b4c67e9508d92cc6bbc82b291d93c587a8463026a856d87b7b778079e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:17Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:17 crc kubenswrapper[4933]: I1202 15:53:17.214985 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-z6kjz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b033c545-93a2-4401-842b-22456e44216b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5307d7bbe56091012f9975b2a42eafb27d8c90b53817f1f82d8269e23456759\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdq96\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-z6kjz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:17Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:17 crc kubenswrapper[4933]: I1202 15:53:17.229813 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50efb59904b9275848b8f068dfa8943515c66087209fe13dc75888354ecaff09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:17Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:17 crc kubenswrapper[4933]: I1202 15:53:17.245938 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:17Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:17 crc kubenswrapper[4933]: I1202 15:53:17.260640 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:17Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:17 crc kubenswrapper[4933]: I1202 15:53:17.263202 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:17 crc kubenswrapper[4933]: I1202 15:53:17.263244 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:17 crc kubenswrapper[4933]: I1202 15:53:17.263254 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:17 crc kubenswrapper[4933]: I1202 15:53:17.263269 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:17 crc kubenswrapper[4933]: I1202 15:53:17.263282 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:17Z","lastTransitionTime":"2025-12-02T15:53:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:17 crc kubenswrapper[4933]: I1202 15:53:17.278313 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-s779q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72a30a71-fe04-43a2-8f60-c9b12a0a6e5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2367cc195b2375ceed6df398214268e414ae13f5459750b4a1f3bbe4ef59363\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a448b3118b6160344bea740978c045ce2351af934805ee95eb6b940963b377bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a448b3118b6160344bea740978c045ce2351af934805ee95eb6b940963b377bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f11b7eae04e4265f3d00a32c0c1c91419ef6da612c6ecd2e69ef4e90b5f72b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f11b7eae04e4265f3d00a32c0c1c91419ef6da612c6ecd2e69ef4e90b5f72b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89093d056f612b90064277410dc17c8b1879459a9e9491538ccd8dabeb2703e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89093d056f612b90064277410dc17c8b1879459a9e9491538ccd8dabeb2703e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e773c0ba8db44a579b0f5b59c7402d773402fbb1b3e3b57bf95e9244df635c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e773c0ba8db44a579b0f5b59c7402d773402fbb1b3e3b57bf95e9244df635c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a098d78bfca730eec14ca5f48ae082e24f0623ec3752e1b0182dceb18821e3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a098d78bfca730eec14ca5f48ae082e24f0623ec3752e1b0182dceb18821e3b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e41fea9eed5df332e515a81666075fc4cb3171b47a7c222b36dac4d5a7533692\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e41fea9eed5df332e515a81666075fc4cb3171b47a7c222b36dac4d5a7533692\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-s779q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:17Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:17 crc kubenswrapper[4933]: I1202 15:53:17.296889 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67141d41-dade-4d16-8921-1a3eeaef658e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67497877f8241167fd89cf2ebbc7257a910b4f22e96a1e58f78936317daf19aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://559f28d5b8b13a4d78a727f671a75d2c61a391f83089c98749a8071b466c8fae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50e3f8f75abacf2b0701b20cc58796727bb04d99836c7ad69c27355d0c7f070f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0f8661c980893fc646e84a6bb8547946653723fb3f1449d585953e25715d588\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdced4864fc5e9a41404f9484c6126634ffcbc3388080207f6a5508be6dc7b19\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 15:52:30.552832 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 15:52:30.556124 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1621699441/tls.crt::/tmp/serving-cert-1621699441/tls.key\\\\\\\"\\\\nI1202 15:52:36.166152 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 15:52:36.169452 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 15:52:36.169553 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 15:52:36.169614 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 15:52:36.169667 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 15:52:36.177343 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 15:52:36.177409 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 15:52:36.177417 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 15:52:36.177423 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 15:52:36.177428 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 15:52:36.177432 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 15:52:36.177437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 15:52:36.177361 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 15:52:36.181525 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://266ff5f18c317176752753189e1fd32c61d17e22bebba19421d71cdce41c10e4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd63fc5b2ffbd52b09628908868b28e2168bfe8bc453de41ce175aedcd404cc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd63fc5b2ffbd52b09628908868b28e2168bfe8bc453de41ce175aedcd404cc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:17Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:17 crc kubenswrapper[4933]: I1202 15:53:17.313665 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c0555430401fd089ba4f14bef44c9a03bcc4352a3159c34aa592797211ff912\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2774460b514418fc05e1d8ac0ca0a8cda1194fab9151804bed266e6bf44c7369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:17Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:17 crc kubenswrapper[4933]: I1202 15:53:17.366298 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:17 crc kubenswrapper[4933]: I1202 15:53:17.366346 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:17 crc kubenswrapper[4933]: I1202 15:53:17.366355 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:17 crc kubenswrapper[4933]: I1202 15:53:17.366373 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:17 crc kubenswrapper[4933]: I1202 15:53:17.366383 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:17Z","lastTransitionTime":"2025-12-02T15:53:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:17 crc kubenswrapper[4933]: I1202 15:53:17.469317 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:17 crc kubenswrapper[4933]: I1202 15:53:17.469362 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:17 crc kubenswrapper[4933]: I1202 15:53:17.469371 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:17 crc kubenswrapper[4933]: I1202 15:53:17.469420 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:17 crc kubenswrapper[4933]: I1202 15:53:17.469439 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:17Z","lastTransitionTime":"2025-12-02T15:53:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:17 crc kubenswrapper[4933]: I1202 15:53:17.572027 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:17 crc kubenswrapper[4933]: I1202 15:53:17.572080 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:17 crc kubenswrapper[4933]: I1202 15:53:17.572097 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:17 crc kubenswrapper[4933]: I1202 15:53:17.572116 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:17 crc kubenswrapper[4933]: I1202 15:53:17.572129 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:17Z","lastTransitionTime":"2025-12-02T15:53:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:17 crc kubenswrapper[4933]: I1202 15:53:17.675733 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:17 crc kubenswrapper[4933]: I1202 15:53:17.675803 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:17 crc kubenswrapper[4933]: I1202 15:53:17.675867 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:17 crc kubenswrapper[4933]: I1202 15:53:17.675900 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:17 crc kubenswrapper[4933]: I1202 15:53:17.675934 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:17Z","lastTransitionTime":"2025-12-02T15:53:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:17 crc kubenswrapper[4933]: I1202 15:53:17.780858 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:17 crc kubenswrapper[4933]: I1202 15:53:17.781160 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:17 crc kubenswrapper[4933]: I1202 15:53:17.781351 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:17 crc kubenswrapper[4933]: I1202 15:53:17.781509 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:17 crc kubenswrapper[4933]: I1202 15:53:17.781656 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:17Z","lastTransitionTime":"2025-12-02T15:53:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:17 crc kubenswrapper[4933]: I1202 15:53:17.884648 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:17 crc kubenswrapper[4933]: I1202 15:53:17.884685 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:17 crc kubenswrapper[4933]: I1202 15:53:17.884696 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:17 crc kubenswrapper[4933]: I1202 15:53:17.884712 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:17 crc kubenswrapper[4933]: I1202 15:53:17.884723 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:17Z","lastTransitionTime":"2025-12-02T15:53:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:17 crc kubenswrapper[4933]: I1202 15:53:17.987678 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:17 crc kubenswrapper[4933]: I1202 15:53:17.987717 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:17 crc kubenswrapper[4933]: I1202 15:53:17.987729 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:17 crc kubenswrapper[4933]: I1202 15:53:17.987745 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:17 crc kubenswrapper[4933]: I1202 15:53:17.987758 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:17Z","lastTransitionTime":"2025-12-02T15:53:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:18 crc kubenswrapper[4933]: I1202 15:53:18.052379 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 15:53:18 crc kubenswrapper[4933]: I1202 15:53:18.052409 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 15:53:18 crc kubenswrapper[4933]: I1202 15:53:18.052449 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 15:53:18 crc kubenswrapper[4933]: E1202 15:53:18.053078 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 15:53:18 crc kubenswrapper[4933]: E1202 15:53:18.053345 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 15:53:18 crc kubenswrapper[4933]: E1202 15:53:18.053353 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 15:53:18 crc kubenswrapper[4933]: I1202 15:53:18.091057 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:18 crc kubenswrapper[4933]: I1202 15:53:18.091390 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:18 crc kubenswrapper[4933]: I1202 15:53:18.091462 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:18 crc kubenswrapper[4933]: I1202 15:53:18.091571 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:18 crc kubenswrapper[4933]: I1202 15:53:18.091659 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:18Z","lastTransitionTime":"2025-12-02T15:53:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:18 crc kubenswrapper[4933]: I1202 15:53:18.194632 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:18 crc kubenswrapper[4933]: I1202 15:53:18.194687 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:18 crc kubenswrapper[4933]: I1202 15:53:18.194702 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:18 crc kubenswrapper[4933]: I1202 15:53:18.194724 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:18 crc kubenswrapper[4933]: I1202 15:53:18.194740 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:18Z","lastTransitionTime":"2025-12-02T15:53:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:18 crc kubenswrapper[4933]: I1202 15:53:18.297208 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:18 crc kubenswrapper[4933]: I1202 15:53:18.297294 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:18 crc kubenswrapper[4933]: I1202 15:53:18.297315 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:18 crc kubenswrapper[4933]: I1202 15:53:18.297342 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:18 crc kubenswrapper[4933]: I1202 15:53:18.297362 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:18Z","lastTransitionTime":"2025-12-02T15:53:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:18 crc kubenswrapper[4933]: I1202 15:53:18.400783 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:18 crc kubenswrapper[4933]: I1202 15:53:18.400872 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:18 crc kubenswrapper[4933]: I1202 15:53:18.400885 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:18 crc kubenswrapper[4933]: I1202 15:53:18.400904 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:18 crc kubenswrapper[4933]: I1202 15:53:18.400918 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:18Z","lastTransitionTime":"2025-12-02T15:53:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:18 crc kubenswrapper[4933]: I1202 15:53:18.503254 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:18 crc kubenswrapper[4933]: I1202 15:53:18.503293 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:18 crc kubenswrapper[4933]: I1202 15:53:18.503303 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:18 crc kubenswrapper[4933]: I1202 15:53:18.503318 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:18 crc kubenswrapper[4933]: I1202 15:53:18.503328 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:18Z","lastTransitionTime":"2025-12-02T15:53:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:18 crc kubenswrapper[4933]: I1202 15:53:18.746365 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:18 crc kubenswrapper[4933]: I1202 15:53:18.746672 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:18 crc kubenswrapper[4933]: I1202 15:53:18.746745 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:18 crc kubenswrapper[4933]: I1202 15:53:18.746841 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:18 crc kubenswrapper[4933]: I1202 15:53:18.746921 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:18Z","lastTransitionTime":"2025-12-02T15:53:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:18 crc kubenswrapper[4933]: I1202 15:53:18.849342 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:18 crc kubenswrapper[4933]: I1202 15:53:18.850316 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:18 crc kubenswrapper[4933]: I1202 15:53:18.850416 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:18 crc kubenswrapper[4933]: I1202 15:53:18.850512 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:18 crc kubenswrapper[4933]: I1202 15:53:18.850591 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:18Z","lastTransitionTime":"2025-12-02T15:53:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:18 crc kubenswrapper[4933]: I1202 15:53:18.953432 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:18 crc kubenswrapper[4933]: I1202 15:53:18.953502 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:18 crc kubenswrapper[4933]: I1202 15:53:18.953520 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:18 crc kubenswrapper[4933]: I1202 15:53:18.953546 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:18 crc kubenswrapper[4933]: I1202 15:53:18.953563 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:18Z","lastTransitionTime":"2025-12-02T15:53:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:19 crc kubenswrapper[4933]: I1202 15:53:19.053298 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qbps2" Dec 02 15:53:19 crc kubenswrapper[4933]: E1202 15:53:19.053536 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qbps2" podUID="c95a4730-1427-4097-9ca3-4bd251e7acf0" Dec 02 15:53:19 crc kubenswrapper[4933]: I1202 15:53:19.056349 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:19 crc kubenswrapper[4933]: I1202 15:53:19.056470 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:19 crc kubenswrapper[4933]: I1202 15:53:19.056485 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:19 crc kubenswrapper[4933]: I1202 15:53:19.056518 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:19 crc kubenswrapper[4933]: I1202 15:53:19.056529 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:19Z","lastTransitionTime":"2025-12-02T15:53:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:19 crc kubenswrapper[4933]: I1202 15:53:19.159444 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:19 crc kubenswrapper[4933]: I1202 15:53:19.159810 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:19 crc kubenswrapper[4933]: I1202 15:53:19.159924 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:19 crc kubenswrapper[4933]: I1202 15:53:19.160003 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:19 crc kubenswrapper[4933]: I1202 15:53:19.160075 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:19Z","lastTransitionTime":"2025-12-02T15:53:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:19 crc kubenswrapper[4933]: I1202 15:53:19.263166 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:19 crc kubenswrapper[4933]: I1202 15:53:19.263203 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:19 crc kubenswrapper[4933]: I1202 15:53:19.263214 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:19 crc kubenswrapper[4933]: I1202 15:53:19.263229 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:19 crc kubenswrapper[4933]: I1202 15:53:19.263239 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:19Z","lastTransitionTime":"2025-12-02T15:53:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:19 crc kubenswrapper[4933]: I1202 15:53:19.366506 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:19 crc kubenswrapper[4933]: I1202 15:53:19.366547 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:19 crc kubenswrapper[4933]: I1202 15:53:19.366560 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:19 crc kubenswrapper[4933]: I1202 15:53:19.366576 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:19 crc kubenswrapper[4933]: I1202 15:53:19.366590 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:19Z","lastTransitionTime":"2025-12-02T15:53:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:19 crc kubenswrapper[4933]: I1202 15:53:19.469996 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:19 crc kubenswrapper[4933]: I1202 15:53:19.470358 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:19 crc kubenswrapper[4933]: I1202 15:53:19.470603 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:19 crc kubenswrapper[4933]: I1202 15:53:19.470903 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:19 crc kubenswrapper[4933]: I1202 15:53:19.471145 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:19Z","lastTransitionTime":"2025-12-02T15:53:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:19 crc kubenswrapper[4933]: I1202 15:53:19.573736 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:19 crc kubenswrapper[4933]: I1202 15:53:19.574064 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:19 crc kubenswrapper[4933]: I1202 15:53:19.574128 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:19 crc kubenswrapper[4933]: I1202 15:53:19.574191 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:19 crc kubenswrapper[4933]: I1202 15:53:19.574270 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:19Z","lastTransitionTime":"2025-12-02T15:53:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:19 crc kubenswrapper[4933]: I1202 15:53:19.677643 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:19 crc kubenswrapper[4933]: I1202 15:53:19.677998 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:19 crc kubenswrapper[4933]: I1202 15:53:19.678096 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:19 crc kubenswrapper[4933]: I1202 15:53:19.678181 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:19 crc kubenswrapper[4933]: I1202 15:53:19.678266 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:19Z","lastTransitionTime":"2025-12-02T15:53:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:19 crc kubenswrapper[4933]: I1202 15:53:19.782167 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:19 crc kubenswrapper[4933]: I1202 15:53:19.782234 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:19 crc kubenswrapper[4933]: I1202 15:53:19.782245 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:19 crc kubenswrapper[4933]: I1202 15:53:19.782261 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:19 crc kubenswrapper[4933]: I1202 15:53:19.782270 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:19Z","lastTransitionTime":"2025-12-02T15:53:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:19 crc kubenswrapper[4933]: I1202 15:53:19.885357 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:19 crc kubenswrapper[4933]: I1202 15:53:19.885417 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:19 crc kubenswrapper[4933]: I1202 15:53:19.885434 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:19 crc kubenswrapper[4933]: I1202 15:53:19.885459 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:19 crc kubenswrapper[4933]: I1202 15:53:19.885477 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:19Z","lastTransitionTime":"2025-12-02T15:53:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:19 crc kubenswrapper[4933]: I1202 15:53:19.987982 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:19 crc kubenswrapper[4933]: I1202 15:53:19.988037 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:19 crc kubenswrapper[4933]: I1202 15:53:19.988052 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:19 crc kubenswrapper[4933]: I1202 15:53:19.988073 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:19 crc kubenswrapper[4933]: I1202 15:53:19.988089 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:19Z","lastTransitionTime":"2025-12-02T15:53:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:20 crc kubenswrapper[4933]: I1202 15:53:20.052710 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 15:53:20 crc kubenswrapper[4933]: I1202 15:53:20.052713 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 15:53:20 crc kubenswrapper[4933]: I1202 15:53:20.052899 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 15:53:20 crc kubenswrapper[4933]: E1202 15:53:20.052928 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 15:53:20 crc kubenswrapper[4933]: E1202 15:53:20.053117 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 15:53:20 crc kubenswrapper[4933]: E1202 15:53:20.053221 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 15:53:20 crc kubenswrapper[4933]: I1202 15:53:20.091125 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:20 crc kubenswrapper[4933]: I1202 15:53:20.091183 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:20 crc kubenswrapper[4933]: I1202 15:53:20.091381 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:20 crc kubenswrapper[4933]: I1202 15:53:20.091403 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:20 crc kubenswrapper[4933]: I1202 15:53:20.091421 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:20Z","lastTransitionTime":"2025-12-02T15:53:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:20 crc kubenswrapper[4933]: I1202 15:53:20.195234 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:20 crc kubenswrapper[4933]: I1202 15:53:20.195313 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:20 crc kubenswrapper[4933]: I1202 15:53:20.195336 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:20 crc kubenswrapper[4933]: I1202 15:53:20.195367 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:20 crc kubenswrapper[4933]: I1202 15:53:20.195393 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:20Z","lastTransitionTime":"2025-12-02T15:53:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:20 crc kubenswrapper[4933]: I1202 15:53:20.298334 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:20 crc kubenswrapper[4933]: I1202 15:53:20.298395 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:20 crc kubenswrapper[4933]: I1202 15:53:20.298404 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:20 crc kubenswrapper[4933]: I1202 15:53:20.298422 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:20 crc kubenswrapper[4933]: I1202 15:53:20.298433 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:20Z","lastTransitionTime":"2025-12-02T15:53:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:20 crc kubenswrapper[4933]: I1202 15:53:20.401775 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:20 crc kubenswrapper[4933]: I1202 15:53:20.401894 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:20 crc kubenswrapper[4933]: I1202 15:53:20.401912 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:20 crc kubenswrapper[4933]: I1202 15:53:20.401941 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:20 crc kubenswrapper[4933]: I1202 15:53:20.401960 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:20Z","lastTransitionTime":"2025-12-02T15:53:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:20 crc kubenswrapper[4933]: I1202 15:53:20.504356 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:20 crc kubenswrapper[4933]: I1202 15:53:20.504435 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:20 crc kubenswrapper[4933]: I1202 15:53:20.504463 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:20 crc kubenswrapper[4933]: I1202 15:53:20.504494 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:20 crc kubenswrapper[4933]: I1202 15:53:20.504516 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:20Z","lastTransitionTime":"2025-12-02T15:53:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:20 crc kubenswrapper[4933]: I1202 15:53:20.608066 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:20 crc kubenswrapper[4933]: I1202 15:53:20.608117 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:20 crc kubenswrapper[4933]: I1202 15:53:20.608128 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:20 crc kubenswrapper[4933]: I1202 15:53:20.608146 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:20 crc kubenswrapper[4933]: I1202 15:53:20.608158 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:20Z","lastTransitionTime":"2025-12-02T15:53:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:20 crc kubenswrapper[4933]: I1202 15:53:20.710752 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:20 crc kubenswrapper[4933]: I1202 15:53:20.711154 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:20 crc kubenswrapper[4933]: I1202 15:53:20.711230 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:20 crc kubenswrapper[4933]: I1202 15:53:20.711314 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:20 crc kubenswrapper[4933]: I1202 15:53:20.711385 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:20Z","lastTransitionTime":"2025-12-02T15:53:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:20 crc kubenswrapper[4933]: I1202 15:53:20.813967 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:20 crc kubenswrapper[4933]: I1202 15:53:20.814026 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:20 crc kubenswrapper[4933]: I1202 15:53:20.814036 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:20 crc kubenswrapper[4933]: I1202 15:53:20.814057 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:20 crc kubenswrapper[4933]: I1202 15:53:20.814067 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:20Z","lastTransitionTime":"2025-12-02T15:53:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:20 crc kubenswrapper[4933]: I1202 15:53:20.916499 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:20 crc kubenswrapper[4933]: I1202 15:53:20.916542 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:20 crc kubenswrapper[4933]: I1202 15:53:20.916557 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:20 crc kubenswrapper[4933]: I1202 15:53:20.916572 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:20 crc kubenswrapper[4933]: I1202 15:53:20.916583 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:20Z","lastTransitionTime":"2025-12-02T15:53:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:21 crc kubenswrapper[4933]: I1202 15:53:21.019735 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:21 crc kubenswrapper[4933]: I1202 15:53:21.019803 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:21 crc kubenswrapper[4933]: I1202 15:53:21.019847 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:21 crc kubenswrapper[4933]: I1202 15:53:21.019878 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:21 crc kubenswrapper[4933]: I1202 15:53:21.019899 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:21Z","lastTransitionTime":"2025-12-02T15:53:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:21 crc kubenswrapper[4933]: I1202 15:53:21.053158 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qbps2" Dec 02 15:53:21 crc kubenswrapper[4933]: E1202 15:53:21.053309 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qbps2" podUID="c95a4730-1427-4097-9ca3-4bd251e7acf0" Dec 02 15:53:21 crc kubenswrapper[4933]: I1202 15:53:21.122365 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:21 crc kubenswrapper[4933]: I1202 15:53:21.122402 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:21 crc kubenswrapper[4933]: I1202 15:53:21.122413 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:21 crc kubenswrapper[4933]: I1202 15:53:21.122426 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:21 crc kubenswrapper[4933]: I1202 15:53:21.122436 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:21Z","lastTransitionTime":"2025-12-02T15:53:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:21 crc kubenswrapper[4933]: I1202 15:53:21.158973 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:21 crc kubenswrapper[4933]: I1202 15:53:21.159012 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:21 crc kubenswrapper[4933]: I1202 15:53:21.159021 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:21 crc kubenswrapper[4933]: I1202 15:53:21.159035 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:21 crc kubenswrapper[4933]: I1202 15:53:21.159046 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:21Z","lastTransitionTime":"2025-12-02T15:53:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:21 crc kubenswrapper[4933]: E1202 15:53:21.173791 4933 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:53:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:53:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:53:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:53:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b45811fa-f657-451d-9a34-cdd268fcc941\\\",\\\"systemUUID\\\":\\\"84b7b789-bc9b-466b-8619-2bf2e1fdb8d0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:21Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:21 crc kubenswrapper[4933]: I1202 15:53:21.181840 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:21 crc kubenswrapper[4933]: I1202 15:53:21.181879 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:21 crc kubenswrapper[4933]: I1202 15:53:21.181891 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:21 crc kubenswrapper[4933]: I1202 15:53:21.181906 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:21 crc kubenswrapper[4933]: I1202 15:53:21.181915 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:21Z","lastTransitionTime":"2025-12-02T15:53:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:21 crc kubenswrapper[4933]: E1202 15:53:21.193664 4933 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:53:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:53:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:53:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:53:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b45811fa-f657-451d-9a34-cdd268fcc941\\\",\\\"systemUUID\\\":\\\"84b7b789-bc9b-466b-8619-2bf2e1fdb8d0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:21Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:21 crc kubenswrapper[4933]: I1202 15:53:21.196879 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:21 crc kubenswrapper[4933]: I1202 15:53:21.196902 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:21 crc kubenswrapper[4933]: I1202 15:53:21.196910 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:21 crc kubenswrapper[4933]: I1202 15:53:21.196921 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:21 crc kubenswrapper[4933]: I1202 15:53:21.196930 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:21Z","lastTransitionTime":"2025-12-02T15:53:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:21 crc kubenswrapper[4933]: E1202 15:53:21.209863 4933 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:53:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:53:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:53:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:53:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b45811fa-f657-451d-9a34-cdd268fcc941\\\",\\\"systemUUID\\\":\\\"84b7b789-bc9b-466b-8619-2bf2e1fdb8d0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:21Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:21 crc kubenswrapper[4933]: I1202 15:53:21.213142 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:21 crc kubenswrapper[4933]: I1202 15:53:21.213163 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:21 crc kubenswrapper[4933]: I1202 15:53:21.213171 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:21 crc kubenswrapper[4933]: I1202 15:53:21.213181 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:21 crc kubenswrapper[4933]: I1202 15:53:21.213190 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:21Z","lastTransitionTime":"2025-12-02T15:53:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:21 crc kubenswrapper[4933]: E1202 15:53:21.228263 4933 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:53:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:53:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:53:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:53:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b45811fa-f657-451d-9a34-cdd268fcc941\\\",\\\"systemUUID\\\":\\\"84b7b789-bc9b-466b-8619-2bf2e1fdb8d0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:21Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:21 crc kubenswrapper[4933]: I1202 15:53:21.232378 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:21 crc kubenswrapper[4933]: I1202 15:53:21.232413 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:21 crc kubenswrapper[4933]: I1202 15:53:21.232422 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:21 crc kubenswrapper[4933]: I1202 15:53:21.232436 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:21 crc kubenswrapper[4933]: I1202 15:53:21.232447 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:21Z","lastTransitionTime":"2025-12-02T15:53:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:21 crc kubenswrapper[4933]: E1202 15:53:21.247337 4933 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:53:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:53:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:53:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:53:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b45811fa-f657-451d-9a34-cdd268fcc941\\\",\\\"systemUUID\\\":\\\"84b7b789-bc9b-466b-8619-2bf2e1fdb8d0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:21Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:21 crc kubenswrapper[4933]: E1202 15:53:21.247563 4933 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 02 15:53:21 crc kubenswrapper[4933]: I1202 15:53:21.250171 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:21 crc kubenswrapper[4933]: I1202 15:53:21.250234 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:21 crc kubenswrapper[4933]: I1202 15:53:21.250249 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:21 crc kubenswrapper[4933]: I1202 15:53:21.250272 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:21 crc kubenswrapper[4933]: I1202 15:53:21.250316 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:21Z","lastTransitionTime":"2025-12-02T15:53:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:21 crc kubenswrapper[4933]: I1202 15:53:21.353210 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:21 crc kubenswrapper[4933]: I1202 15:53:21.353266 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:21 crc kubenswrapper[4933]: I1202 15:53:21.353277 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:21 crc kubenswrapper[4933]: I1202 15:53:21.353299 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:21 crc kubenswrapper[4933]: I1202 15:53:21.353311 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:21Z","lastTransitionTime":"2025-12-02T15:53:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:21 crc kubenswrapper[4933]: I1202 15:53:21.456494 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:21 crc kubenswrapper[4933]: I1202 15:53:21.456836 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:21 crc kubenswrapper[4933]: I1202 15:53:21.456940 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:21 crc kubenswrapper[4933]: I1202 15:53:21.457055 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:21 crc kubenswrapper[4933]: I1202 15:53:21.457138 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:21Z","lastTransitionTime":"2025-12-02T15:53:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:21 crc kubenswrapper[4933]: I1202 15:53:21.560973 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:21 crc kubenswrapper[4933]: I1202 15:53:21.561073 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:21 crc kubenswrapper[4933]: I1202 15:53:21.561090 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:21 crc kubenswrapper[4933]: I1202 15:53:21.561145 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:21 crc kubenswrapper[4933]: I1202 15:53:21.561167 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:21Z","lastTransitionTime":"2025-12-02T15:53:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:21 crc kubenswrapper[4933]: I1202 15:53:21.664373 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:21 crc kubenswrapper[4933]: I1202 15:53:21.664724 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:21 crc kubenswrapper[4933]: I1202 15:53:21.664839 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:21 crc kubenswrapper[4933]: I1202 15:53:21.664932 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:21 crc kubenswrapper[4933]: I1202 15:53:21.665021 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:21Z","lastTransitionTime":"2025-12-02T15:53:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:21 crc kubenswrapper[4933]: I1202 15:53:21.767590 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:21 crc kubenswrapper[4933]: I1202 15:53:21.767644 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:21 crc kubenswrapper[4933]: I1202 15:53:21.767660 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:21 crc kubenswrapper[4933]: I1202 15:53:21.767683 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:21 crc kubenswrapper[4933]: I1202 15:53:21.767700 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:21Z","lastTransitionTime":"2025-12-02T15:53:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:21 crc kubenswrapper[4933]: I1202 15:53:21.871027 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:21 crc kubenswrapper[4933]: I1202 15:53:21.871330 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:21 crc kubenswrapper[4933]: I1202 15:53:21.871426 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:21 crc kubenswrapper[4933]: I1202 15:53:21.871522 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:21 crc kubenswrapper[4933]: I1202 15:53:21.871591 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:21Z","lastTransitionTime":"2025-12-02T15:53:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:21 crc kubenswrapper[4933]: I1202 15:53:21.975145 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:21 crc kubenswrapper[4933]: I1202 15:53:21.975222 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:21 crc kubenswrapper[4933]: I1202 15:53:21.975237 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:21 crc kubenswrapper[4933]: I1202 15:53:21.975282 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:21 crc kubenswrapper[4933]: I1202 15:53:21.975296 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:21Z","lastTransitionTime":"2025-12-02T15:53:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:22 crc kubenswrapper[4933]: I1202 15:53:22.052894 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 15:53:22 crc kubenswrapper[4933]: I1202 15:53:22.053038 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 15:53:22 crc kubenswrapper[4933]: I1202 15:53:22.053081 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 15:53:22 crc kubenswrapper[4933]: E1202 15:53:22.053217 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 15:53:22 crc kubenswrapper[4933]: E1202 15:53:22.053374 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 15:53:22 crc kubenswrapper[4933]: E1202 15:53:22.053488 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 15:53:22 crc kubenswrapper[4933]: I1202 15:53:22.077732 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:22 crc kubenswrapper[4933]: I1202 15:53:22.078175 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:22 crc kubenswrapper[4933]: I1202 15:53:22.078269 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:22 crc kubenswrapper[4933]: I1202 15:53:22.078365 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:22 crc kubenswrapper[4933]: I1202 15:53:22.078440 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:22Z","lastTransitionTime":"2025-12-02T15:53:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:22 crc kubenswrapper[4933]: I1202 15:53:22.181770 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:22 crc kubenswrapper[4933]: I1202 15:53:22.182433 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:22 crc kubenswrapper[4933]: I1202 15:53:22.182515 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:22 crc kubenswrapper[4933]: I1202 15:53:22.182585 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:22 crc kubenswrapper[4933]: I1202 15:53:22.182661 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:22Z","lastTransitionTime":"2025-12-02T15:53:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:22 crc kubenswrapper[4933]: I1202 15:53:22.285215 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:22 crc kubenswrapper[4933]: I1202 15:53:22.285688 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:22 crc kubenswrapper[4933]: I1202 15:53:22.285861 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:22 crc kubenswrapper[4933]: I1202 15:53:22.286037 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:22 crc kubenswrapper[4933]: I1202 15:53:22.286160 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:22Z","lastTransitionTime":"2025-12-02T15:53:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:22 crc kubenswrapper[4933]: I1202 15:53:22.388934 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:22 crc kubenswrapper[4933]: I1202 15:53:22.389009 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:22 crc kubenswrapper[4933]: I1202 15:53:22.389018 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:22 crc kubenswrapper[4933]: I1202 15:53:22.389031 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:22 crc kubenswrapper[4933]: I1202 15:53:22.389040 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:22Z","lastTransitionTime":"2025-12-02T15:53:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:22 crc kubenswrapper[4933]: I1202 15:53:22.491208 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:22 crc kubenswrapper[4933]: I1202 15:53:22.491598 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:22 crc kubenswrapper[4933]: I1202 15:53:22.491712 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:22 crc kubenswrapper[4933]: I1202 15:53:22.491862 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:22 crc kubenswrapper[4933]: I1202 15:53:22.491962 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:22Z","lastTransitionTime":"2025-12-02T15:53:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:22 crc kubenswrapper[4933]: I1202 15:53:22.595530 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:22 crc kubenswrapper[4933]: I1202 15:53:22.595576 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:22 crc kubenswrapper[4933]: I1202 15:53:22.595594 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:22 crc kubenswrapper[4933]: I1202 15:53:22.595612 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:22 crc kubenswrapper[4933]: I1202 15:53:22.595628 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:22Z","lastTransitionTime":"2025-12-02T15:53:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:22 crc kubenswrapper[4933]: I1202 15:53:22.698653 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:22 crc kubenswrapper[4933]: I1202 15:53:22.699025 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:22 crc kubenswrapper[4933]: I1202 15:53:22.699113 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:22 crc kubenswrapper[4933]: I1202 15:53:22.699243 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:22 crc kubenswrapper[4933]: I1202 15:53:22.699358 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:22Z","lastTransitionTime":"2025-12-02T15:53:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:22 crc kubenswrapper[4933]: I1202 15:53:22.803380 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:22 crc kubenswrapper[4933]: I1202 15:53:22.803633 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:22 crc kubenswrapper[4933]: I1202 15:53:22.803753 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:22 crc kubenswrapper[4933]: I1202 15:53:22.803913 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:22 crc kubenswrapper[4933]: I1202 15:53:22.804055 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:22Z","lastTransitionTime":"2025-12-02T15:53:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:22 crc kubenswrapper[4933]: I1202 15:53:22.907343 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:22 crc kubenswrapper[4933]: I1202 15:53:22.907432 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:22 crc kubenswrapper[4933]: I1202 15:53:22.907449 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:22 crc kubenswrapper[4933]: I1202 15:53:22.907479 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:22 crc kubenswrapper[4933]: I1202 15:53:22.907507 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:22Z","lastTransitionTime":"2025-12-02T15:53:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:23 crc kubenswrapper[4933]: I1202 15:53:23.010309 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:23 crc kubenswrapper[4933]: I1202 15:53:23.011246 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:23 crc kubenswrapper[4933]: I1202 15:53:23.011489 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:23 crc kubenswrapper[4933]: I1202 15:53:23.011923 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:23 crc kubenswrapper[4933]: I1202 15:53:23.012160 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:23Z","lastTransitionTime":"2025-12-02T15:53:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:23 crc kubenswrapper[4933]: I1202 15:53:23.055717 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qbps2" Dec 02 15:53:23 crc kubenswrapper[4933]: E1202 15:53:23.056210 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qbps2" podUID="c95a4730-1427-4097-9ca3-4bd251e7acf0" Dec 02 15:53:23 crc kubenswrapper[4933]: I1202 15:53:23.068504 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Dec 02 15:53:23 crc kubenswrapper[4933]: I1202 15:53:23.114813 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:23 crc kubenswrapper[4933]: I1202 15:53:23.114889 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:23 crc kubenswrapper[4933]: I1202 15:53:23.114902 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:23 crc kubenswrapper[4933]: I1202 15:53:23.114925 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:23 crc kubenswrapper[4933]: I1202 15:53:23.114945 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:23Z","lastTransitionTime":"2025-12-02T15:53:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:23 crc kubenswrapper[4933]: I1202 15:53:23.217948 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:23 crc kubenswrapper[4933]: I1202 15:53:23.218013 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:23 crc kubenswrapper[4933]: I1202 15:53:23.218027 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:23 crc kubenswrapper[4933]: I1202 15:53:23.218047 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:23 crc kubenswrapper[4933]: I1202 15:53:23.218059 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:23Z","lastTransitionTime":"2025-12-02T15:53:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:23 crc kubenswrapper[4933]: I1202 15:53:23.321228 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:23 crc kubenswrapper[4933]: I1202 15:53:23.321279 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:23 crc kubenswrapper[4933]: I1202 15:53:23.321320 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:23 crc kubenswrapper[4933]: I1202 15:53:23.321338 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:23 crc kubenswrapper[4933]: I1202 15:53:23.321353 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:23Z","lastTransitionTime":"2025-12-02T15:53:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:23 crc kubenswrapper[4933]: I1202 15:53:23.424094 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:23 crc kubenswrapper[4933]: I1202 15:53:23.424143 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:23 crc kubenswrapper[4933]: I1202 15:53:23.424155 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:23 crc kubenswrapper[4933]: I1202 15:53:23.424176 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:23 crc kubenswrapper[4933]: I1202 15:53:23.424190 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:23Z","lastTransitionTime":"2025-12-02T15:53:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:23 crc kubenswrapper[4933]: I1202 15:53:23.527557 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:23 crc kubenswrapper[4933]: I1202 15:53:23.527605 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:23 crc kubenswrapper[4933]: I1202 15:53:23.527615 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:23 crc kubenswrapper[4933]: I1202 15:53:23.527634 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:23 crc kubenswrapper[4933]: I1202 15:53:23.527645 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:23Z","lastTransitionTime":"2025-12-02T15:53:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:23 crc kubenswrapper[4933]: I1202 15:53:23.630527 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:23 crc kubenswrapper[4933]: I1202 15:53:23.630572 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:23 crc kubenswrapper[4933]: I1202 15:53:23.630581 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:23 crc kubenswrapper[4933]: I1202 15:53:23.630599 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:23 crc kubenswrapper[4933]: I1202 15:53:23.630611 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:23Z","lastTransitionTime":"2025-12-02T15:53:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:23 crc kubenswrapper[4933]: I1202 15:53:23.733922 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:23 crc kubenswrapper[4933]: I1202 15:53:23.733991 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:23 crc kubenswrapper[4933]: I1202 15:53:23.734005 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:23 crc kubenswrapper[4933]: I1202 15:53:23.734026 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:23 crc kubenswrapper[4933]: I1202 15:53:23.734041 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:23Z","lastTransitionTime":"2025-12-02T15:53:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:23 crc kubenswrapper[4933]: I1202 15:53:23.837371 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:23 crc kubenswrapper[4933]: I1202 15:53:23.837434 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:23 crc kubenswrapper[4933]: I1202 15:53:23.837457 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:23 crc kubenswrapper[4933]: I1202 15:53:23.837477 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:23 crc kubenswrapper[4933]: I1202 15:53:23.837492 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:23Z","lastTransitionTime":"2025-12-02T15:53:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:23 crc kubenswrapper[4933]: I1202 15:53:23.941165 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:23 crc kubenswrapper[4933]: I1202 15:53:23.941544 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:23 crc kubenswrapper[4933]: I1202 15:53:23.941728 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:23 crc kubenswrapper[4933]: I1202 15:53:23.941891 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:23 crc kubenswrapper[4933]: I1202 15:53:23.942003 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:23Z","lastTransitionTime":"2025-12-02T15:53:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:24 crc kubenswrapper[4933]: I1202 15:53:24.045356 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:24 crc kubenswrapper[4933]: I1202 15:53:24.045399 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:24 crc kubenswrapper[4933]: I1202 15:53:24.045410 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:24 crc kubenswrapper[4933]: I1202 15:53:24.045431 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:24 crc kubenswrapper[4933]: I1202 15:53:24.045443 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:24Z","lastTransitionTime":"2025-12-02T15:53:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:24 crc kubenswrapper[4933]: I1202 15:53:24.052685 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 15:53:24 crc kubenswrapper[4933]: E1202 15:53:24.052859 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 15:53:24 crc kubenswrapper[4933]: I1202 15:53:24.052965 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 15:53:24 crc kubenswrapper[4933]: E1202 15:53:24.053061 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 15:53:24 crc kubenswrapper[4933]: I1202 15:53:24.052892 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 15:53:24 crc kubenswrapper[4933]: E1202 15:53:24.053480 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 15:53:24 crc kubenswrapper[4933]: I1202 15:53:24.148801 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:24 crc kubenswrapper[4933]: I1202 15:53:24.148858 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:24 crc kubenswrapper[4933]: I1202 15:53:24.148874 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:24 crc kubenswrapper[4933]: I1202 15:53:24.148893 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:24 crc kubenswrapper[4933]: I1202 15:53:24.148906 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:24Z","lastTransitionTime":"2025-12-02T15:53:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:24 crc kubenswrapper[4933]: I1202 15:53:24.251842 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:24 crc kubenswrapper[4933]: I1202 15:53:24.251885 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:24 crc kubenswrapper[4933]: I1202 15:53:24.251893 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:24 crc kubenswrapper[4933]: I1202 15:53:24.251909 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:24 crc kubenswrapper[4933]: I1202 15:53:24.251919 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:24Z","lastTransitionTime":"2025-12-02T15:53:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:24 crc kubenswrapper[4933]: I1202 15:53:24.355290 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:24 crc kubenswrapper[4933]: I1202 15:53:24.355355 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:24 crc kubenswrapper[4933]: I1202 15:53:24.355373 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:24 crc kubenswrapper[4933]: I1202 15:53:24.355395 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:24 crc kubenswrapper[4933]: I1202 15:53:24.355409 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:24Z","lastTransitionTime":"2025-12-02T15:53:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:24 crc kubenswrapper[4933]: I1202 15:53:24.458482 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:24 crc kubenswrapper[4933]: I1202 15:53:24.458543 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:24 crc kubenswrapper[4933]: I1202 15:53:24.458555 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:24 crc kubenswrapper[4933]: I1202 15:53:24.458578 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:24 crc kubenswrapper[4933]: I1202 15:53:24.458591 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:24Z","lastTransitionTime":"2025-12-02T15:53:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:24 crc kubenswrapper[4933]: I1202 15:53:24.561576 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:24 crc kubenswrapper[4933]: I1202 15:53:24.561629 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:24 crc kubenswrapper[4933]: I1202 15:53:24.561645 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:24 crc kubenswrapper[4933]: I1202 15:53:24.561668 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:24 crc kubenswrapper[4933]: I1202 15:53:24.561685 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:24Z","lastTransitionTime":"2025-12-02T15:53:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:24 crc kubenswrapper[4933]: I1202 15:53:24.664542 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:24 crc kubenswrapper[4933]: I1202 15:53:24.664620 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:24 crc kubenswrapper[4933]: I1202 15:53:24.664655 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:24 crc kubenswrapper[4933]: I1202 15:53:24.664692 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:24 crc kubenswrapper[4933]: I1202 15:53:24.664714 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:24Z","lastTransitionTime":"2025-12-02T15:53:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:24 crc kubenswrapper[4933]: I1202 15:53:24.766742 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:24 crc kubenswrapper[4933]: I1202 15:53:24.766799 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:24 crc kubenswrapper[4933]: I1202 15:53:24.766818 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:24 crc kubenswrapper[4933]: I1202 15:53:24.766860 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:24 crc kubenswrapper[4933]: I1202 15:53:24.766877 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:24Z","lastTransitionTime":"2025-12-02T15:53:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:24 crc kubenswrapper[4933]: I1202 15:53:24.869469 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:24 crc kubenswrapper[4933]: I1202 15:53:24.869515 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:24 crc kubenswrapper[4933]: I1202 15:53:24.869524 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:24 crc kubenswrapper[4933]: I1202 15:53:24.869543 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:24 crc kubenswrapper[4933]: I1202 15:53:24.869553 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:24Z","lastTransitionTime":"2025-12-02T15:53:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:24 crc kubenswrapper[4933]: I1202 15:53:24.971971 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:24 crc kubenswrapper[4933]: I1202 15:53:24.972018 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:24 crc kubenswrapper[4933]: I1202 15:53:24.972027 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:24 crc kubenswrapper[4933]: I1202 15:53:24.972042 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:24 crc kubenswrapper[4933]: I1202 15:53:24.972051 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:24Z","lastTransitionTime":"2025-12-02T15:53:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:25 crc kubenswrapper[4933]: I1202 15:53:25.053630 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qbps2" Dec 02 15:53:25 crc kubenswrapper[4933]: E1202 15:53:25.053939 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qbps2" podUID="c95a4730-1427-4097-9ca3-4bd251e7acf0" Dec 02 15:53:25 crc kubenswrapper[4933]: I1202 15:53:25.074034 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:25 crc kubenswrapper[4933]: I1202 15:53:25.074114 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:25 crc kubenswrapper[4933]: I1202 15:53:25.074140 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:25 crc kubenswrapper[4933]: I1202 15:53:25.074168 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:25 crc kubenswrapper[4933]: I1202 15:53:25.074189 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:25Z","lastTransitionTime":"2025-12-02T15:53:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:25 crc kubenswrapper[4933]: I1202 15:53:25.177097 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:25 crc kubenswrapper[4933]: I1202 15:53:25.177163 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:25 crc kubenswrapper[4933]: I1202 15:53:25.177188 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:25 crc kubenswrapper[4933]: I1202 15:53:25.177247 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:25 crc kubenswrapper[4933]: I1202 15:53:25.177269 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:25Z","lastTransitionTime":"2025-12-02T15:53:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:25 crc kubenswrapper[4933]: I1202 15:53:25.279632 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:25 crc kubenswrapper[4933]: I1202 15:53:25.279682 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:25 crc kubenswrapper[4933]: I1202 15:53:25.279691 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:25 crc kubenswrapper[4933]: I1202 15:53:25.279704 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:25 crc kubenswrapper[4933]: I1202 15:53:25.279714 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:25Z","lastTransitionTime":"2025-12-02T15:53:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:25 crc kubenswrapper[4933]: I1202 15:53:25.382879 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:25 crc kubenswrapper[4933]: I1202 15:53:25.382928 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:25 crc kubenswrapper[4933]: I1202 15:53:25.382937 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:25 crc kubenswrapper[4933]: I1202 15:53:25.382955 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:25 crc kubenswrapper[4933]: I1202 15:53:25.382965 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:25Z","lastTransitionTime":"2025-12-02T15:53:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:25 crc kubenswrapper[4933]: I1202 15:53:25.485662 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:25 crc kubenswrapper[4933]: I1202 15:53:25.485704 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:25 crc kubenswrapper[4933]: I1202 15:53:25.485714 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:25 crc kubenswrapper[4933]: I1202 15:53:25.485730 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:25 crc kubenswrapper[4933]: I1202 15:53:25.485741 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:25Z","lastTransitionTime":"2025-12-02T15:53:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:25 crc kubenswrapper[4933]: I1202 15:53:25.588805 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:25 crc kubenswrapper[4933]: I1202 15:53:25.588871 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:25 crc kubenswrapper[4933]: I1202 15:53:25.588882 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:25 crc kubenswrapper[4933]: I1202 15:53:25.588897 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:25 crc kubenswrapper[4933]: I1202 15:53:25.588908 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:25Z","lastTransitionTime":"2025-12-02T15:53:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:25 crc kubenswrapper[4933]: I1202 15:53:25.691851 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:25 crc kubenswrapper[4933]: I1202 15:53:25.691902 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:25 crc kubenswrapper[4933]: I1202 15:53:25.691912 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:25 crc kubenswrapper[4933]: I1202 15:53:25.691928 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:25 crc kubenswrapper[4933]: I1202 15:53:25.691938 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:25Z","lastTransitionTime":"2025-12-02T15:53:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:25 crc kubenswrapper[4933]: I1202 15:53:25.793931 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:25 crc kubenswrapper[4933]: I1202 15:53:25.794002 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:25 crc kubenswrapper[4933]: I1202 15:53:25.794021 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:25 crc kubenswrapper[4933]: I1202 15:53:25.794047 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:25 crc kubenswrapper[4933]: I1202 15:53:25.794065 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:25Z","lastTransitionTime":"2025-12-02T15:53:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:25 crc kubenswrapper[4933]: I1202 15:53:25.896523 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:25 crc kubenswrapper[4933]: I1202 15:53:25.896590 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:25 crc kubenswrapper[4933]: I1202 15:53:25.896607 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:25 crc kubenswrapper[4933]: I1202 15:53:25.896630 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:25 crc kubenswrapper[4933]: I1202 15:53:25.896648 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:25Z","lastTransitionTime":"2025-12-02T15:53:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:25 crc kubenswrapper[4933]: I1202 15:53:25.999226 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:25 crc kubenswrapper[4933]: I1202 15:53:25.999291 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:25 crc kubenswrapper[4933]: I1202 15:53:25.999309 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:25 crc kubenswrapper[4933]: I1202 15:53:25.999335 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:25 crc kubenswrapper[4933]: I1202 15:53:25.999355 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:25Z","lastTransitionTime":"2025-12-02T15:53:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:26 crc kubenswrapper[4933]: I1202 15:53:26.052689 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 15:53:26 crc kubenswrapper[4933]: I1202 15:53:26.052762 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 15:53:26 crc kubenswrapper[4933]: E1202 15:53:26.052861 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 15:53:26 crc kubenswrapper[4933]: I1202 15:53:26.052780 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 15:53:26 crc kubenswrapper[4933]: E1202 15:53:26.053185 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 15:53:26 crc kubenswrapper[4933]: E1202 15:53:26.053218 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 15:53:26 crc kubenswrapper[4933]: I1202 15:53:26.102509 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:26 crc kubenswrapper[4933]: I1202 15:53:26.102564 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:26 crc kubenswrapper[4933]: I1202 15:53:26.102576 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:26 crc kubenswrapper[4933]: I1202 15:53:26.102591 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:26 crc kubenswrapper[4933]: I1202 15:53:26.102602 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:26Z","lastTransitionTime":"2025-12-02T15:53:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:26 crc kubenswrapper[4933]: I1202 15:53:26.206256 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:26 crc kubenswrapper[4933]: I1202 15:53:26.206303 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:26 crc kubenswrapper[4933]: I1202 15:53:26.206312 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:26 crc kubenswrapper[4933]: I1202 15:53:26.206328 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:26 crc kubenswrapper[4933]: I1202 15:53:26.206338 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:26Z","lastTransitionTime":"2025-12-02T15:53:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:26 crc kubenswrapper[4933]: I1202 15:53:26.309379 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:26 crc kubenswrapper[4933]: I1202 15:53:26.309415 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:26 crc kubenswrapper[4933]: I1202 15:53:26.309423 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:26 crc kubenswrapper[4933]: I1202 15:53:26.309438 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:26 crc kubenswrapper[4933]: I1202 15:53:26.309447 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:26Z","lastTransitionTime":"2025-12-02T15:53:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:26 crc kubenswrapper[4933]: I1202 15:53:26.412461 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:26 crc kubenswrapper[4933]: I1202 15:53:26.412509 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:26 crc kubenswrapper[4933]: I1202 15:53:26.412521 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:26 crc kubenswrapper[4933]: I1202 15:53:26.412537 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:26 crc kubenswrapper[4933]: I1202 15:53:26.412547 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:26Z","lastTransitionTime":"2025-12-02T15:53:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:26 crc kubenswrapper[4933]: I1202 15:53:26.515697 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:26 crc kubenswrapper[4933]: I1202 15:53:26.515742 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:26 crc kubenswrapper[4933]: I1202 15:53:26.515755 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:26 crc kubenswrapper[4933]: I1202 15:53:26.515776 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:26 crc kubenswrapper[4933]: I1202 15:53:26.515790 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:26Z","lastTransitionTime":"2025-12-02T15:53:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:26 crc kubenswrapper[4933]: I1202 15:53:26.619653 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:26 crc kubenswrapper[4933]: I1202 15:53:26.619951 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:26 crc kubenswrapper[4933]: I1202 15:53:26.620124 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:26 crc kubenswrapper[4933]: I1202 15:53:26.620174 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:26 crc kubenswrapper[4933]: I1202 15:53:26.620201 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:26Z","lastTransitionTime":"2025-12-02T15:53:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:26 crc kubenswrapper[4933]: I1202 15:53:26.723407 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:26 crc kubenswrapper[4933]: I1202 15:53:26.723459 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:26 crc kubenswrapper[4933]: I1202 15:53:26.723470 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:26 crc kubenswrapper[4933]: I1202 15:53:26.723487 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:26 crc kubenswrapper[4933]: I1202 15:53:26.723498 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:26Z","lastTransitionTime":"2025-12-02T15:53:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:26 crc kubenswrapper[4933]: I1202 15:53:26.827230 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:26 crc kubenswrapper[4933]: I1202 15:53:26.827300 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:26 crc kubenswrapper[4933]: I1202 15:53:26.827316 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:26 crc kubenswrapper[4933]: I1202 15:53:26.827341 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:26 crc kubenswrapper[4933]: I1202 15:53:26.827356 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:26Z","lastTransitionTime":"2025-12-02T15:53:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:26 crc kubenswrapper[4933]: I1202 15:53:26.931761 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:26 crc kubenswrapper[4933]: I1202 15:53:26.931853 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:26 crc kubenswrapper[4933]: I1202 15:53:26.931869 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:26 crc kubenswrapper[4933]: I1202 15:53:26.931892 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:26 crc kubenswrapper[4933]: I1202 15:53:26.931907 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:26Z","lastTransitionTime":"2025-12-02T15:53:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:27 crc kubenswrapper[4933]: I1202 15:53:27.035444 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:27 crc kubenswrapper[4933]: I1202 15:53:27.035510 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:27 crc kubenswrapper[4933]: I1202 15:53:27.035522 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:27 crc kubenswrapper[4933]: I1202 15:53:27.035541 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:27 crc kubenswrapper[4933]: I1202 15:53:27.035553 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:27Z","lastTransitionTime":"2025-12-02T15:53:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:27 crc kubenswrapper[4933]: I1202 15:53:27.053075 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qbps2" Dec 02 15:53:27 crc kubenswrapper[4933]: E1202 15:53:27.053284 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qbps2" podUID="c95a4730-1427-4097-9ca3-4bd251e7acf0" Dec 02 15:53:27 crc kubenswrapper[4933]: I1202 15:53:27.068659 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-z6kjz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b033c545-93a2-4401-842b-22456e44216b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5307d7bbe56091012f9975b2a42eafb27d8c90b53817f1f82d8269e23456759\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdq96\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-z6kjz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:27Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:27 crc kubenswrapper[4933]: I1202 15:53:27.082288 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50efb59904b9275848b8f068dfa8943515c66087209fe13dc75888354ecaff09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:27Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:27 crc kubenswrapper[4933]: I1202 15:53:27.093307 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:27Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:27 crc kubenswrapper[4933]: I1202 15:53:27.106256 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4424b26b4c67e9508d92cc6bbc82b291d93c587a8463026a856d87b7b778079e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:27Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:27 crc kubenswrapper[4933]: I1202 15:53:27.121022 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-s779q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72a30a71-fe04-43a2-8f60-c9b12a0a6e5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2367cc195b2375ceed6df398214268e414ae13f5459750b4a1f3bbe4ef59363\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a448b3118b6160344bea740978c045ce2351af934805ee95eb6b940963b377bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a448b3118b6160344bea740978c045ce2351af934805ee95eb6b940963b377bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f11b7eae04e4265f3d00a32c0c1c91419ef6da612c6ecd2e69ef4e90b5f72b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f11b7eae04e4265f3d00a32c0c1c91419ef6da612c6ecd2e69ef4e90b5f72b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89093d056f612b90064277410dc17c8b1879459a9e9491538ccd8dabeb2703e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89093d056f612b90064277410dc17c8b1879459a9e9491538ccd8dabeb2703e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e773c0ba8db44a579b0f5b59c7402d773402fbb1b3e3b57bf95e9244df635c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e773c0ba8db44a579b0f5b59c7402d773402fbb1b3e3b57bf95e9244df635c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a098d78bfca730eec14ca5f48ae082e24f0623ec3752e1b0182dceb18821e3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a098d78bfca730eec14ca5f48ae082e24f0623ec3752e1b0182dceb18821e3b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e41fea9eed5df332e515a81666075fc4cb3171b47a7c222b36dac4d5a7533692\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e41fea9eed5df332e515a81666075fc4cb3171b47a7c222b36dac4d5a7533692\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-s779q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:27Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:27 crc kubenswrapper[4933]: I1202 15:53:27.137295 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67141d41-dade-4d16-8921-1a3eeaef658e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67497877f8241167fd89cf2ebbc7257a910b4f22e96a1e58f78936317daf19aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://559f28d5b8b13a4d78a727f671a75d2c61a391f83089c98749a8071b466c8fae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50e3f8f75abacf2b0701b20cc58796727bb04d99836c7ad69c27355d0c7f070f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0f8661c980893fc646e84a6bb8547946653723fb3f1449d585953e25715d588\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdced4864fc5e9a41404f9484c6126634ffcbc3388080207f6a5508be6dc7b19\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 15:52:30.552832 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 15:52:30.556124 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1621699441/tls.crt::/tmp/serving-cert-1621699441/tls.key\\\\\\\"\\\\nI1202 15:52:36.166152 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 15:52:36.169452 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 15:52:36.169553 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 15:52:36.169614 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 15:52:36.169667 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 15:52:36.177343 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 15:52:36.177409 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 15:52:36.177417 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 15:52:36.177423 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 15:52:36.177428 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 15:52:36.177432 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 15:52:36.177437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 15:52:36.177361 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 15:52:36.181525 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://266ff5f18c317176752753189e1fd32c61d17e22bebba19421d71cdce41c10e4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd63fc5b2ffbd52b09628908868b28e2168bfe8bc453de41ce175aedcd404cc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd63fc5b2ffbd52b09628908868b28e2168bfe8bc453de41ce175aedcd404cc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:27Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:27 crc kubenswrapper[4933]: I1202 15:53:27.138189 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:27 crc kubenswrapper[4933]: I1202 15:53:27.138226 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:27 crc kubenswrapper[4933]: I1202 15:53:27.138241 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:27 crc kubenswrapper[4933]: I1202 15:53:27.138260 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:27 crc kubenswrapper[4933]: I1202 15:53:27.138276 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:27Z","lastTransitionTime":"2025-12-02T15:53:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:27 crc kubenswrapper[4933]: I1202 15:53:27.155316 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c0555430401fd089ba4f14bef44c9a03bcc4352a3159c34aa592797211ff912\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2774460b514418fc05e1d8ac0ca0a8cda1194fab9151804bed266e6bf44c7369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:27Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:27 crc kubenswrapper[4933]: I1202 15:53:27.167155 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:27Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:27 crc kubenswrapper[4933]: I1202 15:53:27.184878 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1972064c-ea30-421c-b009-2bc675a98fcc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3920b75a79accaa154ca827b9d813f5bbb7c7c18369c0b6d3bb82cab653bcb66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a3f2ac3ef3759389e0c4807b4235aea66b8d9570ef020af90110402296a755\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d4c0805f39e3410185b2d898b61b42dfeb544cd47c1ac7570a099ece7921a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://170155e5803e9eabfc4866a3cd76bd74567d9ff6707bebfdf7f24a2330173140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a499bb33dae499d565aa22f537a095fe9465bd1d68d829248fc7a896f692f67d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14c25265781df0f480ecebbc167b5e413d869433f7258cc299a2c2bc12ee2af3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e68bfc1eb2e35cbb02950eed88f4e4cb09b5ee597cbf3dbefcbb7dca7a9f90d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e68bfc1eb2e35cbb02950eed88f4e4cb09b5ee597cbf3dbefcbb7dca7a9f90d7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T15:53:08Z\\\",\\\"message\\\":\\\"g for pod on switch crc\\\\nI1202 15:53:07.975214 6599 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1202 15:53:07.975521 6599 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1202 15:53:07.975535 6599 base_network_controller_pods.go:916] Annotation values: ip=[10.217.0.3/23] ; mac=0a:58:0a:d9:00:03 ; gw=[10.217.0.1]\\\\nI1202 15:53:07.975530 6599 ovn.go:134] Ensuring zone local for Pod openshift-kube-controller-manager/kube-controller-manager-crc in node crc\\\\nI1202 15:53:07.975580 6599 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc after 0 failed attempt(s)\\\\nI1202 15:53:07.975354 6599 services_controller.go:444] Built service openshift-apiserver/api LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1202 15:53:07.975596 6599 default_network_controller.go:776] Recording success event on pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1202 15:53:07.975612 6599 services_controller.go:445] Built service openshift-apiserver/api LB template configs for network=default: []services.lbConfig(nil)\\\\nF1202 15:53:07.975627 6599 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T15:53:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-8mklc_openshift-ovn-kubernetes(1972064c-ea30-421c-b009-2bc675a98fcc)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ae6c9973a0f8983a5da9a3a25ba23d7d836e7f5bdaa60b60627d604ef3d2afc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0062caebc274431e99ef7dc875a50f2f2d756215bd2700b7206582f68e5f376\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0062caebc274431e99ef7dc875a50f2f2d756215bd2700b7206582f68e5f376\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8mklc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:27Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:27 crc kubenswrapper[4933]: I1202 15:53:27.197125 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s488w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a161d9d5-a56f-45e9-93e4-50e7220cd31e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38f86b30416fb2f10e8444d5c7c0afe84f16d619e83e7a5e1186eaf4c274a51f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mnvpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cba0775cbbe9a6ce1f0b0fe3559f2b5eb39bd13d4f35686fe2ff92d7d833909\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mnvpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s488w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:27Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:27 crc kubenswrapper[4933]: I1202 15:53:27.210349 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"448f05b3-d7b9-4eca-a267-b9b4a5a766e2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://242478d3d008a43cabff06a9c4bf90faf8c8eb3a12f5f43cde5d0156c47d6a8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://278a1b76a0d8f7d922ab08d7ecd0912f0cde3f3438de43f5c5214e703bd244c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://278a1b76a0d8f7d922ab08d7ecd0912f0cde3f3438de43f5c5214e703bd244c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:27Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:27 crc kubenswrapper[4933]: I1202 15:53:27.224689 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06b195f1-3296-4050-9361-eab421cde8d4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7399fed7994f0ed3f12a423d3f6796e84d8687f9c16a3050ccbb90e1c80a07d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee933bdd8638f0085a6f720a178c8ce59bf46b40a0bcb015ac9c570e25ce97d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7cb1abb6f86878fc3daef153191ea3a2ebe06b3f1fc7df959539938c3b6a724\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db14d6e6ebdfa06ff02570eb66fe7ea17a7705fdaa767b6fb91d7ed12eacd59a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:27Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:27 crc kubenswrapper[4933]: I1202 15:53:27.235482 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5d9dn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97edaf10-912b-42e7-a9e7-930381d48508\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea0970d622ab76b94ceab66d1d10d469581574368d38c8cd7c6b7a26f81cb6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s82hp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5d9dn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:27Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:27 crc kubenswrapper[4933]: I1202 15:53:27.241289 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:27 crc kubenswrapper[4933]: I1202 15:53:27.241357 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:27 crc kubenswrapper[4933]: I1202 15:53:27.241382 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:27 crc kubenswrapper[4933]: I1202 15:53:27.241413 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:27 crc kubenswrapper[4933]: I1202 15:53:27.241436 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:27Z","lastTransitionTime":"2025-12-02T15:53:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:27 crc kubenswrapper[4933]: I1202 15:53:27.246186 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fl25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"680c0df1-e4d6-4e1c-a36d-2378e821d2d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bdbdb1eb6ad7b6e710feeda6af64e9557ec1c3c938fd850fa5b2835abc45f098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-559sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fl25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:27Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:27 crc kubenswrapper[4933]: I1202 15:53:27.259771 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qbps2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c95a4730-1427-4097-9ca3-4bd251e7acf0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdwf7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdwf7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:56Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qbps2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:27Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:27 crc kubenswrapper[4933]: I1202 15:53:27.273308 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7faf072-ca84-4fd4-9409-b46ca6d4f1b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://befd22a9932058934ba3614781a7f133b4e432d5488c47697d082722ac11e0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d957eb940436ccb56c610cb177a4670ec0fe3aa5437e333a99c291c4263c5978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://defe82313be0dd4818134a0c694acbf728c8e31d0df745b7f2241a3e57c1bd8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d0c36fe96576d4594aff50e5b631d1bc23ea352372722d332d11c9dc6b5b7cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d0c36fe96576d4594aff50e5b631d1bc23ea352372722d332d11c9dc6b5b7cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:17Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:27Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:27 crc kubenswrapper[4933]: I1202 15:53:27.288396 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:27Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:27 crc kubenswrapper[4933]: I1202 15:53:27.302284 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6c1c5e6-50dd-428a-890c-2c3f0456f2fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d84363ac0dfeec81ad7770d6ffd34547605fc51bebb545c4639f4c069bab93ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t48jm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54194f3459a2bbe748821e4f8e94abdd18e7c4e483d4cc2c9d5b765db584dd01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t48jm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-d2p6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:27Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:27 crc kubenswrapper[4933]: I1202 15:53:27.343485 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:27 crc kubenswrapper[4933]: I1202 15:53:27.344367 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:27 crc kubenswrapper[4933]: I1202 15:53:27.344520 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:27 crc kubenswrapper[4933]: I1202 15:53:27.344683 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:27 crc kubenswrapper[4933]: I1202 15:53:27.344801 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:27Z","lastTransitionTime":"2025-12-02T15:53:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:27 crc kubenswrapper[4933]: I1202 15:53:27.447474 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:27 crc kubenswrapper[4933]: I1202 15:53:27.447518 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:27 crc kubenswrapper[4933]: I1202 15:53:27.447528 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:27 crc kubenswrapper[4933]: I1202 15:53:27.447564 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:27 crc kubenswrapper[4933]: I1202 15:53:27.447573 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:27Z","lastTransitionTime":"2025-12-02T15:53:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:27 crc kubenswrapper[4933]: I1202 15:53:27.549891 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:27 crc kubenswrapper[4933]: I1202 15:53:27.549934 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:27 crc kubenswrapper[4933]: I1202 15:53:27.549949 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:27 crc kubenswrapper[4933]: I1202 15:53:27.549965 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:27 crc kubenswrapper[4933]: I1202 15:53:27.549974 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:27Z","lastTransitionTime":"2025-12-02T15:53:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:27 crc kubenswrapper[4933]: I1202 15:53:27.653138 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:27 crc kubenswrapper[4933]: I1202 15:53:27.653212 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:27 crc kubenswrapper[4933]: I1202 15:53:27.653229 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:27 crc kubenswrapper[4933]: I1202 15:53:27.653258 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:27 crc kubenswrapper[4933]: I1202 15:53:27.653278 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:27Z","lastTransitionTime":"2025-12-02T15:53:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:27 crc kubenswrapper[4933]: I1202 15:53:27.755385 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:27 crc kubenswrapper[4933]: I1202 15:53:27.755430 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:27 crc kubenswrapper[4933]: I1202 15:53:27.755440 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:27 crc kubenswrapper[4933]: I1202 15:53:27.755458 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:27 crc kubenswrapper[4933]: I1202 15:53:27.755469 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:27Z","lastTransitionTime":"2025-12-02T15:53:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:27 crc kubenswrapper[4933]: I1202 15:53:27.858503 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:27 crc kubenswrapper[4933]: I1202 15:53:27.858544 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:27 crc kubenswrapper[4933]: I1202 15:53:27.858554 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:27 crc kubenswrapper[4933]: I1202 15:53:27.858569 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:27 crc kubenswrapper[4933]: I1202 15:53:27.858579 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:27Z","lastTransitionTime":"2025-12-02T15:53:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:27 crc kubenswrapper[4933]: I1202 15:53:27.962226 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:27 crc kubenswrapper[4933]: I1202 15:53:27.962282 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:27 crc kubenswrapper[4933]: I1202 15:53:27.962299 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:27 crc kubenswrapper[4933]: I1202 15:53:27.962325 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:27 crc kubenswrapper[4933]: I1202 15:53:27.962343 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:27Z","lastTransitionTime":"2025-12-02T15:53:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:28 crc kubenswrapper[4933]: I1202 15:53:28.053031 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 15:53:28 crc kubenswrapper[4933]: I1202 15:53:28.053154 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 15:53:28 crc kubenswrapper[4933]: E1202 15:53:28.053214 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 15:53:28 crc kubenswrapper[4933]: I1202 15:53:28.053449 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 15:53:28 crc kubenswrapper[4933]: E1202 15:53:28.053465 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 15:53:28 crc kubenswrapper[4933]: I1202 15:53:28.054368 4933 scope.go:117] "RemoveContainer" containerID="e68bfc1eb2e35cbb02950eed88f4e4cb09b5ee597cbf3dbefcbb7dca7a9f90d7" Dec 02 15:53:28 crc kubenswrapper[4933]: E1202 15:53:28.054596 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-8mklc_openshift-ovn-kubernetes(1972064c-ea30-421c-b009-2bc675a98fcc)\"" pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" podUID="1972064c-ea30-421c-b009-2bc675a98fcc" Dec 02 15:53:28 crc kubenswrapper[4933]: E1202 15:53:28.054642 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 15:53:28 crc kubenswrapper[4933]: I1202 15:53:28.065286 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:28 crc kubenswrapper[4933]: I1202 15:53:28.065314 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:28 crc kubenswrapper[4933]: I1202 15:53:28.065325 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:28 crc kubenswrapper[4933]: I1202 15:53:28.065343 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:28 crc kubenswrapper[4933]: I1202 15:53:28.065357 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:28Z","lastTransitionTime":"2025-12-02T15:53:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:28 crc kubenswrapper[4933]: I1202 15:53:28.168679 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:28 crc kubenswrapper[4933]: I1202 15:53:28.168737 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:28 crc kubenswrapper[4933]: I1202 15:53:28.168747 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:28 crc kubenswrapper[4933]: I1202 15:53:28.168767 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:28 crc kubenswrapper[4933]: I1202 15:53:28.168778 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:28Z","lastTransitionTime":"2025-12-02T15:53:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:28 crc kubenswrapper[4933]: I1202 15:53:28.272511 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:28 crc kubenswrapper[4933]: I1202 15:53:28.272570 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:28 crc kubenswrapper[4933]: I1202 15:53:28.272587 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:28 crc kubenswrapper[4933]: I1202 15:53:28.272614 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:28 crc kubenswrapper[4933]: I1202 15:53:28.272633 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:28Z","lastTransitionTime":"2025-12-02T15:53:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:28 crc kubenswrapper[4933]: I1202 15:53:28.374775 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:28 crc kubenswrapper[4933]: I1202 15:53:28.374801 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:28 crc kubenswrapper[4933]: I1202 15:53:28.374809 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:28 crc kubenswrapper[4933]: I1202 15:53:28.374839 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:28 crc kubenswrapper[4933]: I1202 15:53:28.374849 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:28Z","lastTransitionTime":"2025-12-02T15:53:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:28 crc kubenswrapper[4933]: I1202 15:53:28.478409 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:28 crc kubenswrapper[4933]: I1202 15:53:28.478447 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:28 crc kubenswrapper[4933]: I1202 15:53:28.478461 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:28 crc kubenswrapper[4933]: I1202 15:53:28.478481 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:28 crc kubenswrapper[4933]: I1202 15:53:28.478498 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:28Z","lastTransitionTime":"2025-12-02T15:53:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:28 crc kubenswrapper[4933]: I1202 15:53:28.581719 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:28 crc kubenswrapper[4933]: I1202 15:53:28.581786 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:28 crc kubenswrapper[4933]: I1202 15:53:28.581800 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:28 crc kubenswrapper[4933]: I1202 15:53:28.581834 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:28 crc kubenswrapper[4933]: I1202 15:53:28.581845 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:28Z","lastTransitionTime":"2025-12-02T15:53:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:28 crc kubenswrapper[4933]: I1202 15:53:28.684085 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:28 crc kubenswrapper[4933]: I1202 15:53:28.684128 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:28 crc kubenswrapper[4933]: I1202 15:53:28.684137 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:28 crc kubenswrapper[4933]: I1202 15:53:28.684153 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:28 crc kubenswrapper[4933]: I1202 15:53:28.684162 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:28Z","lastTransitionTime":"2025-12-02T15:53:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:28 crc kubenswrapper[4933]: I1202 15:53:28.786035 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:28 crc kubenswrapper[4933]: I1202 15:53:28.786079 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:28 crc kubenswrapper[4933]: I1202 15:53:28.786090 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:28 crc kubenswrapper[4933]: I1202 15:53:28.786108 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:28 crc kubenswrapper[4933]: I1202 15:53:28.786120 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:28Z","lastTransitionTime":"2025-12-02T15:53:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:28 crc kubenswrapper[4933]: I1202 15:53:28.889905 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:28 crc kubenswrapper[4933]: I1202 15:53:28.890026 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:28 crc kubenswrapper[4933]: I1202 15:53:28.890056 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:28 crc kubenswrapper[4933]: I1202 15:53:28.890091 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:28 crc kubenswrapper[4933]: I1202 15:53:28.890113 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:28Z","lastTransitionTime":"2025-12-02T15:53:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:28 crc kubenswrapper[4933]: I1202 15:53:28.958581 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c95a4730-1427-4097-9ca3-4bd251e7acf0-metrics-certs\") pod \"network-metrics-daemon-qbps2\" (UID: \"c95a4730-1427-4097-9ca3-4bd251e7acf0\") " pod="openshift-multus/network-metrics-daemon-qbps2" Dec 02 15:53:28 crc kubenswrapper[4933]: E1202 15:53:28.958783 4933 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 15:53:28 crc kubenswrapper[4933]: E1202 15:53:28.958935 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c95a4730-1427-4097-9ca3-4bd251e7acf0-metrics-certs podName:c95a4730-1427-4097-9ca3-4bd251e7acf0 nodeName:}" failed. No retries permitted until 2025-12-02 15:54:00.958907107 +0000 UTC m=+104.210133850 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c95a4730-1427-4097-9ca3-4bd251e7acf0-metrics-certs") pod "network-metrics-daemon-qbps2" (UID: "c95a4730-1427-4097-9ca3-4bd251e7acf0") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 15:53:28 crc kubenswrapper[4933]: I1202 15:53:28.992910 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:28 crc kubenswrapper[4933]: I1202 15:53:28.992983 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:28 crc kubenswrapper[4933]: I1202 15:53:28.993008 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:28 crc kubenswrapper[4933]: I1202 15:53:28.993038 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:28 crc kubenswrapper[4933]: I1202 15:53:28.993062 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:28Z","lastTransitionTime":"2025-12-02T15:53:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:29 crc kubenswrapper[4933]: I1202 15:53:29.052530 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qbps2" Dec 02 15:53:29 crc kubenswrapper[4933]: E1202 15:53:29.052804 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qbps2" podUID="c95a4730-1427-4097-9ca3-4bd251e7acf0" Dec 02 15:53:29 crc kubenswrapper[4933]: I1202 15:53:29.095761 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:29 crc kubenswrapper[4933]: I1202 15:53:29.095810 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:29 crc kubenswrapper[4933]: I1202 15:53:29.095854 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:29 crc kubenswrapper[4933]: I1202 15:53:29.095873 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:29 crc kubenswrapper[4933]: I1202 15:53:29.095885 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:29Z","lastTransitionTime":"2025-12-02T15:53:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:29 crc kubenswrapper[4933]: I1202 15:53:29.198625 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:29 crc kubenswrapper[4933]: I1202 15:53:29.198676 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:29 crc kubenswrapper[4933]: I1202 15:53:29.198689 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:29 crc kubenswrapper[4933]: I1202 15:53:29.198707 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:29 crc kubenswrapper[4933]: I1202 15:53:29.198718 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:29Z","lastTransitionTime":"2025-12-02T15:53:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:29 crc kubenswrapper[4933]: I1202 15:53:29.302053 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:29 crc kubenswrapper[4933]: I1202 15:53:29.302100 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:29 crc kubenswrapper[4933]: I1202 15:53:29.302109 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:29 crc kubenswrapper[4933]: I1202 15:53:29.302124 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:29 crc kubenswrapper[4933]: I1202 15:53:29.302134 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:29Z","lastTransitionTime":"2025-12-02T15:53:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:29 crc kubenswrapper[4933]: I1202 15:53:29.404159 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:29 crc kubenswrapper[4933]: I1202 15:53:29.404209 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:29 crc kubenswrapper[4933]: I1202 15:53:29.404258 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:29 crc kubenswrapper[4933]: I1202 15:53:29.404276 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:29 crc kubenswrapper[4933]: I1202 15:53:29.404287 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:29Z","lastTransitionTime":"2025-12-02T15:53:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:29 crc kubenswrapper[4933]: I1202 15:53:29.507259 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:29 crc kubenswrapper[4933]: I1202 15:53:29.507299 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:29 crc kubenswrapper[4933]: I1202 15:53:29.507314 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:29 crc kubenswrapper[4933]: I1202 15:53:29.507330 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:29 crc kubenswrapper[4933]: I1202 15:53:29.507341 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:29Z","lastTransitionTime":"2025-12-02T15:53:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:29 crc kubenswrapper[4933]: I1202 15:53:29.609988 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:29 crc kubenswrapper[4933]: I1202 15:53:29.610033 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:29 crc kubenswrapper[4933]: I1202 15:53:29.610041 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:29 crc kubenswrapper[4933]: I1202 15:53:29.610057 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:29 crc kubenswrapper[4933]: I1202 15:53:29.610068 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:29Z","lastTransitionTime":"2025-12-02T15:53:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:29 crc kubenswrapper[4933]: I1202 15:53:29.716757 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:29 crc kubenswrapper[4933]: I1202 15:53:29.717075 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:29 crc kubenswrapper[4933]: I1202 15:53:29.717102 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:29 crc kubenswrapper[4933]: I1202 15:53:29.717131 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:29 crc kubenswrapper[4933]: I1202 15:53:29.717159 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:29Z","lastTransitionTime":"2025-12-02T15:53:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:29 crc kubenswrapper[4933]: I1202 15:53:29.819301 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:29 crc kubenswrapper[4933]: I1202 15:53:29.819352 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:29 crc kubenswrapper[4933]: I1202 15:53:29.819364 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:29 crc kubenswrapper[4933]: I1202 15:53:29.819384 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:29 crc kubenswrapper[4933]: I1202 15:53:29.819396 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:29Z","lastTransitionTime":"2025-12-02T15:53:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:29 crc kubenswrapper[4933]: I1202 15:53:29.921935 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:29 crc kubenswrapper[4933]: I1202 15:53:29.921993 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:29 crc kubenswrapper[4933]: I1202 15:53:29.922009 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:29 crc kubenswrapper[4933]: I1202 15:53:29.922036 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:29 crc kubenswrapper[4933]: I1202 15:53:29.922054 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:29Z","lastTransitionTime":"2025-12-02T15:53:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:30 crc kubenswrapper[4933]: I1202 15:53:30.025613 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:30 crc kubenswrapper[4933]: I1202 15:53:30.025672 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:30 crc kubenswrapper[4933]: I1202 15:53:30.025690 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:30 crc kubenswrapper[4933]: I1202 15:53:30.025712 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:30 crc kubenswrapper[4933]: I1202 15:53:30.025724 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:30Z","lastTransitionTime":"2025-12-02T15:53:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:30 crc kubenswrapper[4933]: I1202 15:53:30.052655 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 15:53:30 crc kubenswrapper[4933]: I1202 15:53:30.052724 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 15:53:30 crc kubenswrapper[4933]: I1202 15:53:30.052784 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 15:53:30 crc kubenswrapper[4933]: E1202 15:53:30.052811 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 15:53:30 crc kubenswrapper[4933]: E1202 15:53:30.052899 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 15:53:30 crc kubenswrapper[4933]: E1202 15:53:30.052980 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 15:53:30 crc kubenswrapper[4933]: I1202 15:53:30.129137 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:30 crc kubenswrapper[4933]: I1202 15:53:30.129566 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:30 crc kubenswrapper[4933]: I1202 15:53:30.129676 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:30 crc kubenswrapper[4933]: I1202 15:53:30.129790 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:30 crc kubenswrapper[4933]: I1202 15:53:30.129910 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:30Z","lastTransitionTime":"2025-12-02T15:53:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:30 crc kubenswrapper[4933]: I1202 15:53:30.232775 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:30 crc kubenswrapper[4933]: I1202 15:53:30.232840 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:30 crc kubenswrapper[4933]: I1202 15:53:30.232851 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:30 crc kubenswrapper[4933]: I1202 15:53:30.232871 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:30 crc kubenswrapper[4933]: I1202 15:53:30.232880 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:30Z","lastTransitionTime":"2025-12-02T15:53:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:30 crc kubenswrapper[4933]: I1202 15:53:30.335588 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:30 crc kubenswrapper[4933]: I1202 15:53:30.335667 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:30 crc kubenswrapper[4933]: I1202 15:53:30.335680 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:30 crc kubenswrapper[4933]: I1202 15:53:30.335719 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:30 crc kubenswrapper[4933]: I1202 15:53:30.335733 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:30Z","lastTransitionTime":"2025-12-02T15:53:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:30 crc kubenswrapper[4933]: I1202 15:53:30.438909 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:30 crc kubenswrapper[4933]: I1202 15:53:30.438961 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:30 crc kubenswrapper[4933]: I1202 15:53:30.438973 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:30 crc kubenswrapper[4933]: I1202 15:53:30.438993 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:30 crc kubenswrapper[4933]: I1202 15:53:30.439007 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:30Z","lastTransitionTime":"2025-12-02T15:53:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:30 crc kubenswrapper[4933]: I1202 15:53:30.542195 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:30 crc kubenswrapper[4933]: I1202 15:53:30.542243 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:30 crc kubenswrapper[4933]: I1202 15:53:30.542253 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:30 crc kubenswrapper[4933]: I1202 15:53:30.542269 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:30 crc kubenswrapper[4933]: I1202 15:53:30.542281 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:30Z","lastTransitionTime":"2025-12-02T15:53:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:30 crc kubenswrapper[4933]: I1202 15:53:30.644819 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:30 crc kubenswrapper[4933]: I1202 15:53:30.644971 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:30 crc kubenswrapper[4933]: I1202 15:53:30.644994 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:30 crc kubenswrapper[4933]: I1202 15:53:30.645026 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:30 crc kubenswrapper[4933]: I1202 15:53:30.645049 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:30Z","lastTransitionTime":"2025-12-02T15:53:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:30 crc kubenswrapper[4933]: I1202 15:53:30.748967 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:30 crc kubenswrapper[4933]: I1202 15:53:30.749085 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:30 crc kubenswrapper[4933]: I1202 15:53:30.749108 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:30 crc kubenswrapper[4933]: I1202 15:53:30.749137 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:30 crc kubenswrapper[4933]: I1202 15:53:30.749156 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:30Z","lastTransitionTime":"2025-12-02T15:53:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:30 crc kubenswrapper[4933]: I1202 15:53:30.785737 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-z6kjz_b033c545-93a2-4401-842b-22456e44216b/kube-multus/0.log" Dec 02 15:53:30 crc kubenswrapper[4933]: I1202 15:53:30.785803 4933 generic.go:334] "Generic (PLEG): container finished" podID="b033c545-93a2-4401-842b-22456e44216b" containerID="a5307d7bbe56091012f9975b2a42eafb27d8c90b53817f1f82d8269e23456759" exitCode=1 Dec 02 15:53:30 crc kubenswrapper[4933]: I1202 15:53:30.785864 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-z6kjz" event={"ID":"b033c545-93a2-4401-842b-22456e44216b","Type":"ContainerDied","Data":"a5307d7bbe56091012f9975b2a42eafb27d8c90b53817f1f82d8269e23456759"} Dec 02 15:53:30 crc kubenswrapper[4933]: I1202 15:53:30.786392 4933 scope.go:117] "RemoveContainer" containerID="a5307d7bbe56091012f9975b2a42eafb27d8c90b53817f1f82d8269e23456759" Dec 02 15:53:30 crc kubenswrapper[4933]: I1202 15:53:30.802521 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"448f05b3-d7b9-4eca-a267-b9b4a5a766e2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://242478d3d008a43cabff06a9c4bf90faf8c8eb3a12f5f43cde5d0156c47d6a8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://278a1b76a0d8f7d922ab08d7ecd0912f0cde3f3438de43f5c5214e703bd244c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://278a1b76a0d8f7d922ab08d7ecd0912f0cde3f3438de43f5c5214e703bd244c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:30Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:30 crc kubenswrapper[4933]: I1202 15:53:30.819759 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06b195f1-3296-4050-9361-eab421cde8d4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7399fed7994f0ed3f12a423d3f6796e84d8687f9c16a3050ccbb90e1c80a07d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee933bdd8638f0085a6f720a178c8ce59bf46b40a0bcb015ac9c570e25ce97d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7cb1abb6f86878fc3daef153191ea3a2ebe06b3f1fc7df959539938c3b6a724\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db14d6e6ebdfa06ff02570eb66fe7ea17a7705fdaa767b6fb91d7ed12eacd59a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:30Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:30 crc kubenswrapper[4933]: I1202 15:53:30.847058 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1972064c-ea30-421c-b009-2bc675a98fcc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3920b75a79accaa154ca827b9d813f5bbb7c7c18369c0b6d3bb82cab653bcb66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a3f2ac3ef3759389e0c4807b4235aea66b8d9570ef020af90110402296a755\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d4c0805f39e3410185b2d898b61b42dfeb544cd47c1ac7570a099ece7921a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://170155e5803e9eabfc4866a3cd76bd74567d9ff6707bebfdf7f24a2330173140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a499bb33dae499d565aa22f537a095fe9465bd1d68d829248fc7a896f692f67d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14c25265781df0f480ecebbc167b5e413d869433f7258cc299a2c2bc12ee2af3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e68bfc1eb2e35cbb02950eed88f4e4cb09b5ee597cbf3dbefcbb7dca7a9f90d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e68bfc1eb2e35cbb02950eed88f4e4cb09b5ee597cbf3dbefcbb7dca7a9f90d7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T15:53:08Z\\\",\\\"message\\\":\\\"g for pod on switch crc\\\\nI1202 15:53:07.975214 6599 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1202 15:53:07.975521 6599 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1202 15:53:07.975535 6599 base_network_controller_pods.go:916] Annotation values: ip=[10.217.0.3/23] ; mac=0a:58:0a:d9:00:03 ; gw=[10.217.0.1]\\\\nI1202 15:53:07.975530 6599 ovn.go:134] Ensuring zone local for Pod openshift-kube-controller-manager/kube-controller-manager-crc in node crc\\\\nI1202 15:53:07.975580 6599 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc after 0 failed attempt(s)\\\\nI1202 15:53:07.975354 6599 services_controller.go:444] Built service openshift-apiserver/api LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1202 15:53:07.975596 6599 default_network_controller.go:776] Recording success event on pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1202 15:53:07.975612 6599 services_controller.go:445] Built service openshift-apiserver/api LB template configs for network=default: []services.lbConfig(nil)\\\\nF1202 15:53:07.975627 6599 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T15:53:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-8mklc_openshift-ovn-kubernetes(1972064c-ea30-421c-b009-2bc675a98fcc)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ae6c9973a0f8983a5da9a3a25ba23d7d836e7f5bdaa60b60627d604ef3d2afc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0062caebc274431e99ef7dc875a50f2f2d756215bd2700b7206582f68e5f376\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0062caebc274431e99ef7dc875a50f2f2d756215bd2700b7206582f68e5f376\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8mklc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:30Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:30 crc kubenswrapper[4933]: I1202 15:53:30.852998 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:30 crc kubenswrapper[4933]: I1202 15:53:30.853047 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:30 crc kubenswrapper[4933]: I1202 15:53:30.853058 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:30 crc kubenswrapper[4933]: I1202 15:53:30.853097 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:30 crc kubenswrapper[4933]: I1202 15:53:30.853108 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:30Z","lastTransitionTime":"2025-12-02T15:53:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:30 crc kubenswrapper[4933]: I1202 15:53:30.860447 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s488w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a161d9d5-a56f-45e9-93e4-50e7220cd31e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38f86b30416fb2f10e8444d5c7c0afe84f16d619e83e7a5e1186eaf4c274a51f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mnvpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cba0775cbbe9a6ce1f0b0fe3559f2b5eb39bd13d4f35686fe2ff92d7d833909\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mnvpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s488w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:30Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:30 crc kubenswrapper[4933]: I1202 15:53:30.874176 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qbps2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c95a4730-1427-4097-9ca3-4bd251e7acf0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdwf7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdwf7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:56Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qbps2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:30Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:30 crc kubenswrapper[4933]: I1202 15:53:30.891291 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7faf072-ca84-4fd4-9409-b46ca6d4f1b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://befd22a9932058934ba3614781a7f133b4e432d5488c47697d082722ac11e0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d957eb940436ccb56c610cb177a4670ec0fe3aa5437e333a99c291c4263c5978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://defe82313be0dd4818134a0c694acbf728c8e31d0df745b7f2241a3e57c1bd8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d0c36fe96576d4594aff50e5b631d1bc23ea352372722d332d11c9dc6b5b7cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d0c36fe96576d4594aff50e5b631d1bc23ea352372722d332d11c9dc6b5b7cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:17Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:30Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:30 crc kubenswrapper[4933]: I1202 15:53:30.907629 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:30Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:30 crc kubenswrapper[4933]: I1202 15:53:30.923561 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6c1c5e6-50dd-428a-890c-2c3f0456f2fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d84363ac0dfeec81ad7770d6ffd34547605fc51bebb545c4639f4c069bab93ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t48jm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54194f3459a2bbe748821e4f8e94abdd18e7c4e483d4cc2c9d5b765db584dd01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t48jm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-d2p6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:30Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:30 crc kubenswrapper[4933]: I1202 15:53:30.937731 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5d9dn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97edaf10-912b-42e7-a9e7-930381d48508\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea0970d622ab76b94ceab66d1d10d469581574368d38c8cd7c6b7a26f81cb6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s82hp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5d9dn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:30Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:30 crc kubenswrapper[4933]: I1202 15:53:30.953455 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fl25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"680c0df1-e4d6-4e1c-a36d-2378e821d2d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bdbdb1eb6ad7b6e710feeda6af64e9557ec1c3c938fd850fa5b2835abc45f098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-559sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fl25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:30Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:30 crc kubenswrapper[4933]: I1202 15:53:30.956352 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:30 crc kubenswrapper[4933]: I1202 15:53:30.956408 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:30 crc kubenswrapper[4933]: I1202 15:53:30.956425 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:30 crc kubenswrapper[4933]: I1202 15:53:30.956449 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:30 crc kubenswrapper[4933]: I1202 15:53:30.956466 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:30Z","lastTransitionTime":"2025-12-02T15:53:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:30 crc kubenswrapper[4933]: I1202 15:53:30.971593 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50efb59904b9275848b8f068dfa8943515c66087209fe13dc75888354ecaff09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:30Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:30 crc kubenswrapper[4933]: I1202 15:53:30.985339 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:30Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:31 crc kubenswrapper[4933]: I1202 15:53:31.001900 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4424b26b4c67e9508d92cc6bbc82b291d93c587a8463026a856d87b7b778079e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:30Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:31 crc kubenswrapper[4933]: I1202 15:53:31.017918 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-z6kjz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b033c545-93a2-4401-842b-22456e44216b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5307d7bbe56091012f9975b2a42eafb27d8c90b53817f1f82d8269e23456759\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5307d7bbe56091012f9975b2a42eafb27d8c90b53817f1f82d8269e23456759\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T15:53:30Z\\\",\\\"message\\\":\\\"2025-12-02T15:52:45+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_261455ff-27bf-4826-8730-80c1421f483f\\\\n2025-12-02T15:52:45+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_261455ff-27bf-4826-8730-80c1421f483f to /host/opt/cni/bin/\\\\n2025-12-02T15:52:45Z [verbose] multus-daemon started\\\\n2025-12-02T15:52:45Z [verbose] Readiness Indicator file check\\\\n2025-12-02T15:53:30Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdq96\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-z6kjz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:31Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:31 crc kubenswrapper[4933]: I1202 15:53:31.036080 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67141d41-dade-4d16-8921-1a3eeaef658e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67497877f8241167fd89cf2ebbc7257a910b4f22e96a1e58f78936317daf19aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://559f28d5b8b13a4d78a727f671a75d2c61a391f83089c98749a8071b466c8fae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50e3f8f75abacf2b0701b20cc58796727bb04d99836c7ad69c27355d0c7f070f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0f8661c980893fc646e84a6bb8547946653723fb3f1449d585953e25715d588\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdced4864fc5e9a41404f9484c6126634ffcbc3388080207f6a5508be6dc7b19\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 15:52:30.552832 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 15:52:30.556124 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1621699441/tls.crt::/tmp/serving-cert-1621699441/tls.key\\\\\\\"\\\\nI1202 15:52:36.166152 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 15:52:36.169452 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 15:52:36.169553 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 15:52:36.169614 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 15:52:36.169667 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 15:52:36.177343 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 15:52:36.177409 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 15:52:36.177417 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 15:52:36.177423 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 15:52:36.177428 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 15:52:36.177432 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 15:52:36.177437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 15:52:36.177361 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 15:52:36.181525 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://266ff5f18c317176752753189e1fd32c61d17e22bebba19421d71cdce41c10e4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd63fc5b2ffbd52b09628908868b28e2168bfe8bc453de41ce175aedcd404cc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd63fc5b2ffbd52b09628908868b28e2168bfe8bc453de41ce175aedcd404cc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:31Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:31 crc kubenswrapper[4933]: I1202 15:53:31.053072 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qbps2" Dec 02 15:53:31 crc kubenswrapper[4933]: E1202 15:53:31.053202 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qbps2" podUID="c95a4730-1427-4097-9ca3-4bd251e7acf0" Dec 02 15:53:31 crc kubenswrapper[4933]: I1202 15:53:31.055751 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c0555430401fd089ba4f14bef44c9a03bcc4352a3159c34aa592797211ff912\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2774460b514418fc05e1d8ac0ca0a8cda1194fab9151804bed266e6bf44c7369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:31Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:31 crc kubenswrapper[4933]: I1202 15:53:31.058650 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:31 crc kubenswrapper[4933]: I1202 15:53:31.058709 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:31 crc kubenswrapper[4933]: I1202 15:53:31.058725 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:31 crc kubenswrapper[4933]: I1202 15:53:31.058746 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:31 crc kubenswrapper[4933]: I1202 15:53:31.058761 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:31Z","lastTransitionTime":"2025-12-02T15:53:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:31 crc kubenswrapper[4933]: I1202 15:53:31.071656 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:31Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:31 crc kubenswrapper[4933]: I1202 15:53:31.092399 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-s779q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72a30a71-fe04-43a2-8f60-c9b12a0a6e5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2367cc195b2375ceed6df398214268e414ae13f5459750b4a1f3bbe4ef59363\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a448b3118b6160344bea740978c045ce2351af934805ee95eb6b940963b377bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a448b3118b6160344bea740978c045ce2351af934805ee95eb6b940963b377bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f11b7eae04e4265f3d00a32c0c1c91419ef6da612c6ecd2e69ef4e90b5f72b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f11b7eae04e4265f3d00a32c0c1c91419ef6da612c6ecd2e69ef4e90b5f72b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89093d056f612b90064277410dc17c8b1879459a9e9491538ccd8dabeb2703e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89093d056f612b90064277410dc17c8b1879459a9e9491538ccd8dabeb2703e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e773c0ba8db44a579b0f5b59c7402d773402fbb1b3e3b57bf95e9244df635c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e773c0ba8db44a579b0f5b59c7402d773402fbb1b3e3b57bf95e9244df635c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a098d78bfca730eec14ca5f48ae082e24f0623ec3752e1b0182dceb18821e3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a098d78bfca730eec14ca5f48ae082e24f0623ec3752e1b0182dceb18821e3b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e41fea9eed5df332e515a81666075fc4cb3171b47a7c222b36dac4d5a7533692\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e41fea9eed5df332e515a81666075fc4cb3171b47a7c222b36dac4d5a7533692\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-s779q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:31Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:31 crc kubenswrapper[4933]: I1202 15:53:31.161579 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:31 crc kubenswrapper[4933]: I1202 15:53:31.161634 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:31 crc kubenswrapper[4933]: I1202 15:53:31.161643 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:31 crc kubenswrapper[4933]: I1202 15:53:31.161660 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:31 crc kubenswrapper[4933]: I1202 15:53:31.161670 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:31Z","lastTransitionTime":"2025-12-02T15:53:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:31 crc kubenswrapper[4933]: I1202 15:53:31.264423 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:31 crc kubenswrapper[4933]: I1202 15:53:31.264482 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:31 crc kubenswrapper[4933]: I1202 15:53:31.264495 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:31 crc kubenswrapper[4933]: I1202 15:53:31.264519 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:31 crc kubenswrapper[4933]: I1202 15:53:31.264534 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:31Z","lastTransitionTime":"2025-12-02T15:53:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:31 crc kubenswrapper[4933]: I1202 15:53:31.368462 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:31 crc kubenswrapper[4933]: I1202 15:53:31.368570 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:31 crc kubenswrapper[4933]: I1202 15:53:31.368586 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:31 crc kubenswrapper[4933]: I1202 15:53:31.368610 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:31 crc kubenswrapper[4933]: I1202 15:53:31.368624 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:31Z","lastTransitionTime":"2025-12-02T15:53:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:31 crc kubenswrapper[4933]: I1202 15:53:31.472338 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:31 crc kubenswrapper[4933]: I1202 15:53:31.472432 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:31 crc kubenswrapper[4933]: I1202 15:53:31.472455 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:31 crc kubenswrapper[4933]: I1202 15:53:31.472488 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:31 crc kubenswrapper[4933]: I1202 15:53:31.472515 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:31Z","lastTransitionTime":"2025-12-02T15:53:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:31 crc kubenswrapper[4933]: I1202 15:53:31.573123 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:31 crc kubenswrapper[4933]: I1202 15:53:31.573224 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:31 crc kubenswrapper[4933]: I1202 15:53:31.573247 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:31 crc kubenswrapper[4933]: I1202 15:53:31.573283 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:31 crc kubenswrapper[4933]: I1202 15:53:31.573310 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:31Z","lastTransitionTime":"2025-12-02T15:53:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:31 crc kubenswrapper[4933]: E1202 15:53:31.593268 4933 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:53:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:53:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:53:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:53:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b45811fa-f657-451d-9a34-cdd268fcc941\\\",\\\"systemUUID\\\":\\\"84b7b789-bc9b-466b-8619-2bf2e1fdb8d0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:31Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:31 crc kubenswrapper[4933]: I1202 15:53:31.597881 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:31 crc kubenswrapper[4933]: I1202 15:53:31.597917 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:31 crc kubenswrapper[4933]: I1202 15:53:31.597925 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:31 crc kubenswrapper[4933]: I1202 15:53:31.597942 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:31 crc kubenswrapper[4933]: I1202 15:53:31.597952 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:31Z","lastTransitionTime":"2025-12-02T15:53:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:31 crc kubenswrapper[4933]: E1202 15:53:31.615524 4933 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:53:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:53:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:53:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:53:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b45811fa-f657-451d-9a34-cdd268fcc941\\\",\\\"systemUUID\\\":\\\"84b7b789-bc9b-466b-8619-2bf2e1fdb8d0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:31Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:31 crc kubenswrapper[4933]: I1202 15:53:31.620586 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:31 crc kubenswrapper[4933]: I1202 15:53:31.620658 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:31 crc kubenswrapper[4933]: I1202 15:53:31.620687 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:31 crc kubenswrapper[4933]: I1202 15:53:31.620715 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:31 crc kubenswrapper[4933]: I1202 15:53:31.620736 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:31Z","lastTransitionTime":"2025-12-02T15:53:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:31 crc kubenswrapper[4933]: E1202 15:53:31.641055 4933 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:53:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:53:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:53:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:53:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b45811fa-f657-451d-9a34-cdd268fcc941\\\",\\\"systemUUID\\\":\\\"84b7b789-bc9b-466b-8619-2bf2e1fdb8d0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:31Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:31 crc kubenswrapper[4933]: I1202 15:53:31.645551 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:31 crc kubenswrapper[4933]: I1202 15:53:31.645582 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:31 crc kubenswrapper[4933]: I1202 15:53:31.645589 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:31 crc kubenswrapper[4933]: I1202 15:53:31.645601 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:31 crc kubenswrapper[4933]: I1202 15:53:31.645610 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:31Z","lastTransitionTime":"2025-12-02T15:53:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:31 crc kubenswrapper[4933]: E1202 15:53:31.665870 4933 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:53:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:53:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:53:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:53:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b45811fa-f657-451d-9a34-cdd268fcc941\\\",\\\"systemUUID\\\":\\\"84b7b789-bc9b-466b-8619-2bf2e1fdb8d0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:31Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:31 crc kubenswrapper[4933]: I1202 15:53:31.671068 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:31 crc kubenswrapper[4933]: I1202 15:53:31.671132 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:31 crc kubenswrapper[4933]: I1202 15:53:31.671152 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:31 crc kubenswrapper[4933]: I1202 15:53:31.671178 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:31 crc kubenswrapper[4933]: I1202 15:53:31.671202 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:31Z","lastTransitionTime":"2025-12-02T15:53:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:31 crc kubenswrapper[4933]: E1202 15:53:31.686423 4933 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:53:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:53:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:53:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:53:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b45811fa-f657-451d-9a34-cdd268fcc941\\\",\\\"systemUUID\\\":\\\"84b7b789-bc9b-466b-8619-2bf2e1fdb8d0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:31Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:31 crc kubenswrapper[4933]: E1202 15:53:31.686567 4933 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 02 15:53:31 crc kubenswrapper[4933]: I1202 15:53:31.688552 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:31 crc kubenswrapper[4933]: I1202 15:53:31.688604 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:31 crc kubenswrapper[4933]: I1202 15:53:31.688618 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:31 crc kubenswrapper[4933]: I1202 15:53:31.688638 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:31 crc kubenswrapper[4933]: I1202 15:53:31.688653 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:31Z","lastTransitionTime":"2025-12-02T15:53:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:31 crc kubenswrapper[4933]: I1202 15:53:31.798283 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:31 crc kubenswrapper[4933]: I1202 15:53:31.798327 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:31 crc kubenswrapper[4933]: I1202 15:53:31.798336 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:31 crc kubenswrapper[4933]: I1202 15:53:31.798355 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:31 crc kubenswrapper[4933]: I1202 15:53:31.798369 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:31Z","lastTransitionTime":"2025-12-02T15:53:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:31 crc kubenswrapper[4933]: I1202 15:53:31.800285 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-z6kjz_b033c545-93a2-4401-842b-22456e44216b/kube-multus/0.log" Dec 02 15:53:31 crc kubenswrapper[4933]: I1202 15:53:31.800376 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-z6kjz" event={"ID":"b033c545-93a2-4401-842b-22456e44216b","Type":"ContainerStarted","Data":"cf98d43fe6e267ef8509fcedf5375bfd2049a7a7964ddcb2cb97b6710013fb7a"} Dec 02 15:53:31 crc kubenswrapper[4933]: I1202 15:53:31.825745 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50efb59904b9275848b8f068dfa8943515c66087209fe13dc75888354ecaff09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:31Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:31 crc kubenswrapper[4933]: I1202 15:53:31.845037 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:31Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:31 crc kubenswrapper[4933]: I1202 15:53:31.865808 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4424b26b4c67e9508d92cc6bbc82b291d93c587a8463026a856d87b7b778079e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:31Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:31 crc kubenswrapper[4933]: I1202 15:53:31.885406 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-z6kjz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b033c545-93a2-4401-842b-22456e44216b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf98d43fe6e267ef8509fcedf5375bfd2049a7a7964ddcb2cb97b6710013fb7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5307d7bbe56091012f9975b2a42eafb27d8c90b53817f1f82d8269e23456759\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T15:53:30Z\\\",\\\"message\\\":\\\"2025-12-02T15:52:45+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_261455ff-27bf-4826-8730-80c1421f483f\\\\n2025-12-02T15:52:45+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_261455ff-27bf-4826-8730-80c1421f483f to /host/opt/cni/bin/\\\\n2025-12-02T15:52:45Z [verbose] multus-daemon started\\\\n2025-12-02T15:52:45Z [verbose] Readiness Indicator file check\\\\n2025-12-02T15:53:30Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:53:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdq96\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-z6kjz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:31Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:31 crc kubenswrapper[4933]: I1202 15:53:31.902195 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:31 crc kubenswrapper[4933]: I1202 15:53:31.902308 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:31 crc kubenswrapper[4933]: I1202 15:53:31.902369 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:31 crc kubenswrapper[4933]: I1202 15:53:31.902399 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:31 crc kubenswrapper[4933]: I1202 15:53:31.902418 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:31Z","lastTransitionTime":"2025-12-02T15:53:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:31 crc kubenswrapper[4933]: I1202 15:53:31.907562 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67141d41-dade-4d16-8921-1a3eeaef658e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67497877f8241167fd89cf2ebbc7257a910b4f22e96a1e58f78936317daf19aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://559f28d5b8b13a4d78a727f671a75d2c61a391f83089c98749a8071b466c8fae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50e3f8f75abacf2b0701b20cc58796727bb04d99836c7ad69c27355d0c7f070f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0f8661c980893fc646e84a6bb8547946653723fb3f1449d585953e25715d588\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdced4864fc5e9a41404f9484c6126634ffcbc3388080207f6a5508be6dc7b19\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 15:52:30.552832 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 15:52:30.556124 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1621699441/tls.crt::/tmp/serving-cert-1621699441/tls.key\\\\\\\"\\\\nI1202 15:52:36.166152 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 15:52:36.169452 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 15:52:36.169553 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 15:52:36.169614 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 15:52:36.169667 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 15:52:36.177343 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 15:52:36.177409 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 15:52:36.177417 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 15:52:36.177423 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 15:52:36.177428 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 15:52:36.177432 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 15:52:36.177437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 15:52:36.177361 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 15:52:36.181525 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://266ff5f18c317176752753189e1fd32c61d17e22bebba19421d71cdce41c10e4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd63fc5b2ffbd52b09628908868b28e2168bfe8bc453de41ce175aedcd404cc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd63fc5b2ffbd52b09628908868b28e2168bfe8bc453de41ce175aedcd404cc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:31Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:31 crc kubenswrapper[4933]: I1202 15:53:31.928625 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c0555430401fd089ba4f14bef44c9a03bcc4352a3159c34aa592797211ff912\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2774460b514418fc05e1d8ac0ca0a8cda1194fab9151804bed266e6bf44c7369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:31Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:31 crc kubenswrapper[4933]: I1202 15:53:31.943516 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:31Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:31 crc kubenswrapper[4933]: I1202 15:53:31.966289 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-s779q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72a30a71-fe04-43a2-8f60-c9b12a0a6e5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2367cc195b2375ceed6df398214268e414ae13f5459750b4a1f3bbe4ef59363\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a448b3118b6160344bea740978c045ce2351af934805ee95eb6b940963b377bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a448b3118b6160344bea740978c045ce2351af934805ee95eb6b940963b377bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f11b7eae04e4265f3d00a32c0c1c91419ef6da612c6ecd2e69ef4e90b5f72b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f11b7eae04e4265f3d00a32c0c1c91419ef6da612c6ecd2e69ef4e90b5f72b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89093d056f612b90064277410dc17c8b1879459a9e9491538ccd8dabeb2703e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89093d056f612b90064277410dc17c8b1879459a9e9491538ccd8dabeb2703e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e773c0ba8db44a579b0f5b59c7402d773402fbb1b3e3b57bf95e9244df635c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e773c0ba8db44a579b0f5b59c7402d773402fbb1b3e3b57bf95e9244df635c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a098d78bfca730eec14ca5f48ae082e24f0623ec3752e1b0182dceb18821e3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a098d78bfca730eec14ca5f48ae082e24f0623ec3752e1b0182dceb18821e3b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e41fea9eed5df332e515a81666075fc4cb3171b47a7c222b36dac4d5a7533692\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e41fea9eed5df332e515a81666075fc4cb3171b47a7c222b36dac4d5a7533692\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-s779q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:31Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:31 crc kubenswrapper[4933]: I1202 15:53:31.983221 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"448f05b3-d7b9-4eca-a267-b9b4a5a766e2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://242478d3d008a43cabff06a9c4bf90faf8c8eb3a12f5f43cde5d0156c47d6a8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://278a1b76a0d8f7d922ab08d7ecd0912f0cde3f3438de43f5c5214e703bd244c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://278a1b76a0d8f7d922ab08d7ecd0912f0cde3f3438de43f5c5214e703bd244c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:31Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:31 crc kubenswrapper[4933]: I1202 15:53:31.995674 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06b195f1-3296-4050-9361-eab421cde8d4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7399fed7994f0ed3f12a423d3f6796e84d8687f9c16a3050ccbb90e1c80a07d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee933bdd8638f0085a6f720a178c8ce59bf46b40a0bcb015ac9c570e25ce97d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7cb1abb6f86878fc3daef153191ea3a2ebe06b3f1fc7df959539938c3b6a724\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db14d6e6ebdfa06ff02570eb66fe7ea17a7705fdaa767b6fb91d7ed12eacd59a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:31Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:32 crc kubenswrapper[4933]: I1202 15:53:32.004687 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:32 crc kubenswrapper[4933]: I1202 15:53:32.004728 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:32 crc kubenswrapper[4933]: I1202 15:53:32.004741 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:32 crc kubenswrapper[4933]: I1202 15:53:32.004759 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:32 crc kubenswrapper[4933]: I1202 15:53:32.004774 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:32Z","lastTransitionTime":"2025-12-02T15:53:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:32 crc kubenswrapper[4933]: I1202 15:53:32.013427 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1972064c-ea30-421c-b009-2bc675a98fcc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3920b75a79accaa154ca827b9d813f5bbb7c7c18369c0b6d3bb82cab653bcb66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a3f2ac3ef3759389e0c4807b4235aea66b8d9570ef020af90110402296a755\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d4c0805f39e3410185b2d898b61b42dfeb544cd47c1ac7570a099ece7921a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://170155e5803e9eabfc4866a3cd76bd74567d9ff6707bebfdf7f24a2330173140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a499bb33dae499d565aa22f537a095fe9465bd1d68d829248fc7a896f692f67d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14c25265781df0f480ecebbc167b5e413d869433f7258cc299a2c2bc12ee2af3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e68bfc1eb2e35cbb02950eed88f4e4cb09b5ee597cbf3dbefcbb7dca7a9f90d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e68bfc1eb2e35cbb02950eed88f4e4cb09b5ee597cbf3dbefcbb7dca7a9f90d7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T15:53:08Z\\\",\\\"message\\\":\\\"g for pod on switch crc\\\\nI1202 15:53:07.975214 6599 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1202 15:53:07.975521 6599 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1202 15:53:07.975535 6599 base_network_controller_pods.go:916] Annotation values: ip=[10.217.0.3/23] ; mac=0a:58:0a:d9:00:03 ; gw=[10.217.0.1]\\\\nI1202 15:53:07.975530 6599 ovn.go:134] Ensuring zone local for Pod openshift-kube-controller-manager/kube-controller-manager-crc in node crc\\\\nI1202 15:53:07.975580 6599 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc after 0 failed attempt(s)\\\\nI1202 15:53:07.975354 6599 services_controller.go:444] Built service openshift-apiserver/api LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1202 15:53:07.975596 6599 default_network_controller.go:776] Recording success event on pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1202 15:53:07.975612 6599 services_controller.go:445] Built service openshift-apiserver/api LB template configs for network=default: []services.lbConfig(nil)\\\\nF1202 15:53:07.975627 6599 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T15:53:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-8mklc_openshift-ovn-kubernetes(1972064c-ea30-421c-b009-2bc675a98fcc)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ae6c9973a0f8983a5da9a3a25ba23d7d836e7f5bdaa60b60627d604ef3d2afc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0062caebc274431e99ef7dc875a50f2f2d756215bd2700b7206582f68e5f376\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0062caebc274431e99ef7dc875a50f2f2d756215bd2700b7206582f68e5f376\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8mklc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:32Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:32 crc kubenswrapper[4933]: I1202 15:53:32.026279 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s488w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a161d9d5-a56f-45e9-93e4-50e7220cd31e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38f86b30416fb2f10e8444d5c7c0afe84f16d619e83e7a5e1186eaf4c274a51f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mnvpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cba0775cbbe9a6ce1f0b0fe3559f2b5eb39bd13d4f35686fe2ff92d7d833909\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mnvpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s488w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:32Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:32 crc kubenswrapper[4933]: I1202 15:53:32.040567 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7faf072-ca84-4fd4-9409-b46ca6d4f1b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://befd22a9932058934ba3614781a7f133b4e432d5488c47697d082722ac11e0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d957eb940436ccb56c610cb177a4670ec0fe3aa5437e333a99c291c4263c5978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://defe82313be0dd4818134a0c694acbf728c8e31d0df745b7f2241a3e57c1bd8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d0c36fe96576d4594aff50e5b631d1bc23ea352372722d332d11c9dc6b5b7cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d0c36fe96576d4594aff50e5b631d1bc23ea352372722d332d11c9dc6b5b7cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:17Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:32Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:32 crc kubenswrapper[4933]: I1202 15:53:32.052997 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 15:53:32 crc kubenswrapper[4933]: I1202 15:53:32.053023 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 15:53:32 crc kubenswrapper[4933]: I1202 15:53:32.053007 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 15:53:32 crc kubenswrapper[4933]: E1202 15:53:32.053132 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 15:53:32 crc kubenswrapper[4933]: E1202 15:53:32.053192 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 15:53:32 crc kubenswrapper[4933]: E1202 15:53:32.053260 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 15:53:32 crc kubenswrapper[4933]: I1202 15:53:32.053863 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:32Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:32 crc kubenswrapper[4933]: I1202 15:53:32.068155 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6c1c5e6-50dd-428a-890c-2c3f0456f2fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d84363ac0dfeec81ad7770d6ffd34547605fc51bebb545c4639f4c069bab93ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t48jm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54194f3459a2bbe748821e4f8e94abdd18e7c4e483d4cc2c9d5b765db584dd01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t48jm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-d2p6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:32Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:32 crc kubenswrapper[4933]: I1202 15:53:32.081588 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5d9dn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97edaf10-912b-42e7-a9e7-930381d48508\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea0970d622ab76b94ceab66d1d10d469581574368d38c8cd7c6b7a26f81cb6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s82hp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5d9dn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:32Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:32 crc kubenswrapper[4933]: I1202 15:53:32.093336 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fl25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"680c0df1-e4d6-4e1c-a36d-2378e821d2d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bdbdb1eb6ad7b6e710feeda6af64e9557ec1c3c938fd850fa5b2835abc45f098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-559sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fl25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:32Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:32 crc kubenswrapper[4933]: I1202 15:53:32.107621 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:32 crc kubenswrapper[4933]: I1202 15:53:32.107654 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:32 crc kubenswrapper[4933]: I1202 15:53:32.107665 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:32 crc kubenswrapper[4933]: I1202 15:53:32.107680 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:32 crc kubenswrapper[4933]: I1202 15:53:32.107691 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:32Z","lastTransitionTime":"2025-12-02T15:53:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:32 crc kubenswrapper[4933]: I1202 15:53:32.110309 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qbps2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c95a4730-1427-4097-9ca3-4bd251e7acf0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdwf7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdwf7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:56Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qbps2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:32Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:32 crc kubenswrapper[4933]: I1202 15:53:32.210427 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:32 crc kubenswrapper[4933]: I1202 15:53:32.210487 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:32 crc kubenswrapper[4933]: I1202 15:53:32.210505 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:32 crc kubenswrapper[4933]: I1202 15:53:32.210529 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:32 crc kubenswrapper[4933]: I1202 15:53:32.210546 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:32Z","lastTransitionTime":"2025-12-02T15:53:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:32 crc kubenswrapper[4933]: I1202 15:53:32.312659 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:32 crc kubenswrapper[4933]: I1202 15:53:32.312696 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:32 crc kubenswrapper[4933]: I1202 15:53:32.312707 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:32 crc kubenswrapper[4933]: I1202 15:53:32.312720 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:32 crc kubenswrapper[4933]: I1202 15:53:32.312728 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:32Z","lastTransitionTime":"2025-12-02T15:53:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:32 crc kubenswrapper[4933]: I1202 15:53:32.414474 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:32 crc kubenswrapper[4933]: I1202 15:53:32.414520 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:32 crc kubenswrapper[4933]: I1202 15:53:32.414536 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:32 crc kubenswrapper[4933]: I1202 15:53:32.414558 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:32 crc kubenswrapper[4933]: I1202 15:53:32.414573 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:32Z","lastTransitionTime":"2025-12-02T15:53:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:32 crc kubenswrapper[4933]: I1202 15:53:32.517750 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:32 crc kubenswrapper[4933]: I1202 15:53:32.517837 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:32 crc kubenswrapper[4933]: I1202 15:53:32.517852 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:32 crc kubenswrapper[4933]: I1202 15:53:32.517875 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:32 crc kubenswrapper[4933]: I1202 15:53:32.517894 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:32Z","lastTransitionTime":"2025-12-02T15:53:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:32 crc kubenswrapper[4933]: I1202 15:53:32.621077 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:32 crc kubenswrapper[4933]: I1202 15:53:32.621161 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:32 crc kubenswrapper[4933]: I1202 15:53:32.621174 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:32 crc kubenswrapper[4933]: I1202 15:53:32.621199 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:32 crc kubenswrapper[4933]: I1202 15:53:32.621218 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:32Z","lastTransitionTime":"2025-12-02T15:53:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:32 crc kubenswrapper[4933]: I1202 15:53:32.723898 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:32 crc kubenswrapper[4933]: I1202 15:53:32.723955 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:32 crc kubenswrapper[4933]: I1202 15:53:32.723966 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:32 crc kubenswrapper[4933]: I1202 15:53:32.723985 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:32 crc kubenswrapper[4933]: I1202 15:53:32.723998 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:32Z","lastTransitionTime":"2025-12-02T15:53:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:32 crc kubenswrapper[4933]: I1202 15:53:32.826522 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:32 crc kubenswrapper[4933]: I1202 15:53:32.826594 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:32 crc kubenswrapper[4933]: I1202 15:53:32.826612 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:32 crc kubenswrapper[4933]: I1202 15:53:32.826638 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:32 crc kubenswrapper[4933]: I1202 15:53:32.826656 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:32Z","lastTransitionTime":"2025-12-02T15:53:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:32 crc kubenswrapper[4933]: I1202 15:53:32.928759 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:32 crc kubenswrapper[4933]: I1202 15:53:32.928802 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:32 crc kubenswrapper[4933]: I1202 15:53:32.928853 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:32 crc kubenswrapper[4933]: I1202 15:53:32.928872 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:32 crc kubenswrapper[4933]: I1202 15:53:32.928882 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:32Z","lastTransitionTime":"2025-12-02T15:53:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:33 crc kubenswrapper[4933]: I1202 15:53:33.031382 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:33 crc kubenswrapper[4933]: I1202 15:53:33.031431 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:33 crc kubenswrapper[4933]: I1202 15:53:33.031449 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:33 crc kubenswrapper[4933]: I1202 15:53:33.031466 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:33 crc kubenswrapper[4933]: I1202 15:53:33.031491 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:33Z","lastTransitionTime":"2025-12-02T15:53:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:33 crc kubenswrapper[4933]: I1202 15:53:33.053156 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qbps2" Dec 02 15:53:33 crc kubenswrapper[4933]: E1202 15:53:33.053312 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qbps2" podUID="c95a4730-1427-4097-9ca3-4bd251e7acf0" Dec 02 15:53:33 crc kubenswrapper[4933]: I1202 15:53:33.134198 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:33 crc kubenswrapper[4933]: I1202 15:53:33.134260 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:33 crc kubenswrapper[4933]: I1202 15:53:33.134278 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:33 crc kubenswrapper[4933]: I1202 15:53:33.134317 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:33 crc kubenswrapper[4933]: I1202 15:53:33.134335 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:33Z","lastTransitionTime":"2025-12-02T15:53:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:33 crc kubenswrapper[4933]: I1202 15:53:33.237142 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:33 crc kubenswrapper[4933]: I1202 15:53:33.237179 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:33 crc kubenswrapper[4933]: I1202 15:53:33.237187 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:33 crc kubenswrapper[4933]: I1202 15:53:33.237199 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:33 crc kubenswrapper[4933]: I1202 15:53:33.237209 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:33Z","lastTransitionTime":"2025-12-02T15:53:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:33 crc kubenswrapper[4933]: I1202 15:53:33.340370 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:33 crc kubenswrapper[4933]: I1202 15:53:33.340451 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:33 crc kubenswrapper[4933]: I1202 15:53:33.340469 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:33 crc kubenswrapper[4933]: I1202 15:53:33.340494 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:33 crc kubenswrapper[4933]: I1202 15:53:33.340514 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:33Z","lastTransitionTime":"2025-12-02T15:53:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:33 crc kubenswrapper[4933]: I1202 15:53:33.444911 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:33 crc kubenswrapper[4933]: I1202 15:53:33.444984 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:33 crc kubenswrapper[4933]: I1202 15:53:33.445001 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:33 crc kubenswrapper[4933]: I1202 15:53:33.445037 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:33 crc kubenswrapper[4933]: I1202 15:53:33.445055 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:33Z","lastTransitionTime":"2025-12-02T15:53:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:33 crc kubenswrapper[4933]: I1202 15:53:33.548159 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:33 crc kubenswrapper[4933]: I1202 15:53:33.548218 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:33 crc kubenswrapper[4933]: I1202 15:53:33.548231 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:33 crc kubenswrapper[4933]: I1202 15:53:33.548251 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:33 crc kubenswrapper[4933]: I1202 15:53:33.548266 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:33Z","lastTransitionTime":"2025-12-02T15:53:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:33 crc kubenswrapper[4933]: I1202 15:53:33.651901 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:33 crc kubenswrapper[4933]: I1202 15:53:33.652006 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:33 crc kubenswrapper[4933]: I1202 15:53:33.652050 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:33 crc kubenswrapper[4933]: I1202 15:53:33.652070 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:33 crc kubenswrapper[4933]: I1202 15:53:33.652080 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:33Z","lastTransitionTime":"2025-12-02T15:53:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:33 crc kubenswrapper[4933]: I1202 15:53:33.755973 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:33 crc kubenswrapper[4933]: I1202 15:53:33.756118 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:33 crc kubenswrapper[4933]: I1202 15:53:33.756148 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:33 crc kubenswrapper[4933]: I1202 15:53:33.756181 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:33 crc kubenswrapper[4933]: I1202 15:53:33.756261 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:33Z","lastTransitionTime":"2025-12-02T15:53:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:33 crc kubenswrapper[4933]: I1202 15:53:33.858653 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:33 crc kubenswrapper[4933]: I1202 15:53:33.858705 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:33 crc kubenswrapper[4933]: I1202 15:53:33.858715 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:33 crc kubenswrapper[4933]: I1202 15:53:33.858730 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:33 crc kubenswrapper[4933]: I1202 15:53:33.858742 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:33Z","lastTransitionTime":"2025-12-02T15:53:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:33 crc kubenswrapper[4933]: I1202 15:53:33.961042 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:33 crc kubenswrapper[4933]: I1202 15:53:33.961083 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:33 crc kubenswrapper[4933]: I1202 15:53:33.961092 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:33 crc kubenswrapper[4933]: I1202 15:53:33.961107 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:33 crc kubenswrapper[4933]: I1202 15:53:33.961118 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:33Z","lastTransitionTime":"2025-12-02T15:53:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:34 crc kubenswrapper[4933]: I1202 15:53:34.053164 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 15:53:34 crc kubenswrapper[4933]: I1202 15:53:34.053265 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 15:53:34 crc kubenswrapper[4933]: I1202 15:53:34.053169 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 15:53:34 crc kubenswrapper[4933]: E1202 15:53:34.053466 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 15:53:34 crc kubenswrapper[4933]: E1202 15:53:34.053610 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 15:53:34 crc kubenswrapper[4933]: E1202 15:53:34.053855 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 15:53:34 crc kubenswrapper[4933]: I1202 15:53:34.063171 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:34 crc kubenswrapper[4933]: I1202 15:53:34.063201 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:34 crc kubenswrapper[4933]: I1202 15:53:34.063215 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:34 crc kubenswrapper[4933]: I1202 15:53:34.063227 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:34 crc kubenswrapper[4933]: I1202 15:53:34.063237 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:34Z","lastTransitionTime":"2025-12-02T15:53:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:34 crc kubenswrapper[4933]: I1202 15:53:34.165396 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:34 crc kubenswrapper[4933]: I1202 15:53:34.165677 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:34 crc kubenswrapper[4933]: I1202 15:53:34.165796 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:34 crc kubenswrapper[4933]: I1202 15:53:34.165922 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:34 crc kubenswrapper[4933]: I1202 15:53:34.165991 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:34Z","lastTransitionTime":"2025-12-02T15:53:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:34 crc kubenswrapper[4933]: I1202 15:53:34.268791 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:34 crc kubenswrapper[4933]: I1202 15:53:34.268842 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:34 crc kubenswrapper[4933]: I1202 15:53:34.268851 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:34 crc kubenswrapper[4933]: I1202 15:53:34.268864 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:34 crc kubenswrapper[4933]: I1202 15:53:34.268874 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:34Z","lastTransitionTime":"2025-12-02T15:53:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:34 crc kubenswrapper[4933]: I1202 15:53:34.372403 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:34 crc kubenswrapper[4933]: I1202 15:53:34.372476 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:34 crc kubenswrapper[4933]: I1202 15:53:34.372495 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:34 crc kubenswrapper[4933]: I1202 15:53:34.372528 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:34 crc kubenswrapper[4933]: I1202 15:53:34.372549 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:34Z","lastTransitionTime":"2025-12-02T15:53:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:34 crc kubenswrapper[4933]: I1202 15:53:34.475853 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:34 crc kubenswrapper[4933]: I1202 15:53:34.475914 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:34 crc kubenswrapper[4933]: I1202 15:53:34.475926 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:34 crc kubenswrapper[4933]: I1202 15:53:34.475948 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:34 crc kubenswrapper[4933]: I1202 15:53:34.475960 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:34Z","lastTransitionTime":"2025-12-02T15:53:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:34 crc kubenswrapper[4933]: I1202 15:53:34.578942 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:34 crc kubenswrapper[4933]: I1202 15:53:34.578983 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:34 crc kubenswrapper[4933]: I1202 15:53:34.578995 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:34 crc kubenswrapper[4933]: I1202 15:53:34.579011 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:34 crc kubenswrapper[4933]: I1202 15:53:34.579022 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:34Z","lastTransitionTime":"2025-12-02T15:53:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:34 crc kubenswrapper[4933]: I1202 15:53:34.683013 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:34 crc kubenswrapper[4933]: I1202 15:53:34.683589 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:34 crc kubenswrapper[4933]: I1202 15:53:34.683688 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:34 crc kubenswrapper[4933]: I1202 15:53:34.683974 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:34 crc kubenswrapper[4933]: I1202 15:53:34.684095 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:34Z","lastTransitionTime":"2025-12-02T15:53:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:34 crc kubenswrapper[4933]: I1202 15:53:34.788236 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:34 crc kubenswrapper[4933]: I1202 15:53:34.788654 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:34 crc kubenswrapper[4933]: I1202 15:53:34.788925 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:34 crc kubenswrapper[4933]: I1202 15:53:34.789143 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:34 crc kubenswrapper[4933]: I1202 15:53:34.789316 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:34Z","lastTransitionTime":"2025-12-02T15:53:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:34 crc kubenswrapper[4933]: I1202 15:53:34.893129 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:34 crc kubenswrapper[4933]: I1202 15:53:34.893215 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:34 crc kubenswrapper[4933]: I1202 15:53:34.893238 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:34 crc kubenswrapper[4933]: I1202 15:53:34.893267 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:34 crc kubenswrapper[4933]: I1202 15:53:34.893289 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:34Z","lastTransitionTime":"2025-12-02T15:53:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:34 crc kubenswrapper[4933]: I1202 15:53:34.996679 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:34 crc kubenswrapper[4933]: I1202 15:53:34.996736 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:34 crc kubenswrapper[4933]: I1202 15:53:34.996753 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:34 crc kubenswrapper[4933]: I1202 15:53:34.996776 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:34 crc kubenswrapper[4933]: I1202 15:53:34.996792 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:34Z","lastTransitionTime":"2025-12-02T15:53:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:35 crc kubenswrapper[4933]: I1202 15:53:35.052904 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qbps2" Dec 02 15:53:35 crc kubenswrapper[4933]: E1202 15:53:35.053060 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qbps2" podUID="c95a4730-1427-4097-9ca3-4bd251e7acf0" Dec 02 15:53:35 crc kubenswrapper[4933]: I1202 15:53:35.099158 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:35 crc kubenswrapper[4933]: I1202 15:53:35.099228 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:35 crc kubenswrapper[4933]: I1202 15:53:35.099254 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:35 crc kubenswrapper[4933]: I1202 15:53:35.099285 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:35 crc kubenswrapper[4933]: I1202 15:53:35.099306 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:35Z","lastTransitionTime":"2025-12-02T15:53:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:35 crc kubenswrapper[4933]: I1202 15:53:35.202462 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:35 crc kubenswrapper[4933]: I1202 15:53:35.202523 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:35 crc kubenswrapper[4933]: I1202 15:53:35.202539 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:35 crc kubenswrapper[4933]: I1202 15:53:35.202563 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:35 crc kubenswrapper[4933]: I1202 15:53:35.202580 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:35Z","lastTransitionTime":"2025-12-02T15:53:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:35 crc kubenswrapper[4933]: I1202 15:53:35.306039 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:35 crc kubenswrapper[4933]: I1202 15:53:35.306113 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:35 crc kubenswrapper[4933]: I1202 15:53:35.306141 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:35 crc kubenswrapper[4933]: I1202 15:53:35.306209 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:35 crc kubenswrapper[4933]: I1202 15:53:35.306228 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:35Z","lastTransitionTime":"2025-12-02T15:53:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:35 crc kubenswrapper[4933]: I1202 15:53:35.409188 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:35 crc kubenswrapper[4933]: I1202 15:53:35.409254 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:35 crc kubenswrapper[4933]: I1202 15:53:35.409272 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:35 crc kubenswrapper[4933]: I1202 15:53:35.409299 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:35 crc kubenswrapper[4933]: I1202 15:53:35.409315 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:35Z","lastTransitionTime":"2025-12-02T15:53:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:35 crc kubenswrapper[4933]: I1202 15:53:35.512988 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:35 crc kubenswrapper[4933]: I1202 15:53:35.513063 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:35 crc kubenswrapper[4933]: I1202 15:53:35.513080 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:35 crc kubenswrapper[4933]: I1202 15:53:35.513104 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:35 crc kubenswrapper[4933]: I1202 15:53:35.513121 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:35Z","lastTransitionTime":"2025-12-02T15:53:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:35 crc kubenswrapper[4933]: I1202 15:53:35.616750 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:35 crc kubenswrapper[4933]: I1202 15:53:35.616869 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:35 crc kubenswrapper[4933]: I1202 15:53:35.616904 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:35 crc kubenswrapper[4933]: I1202 15:53:35.616932 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:35 crc kubenswrapper[4933]: I1202 15:53:35.616950 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:35Z","lastTransitionTime":"2025-12-02T15:53:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:35 crc kubenswrapper[4933]: I1202 15:53:35.720598 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:35 crc kubenswrapper[4933]: I1202 15:53:35.720657 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:35 crc kubenswrapper[4933]: I1202 15:53:35.720675 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:35 crc kubenswrapper[4933]: I1202 15:53:35.720698 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:35 crc kubenswrapper[4933]: I1202 15:53:35.720715 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:35Z","lastTransitionTime":"2025-12-02T15:53:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:35 crc kubenswrapper[4933]: I1202 15:53:35.824056 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:35 crc kubenswrapper[4933]: I1202 15:53:35.824255 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:35 crc kubenswrapper[4933]: I1202 15:53:35.824354 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:35 crc kubenswrapper[4933]: I1202 15:53:35.824380 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:35 crc kubenswrapper[4933]: I1202 15:53:35.824398 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:35Z","lastTransitionTime":"2025-12-02T15:53:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:35 crc kubenswrapper[4933]: I1202 15:53:35.927804 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:35 crc kubenswrapper[4933]: I1202 15:53:35.928367 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:35 crc kubenswrapper[4933]: I1202 15:53:35.928593 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:35 crc kubenswrapper[4933]: I1202 15:53:35.928727 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:35 crc kubenswrapper[4933]: I1202 15:53:35.928860 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:35Z","lastTransitionTime":"2025-12-02T15:53:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:36 crc kubenswrapper[4933]: I1202 15:53:36.031968 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:36 crc kubenswrapper[4933]: I1202 15:53:36.031998 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:36 crc kubenswrapper[4933]: I1202 15:53:36.032006 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:36 crc kubenswrapper[4933]: I1202 15:53:36.032020 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:36 crc kubenswrapper[4933]: I1202 15:53:36.032029 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:36Z","lastTransitionTime":"2025-12-02T15:53:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:36 crc kubenswrapper[4933]: I1202 15:53:36.052367 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 15:53:36 crc kubenswrapper[4933]: I1202 15:53:36.052373 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 15:53:36 crc kubenswrapper[4933]: I1202 15:53:36.053019 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 15:53:36 crc kubenswrapper[4933]: E1202 15:53:36.053395 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 15:53:36 crc kubenswrapper[4933]: E1202 15:53:36.053745 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 15:53:36 crc kubenswrapper[4933]: E1202 15:53:36.054173 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 15:53:36 crc kubenswrapper[4933]: I1202 15:53:36.134591 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:36 crc kubenswrapper[4933]: I1202 15:53:36.134641 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:36 crc kubenswrapper[4933]: I1202 15:53:36.134657 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:36 crc kubenswrapper[4933]: I1202 15:53:36.134676 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:36 crc kubenswrapper[4933]: I1202 15:53:36.134690 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:36Z","lastTransitionTime":"2025-12-02T15:53:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:36 crc kubenswrapper[4933]: I1202 15:53:36.237152 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:36 crc kubenswrapper[4933]: I1202 15:53:36.237577 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:36 crc kubenswrapper[4933]: I1202 15:53:36.237734 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:36 crc kubenswrapper[4933]: I1202 15:53:36.237917 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:36 crc kubenswrapper[4933]: I1202 15:53:36.238030 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:36Z","lastTransitionTime":"2025-12-02T15:53:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:36 crc kubenswrapper[4933]: I1202 15:53:36.340496 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:36 crc kubenswrapper[4933]: I1202 15:53:36.340557 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:36 crc kubenswrapper[4933]: I1202 15:53:36.340574 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:36 crc kubenswrapper[4933]: I1202 15:53:36.340599 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:36 crc kubenswrapper[4933]: I1202 15:53:36.340618 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:36Z","lastTransitionTime":"2025-12-02T15:53:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:36 crc kubenswrapper[4933]: I1202 15:53:36.443593 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:36 crc kubenswrapper[4933]: I1202 15:53:36.444036 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:36 crc kubenswrapper[4933]: I1202 15:53:36.444366 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:36 crc kubenswrapper[4933]: I1202 15:53:36.444683 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:36 crc kubenswrapper[4933]: I1202 15:53:36.444878 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:36Z","lastTransitionTime":"2025-12-02T15:53:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:36 crc kubenswrapper[4933]: I1202 15:53:36.547955 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:36 crc kubenswrapper[4933]: I1202 15:53:36.548029 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:36 crc kubenswrapper[4933]: I1202 15:53:36.548132 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:36 crc kubenswrapper[4933]: I1202 15:53:36.548167 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:36 crc kubenswrapper[4933]: I1202 15:53:36.548193 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:36Z","lastTransitionTime":"2025-12-02T15:53:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:36 crc kubenswrapper[4933]: I1202 15:53:36.652024 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:36 crc kubenswrapper[4933]: I1202 15:53:36.652128 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:36 crc kubenswrapper[4933]: I1202 15:53:36.652196 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:36 crc kubenswrapper[4933]: I1202 15:53:36.652225 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:36 crc kubenswrapper[4933]: I1202 15:53:36.652288 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:36Z","lastTransitionTime":"2025-12-02T15:53:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:36 crc kubenswrapper[4933]: I1202 15:53:36.756038 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:36 crc kubenswrapper[4933]: I1202 15:53:36.756104 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:36 crc kubenswrapper[4933]: I1202 15:53:36.756127 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:36 crc kubenswrapper[4933]: I1202 15:53:36.756155 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:36 crc kubenswrapper[4933]: I1202 15:53:36.756176 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:36Z","lastTransitionTime":"2025-12-02T15:53:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:36 crc kubenswrapper[4933]: I1202 15:53:36.859853 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:36 crc kubenswrapper[4933]: I1202 15:53:36.859921 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:36 crc kubenswrapper[4933]: I1202 15:53:36.859938 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:36 crc kubenswrapper[4933]: I1202 15:53:36.859961 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:36 crc kubenswrapper[4933]: I1202 15:53:36.859986 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:36Z","lastTransitionTime":"2025-12-02T15:53:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:36 crc kubenswrapper[4933]: I1202 15:53:36.964944 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:36 crc kubenswrapper[4933]: I1202 15:53:36.964985 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:36 crc kubenswrapper[4933]: I1202 15:53:36.964997 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:36 crc kubenswrapper[4933]: I1202 15:53:36.965016 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:36 crc kubenswrapper[4933]: I1202 15:53:36.965028 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:36Z","lastTransitionTime":"2025-12-02T15:53:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:37 crc kubenswrapper[4933]: I1202 15:53:37.055379 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qbps2" Dec 02 15:53:37 crc kubenswrapper[4933]: E1202 15:53:37.055874 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qbps2" podUID="c95a4730-1427-4097-9ca3-4bd251e7acf0" Dec 02 15:53:37 crc kubenswrapper[4933]: I1202 15:53:37.072508 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:37 crc kubenswrapper[4933]: I1202 15:53:37.072862 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:37 crc kubenswrapper[4933]: I1202 15:53:37.073096 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:37 crc kubenswrapper[4933]: I1202 15:53:37.073223 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:37 crc kubenswrapper[4933]: I1202 15:53:37.073311 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:37Z","lastTransitionTime":"2025-12-02T15:53:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:37 crc kubenswrapper[4933]: I1202 15:53:37.074470 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"448f05b3-d7b9-4eca-a267-b9b4a5a766e2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://242478d3d008a43cabff06a9c4bf90faf8c8eb3a12f5f43cde5d0156c47d6a8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://278a1b76a0d8f7d922ab08d7ecd0912f0cde3f3438de43f5c5214e703bd244c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://278a1b76a0d8f7d922ab08d7ecd0912f0cde3f3438de43f5c5214e703bd244c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:37Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:37 crc kubenswrapper[4933]: I1202 15:53:37.092807 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06b195f1-3296-4050-9361-eab421cde8d4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7399fed7994f0ed3f12a423d3f6796e84d8687f9c16a3050ccbb90e1c80a07d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee933bdd8638f0085a6f720a178c8ce59bf46b40a0bcb015ac9c570e25ce97d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7cb1abb6f86878fc3daef153191ea3a2ebe06b3f1fc7df959539938c3b6a724\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db14d6e6ebdfa06ff02570eb66fe7ea17a7705fdaa767b6fb91d7ed12eacd59a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:37Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:37 crc kubenswrapper[4933]: I1202 15:53:37.120841 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1972064c-ea30-421c-b009-2bc675a98fcc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3920b75a79accaa154ca827b9d813f5bbb7c7c18369c0b6d3bb82cab653bcb66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a3f2ac3ef3759389e0c4807b4235aea66b8d9570ef020af90110402296a755\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d4c0805f39e3410185b2d898b61b42dfeb544cd47c1ac7570a099ece7921a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://170155e5803e9eabfc4866a3cd76bd74567d9ff6707bebfdf7f24a2330173140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a499bb33dae499d565aa22f537a095fe9465bd1d68d829248fc7a896f692f67d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14c25265781df0f480ecebbc167b5e413d869433f7258cc299a2c2bc12ee2af3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e68bfc1eb2e35cbb02950eed88f4e4cb09b5ee597cbf3dbefcbb7dca7a9f90d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e68bfc1eb2e35cbb02950eed88f4e4cb09b5ee597cbf3dbefcbb7dca7a9f90d7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T15:53:08Z\\\",\\\"message\\\":\\\"g for pod on switch crc\\\\nI1202 15:53:07.975214 6599 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1202 15:53:07.975521 6599 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1202 15:53:07.975535 6599 base_network_controller_pods.go:916] Annotation values: ip=[10.217.0.3/23] ; mac=0a:58:0a:d9:00:03 ; gw=[10.217.0.1]\\\\nI1202 15:53:07.975530 6599 ovn.go:134] Ensuring zone local for Pod openshift-kube-controller-manager/kube-controller-manager-crc in node crc\\\\nI1202 15:53:07.975580 6599 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc after 0 failed attempt(s)\\\\nI1202 15:53:07.975354 6599 services_controller.go:444] Built service openshift-apiserver/api LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1202 15:53:07.975596 6599 default_network_controller.go:776] Recording success event on pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1202 15:53:07.975612 6599 services_controller.go:445] Built service openshift-apiserver/api LB template configs for network=default: []services.lbConfig(nil)\\\\nF1202 15:53:07.975627 6599 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T15:53:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-8mklc_openshift-ovn-kubernetes(1972064c-ea30-421c-b009-2bc675a98fcc)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ae6c9973a0f8983a5da9a3a25ba23d7d836e7f5bdaa60b60627d604ef3d2afc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0062caebc274431e99ef7dc875a50f2f2d756215bd2700b7206582f68e5f376\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0062caebc274431e99ef7dc875a50f2f2d756215bd2700b7206582f68e5f376\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8mklc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:37Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:37 crc kubenswrapper[4933]: I1202 15:53:37.134289 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s488w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a161d9d5-a56f-45e9-93e4-50e7220cd31e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38f86b30416fb2f10e8444d5c7c0afe84f16d619e83e7a5e1186eaf4c274a51f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mnvpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cba0775cbbe9a6ce1f0b0fe3559f2b5eb39bd13d4f35686fe2ff92d7d833909\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mnvpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s488w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:37Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:37 crc kubenswrapper[4933]: I1202 15:53:37.146671 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qbps2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c95a4730-1427-4097-9ca3-4bd251e7acf0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdwf7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdwf7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:56Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qbps2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:37Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:37 crc kubenswrapper[4933]: I1202 15:53:37.157975 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7faf072-ca84-4fd4-9409-b46ca6d4f1b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://befd22a9932058934ba3614781a7f133b4e432d5488c47697d082722ac11e0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d957eb940436ccb56c610cb177a4670ec0fe3aa5437e333a99c291c4263c5978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://defe82313be0dd4818134a0c694acbf728c8e31d0df745b7f2241a3e57c1bd8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d0c36fe96576d4594aff50e5b631d1bc23ea352372722d332d11c9dc6b5b7cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d0c36fe96576d4594aff50e5b631d1bc23ea352372722d332d11c9dc6b5b7cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:17Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:37Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:37 crc kubenswrapper[4933]: I1202 15:53:37.172163 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:37Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:37 crc kubenswrapper[4933]: I1202 15:53:37.175907 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:37 crc kubenswrapper[4933]: I1202 15:53:37.175947 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:37 crc kubenswrapper[4933]: I1202 15:53:37.175964 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:37 crc kubenswrapper[4933]: I1202 15:53:37.175981 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:37 crc kubenswrapper[4933]: I1202 15:53:37.175992 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:37Z","lastTransitionTime":"2025-12-02T15:53:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:37 crc kubenswrapper[4933]: I1202 15:53:37.184160 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6c1c5e6-50dd-428a-890c-2c3f0456f2fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d84363ac0dfeec81ad7770d6ffd34547605fc51bebb545c4639f4c069bab93ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t48jm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54194f3459a2bbe748821e4f8e94abdd18e7c4e483d4cc2c9d5b765db584dd01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t48jm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-d2p6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:37Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:37 crc kubenswrapper[4933]: I1202 15:53:37.197572 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5d9dn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97edaf10-912b-42e7-a9e7-930381d48508\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea0970d622ab76b94ceab66d1d10d469581574368d38c8cd7c6b7a26f81cb6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s82hp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5d9dn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:37Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:37 crc kubenswrapper[4933]: I1202 15:53:37.208560 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fl25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"680c0df1-e4d6-4e1c-a36d-2378e821d2d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bdbdb1eb6ad7b6e710feeda6af64e9557ec1c3c938fd850fa5b2835abc45f098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-559sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fl25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:37Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:37 crc kubenswrapper[4933]: I1202 15:53:37.223547 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50efb59904b9275848b8f068dfa8943515c66087209fe13dc75888354ecaff09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:37Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:37 crc kubenswrapper[4933]: I1202 15:53:37.237505 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:37Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:37 crc kubenswrapper[4933]: I1202 15:53:37.251239 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4424b26b4c67e9508d92cc6bbc82b291d93c587a8463026a856d87b7b778079e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:37Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:37 crc kubenswrapper[4933]: I1202 15:53:37.262916 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-z6kjz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b033c545-93a2-4401-842b-22456e44216b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf98d43fe6e267ef8509fcedf5375bfd2049a7a7964ddcb2cb97b6710013fb7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5307d7bbe56091012f9975b2a42eafb27d8c90b53817f1f82d8269e23456759\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T15:53:30Z\\\",\\\"message\\\":\\\"2025-12-02T15:52:45+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_261455ff-27bf-4826-8730-80c1421f483f\\\\n2025-12-02T15:52:45+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_261455ff-27bf-4826-8730-80c1421f483f to /host/opt/cni/bin/\\\\n2025-12-02T15:52:45Z [verbose] multus-daemon started\\\\n2025-12-02T15:52:45Z [verbose] Readiness Indicator file check\\\\n2025-12-02T15:53:30Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:53:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdq96\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-z6kjz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:37Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:37 crc kubenswrapper[4933]: I1202 15:53:37.275539 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67141d41-dade-4d16-8921-1a3eeaef658e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67497877f8241167fd89cf2ebbc7257a910b4f22e96a1e58f78936317daf19aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://559f28d5b8b13a4d78a727f671a75d2c61a391f83089c98749a8071b466c8fae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50e3f8f75abacf2b0701b20cc58796727bb04d99836c7ad69c27355d0c7f070f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0f8661c980893fc646e84a6bb8547946653723fb3f1449d585953e25715d588\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdced4864fc5e9a41404f9484c6126634ffcbc3388080207f6a5508be6dc7b19\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 15:52:30.552832 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 15:52:30.556124 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1621699441/tls.crt::/tmp/serving-cert-1621699441/tls.key\\\\\\\"\\\\nI1202 15:52:36.166152 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 15:52:36.169452 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 15:52:36.169553 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 15:52:36.169614 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 15:52:36.169667 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 15:52:36.177343 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 15:52:36.177409 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 15:52:36.177417 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 15:52:36.177423 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 15:52:36.177428 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 15:52:36.177432 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 15:52:36.177437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 15:52:36.177361 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 15:52:36.181525 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://266ff5f18c317176752753189e1fd32c61d17e22bebba19421d71cdce41c10e4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd63fc5b2ffbd52b09628908868b28e2168bfe8bc453de41ce175aedcd404cc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd63fc5b2ffbd52b09628908868b28e2168bfe8bc453de41ce175aedcd404cc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:37Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:37 crc kubenswrapper[4933]: I1202 15:53:37.283358 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:37 crc kubenswrapper[4933]: I1202 15:53:37.283488 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:37 crc kubenswrapper[4933]: I1202 15:53:37.283515 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:37 crc kubenswrapper[4933]: I1202 15:53:37.283547 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:37 crc kubenswrapper[4933]: I1202 15:53:37.283572 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:37Z","lastTransitionTime":"2025-12-02T15:53:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:37 crc kubenswrapper[4933]: I1202 15:53:37.287513 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c0555430401fd089ba4f14bef44c9a03bcc4352a3159c34aa592797211ff912\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2774460b514418fc05e1d8ac0ca0a8cda1194fab9151804bed266e6bf44c7369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:37Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:37 crc kubenswrapper[4933]: I1202 15:53:37.298666 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:37Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:37 crc kubenswrapper[4933]: I1202 15:53:37.310707 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-s779q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72a30a71-fe04-43a2-8f60-c9b12a0a6e5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2367cc195b2375ceed6df398214268e414ae13f5459750b4a1f3bbe4ef59363\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a448b3118b6160344bea740978c045ce2351af934805ee95eb6b940963b377bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a448b3118b6160344bea740978c045ce2351af934805ee95eb6b940963b377bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f11b7eae04e4265f3d00a32c0c1c91419ef6da612c6ecd2e69ef4e90b5f72b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f11b7eae04e4265f3d00a32c0c1c91419ef6da612c6ecd2e69ef4e90b5f72b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89093d056f612b90064277410dc17c8b1879459a9e9491538ccd8dabeb2703e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89093d056f612b90064277410dc17c8b1879459a9e9491538ccd8dabeb2703e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e773c0ba8db44a579b0f5b59c7402d773402fbb1b3e3b57bf95e9244df635c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e773c0ba8db44a579b0f5b59c7402d773402fbb1b3e3b57bf95e9244df635c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a098d78bfca730eec14ca5f48ae082e24f0623ec3752e1b0182dceb18821e3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a098d78bfca730eec14ca5f48ae082e24f0623ec3752e1b0182dceb18821e3b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e41fea9eed5df332e515a81666075fc4cb3171b47a7c222b36dac4d5a7533692\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e41fea9eed5df332e515a81666075fc4cb3171b47a7c222b36dac4d5a7533692\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-s779q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:37Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:37 crc kubenswrapper[4933]: I1202 15:53:37.387298 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:37 crc kubenswrapper[4933]: I1202 15:53:37.387385 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:37 crc kubenswrapper[4933]: I1202 15:53:37.387436 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:37 crc kubenswrapper[4933]: I1202 15:53:37.387462 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:37 crc kubenswrapper[4933]: I1202 15:53:37.387482 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:37Z","lastTransitionTime":"2025-12-02T15:53:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:37 crc kubenswrapper[4933]: I1202 15:53:37.491581 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:37 crc kubenswrapper[4933]: I1202 15:53:37.491687 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:37 crc kubenswrapper[4933]: I1202 15:53:37.491751 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:37 crc kubenswrapper[4933]: I1202 15:53:37.491786 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:37 crc kubenswrapper[4933]: I1202 15:53:37.491931 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:37Z","lastTransitionTime":"2025-12-02T15:53:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:37 crc kubenswrapper[4933]: I1202 15:53:37.594265 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:37 crc kubenswrapper[4933]: I1202 15:53:37.594304 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:37 crc kubenswrapper[4933]: I1202 15:53:37.594319 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:37 crc kubenswrapper[4933]: I1202 15:53:37.594337 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:37 crc kubenswrapper[4933]: I1202 15:53:37.594351 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:37Z","lastTransitionTime":"2025-12-02T15:53:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:37 crc kubenswrapper[4933]: I1202 15:53:37.698970 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:37 crc kubenswrapper[4933]: I1202 15:53:37.699033 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:37 crc kubenswrapper[4933]: I1202 15:53:37.699055 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:37 crc kubenswrapper[4933]: I1202 15:53:37.699084 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:37 crc kubenswrapper[4933]: I1202 15:53:37.699107 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:37Z","lastTransitionTime":"2025-12-02T15:53:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:37 crc kubenswrapper[4933]: I1202 15:53:37.802358 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:37 crc kubenswrapper[4933]: I1202 15:53:37.802409 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:37 crc kubenswrapper[4933]: I1202 15:53:37.802425 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:37 crc kubenswrapper[4933]: I1202 15:53:37.802450 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:37 crc kubenswrapper[4933]: I1202 15:53:37.802467 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:37Z","lastTransitionTime":"2025-12-02T15:53:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:37 crc kubenswrapper[4933]: I1202 15:53:37.906278 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:37 crc kubenswrapper[4933]: I1202 15:53:37.906329 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:37 crc kubenswrapper[4933]: I1202 15:53:37.906347 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:37 crc kubenswrapper[4933]: I1202 15:53:37.906382 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:37 crc kubenswrapper[4933]: I1202 15:53:37.906416 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:37Z","lastTransitionTime":"2025-12-02T15:53:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:38 crc kubenswrapper[4933]: I1202 15:53:38.009624 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:38 crc kubenswrapper[4933]: I1202 15:53:38.010211 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:38 crc kubenswrapper[4933]: I1202 15:53:38.010472 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:38 crc kubenswrapper[4933]: I1202 15:53:38.010670 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:38 crc kubenswrapper[4933]: I1202 15:53:38.010887 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:38Z","lastTransitionTime":"2025-12-02T15:53:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:38 crc kubenswrapper[4933]: I1202 15:53:38.052409 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 15:53:38 crc kubenswrapper[4933]: I1202 15:53:38.052441 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 15:53:38 crc kubenswrapper[4933]: E1202 15:53:38.052529 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 15:53:38 crc kubenswrapper[4933]: I1202 15:53:38.052556 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 15:53:38 crc kubenswrapper[4933]: E1202 15:53:38.052591 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 15:53:38 crc kubenswrapper[4933]: E1202 15:53:38.052635 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 15:53:38 crc kubenswrapper[4933]: I1202 15:53:38.114133 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:38 crc kubenswrapper[4933]: I1202 15:53:38.114205 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:38 crc kubenswrapper[4933]: I1202 15:53:38.114226 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:38 crc kubenswrapper[4933]: I1202 15:53:38.114255 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:38 crc kubenswrapper[4933]: I1202 15:53:38.114276 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:38Z","lastTransitionTime":"2025-12-02T15:53:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:38 crc kubenswrapper[4933]: I1202 15:53:38.216647 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:38 crc kubenswrapper[4933]: I1202 15:53:38.216684 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:38 crc kubenswrapper[4933]: I1202 15:53:38.216696 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:38 crc kubenswrapper[4933]: I1202 15:53:38.216713 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:38 crc kubenswrapper[4933]: I1202 15:53:38.216726 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:38Z","lastTransitionTime":"2025-12-02T15:53:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:38 crc kubenswrapper[4933]: I1202 15:53:38.320313 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:38 crc kubenswrapper[4933]: I1202 15:53:38.320400 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:38 crc kubenswrapper[4933]: I1202 15:53:38.320425 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:38 crc kubenswrapper[4933]: I1202 15:53:38.320453 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:38 crc kubenswrapper[4933]: I1202 15:53:38.320471 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:38Z","lastTransitionTime":"2025-12-02T15:53:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:38 crc kubenswrapper[4933]: I1202 15:53:38.423631 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:38 crc kubenswrapper[4933]: I1202 15:53:38.423697 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:38 crc kubenswrapper[4933]: I1202 15:53:38.423715 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:38 crc kubenswrapper[4933]: I1202 15:53:38.423747 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:38 crc kubenswrapper[4933]: I1202 15:53:38.423767 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:38Z","lastTransitionTime":"2025-12-02T15:53:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:38 crc kubenswrapper[4933]: I1202 15:53:38.528869 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:38 crc kubenswrapper[4933]: I1202 15:53:38.529548 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:38 crc kubenswrapper[4933]: I1202 15:53:38.529596 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:38 crc kubenswrapper[4933]: I1202 15:53:38.529628 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:38 crc kubenswrapper[4933]: I1202 15:53:38.529660 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:38Z","lastTransitionTime":"2025-12-02T15:53:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:38 crc kubenswrapper[4933]: I1202 15:53:38.633416 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:38 crc kubenswrapper[4933]: I1202 15:53:38.633453 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:38 crc kubenswrapper[4933]: I1202 15:53:38.633461 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:38 crc kubenswrapper[4933]: I1202 15:53:38.633474 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:38 crc kubenswrapper[4933]: I1202 15:53:38.633483 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:38Z","lastTransitionTime":"2025-12-02T15:53:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:38 crc kubenswrapper[4933]: I1202 15:53:38.736704 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:38 crc kubenswrapper[4933]: I1202 15:53:38.736782 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:38 crc kubenswrapper[4933]: I1202 15:53:38.736796 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:38 crc kubenswrapper[4933]: I1202 15:53:38.736810 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:38 crc kubenswrapper[4933]: I1202 15:53:38.736862 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:38Z","lastTransitionTime":"2025-12-02T15:53:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:38 crc kubenswrapper[4933]: I1202 15:53:38.840176 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:38 crc kubenswrapper[4933]: I1202 15:53:38.840219 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:38 crc kubenswrapper[4933]: I1202 15:53:38.840229 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:38 crc kubenswrapper[4933]: I1202 15:53:38.840243 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:38 crc kubenswrapper[4933]: I1202 15:53:38.840252 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:38Z","lastTransitionTime":"2025-12-02T15:53:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:38 crc kubenswrapper[4933]: I1202 15:53:38.943314 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:38 crc kubenswrapper[4933]: I1202 15:53:38.943394 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:38 crc kubenswrapper[4933]: I1202 15:53:38.943413 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:38 crc kubenswrapper[4933]: I1202 15:53:38.943442 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:38 crc kubenswrapper[4933]: I1202 15:53:38.943461 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:38Z","lastTransitionTime":"2025-12-02T15:53:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:39 crc kubenswrapper[4933]: I1202 15:53:39.045948 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:39 crc kubenswrapper[4933]: I1202 15:53:39.046001 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:39 crc kubenswrapper[4933]: I1202 15:53:39.046016 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:39 crc kubenswrapper[4933]: I1202 15:53:39.046037 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:39 crc kubenswrapper[4933]: I1202 15:53:39.046052 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:39Z","lastTransitionTime":"2025-12-02T15:53:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:39 crc kubenswrapper[4933]: I1202 15:53:39.053268 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qbps2" Dec 02 15:53:39 crc kubenswrapper[4933]: E1202 15:53:39.053399 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qbps2" podUID="c95a4730-1427-4097-9ca3-4bd251e7acf0" Dec 02 15:53:39 crc kubenswrapper[4933]: I1202 15:53:39.149189 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:39 crc kubenswrapper[4933]: I1202 15:53:39.149234 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:39 crc kubenswrapper[4933]: I1202 15:53:39.149248 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:39 crc kubenswrapper[4933]: I1202 15:53:39.149268 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:39 crc kubenswrapper[4933]: I1202 15:53:39.149282 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:39Z","lastTransitionTime":"2025-12-02T15:53:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:39 crc kubenswrapper[4933]: I1202 15:53:39.253180 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:39 crc kubenswrapper[4933]: I1202 15:53:39.253738 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:39 crc kubenswrapper[4933]: I1202 15:53:39.254088 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:39 crc kubenswrapper[4933]: I1202 15:53:39.254547 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:39 crc kubenswrapper[4933]: I1202 15:53:39.255003 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:39Z","lastTransitionTime":"2025-12-02T15:53:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:39 crc kubenswrapper[4933]: I1202 15:53:39.358497 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:39 crc kubenswrapper[4933]: I1202 15:53:39.358894 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:39 crc kubenswrapper[4933]: I1202 15:53:39.359164 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:39 crc kubenswrapper[4933]: I1202 15:53:39.359431 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:39 crc kubenswrapper[4933]: I1202 15:53:39.359676 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:39Z","lastTransitionTime":"2025-12-02T15:53:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:39 crc kubenswrapper[4933]: I1202 15:53:39.463741 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:39 crc kubenswrapper[4933]: I1202 15:53:39.465253 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:39 crc kubenswrapper[4933]: I1202 15:53:39.465295 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:39 crc kubenswrapper[4933]: I1202 15:53:39.465339 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:39 crc kubenswrapper[4933]: I1202 15:53:39.465373 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:39Z","lastTransitionTime":"2025-12-02T15:53:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:39 crc kubenswrapper[4933]: I1202 15:53:39.568318 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:39 crc kubenswrapper[4933]: I1202 15:53:39.568396 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:39 crc kubenswrapper[4933]: I1202 15:53:39.568414 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:39 crc kubenswrapper[4933]: I1202 15:53:39.568493 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:39 crc kubenswrapper[4933]: I1202 15:53:39.568514 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:39Z","lastTransitionTime":"2025-12-02T15:53:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:39 crc kubenswrapper[4933]: I1202 15:53:39.671552 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:39 crc kubenswrapper[4933]: I1202 15:53:39.671596 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:39 crc kubenswrapper[4933]: I1202 15:53:39.671605 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:39 crc kubenswrapper[4933]: I1202 15:53:39.671620 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:39 crc kubenswrapper[4933]: I1202 15:53:39.671630 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:39Z","lastTransitionTime":"2025-12-02T15:53:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:39 crc kubenswrapper[4933]: I1202 15:53:39.775185 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:39 crc kubenswrapper[4933]: I1202 15:53:39.775234 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:39 crc kubenswrapper[4933]: I1202 15:53:39.775244 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:39 crc kubenswrapper[4933]: I1202 15:53:39.775259 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:39 crc kubenswrapper[4933]: I1202 15:53:39.775270 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:39Z","lastTransitionTime":"2025-12-02T15:53:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:39 crc kubenswrapper[4933]: I1202 15:53:39.879558 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:39 crc kubenswrapper[4933]: I1202 15:53:39.879633 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:39 crc kubenswrapper[4933]: I1202 15:53:39.879659 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:39 crc kubenswrapper[4933]: I1202 15:53:39.879691 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:39 crc kubenswrapper[4933]: I1202 15:53:39.879714 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:39Z","lastTransitionTime":"2025-12-02T15:53:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:39 crc kubenswrapper[4933]: I1202 15:53:39.982470 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:39 crc kubenswrapper[4933]: I1202 15:53:39.982546 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:39 crc kubenswrapper[4933]: I1202 15:53:39.982570 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:39 crc kubenswrapper[4933]: I1202 15:53:39.982598 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:39 crc kubenswrapper[4933]: I1202 15:53:39.982621 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:39Z","lastTransitionTime":"2025-12-02T15:53:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:40 crc kubenswrapper[4933]: I1202 15:53:40.052381 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 15:53:40 crc kubenswrapper[4933]: I1202 15:53:40.052427 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 15:53:40 crc kubenswrapper[4933]: E1202 15:53:40.052651 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 15:53:40 crc kubenswrapper[4933]: I1202 15:53:40.052734 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 15:53:40 crc kubenswrapper[4933]: E1202 15:53:40.053076 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 15:53:40 crc kubenswrapper[4933]: E1202 15:53:40.053241 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 15:53:40 crc kubenswrapper[4933]: I1202 15:53:40.085231 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:40 crc kubenswrapper[4933]: I1202 15:53:40.085312 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:40 crc kubenswrapper[4933]: I1202 15:53:40.085332 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:40 crc kubenswrapper[4933]: I1202 15:53:40.085363 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:40 crc kubenswrapper[4933]: I1202 15:53:40.085383 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:40Z","lastTransitionTime":"2025-12-02T15:53:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:40 crc kubenswrapper[4933]: I1202 15:53:40.181482 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 15:53:40 crc kubenswrapper[4933]: I1202 15:53:40.181702 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 15:53:40 crc kubenswrapper[4933]: I1202 15:53:40.181756 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 15:53:40 crc kubenswrapper[4933]: I1202 15:53:40.181795 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 15:53:40 crc kubenswrapper[4933]: E1202 15:53:40.181866 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 15:54:44.181789264 +0000 UTC m=+147.433016007 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:53:40 crc kubenswrapper[4933]: I1202 15:53:40.181952 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 15:53:40 crc kubenswrapper[4933]: E1202 15:53:40.181959 4933 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 15:53:40 crc kubenswrapper[4933]: E1202 15:53:40.182122 4933 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 15:53:40 crc kubenswrapper[4933]: E1202 15:53:40.182184 4933 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 15:53:40 crc kubenswrapper[4933]: E1202 15:53:40.182199 4933 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 15:53:40 crc kubenswrapper[4933]: E1202 15:53:40.182134 4933 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 15:53:40 crc kubenswrapper[4933]: E1202 15:53:40.182018 4933 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 15:53:40 crc kubenswrapper[4933]: E1202 15:53:40.182319 4933 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 15:53:40 crc kubenswrapper[4933]: E1202 15:53:40.182344 4933 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 15:53:40 crc kubenswrapper[4933]: E1202 15:53:40.182152 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 15:54:44.182137204 +0000 UTC m=+147.433363947 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 15:53:40 crc kubenswrapper[4933]: E1202 15:53:40.182426 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-02 15:54:44.18239419 +0000 UTC m=+147.433620933 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 15:53:40 crc kubenswrapper[4933]: E1202 15:53:40.182453 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 15:54:44.182440762 +0000 UTC m=+147.433667495 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 15:53:40 crc kubenswrapper[4933]: E1202 15:53:40.182474 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-02 15:54:44.182462942 +0000 UTC m=+147.433689685 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 15:53:40 crc kubenswrapper[4933]: I1202 15:53:40.189708 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:40 crc kubenswrapper[4933]: I1202 15:53:40.189770 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:40 crc kubenswrapper[4933]: I1202 15:53:40.189795 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:40 crc kubenswrapper[4933]: I1202 15:53:40.190041 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:40 crc kubenswrapper[4933]: I1202 15:53:40.190064 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:40Z","lastTransitionTime":"2025-12-02T15:53:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:40 crc kubenswrapper[4933]: I1202 15:53:40.292832 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:40 crc kubenswrapper[4933]: I1202 15:53:40.292877 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:40 crc kubenswrapper[4933]: I1202 15:53:40.292887 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:40 crc kubenswrapper[4933]: I1202 15:53:40.292902 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:40 crc kubenswrapper[4933]: I1202 15:53:40.292912 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:40Z","lastTransitionTime":"2025-12-02T15:53:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:40 crc kubenswrapper[4933]: I1202 15:53:40.395718 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:40 crc kubenswrapper[4933]: I1202 15:53:40.395745 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:40 crc kubenswrapper[4933]: I1202 15:53:40.395754 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:40 crc kubenswrapper[4933]: I1202 15:53:40.395771 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:40 crc kubenswrapper[4933]: I1202 15:53:40.395790 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:40Z","lastTransitionTime":"2025-12-02T15:53:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:40 crc kubenswrapper[4933]: I1202 15:53:40.498698 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:40 crc kubenswrapper[4933]: I1202 15:53:40.498762 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:40 crc kubenswrapper[4933]: I1202 15:53:40.498776 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:40 crc kubenswrapper[4933]: I1202 15:53:40.498797 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:40 crc kubenswrapper[4933]: I1202 15:53:40.498814 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:40Z","lastTransitionTime":"2025-12-02T15:53:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:40 crc kubenswrapper[4933]: I1202 15:53:40.602243 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:40 crc kubenswrapper[4933]: I1202 15:53:40.602713 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:40 crc kubenswrapper[4933]: I1202 15:53:40.602914 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:40 crc kubenswrapper[4933]: I1202 15:53:40.603084 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:40 crc kubenswrapper[4933]: I1202 15:53:40.603249 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:40Z","lastTransitionTime":"2025-12-02T15:53:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:40 crc kubenswrapper[4933]: I1202 15:53:40.706078 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:40 crc kubenswrapper[4933]: I1202 15:53:40.706472 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:40 crc kubenswrapper[4933]: I1202 15:53:40.706586 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:40 crc kubenswrapper[4933]: I1202 15:53:40.706698 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:40 crc kubenswrapper[4933]: I1202 15:53:40.706790 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:40Z","lastTransitionTime":"2025-12-02T15:53:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:40 crc kubenswrapper[4933]: I1202 15:53:40.809602 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:40 crc kubenswrapper[4933]: I1202 15:53:40.809659 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:40 crc kubenswrapper[4933]: I1202 15:53:40.809673 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:40 crc kubenswrapper[4933]: I1202 15:53:40.809693 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:40 crc kubenswrapper[4933]: I1202 15:53:40.809708 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:40Z","lastTransitionTime":"2025-12-02T15:53:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:40 crc kubenswrapper[4933]: I1202 15:53:40.912946 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:40 crc kubenswrapper[4933]: I1202 15:53:40.913342 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:40 crc kubenswrapper[4933]: I1202 15:53:40.913431 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:40 crc kubenswrapper[4933]: I1202 15:53:40.913544 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:40 crc kubenswrapper[4933]: I1202 15:53:40.913633 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:40Z","lastTransitionTime":"2025-12-02T15:53:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:41 crc kubenswrapper[4933]: I1202 15:53:41.017004 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:41 crc kubenswrapper[4933]: I1202 15:53:41.017328 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:41 crc kubenswrapper[4933]: I1202 15:53:41.017397 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:41 crc kubenswrapper[4933]: I1202 15:53:41.017465 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:41 crc kubenswrapper[4933]: I1202 15:53:41.017543 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:41Z","lastTransitionTime":"2025-12-02T15:53:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:41 crc kubenswrapper[4933]: I1202 15:53:41.053192 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qbps2" Dec 02 15:53:41 crc kubenswrapper[4933]: E1202 15:53:41.053390 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qbps2" podUID="c95a4730-1427-4097-9ca3-4bd251e7acf0" Dec 02 15:53:41 crc kubenswrapper[4933]: I1202 15:53:41.121211 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:41 crc kubenswrapper[4933]: I1202 15:53:41.121261 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:41 crc kubenswrapper[4933]: I1202 15:53:41.121270 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:41 crc kubenswrapper[4933]: I1202 15:53:41.121288 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:41 crc kubenswrapper[4933]: I1202 15:53:41.121297 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:41Z","lastTransitionTime":"2025-12-02T15:53:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:41 crc kubenswrapper[4933]: I1202 15:53:41.224445 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:41 crc kubenswrapper[4933]: I1202 15:53:41.224525 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:41 crc kubenswrapper[4933]: I1202 15:53:41.224538 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:41 crc kubenswrapper[4933]: I1202 15:53:41.224565 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:41 crc kubenswrapper[4933]: I1202 15:53:41.224577 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:41Z","lastTransitionTime":"2025-12-02T15:53:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:41 crc kubenswrapper[4933]: I1202 15:53:41.327342 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:41 crc kubenswrapper[4933]: I1202 15:53:41.327394 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:41 crc kubenswrapper[4933]: I1202 15:53:41.327403 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:41 crc kubenswrapper[4933]: I1202 15:53:41.327422 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:41 crc kubenswrapper[4933]: I1202 15:53:41.327434 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:41Z","lastTransitionTime":"2025-12-02T15:53:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:41 crc kubenswrapper[4933]: I1202 15:53:41.430622 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:41 crc kubenswrapper[4933]: I1202 15:53:41.430681 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:41 crc kubenswrapper[4933]: I1202 15:53:41.430696 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:41 crc kubenswrapper[4933]: I1202 15:53:41.430717 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:41 crc kubenswrapper[4933]: I1202 15:53:41.430731 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:41Z","lastTransitionTime":"2025-12-02T15:53:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:41 crc kubenswrapper[4933]: I1202 15:53:41.534380 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:41 crc kubenswrapper[4933]: I1202 15:53:41.534448 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:41 crc kubenswrapper[4933]: I1202 15:53:41.534470 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:41 crc kubenswrapper[4933]: I1202 15:53:41.534498 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:41 crc kubenswrapper[4933]: I1202 15:53:41.534522 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:41Z","lastTransitionTime":"2025-12-02T15:53:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:41 crc kubenswrapper[4933]: I1202 15:53:41.637637 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:41 crc kubenswrapper[4933]: I1202 15:53:41.637724 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:41 crc kubenswrapper[4933]: I1202 15:53:41.637736 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:41 crc kubenswrapper[4933]: I1202 15:53:41.637757 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:41 crc kubenswrapper[4933]: I1202 15:53:41.637773 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:41Z","lastTransitionTime":"2025-12-02T15:53:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:41 crc kubenswrapper[4933]: I1202 15:53:41.741293 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:41 crc kubenswrapper[4933]: I1202 15:53:41.741363 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:41 crc kubenswrapper[4933]: I1202 15:53:41.741383 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:41 crc kubenswrapper[4933]: I1202 15:53:41.741409 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:41 crc kubenswrapper[4933]: I1202 15:53:41.741429 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:41Z","lastTransitionTime":"2025-12-02T15:53:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:41 crc kubenswrapper[4933]: I1202 15:53:41.758399 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:41 crc kubenswrapper[4933]: I1202 15:53:41.758496 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:41 crc kubenswrapper[4933]: I1202 15:53:41.758517 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:41 crc kubenswrapper[4933]: I1202 15:53:41.758587 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:41 crc kubenswrapper[4933]: I1202 15:53:41.758604 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:41Z","lastTransitionTime":"2025-12-02T15:53:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:41 crc kubenswrapper[4933]: E1202 15:53:41.779739 4933 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:53:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:53:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:53:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:53:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b45811fa-f657-451d-9a34-cdd268fcc941\\\",\\\"systemUUID\\\":\\\"84b7b789-bc9b-466b-8619-2bf2e1fdb8d0\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:41Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:41 crc kubenswrapper[4933]: I1202 15:53:41.784981 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:41 crc kubenswrapper[4933]: I1202 15:53:41.785080 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:41 crc kubenswrapper[4933]: I1202 15:53:41.785105 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:41 crc kubenswrapper[4933]: I1202 15:53:41.785137 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:41 crc kubenswrapper[4933]: I1202 15:53:41.785157 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:41Z","lastTransitionTime":"2025-12-02T15:53:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:41 crc kubenswrapper[4933]: E1202 15:53:41.805744 4933 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:53:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:53:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:53:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:53:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b45811fa-f657-451d-9a34-cdd268fcc941\\\",\\\"systemUUID\\\":\\\"84b7b789-bc9b-466b-8619-2bf2e1fdb8d0\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:41Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:41 crc kubenswrapper[4933]: I1202 15:53:41.811670 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:41 crc kubenswrapper[4933]: I1202 15:53:41.812161 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:41 crc kubenswrapper[4933]: I1202 15:53:41.812254 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:41 crc kubenswrapper[4933]: I1202 15:53:41.812364 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:41 crc kubenswrapper[4933]: I1202 15:53:41.812461 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:41Z","lastTransitionTime":"2025-12-02T15:53:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:41 crc kubenswrapper[4933]: E1202 15:53:41.826687 4933 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:53:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:53:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:53:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:53:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b45811fa-f657-451d-9a34-cdd268fcc941\\\",\\\"systemUUID\\\":\\\"84b7b789-bc9b-466b-8619-2bf2e1fdb8d0\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:41Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:41 crc kubenswrapper[4933]: I1202 15:53:41.831094 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:41 crc kubenswrapper[4933]: I1202 15:53:41.831137 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:41 crc kubenswrapper[4933]: I1202 15:53:41.831150 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:41 crc kubenswrapper[4933]: I1202 15:53:41.831170 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:41 crc kubenswrapper[4933]: I1202 15:53:41.831184 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:41Z","lastTransitionTime":"2025-12-02T15:53:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:41 crc kubenswrapper[4933]: E1202 15:53:41.851093 4933 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:53:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:53:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:53:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:53:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b45811fa-f657-451d-9a34-cdd268fcc941\\\",\\\"systemUUID\\\":\\\"84b7b789-bc9b-466b-8619-2bf2e1fdb8d0\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:41Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:41 crc kubenswrapper[4933]: I1202 15:53:41.855875 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:41 crc kubenswrapper[4933]: I1202 15:53:41.856064 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:41 crc kubenswrapper[4933]: I1202 15:53:41.856137 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:41 crc kubenswrapper[4933]: I1202 15:53:41.856203 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:41 crc kubenswrapper[4933]: I1202 15:53:41.856273 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:41Z","lastTransitionTime":"2025-12-02T15:53:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:41 crc kubenswrapper[4933]: E1202 15:53:41.869751 4933 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:53:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:53:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:53:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:53:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b45811fa-f657-451d-9a34-cdd268fcc941\\\",\\\"systemUUID\\\":\\\"84b7b789-bc9b-466b-8619-2bf2e1fdb8d0\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:41Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:41 crc kubenswrapper[4933]: E1202 15:53:41.869890 4933 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 02 15:53:41 crc kubenswrapper[4933]: I1202 15:53:41.871499 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:41 crc kubenswrapper[4933]: I1202 15:53:41.871527 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:41 crc kubenswrapper[4933]: I1202 15:53:41.871539 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:41 crc kubenswrapper[4933]: I1202 15:53:41.871554 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:41 crc kubenswrapper[4933]: I1202 15:53:41.871567 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:41Z","lastTransitionTime":"2025-12-02T15:53:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:41 crc kubenswrapper[4933]: I1202 15:53:41.974707 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:41 crc kubenswrapper[4933]: I1202 15:53:41.975173 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:41 crc kubenswrapper[4933]: I1202 15:53:41.975307 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:41 crc kubenswrapper[4933]: I1202 15:53:41.975448 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:41 crc kubenswrapper[4933]: I1202 15:53:41.975585 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:41Z","lastTransitionTime":"2025-12-02T15:53:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:42 crc kubenswrapper[4933]: I1202 15:53:42.052729 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 15:53:42 crc kubenswrapper[4933]: E1202 15:53:42.053229 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 15:53:42 crc kubenswrapper[4933]: I1202 15:53:42.052852 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 15:53:42 crc kubenswrapper[4933]: E1202 15:53:42.053417 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 15:53:42 crc kubenswrapper[4933]: I1202 15:53:42.052729 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 15:53:42 crc kubenswrapper[4933]: E1202 15:53:42.053589 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 15:53:42 crc kubenswrapper[4933]: I1202 15:53:42.053674 4933 scope.go:117] "RemoveContainer" containerID="e68bfc1eb2e35cbb02950eed88f4e4cb09b5ee597cbf3dbefcbb7dca7a9f90d7" Dec 02 15:53:42 crc kubenswrapper[4933]: I1202 15:53:42.079097 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:42 crc kubenswrapper[4933]: I1202 15:53:42.079163 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:42 crc kubenswrapper[4933]: I1202 15:53:42.079185 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:42 crc kubenswrapper[4933]: I1202 15:53:42.079216 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:42 crc kubenswrapper[4933]: I1202 15:53:42.079240 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:42Z","lastTransitionTime":"2025-12-02T15:53:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:42 crc kubenswrapper[4933]: I1202 15:53:42.182554 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:42 crc kubenswrapper[4933]: I1202 15:53:42.183046 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:42 crc kubenswrapper[4933]: I1202 15:53:42.183067 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:42 crc kubenswrapper[4933]: I1202 15:53:42.183095 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:42 crc kubenswrapper[4933]: I1202 15:53:42.183112 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:42Z","lastTransitionTime":"2025-12-02T15:53:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:42 crc kubenswrapper[4933]: I1202 15:53:42.285913 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:42 crc kubenswrapper[4933]: I1202 15:53:42.285982 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:42 crc kubenswrapper[4933]: I1202 15:53:42.286001 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:42 crc kubenswrapper[4933]: I1202 15:53:42.286029 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:42 crc kubenswrapper[4933]: I1202 15:53:42.286048 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:42Z","lastTransitionTime":"2025-12-02T15:53:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:42 crc kubenswrapper[4933]: I1202 15:53:42.390148 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:42 crc kubenswrapper[4933]: I1202 15:53:42.390195 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:42 crc kubenswrapper[4933]: I1202 15:53:42.390206 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:42 crc kubenswrapper[4933]: I1202 15:53:42.390290 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:42 crc kubenswrapper[4933]: I1202 15:53:42.390302 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:42Z","lastTransitionTime":"2025-12-02T15:53:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:42 crc kubenswrapper[4933]: I1202 15:53:42.493512 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:42 crc kubenswrapper[4933]: I1202 15:53:42.493575 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:42 crc kubenswrapper[4933]: I1202 15:53:42.493592 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:42 crc kubenswrapper[4933]: I1202 15:53:42.493616 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:42 crc kubenswrapper[4933]: I1202 15:53:42.493633 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:42Z","lastTransitionTime":"2025-12-02T15:53:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:42 crc kubenswrapper[4933]: I1202 15:53:42.597466 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:42 crc kubenswrapper[4933]: I1202 15:53:42.597561 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:42 crc kubenswrapper[4933]: I1202 15:53:42.597586 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:42 crc kubenswrapper[4933]: I1202 15:53:42.597618 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:42 crc kubenswrapper[4933]: I1202 15:53:42.597644 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:42Z","lastTransitionTime":"2025-12-02T15:53:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:42 crc kubenswrapper[4933]: I1202 15:53:42.701340 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:42 crc kubenswrapper[4933]: I1202 15:53:42.701390 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:42 crc kubenswrapper[4933]: I1202 15:53:42.701407 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:42 crc kubenswrapper[4933]: I1202 15:53:42.701431 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:42 crc kubenswrapper[4933]: I1202 15:53:42.701449 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:42Z","lastTransitionTime":"2025-12-02T15:53:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:42 crc kubenswrapper[4933]: I1202 15:53:42.804582 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:42 crc kubenswrapper[4933]: I1202 15:53:42.804666 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:42 crc kubenswrapper[4933]: I1202 15:53:42.804687 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:42 crc kubenswrapper[4933]: I1202 15:53:42.804706 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:42 crc kubenswrapper[4933]: I1202 15:53:42.804717 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:42Z","lastTransitionTime":"2025-12-02T15:53:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:42 crc kubenswrapper[4933]: I1202 15:53:42.846532 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8mklc_1972064c-ea30-421c-b009-2bc675a98fcc/ovnkube-controller/2.log" Dec 02 15:53:42 crc kubenswrapper[4933]: I1202 15:53:42.850879 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" event={"ID":"1972064c-ea30-421c-b009-2bc675a98fcc","Type":"ContainerStarted","Data":"2f7591b746ed9942ce7644e8005251baa49bdeaef3618aa1d679249fd3a96058"} Dec 02 15:53:42 crc kubenswrapper[4933]: I1202 15:53:42.852326 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" Dec 02 15:53:42 crc kubenswrapper[4933]: I1202 15:53:42.871571 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67141d41-dade-4d16-8921-1a3eeaef658e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67497877f8241167fd89cf2ebbc7257a910b4f22e96a1e58f78936317daf19aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://559f28d5b8b13a4d78a727f671a75d2c61a391f83089c98749a8071b466c8fae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50e3f8f75abacf2b0701b20cc58796727bb04d99836c7ad69c27355d0c7f070f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0f8661c980893fc646e84a6bb8547946653723fb3f1449d585953e25715d588\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdced4864fc5e9a41404f9484c6126634ffcbc3388080207f6a5508be6dc7b19\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 15:52:30.552832 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 15:52:30.556124 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1621699441/tls.crt::/tmp/serving-cert-1621699441/tls.key\\\\\\\"\\\\nI1202 15:52:36.166152 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 15:52:36.169452 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 15:52:36.169553 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 15:52:36.169614 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 15:52:36.169667 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 15:52:36.177343 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 15:52:36.177409 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 15:52:36.177417 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 15:52:36.177423 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 15:52:36.177428 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 15:52:36.177432 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 15:52:36.177437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 15:52:36.177361 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 15:52:36.181525 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://266ff5f18c317176752753189e1fd32c61d17e22bebba19421d71cdce41c10e4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd63fc5b2ffbd52b09628908868b28e2168bfe8bc453de41ce175aedcd404cc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd63fc5b2ffbd52b09628908868b28e2168bfe8bc453de41ce175aedcd404cc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:42Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:42 crc kubenswrapper[4933]: I1202 15:53:42.884288 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c0555430401fd089ba4f14bef44c9a03bcc4352a3159c34aa592797211ff912\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2774460b514418fc05e1d8ac0ca0a8cda1194fab9151804bed266e6bf44c7369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:42Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:42 crc kubenswrapper[4933]: I1202 15:53:42.898051 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:42Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:42 crc kubenswrapper[4933]: I1202 15:53:42.908400 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:42 crc kubenswrapper[4933]: I1202 15:53:42.908453 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:42 crc kubenswrapper[4933]: I1202 15:53:42.908464 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:42 crc kubenswrapper[4933]: I1202 15:53:42.908483 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:42 crc kubenswrapper[4933]: I1202 15:53:42.908495 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:42Z","lastTransitionTime":"2025-12-02T15:53:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:42 crc kubenswrapper[4933]: I1202 15:53:42.911884 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-s779q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72a30a71-fe04-43a2-8f60-c9b12a0a6e5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2367cc195b2375ceed6df398214268e414ae13f5459750b4a1f3bbe4ef59363\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a448b3118b6160344bea740978c045ce2351af934805ee95eb6b940963b377bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a448b3118b6160344bea740978c045ce2351af934805ee95eb6b940963b377bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f11b7eae04e4265f3d00a32c0c1c91419ef6da612c6ecd2e69ef4e90b5f72b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f11b7eae04e4265f3d00a32c0c1c91419ef6da612c6ecd2e69ef4e90b5f72b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89093d056f612b90064277410dc17c8b1879459a9e9491538ccd8dabeb2703e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89093d056f612b90064277410dc17c8b1879459a9e9491538ccd8dabeb2703e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e773c0ba8db44a579b0f5b59c7402d773402fbb1b3e3b57bf95e9244df635c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e773c0ba8db44a579b0f5b59c7402d773402fbb1b3e3b57bf95e9244df635c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a098d78bfca730eec14ca5f48ae082e24f0623ec3752e1b0182dceb18821e3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a098d78bfca730eec14ca5f48ae082e24f0623ec3752e1b0182dceb18821e3b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e41fea9eed5df332e515a81666075fc4cb3171b47a7c222b36dac4d5a7533692\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e41fea9eed5df332e515a81666075fc4cb3171b47a7c222b36dac4d5a7533692\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-s779q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:42Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:42 crc kubenswrapper[4933]: I1202 15:53:42.923622 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"448f05b3-d7b9-4eca-a267-b9b4a5a766e2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://242478d3d008a43cabff06a9c4bf90faf8c8eb3a12f5f43cde5d0156c47d6a8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://278a1b76a0d8f7d922ab08d7ecd0912f0cde3f3438de43f5c5214e703bd244c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://278a1b76a0d8f7d922ab08d7ecd0912f0cde3f3438de43f5c5214e703bd244c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:42Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:42 crc kubenswrapper[4933]: I1202 15:53:42.938105 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06b195f1-3296-4050-9361-eab421cde8d4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7399fed7994f0ed3f12a423d3f6796e84d8687f9c16a3050ccbb90e1c80a07d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee933bdd8638f0085a6f720a178c8ce59bf46b40a0bcb015ac9c570e25ce97d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7cb1abb6f86878fc3daef153191ea3a2ebe06b3f1fc7df959539938c3b6a724\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db14d6e6ebdfa06ff02570eb66fe7ea17a7705fdaa767b6fb91d7ed12eacd59a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:42Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:42 crc kubenswrapper[4933]: I1202 15:53:42.959901 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1972064c-ea30-421c-b009-2bc675a98fcc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3920b75a79accaa154ca827b9d813f5bbb7c7c18369c0b6d3bb82cab653bcb66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a3f2ac3ef3759389e0c4807b4235aea66b8d9570ef020af90110402296a755\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d4c0805f39e3410185b2d898b61b42dfeb544cd47c1ac7570a099ece7921a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://170155e5803e9eabfc4866a3cd76bd74567d9ff6707bebfdf7f24a2330173140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a499bb33dae499d565aa22f537a095fe9465bd1d68d829248fc7a896f692f67d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14c25265781df0f480ecebbc167b5e413d869433f7258cc299a2c2bc12ee2af3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f7591b746ed9942ce7644e8005251baa49bdeaef3618aa1d679249fd3a96058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e68bfc1eb2e35cbb02950eed88f4e4cb09b5ee597cbf3dbefcbb7dca7a9f90d7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T15:53:08Z\\\",\\\"message\\\":\\\"g for pod on switch crc\\\\nI1202 15:53:07.975214 6599 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1202 15:53:07.975521 6599 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1202 15:53:07.975535 6599 base_network_controller_pods.go:916] Annotation values: ip=[10.217.0.3/23] ; mac=0a:58:0a:d9:00:03 ; gw=[10.217.0.1]\\\\nI1202 15:53:07.975530 6599 ovn.go:134] Ensuring zone local for Pod openshift-kube-controller-manager/kube-controller-manager-crc in node crc\\\\nI1202 15:53:07.975580 6599 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc after 0 failed attempt(s)\\\\nI1202 15:53:07.975354 6599 services_controller.go:444] Built service openshift-apiserver/api LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1202 15:53:07.975596 6599 default_network_controller.go:776] Recording success event on pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1202 15:53:07.975612 6599 services_controller.go:445] Built service openshift-apiserver/api LB template configs for network=default: []services.lbConfig(nil)\\\\nF1202 15:53:07.975627 6599 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T15:53:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:53:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ae6c9973a0f8983a5da9a3a25ba23d7d836e7f5bdaa60b60627d604ef3d2afc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0062caebc274431e99ef7dc875a50f2f2d756215bd2700b7206582f68e5f376\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0062caebc274431e99ef7dc875a50f2f2d756215bd2700b7206582f68e5f376\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8mklc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:42Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:42 crc kubenswrapper[4933]: I1202 15:53:42.974492 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s488w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a161d9d5-a56f-45e9-93e4-50e7220cd31e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38f86b30416fb2f10e8444d5c7c0afe84f16d619e83e7a5e1186eaf4c274a51f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mnvpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cba0775cbbe9a6ce1f0b0fe3559f2b5eb39bd13d4f35686fe2ff92d7d833909\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mnvpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s488w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:42Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:42 crc kubenswrapper[4933]: I1202 15:53:42.985332 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7faf072-ca84-4fd4-9409-b46ca6d4f1b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://befd22a9932058934ba3614781a7f133b4e432d5488c47697d082722ac11e0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d957eb940436ccb56c610cb177a4670ec0fe3aa5437e333a99c291c4263c5978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://defe82313be0dd4818134a0c694acbf728c8e31d0df745b7f2241a3e57c1bd8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d0c36fe96576d4594aff50e5b631d1bc23ea352372722d332d11c9dc6b5b7cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d0c36fe96576d4594aff50e5b631d1bc23ea352372722d332d11c9dc6b5b7cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:17Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:42Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:42 crc kubenswrapper[4933]: I1202 15:53:42.998589 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:42Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:43 crc kubenswrapper[4933]: I1202 15:53:43.009937 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6c1c5e6-50dd-428a-890c-2c3f0456f2fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d84363ac0dfeec81ad7770d6ffd34547605fc51bebb545c4639f4c069bab93ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t48jm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54194f3459a2bbe748821e4f8e94abdd18e7c4e483d4cc2c9d5b765db584dd01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t48jm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-d2p6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:43Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:43 crc kubenswrapper[4933]: I1202 15:53:43.011287 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:43 crc kubenswrapper[4933]: I1202 15:53:43.011340 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:43 crc kubenswrapper[4933]: I1202 15:53:43.011361 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:43 crc kubenswrapper[4933]: I1202 15:53:43.011456 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:43 crc kubenswrapper[4933]: I1202 15:53:43.011521 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:43Z","lastTransitionTime":"2025-12-02T15:53:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:43 crc kubenswrapper[4933]: I1202 15:53:43.020239 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5d9dn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97edaf10-912b-42e7-a9e7-930381d48508\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea0970d622ab76b94ceab66d1d10d469581574368d38c8cd7c6b7a26f81cb6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s82hp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5d9dn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:43Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:43 crc kubenswrapper[4933]: I1202 15:53:43.031066 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fl25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"680c0df1-e4d6-4e1c-a36d-2378e821d2d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bdbdb1eb6ad7b6e710feeda6af64e9557ec1c3c938fd850fa5b2835abc45f098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-559sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fl25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:43Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:43 crc kubenswrapper[4933]: I1202 15:53:43.042495 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qbps2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c95a4730-1427-4097-9ca3-4bd251e7acf0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdwf7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdwf7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:56Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qbps2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:43Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:43 crc kubenswrapper[4933]: I1202 15:53:43.053295 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qbps2" Dec 02 15:53:43 crc kubenswrapper[4933]: E1202 15:53:43.053746 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qbps2" podUID="c95a4730-1427-4097-9ca3-4bd251e7acf0" Dec 02 15:53:43 crc kubenswrapper[4933]: I1202 15:53:43.057304 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50efb59904b9275848b8f068dfa8943515c66087209fe13dc75888354ecaff09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:43Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:43 crc kubenswrapper[4933]: I1202 15:53:43.069436 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Dec 02 15:53:43 crc kubenswrapper[4933]: I1202 15:53:43.070534 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:43Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:43 crc kubenswrapper[4933]: I1202 15:53:43.082743 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4424b26b4c67e9508d92cc6bbc82b291d93c587a8463026a856d87b7b778079e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:43Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:43 crc kubenswrapper[4933]: I1202 15:53:43.094925 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-z6kjz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b033c545-93a2-4401-842b-22456e44216b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf98d43fe6e267ef8509fcedf5375bfd2049a7a7964ddcb2cb97b6710013fb7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5307d7bbe56091012f9975b2a42eafb27d8c90b53817f1f82d8269e23456759\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T15:53:30Z\\\",\\\"message\\\":\\\"2025-12-02T15:52:45+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_261455ff-27bf-4826-8730-80c1421f483f\\\\n2025-12-02T15:52:45+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_261455ff-27bf-4826-8730-80c1421f483f to /host/opt/cni/bin/\\\\n2025-12-02T15:52:45Z [verbose] multus-daemon started\\\\n2025-12-02T15:52:45Z [verbose] Readiness Indicator file check\\\\n2025-12-02T15:53:30Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:53:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdq96\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-z6kjz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:43Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:43 crc kubenswrapper[4933]: I1202 15:53:43.114108 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:43 crc kubenswrapper[4933]: I1202 15:53:43.114162 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:43 crc kubenswrapper[4933]: I1202 15:53:43.114176 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:43 crc kubenswrapper[4933]: I1202 15:53:43.114194 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:43 crc kubenswrapper[4933]: I1202 15:53:43.114210 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:43Z","lastTransitionTime":"2025-12-02T15:53:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:43 crc kubenswrapper[4933]: I1202 15:53:43.222918 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:43 crc kubenswrapper[4933]: I1202 15:53:43.222960 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:43 crc kubenswrapper[4933]: I1202 15:53:43.222970 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:43 crc kubenswrapper[4933]: I1202 15:53:43.222985 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:43 crc kubenswrapper[4933]: I1202 15:53:43.222995 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:43Z","lastTransitionTime":"2025-12-02T15:53:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:43 crc kubenswrapper[4933]: I1202 15:53:43.325743 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:43 crc kubenswrapper[4933]: I1202 15:53:43.325811 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:43 crc kubenswrapper[4933]: I1202 15:53:43.325842 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:43 crc kubenswrapper[4933]: I1202 15:53:43.325861 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:43 crc kubenswrapper[4933]: I1202 15:53:43.325872 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:43Z","lastTransitionTime":"2025-12-02T15:53:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:43 crc kubenswrapper[4933]: I1202 15:53:43.429394 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:43 crc kubenswrapper[4933]: I1202 15:53:43.429444 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:43 crc kubenswrapper[4933]: I1202 15:53:43.429452 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:43 crc kubenswrapper[4933]: I1202 15:53:43.429473 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:43 crc kubenswrapper[4933]: I1202 15:53:43.429484 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:43Z","lastTransitionTime":"2025-12-02T15:53:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:43 crc kubenswrapper[4933]: I1202 15:53:43.532163 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:43 crc kubenswrapper[4933]: I1202 15:53:43.532218 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:43 crc kubenswrapper[4933]: I1202 15:53:43.532231 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:43 crc kubenswrapper[4933]: I1202 15:53:43.532252 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:43 crc kubenswrapper[4933]: I1202 15:53:43.532264 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:43Z","lastTransitionTime":"2025-12-02T15:53:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:43 crc kubenswrapper[4933]: I1202 15:53:43.635180 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:43 crc kubenswrapper[4933]: I1202 15:53:43.635244 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:43 crc kubenswrapper[4933]: I1202 15:53:43.635256 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:43 crc kubenswrapper[4933]: I1202 15:53:43.635276 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:43 crc kubenswrapper[4933]: I1202 15:53:43.635292 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:43Z","lastTransitionTime":"2025-12-02T15:53:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:43 crc kubenswrapper[4933]: I1202 15:53:43.737997 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:43 crc kubenswrapper[4933]: I1202 15:53:43.738064 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:43 crc kubenswrapper[4933]: I1202 15:53:43.738079 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:43 crc kubenswrapper[4933]: I1202 15:53:43.738100 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:43 crc kubenswrapper[4933]: I1202 15:53:43.738120 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:43Z","lastTransitionTime":"2025-12-02T15:53:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:43 crc kubenswrapper[4933]: I1202 15:53:43.841176 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:43 crc kubenswrapper[4933]: I1202 15:53:43.841247 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:43 crc kubenswrapper[4933]: I1202 15:53:43.841259 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:43 crc kubenswrapper[4933]: I1202 15:53:43.841277 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:43 crc kubenswrapper[4933]: I1202 15:53:43.841292 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:43Z","lastTransitionTime":"2025-12-02T15:53:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:43 crc kubenswrapper[4933]: I1202 15:53:43.944154 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:43 crc kubenswrapper[4933]: I1202 15:53:43.944217 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:43 crc kubenswrapper[4933]: I1202 15:53:43.944233 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:43 crc kubenswrapper[4933]: I1202 15:53:43.944259 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:43 crc kubenswrapper[4933]: I1202 15:53:43.944276 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:43Z","lastTransitionTime":"2025-12-02T15:53:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:44 crc kubenswrapper[4933]: I1202 15:53:44.047505 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:44 crc kubenswrapper[4933]: I1202 15:53:44.047545 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:44 crc kubenswrapper[4933]: I1202 15:53:44.047556 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:44 crc kubenswrapper[4933]: I1202 15:53:44.047571 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:44 crc kubenswrapper[4933]: I1202 15:53:44.047579 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:44Z","lastTransitionTime":"2025-12-02T15:53:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:44 crc kubenswrapper[4933]: I1202 15:53:44.053003 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 15:53:44 crc kubenswrapper[4933]: I1202 15:53:44.053087 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 15:53:44 crc kubenswrapper[4933]: E1202 15:53:44.053141 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 15:53:44 crc kubenswrapper[4933]: I1202 15:53:44.053095 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 15:53:44 crc kubenswrapper[4933]: E1202 15:53:44.053216 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 15:53:44 crc kubenswrapper[4933]: E1202 15:53:44.053316 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 15:53:44 crc kubenswrapper[4933]: I1202 15:53:44.150172 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:44 crc kubenswrapper[4933]: I1202 15:53:44.150215 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:44 crc kubenswrapper[4933]: I1202 15:53:44.150225 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:44 crc kubenswrapper[4933]: I1202 15:53:44.150241 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:44 crc kubenswrapper[4933]: I1202 15:53:44.150255 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:44Z","lastTransitionTime":"2025-12-02T15:53:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:44 crc kubenswrapper[4933]: I1202 15:53:44.253127 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:44 crc kubenswrapper[4933]: I1202 15:53:44.253161 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:44 crc kubenswrapper[4933]: I1202 15:53:44.253171 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:44 crc kubenswrapper[4933]: I1202 15:53:44.253187 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:44 crc kubenswrapper[4933]: I1202 15:53:44.253197 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:44Z","lastTransitionTime":"2025-12-02T15:53:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:44 crc kubenswrapper[4933]: I1202 15:53:44.355926 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:44 crc kubenswrapper[4933]: I1202 15:53:44.356001 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:44 crc kubenswrapper[4933]: I1202 15:53:44.356025 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:44 crc kubenswrapper[4933]: I1202 15:53:44.356055 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:44 crc kubenswrapper[4933]: I1202 15:53:44.356076 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:44Z","lastTransitionTime":"2025-12-02T15:53:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:44 crc kubenswrapper[4933]: I1202 15:53:44.458747 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:44 crc kubenswrapper[4933]: I1202 15:53:44.458811 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:44 crc kubenswrapper[4933]: I1202 15:53:44.458865 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:44 crc kubenswrapper[4933]: I1202 15:53:44.458892 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:44 crc kubenswrapper[4933]: I1202 15:53:44.458910 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:44Z","lastTransitionTime":"2025-12-02T15:53:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:44 crc kubenswrapper[4933]: I1202 15:53:44.562402 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:44 crc kubenswrapper[4933]: I1202 15:53:44.562473 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:44 crc kubenswrapper[4933]: I1202 15:53:44.562493 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:44 crc kubenswrapper[4933]: I1202 15:53:44.562518 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:44 crc kubenswrapper[4933]: I1202 15:53:44.562533 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:44Z","lastTransitionTime":"2025-12-02T15:53:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:44 crc kubenswrapper[4933]: I1202 15:53:44.665424 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:44 crc kubenswrapper[4933]: I1202 15:53:44.665498 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:44 crc kubenswrapper[4933]: I1202 15:53:44.665522 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:44 crc kubenswrapper[4933]: I1202 15:53:44.665551 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:44 crc kubenswrapper[4933]: I1202 15:53:44.665573 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:44Z","lastTransitionTime":"2025-12-02T15:53:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:44 crc kubenswrapper[4933]: I1202 15:53:44.769325 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:44 crc kubenswrapper[4933]: I1202 15:53:44.769418 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:44 crc kubenswrapper[4933]: I1202 15:53:44.769435 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:44 crc kubenswrapper[4933]: I1202 15:53:44.769457 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:44 crc kubenswrapper[4933]: I1202 15:53:44.769474 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:44Z","lastTransitionTime":"2025-12-02T15:53:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:44 crc kubenswrapper[4933]: I1202 15:53:44.859487 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8mklc_1972064c-ea30-421c-b009-2bc675a98fcc/ovnkube-controller/3.log" Dec 02 15:53:44 crc kubenswrapper[4933]: I1202 15:53:44.860710 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8mklc_1972064c-ea30-421c-b009-2bc675a98fcc/ovnkube-controller/2.log" Dec 02 15:53:44 crc kubenswrapper[4933]: I1202 15:53:44.865913 4933 generic.go:334] "Generic (PLEG): container finished" podID="1972064c-ea30-421c-b009-2bc675a98fcc" containerID="2f7591b746ed9942ce7644e8005251baa49bdeaef3618aa1d679249fd3a96058" exitCode=1 Dec 02 15:53:44 crc kubenswrapper[4933]: I1202 15:53:44.865988 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" event={"ID":"1972064c-ea30-421c-b009-2bc675a98fcc","Type":"ContainerDied","Data":"2f7591b746ed9942ce7644e8005251baa49bdeaef3618aa1d679249fd3a96058"} Dec 02 15:53:44 crc kubenswrapper[4933]: I1202 15:53:44.866056 4933 scope.go:117] "RemoveContainer" containerID="e68bfc1eb2e35cbb02950eed88f4e4cb09b5ee597cbf3dbefcbb7dca7a9f90d7" Dec 02 15:53:44 crc kubenswrapper[4933]: I1202 15:53:44.867228 4933 scope.go:117] "RemoveContainer" containerID="2f7591b746ed9942ce7644e8005251baa49bdeaef3618aa1d679249fd3a96058" Dec 02 15:53:44 crc kubenswrapper[4933]: E1202 15:53:44.867494 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-8mklc_openshift-ovn-kubernetes(1972064c-ea30-421c-b009-2bc675a98fcc)\"" pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" podUID="1972064c-ea30-421c-b009-2bc675a98fcc" Dec 02 15:53:44 crc kubenswrapper[4933]: I1202 15:53:44.872328 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:44 crc kubenswrapper[4933]: I1202 15:53:44.872384 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:44 crc kubenswrapper[4933]: I1202 15:53:44.872405 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:44 crc kubenswrapper[4933]: I1202 15:53:44.872437 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:44 crc kubenswrapper[4933]: I1202 15:53:44.872459 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:44Z","lastTransitionTime":"2025-12-02T15:53:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:44 crc kubenswrapper[4933]: I1202 15:53:44.896483 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67141d41-dade-4d16-8921-1a3eeaef658e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67497877f8241167fd89cf2ebbc7257a910b4f22e96a1e58f78936317daf19aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://559f28d5b8b13a4d78a727f671a75d2c61a391f83089c98749a8071b466c8fae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50e3f8f75abacf2b0701b20cc58796727bb04d99836c7ad69c27355d0c7f070f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0f8661c980893fc646e84a6bb8547946653723fb3f1449d585953e25715d588\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdced4864fc5e9a41404f9484c6126634ffcbc3388080207f6a5508be6dc7b19\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 15:52:30.552832 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 15:52:30.556124 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1621699441/tls.crt::/tmp/serving-cert-1621699441/tls.key\\\\\\\"\\\\nI1202 15:52:36.166152 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 15:52:36.169452 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 15:52:36.169553 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 15:52:36.169614 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 15:52:36.169667 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 15:52:36.177343 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 15:52:36.177409 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 15:52:36.177417 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 15:52:36.177423 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 15:52:36.177428 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 15:52:36.177432 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 15:52:36.177437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 15:52:36.177361 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 15:52:36.181525 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://266ff5f18c317176752753189e1fd32c61d17e22bebba19421d71cdce41c10e4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd63fc5b2ffbd52b09628908868b28e2168bfe8bc453de41ce175aedcd404cc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd63fc5b2ffbd52b09628908868b28e2168bfe8bc453de41ce175aedcd404cc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:44Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:44 crc kubenswrapper[4933]: I1202 15:53:44.914905 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c0555430401fd089ba4f14bef44c9a03bcc4352a3159c34aa592797211ff912\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2774460b514418fc05e1d8ac0ca0a8cda1194fab9151804bed266e6bf44c7369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:44Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:44 crc kubenswrapper[4933]: I1202 15:53:44.929213 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:44Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:44 crc kubenswrapper[4933]: I1202 15:53:44.946988 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-s779q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72a30a71-fe04-43a2-8f60-c9b12a0a6e5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2367cc195b2375ceed6df398214268e414ae13f5459750b4a1f3bbe4ef59363\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a448b3118b6160344bea740978c045ce2351af934805ee95eb6b940963b377bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a448b3118b6160344bea740978c045ce2351af934805ee95eb6b940963b377bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f11b7eae04e4265f3d00a32c0c1c91419ef6da612c6ecd2e69ef4e90b5f72b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f11b7eae04e4265f3d00a32c0c1c91419ef6da612c6ecd2e69ef4e90b5f72b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89093d056f612b90064277410dc17c8b1879459a9e9491538ccd8dabeb2703e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89093d056f612b90064277410dc17c8b1879459a9e9491538ccd8dabeb2703e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e773c0ba8db44a579b0f5b59c7402d773402fbb1b3e3b57bf95e9244df635c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e773c0ba8db44a579b0f5b59c7402d773402fbb1b3e3b57bf95e9244df635c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a098d78bfca730eec14ca5f48ae082e24f0623ec3752e1b0182dceb18821e3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a098d78bfca730eec14ca5f48ae082e24f0623ec3752e1b0182dceb18821e3b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e41fea9eed5df332e515a81666075fc4cb3171b47a7c222b36dac4d5a7533692\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e41fea9eed5df332e515a81666075fc4cb3171b47a7c222b36dac4d5a7533692\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-s779q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:44Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:44 crc kubenswrapper[4933]: I1202 15:53:44.960410 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"448f05b3-d7b9-4eca-a267-b9b4a5a766e2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://242478d3d008a43cabff06a9c4bf90faf8c8eb3a12f5f43cde5d0156c47d6a8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://278a1b76a0d8f7d922ab08d7ecd0912f0cde3f3438de43f5c5214e703bd244c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://278a1b76a0d8f7d922ab08d7ecd0912f0cde3f3438de43f5c5214e703bd244c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:44Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:44 crc kubenswrapper[4933]: I1202 15:53:44.975322 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:44 crc kubenswrapper[4933]: I1202 15:53:44.975356 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:44 crc kubenswrapper[4933]: I1202 15:53:44.975370 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:44 crc kubenswrapper[4933]: I1202 15:53:44.975386 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:44 crc kubenswrapper[4933]: I1202 15:53:44.975400 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:44Z","lastTransitionTime":"2025-12-02T15:53:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:44 crc kubenswrapper[4933]: I1202 15:53:44.984565 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c736b69-0c6f-42a9-bea8-cae1d3274483\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://292a536fa8488df1cde96062a3d40cbe5ecf556b30ae0ba609d9deb6b8dde7d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d4f02704d1f6ee3644bceab7f4cd44ef43e948d1777fa2e225642a5da98901f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e5b219049292adb37755387b666dabb044ec252e589f4f554a96bb2e858612c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c86a4b6d42de44d954b5bcf8b42ac10cffaac09c9f7e362a2d26b4a0fcba9a74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1258d15649d53d011ca2d8bb0e3e6b17ddaabe8ee2de9d184d85141ef2229ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94a860a32ad2b16621b3f2208b144968777437df38bf39ce6ab6b534a77ea154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94a860a32ad2b16621b3f2208b144968777437df38bf39ce6ab6b534a77ea154\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57d56c47e0f5d4c1ab0ee1bf1098a81be9d384c8f8035cff1eabc824523b46f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57d56c47e0f5d4c1ab0ee1bf1098a81be9d384c8f8035cff1eabc824523b46f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://54b33eb07acab9a8b66ee5312351af62b7dba3c8212de0dff4cd777235f5f2cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54b33eb07acab9a8b66ee5312351af62b7dba3c8212de0dff4cd777235f5f2cb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:44Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:44 crc kubenswrapper[4933]: I1202 15:53:44.999817 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06b195f1-3296-4050-9361-eab421cde8d4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7399fed7994f0ed3f12a423d3f6796e84d8687f9c16a3050ccbb90e1c80a07d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee933bdd8638f0085a6f720a178c8ce59bf46b40a0bcb015ac9c570e25ce97d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7cb1abb6f86878fc3daef153191ea3a2ebe06b3f1fc7df959539938c3b6a724\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db14d6e6ebdfa06ff02570eb66fe7ea17a7705fdaa767b6fb91d7ed12eacd59a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:44Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:45 crc kubenswrapper[4933]: I1202 15:53:45.025847 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1972064c-ea30-421c-b009-2bc675a98fcc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3920b75a79accaa154ca827b9d813f5bbb7c7c18369c0b6d3bb82cab653bcb66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a3f2ac3ef3759389e0c4807b4235aea66b8d9570ef020af90110402296a755\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d4c0805f39e3410185b2d898b61b42dfeb544cd47c1ac7570a099ece7921a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://170155e5803e9eabfc4866a3cd76bd74567d9ff6707bebfdf7f24a2330173140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a499bb33dae499d565aa22f537a095fe9465bd1d68d829248fc7a896f692f67d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14c25265781df0f480ecebbc167b5e413d869433f7258cc299a2c2bc12ee2af3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f7591b746ed9942ce7644e8005251baa49bdeaef3618aa1d679249fd3a96058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e68bfc1eb2e35cbb02950eed88f4e4cb09b5ee597cbf3dbefcbb7dca7a9f90d7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T15:53:08Z\\\",\\\"message\\\":\\\"g for pod on switch crc\\\\nI1202 15:53:07.975214 6599 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1202 15:53:07.975521 6599 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1202 15:53:07.975535 6599 base_network_controller_pods.go:916] Annotation values: ip=[10.217.0.3/23] ; mac=0a:58:0a:d9:00:03 ; gw=[10.217.0.1]\\\\nI1202 15:53:07.975530 6599 ovn.go:134] Ensuring zone local for Pod openshift-kube-controller-manager/kube-controller-manager-crc in node crc\\\\nI1202 15:53:07.975580 6599 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc after 0 failed attempt(s)\\\\nI1202 15:53:07.975354 6599 services_controller.go:444] Built service openshift-apiserver/api LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1202 15:53:07.975596 6599 default_network_controller.go:776] Recording success event on pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1202 15:53:07.975612 6599 services_controller.go:445] Built service openshift-apiserver/api LB template configs for network=default: []services.lbConfig(nil)\\\\nF1202 15:53:07.975627 6599 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T15:53:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f7591b746ed9942ce7644e8005251baa49bdeaef3618aa1d679249fd3a96058\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T15:53:44Z\\\",\\\"message\\\":\\\"il\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1202 15:53:43.707176 6999 transact.go:42] Configuring OVN: [{Op:update Table:Logical_Router_Static_Route Row:map[ip_prefix:10.217.0.0/22 nexthop:100.64.0.2 policy:{GoSet:[src-ip]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8944024f-deb7-4076-afb3-4b50a2ff4b4b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:static_routes Mutator:insert Value:{GoSet:[{GoUUID:8944024f-deb7-4076-afb3-4b50a2ff4b4b}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1202 15:53:43.707700 6999 obj_retry.go:551] Creating *factory.egressNode crc took: 2.297771ms\\\\nI1202 15:53:43.707729 6999 factory.go:1336] Added *v1.Node event handler 7\\\\nI1202 15:53:43.707766 6999 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1202 15:53:43.708052 6999 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1202 15:53:43.708145 6999 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1202 15:53:43.708185 6999 ovnkube.go:599] Stopped ovnkube\\\\nI1202 15:53:43.708211 6999 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1202 15:53:43.708294 6999 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T15:53:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ae6c9973a0f8983a5da9a3a25ba23d7d836e7f5bdaa60b60627d604ef3d2afc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0062caebc274431e99ef7dc875a50f2f2d756215bd2700b7206582f68e5f376\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0062caebc274431e99ef7dc875a50f2f2d756215bd2700b7206582f68e5f376\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8mklc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:45Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:45 crc kubenswrapper[4933]: I1202 15:53:45.043376 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s488w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a161d9d5-a56f-45e9-93e4-50e7220cd31e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38f86b30416fb2f10e8444d5c7c0afe84f16d619e83e7a5e1186eaf4c274a51f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mnvpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cba0775cbbe9a6ce1f0b0fe3559f2b5eb39bd13d4f35686fe2ff92d7d833909\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mnvpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s488w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:45Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:45 crc kubenswrapper[4933]: I1202 15:53:45.053089 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qbps2" Dec 02 15:53:45 crc kubenswrapper[4933]: E1202 15:53:45.053274 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qbps2" podUID="c95a4730-1427-4097-9ca3-4bd251e7acf0" Dec 02 15:53:45 crc kubenswrapper[4933]: I1202 15:53:45.064554 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7faf072-ca84-4fd4-9409-b46ca6d4f1b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://befd22a9932058934ba3614781a7f133b4e432d5488c47697d082722ac11e0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d957eb940436ccb56c610cb177a4670ec0fe3aa5437e333a99c291c4263c5978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://defe82313be0dd4818134a0c694acbf728c8e31d0df745b7f2241a3e57c1bd8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d0c36fe96576d4594aff50e5b631d1bc23ea352372722d332d11c9dc6b5b7cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d0c36fe96576d4594aff50e5b631d1bc23ea352372722d332d11c9dc6b5b7cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:17Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:45Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:45 crc kubenswrapper[4933]: I1202 15:53:45.077809 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:45 crc kubenswrapper[4933]: I1202 15:53:45.077868 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:45 crc kubenswrapper[4933]: I1202 15:53:45.077884 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:45 crc kubenswrapper[4933]: I1202 15:53:45.077904 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:45 crc kubenswrapper[4933]: I1202 15:53:45.077919 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:45Z","lastTransitionTime":"2025-12-02T15:53:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:45 crc kubenswrapper[4933]: I1202 15:53:45.081072 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:45Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:45 crc kubenswrapper[4933]: I1202 15:53:45.096269 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6c1c5e6-50dd-428a-890c-2c3f0456f2fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d84363ac0dfeec81ad7770d6ffd34547605fc51bebb545c4639f4c069bab93ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t48jm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54194f3459a2bbe748821e4f8e94abdd18e7c4e483d4cc2c9d5b765db584dd01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t48jm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-d2p6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:45Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:45 crc kubenswrapper[4933]: I1202 15:53:45.109395 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5d9dn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97edaf10-912b-42e7-a9e7-930381d48508\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea0970d622ab76b94ceab66d1d10d469581574368d38c8cd7c6b7a26f81cb6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s82hp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5d9dn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:45Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:45 crc kubenswrapper[4933]: I1202 15:53:45.123302 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fl25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"680c0df1-e4d6-4e1c-a36d-2378e821d2d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bdbdb1eb6ad7b6e710feeda6af64e9557ec1c3c938fd850fa5b2835abc45f098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-559sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fl25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:45Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:45 crc kubenswrapper[4933]: I1202 15:53:45.143010 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qbps2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c95a4730-1427-4097-9ca3-4bd251e7acf0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdwf7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdwf7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:56Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qbps2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:45Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:45 crc kubenswrapper[4933]: I1202 15:53:45.164259 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50efb59904b9275848b8f068dfa8943515c66087209fe13dc75888354ecaff09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:45Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:45 crc kubenswrapper[4933]: I1202 15:53:45.180965 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:45 crc kubenswrapper[4933]: I1202 15:53:45.181027 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:45 crc kubenswrapper[4933]: I1202 15:53:45.181044 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:45 crc kubenswrapper[4933]: I1202 15:53:45.181071 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:45 crc kubenswrapper[4933]: I1202 15:53:45.181090 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:45Z","lastTransitionTime":"2025-12-02T15:53:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:45 crc kubenswrapper[4933]: I1202 15:53:45.183208 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:45Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:45 crc kubenswrapper[4933]: I1202 15:53:45.199319 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4424b26b4c67e9508d92cc6bbc82b291d93c587a8463026a856d87b7b778079e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:45Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:45 crc kubenswrapper[4933]: I1202 15:53:45.221031 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-z6kjz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b033c545-93a2-4401-842b-22456e44216b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf98d43fe6e267ef8509fcedf5375bfd2049a7a7964ddcb2cb97b6710013fb7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5307d7bbe56091012f9975b2a42eafb27d8c90b53817f1f82d8269e23456759\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T15:53:30Z\\\",\\\"message\\\":\\\"2025-12-02T15:52:45+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_261455ff-27bf-4826-8730-80c1421f483f\\\\n2025-12-02T15:52:45+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_261455ff-27bf-4826-8730-80c1421f483f to /host/opt/cni/bin/\\\\n2025-12-02T15:52:45Z [verbose] multus-daemon started\\\\n2025-12-02T15:52:45Z [verbose] Readiness Indicator file check\\\\n2025-12-02T15:53:30Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:53:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdq96\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-z6kjz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:45Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:45 crc kubenswrapper[4933]: I1202 15:53:45.283731 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:45 crc kubenswrapper[4933]: I1202 15:53:45.283796 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:45 crc kubenswrapper[4933]: I1202 15:53:45.283853 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:45 crc kubenswrapper[4933]: I1202 15:53:45.283895 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:45 crc kubenswrapper[4933]: I1202 15:53:45.283920 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:45Z","lastTransitionTime":"2025-12-02T15:53:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:45 crc kubenswrapper[4933]: I1202 15:53:45.387048 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:45 crc kubenswrapper[4933]: I1202 15:53:45.387505 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:45 crc kubenswrapper[4933]: I1202 15:53:45.387774 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:45 crc kubenswrapper[4933]: I1202 15:53:45.387987 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:45 crc kubenswrapper[4933]: I1202 15:53:45.388125 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:45Z","lastTransitionTime":"2025-12-02T15:53:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:45 crc kubenswrapper[4933]: I1202 15:53:45.490795 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:45 crc kubenswrapper[4933]: I1202 15:53:45.490910 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:45 crc kubenswrapper[4933]: I1202 15:53:45.490928 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:45 crc kubenswrapper[4933]: I1202 15:53:45.490951 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:45 crc kubenswrapper[4933]: I1202 15:53:45.490968 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:45Z","lastTransitionTime":"2025-12-02T15:53:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:45 crc kubenswrapper[4933]: I1202 15:53:45.594116 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:45 crc kubenswrapper[4933]: I1202 15:53:45.594188 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:45 crc kubenswrapper[4933]: I1202 15:53:45.594206 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:45 crc kubenswrapper[4933]: I1202 15:53:45.594233 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:45 crc kubenswrapper[4933]: I1202 15:53:45.594249 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:45Z","lastTransitionTime":"2025-12-02T15:53:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:45 crc kubenswrapper[4933]: I1202 15:53:45.697605 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:45 crc kubenswrapper[4933]: I1202 15:53:45.698092 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:45 crc kubenswrapper[4933]: I1202 15:53:45.698310 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:45 crc kubenswrapper[4933]: I1202 15:53:45.698586 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:45 crc kubenswrapper[4933]: I1202 15:53:45.698818 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:45Z","lastTransitionTime":"2025-12-02T15:53:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:45 crc kubenswrapper[4933]: I1202 15:53:45.802531 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:45 crc kubenswrapper[4933]: I1202 15:53:45.802577 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:45 crc kubenswrapper[4933]: I1202 15:53:45.802592 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:45 crc kubenswrapper[4933]: I1202 15:53:45.802614 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:45 crc kubenswrapper[4933]: I1202 15:53:45.802629 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:45Z","lastTransitionTime":"2025-12-02T15:53:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:45 crc kubenswrapper[4933]: I1202 15:53:45.872328 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8mklc_1972064c-ea30-421c-b009-2bc675a98fcc/ovnkube-controller/3.log" Dec 02 15:53:45 crc kubenswrapper[4933]: I1202 15:53:45.905968 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:45 crc kubenswrapper[4933]: I1202 15:53:45.906031 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:45 crc kubenswrapper[4933]: I1202 15:53:45.906054 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:45 crc kubenswrapper[4933]: I1202 15:53:45.906079 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:45 crc kubenswrapper[4933]: I1202 15:53:45.906096 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:45Z","lastTransitionTime":"2025-12-02T15:53:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:46 crc kubenswrapper[4933]: I1202 15:53:46.014042 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:46 crc kubenswrapper[4933]: I1202 15:53:46.014085 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:46 crc kubenswrapper[4933]: I1202 15:53:46.014093 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:46 crc kubenswrapper[4933]: I1202 15:53:46.014111 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:46 crc kubenswrapper[4933]: I1202 15:53:46.014124 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:46Z","lastTransitionTime":"2025-12-02T15:53:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:46 crc kubenswrapper[4933]: I1202 15:53:46.053322 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 15:53:46 crc kubenswrapper[4933]: I1202 15:53:46.053418 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 15:53:46 crc kubenswrapper[4933]: E1202 15:53:46.053472 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 15:53:46 crc kubenswrapper[4933]: I1202 15:53:46.053536 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 15:53:46 crc kubenswrapper[4933]: E1202 15:53:46.053577 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 15:53:46 crc kubenswrapper[4933]: E1202 15:53:46.053660 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 15:53:46 crc kubenswrapper[4933]: I1202 15:53:46.118016 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:46 crc kubenswrapper[4933]: I1202 15:53:46.118084 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:46 crc kubenswrapper[4933]: I1202 15:53:46.118103 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:46 crc kubenswrapper[4933]: I1202 15:53:46.118125 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:46 crc kubenswrapper[4933]: I1202 15:53:46.118172 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:46Z","lastTransitionTime":"2025-12-02T15:53:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:46 crc kubenswrapper[4933]: I1202 15:53:46.221563 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:46 crc kubenswrapper[4933]: I1202 15:53:46.222012 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:46 crc kubenswrapper[4933]: I1202 15:53:46.222032 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:46 crc kubenswrapper[4933]: I1202 15:53:46.222057 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:46 crc kubenswrapper[4933]: I1202 15:53:46.222076 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:46Z","lastTransitionTime":"2025-12-02T15:53:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:46 crc kubenswrapper[4933]: I1202 15:53:46.326633 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:46 crc kubenswrapper[4933]: I1202 15:53:46.326700 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:46 crc kubenswrapper[4933]: I1202 15:53:46.326719 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:46 crc kubenswrapper[4933]: I1202 15:53:46.326742 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:46 crc kubenswrapper[4933]: I1202 15:53:46.326759 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:46Z","lastTransitionTime":"2025-12-02T15:53:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:46 crc kubenswrapper[4933]: I1202 15:53:46.429926 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:46 crc kubenswrapper[4933]: I1202 15:53:46.429994 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:46 crc kubenswrapper[4933]: I1202 15:53:46.430016 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:46 crc kubenswrapper[4933]: I1202 15:53:46.430038 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:46 crc kubenswrapper[4933]: I1202 15:53:46.430052 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:46Z","lastTransitionTime":"2025-12-02T15:53:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:46 crc kubenswrapper[4933]: I1202 15:53:46.533405 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:46 crc kubenswrapper[4933]: I1202 15:53:46.533459 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:46 crc kubenswrapper[4933]: I1202 15:53:46.533475 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:46 crc kubenswrapper[4933]: I1202 15:53:46.533499 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:46 crc kubenswrapper[4933]: I1202 15:53:46.533516 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:46Z","lastTransitionTime":"2025-12-02T15:53:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:46 crc kubenswrapper[4933]: I1202 15:53:46.635789 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:46 crc kubenswrapper[4933]: I1202 15:53:46.635873 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:46 crc kubenswrapper[4933]: I1202 15:53:46.635883 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:46 crc kubenswrapper[4933]: I1202 15:53:46.635926 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:46 crc kubenswrapper[4933]: I1202 15:53:46.635942 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:46Z","lastTransitionTime":"2025-12-02T15:53:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:46 crc kubenswrapper[4933]: I1202 15:53:46.738972 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:46 crc kubenswrapper[4933]: I1202 15:53:46.739022 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:46 crc kubenswrapper[4933]: I1202 15:53:46.739034 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:46 crc kubenswrapper[4933]: I1202 15:53:46.739051 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:46 crc kubenswrapper[4933]: I1202 15:53:46.739062 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:46Z","lastTransitionTime":"2025-12-02T15:53:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:46 crc kubenswrapper[4933]: I1202 15:53:46.842098 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:46 crc kubenswrapper[4933]: I1202 15:53:46.842149 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:46 crc kubenswrapper[4933]: I1202 15:53:46.842161 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:46 crc kubenswrapper[4933]: I1202 15:53:46.842175 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:46 crc kubenswrapper[4933]: I1202 15:53:46.842184 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:46Z","lastTransitionTime":"2025-12-02T15:53:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:46 crc kubenswrapper[4933]: I1202 15:53:46.945401 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:46 crc kubenswrapper[4933]: I1202 15:53:46.945476 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:46 crc kubenswrapper[4933]: I1202 15:53:46.945494 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:46 crc kubenswrapper[4933]: I1202 15:53:46.945517 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:46 crc kubenswrapper[4933]: I1202 15:53:46.945537 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:46Z","lastTransitionTime":"2025-12-02T15:53:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:47 crc kubenswrapper[4933]: I1202 15:53:47.048403 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:47 crc kubenswrapper[4933]: I1202 15:53:47.048454 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:47 crc kubenswrapper[4933]: I1202 15:53:47.048464 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:47 crc kubenswrapper[4933]: I1202 15:53:47.048482 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:47 crc kubenswrapper[4933]: I1202 15:53:47.048494 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:47Z","lastTransitionTime":"2025-12-02T15:53:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:47 crc kubenswrapper[4933]: I1202 15:53:47.052782 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qbps2" Dec 02 15:53:47 crc kubenswrapper[4933]: E1202 15:53:47.053038 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qbps2" podUID="c95a4730-1427-4097-9ca3-4bd251e7acf0" Dec 02 15:53:47 crc kubenswrapper[4933]: I1202 15:53:47.069561 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7faf072-ca84-4fd4-9409-b46ca6d4f1b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://befd22a9932058934ba3614781a7f133b4e432d5488c47697d082722ac11e0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d957eb940436ccb56c610cb177a4670ec0fe3aa5437e333a99c291c4263c5978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://defe82313be0dd4818134a0c694acbf728c8e31d0df745b7f2241a3e57c1bd8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d0c36fe96576d4594aff50e5b631d1bc23ea352372722d332d11c9dc6b5b7cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d0c36fe96576d4594aff50e5b631d1bc23ea352372722d332d11c9dc6b5b7cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:17Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:47Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:47 crc kubenswrapper[4933]: I1202 15:53:47.090126 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:47Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:47 crc kubenswrapper[4933]: I1202 15:53:47.108967 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6c1c5e6-50dd-428a-890c-2c3f0456f2fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d84363ac0dfeec81ad7770d6ffd34547605fc51bebb545c4639f4c069bab93ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t48jm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54194f3459a2bbe748821e4f8e94abdd18e7c4e483d4cc2c9d5b765db584dd01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t48jm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-d2p6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:47Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:47 crc kubenswrapper[4933]: I1202 15:53:47.125963 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5d9dn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97edaf10-912b-42e7-a9e7-930381d48508\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea0970d622ab76b94ceab66d1d10d469581574368d38c8cd7c6b7a26f81cb6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s82hp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5d9dn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:47Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:47 crc kubenswrapper[4933]: I1202 15:53:47.143167 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fl25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"680c0df1-e4d6-4e1c-a36d-2378e821d2d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bdbdb1eb6ad7b6e710feeda6af64e9557ec1c3c938fd850fa5b2835abc45f098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-559sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fl25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:47Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:47 crc kubenswrapper[4933]: I1202 15:53:47.151819 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:47 crc kubenswrapper[4933]: I1202 15:53:47.151907 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:47 crc kubenswrapper[4933]: I1202 15:53:47.151924 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:47 crc kubenswrapper[4933]: I1202 15:53:47.151948 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:47 crc kubenswrapper[4933]: I1202 15:53:47.151966 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:47Z","lastTransitionTime":"2025-12-02T15:53:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:47 crc kubenswrapper[4933]: I1202 15:53:47.161158 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qbps2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c95a4730-1427-4097-9ca3-4bd251e7acf0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdwf7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdwf7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:56Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qbps2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:47Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:47 crc kubenswrapper[4933]: I1202 15:53:47.182743 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50efb59904b9275848b8f068dfa8943515c66087209fe13dc75888354ecaff09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:47Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:47 crc kubenswrapper[4933]: I1202 15:53:47.205563 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:47Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:47 crc kubenswrapper[4933]: I1202 15:53:47.220912 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4424b26b4c67e9508d92cc6bbc82b291d93c587a8463026a856d87b7b778079e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:47Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:47 crc kubenswrapper[4933]: I1202 15:53:47.242726 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-z6kjz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b033c545-93a2-4401-842b-22456e44216b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf98d43fe6e267ef8509fcedf5375bfd2049a7a7964ddcb2cb97b6710013fb7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5307d7bbe56091012f9975b2a42eafb27d8c90b53817f1f82d8269e23456759\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T15:53:30Z\\\",\\\"message\\\":\\\"2025-12-02T15:52:45+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_261455ff-27bf-4826-8730-80c1421f483f\\\\n2025-12-02T15:52:45+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_261455ff-27bf-4826-8730-80c1421f483f to /host/opt/cni/bin/\\\\n2025-12-02T15:52:45Z [verbose] multus-daemon started\\\\n2025-12-02T15:52:45Z [verbose] Readiness Indicator file check\\\\n2025-12-02T15:53:30Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:53:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdq96\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-z6kjz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:47Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:47 crc kubenswrapper[4933]: I1202 15:53:47.255332 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:47 crc kubenswrapper[4933]: I1202 15:53:47.255397 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:47 crc kubenswrapper[4933]: I1202 15:53:47.255422 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:47 crc kubenswrapper[4933]: I1202 15:53:47.255454 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:47 crc kubenswrapper[4933]: I1202 15:53:47.255479 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:47Z","lastTransitionTime":"2025-12-02T15:53:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:47 crc kubenswrapper[4933]: I1202 15:53:47.258195 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67141d41-dade-4d16-8921-1a3eeaef658e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67497877f8241167fd89cf2ebbc7257a910b4f22e96a1e58f78936317daf19aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://559f28d5b8b13a4d78a727f671a75d2c61a391f83089c98749a8071b466c8fae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50e3f8f75abacf2b0701b20cc58796727bb04d99836c7ad69c27355d0c7f070f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0f8661c980893fc646e84a6bb8547946653723fb3f1449d585953e25715d588\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdced4864fc5e9a41404f9484c6126634ffcbc3388080207f6a5508be6dc7b19\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 15:52:30.552832 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 15:52:30.556124 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1621699441/tls.crt::/tmp/serving-cert-1621699441/tls.key\\\\\\\"\\\\nI1202 15:52:36.166152 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 15:52:36.169452 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 15:52:36.169553 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 15:52:36.169614 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 15:52:36.169667 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 15:52:36.177343 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 15:52:36.177409 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 15:52:36.177417 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 15:52:36.177423 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 15:52:36.177428 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 15:52:36.177432 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 15:52:36.177437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 15:52:36.177361 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 15:52:36.181525 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://266ff5f18c317176752753189e1fd32c61d17e22bebba19421d71cdce41c10e4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd63fc5b2ffbd52b09628908868b28e2168bfe8bc453de41ce175aedcd404cc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd63fc5b2ffbd52b09628908868b28e2168bfe8bc453de41ce175aedcd404cc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:47Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:47 crc kubenswrapper[4933]: I1202 15:53:47.273936 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c0555430401fd089ba4f14bef44c9a03bcc4352a3159c34aa592797211ff912\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2774460b514418fc05e1d8ac0ca0a8cda1194fab9151804bed266e6bf44c7369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:47Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:47 crc kubenswrapper[4933]: I1202 15:53:47.287325 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:47Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:47 crc kubenswrapper[4933]: I1202 15:53:47.301530 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-s779q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72a30a71-fe04-43a2-8f60-c9b12a0a6e5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2367cc195b2375ceed6df398214268e414ae13f5459750b4a1f3bbe4ef59363\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a448b3118b6160344bea740978c045ce2351af934805ee95eb6b940963b377bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a448b3118b6160344bea740978c045ce2351af934805ee95eb6b940963b377bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f11b7eae04e4265f3d00a32c0c1c91419ef6da612c6ecd2e69ef4e90b5f72b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f11b7eae04e4265f3d00a32c0c1c91419ef6da612c6ecd2e69ef4e90b5f72b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89093d056f612b90064277410dc17c8b1879459a9e9491538ccd8dabeb2703e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89093d056f612b90064277410dc17c8b1879459a9e9491538ccd8dabeb2703e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e773c0ba8db44a579b0f5b59c7402d773402fbb1b3e3b57bf95e9244df635c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e773c0ba8db44a579b0f5b59c7402d773402fbb1b3e3b57bf95e9244df635c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a098d78bfca730eec14ca5f48ae082e24f0623ec3752e1b0182dceb18821e3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a098d78bfca730eec14ca5f48ae082e24f0623ec3752e1b0182dceb18821e3b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e41fea9eed5df332e515a81666075fc4cb3171b47a7c222b36dac4d5a7533692\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e41fea9eed5df332e515a81666075fc4cb3171b47a7c222b36dac4d5a7533692\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-s779q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:47Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:47 crc kubenswrapper[4933]: I1202 15:53:47.314400 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"448f05b3-d7b9-4eca-a267-b9b4a5a766e2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://242478d3d008a43cabff06a9c4bf90faf8c8eb3a12f5f43cde5d0156c47d6a8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://278a1b76a0d8f7d922ab08d7ecd0912f0cde3f3438de43f5c5214e703bd244c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://278a1b76a0d8f7d922ab08d7ecd0912f0cde3f3438de43f5c5214e703bd244c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:47Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:47 crc kubenswrapper[4933]: I1202 15:53:47.334930 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c736b69-0c6f-42a9-bea8-cae1d3274483\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://292a536fa8488df1cde96062a3d40cbe5ecf556b30ae0ba609d9deb6b8dde7d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d4f02704d1f6ee3644bceab7f4cd44ef43e948d1777fa2e225642a5da98901f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e5b219049292adb37755387b666dabb044ec252e589f4f554a96bb2e858612c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c86a4b6d42de44d954b5bcf8b42ac10cffaac09c9f7e362a2d26b4a0fcba9a74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1258d15649d53d011ca2d8bb0e3e6b17ddaabe8ee2de9d184d85141ef2229ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94a860a32ad2b16621b3f2208b144968777437df38bf39ce6ab6b534a77ea154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94a860a32ad2b16621b3f2208b144968777437df38bf39ce6ab6b534a77ea154\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57d56c47e0f5d4c1ab0ee1bf1098a81be9d384c8f8035cff1eabc824523b46f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57d56c47e0f5d4c1ab0ee1bf1098a81be9d384c8f8035cff1eabc824523b46f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://54b33eb07acab9a8b66ee5312351af62b7dba3c8212de0dff4cd777235f5f2cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54b33eb07acab9a8b66ee5312351af62b7dba3c8212de0dff4cd777235f5f2cb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:47Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:47 crc kubenswrapper[4933]: I1202 15:53:47.350735 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06b195f1-3296-4050-9361-eab421cde8d4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7399fed7994f0ed3f12a423d3f6796e84d8687f9c16a3050ccbb90e1c80a07d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee933bdd8638f0085a6f720a178c8ce59bf46b40a0bcb015ac9c570e25ce97d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7cb1abb6f86878fc3daef153191ea3a2ebe06b3f1fc7df959539938c3b6a724\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db14d6e6ebdfa06ff02570eb66fe7ea17a7705fdaa767b6fb91d7ed12eacd59a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:47Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:47 crc kubenswrapper[4933]: I1202 15:53:47.358515 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:47 crc kubenswrapper[4933]: I1202 15:53:47.358649 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:47 crc kubenswrapper[4933]: I1202 15:53:47.358668 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:47 crc kubenswrapper[4933]: I1202 15:53:47.358690 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:47 crc kubenswrapper[4933]: I1202 15:53:47.358707 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:47Z","lastTransitionTime":"2025-12-02T15:53:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:47 crc kubenswrapper[4933]: I1202 15:53:47.374904 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1972064c-ea30-421c-b009-2bc675a98fcc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3920b75a79accaa154ca827b9d813f5bbb7c7c18369c0b6d3bb82cab653bcb66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a3f2ac3ef3759389e0c4807b4235aea66b8d9570ef020af90110402296a755\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d4c0805f39e3410185b2d898b61b42dfeb544cd47c1ac7570a099ece7921a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://170155e5803e9eabfc4866a3cd76bd74567d9ff6707bebfdf7f24a2330173140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a499bb33dae499d565aa22f537a095fe9465bd1d68d829248fc7a896f692f67d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14c25265781df0f480ecebbc167b5e413d869433f7258cc299a2c2bc12ee2af3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f7591b746ed9942ce7644e8005251baa49bdeaef3618aa1d679249fd3a96058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e68bfc1eb2e35cbb02950eed88f4e4cb09b5ee597cbf3dbefcbb7dca7a9f90d7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T15:53:08Z\\\",\\\"message\\\":\\\"g for pod on switch crc\\\\nI1202 15:53:07.975214 6599 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1202 15:53:07.975521 6599 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1202 15:53:07.975535 6599 base_network_controller_pods.go:916] Annotation values: ip=[10.217.0.3/23] ; mac=0a:58:0a:d9:00:03 ; gw=[10.217.0.1]\\\\nI1202 15:53:07.975530 6599 ovn.go:134] Ensuring zone local for Pod openshift-kube-controller-manager/kube-controller-manager-crc in node crc\\\\nI1202 15:53:07.975580 6599 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc after 0 failed attempt(s)\\\\nI1202 15:53:07.975354 6599 services_controller.go:444] Built service openshift-apiserver/api LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1202 15:53:07.975596 6599 default_network_controller.go:776] Recording success event on pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1202 15:53:07.975612 6599 services_controller.go:445] Built service openshift-apiserver/api LB template configs for network=default: []services.lbConfig(nil)\\\\nF1202 15:53:07.975627 6599 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T15:53:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f7591b746ed9942ce7644e8005251baa49bdeaef3618aa1d679249fd3a96058\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T15:53:44Z\\\",\\\"message\\\":\\\"il\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1202 15:53:43.707176 6999 transact.go:42] Configuring OVN: [{Op:update Table:Logical_Router_Static_Route Row:map[ip_prefix:10.217.0.0/22 nexthop:100.64.0.2 policy:{GoSet:[src-ip]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8944024f-deb7-4076-afb3-4b50a2ff4b4b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:static_routes Mutator:insert Value:{GoSet:[{GoUUID:8944024f-deb7-4076-afb3-4b50a2ff4b4b}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1202 15:53:43.707700 6999 obj_retry.go:551] Creating *factory.egressNode crc took: 2.297771ms\\\\nI1202 15:53:43.707729 6999 factory.go:1336] Added *v1.Node event handler 7\\\\nI1202 15:53:43.707766 6999 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1202 15:53:43.708052 6999 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1202 15:53:43.708145 6999 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1202 15:53:43.708185 6999 ovnkube.go:599] Stopped ovnkube\\\\nI1202 15:53:43.708211 6999 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1202 15:53:43.708294 6999 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T15:53:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ae6c9973a0f8983a5da9a3a25ba23d7d836e7f5bdaa60b60627d604ef3d2afc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0062caebc274431e99ef7dc875a50f2f2d756215bd2700b7206582f68e5f376\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0062caebc274431e99ef7dc875a50f2f2d756215bd2700b7206582f68e5f376\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8mklc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:47Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:47 crc kubenswrapper[4933]: I1202 15:53:47.391719 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s488w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a161d9d5-a56f-45e9-93e4-50e7220cd31e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38f86b30416fb2f10e8444d5c7c0afe84f16d619e83e7a5e1186eaf4c274a51f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mnvpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cba0775cbbe9a6ce1f0b0fe3559f2b5eb39bd13d4f35686fe2ff92d7d833909\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mnvpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s488w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:47Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:47 crc kubenswrapper[4933]: I1202 15:53:47.461748 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:47 crc kubenswrapper[4933]: I1202 15:53:47.461813 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:47 crc kubenswrapper[4933]: I1202 15:53:47.461838 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:47 crc kubenswrapper[4933]: I1202 15:53:47.461852 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:47 crc kubenswrapper[4933]: I1202 15:53:47.461860 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:47Z","lastTransitionTime":"2025-12-02T15:53:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:47 crc kubenswrapper[4933]: I1202 15:53:47.565451 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:47 crc kubenswrapper[4933]: I1202 15:53:47.565525 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:47 crc kubenswrapper[4933]: I1202 15:53:47.565546 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:47 crc kubenswrapper[4933]: I1202 15:53:47.565577 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:47 crc kubenswrapper[4933]: I1202 15:53:47.565601 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:47Z","lastTransitionTime":"2025-12-02T15:53:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:47 crc kubenswrapper[4933]: I1202 15:53:47.668884 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:47 crc kubenswrapper[4933]: I1202 15:53:47.668980 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:47 crc kubenswrapper[4933]: I1202 15:53:47.669000 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:47 crc kubenswrapper[4933]: I1202 15:53:47.669025 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:47 crc kubenswrapper[4933]: I1202 15:53:47.669044 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:47Z","lastTransitionTime":"2025-12-02T15:53:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:47 crc kubenswrapper[4933]: I1202 15:53:47.772531 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:47 crc kubenswrapper[4933]: I1202 15:53:47.772638 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:47 crc kubenswrapper[4933]: I1202 15:53:47.772799 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:47 crc kubenswrapper[4933]: I1202 15:53:47.772882 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:47 crc kubenswrapper[4933]: I1202 15:53:47.772944 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:47Z","lastTransitionTime":"2025-12-02T15:53:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:47 crc kubenswrapper[4933]: I1202 15:53:47.875716 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:47 crc kubenswrapper[4933]: I1202 15:53:47.875787 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:47 crc kubenswrapper[4933]: I1202 15:53:47.875799 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:47 crc kubenswrapper[4933]: I1202 15:53:47.875836 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:47 crc kubenswrapper[4933]: I1202 15:53:47.875854 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:47Z","lastTransitionTime":"2025-12-02T15:53:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:47 crc kubenswrapper[4933]: I1202 15:53:47.978870 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:47 crc kubenswrapper[4933]: I1202 15:53:47.978938 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:47 crc kubenswrapper[4933]: I1202 15:53:47.978960 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:47 crc kubenswrapper[4933]: I1202 15:53:47.978989 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:47 crc kubenswrapper[4933]: I1202 15:53:47.979011 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:47Z","lastTransitionTime":"2025-12-02T15:53:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:48 crc kubenswrapper[4933]: I1202 15:53:48.052913 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 15:53:48 crc kubenswrapper[4933]: I1202 15:53:48.052961 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 15:53:48 crc kubenswrapper[4933]: I1202 15:53:48.053022 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 15:53:48 crc kubenswrapper[4933]: E1202 15:53:48.053074 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 15:53:48 crc kubenswrapper[4933]: E1202 15:53:48.053216 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 15:53:48 crc kubenswrapper[4933]: E1202 15:53:48.053273 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 15:53:48 crc kubenswrapper[4933]: I1202 15:53:48.082303 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:48 crc kubenswrapper[4933]: I1202 15:53:48.082350 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:48 crc kubenswrapper[4933]: I1202 15:53:48.082359 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:48 crc kubenswrapper[4933]: I1202 15:53:48.082390 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:48 crc kubenswrapper[4933]: I1202 15:53:48.082402 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:48Z","lastTransitionTime":"2025-12-02T15:53:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:48 crc kubenswrapper[4933]: I1202 15:53:48.185863 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:48 crc kubenswrapper[4933]: I1202 15:53:48.185927 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:48 crc kubenswrapper[4933]: I1202 15:53:48.185938 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:48 crc kubenswrapper[4933]: I1202 15:53:48.185955 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:48 crc kubenswrapper[4933]: I1202 15:53:48.185968 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:48Z","lastTransitionTime":"2025-12-02T15:53:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:48 crc kubenswrapper[4933]: I1202 15:53:48.289359 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:48 crc kubenswrapper[4933]: I1202 15:53:48.289478 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:48 crc kubenswrapper[4933]: I1202 15:53:48.289504 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:48 crc kubenswrapper[4933]: I1202 15:53:48.289534 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:48 crc kubenswrapper[4933]: I1202 15:53:48.289556 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:48Z","lastTransitionTime":"2025-12-02T15:53:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:48 crc kubenswrapper[4933]: I1202 15:53:48.392855 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:48 crc kubenswrapper[4933]: I1202 15:53:48.392922 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:48 crc kubenswrapper[4933]: I1202 15:53:48.392940 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:48 crc kubenswrapper[4933]: I1202 15:53:48.392967 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:48 crc kubenswrapper[4933]: I1202 15:53:48.392984 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:48Z","lastTransitionTime":"2025-12-02T15:53:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:48 crc kubenswrapper[4933]: I1202 15:53:48.496781 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:48 crc kubenswrapper[4933]: I1202 15:53:48.496897 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:48 crc kubenswrapper[4933]: I1202 15:53:48.496927 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:48 crc kubenswrapper[4933]: I1202 15:53:48.496959 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:48 crc kubenswrapper[4933]: I1202 15:53:48.496983 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:48Z","lastTransitionTime":"2025-12-02T15:53:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:48 crc kubenswrapper[4933]: I1202 15:53:48.600174 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:48 crc kubenswrapper[4933]: I1202 15:53:48.600320 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:48 crc kubenswrapper[4933]: I1202 15:53:48.600344 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:48 crc kubenswrapper[4933]: I1202 15:53:48.600374 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:48 crc kubenswrapper[4933]: I1202 15:53:48.600395 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:48Z","lastTransitionTime":"2025-12-02T15:53:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:48 crc kubenswrapper[4933]: I1202 15:53:48.703743 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:48 crc kubenswrapper[4933]: I1202 15:53:48.703856 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:48 crc kubenswrapper[4933]: I1202 15:53:48.703883 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:48 crc kubenswrapper[4933]: I1202 15:53:48.703909 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:48 crc kubenswrapper[4933]: I1202 15:53:48.703930 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:48Z","lastTransitionTime":"2025-12-02T15:53:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:48 crc kubenswrapper[4933]: I1202 15:53:48.807589 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:48 crc kubenswrapper[4933]: I1202 15:53:48.807680 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:48 crc kubenswrapper[4933]: I1202 15:53:48.807693 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:48 crc kubenswrapper[4933]: I1202 15:53:48.807713 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:48 crc kubenswrapper[4933]: I1202 15:53:48.807726 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:48Z","lastTransitionTime":"2025-12-02T15:53:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:48 crc kubenswrapper[4933]: I1202 15:53:48.911411 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:48 crc kubenswrapper[4933]: I1202 15:53:48.911480 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:48 crc kubenswrapper[4933]: I1202 15:53:48.911498 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:48 crc kubenswrapper[4933]: I1202 15:53:48.911523 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:48 crc kubenswrapper[4933]: I1202 15:53:48.911539 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:48Z","lastTransitionTime":"2025-12-02T15:53:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:49 crc kubenswrapper[4933]: I1202 15:53:49.014177 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:49 crc kubenswrapper[4933]: I1202 15:53:49.014222 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:49 crc kubenswrapper[4933]: I1202 15:53:49.014234 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:49 crc kubenswrapper[4933]: I1202 15:53:49.014253 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:49 crc kubenswrapper[4933]: I1202 15:53:49.014264 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:49Z","lastTransitionTime":"2025-12-02T15:53:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:49 crc kubenswrapper[4933]: I1202 15:53:49.052927 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qbps2" Dec 02 15:53:49 crc kubenswrapper[4933]: E1202 15:53:49.053192 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qbps2" podUID="c95a4730-1427-4097-9ca3-4bd251e7acf0" Dec 02 15:53:49 crc kubenswrapper[4933]: I1202 15:53:49.117514 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:49 crc kubenswrapper[4933]: I1202 15:53:49.117574 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:49 crc kubenswrapper[4933]: I1202 15:53:49.117593 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:49 crc kubenswrapper[4933]: I1202 15:53:49.117618 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:49 crc kubenswrapper[4933]: I1202 15:53:49.117639 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:49Z","lastTransitionTime":"2025-12-02T15:53:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:49 crc kubenswrapper[4933]: I1202 15:53:49.220879 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:49 crc kubenswrapper[4933]: I1202 15:53:49.220939 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:49 crc kubenswrapper[4933]: I1202 15:53:49.220956 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:49 crc kubenswrapper[4933]: I1202 15:53:49.220979 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:49 crc kubenswrapper[4933]: I1202 15:53:49.220996 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:49Z","lastTransitionTime":"2025-12-02T15:53:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:49 crc kubenswrapper[4933]: I1202 15:53:49.323479 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:49 crc kubenswrapper[4933]: I1202 15:53:49.323684 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:49 crc kubenswrapper[4933]: I1202 15:53:49.323707 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:49 crc kubenswrapper[4933]: I1202 15:53:49.323736 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:49 crc kubenswrapper[4933]: I1202 15:53:49.323762 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:49Z","lastTransitionTime":"2025-12-02T15:53:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:49 crc kubenswrapper[4933]: I1202 15:53:49.426858 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:49 crc kubenswrapper[4933]: I1202 15:53:49.426921 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:49 crc kubenswrapper[4933]: I1202 15:53:49.426935 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:49 crc kubenswrapper[4933]: I1202 15:53:49.426949 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:49 crc kubenswrapper[4933]: I1202 15:53:49.426961 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:49Z","lastTransitionTime":"2025-12-02T15:53:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:49 crc kubenswrapper[4933]: I1202 15:53:49.529153 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:49 crc kubenswrapper[4933]: I1202 15:53:49.529193 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:49 crc kubenswrapper[4933]: I1202 15:53:49.529208 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:49 crc kubenswrapper[4933]: I1202 15:53:49.529223 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:49 crc kubenswrapper[4933]: I1202 15:53:49.529234 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:49Z","lastTransitionTime":"2025-12-02T15:53:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:49 crc kubenswrapper[4933]: I1202 15:53:49.632299 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:49 crc kubenswrapper[4933]: I1202 15:53:49.632360 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:49 crc kubenswrapper[4933]: I1202 15:53:49.632372 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:49 crc kubenswrapper[4933]: I1202 15:53:49.632389 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:49 crc kubenswrapper[4933]: I1202 15:53:49.632399 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:49Z","lastTransitionTime":"2025-12-02T15:53:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:49 crc kubenswrapper[4933]: I1202 15:53:49.735133 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:49 crc kubenswrapper[4933]: I1202 15:53:49.735189 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:49 crc kubenswrapper[4933]: I1202 15:53:49.735210 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:49 crc kubenswrapper[4933]: I1202 15:53:49.735242 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:49 crc kubenswrapper[4933]: I1202 15:53:49.735258 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:49Z","lastTransitionTime":"2025-12-02T15:53:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:49 crc kubenswrapper[4933]: I1202 15:53:49.838914 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:49 crc kubenswrapper[4933]: I1202 15:53:49.838963 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:49 crc kubenswrapper[4933]: I1202 15:53:49.838971 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:49 crc kubenswrapper[4933]: I1202 15:53:49.838986 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:49 crc kubenswrapper[4933]: I1202 15:53:49.838996 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:49Z","lastTransitionTime":"2025-12-02T15:53:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:49 crc kubenswrapper[4933]: I1202 15:53:49.941407 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:49 crc kubenswrapper[4933]: I1202 15:53:49.941478 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:49 crc kubenswrapper[4933]: I1202 15:53:49.941495 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:49 crc kubenswrapper[4933]: I1202 15:53:49.941521 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:49 crc kubenswrapper[4933]: I1202 15:53:49.941539 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:49Z","lastTransitionTime":"2025-12-02T15:53:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:50 crc kubenswrapper[4933]: I1202 15:53:50.043714 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:50 crc kubenswrapper[4933]: I1202 15:53:50.043752 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:50 crc kubenswrapper[4933]: I1202 15:53:50.043763 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:50 crc kubenswrapper[4933]: I1202 15:53:50.043778 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:50 crc kubenswrapper[4933]: I1202 15:53:50.043789 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:50Z","lastTransitionTime":"2025-12-02T15:53:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:50 crc kubenswrapper[4933]: I1202 15:53:50.052587 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 15:53:50 crc kubenswrapper[4933]: I1202 15:53:50.052886 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 15:53:50 crc kubenswrapper[4933]: I1202 15:53:50.052993 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 15:53:50 crc kubenswrapper[4933]: E1202 15:53:50.053244 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 15:53:50 crc kubenswrapper[4933]: E1202 15:53:50.053500 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 15:53:50 crc kubenswrapper[4933]: E1202 15:53:50.053584 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 15:53:50 crc kubenswrapper[4933]: I1202 15:53:50.146572 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:50 crc kubenswrapper[4933]: I1202 15:53:50.146617 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:50 crc kubenswrapper[4933]: I1202 15:53:50.146630 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:50 crc kubenswrapper[4933]: I1202 15:53:50.146648 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:50 crc kubenswrapper[4933]: I1202 15:53:50.146735 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:50Z","lastTransitionTime":"2025-12-02T15:53:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:50 crc kubenswrapper[4933]: I1202 15:53:50.249273 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:50 crc kubenswrapper[4933]: I1202 15:53:50.249314 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:50 crc kubenswrapper[4933]: I1202 15:53:50.249322 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:50 crc kubenswrapper[4933]: I1202 15:53:50.249336 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:50 crc kubenswrapper[4933]: I1202 15:53:50.249346 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:50Z","lastTransitionTime":"2025-12-02T15:53:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:50 crc kubenswrapper[4933]: I1202 15:53:50.352364 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:50 crc kubenswrapper[4933]: I1202 15:53:50.352419 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:50 crc kubenswrapper[4933]: I1202 15:53:50.352431 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:50 crc kubenswrapper[4933]: I1202 15:53:50.352449 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:50 crc kubenswrapper[4933]: I1202 15:53:50.352459 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:50Z","lastTransitionTime":"2025-12-02T15:53:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:50 crc kubenswrapper[4933]: I1202 15:53:50.455184 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:50 crc kubenswrapper[4933]: I1202 15:53:50.455244 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:50 crc kubenswrapper[4933]: I1202 15:53:50.455258 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:50 crc kubenswrapper[4933]: I1202 15:53:50.455276 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:50 crc kubenswrapper[4933]: I1202 15:53:50.455288 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:50Z","lastTransitionTime":"2025-12-02T15:53:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:50 crc kubenswrapper[4933]: I1202 15:53:50.558925 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:50 crc kubenswrapper[4933]: I1202 15:53:50.558984 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:50 crc kubenswrapper[4933]: I1202 15:53:50.558995 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:50 crc kubenswrapper[4933]: I1202 15:53:50.559019 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:50 crc kubenswrapper[4933]: I1202 15:53:50.559030 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:50Z","lastTransitionTime":"2025-12-02T15:53:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:50 crc kubenswrapper[4933]: I1202 15:53:50.662071 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:50 crc kubenswrapper[4933]: I1202 15:53:50.662140 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:50 crc kubenswrapper[4933]: I1202 15:53:50.662157 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:50 crc kubenswrapper[4933]: I1202 15:53:50.662181 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:50 crc kubenswrapper[4933]: I1202 15:53:50.662203 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:50Z","lastTransitionTime":"2025-12-02T15:53:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:50 crc kubenswrapper[4933]: I1202 15:53:50.765754 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:50 crc kubenswrapper[4933]: I1202 15:53:50.765892 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:50 crc kubenswrapper[4933]: I1202 15:53:50.765913 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:50 crc kubenswrapper[4933]: I1202 15:53:50.765941 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:50 crc kubenswrapper[4933]: I1202 15:53:50.765958 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:50Z","lastTransitionTime":"2025-12-02T15:53:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:50 crc kubenswrapper[4933]: I1202 15:53:50.868998 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:50 crc kubenswrapper[4933]: I1202 15:53:50.869042 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:50 crc kubenswrapper[4933]: I1202 15:53:50.869050 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:50 crc kubenswrapper[4933]: I1202 15:53:50.869064 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:50 crc kubenswrapper[4933]: I1202 15:53:50.869072 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:50Z","lastTransitionTime":"2025-12-02T15:53:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:50 crc kubenswrapper[4933]: I1202 15:53:50.971949 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:50 crc kubenswrapper[4933]: I1202 15:53:50.972023 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:50 crc kubenswrapper[4933]: I1202 15:53:50.972040 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:50 crc kubenswrapper[4933]: I1202 15:53:50.972066 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:50 crc kubenswrapper[4933]: I1202 15:53:50.972085 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:50Z","lastTransitionTime":"2025-12-02T15:53:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:51 crc kubenswrapper[4933]: I1202 15:53:51.053416 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qbps2" Dec 02 15:53:51 crc kubenswrapper[4933]: E1202 15:53:51.053681 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qbps2" podUID="c95a4730-1427-4097-9ca3-4bd251e7acf0" Dec 02 15:53:51 crc kubenswrapper[4933]: I1202 15:53:51.074787 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:51 crc kubenswrapper[4933]: I1202 15:53:51.074893 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:51 crc kubenswrapper[4933]: I1202 15:53:51.074914 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:51 crc kubenswrapper[4933]: I1202 15:53:51.074942 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:51 crc kubenswrapper[4933]: I1202 15:53:51.074962 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:51Z","lastTransitionTime":"2025-12-02T15:53:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:51 crc kubenswrapper[4933]: I1202 15:53:51.177816 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:51 crc kubenswrapper[4933]: I1202 15:53:51.177910 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:51 crc kubenswrapper[4933]: I1202 15:53:51.177926 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:51 crc kubenswrapper[4933]: I1202 15:53:51.177947 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:51 crc kubenswrapper[4933]: I1202 15:53:51.177965 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:51Z","lastTransitionTime":"2025-12-02T15:53:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:51 crc kubenswrapper[4933]: I1202 15:53:51.281987 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:51 crc kubenswrapper[4933]: I1202 15:53:51.282077 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:51 crc kubenswrapper[4933]: I1202 15:53:51.282104 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:51 crc kubenswrapper[4933]: I1202 15:53:51.282288 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:51 crc kubenswrapper[4933]: I1202 15:53:51.282321 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:51Z","lastTransitionTime":"2025-12-02T15:53:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:51 crc kubenswrapper[4933]: I1202 15:53:51.385221 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:51 crc kubenswrapper[4933]: I1202 15:53:51.385297 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:51 crc kubenswrapper[4933]: I1202 15:53:51.385337 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:51 crc kubenswrapper[4933]: I1202 15:53:51.385376 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:51 crc kubenswrapper[4933]: I1202 15:53:51.385402 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:51Z","lastTransitionTime":"2025-12-02T15:53:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:51 crc kubenswrapper[4933]: I1202 15:53:51.490041 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:51 crc kubenswrapper[4933]: I1202 15:53:51.490101 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:51 crc kubenswrapper[4933]: I1202 15:53:51.490111 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:51 crc kubenswrapper[4933]: I1202 15:53:51.490131 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:51 crc kubenswrapper[4933]: I1202 15:53:51.490144 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:51Z","lastTransitionTime":"2025-12-02T15:53:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:51 crc kubenswrapper[4933]: I1202 15:53:51.593322 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:51 crc kubenswrapper[4933]: I1202 15:53:51.594044 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:51 crc kubenswrapper[4933]: I1202 15:53:51.594193 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:51 crc kubenswrapper[4933]: I1202 15:53:51.594308 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:51 crc kubenswrapper[4933]: I1202 15:53:51.594403 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:51Z","lastTransitionTime":"2025-12-02T15:53:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:51 crc kubenswrapper[4933]: I1202 15:53:51.697537 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:51 crc kubenswrapper[4933]: I1202 15:53:51.697587 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:51 crc kubenswrapper[4933]: I1202 15:53:51.697602 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:51 crc kubenswrapper[4933]: I1202 15:53:51.697621 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:51 crc kubenswrapper[4933]: I1202 15:53:51.697634 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:51Z","lastTransitionTime":"2025-12-02T15:53:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:51 crc kubenswrapper[4933]: I1202 15:53:51.800689 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:51 crc kubenswrapper[4933]: I1202 15:53:51.800780 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:51 crc kubenswrapper[4933]: I1202 15:53:51.800804 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:51 crc kubenswrapper[4933]: I1202 15:53:51.800883 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:51 crc kubenswrapper[4933]: I1202 15:53:51.800909 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:51Z","lastTransitionTime":"2025-12-02T15:53:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:51 crc kubenswrapper[4933]: I1202 15:53:51.904352 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:51 crc kubenswrapper[4933]: I1202 15:53:51.904431 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:51 crc kubenswrapper[4933]: I1202 15:53:51.904452 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:51 crc kubenswrapper[4933]: I1202 15:53:51.904519 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:51 crc kubenswrapper[4933]: I1202 15:53:51.904538 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:51Z","lastTransitionTime":"2025-12-02T15:53:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:51 crc kubenswrapper[4933]: I1202 15:53:51.915690 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:51 crc kubenswrapper[4933]: I1202 15:53:51.915732 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:51 crc kubenswrapper[4933]: I1202 15:53:51.915741 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:51 crc kubenswrapper[4933]: I1202 15:53:51.915760 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:51 crc kubenswrapper[4933]: I1202 15:53:51.915772 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:51Z","lastTransitionTime":"2025-12-02T15:53:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:51 crc kubenswrapper[4933]: E1202 15:53:51.948433 4933 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:53:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:53:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:53:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:53:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b45811fa-f657-451d-9a34-cdd268fcc941\\\",\\\"systemUUID\\\":\\\"84b7b789-bc9b-466b-8619-2bf2e1fdb8d0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:51Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:51 crc kubenswrapper[4933]: I1202 15:53:51.956282 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:51 crc kubenswrapper[4933]: I1202 15:53:51.956336 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:51 crc kubenswrapper[4933]: I1202 15:53:51.956348 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:51 crc kubenswrapper[4933]: I1202 15:53:51.956368 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:51 crc kubenswrapper[4933]: I1202 15:53:51.956384 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:51Z","lastTransitionTime":"2025-12-02T15:53:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:51 crc kubenswrapper[4933]: E1202 15:53:51.978243 4933 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:53:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:53:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:53:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:53:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b45811fa-f657-451d-9a34-cdd268fcc941\\\",\\\"systemUUID\\\":\\\"84b7b789-bc9b-466b-8619-2bf2e1fdb8d0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:51Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:51 crc kubenswrapper[4933]: I1202 15:53:51.983667 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:51 crc kubenswrapper[4933]: I1202 15:53:51.983723 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:51 crc kubenswrapper[4933]: I1202 15:53:51.983735 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:51 crc kubenswrapper[4933]: I1202 15:53:51.983755 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:51 crc kubenswrapper[4933]: I1202 15:53:51.983767 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:51Z","lastTransitionTime":"2025-12-02T15:53:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:52 crc kubenswrapper[4933]: E1202 15:53:52.004392 4933 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:53:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:53:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:53:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:53:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b45811fa-f657-451d-9a34-cdd268fcc941\\\",\\\"systemUUID\\\":\\\"84b7b789-bc9b-466b-8619-2bf2e1fdb8d0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:52Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:52 crc kubenswrapper[4933]: I1202 15:53:52.013533 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:52 crc kubenswrapper[4933]: I1202 15:53:52.013595 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:52 crc kubenswrapper[4933]: I1202 15:53:52.013610 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:52 crc kubenswrapper[4933]: I1202 15:53:52.013630 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:52 crc kubenswrapper[4933]: I1202 15:53:52.013649 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:52Z","lastTransitionTime":"2025-12-02T15:53:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:52 crc kubenswrapper[4933]: E1202 15:53:52.032211 4933 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:53:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:53:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:53:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:53:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b45811fa-f657-451d-9a34-cdd268fcc941\\\",\\\"systemUUID\\\":\\\"84b7b789-bc9b-466b-8619-2bf2e1fdb8d0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:52Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:52 crc kubenswrapper[4933]: I1202 15:53:52.036906 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:52 crc kubenswrapper[4933]: I1202 15:53:52.036954 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:52 crc kubenswrapper[4933]: I1202 15:53:52.036966 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:52 crc kubenswrapper[4933]: I1202 15:53:52.036980 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:52 crc kubenswrapper[4933]: I1202 15:53:52.036989 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:52Z","lastTransitionTime":"2025-12-02T15:53:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:52 crc kubenswrapper[4933]: E1202 15:53:52.048840 4933 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:53:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:53:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:53:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:53:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b45811fa-f657-451d-9a34-cdd268fcc941\\\",\\\"systemUUID\\\":\\\"84b7b789-bc9b-466b-8619-2bf2e1fdb8d0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:52Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:52 crc kubenswrapper[4933]: E1202 15:53:52.048973 4933 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 02 15:53:52 crc kubenswrapper[4933]: I1202 15:53:52.050696 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:52 crc kubenswrapper[4933]: I1202 15:53:52.050725 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:52 crc kubenswrapper[4933]: I1202 15:53:52.050735 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:52 crc kubenswrapper[4933]: I1202 15:53:52.050748 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:52 crc kubenswrapper[4933]: I1202 15:53:52.050758 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:52Z","lastTransitionTime":"2025-12-02T15:53:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:52 crc kubenswrapper[4933]: I1202 15:53:52.053108 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 15:53:52 crc kubenswrapper[4933]: I1202 15:53:52.053115 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 15:53:52 crc kubenswrapper[4933]: I1202 15:53:52.053116 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 15:53:52 crc kubenswrapper[4933]: E1202 15:53:52.053220 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 15:53:52 crc kubenswrapper[4933]: E1202 15:53:52.053266 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 15:53:52 crc kubenswrapper[4933]: E1202 15:53:52.053325 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 15:53:52 crc kubenswrapper[4933]: I1202 15:53:52.154359 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:52 crc kubenswrapper[4933]: I1202 15:53:52.154437 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:52 crc kubenswrapper[4933]: I1202 15:53:52.154446 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:52 crc kubenswrapper[4933]: I1202 15:53:52.154463 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:52 crc kubenswrapper[4933]: I1202 15:53:52.154474 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:52Z","lastTransitionTime":"2025-12-02T15:53:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:52 crc kubenswrapper[4933]: I1202 15:53:52.257453 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:52 crc kubenswrapper[4933]: I1202 15:53:52.257553 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:52 crc kubenswrapper[4933]: I1202 15:53:52.257576 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:52 crc kubenswrapper[4933]: I1202 15:53:52.257611 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:52 crc kubenswrapper[4933]: I1202 15:53:52.257633 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:52Z","lastTransitionTime":"2025-12-02T15:53:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:52 crc kubenswrapper[4933]: I1202 15:53:52.360785 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:52 crc kubenswrapper[4933]: I1202 15:53:52.360914 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:52 crc kubenswrapper[4933]: I1202 15:53:52.360938 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:52 crc kubenswrapper[4933]: I1202 15:53:52.360968 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:52 crc kubenswrapper[4933]: I1202 15:53:52.360995 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:52Z","lastTransitionTime":"2025-12-02T15:53:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:52 crc kubenswrapper[4933]: I1202 15:53:52.464044 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:52 crc kubenswrapper[4933]: I1202 15:53:52.464112 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:52 crc kubenswrapper[4933]: I1202 15:53:52.464128 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:52 crc kubenswrapper[4933]: I1202 15:53:52.464156 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:52 crc kubenswrapper[4933]: I1202 15:53:52.464174 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:52Z","lastTransitionTime":"2025-12-02T15:53:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:52 crc kubenswrapper[4933]: I1202 15:53:52.566907 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:52 crc kubenswrapper[4933]: I1202 15:53:52.566977 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:52 crc kubenswrapper[4933]: I1202 15:53:52.566998 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:52 crc kubenswrapper[4933]: I1202 15:53:52.567030 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:52 crc kubenswrapper[4933]: I1202 15:53:52.567056 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:52Z","lastTransitionTime":"2025-12-02T15:53:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:52 crc kubenswrapper[4933]: I1202 15:53:52.670932 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:52 crc kubenswrapper[4933]: I1202 15:53:52.671026 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:52 crc kubenswrapper[4933]: I1202 15:53:52.671059 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:52 crc kubenswrapper[4933]: I1202 15:53:52.671094 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:52 crc kubenswrapper[4933]: I1202 15:53:52.671117 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:52Z","lastTransitionTime":"2025-12-02T15:53:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:52 crc kubenswrapper[4933]: I1202 15:53:52.774600 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:52 crc kubenswrapper[4933]: I1202 15:53:52.774670 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:52 crc kubenswrapper[4933]: I1202 15:53:52.774692 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:52 crc kubenswrapper[4933]: I1202 15:53:52.774719 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:52 crc kubenswrapper[4933]: I1202 15:53:52.774736 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:52Z","lastTransitionTime":"2025-12-02T15:53:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:52 crc kubenswrapper[4933]: I1202 15:53:52.877572 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:52 crc kubenswrapper[4933]: I1202 15:53:52.877955 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:52 crc kubenswrapper[4933]: I1202 15:53:52.878041 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:52 crc kubenswrapper[4933]: I1202 15:53:52.878130 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:52 crc kubenswrapper[4933]: I1202 15:53:52.878204 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:52Z","lastTransitionTime":"2025-12-02T15:53:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:52 crc kubenswrapper[4933]: I1202 15:53:52.981564 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:52 crc kubenswrapper[4933]: I1202 15:53:52.981620 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:52 crc kubenswrapper[4933]: I1202 15:53:52.981633 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:52 crc kubenswrapper[4933]: I1202 15:53:52.981651 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:52 crc kubenswrapper[4933]: I1202 15:53:52.981662 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:52Z","lastTransitionTime":"2025-12-02T15:53:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:53 crc kubenswrapper[4933]: I1202 15:53:53.052758 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qbps2" Dec 02 15:53:53 crc kubenswrapper[4933]: E1202 15:53:53.053124 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qbps2" podUID="c95a4730-1427-4097-9ca3-4bd251e7acf0" Dec 02 15:53:53 crc kubenswrapper[4933]: I1202 15:53:53.083733 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:53 crc kubenswrapper[4933]: I1202 15:53:53.083789 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:53 crc kubenswrapper[4933]: I1202 15:53:53.083802 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:53 crc kubenswrapper[4933]: I1202 15:53:53.083843 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:53 crc kubenswrapper[4933]: I1202 15:53:53.083856 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:53Z","lastTransitionTime":"2025-12-02T15:53:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:53 crc kubenswrapper[4933]: I1202 15:53:53.186880 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:53 crc kubenswrapper[4933]: I1202 15:53:53.186997 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:53 crc kubenswrapper[4933]: I1202 15:53:53.187010 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:53 crc kubenswrapper[4933]: I1202 15:53:53.187030 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:53 crc kubenswrapper[4933]: I1202 15:53:53.187048 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:53Z","lastTransitionTime":"2025-12-02T15:53:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:53 crc kubenswrapper[4933]: I1202 15:53:53.296205 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:53 crc kubenswrapper[4933]: I1202 15:53:53.296318 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:53 crc kubenswrapper[4933]: I1202 15:53:53.296337 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:53 crc kubenswrapper[4933]: I1202 15:53:53.296365 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:53 crc kubenswrapper[4933]: I1202 15:53:53.296390 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:53Z","lastTransitionTime":"2025-12-02T15:53:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:53 crc kubenswrapper[4933]: I1202 15:53:53.399004 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:53 crc kubenswrapper[4933]: I1202 15:53:53.399082 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:53 crc kubenswrapper[4933]: I1202 15:53:53.399100 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:53 crc kubenswrapper[4933]: I1202 15:53:53.399126 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:53 crc kubenswrapper[4933]: I1202 15:53:53.399143 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:53Z","lastTransitionTime":"2025-12-02T15:53:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:53 crc kubenswrapper[4933]: I1202 15:53:53.502868 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:53 crc kubenswrapper[4933]: I1202 15:53:53.503227 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:53 crc kubenswrapper[4933]: I1202 15:53:53.503334 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:53 crc kubenswrapper[4933]: I1202 15:53:53.503440 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:53 crc kubenswrapper[4933]: I1202 15:53:53.503560 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:53Z","lastTransitionTime":"2025-12-02T15:53:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:53 crc kubenswrapper[4933]: I1202 15:53:53.606660 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:53 crc kubenswrapper[4933]: I1202 15:53:53.606741 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:53 crc kubenswrapper[4933]: I1202 15:53:53.606769 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:53 crc kubenswrapper[4933]: I1202 15:53:53.606811 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:53 crc kubenswrapper[4933]: I1202 15:53:53.606893 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:53Z","lastTransitionTime":"2025-12-02T15:53:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:53 crc kubenswrapper[4933]: I1202 15:53:53.709338 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:53 crc kubenswrapper[4933]: I1202 15:53:53.709570 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:53 crc kubenswrapper[4933]: I1202 15:53:53.709585 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:53 crc kubenswrapper[4933]: I1202 15:53:53.709606 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:53 crc kubenswrapper[4933]: I1202 15:53:53.709621 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:53Z","lastTransitionTime":"2025-12-02T15:53:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:53 crc kubenswrapper[4933]: I1202 15:53:53.813157 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:53 crc kubenswrapper[4933]: I1202 15:53:53.813664 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:53 crc kubenswrapper[4933]: I1202 15:53:53.813844 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:53 crc kubenswrapper[4933]: I1202 15:53:53.814033 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:53 crc kubenswrapper[4933]: I1202 15:53:53.814209 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:53Z","lastTransitionTime":"2025-12-02T15:53:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:53 crc kubenswrapper[4933]: I1202 15:53:53.917555 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:53 crc kubenswrapper[4933]: I1202 15:53:53.917607 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:53 crc kubenswrapper[4933]: I1202 15:53:53.917628 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:53 crc kubenswrapper[4933]: I1202 15:53:53.917652 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:53 crc kubenswrapper[4933]: I1202 15:53:53.917670 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:53Z","lastTransitionTime":"2025-12-02T15:53:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:54 crc kubenswrapper[4933]: I1202 15:53:54.021060 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:54 crc kubenswrapper[4933]: I1202 15:53:54.021095 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:54 crc kubenswrapper[4933]: I1202 15:53:54.021107 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:54 crc kubenswrapper[4933]: I1202 15:53:54.021121 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:54 crc kubenswrapper[4933]: I1202 15:53:54.021133 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:54Z","lastTransitionTime":"2025-12-02T15:53:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:54 crc kubenswrapper[4933]: I1202 15:53:54.052560 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 15:53:54 crc kubenswrapper[4933]: E1202 15:53:54.052747 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 15:53:54 crc kubenswrapper[4933]: I1202 15:53:54.053045 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 15:53:54 crc kubenswrapper[4933]: I1202 15:53:54.053086 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 15:53:54 crc kubenswrapper[4933]: E1202 15:53:54.053183 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 15:53:54 crc kubenswrapper[4933]: E1202 15:53:54.053745 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 15:53:54 crc kubenswrapper[4933]: I1202 15:53:54.124312 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:54 crc kubenswrapper[4933]: I1202 15:53:54.124409 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:54 crc kubenswrapper[4933]: I1202 15:53:54.124434 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:54 crc kubenswrapper[4933]: I1202 15:53:54.124466 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:54 crc kubenswrapper[4933]: I1202 15:53:54.124495 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:54Z","lastTransitionTime":"2025-12-02T15:53:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:54 crc kubenswrapper[4933]: I1202 15:53:54.228247 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:54 crc kubenswrapper[4933]: I1202 15:53:54.229241 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:54 crc kubenswrapper[4933]: I1202 15:53:54.229447 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:54 crc kubenswrapper[4933]: I1202 15:53:54.229611 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:54 crc kubenswrapper[4933]: I1202 15:53:54.229740 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:54Z","lastTransitionTime":"2025-12-02T15:53:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:54 crc kubenswrapper[4933]: I1202 15:53:54.333346 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:54 crc kubenswrapper[4933]: I1202 15:53:54.333394 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:54 crc kubenswrapper[4933]: I1202 15:53:54.333406 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:54 crc kubenswrapper[4933]: I1202 15:53:54.333424 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:54 crc kubenswrapper[4933]: I1202 15:53:54.333436 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:54Z","lastTransitionTime":"2025-12-02T15:53:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:54 crc kubenswrapper[4933]: I1202 15:53:54.436213 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:54 crc kubenswrapper[4933]: I1202 15:53:54.436287 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:54 crc kubenswrapper[4933]: I1202 15:53:54.436311 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:54 crc kubenswrapper[4933]: I1202 15:53:54.436342 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:54 crc kubenswrapper[4933]: I1202 15:53:54.436364 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:54Z","lastTransitionTime":"2025-12-02T15:53:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:54 crc kubenswrapper[4933]: I1202 15:53:54.543406 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:54 crc kubenswrapper[4933]: I1202 15:53:54.543494 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:54 crc kubenswrapper[4933]: I1202 15:53:54.543559 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:54 crc kubenswrapper[4933]: I1202 15:53:54.543585 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:54 crc kubenswrapper[4933]: I1202 15:53:54.543603 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:54Z","lastTransitionTime":"2025-12-02T15:53:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:54 crc kubenswrapper[4933]: I1202 15:53:54.646809 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:54 crc kubenswrapper[4933]: I1202 15:53:54.646923 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:54 crc kubenswrapper[4933]: I1202 15:53:54.646944 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:54 crc kubenswrapper[4933]: I1202 15:53:54.646968 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:54 crc kubenswrapper[4933]: I1202 15:53:54.646988 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:54Z","lastTransitionTime":"2025-12-02T15:53:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:54 crc kubenswrapper[4933]: I1202 15:53:54.749717 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:54 crc kubenswrapper[4933]: I1202 15:53:54.750092 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:54 crc kubenswrapper[4933]: I1202 15:53:54.750168 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:54 crc kubenswrapper[4933]: I1202 15:53:54.750266 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:54 crc kubenswrapper[4933]: I1202 15:53:54.750334 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:54Z","lastTransitionTime":"2025-12-02T15:53:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:54 crc kubenswrapper[4933]: I1202 15:53:54.853373 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:54 crc kubenswrapper[4933]: I1202 15:53:54.853434 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:54 crc kubenswrapper[4933]: I1202 15:53:54.853446 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:54 crc kubenswrapper[4933]: I1202 15:53:54.853466 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:54 crc kubenswrapper[4933]: I1202 15:53:54.853478 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:54Z","lastTransitionTime":"2025-12-02T15:53:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:54 crc kubenswrapper[4933]: I1202 15:53:54.956526 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:54 crc kubenswrapper[4933]: I1202 15:53:54.956581 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:54 crc kubenswrapper[4933]: I1202 15:53:54.956599 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:54 crc kubenswrapper[4933]: I1202 15:53:54.956624 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:54 crc kubenswrapper[4933]: I1202 15:53:54.956642 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:54Z","lastTransitionTime":"2025-12-02T15:53:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:55 crc kubenswrapper[4933]: I1202 15:53:55.053189 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qbps2" Dec 02 15:53:55 crc kubenswrapper[4933]: E1202 15:53:55.053520 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qbps2" podUID="c95a4730-1427-4097-9ca3-4bd251e7acf0" Dec 02 15:53:55 crc kubenswrapper[4933]: I1202 15:53:55.060545 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:55 crc kubenswrapper[4933]: I1202 15:53:55.060636 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:55 crc kubenswrapper[4933]: I1202 15:53:55.060665 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:55 crc kubenswrapper[4933]: I1202 15:53:55.060697 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:55 crc kubenswrapper[4933]: I1202 15:53:55.060721 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:55Z","lastTransitionTime":"2025-12-02T15:53:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:55 crc kubenswrapper[4933]: I1202 15:53:55.164187 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:55 crc kubenswrapper[4933]: I1202 15:53:55.164369 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:55 crc kubenswrapper[4933]: I1202 15:53:55.164400 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:55 crc kubenswrapper[4933]: I1202 15:53:55.164434 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:55 crc kubenswrapper[4933]: I1202 15:53:55.164459 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:55Z","lastTransitionTime":"2025-12-02T15:53:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:55 crc kubenswrapper[4933]: I1202 15:53:55.267309 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:55 crc kubenswrapper[4933]: I1202 15:53:55.267379 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:55 crc kubenswrapper[4933]: I1202 15:53:55.267400 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:55 crc kubenswrapper[4933]: I1202 15:53:55.267425 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:55 crc kubenswrapper[4933]: I1202 15:53:55.267443 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:55Z","lastTransitionTime":"2025-12-02T15:53:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:55 crc kubenswrapper[4933]: I1202 15:53:55.370632 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:55 crc kubenswrapper[4933]: I1202 15:53:55.370716 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:55 crc kubenswrapper[4933]: I1202 15:53:55.370735 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:55 crc kubenswrapper[4933]: I1202 15:53:55.370760 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:55 crc kubenswrapper[4933]: I1202 15:53:55.370778 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:55Z","lastTransitionTime":"2025-12-02T15:53:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:55 crc kubenswrapper[4933]: I1202 15:53:55.474206 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:55 crc kubenswrapper[4933]: I1202 15:53:55.474255 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:55 crc kubenswrapper[4933]: I1202 15:53:55.474269 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:55 crc kubenswrapper[4933]: I1202 15:53:55.474310 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:55 crc kubenswrapper[4933]: I1202 15:53:55.474329 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:55Z","lastTransitionTime":"2025-12-02T15:53:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:55 crc kubenswrapper[4933]: I1202 15:53:55.576809 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:55 crc kubenswrapper[4933]: I1202 15:53:55.576925 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:55 crc kubenswrapper[4933]: I1202 15:53:55.576942 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:55 crc kubenswrapper[4933]: I1202 15:53:55.576963 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:55 crc kubenswrapper[4933]: I1202 15:53:55.576984 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:55Z","lastTransitionTime":"2025-12-02T15:53:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:55 crc kubenswrapper[4933]: I1202 15:53:55.679609 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:55 crc kubenswrapper[4933]: I1202 15:53:55.679655 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:55 crc kubenswrapper[4933]: I1202 15:53:55.679666 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:55 crc kubenswrapper[4933]: I1202 15:53:55.679682 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:55 crc kubenswrapper[4933]: I1202 15:53:55.679697 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:55Z","lastTransitionTime":"2025-12-02T15:53:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:55 crc kubenswrapper[4933]: I1202 15:53:55.782604 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:55 crc kubenswrapper[4933]: I1202 15:53:55.782661 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:55 crc kubenswrapper[4933]: I1202 15:53:55.782672 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:55 crc kubenswrapper[4933]: I1202 15:53:55.782689 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:55 crc kubenswrapper[4933]: I1202 15:53:55.782701 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:55Z","lastTransitionTime":"2025-12-02T15:53:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:55 crc kubenswrapper[4933]: I1202 15:53:55.885486 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:55 crc kubenswrapper[4933]: I1202 15:53:55.885550 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:55 crc kubenswrapper[4933]: I1202 15:53:55.885569 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:55 crc kubenswrapper[4933]: I1202 15:53:55.885594 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:55 crc kubenswrapper[4933]: I1202 15:53:55.885612 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:55Z","lastTransitionTime":"2025-12-02T15:53:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:55 crc kubenswrapper[4933]: I1202 15:53:55.988303 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:55 crc kubenswrapper[4933]: I1202 15:53:55.988361 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:55 crc kubenswrapper[4933]: I1202 15:53:55.988374 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:55 crc kubenswrapper[4933]: I1202 15:53:55.988392 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:55 crc kubenswrapper[4933]: I1202 15:53:55.988403 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:55Z","lastTransitionTime":"2025-12-02T15:53:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:56 crc kubenswrapper[4933]: I1202 15:53:56.052775 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 15:53:56 crc kubenswrapper[4933]: I1202 15:53:56.052934 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 15:53:56 crc kubenswrapper[4933]: E1202 15:53:56.053018 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 15:53:56 crc kubenswrapper[4933]: I1202 15:53:56.052934 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 15:53:56 crc kubenswrapper[4933]: E1202 15:53:56.053127 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 15:53:56 crc kubenswrapper[4933]: E1202 15:53:56.053176 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 15:53:56 crc kubenswrapper[4933]: I1202 15:53:56.091817 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:56 crc kubenswrapper[4933]: I1202 15:53:56.091903 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:56 crc kubenswrapper[4933]: I1202 15:53:56.091920 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:56 crc kubenswrapper[4933]: I1202 15:53:56.091947 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:56 crc kubenswrapper[4933]: I1202 15:53:56.091964 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:56Z","lastTransitionTime":"2025-12-02T15:53:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:56 crc kubenswrapper[4933]: I1202 15:53:56.195026 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:56 crc kubenswrapper[4933]: I1202 15:53:56.195116 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:56 crc kubenswrapper[4933]: I1202 15:53:56.195140 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:56 crc kubenswrapper[4933]: I1202 15:53:56.195168 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:56 crc kubenswrapper[4933]: I1202 15:53:56.195191 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:56Z","lastTransitionTime":"2025-12-02T15:53:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:56 crc kubenswrapper[4933]: I1202 15:53:56.298337 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:56 crc kubenswrapper[4933]: I1202 15:53:56.298388 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:56 crc kubenswrapper[4933]: I1202 15:53:56.298399 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:56 crc kubenswrapper[4933]: I1202 15:53:56.298421 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:56 crc kubenswrapper[4933]: I1202 15:53:56.298446 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:56Z","lastTransitionTime":"2025-12-02T15:53:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:56 crc kubenswrapper[4933]: I1202 15:53:56.401649 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:56 crc kubenswrapper[4933]: I1202 15:53:56.401712 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:56 crc kubenswrapper[4933]: I1202 15:53:56.401729 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:56 crc kubenswrapper[4933]: I1202 15:53:56.401751 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:56 crc kubenswrapper[4933]: I1202 15:53:56.401765 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:56Z","lastTransitionTime":"2025-12-02T15:53:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:56 crc kubenswrapper[4933]: I1202 15:53:56.504983 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:56 crc kubenswrapper[4933]: I1202 15:53:56.505038 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:56 crc kubenswrapper[4933]: I1202 15:53:56.505049 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:56 crc kubenswrapper[4933]: I1202 15:53:56.505065 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:56 crc kubenswrapper[4933]: I1202 15:53:56.505077 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:56Z","lastTransitionTime":"2025-12-02T15:53:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:56 crc kubenswrapper[4933]: I1202 15:53:56.608961 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:56 crc kubenswrapper[4933]: I1202 15:53:56.609021 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:56 crc kubenswrapper[4933]: I1202 15:53:56.609037 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:56 crc kubenswrapper[4933]: I1202 15:53:56.609057 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:56 crc kubenswrapper[4933]: I1202 15:53:56.609072 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:56Z","lastTransitionTime":"2025-12-02T15:53:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:56 crc kubenswrapper[4933]: I1202 15:53:56.711500 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:56 crc kubenswrapper[4933]: I1202 15:53:56.711561 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:56 crc kubenswrapper[4933]: I1202 15:53:56.711580 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:56 crc kubenswrapper[4933]: I1202 15:53:56.711602 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:56 crc kubenswrapper[4933]: I1202 15:53:56.711618 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:56Z","lastTransitionTime":"2025-12-02T15:53:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:56 crc kubenswrapper[4933]: I1202 15:53:56.814754 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:56 crc kubenswrapper[4933]: I1202 15:53:56.814806 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:56 crc kubenswrapper[4933]: I1202 15:53:56.814816 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:56 crc kubenswrapper[4933]: I1202 15:53:56.814870 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:56 crc kubenswrapper[4933]: I1202 15:53:56.814882 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:56Z","lastTransitionTime":"2025-12-02T15:53:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:56 crc kubenswrapper[4933]: I1202 15:53:56.917178 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:56 crc kubenswrapper[4933]: I1202 15:53:56.917288 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:56 crc kubenswrapper[4933]: I1202 15:53:56.917314 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:56 crc kubenswrapper[4933]: I1202 15:53:56.917338 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:56 crc kubenswrapper[4933]: I1202 15:53:56.917357 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:56Z","lastTransitionTime":"2025-12-02T15:53:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:57 crc kubenswrapper[4933]: I1202 15:53:57.020442 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:57 crc kubenswrapper[4933]: I1202 15:53:57.020528 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:57 crc kubenswrapper[4933]: I1202 15:53:57.020539 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:57 crc kubenswrapper[4933]: I1202 15:53:57.020559 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:57 crc kubenswrapper[4933]: I1202 15:53:57.020576 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:57Z","lastTransitionTime":"2025-12-02T15:53:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:57 crc kubenswrapper[4933]: I1202 15:53:57.052898 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qbps2" Dec 02 15:53:57 crc kubenswrapper[4933]: E1202 15:53:57.053114 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qbps2" podUID="c95a4730-1427-4097-9ca3-4bd251e7acf0" Dec 02 15:53:57 crc kubenswrapper[4933]: I1202 15:53:57.055056 4933 scope.go:117] "RemoveContainer" containerID="2f7591b746ed9942ce7644e8005251baa49bdeaef3618aa1d679249fd3a96058" Dec 02 15:53:57 crc kubenswrapper[4933]: E1202 15:53:57.055445 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-8mklc_openshift-ovn-kubernetes(1972064c-ea30-421c-b009-2bc675a98fcc)\"" pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" podUID="1972064c-ea30-421c-b009-2bc675a98fcc" Dec 02 15:53:57 crc kubenswrapper[4933]: I1202 15:53:57.077490 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67141d41-dade-4d16-8921-1a3eeaef658e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67497877f8241167fd89cf2ebbc7257a910b4f22e96a1e58f78936317daf19aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://559f28d5b8b13a4d78a727f671a75d2c61a391f83089c98749a8071b466c8fae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50e3f8f75abacf2b0701b20cc58796727bb04d99836c7ad69c27355d0c7f070f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0f8661c980893fc646e84a6bb8547946653723fb3f1449d585953e25715d588\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdced4864fc5e9a41404f9484c6126634ffcbc3388080207f6a5508be6dc7b19\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 15:52:30.552832 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 15:52:30.556124 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1621699441/tls.crt::/tmp/serving-cert-1621699441/tls.key\\\\\\\"\\\\nI1202 15:52:36.166152 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 15:52:36.169452 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 15:52:36.169553 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 15:52:36.169614 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 15:52:36.169667 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 15:52:36.177343 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 15:52:36.177409 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 15:52:36.177417 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 15:52:36.177423 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 15:52:36.177428 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 15:52:36.177432 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 15:52:36.177437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 15:52:36.177361 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 15:52:36.181525 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://266ff5f18c317176752753189e1fd32c61d17e22bebba19421d71cdce41c10e4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd63fc5b2ffbd52b09628908868b28e2168bfe8bc453de41ce175aedcd404cc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd63fc5b2ffbd52b09628908868b28e2168bfe8bc453de41ce175aedcd404cc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:57Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:57 crc kubenswrapper[4933]: I1202 15:53:57.099100 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c0555430401fd089ba4f14bef44c9a03bcc4352a3159c34aa592797211ff912\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2774460b514418fc05e1d8ac0ca0a8cda1194fab9151804bed266e6bf44c7369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:57Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:57 crc kubenswrapper[4933]: I1202 15:53:57.119103 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:57Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:57 crc kubenswrapper[4933]: I1202 15:53:57.123751 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:57 crc kubenswrapper[4933]: I1202 15:53:57.123918 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:57 crc kubenswrapper[4933]: I1202 15:53:57.124099 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:57 crc kubenswrapper[4933]: I1202 15:53:57.124238 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:57 crc kubenswrapper[4933]: I1202 15:53:57.124359 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:57Z","lastTransitionTime":"2025-12-02T15:53:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:57 crc kubenswrapper[4933]: I1202 15:53:57.139020 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-s779q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72a30a71-fe04-43a2-8f60-c9b12a0a6e5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2367cc195b2375ceed6df398214268e414ae13f5459750b4a1f3bbe4ef59363\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a448b3118b6160344bea740978c045ce2351af934805ee95eb6b940963b377bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a448b3118b6160344bea740978c045ce2351af934805ee95eb6b940963b377bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f11b7eae04e4265f3d00a32c0c1c91419ef6da612c6ecd2e69ef4e90b5f72b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f11b7eae04e4265f3d00a32c0c1c91419ef6da612c6ecd2e69ef4e90b5f72b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89093d056f612b90064277410dc17c8b1879459a9e9491538ccd8dabeb2703e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89093d056f612b90064277410dc17c8b1879459a9e9491538ccd8dabeb2703e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e773c0ba8db44a579b0f5b59c7402d773402fbb1b3e3b57bf95e9244df635c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e773c0ba8db44a579b0f5b59c7402d773402fbb1b3e3b57bf95e9244df635c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a098d78bfca730eec14ca5f48ae082e24f0623ec3752e1b0182dceb18821e3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a098d78bfca730eec14ca5f48ae082e24f0623ec3752e1b0182dceb18821e3b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e41fea9eed5df332e515a81666075fc4cb3171b47a7c222b36dac4d5a7533692\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e41fea9eed5df332e515a81666075fc4cb3171b47a7c222b36dac4d5a7533692\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-s779q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:57Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:57 crc kubenswrapper[4933]: I1202 15:53:57.151988 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"448f05b3-d7b9-4eca-a267-b9b4a5a766e2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://242478d3d008a43cabff06a9c4bf90faf8c8eb3a12f5f43cde5d0156c47d6a8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://278a1b76a0d8f7d922ab08d7ecd0912f0cde3f3438de43f5c5214e703bd244c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://278a1b76a0d8f7d922ab08d7ecd0912f0cde3f3438de43f5c5214e703bd244c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:57Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:57 crc kubenswrapper[4933]: I1202 15:53:57.188178 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c736b69-0c6f-42a9-bea8-cae1d3274483\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://292a536fa8488df1cde96062a3d40cbe5ecf556b30ae0ba609d9deb6b8dde7d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d4f02704d1f6ee3644bceab7f4cd44ef43e948d1777fa2e225642a5da98901f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e5b219049292adb37755387b666dabb044ec252e589f4f554a96bb2e858612c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c86a4b6d42de44d954b5bcf8b42ac10cffaac09c9f7e362a2d26b4a0fcba9a74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1258d15649d53d011ca2d8bb0e3e6b17ddaabe8ee2de9d184d85141ef2229ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94a860a32ad2b16621b3f2208b144968777437df38bf39ce6ab6b534a77ea154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94a860a32ad2b16621b3f2208b144968777437df38bf39ce6ab6b534a77ea154\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57d56c47e0f5d4c1ab0ee1bf1098a81be9d384c8f8035cff1eabc824523b46f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57d56c47e0f5d4c1ab0ee1bf1098a81be9d384c8f8035cff1eabc824523b46f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://54b33eb07acab9a8b66ee5312351af62b7dba3c8212de0dff4cd777235f5f2cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54b33eb07acab9a8b66ee5312351af62b7dba3c8212de0dff4cd777235f5f2cb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:57Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:57 crc kubenswrapper[4933]: I1202 15:53:57.208495 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06b195f1-3296-4050-9361-eab421cde8d4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7399fed7994f0ed3f12a423d3f6796e84d8687f9c16a3050ccbb90e1c80a07d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee933bdd8638f0085a6f720a178c8ce59bf46b40a0bcb015ac9c570e25ce97d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7cb1abb6f86878fc3daef153191ea3a2ebe06b3f1fc7df959539938c3b6a724\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db14d6e6ebdfa06ff02570eb66fe7ea17a7705fdaa767b6fb91d7ed12eacd59a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:57Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:57 crc kubenswrapper[4933]: I1202 15:53:57.227767 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:57 crc kubenswrapper[4933]: I1202 15:53:57.227898 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:57 crc kubenswrapper[4933]: I1202 15:53:57.227918 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:57 crc kubenswrapper[4933]: I1202 15:53:57.227945 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:57 crc kubenswrapper[4933]: I1202 15:53:57.227963 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:57Z","lastTransitionTime":"2025-12-02T15:53:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:57 crc kubenswrapper[4933]: I1202 15:53:57.238068 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1972064c-ea30-421c-b009-2bc675a98fcc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3920b75a79accaa154ca827b9d813f5bbb7c7c18369c0b6d3bb82cab653bcb66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a3f2ac3ef3759389e0c4807b4235aea66b8d9570ef020af90110402296a755\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d4c0805f39e3410185b2d898b61b42dfeb544cd47c1ac7570a099ece7921a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://170155e5803e9eabfc4866a3cd76bd74567d9ff6707bebfdf7f24a2330173140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a499bb33dae499d565aa22f537a095fe9465bd1d68d829248fc7a896f692f67d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14c25265781df0f480ecebbc167b5e413d869433f7258cc299a2c2bc12ee2af3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f7591b746ed9942ce7644e8005251baa49bdeaef3618aa1d679249fd3a96058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e68bfc1eb2e35cbb02950eed88f4e4cb09b5ee597cbf3dbefcbb7dca7a9f90d7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T15:53:08Z\\\",\\\"message\\\":\\\"g for pod on switch crc\\\\nI1202 15:53:07.975214 6599 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1202 15:53:07.975521 6599 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1202 15:53:07.975535 6599 base_network_controller_pods.go:916] Annotation values: ip=[10.217.0.3/23] ; mac=0a:58:0a:d9:00:03 ; gw=[10.217.0.1]\\\\nI1202 15:53:07.975530 6599 ovn.go:134] Ensuring zone local for Pod openshift-kube-controller-manager/kube-controller-manager-crc in node crc\\\\nI1202 15:53:07.975580 6599 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc after 0 failed attempt(s)\\\\nI1202 15:53:07.975354 6599 services_controller.go:444] Built service openshift-apiserver/api LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1202 15:53:07.975596 6599 default_network_controller.go:776] Recording success event on pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1202 15:53:07.975612 6599 services_controller.go:445] Built service openshift-apiserver/api LB template configs for network=default: []services.lbConfig(nil)\\\\nF1202 15:53:07.975627 6599 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T15:53:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f7591b746ed9942ce7644e8005251baa49bdeaef3618aa1d679249fd3a96058\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T15:53:44Z\\\",\\\"message\\\":\\\"il\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1202 15:53:43.707176 6999 transact.go:42] Configuring OVN: [{Op:update Table:Logical_Router_Static_Route Row:map[ip_prefix:10.217.0.0/22 nexthop:100.64.0.2 policy:{GoSet:[src-ip]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8944024f-deb7-4076-afb3-4b50a2ff4b4b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:static_routes Mutator:insert Value:{GoSet:[{GoUUID:8944024f-deb7-4076-afb3-4b50a2ff4b4b}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1202 15:53:43.707700 6999 obj_retry.go:551] Creating *factory.egressNode crc took: 2.297771ms\\\\nI1202 15:53:43.707729 6999 factory.go:1336] Added *v1.Node event handler 7\\\\nI1202 15:53:43.707766 6999 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1202 15:53:43.708052 6999 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1202 15:53:43.708145 6999 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1202 15:53:43.708185 6999 ovnkube.go:599] Stopped ovnkube\\\\nI1202 15:53:43.708211 6999 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1202 15:53:43.708294 6999 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T15:53:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ae6c9973a0f8983a5da9a3a25ba23d7d836e7f5bdaa60b60627d604ef3d2afc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0062caebc274431e99ef7dc875a50f2f2d756215bd2700b7206582f68e5f376\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0062caebc274431e99ef7dc875a50f2f2d756215bd2700b7206582f68e5f376\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8mklc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:57Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:57 crc kubenswrapper[4933]: I1202 15:53:57.252755 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s488w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a161d9d5-a56f-45e9-93e4-50e7220cd31e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38f86b30416fb2f10e8444d5c7c0afe84f16d619e83e7a5e1186eaf4c274a51f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mnvpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cba0775cbbe9a6ce1f0b0fe3559f2b5eb39bd13d4f35686fe2ff92d7d833909\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mnvpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s488w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:57Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:57 crc kubenswrapper[4933]: I1202 15:53:57.270764 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7faf072-ca84-4fd4-9409-b46ca6d4f1b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://befd22a9932058934ba3614781a7f133b4e432d5488c47697d082722ac11e0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d957eb940436ccb56c610cb177a4670ec0fe3aa5437e333a99c291c4263c5978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://defe82313be0dd4818134a0c694acbf728c8e31d0df745b7f2241a3e57c1bd8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d0c36fe96576d4594aff50e5b631d1bc23ea352372722d332d11c9dc6b5b7cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d0c36fe96576d4594aff50e5b631d1bc23ea352372722d332d11c9dc6b5b7cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:17Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:57Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:57 crc kubenswrapper[4933]: I1202 15:53:57.287176 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:57Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:57 crc kubenswrapper[4933]: I1202 15:53:57.305317 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6c1c5e6-50dd-428a-890c-2c3f0456f2fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d84363ac0dfeec81ad7770d6ffd34547605fc51bebb545c4639f4c069bab93ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t48jm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54194f3459a2bbe748821e4f8e94abdd18e7c4e483d4cc2c9d5b765db584dd01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t48jm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-d2p6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:57Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:57 crc kubenswrapper[4933]: I1202 15:53:57.320480 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5d9dn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97edaf10-912b-42e7-a9e7-930381d48508\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea0970d622ab76b94ceab66d1d10d469581574368d38c8cd7c6b7a26f81cb6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s82hp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5d9dn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:57Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:57 crc kubenswrapper[4933]: I1202 15:53:57.330964 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:57 crc kubenswrapper[4933]: I1202 15:53:57.331350 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:57 crc kubenswrapper[4933]: I1202 15:53:57.331445 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:57 crc kubenswrapper[4933]: I1202 15:53:57.331538 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:57 crc kubenswrapper[4933]: I1202 15:53:57.331608 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:57Z","lastTransitionTime":"2025-12-02T15:53:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:57 crc kubenswrapper[4933]: I1202 15:53:57.335427 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fl25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"680c0df1-e4d6-4e1c-a36d-2378e821d2d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bdbdb1eb6ad7b6e710feeda6af64e9557ec1c3c938fd850fa5b2835abc45f098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-559sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fl25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:57Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:57 crc kubenswrapper[4933]: I1202 15:53:57.349455 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qbps2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c95a4730-1427-4097-9ca3-4bd251e7acf0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdwf7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdwf7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:56Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qbps2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:57Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:57 crc kubenswrapper[4933]: I1202 15:53:57.365818 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50efb59904b9275848b8f068dfa8943515c66087209fe13dc75888354ecaff09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:57Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:57 crc kubenswrapper[4933]: I1202 15:53:57.380807 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:57Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:57 crc kubenswrapper[4933]: I1202 15:53:57.395558 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4424b26b4c67e9508d92cc6bbc82b291d93c587a8463026a856d87b7b778079e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:57Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:57 crc kubenswrapper[4933]: I1202 15:53:57.411195 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-z6kjz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b033c545-93a2-4401-842b-22456e44216b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf98d43fe6e267ef8509fcedf5375bfd2049a7a7964ddcb2cb97b6710013fb7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5307d7bbe56091012f9975b2a42eafb27d8c90b53817f1f82d8269e23456759\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T15:53:30Z\\\",\\\"message\\\":\\\"2025-12-02T15:52:45+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_261455ff-27bf-4826-8730-80c1421f483f\\\\n2025-12-02T15:52:45+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_261455ff-27bf-4826-8730-80c1421f483f to /host/opt/cni/bin/\\\\n2025-12-02T15:52:45Z [verbose] multus-daemon started\\\\n2025-12-02T15:52:45Z [verbose] Readiness Indicator file check\\\\n2025-12-02T15:53:30Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:53:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdq96\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-z6kjz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:57Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:57 crc kubenswrapper[4933]: I1202 15:53:57.434912 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:57 crc kubenswrapper[4933]: I1202 15:53:57.435280 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:57 crc kubenswrapper[4933]: I1202 15:53:57.435447 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:57 crc kubenswrapper[4933]: I1202 15:53:57.435614 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:57 crc kubenswrapper[4933]: I1202 15:53:57.435772 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:57Z","lastTransitionTime":"2025-12-02T15:53:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:57 crc kubenswrapper[4933]: I1202 15:53:57.443282 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c736b69-0c6f-42a9-bea8-cae1d3274483\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://292a536fa8488df1cde96062a3d40cbe5ecf556b30ae0ba609d9deb6b8dde7d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d4f02704d1f6ee3644bceab7f4cd44ef43e948d1777fa2e225642a5da98901f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e5b219049292adb37755387b666dabb044ec252e589f4f554a96bb2e858612c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c86a4b6d42de44d954b5bcf8b42ac10cffaac09c9f7e362a2d26b4a0fcba9a74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1258d15649d53d011ca2d8bb0e3e6b17ddaabe8ee2de9d184d85141ef2229ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94a860a32ad2b16621b3f2208b144968777437df38bf39ce6ab6b534a77ea154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94a860a32ad2b16621b3f2208b144968777437df38bf39ce6ab6b534a77ea154\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57d56c47e0f5d4c1ab0ee1bf1098a81be9d384c8f8035cff1eabc824523b46f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57d56c47e0f5d4c1ab0ee1bf1098a81be9d384c8f8035cff1eabc824523b46f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://54b33eb07acab9a8b66ee5312351af62b7dba3c8212de0dff4cd777235f5f2cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54b33eb07acab9a8b66ee5312351af62b7dba3c8212de0dff4cd777235f5f2cb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:57Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:57 crc kubenswrapper[4933]: I1202 15:53:57.466374 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06b195f1-3296-4050-9361-eab421cde8d4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7399fed7994f0ed3f12a423d3f6796e84d8687f9c16a3050ccbb90e1c80a07d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee933bdd8638f0085a6f720a178c8ce59bf46b40a0bcb015ac9c570e25ce97d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7cb1abb6f86878fc3daef153191ea3a2ebe06b3f1fc7df959539938c3b6a724\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db14d6e6ebdfa06ff02570eb66fe7ea17a7705fdaa767b6fb91d7ed12eacd59a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:57Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:57 crc kubenswrapper[4933]: I1202 15:53:57.485156 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1972064c-ea30-421c-b009-2bc675a98fcc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3920b75a79accaa154ca827b9d813f5bbb7c7c18369c0b6d3bb82cab653bcb66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a3f2ac3ef3759389e0c4807b4235aea66b8d9570ef020af90110402296a755\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d4c0805f39e3410185b2d898b61b42dfeb544cd47c1ac7570a099ece7921a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://170155e5803e9eabfc4866a3cd76bd74567d9ff6707bebfdf7f24a2330173140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a499bb33dae499d565aa22f537a095fe9465bd1d68d829248fc7a896f692f67d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14c25265781df0f480ecebbc167b5e413d869433f7258cc299a2c2bc12ee2af3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f7591b746ed9942ce7644e8005251baa49bdeaef3618aa1d679249fd3a96058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f7591b746ed9942ce7644e8005251baa49bdeaef3618aa1d679249fd3a96058\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T15:53:44Z\\\",\\\"message\\\":\\\"il\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1202 15:53:43.707176 6999 transact.go:42] Configuring OVN: [{Op:update Table:Logical_Router_Static_Route Row:map[ip_prefix:10.217.0.0/22 nexthop:100.64.0.2 policy:{GoSet:[src-ip]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8944024f-deb7-4076-afb3-4b50a2ff4b4b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:static_routes Mutator:insert Value:{GoSet:[{GoUUID:8944024f-deb7-4076-afb3-4b50a2ff4b4b}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1202 15:53:43.707700 6999 obj_retry.go:551] Creating *factory.egressNode crc took: 2.297771ms\\\\nI1202 15:53:43.707729 6999 factory.go:1336] Added *v1.Node event handler 7\\\\nI1202 15:53:43.707766 6999 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1202 15:53:43.708052 6999 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1202 15:53:43.708145 6999 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1202 15:53:43.708185 6999 ovnkube.go:599] Stopped ovnkube\\\\nI1202 15:53:43.708211 6999 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1202 15:53:43.708294 6999 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T15:53:42Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-8mklc_openshift-ovn-kubernetes(1972064c-ea30-421c-b009-2bc675a98fcc)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ae6c9973a0f8983a5da9a3a25ba23d7d836e7f5bdaa60b60627d604ef3d2afc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0062caebc274431e99ef7dc875a50f2f2d756215bd2700b7206582f68e5f376\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0062caebc274431e99ef7dc875a50f2f2d756215bd2700b7206582f68e5f376\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8mklc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:57Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:57 crc kubenswrapper[4933]: I1202 15:53:57.497773 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s488w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a161d9d5-a56f-45e9-93e4-50e7220cd31e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38f86b30416fb2f10e8444d5c7c0afe84f16d619e83e7a5e1186eaf4c274a51f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mnvpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cba0775cbbe9a6ce1f0b0fe3559f2b5eb39bd13d4f35686fe2ff92d7d833909\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mnvpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s488w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:57Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:57 crc kubenswrapper[4933]: I1202 15:53:57.511272 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"448f05b3-d7b9-4eca-a267-b9b4a5a766e2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://242478d3d008a43cabff06a9c4bf90faf8c8eb3a12f5f43cde5d0156c47d6a8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://278a1b76a0d8f7d922ab08d7ecd0912f0cde3f3438de43f5c5214e703bd244c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://278a1b76a0d8f7d922ab08d7ecd0912f0cde3f3438de43f5c5214e703bd244c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:57Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:57 crc kubenswrapper[4933]: I1202 15:53:57.528150 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:57Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:57 crc kubenswrapper[4933]: I1202 15:53:57.538865 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:57 crc kubenswrapper[4933]: I1202 15:53:57.539131 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:57 crc kubenswrapper[4933]: I1202 15:53:57.539315 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:57 crc kubenswrapper[4933]: I1202 15:53:57.539460 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:57 crc kubenswrapper[4933]: I1202 15:53:57.539603 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:57Z","lastTransitionTime":"2025-12-02T15:53:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:57 crc kubenswrapper[4933]: I1202 15:53:57.544131 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6c1c5e6-50dd-428a-890c-2c3f0456f2fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d84363ac0dfeec81ad7770d6ffd34547605fc51bebb545c4639f4c069bab93ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t48jm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54194f3459a2bbe748821e4f8e94abdd18e7c4e483d4cc2c9d5b765db584dd01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t48jm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-d2p6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:57Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:57 crc kubenswrapper[4933]: I1202 15:53:57.559234 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5d9dn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97edaf10-912b-42e7-a9e7-930381d48508\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea0970d622ab76b94ceab66d1d10d469581574368d38c8cd7c6b7a26f81cb6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s82hp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5d9dn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:57Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:57 crc kubenswrapper[4933]: I1202 15:53:57.573769 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fl25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"680c0df1-e4d6-4e1c-a36d-2378e821d2d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bdbdb1eb6ad7b6e710feeda6af64e9557ec1c3c938fd850fa5b2835abc45f098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-559sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fl25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:57Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:57 crc kubenswrapper[4933]: I1202 15:53:57.587737 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qbps2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c95a4730-1427-4097-9ca3-4bd251e7acf0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdwf7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdwf7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:56Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qbps2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:57Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:57 crc kubenswrapper[4933]: I1202 15:53:57.599747 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7faf072-ca84-4fd4-9409-b46ca6d4f1b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://befd22a9932058934ba3614781a7f133b4e432d5488c47697d082722ac11e0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d957eb940436ccb56c610cb177a4670ec0fe3aa5437e333a99c291c4263c5978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://defe82313be0dd4818134a0c694acbf728c8e31d0df745b7f2241a3e57c1bd8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d0c36fe96576d4594aff50e5b631d1bc23ea352372722d332d11c9dc6b5b7cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d0c36fe96576d4594aff50e5b631d1bc23ea352372722d332d11c9dc6b5b7cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:17Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:57Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:57 crc kubenswrapper[4933]: I1202 15:53:57.611271 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:57Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:57 crc kubenswrapper[4933]: I1202 15:53:57.624162 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4424b26b4c67e9508d92cc6bbc82b291d93c587a8463026a856d87b7b778079e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:57Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:57 crc kubenswrapper[4933]: I1202 15:53:57.638960 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-z6kjz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b033c545-93a2-4401-842b-22456e44216b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf98d43fe6e267ef8509fcedf5375bfd2049a7a7964ddcb2cb97b6710013fb7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5307d7bbe56091012f9975b2a42eafb27d8c90b53817f1f82d8269e23456759\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T15:53:30Z\\\",\\\"message\\\":\\\"2025-12-02T15:52:45+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_261455ff-27bf-4826-8730-80c1421f483f\\\\n2025-12-02T15:52:45+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_261455ff-27bf-4826-8730-80c1421f483f to /host/opt/cni/bin/\\\\n2025-12-02T15:52:45Z [verbose] multus-daemon started\\\\n2025-12-02T15:52:45Z [verbose] Readiness Indicator file check\\\\n2025-12-02T15:53:30Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:53:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdq96\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-z6kjz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:57Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:57 crc kubenswrapper[4933]: I1202 15:53:57.643480 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:57 crc kubenswrapper[4933]: I1202 15:53:57.643541 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:57 crc kubenswrapper[4933]: I1202 15:53:57.643561 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:57 crc kubenswrapper[4933]: I1202 15:53:57.643586 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:57 crc kubenswrapper[4933]: I1202 15:53:57.643610 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:57Z","lastTransitionTime":"2025-12-02T15:53:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:57 crc kubenswrapper[4933]: I1202 15:53:57.656437 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50efb59904b9275848b8f068dfa8943515c66087209fe13dc75888354ecaff09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:57Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:57 crc kubenswrapper[4933]: I1202 15:53:57.673507 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c0555430401fd089ba4f14bef44c9a03bcc4352a3159c34aa592797211ff912\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2774460b514418fc05e1d8ac0ca0a8cda1194fab9151804bed266e6bf44c7369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:57Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:57 crc kubenswrapper[4933]: I1202 15:53:57.694669 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:57Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:57 crc kubenswrapper[4933]: I1202 15:53:57.712892 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-s779q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72a30a71-fe04-43a2-8f60-c9b12a0a6e5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2367cc195b2375ceed6df398214268e414ae13f5459750b4a1f3bbe4ef59363\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a448b3118b6160344bea740978c045ce2351af934805ee95eb6b940963b377bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a448b3118b6160344bea740978c045ce2351af934805ee95eb6b940963b377bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f11b7eae04e4265f3d00a32c0c1c91419ef6da612c6ecd2e69ef4e90b5f72b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f11b7eae04e4265f3d00a32c0c1c91419ef6da612c6ecd2e69ef4e90b5f72b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89093d056f612b90064277410dc17c8b1879459a9e9491538ccd8dabeb2703e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89093d056f612b90064277410dc17c8b1879459a9e9491538ccd8dabeb2703e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e773c0ba8db44a579b0f5b59c7402d773402fbb1b3e3b57bf95e9244df635c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e773c0ba8db44a579b0f5b59c7402d773402fbb1b3e3b57bf95e9244df635c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a098d78bfca730eec14ca5f48ae082e24f0623ec3752e1b0182dceb18821e3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a098d78bfca730eec14ca5f48ae082e24f0623ec3752e1b0182dceb18821e3b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e41fea9eed5df332e515a81666075fc4cb3171b47a7c222b36dac4d5a7533692\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e41fea9eed5df332e515a81666075fc4cb3171b47a7c222b36dac4d5a7533692\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-s779q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:57Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:57 crc kubenswrapper[4933]: I1202 15:53:57.735567 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67141d41-dade-4d16-8921-1a3eeaef658e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67497877f8241167fd89cf2ebbc7257a910b4f22e96a1e58f78936317daf19aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://559f28d5b8b13a4d78a727f671a75d2c61a391f83089c98749a8071b466c8fae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50e3f8f75abacf2b0701b20cc58796727bb04d99836c7ad69c27355d0c7f070f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0f8661c980893fc646e84a6bb8547946653723fb3f1449d585953e25715d588\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdced4864fc5e9a41404f9484c6126634ffcbc3388080207f6a5508be6dc7b19\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 15:52:30.552832 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 15:52:30.556124 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1621699441/tls.crt::/tmp/serving-cert-1621699441/tls.key\\\\\\\"\\\\nI1202 15:52:36.166152 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 15:52:36.169452 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 15:52:36.169553 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 15:52:36.169614 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 15:52:36.169667 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 15:52:36.177343 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 15:52:36.177409 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 15:52:36.177417 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 15:52:36.177423 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 15:52:36.177428 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 15:52:36.177432 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 15:52:36.177437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 15:52:36.177361 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 15:52:36.181525 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://266ff5f18c317176752753189e1fd32c61d17e22bebba19421d71cdce41c10e4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd63fc5b2ffbd52b09628908868b28e2168bfe8bc453de41ce175aedcd404cc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd63fc5b2ffbd52b09628908868b28e2168bfe8bc453de41ce175aedcd404cc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:53:57Z is after 2025-08-24T17:21:41Z" Dec 02 15:53:57 crc kubenswrapper[4933]: I1202 15:53:57.746725 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:57 crc kubenswrapper[4933]: I1202 15:53:57.746769 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:57 crc kubenswrapper[4933]: I1202 15:53:57.746783 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:57 crc kubenswrapper[4933]: I1202 15:53:57.746801 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:57 crc kubenswrapper[4933]: I1202 15:53:57.746817 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:57Z","lastTransitionTime":"2025-12-02T15:53:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:57 crc kubenswrapper[4933]: I1202 15:53:57.848985 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:57 crc kubenswrapper[4933]: I1202 15:53:57.849018 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:57 crc kubenswrapper[4933]: I1202 15:53:57.849027 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:57 crc kubenswrapper[4933]: I1202 15:53:57.849040 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:57 crc kubenswrapper[4933]: I1202 15:53:57.849048 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:57Z","lastTransitionTime":"2025-12-02T15:53:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:57 crc kubenswrapper[4933]: I1202 15:53:57.951152 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:57 crc kubenswrapper[4933]: I1202 15:53:57.951193 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:57 crc kubenswrapper[4933]: I1202 15:53:57.951202 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:57 crc kubenswrapper[4933]: I1202 15:53:57.951218 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:57 crc kubenswrapper[4933]: I1202 15:53:57.951228 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:57Z","lastTransitionTime":"2025-12-02T15:53:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:58 crc kubenswrapper[4933]: I1202 15:53:58.052600 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 15:53:58 crc kubenswrapper[4933]: I1202 15:53:58.052680 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 15:53:58 crc kubenswrapper[4933]: I1202 15:53:58.052746 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 15:53:58 crc kubenswrapper[4933]: E1202 15:53:58.052809 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 15:53:58 crc kubenswrapper[4933]: E1202 15:53:58.052988 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 15:53:58 crc kubenswrapper[4933]: E1202 15:53:58.053274 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 15:53:58 crc kubenswrapper[4933]: I1202 15:53:58.054966 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:58 crc kubenswrapper[4933]: I1202 15:53:58.055019 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:58 crc kubenswrapper[4933]: I1202 15:53:58.055037 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:58 crc kubenswrapper[4933]: I1202 15:53:58.055060 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:58 crc kubenswrapper[4933]: I1202 15:53:58.055078 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:58Z","lastTransitionTime":"2025-12-02T15:53:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:58 crc kubenswrapper[4933]: I1202 15:53:58.159039 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:58 crc kubenswrapper[4933]: I1202 15:53:58.159121 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:58 crc kubenswrapper[4933]: I1202 15:53:58.159299 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:58 crc kubenswrapper[4933]: I1202 15:53:58.159339 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:58 crc kubenswrapper[4933]: I1202 15:53:58.159363 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:58Z","lastTransitionTime":"2025-12-02T15:53:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:58 crc kubenswrapper[4933]: I1202 15:53:58.263299 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:58 crc kubenswrapper[4933]: I1202 15:53:58.263350 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:58 crc kubenswrapper[4933]: I1202 15:53:58.263367 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:58 crc kubenswrapper[4933]: I1202 15:53:58.263391 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:58 crc kubenswrapper[4933]: I1202 15:53:58.263411 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:58Z","lastTransitionTime":"2025-12-02T15:53:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:58 crc kubenswrapper[4933]: I1202 15:53:58.366390 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:58 crc kubenswrapper[4933]: I1202 15:53:58.366454 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:58 crc kubenswrapper[4933]: I1202 15:53:58.366471 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:58 crc kubenswrapper[4933]: I1202 15:53:58.366495 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:58 crc kubenswrapper[4933]: I1202 15:53:58.366514 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:58Z","lastTransitionTime":"2025-12-02T15:53:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:58 crc kubenswrapper[4933]: I1202 15:53:58.468636 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:58 crc kubenswrapper[4933]: I1202 15:53:58.468690 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:58 crc kubenswrapper[4933]: I1202 15:53:58.468699 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:58 crc kubenswrapper[4933]: I1202 15:53:58.468713 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:58 crc kubenswrapper[4933]: I1202 15:53:58.468724 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:58Z","lastTransitionTime":"2025-12-02T15:53:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:58 crc kubenswrapper[4933]: I1202 15:53:58.571232 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:58 crc kubenswrapper[4933]: I1202 15:53:58.571262 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:58 crc kubenswrapper[4933]: I1202 15:53:58.571284 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:58 crc kubenswrapper[4933]: I1202 15:53:58.571297 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:58 crc kubenswrapper[4933]: I1202 15:53:58.571306 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:58Z","lastTransitionTime":"2025-12-02T15:53:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:58 crc kubenswrapper[4933]: I1202 15:53:58.674434 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:58 crc kubenswrapper[4933]: I1202 15:53:58.674489 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:58 crc kubenswrapper[4933]: I1202 15:53:58.674499 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:58 crc kubenswrapper[4933]: I1202 15:53:58.674512 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:58 crc kubenswrapper[4933]: I1202 15:53:58.674520 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:58Z","lastTransitionTime":"2025-12-02T15:53:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:58 crc kubenswrapper[4933]: I1202 15:53:58.778503 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:58 crc kubenswrapper[4933]: I1202 15:53:58.778559 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:58 crc kubenswrapper[4933]: I1202 15:53:58.778568 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:58 crc kubenswrapper[4933]: I1202 15:53:58.778585 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:58 crc kubenswrapper[4933]: I1202 15:53:58.778597 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:58Z","lastTransitionTime":"2025-12-02T15:53:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:58 crc kubenswrapper[4933]: I1202 15:53:58.882285 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:58 crc kubenswrapper[4933]: I1202 15:53:58.882354 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:58 crc kubenswrapper[4933]: I1202 15:53:58.882378 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:58 crc kubenswrapper[4933]: I1202 15:53:58.882408 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:58 crc kubenswrapper[4933]: I1202 15:53:58.882450 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:58Z","lastTransitionTime":"2025-12-02T15:53:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:58 crc kubenswrapper[4933]: I1202 15:53:58.986602 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:58 crc kubenswrapper[4933]: I1202 15:53:58.986654 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:58 crc kubenswrapper[4933]: I1202 15:53:58.986667 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:58 crc kubenswrapper[4933]: I1202 15:53:58.986685 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:58 crc kubenswrapper[4933]: I1202 15:53:58.986696 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:58Z","lastTransitionTime":"2025-12-02T15:53:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:59 crc kubenswrapper[4933]: I1202 15:53:59.053183 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qbps2" Dec 02 15:53:59 crc kubenswrapper[4933]: E1202 15:53:59.053448 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qbps2" podUID="c95a4730-1427-4097-9ca3-4bd251e7acf0" Dec 02 15:53:59 crc kubenswrapper[4933]: I1202 15:53:59.090173 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:59 crc kubenswrapper[4933]: I1202 15:53:59.090229 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:59 crc kubenswrapper[4933]: I1202 15:53:59.090246 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:59 crc kubenswrapper[4933]: I1202 15:53:59.090268 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:59 crc kubenswrapper[4933]: I1202 15:53:59.090283 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:59Z","lastTransitionTime":"2025-12-02T15:53:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:59 crc kubenswrapper[4933]: I1202 15:53:59.193639 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:59 crc kubenswrapper[4933]: I1202 15:53:59.193704 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:59 crc kubenswrapper[4933]: I1202 15:53:59.193718 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:59 crc kubenswrapper[4933]: I1202 15:53:59.193739 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:59 crc kubenswrapper[4933]: I1202 15:53:59.193752 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:59Z","lastTransitionTime":"2025-12-02T15:53:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:59 crc kubenswrapper[4933]: I1202 15:53:59.297453 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:59 crc kubenswrapper[4933]: I1202 15:53:59.297516 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:59 crc kubenswrapper[4933]: I1202 15:53:59.297533 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:59 crc kubenswrapper[4933]: I1202 15:53:59.297555 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:59 crc kubenswrapper[4933]: I1202 15:53:59.297568 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:59Z","lastTransitionTime":"2025-12-02T15:53:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:59 crc kubenswrapper[4933]: I1202 15:53:59.401355 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:59 crc kubenswrapper[4933]: I1202 15:53:59.401447 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:59 crc kubenswrapper[4933]: I1202 15:53:59.401506 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:59 crc kubenswrapper[4933]: I1202 15:53:59.401553 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:59 crc kubenswrapper[4933]: I1202 15:53:59.401587 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:59Z","lastTransitionTime":"2025-12-02T15:53:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:59 crc kubenswrapper[4933]: I1202 15:53:59.505556 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:59 crc kubenswrapper[4933]: I1202 15:53:59.505706 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:59 crc kubenswrapper[4933]: I1202 15:53:59.505741 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:59 crc kubenswrapper[4933]: I1202 15:53:59.505781 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:59 crc kubenswrapper[4933]: I1202 15:53:59.505804 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:59Z","lastTransitionTime":"2025-12-02T15:53:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:59 crc kubenswrapper[4933]: I1202 15:53:59.608773 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:59 crc kubenswrapper[4933]: I1202 15:53:59.608858 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:59 crc kubenswrapper[4933]: I1202 15:53:59.608875 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:59 crc kubenswrapper[4933]: I1202 15:53:59.608896 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:59 crc kubenswrapper[4933]: I1202 15:53:59.608911 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:59Z","lastTransitionTime":"2025-12-02T15:53:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:59 crc kubenswrapper[4933]: I1202 15:53:59.712456 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:59 crc kubenswrapper[4933]: I1202 15:53:59.712489 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:59 crc kubenswrapper[4933]: I1202 15:53:59.712499 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:59 crc kubenswrapper[4933]: I1202 15:53:59.712516 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:59 crc kubenswrapper[4933]: I1202 15:53:59.712527 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:59Z","lastTransitionTime":"2025-12-02T15:53:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:59 crc kubenswrapper[4933]: I1202 15:53:59.815609 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:59 crc kubenswrapper[4933]: I1202 15:53:59.815660 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:59 crc kubenswrapper[4933]: I1202 15:53:59.815673 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:59 crc kubenswrapper[4933]: I1202 15:53:59.815690 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:59 crc kubenswrapper[4933]: I1202 15:53:59.815703 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:59Z","lastTransitionTime":"2025-12-02T15:53:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:53:59 crc kubenswrapper[4933]: I1202 15:53:59.918346 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:53:59 crc kubenswrapper[4933]: I1202 15:53:59.918413 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:53:59 crc kubenswrapper[4933]: I1202 15:53:59.918434 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:53:59 crc kubenswrapper[4933]: I1202 15:53:59.918459 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:53:59 crc kubenswrapper[4933]: I1202 15:53:59.918477 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:53:59Z","lastTransitionTime":"2025-12-02T15:53:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:00 crc kubenswrapper[4933]: I1202 15:54:00.022177 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:00 crc kubenswrapper[4933]: I1202 15:54:00.022243 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:00 crc kubenswrapper[4933]: I1202 15:54:00.022268 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:00 crc kubenswrapper[4933]: I1202 15:54:00.022300 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:00 crc kubenswrapper[4933]: I1202 15:54:00.022327 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:00Z","lastTransitionTime":"2025-12-02T15:54:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:00 crc kubenswrapper[4933]: I1202 15:54:00.052876 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 15:54:00 crc kubenswrapper[4933]: I1202 15:54:00.052938 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 15:54:00 crc kubenswrapper[4933]: I1202 15:54:00.052983 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 15:54:00 crc kubenswrapper[4933]: E1202 15:54:00.053078 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 15:54:00 crc kubenswrapper[4933]: E1202 15:54:00.053250 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 15:54:00 crc kubenswrapper[4933]: E1202 15:54:00.053660 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 15:54:00 crc kubenswrapper[4933]: I1202 15:54:00.125263 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:00 crc kubenswrapper[4933]: I1202 15:54:00.125323 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:00 crc kubenswrapper[4933]: I1202 15:54:00.125341 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:00 crc kubenswrapper[4933]: I1202 15:54:00.125366 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:00 crc kubenswrapper[4933]: I1202 15:54:00.125383 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:00Z","lastTransitionTime":"2025-12-02T15:54:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:00 crc kubenswrapper[4933]: I1202 15:54:00.228177 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:00 crc kubenswrapper[4933]: I1202 15:54:00.228223 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:00 crc kubenswrapper[4933]: I1202 15:54:00.228232 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:00 crc kubenswrapper[4933]: I1202 15:54:00.228247 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:00 crc kubenswrapper[4933]: I1202 15:54:00.228258 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:00Z","lastTransitionTime":"2025-12-02T15:54:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:00 crc kubenswrapper[4933]: I1202 15:54:00.331054 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:00 crc kubenswrapper[4933]: I1202 15:54:00.331122 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:00 crc kubenswrapper[4933]: I1202 15:54:00.331140 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:00 crc kubenswrapper[4933]: I1202 15:54:00.331164 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:00 crc kubenswrapper[4933]: I1202 15:54:00.331183 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:00Z","lastTransitionTime":"2025-12-02T15:54:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:00 crc kubenswrapper[4933]: I1202 15:54:00.434497 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:00 crc kubenswrapper[4933]: I1202 15:54:00.434544 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:00 crc kubenswrapper[4933]: I1202 15:54:00.434554 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:00 crc kubenswrapper[4933]: I1202 15:54:00.434570 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:00 crc kubenswrapper[4933]: I1202 15:54:00.434581 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:00Z","lastTransitionTime":"2025-12-02T15:54:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:00 crc kubenswrapper[4933]: I1202 15:54:00.537345 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:00 crc kubenswrapper[4933]: I1202 15:54:00.537423 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:00 crc kubenswrapper[4933]: I1202 15:54:00.537449 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:00 crc kubenswrapper[4933]: I1202 15:54:00.537477 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:00 crc kubenswrapper[4933]: I1202 15:54:00.537495 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:00Z","lastTransitionTime":"2025-12-02T15:54:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:00 crc kubenswrapper[4933]: I1202 15:54:00.640710 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:00 crc kubenswrapper[4933]: I1202 15:54:00.640773 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:00 crc kubenswrapper[4933]: I1202 15:54:00.640797 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:00 crc kubenswrapper[4933]: I1202 15:54:00.640860 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:00 crc kubenswrapper[4933]: I1202 15:54:00.640884 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:00Z","lastTransitionTime":"2025-12-02T15:54:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:00 crc kubenswrapper[4933]: I1202 15:54:00.744084 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:00 crc kubenswrapper[4933]: I1202 15:54:00.744146 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:00 crc kubenswrapper[4933]: I1202 15:54:00.744163 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:00 crc kubenswrapper[4933]: I1202 15:54:00.744186 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:00 crc kubenswrapper[4933]: I1202 15:54:00.744202 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:00Z","lastTransitionTime":"2025-12-02T15:54:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:00 crc kubenswrapper[4933]: I1202 15:54:00.847301 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:00 crc kubenswrapper[4933]: I1202 15:54:00.847357 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:00 crc kubenswrapper[4933]: I1202 15:54:00.847373 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:00 crc kubenswrapper[4933]: I1202 15:54:00.847392 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:00 crc kubenswrapper[4933]: I1202 15:54:00.847406 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:00Z","lastTransitionTime":"2025-12-02T15:54:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:00 crc kubenswrapper[4933]: I1202 15:54:00.950745 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:00 crc kubenswrapper[4933]: I1202 15:54:00.950795 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:00 crc kubenswrapper[4933]: I1202 15:54:00.950804 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:00 crc kubenswrapper[4933]: I1202 15:54:00.950818 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:00 crc kubenswrapper[4933]: I1202 15:54:00.950858 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:00Z","lastTransitionTime":"2025-12-02T15:54:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:01 crc kubenswrapper[4933]: I1202 15:54:01.036388 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c95a4730-1427-4097-9ca3-4bd251e7acf0-metrics-certs\") pod \"network-metrics-daemon-qbps2\" (UID: \"c95a4730-1427-4097-9ca3-4bd251e7acf0\") " pod="openshift-multus/network-metrics-daemon-qbps2" Dec 02 15:54:01 crc kubenswrapper[4933]: E1202 15:54:01.036674 4933 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 15:54:01 crc kubenswrapper[4933]: E1202 15:54:01.036796 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c95a4730-1427-4097-9ca3-4bd251e7acf0-metrics-certs podName:c95a4730-1427-4097-9ca3-4bd251e7acf0 nodeName:}" failed. No retries permitted until 2025-12-02 15:55:05.036769292 +0000 UTC m=+168.287996035 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c95a4730-1427-4097-9ca3-4bd251e7acf0-metrics-certs") pod "network-metrics-daemon-qbps2" (UID: "c95a4730-1427-4097-9ca3-4bd251e7acf0") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 15:54:01 crc kubenswrapper[4933]: I1202 15:54:01.052901 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qbps2" Dec 02 15:54:01 crc kubenswrapper[4933]: I1202 15:54:01.053252 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:01 crc kubenswrapper[4933]: I1202 15:54:01.053311 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:01 crc kubenswrapper[4933]: I1202 15:54:01.053330 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:01 crc kubenswrapper[4933]: E1202 15:54:01.053336 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qbps2" podUID="c95a4730-1427-4097-9ca3-4bd251e7acf0" Dec 02 15:54:01 crc kubenswrapper[4933]: I1202 15:54:01.053353 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:01 crc kubenswrapper[4933]: I1202 15:54:01.053394 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:01Z","lastTransitionTime":"2025-12-02T15:54:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:01 crc kubenswrapper[4933]: I1202 15:54:01.157494 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:01 crc kubenswrapper[4933]: I1202 15:54:01.157566 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:01 crc kubenswrapper[4933]: I1202 15:54:01.157588 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:01 crc kubenswrapper[4933]: I1202 15:54:01.157617 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:01 crc kubenswrapper[4933]: I1202 15:54:01.157639 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:01Z","lastTransitionTime":"2025-12-02T15:54:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:01 crc kubenswrapper[4933]: I1202 15:54:01.261310 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:01 crc kubenswrapper[4933]: I1202 15:54:01.261378 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:01 crc kubenswrapper[4933]: I1202 15:54:01.261396 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:01 crc kubenswrapper[4933]: I1202 15:54:01.261422 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:01 crc kubenswrapper[4933]: I1202 15:54:01.261440 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:01Z","lastTransitionTime":"2025-12-02T15:54:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:01 crc kubenswrapper[4933]: I1202 15:54:01.365205 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:01 crc kubenswrapper[4933]: I1202 15:54:01.365259 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:01 crc kubenswrapper[4933]: I1202 15:54:01.365269 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:01 crc kubenswrapper[4933]: I1202 15:54:01.365287 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:01 crc kubenswrapper[4933]: I1202 15:54:01.365297 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:01Z","lastTransitionTime":"2025-12-02T15:54:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:01 crc kubenswrapper[4933]: I1202 15:54:01.468064 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:01 crc kubenswrapper[4933]: I1202 15:54:01.468137 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:01 crc kubenswrapper[4933]: I1202 15:54:01.468155 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:01 crc kubenswrapper[4933]: I1202 15:54:01.468187 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:01 crc kubenswrapper[4933]: I1202 15:54:01.468206 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:01Z","lastTransitionTime":"2025-12-02T15:54:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:01 crc kubenswrapper[4933]: I1202 15:54:01.571356 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:01 crc kubenswrapper[4933]: I1202 15:54:01.571432 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:01 crc kubenswrapper[4933]: I1202 15:54:01.571445 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:01 crc kubenswrapper[4933]: I1202 15:54:01.571462 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:01 crc kubenswrapper[4933]: I1202 15:54:01.571477 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:01Z","lastTransitionTime":"2025-12-02T15:54:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:01 crc kubenswrapper[4933]: I1202 15:54:01.674291 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:01 crc kubenswrapper[4933]: I1202 15:54:01.674422 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:01 crc kubenswrapper[4933]: I1202 15:54:01.674474 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:01 crc kubenswrapper[4933]: I1202 15:54:01.674506 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:01 crc kubenswrapper[4933]: I1202 15:54:01.674525 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:01Z","lastTransitionTime":"2025-12-02T15:54:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:01 crc kubenswrapper[4933]: I1202 15:54:01.778788 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:01 crc kubenswrapper[4933]: I1202 15:54:01.778896 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:01 crc kubenswrapper[4933]: I1202 15:54:01.778917 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:01 crc kubenswrapper[4933]: I1202 15:54:01.778943 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:01 crc kubenswrapper[4933]: I1202 15:54:01.778963 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:01Z","lastTransitionTime":"2025-12-02T15:54:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:01 crc kubenswrapper[4933]: I1202 15:54:01.882696 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:01 crc kubenswrapper[4933]: I1202 15:54:01.882773 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:01 crc kubenswrapper[4933]: I1202 15:54:01.882789 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:01 crc kubenswrapper[4933]: I1202 15:54:01.882820 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:01 crc kubenswrapper[4933]: I1202 15:54:01.882867 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:01Z","lastTransitionTime":"2025-12-02T15:54:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:01 crc kubenswrapper[4933]: I1202 15:54:01.986491 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:01 crc kubenswrapper[4933]: I1202 15:54:01.986566 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:01 crc kubenswrapper[4933]: I1202 15:54:01.986589 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:01 crc kubenswrapper[4933]: I1202 15:54:01.986616 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:01 crc kubenswrapper[4933]: I1202 15:54:01.986638 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:01Z","lastTransitionTime":"2025-12-02T15:54:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:02 crc kubenswrapper[4933]: I1202 15:54:02.053356 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 15:54:02 crc kubenswrapper[4933]: E1202 15:54:02.053566 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 15:54:02 crc kubenswrapper[4933]: I1202 15:54:02.053593 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 15:54:02 crc kubenswrapper[4933]: E1202 15:54:02.053745 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 15:54:02 crc kubenswrapper[4933]: I1202 15:54:02.053803 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 15:54:02 crc kubenswrapper[4933]: E1202 15:54:02.053894 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 15:54:02 crc kubenswrapper[4933]: I1202 15:54:02.090230 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:02 crc kubenswrapper[4933]: I1202 15:54:02.090286 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:02 crc kubenswrapper[4933]: I1202 15:54:02.090299 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:02 crc kubenswrapper[4933]: I1202 15:54:02.090319 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:02 crc kubenswrapper[4933]: I1202 15:54:02.090338 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:02Z","lastTransitionTime":"2025-12-02T15:54:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:02 crc kubenswrapper[4933]: I1202 15:54:02.101796 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:02 crc kubenswrapper[4933]: I1202 15:54:02.101874 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:02 crc kubenswrapper[4933]: I1202 15:54:02.101884 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:02 crc kubenswrapper[4933]: I1202 15:54:02.101916 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:02 crc kubenswrapper[4933]: I1202 15:54:02.101929 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:02Z","lastTransitionTime":"2025-12-02T15:54:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:02 crc kubenswrapper[4933]: E1202 15:54:02.116112 4933 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:54:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:54:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:54:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:54:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:54:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:54:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:54:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:54:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b45811fa-f657-451d-9a34-cdd268fcc941\\\",\\\"systemUUID\\\":\\\"84b7b789-bc9b-466b-8619-2bf2e1fdb8d0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:54:02Z is after 2025-08-24T17:21:41Z" Dec 02 15:54:02 crc kubenswrapper[4933]: I1202 15:54:02.125301 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:02 crc kubenswrapper[4933]: I1202 15:54:02.125363 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:02 crc kubenswrapper[4933]: I1202 15:54:02.125386 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:02 crc kubenswrapper[4933]: I1202 15:54:02.125414 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:02 crc kubenswrapper[4933]: I1202 15:54:02.125434 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:02Z","lastTransitionTime":"2025-12-02T15:54:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:02 crc kubenswrapper[4933]: E1202 15:54:02.140080 4933 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:54:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:54:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:54:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:54:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:54:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:54:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:54:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:54:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b45811fa-f657-451d-9a34-cdd268fcc941\\\",\\\"systemUUID\\\":\\\"84b7b789-bc9b-466b-8619-2bf2e1fdb8d0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:54:02Z is after 2025-08-24T17:21:41Z" Dec 02 15:54:02 crc kubenswrapper[4933]: I1202 15:54:02.145562 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:02 crc kubenswrapper[4933]: I1202 15:54:02.145663 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:02 crc kubenswrapper[4933]: I1202 15:54:02.145697 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:02 crc kubenswrapper[4933]: I1202 15:54:02.145730 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:02 crc kubenswrapper[4933]: I1202 15:54:02.145755 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:02Z","lastTransitionTime":"2025-12-02T15:54:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:02 crc kubenswrapper[4933]: E1202 15:54:02.162883 4933 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:54:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:54:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:54:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:54:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:54:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:54:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:54:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:54:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b45811fa-f657-451d-9a34-cdd268fcc941\\\",\\\"systemUUID\\\":\\\"84b7b789-bc9b-466b-8619-2bf2e1fdb8d0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:54:02Z is after 2025-08-24T17:21:41Z" Dec 02 15:54:02 crc kubenswrapper[4933]: I1202 15:54:02.168379 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:02 crc kubenswrapper[4933]: I1202 15:54:02.168450 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:02 crc kubenswrapper[4933]: I1202 15:54:02.168468 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:02 crc kubenswrapper[4933]: I1202 15:54:02.168499 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:02 crc kubenswrapper[4933]: I1202 15:54:02.168521 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:02Z","lastTransitionTime":"2025-12-02T15:54:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:02 crc kubenswrapper[4933]: E1202 15:54:02.183610 4933 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:54:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:54:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:54:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:54:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:54:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:54:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:54:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:54:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b45811fa-f657-451d-9a34-cdd268fcc941\\\",\\\"systemUUID\\\":\\\"84b7b789-bc9b-466b-8619-2bf2e1fdb8d0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:54:02Z is after 2025-08-24T17:21:41Z" Dec 02 15:54:02 crc kubenswrapper[4933]: I1202 15:54:02.189333 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:02 crc kubenswrapper[4933]: I1202 15:54:02.189410 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:02 crc kubenswrapper[4933]: I1202 15:54:02.189438 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:02 crc kubenswrapper[4933]: I1202 15:54:02.189471 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:02 crc kubenswrapper[4933]: I1202 15:54:02.189496 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:02Z","lastTransitionTime":"2025-12-02T15:54:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:02 crc kubenswrapper[4933]: E1202 15:54:02.207249 4933 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:54:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:54:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:54:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:54:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:54:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:54:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T15:54:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T15:54:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b45811fa-f657-451d-9a34-cdd268fcc941\\\",\\\"systemUUID\\\":\\\"84b7b789-bc9b-466b-8619-2bf2e1fdb8d0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:54:02Z is after 2025-08-24T17:21:41Z" Dec 02 15:54:02 crc kubenswrapper[4933]: E1202 15:54:02.207382 4933 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 02 15:54:02 crc kubenswrapper[4933]: I1202 15:54:02.209036 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:02 crc kubenswrapper[4933]: I1202 15:54:02.209062 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:02 crc kubenswrapper[4933]: I1202 15:54:02.209074 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:02 crc kubenswrapper[4933]: I1202 15:54:02.209089 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:02 crc kubenswrapper[4933]: I1202 15:54:02.209101 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:02Z","lastTransitionTime":"2025-12-02T15:54:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:02 crc kubenswrapper[4933]: I1202 15:54:02.312109 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:02 crc kubenswrapper[4933]: I1202 15:54:02.312167 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:02 crc kubenswrapper[4933]: I1202 15:54:02.312185 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:02 crc kubenswrapper[4933]: I1202 15:54:02.312206 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:02 crc kubenswrapper[4933]: I1202 15:54:02.312222 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:02Z","lastTransitionTime":"2025-12-02T15:54:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:02 crc kubenswrapper[4933]: I1202 15:54:02.415563 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:02 crc kubenswrapper[4933]: I1202 15:54:02.415642 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:02 crc kubenswrapper[4933]: I1202 15:54:02.415663 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:02 crc kubenswrapper[4933]: I1202 15:54:02.415687 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:02 crc kubenswrapper[4933]: I1202 15:54:02.415708 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:02Z","lastTransitionTime":"2025-12-02T15:54:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:02 crc kubenswrapper[4933]: I1202 15:54:02.518615 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:02 crc kubenswrapper[4933]: I1202 15:54:02.518679 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:02 crc kubenswrapper[4933]: I1202 15:54:02.518697 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:02 crc kubenswrapper[4933]: I1202 15:54:02.518725 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:02 crc kubenswrapper[4933]: I1202 15:54:02.518745 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:02Z","lastTransitionTime":"2025-12-02T15:54:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:02 crc kubenswrapper[4933]: I1202 15:54:02.622623 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:02 crc kubenswrapper[4933]: I1202 15:54:02.622735 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:02 crc kubenswrapper[4933]: I1202 15:54:02.622756 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:02 crc kubenswrapper[4933]: I1202 15:54:02.622813 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:02 crc kubenswrapper[4933]: I1202 15:54:02.622900 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:02Z","lastTransitionTime":"2025-12-02T15:54:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:02 crc kubenswrapper[4933]: I1202 15:54:02.725477 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:02 crc kubenswrapper[4933]: I1202 15:54:02.725552 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:02 crc kubenswrapper[4933]: I1202 15:54:02.725569 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:02 crc kubenswrapper[4933]: I1202 15:54:02.725598 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:02 crc kubenswrapper[4933]: I1202 15:54:02.725617 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:02Z","lastTransitionTime":"2025-12-02T15:54:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:02 crc kubenswrapper[4933]: I1202 15:54:02.828119 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:02 crc kubenswrapper[4933]: I1202 15:54:02.828199 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:02 crc kubenswrapper[4933]: I1202 15:54:02.828232 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:02 crc kubenswrapper[4933]: I1202 15:54:02.828250 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:02 crc kubenswrapper[4933]: I1202 15:54:02.828264 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:02Z","lastTransitionTime":"2025-12-02T15:54:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:02 crc kubenswrapper[4933]: I1202 15:54:02.931014 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:02 crc kubenswrapper[4933]: I1202 15:54:02.931088 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:02 crc kubenswrapper[4933]: I1202 15:54:02.931108 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:02 crc kubenswrapper[4933]: I1202 15:54:02.931134 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:02 crc kubenswrapper[4933]: I1202 15:54:02.931157 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:02Z","lastTransitionTime":"2025-12-02T15:54:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:03 crc kubenswrapper[4933]: I1202 15:54:03.033926 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:03 crc kubenswrapper[4933]: I1202 15:54:03.033973 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:03 crc kubenswrapper[4933]: I1202 15:54:03.033983 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:03 crc kubenswrapper[4933]: I1202 15:54:03.034002 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:03 crc kubenswrapper[4933]: I1202 15:54:03.034015 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:03Z","lastTransitionTime":"2025-12-02T15:54:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:03 crc kubenswrapper[4933]: I1202 15:54:03.053197 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qbps2" Dec 02 15:54:03 crc kubenswrapper[4933]: E1202 15:54:03.053419 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qbps2" podUID="c95a4730-1427-4097-9ca3-4bd251e7acf0" Dec 02 15:54:03 crc kubenswrapper[4933]: I1202 15:54:03.136029 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:03 crc kubenswrapper[4933]: I1202 15:54:03.136096 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:03 crc kubenswrapper[4933]: I1202 15:54:03.136108 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:03 crc kubenswrapper[4933]: I1202 15:54:03.136127 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:03 crc kubenswrapper[4933]: I1202 15:54:03.136140 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:03Z","lastTransitionTime":"2025-12-02T15:54:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:03 crc kubenswrapper[4933]: I1202 15:54:03.238648 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:03 crc kubenswrapper[4933]: I1202 15:54:03.238714 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:03 crc kubenswrapper[4933]: I1202 15:54:03.238725 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:03 crc kubenswrapper[4933]: I1202 15:54:03.238746 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:03 crc kubenswrapper[4933]: I1202 15:54:03.238759 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:03Z","lastTransitionTime":"2025-12-02T15:54:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:03 crc kubenswrapper[4933]: I1202 15:54:03.341863 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:03 crc kubenswrapper[4933]: I1202 15:54:03.341956 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:03 crc kubenswrapper[4933]: I1202 15:54:03.341994 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:03 crc kubenswrapper[4933]: I1202 15:54:03.342030 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:03 crc kubenswrapper[4933]: I1202 15:54:03.342052 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:03Z","lastTransitionTime":"2025-12-02T15:54:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:03 crc kubenswrapper[4933]: I1202 15:54:03.445307 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:03 crc kubenswrapper[4933]: I1202 15:54:03.445489 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:03 crc kubenswrapper[4933]: I1202 15:54:03.445523 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:03 crc kubenswrapper[4933]: I1202 15:54:03.445551 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:03 crc kubenswrapper[4933]: I1202 15:54:03.445571 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:03Z","lastTransitionTime":"2025-12-02T15:54:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:03 crc kubenswrapper[4933]: I1202 15:54:03.548898 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:03 crc kubenswrapper[4933]: I1202 15:54:03.548959 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:03 crc kubenswrapper[4933]: I1202 15:54:03.548977 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:03 crc kubenswrapper[4933]: I1202 15:54:03.548995 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:03 crc kubenswrapper[4933]: I1202 15:54:03.549007 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:03Z","lastTransitionTime":"2025-12-02T15:54:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:03 crc kubenswrapper[4933]: I1202 15:54:03.651336 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:03 crc kubenswrapper[4933]: I1202 15:54:03.651382 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:03 crc kubenswrapper[4933]: I1202 15:54:03.651402 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:03 crc kubenswrapper[4933]: I1202 15:54:03.651428 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:03 crc kubenswrapper[4933]: I1202 15:54:03.651444 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:03Z","lastTransitionTime":"2025-12-02T15:54:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:03 crc kubenswrapper[4933]: I1202 15:54:03.755051 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:03 crc kubenswrapper[4933]: I1202 15:54:03.755118 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:03 crc kubenswrapper[4933]: I1202 15:54:03.755137 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:03 crc kubenswrapper[4933]: I1202 15:54:03.755160 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:03 crc kubenswrapper[4933]: I1202 15:54:03.755176 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:03Z","lastTransitionTime":"2025-12-02T15:54:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:03 crc kubenswrapper[4933]: I1202 15:54:03.857646 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:03 crc kubenswrapper[4933]: I1202 15:54:03.857687 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:03 crc kubenswrapper[4933]: I1202 15:54:03.857704 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:03 crc kubenswrapper[4933]: I1202 15:54:03.857723 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:03 crc kubenswrapper[4933]: I1202 15:54:03.857734 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:03Z","lastTransitionTime":"2025-12-02T15:54:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:03 crc kubenswrapper[4933]: I1202 15:54:03.960222 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:03 crc kubenswrapper[4933]: I1202 15:54:03.960278 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:03 crc kubenswrapper[4933]: I1202 15:54:03.960291 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:03 crc kubenswrapper[4933]: I1202 15:54:03.960308 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:03 crc kubenswrapper[4933]: I1202 15:54:03.960320 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:03Z","lastTransitionTime":"2025-12-02T15:54:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:04 crc kubenswrapper[4933]: I1202 15:54:04.052904 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 15:54:04 crc kubenswrapper[4933]: I1202 15:54:04.052984 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 15:54:04 crc kubenswrapper[4933]: E1202 15:54:04.053028 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 15:54:04 crc kubenswrapper[4933]: I1202 15:54:04.053062 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 15:54:04 crc kubenswrapper[4933]: E1202 15:54:04.053150 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 15:54:04 crc kubenswrapper[4933]: E1202 15:54:04.053241 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 15:54:04 crc kubenswrapper[4933]: I1202 15:54:04.063235 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:04 crc kubenswrapper[4933]: I1202 15:54:04.063297 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:04 crc kubenswrapper[4933]: I1202 15:54:04.063309 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:04 crc kubenswrapper[4933]: I1202 15:54:04.063327 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:04 crc kubenswrapper[4933]: I1202 15:54:04.063341 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:04Z","lastTransitionTime":"2025-12-02T15:54:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:04 crc kubenswrapper[4933]: I1202 15:54:04.167189 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:04 crc kubenswrapper[4933]: I1202 15:54:04.167278 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:04 crc kubenswrapper[4933]: I1202 15:54:04.167295 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:04 crc kubenswrapper[4933]: I1202 15:54:04.167349 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:04 crc kubenswrapper[4933]: I1202 15:54:04.167365 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:04Z","lastTransitionTime":"2025-12-02T15:54:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:04 crc kubenswrapper[4933]: I1202 15:54:04.270417 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:04 crc kubenswrapper[4933]: I1202 15:54:04.270516 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:04 crc kubenswrapper[4933]: I1202 15:54:04.270535 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:04 crc kubenswrapper[4933]: I1202 15:54:04.270595 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:04 crc kubenswrapper[4933]: I1202 15:54:04.270617 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:04Z","lastTransitionTime":"2025-12-02T15:54:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:04 crc kubenswrapper[4933]: I1202 15:54:04.373646 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:04 crc kubenswrapper[4933]: I1202 15:54:04.373693 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:04 crc kubenswrapper[4933]: I1202 15:54:04.373706 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:04 crc kubenswrapper[4933]: I1202 15:54:04.373721 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:04 crc kubenswrapper[4933]: I1202 15:54:04.373734 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:04Z","lastTransitionTime":"2025-12-02T15:54:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:04 crc kubenswrapper[4933]: I1202 15:54:04.476032 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:04 crc kubenswrapper[4933]: I1202 15:54:04.476090 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:04 crc kubenswrapper[4933]: I1202 15:54:04.476106 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:04 crc kubenswrapper[4933]: I1202 15:54:04.476131 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:04 crc kubenswrapper[4933]: I1202 15:54:04.476152 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:04Z","lastTransitionTime":"2025-12-02T15:54:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:04 crc kubenswrapper[4933]: I1202 15:54:04.579209 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:04 crc kubenswrapper[4933]: I1202 15:54:04.579261 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:04 crc kubenswrapper[4933]: I1202 15:54:04.579271 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:04 crc kubenswrapper[4933]: I1202 15:54:04.579286 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:04 crc kubenswrapper[4933]: I1202 15:54:04.579295 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:04Z","lastTransitionTime":"2025-12-02T15:54:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:04 crc kubenswrapper[4933]: I1202 15:54:04.681438 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:04 crc kubenswrapper[4933]: I1202 15:54:04.681473 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:04 crc kubenswrapper[4933]: I1202 15:54:04.681502 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:04 crc kubenswrapper[4933]: I1202 15:54:04.681515 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:04 crc kubenswrapper[4933]: I1202 15:54:04.681524 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:04Z","lastTransitionTime":"2025-12-02T15:54:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:04 crc kubenswrapper[4933]: I1202 15:54:04.784802 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:04 crc kubenswrapper[4933]: I1202 15:54:04.784891 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:04 crc kubenswrapper[4933]: I1202 15:54:04.784907 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:04 crc kubenswrapper[4933]: I1202 15:54:04.784924 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:04 crc kubenswrapper[4933]: I1202 15:54:04.784939 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:04Z","lastTransitionTime":"2025-12-02T15:54:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:04 crc kubenswrapper[4933]: I1202 15:54:04.888139 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:04 crc kubenswrapper[4933]: I1202 15:54:04.888193 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:04 crc kubenswrapper[4933]: I1202 15:54:04.888205 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:04 crc kubenswrapper[4933]: I1202 15:54:04.888227 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:04 crc kubenswrapper[4933]: I1202 15:54:04.888242 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:04Z","lastTransitionTime":"2025-12-02T15:54:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:04 crc kubenswrapper[4933]: I1202 15:54:04.991199 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:04 crc kubenswrapper[4933]: I1202 15:54:04.991242 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:04 crc kubenswrapper[4933]: I1202 15:54:04.991251 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:04 crc kubenswrapper[4933]: I1202 15:54:04.991265 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:04 crc kubenswrapper[4933]: I1202 15:54:04.991277 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:04Z","lastTransitionTime":"2025-12-02T15:54:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:05 crc kubenswrapper[4933]: I1202 15:54:05.052398 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qbps2" Dec 02 15:54:05 crc kubenswrapper[4933]: E1202 15:54:05.052569 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qbps2" podUID="c95a4730-1427-4097-9ca3-4bd251e7acf0" Dec 02 15:54:05 crc kubenswrapper[4933]: I1202 15:54:05.094753 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:05 crc kubenswrapper[4933]: I1202 15:54:05.094875 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:05 crc kubenswrapper[4933]: I1202 15:54:05.094912 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:05 crc kubenswrapper[4933]: I1202 15:54:05.094941 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:05 crc kubenswrapper[4933]: I1202 15:54:05.094959 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:05Z","lastTransitionTime":"2025-12-02T15:54:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:05 crc kubenswrapper[4933]: I1202 15:54:05.198134 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:05 crc kubenswrapper[4933]: I1202 15:54:05.198183 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:05 crc kubenswrapper[4933]: I1202 15:54:05.198193 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:05 crc kubenswrapper[4933]: I1202 15:54:05.198211 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:05 crc kubenswrapper[4933]: I1202 15:54:05.198223 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:05Z","lastTransitionTime":"2025-12-02T15:54:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:05 crc kubenswrapper[4933]: I1202 15:54:05.301276 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:05 crc kubenswrapper[4933]: I1202 15:54:05.301352 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:05 crc kubenswrapper[4933]: I1202 15:54:05.301367 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:05 crc kubenswrapper[4933]: I1202 15:54:05.301411 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:05 crc kubenswrapper[4933]: I1202 15:54:05.301426 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:05Z","lastTransitionTime":"2025-12-02T15:54:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:05 crc kubenswrapper[4933]: I1202 15:54:05.403691 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:05 crc kubenswrapper[4933]: I1202 15:54:05.403730 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:05 crc kubenswrapper[4933]: I1202 15:54:05.403739 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:05 crc kubenswrapper[4933]: I1202 15:54:05.403752 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:05 crc kubenswrapper[4933]: I1202 15:54:05.403764 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:05Z","lastTransitionTime":"2025-12-02T15:54:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:05 crc kubenswrapper[4933]: I1202 15:54:05.507076 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:05 crc kubenswrapper[4933]: I1202 15:54:05.507146 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:05 crc kubenswrapper[4933]: I1202 15:54:05.507165 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:05 crc kubenswrapper[4933]: I1202 15:54:05.507194 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:05 crc kubenswrapper[4933]: I1202 15:54:05.507212 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:05Z","lastTransitionTime":"2025-12-02T15:54:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:05 crc kubenswrapper[4933]: I1202 15:54:05.610981 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:05 crc kubenswrapper[4933]: I1202 15:54:05.611062 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:05 crc kubenswrapper[4933]: I1202 15:54:05.611085 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:05 crc kubenswrapper[4933]: I1202 15:54:05.611111 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:05 crc kubenswrapper[4933]: I1202 15:54:05.611128 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:05Z","lastTransitionTime":"2025-12-02T15:54:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:05 crc kubenswrapper[4933]: I1202 15:54:05.714927 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:05 crc kubenswrapper[4933]: I1202 15:54:05.714994 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:05 crc kubenswrapper[4933]: I1202 15:54:05.715013 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:05 crc kubenswrapper[4933]: I1202 15:54:05.715038 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:05 crc kubenswrapper[4933]: I1202 15:54:05.715056 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:05Z","lastTransitionTime":"2025-12-02T15:54:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:05 crc kubenswrapper[4933]: I1202 15:54:05.819064 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:05 crc kubenswrapper[4933]: I1202 15:54:05.819155 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:05 crc kubenswrapper[4933]: I1202 15:54:05.819180 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:05 crc kubenswrapper[4933]: I1202 15:54:05.819211 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:05 crc kubenswrapper[4933]: I1202 15:54:05.819233 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:05Z","lastTransitionTime":"2025-12-02T15:54:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:05 crc kubenswrapper[4933]: I1202 15:54:05.922136 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:05 crc kubenswrapper[4933]: I1202 15:54:05.922178 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:05 crc kubenswrapper[4933]: I1202 15:54:05.922187 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:05 crc kubenswrapper[4933]: I1202 15:54:05.922204 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:05 crc kubenswrapper[4933]: I1202 15:54:05.922213 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:05Z","lastTransitionTime":"2025-12-02T15:54:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:06 crc kubenswrapper[4933]: I1202 15:54:06.025320 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:06 crc kubenswrapper[4933]: I1202 15:54:06.025401 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:06 crc kubenswrapper[4933]: I1202 15:54:06.025422 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:06 crc kubenswrapper[4933]: I1202 15:54:06.025442 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:06 crc kubenswrapper[4933]: I1202 15:54:06.025458 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:06Z","lastTransitionTime":"2025-12-02T15:54:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:06 crc kubenswrapper[4933]: I1202 15:54:06.052735 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 15:54:06 crc kubenswrapper[4933]: I1202 15:54:06.052735 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 15:54:06 crc kubenswrapper[4933]: E1202 15:54:06.052933 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 15:54:06 crc kubenswrapper[4933]: I1202 15:54:06.052768 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 15:54:06 crc kubenswrapper[4933]: E1202 15:54:06.053116 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 15:54:06 crc kubenswrapper[4933]: E1202 15:54:06.053300 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 15:54:06 crc kubenswrapper[4933]: I1202 15:54:06.128269 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:06 crc kubenswrapper[4933]: I1202 15:54:06.128326 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:06 crc kubenswrapper[4933]: I1202 15:54:06.128334 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:06 crc kubenswrapper[4933]: I1202 15:54:06.128351 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:06 crc kubenswrapper[4933]: I1202 15:54:06.128361 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:06Z","lastTransitionTime":"2025-12-02T15:54:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:06 crc kubenswrapper[4933]: I1202 15:54:06.231426 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:06 crc kubenswrapper[4933]: I1202 15:54:06.231483 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:06 crc kubenswrapper[4933]: I1202 15:54:06.231498 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:06 crc kubenswrapper[4933]: I1202 15:54:06.231518 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:06 crc kubenswrapper[4933]: I1202 15:54:06.231541 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:06Z","lastTransitionTime":"2025-12-02T15:54:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:06 crc kubenswrapper[4933]: I1202 15:54:06.335623 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:06 crc kubenswrapper[4933]: I1202 15:54:06.335682 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:06 crc kubenswrapper[4933]: I1202 15:54:06.335699 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:06 crc kubenswrapper[4933]: I1202 15:54:06.335727 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:06 crc kubenswrapper[4933]: I1202 15:54:06.335743 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:06Z","lastTransitionTime":"2025-12-02T15:54:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:06 crc kubenswrapper[4933]: I1202 15:54:06.439325 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:06 crc kubenswrapper[4933]: I1202 15:54:06.439389 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:06 crc kubenswrapper[4933]: I1202 15:54:06.439401 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:06 crc kubenswrapper[4933]: I1202 15:54:06.439420 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:06 crc kubenswrapper[4933]: I1202 15:54:06.439432 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:06Z","lastTransitionTime":"2025-12-02T15:54:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:06 crc kubenswrapper[4933]: I1202 15:54:06.542467 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:06 crc kubenswrapper[4933]: I1202 15:54:06.542517 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:06 crc kubenswrapper[4933]: I1202 15:54:06.542537 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:06 crc kubenswrapper[4933]: I1202 15:54:06.542562 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:06 crc kubenswrapper[4933]: I1202 15:54:06.542597 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:06Z","lastTransitionTime":"2025-12-02T15:54:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:06 crc kubenswrapper[4933]: I1202 15:54:06.646519 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:06 crc kubenswrapper[4933]: I1202 15:54:06.646586 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:06 crc kubenswrapper[4933]: I1202 15:54:06.646597 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:06 crc kubenswrapper[4933]: I1202 15:54:06.646621 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:06 crc kubenswrapper[4933]: I1202 15:54:06.646635 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:06Z","lastTransitionTime":"2025-12-02T15:54:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:06 crc kubenswrapper[4933]: I1202 15:54:06.749720 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:06 crc kubenswrapper[4933]: I1202 15:54:06.749783 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:06 crc kubenswrapper[4933]: I1202 15:54:06.749805 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:06 crc kubenswrapper[4933]: I1202 15:54:06.749870 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:06 crc kubenswrapper[4933]: I1202 15:54:06.749896 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:06Z","lastTransitionTime":"2025-12-02T15:54:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:06 crc kubenswrapper[4933]: I1202 15:54:06.853572 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:06 crc kubenswrapper[4933]: I1202 15:54:06.853662 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:06 crc kubenswrapper[4933]: I1202 15:54:06.853680 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:06 crc kubenswrapper[4933]: I1202 15:54:06.853707 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:06 crc kubenswrapper[4933]: I1202 15:54:06.853726 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:06Z","lastTransitionTime":"2025-12-02T15:54:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:06 crc kubenswrapper[4933]: I1202 15:54:06.957528 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:06 crc kubenswrapper[4933]: I1202 15:54:06.957578 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:06 crc kubenswrapper[4933]: I1202 15:54:06.957588 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:06 crc kubenswrapper[4933]: I1202 15:54:06.957606 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:06 crc kubenswrapper[4933]: I1202 15:54:06.957618 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:06Z","lastTransitionTime":"2025-12-02T15:54:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:07 crc kubenswrapper[4933]: I1202 15:54:07.052955 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qbps2" Dec 02 15:54:07 crc kubenswrapper[4933]: E1202 15:54:07.053484 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qbps2" podUID="c95a4730-1427-4097-9ca3-4bd251e7acf0" Dec 02 15:54:07 crc kubenswrapper[4933]: I1202 15:54:07.059306 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:07 crc kubenswrapper[4933]: I1202 15:54:07.059375 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:07 crc kubenswrapper[4933]: I1202 15:54:07.059387 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:07 crc kubenswrapper[4933]: I1202 15:54:07.059426 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:07 crc kubenswrapper[4933]: I1202 15:54:07.059438 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:07Z","lastTransitionTime":"2025-12-02T15:54:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:07 crc kubenswrapper[4933]: I1202 15:54:07.066072 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:54:07Z is after 2025-08-24T17:21:41Z" Dec 02 15:54:07 crc kubenswrapper[4933]: I1202 15:54:07.078625 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4424b26b4c67e9508d92cc6bbc82b291d93c587a8463026a856d87b7b778079e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:54:07Z is after 2025-08-24T17:21:41Z" Dec 02 15:54:07 crc kubenswrapper[4933]: I1202 15:54:07.093540 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-z6kjz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b033c545-93a2-4401-842b-22456e44216b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf98d43fe6e267ef8509fcedf5375bfd2049a7a7964ddcb2cb97b6710013fb7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5307d7bbe56091012f9975b2a42eafb27d8c90b53817f1f82d8269e23456759\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T15:53:30Z\\\",\\\"message\\\":\\\"2025-12-02T15:52:45+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_261455ff-27bf-4826-8730-80c1421f483f\\\\n2025-12-02T15:52:45+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_261455ff-27bf-4826-8730-80c1421f483f to /host/opt/cni/bin/\\\\n2025-12-02T15:52:45Z [verbose] multus-daemon started\\\\n2025-12-02T15:52:45Z [verbose] Readiness Indicator file check\\\\n2025-12-02T15:53:30Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:53:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdq96\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-z6kjz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:54:07Z is after 2025-08-24T17:21:41Z" Dec 02 15:54:07 crc kubenswrapper[4933]: I1202 15:54:07.107351 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50efb59904b9275848b8f068dfa8943515c66087209fe13dc75888354ecaff09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:54:07Z is after 2025-08-24T17:21:41Z" Dec 02 15:54:07 crc kubenswrapper[4933]: I1202 15:54:07.127152 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c0555430401fd089ba4f14bef44c9a03bcc4352a3159c34aa592797211ff912\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2774460b514418fc05e1d8ac0ca0a8cda1194fab9151804bed266e6bf44c7369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:54:07Z is after 2025-08-24T17:21:41Z" Dec 02 15:54:07 crc kubenswrapper[4933]: I1202 15:54:07.145924 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:54:07Z is after 2025-08-24T17:21:41Z" Dec 02 15:54:07 crc kubenswrapper[4933]: I1202 15:54:07.161991 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:07 crc kubenswrapper[4933]: I1202 15:54:07.162035 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:07 crc kubenswrapper[4933]: I1202 15:54:07.162050 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:07 crc kubenswrapper[4933]: I1202 15:54:07.162071 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:07 crc kubenswrapper[4933]: I1202 15:54:07.162084 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:07Z","lastTransitionTime":"2025-12-02T15:54:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:07 crc kubenswrapper[4933]: I1202 15:54:07.170740 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-s779q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72a30a71-fe04-43a2-8f60-c9b12a0a6e5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2367cc195b2375ceed6df398214268e414ae13f5459750b4a1f3bbe4ef59363\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a448b3118b6160344bea740978c045ce2351af934805ee95eb6b940963b377bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a448b3118b6160344bea740978c045ce2351af934805ee95eb6b940963b377bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f11b7eae04e4265f3d00a32c0c1c91419ef6da612c6ecd2e69ef4e90b5f72b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f11b7eae04e4265f3d00a32c0c1c91419ef6da612c6ecd2e69ef4e90b5f72b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89093d056f612b90064277410dc17c8b1879459a9e9491538ccd8dabeb2703e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89093d056f612b90064277410dc17c8b1879459a9e9491538ccd8dabeb2703e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e773c0ba8db44a579b0f5b59c7402d773402fbb1b3e3b57bf95e9244df635c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e773c0ba8db44a579b0f5b59c7402d773402fbb1b3e3b57bf95e9244df635c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a098d78bfca730eec14ca5f48ae082e24f0623ec3752e1b0182dceb18821e3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a098d78bfca730eec14ca5f48ae082e24f0623ec3752e1b0182dceb18821e3b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e41fea9eed5df332e515a81666075fc4cb3171b47a7c222b36dac4d5a7533692\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e41fea9eed5df332e515a81666075fc4cb3171b47a7c222b36dac4d5a7533692\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgds8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-s779q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:54:07Z is after 2025-08-24T17:21:41Z" Dec 02 15:54:07 crc kubenswrapper[4933]: I1202 15:54:07.187154 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67141d41-dade-4d16-8921-1a3eeaef658e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67497877f8241167fd89cf2ebbc7257a910b4f22e96a1e58f78936317daf19aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://559f28d5b8b13a4d78a727f671a75d2c61a391f83089c98749a8071b466c8fae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50e3f8f75abacf2b0701b20cc58796727bb04d99836c7ad69c27355d0c7f070f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0f8661c980893fc646e84a6bb8547946653723fb3f1449d585953e25715d588\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdced4864fc5e9a41404f9484c6126634ffcbc3388080207f6a5508be6dc7b19\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 15:52:30.552832 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 15:52:30.556124 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1621699441/tls.crt::/tmp/serving-cert-1621699441/tls.key\\\\\\\"\\\\nI1202 15:52:36.166152 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 15:52:36.169452 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 15:52:36.169553 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 15:52:36.169614 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 15:52:36.169667 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 15:52:36.177343 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 15:52:36.177409 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 15:52:36.177417 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 15:52:36.177423 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 15:52:36.177428 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 15:52:36.177432 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 15:52:36.177437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 15:52:36.177361 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 15:52:36.181525 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://266ff5f18c317176752753189e1fd32c61d17e22bebba19421d71cdce41c10e4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd63fc5b2ffbd52b09628908868b28e2168bfe8bc453de41ce175aedcd404cc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd63fc5b2ffbd52b09628908868b28e2168bfe8bc453de41ce175aedcd404cc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:54:07Z is after 2025-08-24T17:21:41Z" Dec 02 15:54:07 crc kubenswrapper[4933]: I1202 15:54:07.206579 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c736b69-0c6f-42a9-bea8-cae1d3274483\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://292a536fa8488df1cde96062a3d40cbe5ecf556b30ae0ba609d9deb6b8dde7d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d4f02704d1f6ee3644bceab7f4cd44ef43e948d1777fa2e225642a5da98901f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e5b219049292adb37755387b666dabb044ec252e589f4f554a96bb2e858612c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c86a4b6d42de44d954b5bcf8b42ac10cffaac09c9f7e362a2d26b4a0fcba9a74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1258d15649d53d011ca2d8bb0e3e6b17ddaabe8ee2de9d184d85141ef2229ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94a860a32ad2b16621b3f2208b144968777437df38bf39ce6ab6b534a77ea154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94a860a32ad2b16621b3f2208b144968777437df38bf39ce6ab6b534a77ea154\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57d56c47e0f5d4c1ab0ee1bf1098a81be9d384c8f8035cff1eabc824523b46f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57d56c47e0f5d4c1ab0ee1bf1098a81be9d384c8f8035cff1eabc824523b46f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://54b33eb07acab9a8b66ee5312351af62b7dba3c8212de0dff4cd777235f5f2cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54b33eb07acab9a8b66ee5312351af62b7dba3c8212de0dff4cd777235f5f2cb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:54:07Z is after 2025-08-24T17:21:41Z" Dec 02 15:54:07 crc kubenswrapper[4933]: I1202 15:54:07.217461 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06b195f1-3296-4050-9361-eab421cde8d4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7399fed7994f0ed3f12a423d3f6796e84d8687f9c16a3050ccbb90e1c80a07d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee933bdd8638f0085a6f720a178c8ce59bf46b40a0bcb015ac9c570e25ce97d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7cb1abb6f86878fc3daef153191ea3a2ebe06b3f1fc7df959539938c3b6a724\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db14d6e6ebdfa06ff02570eb66fe7ea17a7705fdaa767b6fb91d7ed12eacd59a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:54:07Z is after 2025-08-24T17:21:41Z" Dec 02 15:54:07 crc kubenswrapper[4933]: I1202 15:54:07.234807 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1972064c-ea30-421c-b009-2bc675a98fcc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3920b75a79accaa154ca827b9d813f5bbb7c7c18369c0b6d3bb82cab653bcb66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a3f2ac3ef3759389e0c4807b4235aea66b8d9570ef020af90110402296a755\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d4c0805f39e3410185b2d898b61b42dfeb544cd47c1ac7570a099ece7921a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://170155e5803e9eabfc4866a3cd76bd74567d9ff6707bebfdf7f24a2330173140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a499bb33dae499d565aa22f537a095fe9465bd1d68d829248fc7a896f692f67d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14c25265781df0f480ecebbc167b5e413d869433f7258cc299a2c2bc12ee2af3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f7591b746ed9942ce7644e8005251baa49bdeaef3618aa1d679249fd3a96058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f7591b746ed9942ce7644e8005251baa49bdeaef3618aa1d679249fd3a96058\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T15:53:44Z\\\",\\\"message\\\":\\\"il\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1202 15:53:43.707176 6999 transact.go:42] Configuring OVN: [{Op:update Table:Logical_Router_Static_Route Row:map[ip_prefix:10.217.0.0/22 nexthop:100.64.0.2 policy:{GoSet:[src-ip]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8944024f-deb7-4076-afb3-4b50a2ff4b4b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:static_routes Mutator:insert Value:{GoSet:[{GoUUID:8944024f-deb7-4076-afb3-4b50a2ff4b4b}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1202 15:53:43.707700 6999 obj_retry.go:551] Creating *factory.egressNode crc took: 2.297771ms\\\\nI1202 15:53:43.707729 6999 factory.go:1336] Added *v1.Node event handler 7\\\\nI1202 15:53:43.707766 6999 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1202 15:53:43.708052 6999 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1202 15:53:43.708145 6999 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1202 15:53:43.708185 6999 ovnkube.go:599] Stopped ovnkube\\\\nI1202 15:53:43.708211 6999 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1202 15:53:43.708294 6999 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T15:53:42Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-8mklc_openshift-ovn-kubernetes(1972064c-ea30-421c-b009-2bc675a98fcc)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ae6c9973a0f8983a5da9a3a25ba23d7d836e7f5bdaa60b60627d604ef3d2afc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0062caebc274431e99ef7dc875a50f2f2d756215bd2700b7206582f68e5f376\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0062caebc274431e99ef7dc875a50f2f2d756215bd2700b7206582f68e5f376\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pcpjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8mklc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:54:07Z is after 2025-08-24T17:21:41Z" Dec 02 15:54:07 crc kubenswrapper[4933]: I1202 15:54:07.246304 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s488w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a161d9d5-a56f-45e9-93e4-50e7220cd31e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38f86b30416fb2f10e8444d5c7c0afe84f16d619e83e7a5e1186eaf4c274a51f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mnvpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cba0775cbbe9a6ce1f0b0fe3559f2b5eb39bd13d4f35686fe2ff92d7d833909\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mnvpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s488w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:54:07Z is after 2025-08-24T17:21:41Z" Dec 02 15:54:07 crc kubenswrapper[4933]: I1202 15:54:07.254997 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"448f05b3-d7b9-4eca-a267-b9b4a5a766e2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://242478d3d008a43cabff06a9c4bf90faf8c8eb3a12f5f43cde5d0156c47d6a8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://278a1b76a0d8f7d922ab08d7ecd0912f0cde3f3438de43f5c5214e703bd244c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://278a1b76a0d8f7d922ab08d7ecd0912f0cde3f3438de43f5c5214e703bd244c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:54:07Z is after 2025-08-24T17:21:41Z" Dec 02 15:54:07 crc kubenswrapper[4933]: I1202 15:54:07.264230 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:07 crc kubenswrapper[4933]: I1202 15:54:07.264275 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:07 crc kubenswrapper[4933]: I1202 15:54:07.264287 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:07 crc kubenswrapper[4933]: I1202 15:54:07.264302 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:07 crc kubenswrapper[4933]: I1202 15:54:07.264313 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:07Z","lastTransitionTime":"2025-12-02T15:54:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:07 crc kubenswrapper[4933]: I1202 15:54:07.266020 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:54:07Z is after 2025-08-24T17:21:41Z" Dec 02 15:54:07 crc kubenswrapper[4933]: I1202 15:54:07.275642 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6c1c5e6-50dd-428a-890c-2c3f0456f2fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d84363ac0dfeec81ad7770d6ffd34547605fc51bebb545c4639f4c069bab93ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t48jm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54194f3459a2bbe748821e4f8e94abdd18e7c4e483d4cc2c9d5b765db584dd01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t48jm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-d2p6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:54:07Z is after 2025-08-24T17:21:41Z" Dec 02 15:54:07 crc kubenswrapper[4933]: I1202 15:54:07.289118 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5d9dn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97edaf10-912b-42e7-a9e7-930381d48508\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea0970d622ab76b94ceab66d1d10d469581574368d38c8cd7c6b7a26f81cb6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s82hp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:43Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5d9dn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:54:07Z is after 2025-08-24T17:21:41Z" Dec 02 15:54:07 crc kubenswrapper[4933]: I1202 15:54:07.298639 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fl25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"680c0df1-e4d6-4e1c-a36d-2378e821d2d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bdbdb1eb6ad7b6e710feeda6af64e9557ec1c3c938fd850fa5b2835abc45f098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-559sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fl25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:54:07Z is after 2025-08-24T17:21:41Z" Dec 02 15:54:07 crc kubenswrapper[4933]: I1202 15:54:07.308493 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qbps2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c95a4730-1427-4097-9ca3-4bd251e7acf0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdwf7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdwf7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:56Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qbps2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:54:07Z is after 2025-08-24T17:21:41Z" Dec 02 15:54:07 crc kubenswrapper[4933]: I1202 15:54:07.318785 4933 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7faf072-ca84-4fd4-9409-b46ca6d4f1b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T15:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://befd22a9932058934ba3614781a7f133b4e432d5488c47697d082722ac11e0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d957eb940436ccb56c610cb177a4670ec0fe3aa5437e333a99c291c4263c5978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://defe82313be0dd4818134a0c694acbf728c8e31d0df745b7f2241a3e57c1bd8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T15:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d0c36fe96576d4594aff50e5b631d1bc23ea352372722d332d11c9dc6b5b7cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d0c36fe96576d4594aff50e5b631d1bc23ea352372722d332d11c9dc6b5b7cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T15:52:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T15:52:18Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T15:52:17Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T15:54:07Z is after 2025-08-24T17:21:41Z" Dec 02 15:54:07 crc kubenswrapper[4933]: I1202 15:54:07.366667 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:07 crc kubenswrapper[4933]: I1202 15:54:07.366731 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:07 crc kubenswrapper[4933]: I1202 15:54:07.366741 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:07 crc kubenswrapper[4933]: I1202 15:54:07.366755 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:07 crc kubenswrapper[4933]: I1202 15:54:07.366765 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:07Z","lastTransitionTime":"2025-12-02T15:54:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:07 crc kubenswrapper[4933]: I1202 15:54:07.469992 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:07 crc kubenswrapper[4933]: I1202 15:54:07.470071 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:07 crc kubenswrapper[4933]: I1202 15:54:07.470083 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:07 crc kubenswrapper[4933]: I1202 15:54:07.470098 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:07 crc kubenswrapper[4933]: I1202 15:54:07.470109 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:07Z","lastTransitionTime":"2025-12-02T15:54:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:07 crc kubenswrapper[4933]: I1202 15:54:07.573034 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:07 crc kubenswrapper[4933]: I1202 15:54:07.573093 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:07 crc kubenswrapper[4933]: I1202 15:54:07.573115 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:07 crc kubenswrapper[4933]: I1202 15:54:07.573145 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:07 crc kubenswrapper[4933]: I1202 15:54:07.573165 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:07Z","lastTransitionTime":"2025-12-02T15:54:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:07 crc kubenswrapper[4933]: I1202 15:54:07.676506 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:07 crc kubenswrapper[4933]: I1202 15:54:07.676577 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:07 crc kubenswrapper[4933]: I1202 15:54:07.676599 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:07 crc kubenswrapper[4933]: I1202 15:54:07.676630 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:07 crc kubenswrapper[4933]: I1202 15:54:07.676654 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:07Z","lastTransitionTime":"2025-12-02T15:54:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:07 crc kubenswrapper[4933]: I1202 15:54:07.780275 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:07 crc kubenswrapper[4933]: I1202 15:54:07.780321 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:07 crc kubenswrapper[4933]: I1202 15:54:07.780330 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:07 crc kubenswrapper[4933]: I1202 15:54:07.780346 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:07 crc kubenswrapper[4933]: I1202 15:54:07.780357 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:07Z","lastTransitionTime":"2025-12-02T15:54:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:07 crc kubenswrapper[4933]: I1202 15:54:07.883414 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:07 crc kubenswrapper[4933]: I1202 15:54:07.883497 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:07 crc kubenswrapper[4933]: I1202 15:54:07.883517 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:07 crc kubenswrapper[4933]: I1202 15:54:07.883546 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:07 crc kubenswrapper[4933]: I1202 15:54:07.883564 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:07Z","lastTransitionTime":"2025-12-02T15:54:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:07 crc kubenswrapper[4933]: I1202 15:54:07.986184 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:07 crc kubenswrapper[4933]: I1202 15:54:07.987050 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:07 crc kubenswrapper[4933]: I1202 15:54:07.987081 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:07 crc kubenswrapper[4933]: I1202 15:54:07.987104 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:07 crc kubenswrapper[4933]: I1202 15:54:07.987123 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:07Z","lastTransitionTime":"2025-12-02T15:54:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:08 crc kubenswrapper[4933]: I1202 15:54:08.052865 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 15:54:08 crc kubenswrapper[4933]: I1202 15:54:08.052979 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 15:54:08 crc kubenswrapper[4933]: E1202 15:54:08.053018 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 15:54:08 crc kubenswrapper[4933]: E1202 15:54:08.053123 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 15:54:08 crc kubenswrapper[4933]: I1202 15:54:08.053421 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 15:54:08 crc kubenswrapper[4933]: E1202 15:54:08.053553 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 15:54:08 crc kubenswrapper[4933]: I1202 15:54:08.089219 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:08 crc kubenswrapper[4933]: I1202 15:54:08.089488 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:08 crc kubenswrapper[4933]: I1202 15:54:08.089581 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:08 crc kubenswrapper[4933]: I1202 15:54:08.089681 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:08 crc kubenswrapper[4933]: I1202 15:54:08.089780 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:08Z","lastTransitionTime":"2025-12-02T15:54:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:08 crc kubenswrapper[4933]: I1202 15:54:08.192958 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:08 crc kubenswrapper[4933]: I1202 15:54:08.193433 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:08 crc kubenswrapper[4933]: I1202 15:54:08.193526 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:08 crc kubenswrapper[4933]: I1202 15:54:08.193663 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:08 crc kubenswrapper[4933]: I1202 15:54:08.193765 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:08Z","lastTransitionTime":"2025-12-02T15:54:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:08 crc kubenswrapper[4933]: I1202 15:54:08.298068 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:08 crc kubenswrapper[4933]: I1202 15:54:08.298150 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:08 crc kubenswrapper[4933]: I1202 15:54:08.298183 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:08 crc kubenswrapper[4933]: I1202 15:54:08.298214 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:08 crc kubenswrapper[4933]: I1202 15:54:08.298238 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:08Z","lastTransitionTime":"2025-12-02T15:54:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:08 crc kubenswrapper[4933]: I1202 15:54:08.401725 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:08 crc kubenswrapper[4933]: I1202 15:54:08.401809 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:08 crc kubenswrapper[4933]: I1202 15:54:08.401863 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:08 crc kubenswrapper[4933]: I1202 15:54:08.401890 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:08 crc kubenswrapper[4933]: I1202 15:54:08.401905 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:08Z","lastTransitionTime":"2025-12-02T15:54:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:08 crc kubenswrapper[4933]: I1202 15:54:08.504119 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:08 crc kubenswrapper[4933]: I1202 15:54:08.504165 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:08 crc kubenswrapper[4933]: I1202 15:54:08.504177 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:08 crc kubenswrapper[4933]: I1202 15:54:08.504195 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:08 crc kubenswrapper[4933]: I1202 15:54:08.504207 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:08Z","lastTransitionTime":"2025-12-02T15:54:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:08 crc kubenswrapper[4933]: I1202 15:54:08.607302 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:08 crc kubenswrapper[4933]: I1202 15:54:08.607406 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:08 crc kubenswrapper[4933]: I1202 15:54:08.607417 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:08 crc kubenswrapper[4933]: I1202 15:54:08.607431 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:08 crc kubenswrapper[4933]: I1202 15:54:08.607441 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:08Z","lastTransitionTime":"2025-12-02T15:54:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:08 crc kubenswrapper[4933]: I1202 15:54:08.710427 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:08 crc kubenswrapper[4933]: I1202 15:54:08.710488 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:08 crc kubenswrapper[4933]: I1202 15:54:08.710513 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:08 crc kubenswrapper[4933]: I1202 15:54:08.710541 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:08 crc kubenswrapper[4933]: I1202 15:54:08.710565 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:08Z","lastTransitionTime":"2025-12-02T15:54:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:08 crc kubenswrapper[4933]: I1202 15:54:08.815352 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:08 crc kubenswrapper[4933]: I1202 15:54:08.816031 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:08 crc kubenswrapper[4933]: I1202 15:54:08.816055 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:08 crc kubenswrapper[4933]: I1202 15:54:08.816083 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:08 crc kubenswrapper[4933]: I1202 15:54:08.816103 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:08Z","lastTransitionTime":"2025-12-02T15:54:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:08 crc kubenswrapper[4933]: I1202 15:54:08.919705 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:08 crc kubenswrapper[4933]: I1202 15:54:08.919779 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:08 crc kubenswrapper[4933]: I1202 15:54:08.919802 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:08 crc kubenswrapper[4933]: I1202 15:54:08.919865 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:08 crc kubenswrapper[4933]: I1202 15:54:08.919894 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:08Z","lastTransitionTime":"2025-12-02T15:54:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:09 crc kubenswrapper[4933]: I1202 15:54:09.022806 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:09 crc kubenswrapper[4933]: I1202 15:54:09.022989 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:09 crc kubenswrapper[4933]: I1202 15:54:09.023014 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:09 crc kubenswrapper[4933]: I1202 15:54:09.023038 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:09 crc kubenswrapper[4933]: I1202 15:54:09.023055 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:09Z","lastTransitionTime":"2025-12-02T15:54:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:09 crc kubenswrapper[4933]: I1202 15:54:09.052925 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qbps2" Dec 02 15:54:09 crc kubenswrapper[4933]: E1202 15:54:09.053170 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qbps2" podUID="c95a4730-1427-4097-9ca3-4bd251e7acf0" Dec 02 15:54:09 crc kubenswrapper[4933]: I1202 15:54:09.126571 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:09 crc kubenswrapper[4933]: I1202 15:54:09.126631 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:09 crc kubenswrapper[4933]: I1202 15:54:09.126644 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:09 crc kubenswrapper[4933]: I1202 15:54:09.126664 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:09 crc kubenswrapper[4933]: I1202 15:54:09.126677 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:09Z","lastTransitionTime":"2025-12-02T15:54:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:09 crc kubenswrapper[4933]: I1202 15:54:09.228729 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:09 crc kubenswrapper[4933]: I1202 15:54:09.229126 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:09 crc kubenswrapper[4933]: I1202 15:54:09.229223 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:09 crc kubenswrapper[4933]: I1202 15:54:09.229308 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:09 crc kubenswrapper[4933]: I1202 15:54:09.229386 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:09Z","lastTransitionTime":"2025-12-02T15:54:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:09 crc kubenswrapper[4933]: I1202 15:54:09.331970 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:09 crc kubenswrapper[4933]: I1202 15:54:09.332007 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:09 crc kubenswrapper[4933]: I1202 15:54:09.332016 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:09 crc kubenswrapper[4933]: I1202 15:54:09.332030 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:09 crc kubenswrapper[4933]: I1202 15:54:09.332040 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:09Z","lastTransitionTime":"2025-12-02T15:54:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:09 crc kubenswrapper[4933]: I1202 15:54:09.435789 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:09 crc kubenswrapper[4933]: I1202 15:54:09.435847 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:09 crc kubenswrapper[4933]: I1202 15:54:09.435860 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:09 crc kubenswrapper[4933]: I1202 15:54:09.435882 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:09 crc kubenswrapper[4933]: I1202 15:54:09.435897 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:09Z","lastTransitionTime":"2025-12-02T15:54:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:09 crc kubenswrapper[4933]: I1202 15:54:09.539044 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:09 crc kubenswrapper[4933]: I1202 15:54:09.539090 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:09 crc kubenswrapper[4933]: I1202 15:54:09.539099 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:09 crc kubenswrapper[4933]: I1202 15:54:09.539113 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:09 crc kubenswrapper[4933]: I1202 15:54:09.539123 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:09Z","lastTransitionTime":"2025-12-02T15:54:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:09 crc kubenswrapper[4933]: I1202 15:54:09.641462 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:09 crc kubenswrapper[4933]: I1202 15:54:09.641517 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:09 crc kubenswrapper[4933]: I1202 15:54:09.641528 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:09 crc kubenswrapper[4933]: I1202 15:54:09.641547 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:09 crc kubenswrapper[4933]: I1202 15:54:09.641561 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:09Z","lastTransitionTime":"2025-12-02T15:54:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:09 crc kubenswrapper[4933]: I1202 15:54:09.745212 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:09 crc kubenswrapper[4933]: I1202 15:54:09.745273 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:09 crc kubenswrapper[4933]: I1202 15:54:09.745285 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:09 crc kubenswrapper[4933]: I1202 15:54:09.745307 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:09 crc kubenswrapper[4933]: I1202 15:54:09.745322 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:09Z","lastTransitionTime":"2025-12-02T15:54:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:09 crc kubenswrapper[4933]: I1202 15:54:09.847697 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:09 crc kubenswrapper[4933]: I1202 15:54:09.848088 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:09 crc kubenswrapper[4933]: I1202 15:54:09.848168 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:09 crc kubenswrapper[4933]: I1202 15:54:09.848250 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:09 crc kubenswrapper[4933]: I1202 15:54:09.848317 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:09Z","lastTransitionTime":"2025-12-02T15:54:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:09 crc kubenswrapper[4933]: I1202 15:54:09.951610 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:09 crc kubenswrapper[4933]: I1202 15:54:09.952270 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:09 crc kubenswrapper[4933]: I1202 15:54:09.952499 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:09 crc kubenswrapper[4933]: I1202 15:54:09.952937 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:09 crc kubenswrapper[4933]: I1202 15:54:09.953134 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:09Z","lastTransitionTime":"2025-12-02T15:54:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:10 crc kubenswrapper[4933]: I1202 15:54:10.052710 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 15:54:10 crc kubenswrapper[4933]: I1202 15:54:10.052730 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 15:54:10 crc kubenswrapper[4933]: E1202 15:54:10.052874 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 15:54:10 crc kubenswrapper[4933]: I1202 15:54:10.052910 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 15:54:10 crc kubenswrapper[4933]: E1202 15:54:10.053024 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 15:54:10 crc kubenswrapper[4933]: E1202 15:54:10.053225 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 15:54:10 crc kubenswrapper[4933]: I1202 15:54:10.056091 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:10 crc kubenswrapper[4933]: I1202 15:54:10.056145 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:10 crc kubenswrapper[4933]: I1202 15:54:10.056157 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:10 crc kubenswrapper[4933]: I1202 15:54:10.056177 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:10 crc kubenswrapper[4933]: I1202 15:54:10.056190 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:10Z","lastTransitionTime":"2025-12-02T15:54:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:10 crc kubenswrapper[4933]: I1202 15:54:10.159873 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:10 crc kubenswrapper[4933]: I1202 15:54:10.159954 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:10 crc kubenswrapper[4933]: I1202 15:54:10.159977 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:10 crc kubenswrapper[4933]: I1202 15:54:10.160010 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:10 crc kubenswrapper[4933]: I1202 15:54:10.160035 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:10Z","lastTransitionTime":"2025-12-02T15:54:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:10 crc kubenswrapper[4933]: I1202 15:54:10.263001 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:10 crc kubenswrapper[4933]: I1202 15:54:10.263062 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:10 crc kubenswrapper[4933]: I1202 15:54:10.263100 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:10 crc kubenswrapper[4933]: I1202 15:54:10.263135 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:10 crc kubenswrapper[4933]: I1202 15:54:10.263158 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:10Z","lastTransitionTime":"2025-12-02T15:54:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:10 crc kubenswrapper[4933]: I1202 15:54:10.366320 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:10 crc kubenswrapper[4933]: I1202 15:54:10.366358 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:10 crc kubenswrapper[4933]: I1202 15:54:10.366367 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:10 crc kubenswrapper[4933]: I1202 15:54:10.366382 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:10 crc kubenswrapper[4933]: I1202 15:54:10.366392 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:10Z","lastTransitionTime":"2025-12-02T15:54:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:10 crc kubenswrapper[4933]: I1202 15:54:10.469549 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:10 crc kubenswrapper[4933]: I1202 15:54:10.469620 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:10 crc kubenswrapper[4933]: I1202 15:54:10.469637 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:10 crc kubenswrapper[4933]: I1202 15:54:10.469665 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:10 crc kubenswrapper[4933]: I1202 15:54:10.469685 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:10Z","lastTransitionTime":"2025-12-02T15:54:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:10 crc kubenswrapper[4933]: I1202 15:54:10.573649 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:10 crc kubenswrapper[4933]: I1202 15:54:10.573727 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:10 crc kubenswrapper[4933]: I1202 15:54:10.573752 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:10 crc kubenswrapper[4933]: I1202 15:54:10.573778 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:10 crc kubenswrapper[4933]: I1202 15:54:10.573799 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:10Z","lastTransitionTime":"2025-12-02T15:54:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:10 crc kubenswrapper[4933]: I1202 15:54:10.677176 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:10 crc kubenswrapper[4933]: I1202 15:54:10.677331 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:10 crc kubenswrapper[4933]: I1202 15:54:10.677354 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:10 crc kubenswrapper[4933]: I1202 15:54:10.677376 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:10 crc kubenswrapper[4933]: I1202 15:54:10.677427 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:10Z","lastTransitionTime":"2025-12-02T15:54:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:10 crc kubenswrapper[4933]: I1202 15:54:10.782218 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:10 crc kubenswrapper[4933]: I1202 15:54:10.782356 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:10 crc kubenswrapper[4933]: I1202 15:54:10.782377 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:10 crc kubenswrapper[4933]: I1202 15:54:10.782399 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:10 crc kubenswrapper[4933]: I1202 15:54:10.782452 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:10Z","lastTransitionTime":"2025-12-02T15:54:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:10 crc kubenswrapper[4933]: I1202 15:54:10.884948 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:10 crc kubenswrapper[4933]: I1202 15:54:10.884991 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:10 crc kubenswrapper[4933]: I1202 15:54:10.885002 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:10 crc kubenswrapper[4933]: I1202 15:54:10.885019 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:10 crc kubenswrapper[4933]: I1202 15:54:10.885029 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:10Z","lastTransitionTime":"2025-12-02T15:54:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:10 crc kubenswrapper[4933]: I1202 15:54:10.988223 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:10 crc kubenswrapper[4933]: I1202 15:54:10.988891 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:10 crc kubenswrapper[4933]: I1202 15:54:10.989180 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:10 crc kubenswrapper[4933]: I1202 15:54:10.989691 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:10 crc kubenswrapper[4933]: I1202 15:54:10.990269 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:10Z","lastTransitionTime":"2025-12-02T15:54:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:11 crc kubenswrapper[4933]: I1202 15:54:11.052738 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qbps2" Dec 02 15:54:11 crc kubenswrapper[4933]: E1202 15:54:11.052963 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qbps2" podUID="c95a4730-1427-4097-9ca3-4bd251e7acf0" Dec 02 15:54:11 crc kubenswrapper[4933]: I1202 15:54:11.094280 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:11 crc kubenswrapper[4933]: I1202 15:54:11.094345 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:11 crc kubenswrapper[4933]: I1202 15:54:11.094362 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:11 crc kubenswrapper[4933]: I1202 15:54:11.094383 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:11 crc kubenswrapper[4933]: I1202 15:54:11.094402 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:11Z","lastTransitionTime":"2025-12-02T15:54:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:11 crc kubenswrapper[4933]: I1202 15:54:11.197531 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:11 crc kubenswrapper[4933]: I1202 15:54:11.197878 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:11 crc kubenswrapper[4933]: I1202 15:54:11.197966 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:11 crc kubenswrapper[4933]: I1202 15:54:11.198048 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:11 crc kubenswrapper[4933]: I1202 15:54:11.198130 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:11Z","lastTransitionTime":"2025-12-02T15:54:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:11 crc kubenswrapper[4933]: I1202 15:54:11.300480 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:11 crc kubenswrapper[4933]: I1202 15:54:11.300515 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:11 crc kubenswrapper[4933]: I1202 15:54:11.300526 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:11 crc kubenswrapper[4933]: I1202 15:54:11.300543 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:11 crc kubenswrapper[4933]: I1202 15:54:11.300555 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:11Z","lastTransitionTime":"2025-12-02T15:54:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:11 crc kubenswrapper[4933]: I1202 15:54:11.403971 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:11 crc kubenswrapper[4933]: I1202 15:54:11.404031 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:11 crc kubenswrapper[4933]: I1202 15:54:11.404046 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:11 crc kubenswrapper[4933]: I1202 15:54:11.404070 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:11 crc kubenswrapper[4933]: I1202 15:54:11.404084 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:11Z","lastTransitionTime":"2025-12-02T15:54:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:11 crc kubenswrapper[4933]: I1202 15:54:11.506509 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:11 crc kubenswrapper[4933]: I1202 15:54:11.506982 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:11 crc kubenswrapper[4933]: I1202 15:54:11.507028 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:11 crc kubenswrapper[4933]: I1202 15:54:11.507050 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:11 crc kubenswrapper[4933]: I1202 15:54:11.507072 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:11Z","lastTransitionTime":"2025-12-02T15:54:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:11 crc kubenswrapper[4933]: I1202 15:54:11.610549 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:11 crc kubenswrapper[4933]: I1202 15:54:11.610605 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:11 crc kubenswrapper[4933]: I1202 15:54:11.610619 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:11 crc kubenswrapper[4933]: I1202 15:54:11.610640 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:11 crc kubenswrapper[4933]: I1202 15:54:11.610651 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:11Z","lastTransitionTime":"2025-12-02T15:54:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:11 crc kubenswrapper[4933]: I1202 15:54:11.713628 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:11 crc kubenswrapper[4933]: I1202 15:54:11.713681 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:11 crc kubenswrapper[4933]: I1202 15:54:11.713693 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:11 crc kubenswrapper[4933]: I1202 15:54:11.713708 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:11 crc kubenswrapper[4933]: I1202 15:54:11.713718 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:11Z","lastTransitionTime":"2025-12-02T15:54:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:11 crc kubenswrapper[4933]: I1202 15:54:11.817258 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:11 crc kubenswrapper[4933]: I1202 15:54:11.817319 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:11 crc kubenswrapper[4933]: I1202 15:54:11.817334 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:11 crc kubenswrapper[4933]: I1202 15:54:11.817352 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:11 crc kubenswrapper[4933]: I1202 15:54:11.817366 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:11Z","lastTransitionTime":"2025-12-02T15:54:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:11 crc kubenswrapper[4933]: I1202 15:54:11.920469 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:11 crc kubenswrapper[4933]: I1202 15:54:11.920518 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:11 crc kubenswrapper[4933]: I1202 15:54:11.920529 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:11 crc kubenswrapper[4933]: I1202 15:54:11.920545 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:11 crc kubenswrapper[4933]: I1202 15:54:11.920556 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:11Z","lastTransitionTime":"2025-12-02T15:54:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:12 crc kubenswrapper[4933]: I1202 15:54:12.024190 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:12 crc kubenswrapper[4933]: I1202 15:54:12.024242 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:12 crc kubenswrapper[4933]: I1202 15:54:12.024256 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:12 crc kubenswrapper[4933]: I1202 15:54:12.024274 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:12 crc kubenswrapper[4933]: I1202 15:54:12.024288 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:12Z","lastTransitionTime":"2025-12-02T15:54:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:12 crc kubenswrapper[4933]: I1202 15:54:12.052983 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 15:54:12 crc kubenswrapper[4933]: I1202 15:54:12.053056 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 15:54:12 crc kubenswrapper[4933]: I1202 15:54:12.052989 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 15:54:12 crc kubenswrapper[4933]: E1202 15:54:12.053168 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 15:54:12 crc kubenswrapper[4933]: E1202 15:54:12.053466 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 15:54:12 crc kubenswrapper[4933]: E1202 15:54:12.054161 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 15:54:12 crc kubenswrapper[4933]: I1202 15:54:12.054885 4933 scope.go:117] "RemoveContainer" containerID="2f7591b746ed9942ce7644e8005251baa49bdeaef3618aa1d679249fd3a96058" Dec 02 15:54:12 crc kubenswrapper[4933]: E1202 15:54:12.055118 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-8mklc_openshift-ovn-kubernetes(1972064c-ea30-421c-b009-2bc675a98fcc)\"" pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" podUID="1972064c-ea30-421c-b009-2bc675a98fcc" Dec 02 15:54:12 crc kubenswrapper[4933]: I1202 15:54:12.127123 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:12 crc kubenswrapper[4933]: I1202 15:54:12.127163 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:12 crc kubenswrapper[4933]: I1202 15:54:12.127177 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:12 crc kubenswrapper[4933]: I1202 15:54:12.127193 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:12 crc kubenswrapper[4933]: I1202 15:54:12.127205 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:12Z","lastTransitionTime":"2025-12-02T15:54:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:12 crc kubenswrapper[4933]: I1202 15:54:12.230087 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:12 crc kubenswrapper[4933]: I1202 15:54:12.230461 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:12 crc kubenswrapper[4933]: I1202 15:54:12.230568 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:12 crc kubenswrapper[4933]: I1202 15:54:12.230640 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:12 crc kubenswrapper[4933]: I1202 15:54:12.230718 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:12Z","lastTransitionTime":"2025-12-02T15:54:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:12 crc kubenswrapper[4933]: I1202 15:54:12.334227 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:12 crc kubenswrapper[4933]: I1202 15:54:12.334292 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:12 crc kubenswrapper[4933]: I1202 15:54:12.334310 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:12 crc kubenswrapper[4933]: I1202 15:54:12.334338 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:12 crc kubenswrapper[4933]: I1202 15:54:12.334358 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:12Z","lastTransitionTime":"2025-12-02T15:54:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:12 crc kubenswrapper[4933]: I1202 15:54:12.375055 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 15:54:12 crc kubenswrapper[4933]: I1202 15:54:12.375125 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 15:54:12 crc kubenswrapper[4933]: I1202 15:54:12.375153 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 15:54:12 crc kubenswrapper[4933]: I1202 15:54:12.375200 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 15:54:12 crc kubenswrapper[4933]: I1202 15:54:12.375222 4933 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T15:54:12Z","lastTransitionTime":"2025-12-02T15:54:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 15:54:12 crc kubenswrapper[4933]: I1202 15:54:12.442518 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-ddjx4"] Dec 02 15:54:12 crc kubenswrapper[4933]: I1202 15:54:12.443323 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ddjx4" Dec 02 15:54:12 crc kubenswrapper[4933]: I1202 15:54:12.445632 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 02 15:54:12 crc kubenswrapper[4933]: I1202 15:54:12.446159 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 02 15:54:12 crc kubenswrapper[4933]: I1202 15:54:12.447654 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 02 15:54:12 crc kubenswrapper[4933]: I1202 15:54:12.447995 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 02 15:54:12 crc kubenswrapper[4933]: I1202 15:54:12.484145 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s488w" podStartSLOduration=89.484115579 podStartE2EDuration="1m29.484115579s" podCreationTimestamp="2025-12-02 15:52:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 15:54:12.467543476 +0000 UTC m=+115.718770219" watchObservedRunningTime="2025-12-02 15:54:12.484115579 +0000 UTC m=+115.735342322" Dec 02 15:54:12 crc kubenswrapper[4933]: I1202 15:54:12.484476 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=49.484467208 podStartE2EDuration="49.484467208s" podCreationTimestamp="2025-12-02 15:53:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 15:54:12.483853422 +0000 UTC m=+115.735080135" watchObservedRunningTime="2025-12-02 15:54:12.484467208 +0000 UTC m=+115.735693951" Dec 02 15:54:12 crc kubenswrapper[4933]: I1202 15:54:12.528542 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=29.528524024 podStartE2EDuration="29.528524024s" podCreationTimestamp="2025-12-02 15:53:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 15:54:12.527193828 +0000 UTC m=+115.778420541" watchObservedRunningTime="2025-12-02 15:54:12.528524024 +0000 UTC m=+115.779750727" Dec 02 15:54:12 crc kubenswrapper[4933]: I1202 15:54:12.574519 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/59c70163-c0d4-4702-be2e-fbf1782ddfbb-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-ddjx4\" (UID: \"59c70163-c0d4-4702-be2e-fbf1782ddfbb\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ddjx4" Dec 02 15:54:12 crc kubenswrapper[4933]: I1202 15:54:12.574612 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/59c70163-c0d4-4702-be2e-fbf1782ddfbb-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-ddjx4\" (UID: \"59c70163-c0d4-4702-be2e-fbf1782ddfbb\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ddjx4" Dec 02 15:54:12 crc kubenswrapper[4933]: I1202 15:54:12.574662 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/59c70163-c0d4-4702-be2e-fbf1782ddfbb-service-ca\") pod \"cluster-version-operator-5c965bbfc6-ddjx4\" (UID: \"59c70163-c0d4-4702-be2e-fbf1782ddfbb\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ddjx4" Dec 02 15:54:12 crc kubenswrapper[4933]: I1202 15:54:12.574729 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/59c70163-c0d4-4702-be2e-fbf1782ddfbb-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-ddjx4\" (UID: \"59c70163-c0d4-4702-be2e-fbf1782ddfbb\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ddjx4" Dec 02 15:54:12 crc kubenswrapper[4933]: I1202 15:54:12.574784 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/59c70163-c0d4-4702-be2e-fbf1782ddfbb-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-ddjx4\" (UID: \"59c70163-c0d4-4702-be2e-fbf1782ddfbb\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ddjx4" Dec 02 15:54:12 crc kubenswrapper[4933]: I1202 15:54:12.586436 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=91.586411018 podStartE2EDuration="1m31.586411018s" podCreationTimestamp="2025-12-02 15:52:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 15:54:12.54898955 +0000 UTC m=+115.800216253" watchObservedRunningTime="2025-12-02 15:54:12.586411018 +0000 UTC m=+115.837637721" Dec 02 15:54:12 crc kubenswrapper[4933]: I1202 15:54:12.611780 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-fl25w" podStartSLOduration=90.611754295 podStartE2EDuration="1m30.611754295s" podCreationTimestamp="2025-12-02 15:52:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 15:54:12.600425182 +0000 UTC m=+115.851651905" watchObservedRunningTime="2025-12-02 15:54:12.611754295 +0000 UTC m=+115.862981048" Dec 02 15:54:12 crc kubenswrapper[4933]: I1202 15:54:12.635637 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=60.635614611 podStartE2EDuration="1m0.635614611s" podCreationTimestamp="2025-12-02 15:53:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 15:54:12.635503918 +0000 UTC m=+115.886730621" watchObservedRunningTime="2025-12-02 15:54:12.635614611 +0000 UTC m=+115.886841314" Dec 02 15:54:12 crc kubenswrapper[4933]: I1202 15:54:12.662978 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podStartSLOduration=90.662950841 podStartE2EDuration="1m30.662950841s" podCreationTimestamp="2025-12-02 15:52:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 15:54:12.662395306 +0000 UTC m=+115.913622049" watchObservedRunningTime="2025-12-02 15:54:12.662950841 +0000 UTC m=+115.914177584" Dec 02 15:54:12 crc kubenswrapper[4933]: I1202 15:54:12.676455 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/59c70163-c0d4-4702-be2e-fbf1782ddfbb-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-ddjx4\" (UID: \"59c70163-c0d4-4702-be2e-fbf1782ddfbb\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ddjx4" Dec 02 15:54:12 crc kubenswrapper[4933]: I1202 15:54:12.676502 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/59c70163-c0d4-4702-be2e-fbf1782ddfbb-service-ca\") pod \"cluster-version-operator-5c965bbfc6-ddjx4\" (UID: \"59c70163-c0d4-4702-be2e-fbf1782ddfbb\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ddjx4" Dec 02 15:54:12 crc kubenswrapper[4933]: I1202 15:54:12.676555 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/59c70163-c0d4-4702-be2e-fbf1782ddfbb-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-ddjx4\" (UID: \"59c70163-c0d4-4702-be2e-fbf1782ddfbb\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ddjx4" Dec 02 15:54:12 crc kubenswrapper[4933]: I1202 15:54:12.676599 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/59c70163-c0d4-4702-be2e-fbf1782ddfbb-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-ddjx4\" (UID: \"59c70163-c0d4-4702-be2e-fbf1782ddfbb\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ddjx4" Dec 02 15:54:12 crc kubenswrapper[4933]: I1202 15:54:12.676673 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/59c70163-c0d4-4702-be2e-fbf1782ddfbb-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-ddjx4\" (UID: \"59c70163-c0d4-4702-be2e-fbf1782ddfbb\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ddjx4" Dec 02 15:54:12 crc kubenswrapper[4933]: I1202 15:54:12.676736 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/59c70163-c0d4-4702-be2e-fbf1782ddfbb-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-ddjx4\" (UID: \"59c70163-c0d4-4702-be2e-fbf1782ddfbb\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ddjx4" Dec 02 15:54:12 crc kubenswrapper[4933]: I1202 15:54:12.677105 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/59c70163-c0d4-4702-be2e-fbf1782ddfbb-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-ddjx4\" (UID: \"59c70163-c0d4-4702-be2e-fbf1782ddfbb\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ddjx4" Dec 02 15:54:12 crc kubenswrapper[4933]: I1202 15:54:12.677617 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/59c70163-c0d4-4702-be2e-fbf1782ddfbb-service-ca\") pod \"cluster-version-operator-5c965bbfc6-ddjx4\" (UID: \"59c70163-c0d4-4702-be2e-fbf1782ddfbb\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ddjx4" Dec 02 15:54:12 crc kubenswrapper[4933]: I1202 15:54:12.681806 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-5d9dn" podStartSLOduration=90.681773013 podStartE2EDuration="1m30.681773013s" podCreationTimestamp="2025-12-02 15:52:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 15:54:12.681697771 +0000 UTC m=+115.932924474" watchObservedRunningTime="2025-12-02 15:54:12.681773013 +0000 UTC m=+115.932999766" Dec 02 15:54:12 crc kubenswrapper[4933]: I1202 15:54:12.687809 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/59c70163-c0d4-4702-be2e-fbf1782ddfbb-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-ddjx4\" (UID: \"59c70163-c0d4-4702-be2e-fbf1782ddfbb\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ddjx4" Dec 02 15:54:12 crc kubenswrapper[4933]: I1202 15:54:12.694797 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/59c70163-c0d4-4702-be2e-fbf1782ddfbb-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-ddjx4\" (UID: \"59c70163-c0d4-4702-be2e-fbf1782ddfbb\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ddjx4" Dec 02 15:54:12 crc kubenswrapper[4933]: I1202 15:54:12.767874 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ddjx4" Dec 02 15:54:12 crc kubenswrapper[4933]: I1202 15:54:12.781603 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-z6kjz" podStartSLOduration=90.781583087 podStartE2EDuration="1m30.781583087s" podCreationTimestamp="2025-12-02 15:52:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 15:54:12.75322939 +0000 UTC m=+116.004456093" watchObservedRunningTime="2025-12-02 15:54:12.781583087 +0000 UTC m=+116.032809800" Dec 02 15:54:12 crc kubenswrapper[4933]: I1202 15:54:12.782221 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=96.782214394 podStartE2EDuration="1m36.782214394s" podCreationTimestamp="2025-12-02 15:52:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 15:54:12.777392935 +0000 UTC m=+116.028619638" watchObservedRunningTime="2025-12-02 15:54:12.782214394 +0000 UTC m=+116.033441097" Dec 02 15:54:12 crc kubenswrapper[4933]: I1202 15:54:12.988281 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ddjx4" event={"ID":"59c70163-c0d4-4702-be2e-fbf1782ddfbb","Type":"ContainerStarted","Data":"e65a78e5ffd394b0e082d74c1600f0562f84351d98715916c1df4def710ad999"} Dec 02 15:54:12 crc kubenswrapper[4933]: I1202 15:54:12.988323 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ddjx4" event={"ID":"59c70163-c0d4-4702-be2e-fbf1782ddfbb","Type":"ContainerStarted","Data":"651e4b0d919aa1fa3e72f387bd6844ed9e1ee8e948ea3c4006d18424a3c76ae8"} Dec 02 15:54:13 crc kubenswrapper[4933]: I1202 15:54:13.002050 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-s779q" podStartSLOduration=91.002030439 podStartE2EDuration="1m31.002030439s" podCreationTimestamp="2025-12-02 15:52:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 15:54:12.845131233 +0000 UTC m=+116.096357936" watchObservedRunningTime="2025-12-02 15:54:13.002030439 +0000 UTC m=+116.253257142" Dec 02 15:54:13 crc kubenswrapper[4933]: I1202 15:54:13.052503 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qbps2" Dec 02 15:54:13 crc kubenswrapper[4933]: E1202 15:54:13.052639 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qbps2" podUID="c95a4730-1427-4097-9ca3-4bd251e7acf0" Dec 02 15:54:14 crc kubenswrapper[4933]: I1202 15:54:14.053393 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 15:54:14 crc kubenswrapper[4933]: I1202 15:54:14.053653 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 15:54:14 crc kubenswrapper[4933]: E1202 15:54:14.053963 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 15:54:14 crc kubenswrapper[4933]: I1202 15:54:14.053975 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 15:54:14 crc kubenswrapper[4933]: E1202 15:54:14.054051 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 15:54:14 crc kubenswrapper[4933]: E1202 15:54:14.054264 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 15:54:15 crc kubenswrapper[4933]: I1202 15:54:15.052573 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qbps2" Dec 02 15:54:15 crc kubenswrapper[4933]: E1202 15:54:15.052812 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qbps2" podUID="c95a4730-1427-4097-9ca3-4bd251e7acf0" Dec 02 15:54:16 crc kubenswrapper[4933]: I1202 15:54:16.052586 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 15:54:16 crc kubenswrapper[4933]: I1202 15:54:16.052674 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 15:54:16 crc kubenswrapper[4933]: I1202 15:54:16.052606 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 15:54:16 crc kubenswrapper[4933]: E1202 15:54:16.052728 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 15:54:16 crc kubenswrapper[4933]: E1202 15:54:16.052855 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 15:54:16 crc kubenswrapper[4933]: E1202 15:54:16.052907 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 15:54:17 crc kubenswrapper[4933]: I1202 15:54:17.009468 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-z6kjz_b033c545-93a2-4401-842b-22456e44216b/kube-multus/1.log" Dec 02 15:54:17 crc kubenswrapper[4933]: I1202 15:54:17.010058 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-z6kjz_b033c545-93a2-4401-842b-22456e44216b/kube-multus/0.log" Dec 02 15:54:17 crc kubenswrapper[4933]: I1202 15:54:17.010221 4933 generic.go:334] "Generic (PLEG): container finished" podID="b033c545-93a2-4401-842b-22456e44216b" containerID="cf98d43fe6e267ef8509fcedf5375bfd2049a7a7964ddcb2cb97b6710013fb7a" exitCode=1 Dec 02 15:54:17 crc kubenswrapper[4933]: I1202 15:54:17.010346 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-z6kjz" event={"ID":"b033c545-93a2-4401-842b-22456e44216b","Type":"ContainerDied","Data":"cf98d43fe6e267ef8509fcedf5375bfd2049a7a7964ddcb2cb97b6710013fb7a"} Dec 02 15:54:17 crc kubenswrapper[4933]: I1202 15:54:17.010438 4933 scope.go:117] "RemoveContainer" containerID="a5307d7bbe56091012f9975b2a42eafb27d8c90b53817f1f82d8269e23456759" Dec 02 15:54:17 crc kubenswrapper[4933]: I1202 15:54:17.011041 4933 scope.go:117] "RemoveContainer" containerID="cf98d43fe6e267ef8509fcedf5375bfd2049a7a7964ddcb2cb97b6710013fb7a" Dec 02 15:54:17 crc kubenswrapper[4933]: E1202 15:54:17.011355 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-z6kjz_openshift-multus(b033c545-93a2-4401-842b-22456e44216b)\"" pod="openshift-multus/multus-z6kjz" podUID="b033c545-93a2-4401-842b-22456e44216b" Dec 02 15:54:17 crc kubenswrapper[4933]: I1202 15:54:17.040517 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ddjx4" podStartSLOduration=95.040493077 podStartE2EDuration="1m35.040493077s" podCreationTimestamp="2025-12-02 15:52:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 15:54:13.003052707 +0000 UTC m=+116.254279410" watchObservedRunningTime="2025-12-02 15:54:17.040493077 +0000 UTC m=+120.291719790" Dec 02 15:54:17 crc kubenswrapper[4933]: I1202 15:54:17.053166 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qbps2" Dec 02 15:54:17 crc kubenswrapper[4933]: E1202 15:54:17.055479 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qbps2" podUID="c95a4730-1427-4097-9ca3-4bd251e7acf0" Dec 02 15:54:17 crc kubenswrapper[4933]: E1202 15:54:17.082701 4933 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Dec 02 15:54:17 crc kubenswrapper[4933]: E1202 15:54:17.206612 4933 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 02 15:54:18 crc kubenswrapper[4933]: I1202 15:54:18.016339 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-z6kjz_b033c545-93a2-4401-842b-22456e44216b/kube-multus/1.log" Dec 02 15:54:18 crc kubenswrapper[4933]: I1202 15:54:18.052809 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 15:54:18 crc kubenswrapper[4933]: I1202 15:54:18.052944 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 15:54:18 crc kubenswrapper[4933]: I1202 15:54:18.052992 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 15:54:18 crc kubenswrapper[4933]: E1202 15:54:18.053002 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 15:54:18 crc kubenswrapper[4933]: E1202 15:54:18.053096 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 15:54:18 crc kubenswrapper[4933]: E1202 15:54:18.053168 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 15:54:19 crc kubenswrapper[4933]: I1202 15:54:19.052687 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qbps2" Dec 02 15:54:19 crc kubenswrapper[4933]: E1202 15:54:19.052950 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qbps2" podUID="c95a4730-1427-4097-9ca3-4bd251e7acf0" Dec 02 15:54:20 crc kubenswrapper[4933]: I1202 15:54:20.052759 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 15:54:20 crc kubenswrapper[4933]: I1202 15:54:20.052865 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 15:54:20 crc kubenswrapper[4933]: E1202 15:54:20.053106 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 15:54:20 crc kubenswrapper[4933]: I1202 15:54:20.052911 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 15:54:20 crc kubenswrapper[4933]: E1202 15:54:20.053343 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 15:54:20 crc kubenswrapper[4933]: E1202 15:54:20.053437 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 15:54:21 crc kubenswrapper[4933]: I1202 15:54:21.052819 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qbps2" Dec 02 15:54:21 crc kubenswrapper[4933]: E1202 15:54:21.053137 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qbps2" podUID="c95a4730-1427-4097-9ca3-4bd251e7acf0" Dec 02 15:54:22 crc kubenswrapper[4933]: I1202 15:54:22.052771 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 15:54:22 crc kubenswrapper[4933]: I1202 15:54:22.052863 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 15:54:22 crc kubenswrapper[4933]: I1202 15:54:22.052900 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 15:54:22 crc kubenswrapper[4933]: E1202 15:54:22.053014 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 15:54:22 crc kubenswrapper[4933]: E1202 15:54:22.053078 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 15:54:22 crc kubenswrapper[4933]: E1202 15:54:22.053152 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 15:54:22 crc kubenswrapper[4933]: E1202 15:54:22.208370 4933 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 02 15:54:23 crc kubenswrapper[4933]: I1202 15:54:23.052660 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qbps2" Dec 02 15:54:23 crc kubenswrapper[4933]: E1202 15:54:23.052897 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qbps2" podUID="c95a4730-1427-4097-9ca3-4bd251e7acf0" Dec 02 15:54:24 crc kubenswrapper[4933]: I1202 15:54:24.053148 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 15:54:24 crc kubenswrapper[4933]: I1202 15:54:24.053224 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 15:54:24 crc kubenswrapper[4933]: I1202 15:54:24.053247 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 15:54:24 crc kubenswrapper[4933]: E1202 15:54:24.053302 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 15:54:24 crc kubenswrapper[4933]: E1202 15:54:24.053398 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 15:54:24 crc kubenswrapper[4933]: E1202 15:54:24.053528 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 15:54:24 crc kubenswrapper[4933]: I1202 15:54:24.054367 4933 scope.go:117] "RemoveContainer" containerID="2f7591b746ed9942ce7644e8005251baa49bdeaef3618aa1d679249fd3a96058" Dec 02 15:54:24 crc kubenswrapper[4933]: I1202 15:54:24.820946 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-qbps2"] Dec 02 15:54:24 crc kubenswrapper[4933]: I1202 15:54:24.821478 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qbps2" Dec 02 15:54:24 crc kubenswrapper[4933]: E1202 15:54:24.821596 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qbps2" podUID="c95a4730-1427-4097-9ca3-4bd251e7acf0" Dec 02 15:54:25 crc kubenswrapper[4933]: I1202 15:54:25.044790 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8mklc_1972064c-ea30-421c-b009-2bc675a98fcc/ovnkube-controller/3.log" Dec 02 15:54:25 crc kubenswrapper[4933]: I1202 15:54:25.048272 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" event={"ID":"1972064c-ea30-421c-b009-2bc675a98fcc","Type":"ContainerStarted","Data":"6c83315e1cf2e0f0b5ef4f463b5d421bdd6c99328c413d5e4bc17487fafcc629"} Dec 02 15:54:25 crc kubenswrapper[4933]: I1202 15:54:25.048833 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" Dec 02 15:54:25 crc kubenswrapper[4933]: I1202 15:54:25.079073 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" podStartSLOduration=103.079047357 podStartE2EDuration="1m43.079047357s" podCreationTimestamp="2025-12-02 15:52:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 15:54:25.078914983 +0000 UTC m=+128.330141696" watchObservedRunningTime="2025-12-02 15:54:25.079047357 +0000 UTC m=+128.330274060" Dec 02 15:54:26 crc kubenswrapper[4933]: I1202 15:54:26.052747 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 15:54:26 crc kubenswrapper[4933]: I1202 15:54:26.052807 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 15:54:26 crc kubenswrapper[4933]: I1202 15:54:26.052804 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qbps2" Dec 02 15:54:26 crc kubenswrapper[4933]: I1202 15:54:26.052976 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 15:54:26 crc kubenswrapper[4933]: E1202 15:54:26.052961 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 15:54:26 crc kubenswrapper[4933]: E1202 15:54:26.053128 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qbps2" podUID="c95a4730-1427-4097-9ca3-4bd251e7acf0" Dec 02 15:54:26 crc kubenswrapper[4933]: E1202 15:54:26.053202 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 15:54:26 crc kubenswrapper[4933]: E1202 15:54:26.053242 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 15:54:27 crc kubenswrapper[4933]: E1202 15:54:27.210813 4933 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 02 15:54:28 crc kubenswrapper[4933]: I1202 15:54:28.052524 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 15:54:28 crc kubenswrapper[4933]: I1202 15:54:28.052552 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 15:54:28 crc kubenswrapper[4933]: I1202 15:54:28.052542 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qbps2" Dec 02 15:54:28 crc kubenswrapper[4933]: I1202 15:54:28.052564 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 15:54:28 crc kubenswrapper[4933]: E1202 15:54:28.052673 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 15:54:28 crc kubenswrapper[4933]: E1202 15:54:28.052798 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qbps2" podUID="c95a4730-1427-4097-9ca3-4bd251e7acf0" Dec 02 15:54:28 crc kubenswrapper[4933]: E1202 15:54:28.052870 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 15:54:28 crc kubenswrapper[4933]: E1202 15:54:28.052930 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 15:54:30 crc kubenswrapper[4933]: I1202 15:54:30.052713 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 15:54:30 crc kubenswrapper[4933]: E1202 15:54:30.053394 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 15:54:30 crc kubenswrapper[4933]: I1202 15:54:30.052778 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 15:54:30 crc kubenswrapper[4933]: E1202 15:54:30.053517 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 15:54:30 crc kubenswrapper[4933]: I1202 15:54:30.052802 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qbps2" Dec 02 15:54:30 crc kubenswrapper[4933]: E1202 15:54:30.053737 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qbps2" podUID="c95a4730-1427-4097-9ca3-4bd251e7acf0" Dec 02 15:54:30 crc kubenswrapper[4933]: I1202 15:54:30.052714 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 15:54:30 crc kubenswrapper[4933]: E1202 15:54:30.053873 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 15:54:32 crc kubenswrapper[4933]: I1202 15:54:32.053406 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qbps2" Dec 02 15:54:32 crc kubenswrapper[4933]: I1202 15:54:32.053456 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 15:54:32 crc kubenswrapper[4933]: I1202 15:54:32.053433 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 15:54:32 crc kubenswrapper[4933]: I1202 15:54:32.053404 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 15:54:32 crc kubenswrapper[4933]: I1202 15:54:32.054518 4933 scope.go:117] "RemoveContainer" containerID="cf98d43fe6e267ef8509fcedf5375bfd2049a7a7964ddcb2cb97b6710013fb7a" Dec 02 15:54:32 crc kubenswrapper[4933]: E1202 15:54:32.054626 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 15:54:32 crc kubenswrapper[4933]: E1202 15:54:32.054591 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 15:54:32 crc kubenswrapper[4933]: E1202 15:54:32.054697 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qbps2" podUID="c95a4730-1427-4097-9ca3-4bd251e7acf0" Dec 02 15:54:32 crc kubenswrapper[4933]: E1202 15:54:32.054782 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 15:54:32 crc kubenswrapper[4933]: E1202 15:54:32.212985 4933 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 02 15:54:33 crc kubenswrapper[4933]: I1202 15:54:33.082135 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-z6kjz_b033c545-93a2-4401-842b-22456e44216b/kube-multus/1.log" Dec 02 15:54:33 crc kubenswrapper[4933]: I1202 15:54:33.082216 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-z6kjz" event={"ID":"b033c545-93a2-4401-842b-22456e44216b","Type":"ContainerStarted","Data":"fdc8b3cab6425c14392d3e1388ade4844bac3e7998924034f044e5a47d73be67"} Dec 02 15:54:34 crc kubenswrapper[4933]: I1202 15:54:34.053241 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 15:54:34 crc kubenswrapper[4933]: I1202 15:54:34.053296 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 15:54:34 crc kubenswrapper[4933]: E1202 15:54:34.053351 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 15:54:34 crc kubenswrapper[4933]: E1202 15:54:34.053420 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 15:54:34 crc kubenswrapper[4933]: I1202 15:54:34.053487 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qbps2" Dec 02 15:54:34 crc kubenswrapper[4933]: E1202 15:54:34.053544 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qbps2" podUID="c95a4730-1427-4097-9ca3-4bd251e7acf0" Dec 02 15:54:34 crc kubenswrapper[4933]: I1202 15:54:34.053594 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 15:54:34 crc kubenswrapper[4933]: E1202 15:54:34.053654 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 15:54:36 crc kubenswrapper[4933]: I1202 15:54:36.053090 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qbps2" Dec 02 15:54:36 crc kubenswrapper[4933]: I1202 15:54:36.053107 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 15:54:36 crc kubenswrapper[4933]: I1202 15:54:36.053153 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 15:54:36 crc kubenswrapper[4933]: I1202 15:54:36.053235 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 15:54:36 crc kubenswrapper[4933]: E1202 15:54:36.053378 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qbps2" podUID="c95a4730-1427-4097-9ca3-4bd251e7acf0" Dec 02 15:54:36 crc kubenswrapper[4933]: E1202 15:54:36.053516 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 15:54:36 crc kubenswrapper[4933]: E1202 15:54:36.053624 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 15:54:36 crc kubenswrapper[4933]: E1202 15:54:36.053679 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 15:54:38 crc kubenswrapper[4933]: I1202 15:54:38.053299 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 15:54:38 crc kubenswrapper[4933]: I1202 15:54:38.053299 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qbps2" Dec 02 15:54:38 crc kubenswrapper[4933]: I1202 15:54:38.053354 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 15:54:38 crc kubenswrapper[4933]: I1202 15:54:38.053394 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 15:54:38 crc kubenswrapper[4933]: I1202 15:54:38.056607 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 02 15:54:38 crc kubenswrapper[4933]: I1202 15:54:38.057084 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 02 15:54:38 crc kubenswrapper[4933]: I1202 15:54:38.059351 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 02 15:54:38 crc kubenswrapper[4933]: I1202 15:54:38.059598 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 02 15:54:38 crc kubenswrapper[4933]: I1202 15:54:38.059855 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 02 15:54:38 crc kubenswrapper[4933]: I1202 15:54:38.060021 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 02 15:54:42 crc kubenswrapper[4933]: I1202 15:54:42.719545 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.075395 4933 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.106934 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-5ttq4"] Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.107751 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-5ttq4" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.107901 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-w8l5x"] Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.108589 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-w8l5x" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.116421 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.116796 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.116925 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.117017 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.117113 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.117647 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.117762 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.117759 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.117902 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.119185 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.119243 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.119332 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.119408 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.119434 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.119496 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.121510 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-kmxmh"] Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.121950 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-kmxmh" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.125079 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.125237 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.125468 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.126599 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.127269 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-brhm4"] Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.128136 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-6dp2z"] Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.128794 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-6dp2z" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.129603 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-brhm4" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.131047 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-9r8tr"] Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.131574 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-9r8tr" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.132525 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.132748 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.133557 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.133722 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8pvz9"] Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.134213 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8pvz9" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.134488 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-774lf"] Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.135003 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-774lf" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.135869 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-tdzpv"] Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.136128 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tdzpv" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.136254 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.137166 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.138086 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-lvtt7"] Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.139095 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-lcwgg"] Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.139697 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2hj84"] Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.140412 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2hj84" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.141206 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-lcwgg" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.139717 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-lvtt7" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.147025 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.147213 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.147312 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.158553 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.159022 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-26gvw"] Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.159073 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.160879 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.161985 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.162800 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.163491 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.163966 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.164634 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.164768 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-26gvw" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.178244 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.178599 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.178684 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.178877 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.179105 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.181045 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-hjncd"] Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.181655 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hjncd" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.184712 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.184727 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.185268 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.185472 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.191301 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.191458 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.191538 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.193435 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.193541 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.193585 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.193656 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.193803 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.193934 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.194343 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.194556 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.194701 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.194846 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.194820 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.195091 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.195240 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.193954 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.201375 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.201656 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.202475 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-b2sss"] Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.203085 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-b2sss" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.203758 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.203848 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.205280 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.205567 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.205744 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.205915 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.206105 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.206266 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.206409 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.206554 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.206698 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.206976 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.209207 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.209983 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.211522 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-jgdkm"] Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.211667 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.212018 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.212106 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.214749 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.212308 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.212349 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.212349 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.214939 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-grlhq"] Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.212386 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.212649 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.215288 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.215377 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-dhcjs"] Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.215441 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.215610 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-jgdkm" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.215731 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-dhcjs" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.216040 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-grlhq" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.216321 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-xfcv2"] Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.220594 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.222034 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.222169 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.232617 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.237351 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-5ttq4"] Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.221524 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.238858 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.244214 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.245526 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-xfcv2" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.245560 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.254656 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.254994 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.255345 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/97c46bec-8e96-4d62-8808-549f712a9802-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-26gvw\" (UID: \"97c46bec-8e96-4d62-8808-549f712a9802\") " pod="openshift-authentication/oauth-openshift-558db77b4-26gvw" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.255390 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/14eded2c-58d8-4348-a7ae-1a027349ae71-serving-cert\") pod \"apiserver-76f77b778f-w8l5x\" (UID: \"14eded2c-58d8-4348-a7ae-1a027349ae71\") " pod="openshift-apiserver/apiserver-76f77b778f-w8l5x" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.255451 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e0ed0e9-47b5-44d2-89c5-9674e4dc3ed4-config\") pod \"machine-api-operator-5694c8668f-brhm4\" (UID: \"2e0ed0e9-47b5-44d2-89c5-9674e4dc3ed4\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-brhm4" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.255504 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/97c46bec-8e96-4d62-8808-549f712a9802-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-26gvw\" (UID: \"97c46bec-8e96-4d62-8808-549f712a9802\") " pod="openshift-authentication/oauth-openshift-558db77b4-26gvw" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.255531 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a3e77e2-5cb3-44df-8570-e18f8e8f15a5-config\") pod \"route-controller-manager-6576b87f9c-tdzpv\" (UID: \"8a3e77e2-5cb3-44df-8570-e18f8e8f15a5\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tdzpv" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.255559 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mt4wv\" (UniqueName: \"kubernetes.io/projected/97c46bec-8e96-4d62-8808-549f712a9802-kube-api-access-mt4wv\") pod \"oauth-openshift-558db77b4-26gvw\" (UID: \"97c46bec-8e96-4d62-8808-549f712a9802\") " pod="openshift-authentication/oauth-openshift-558db77b4-26gvw" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.255582 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9be27357-dce3-4ea8-8f1f-6bb17fc0870d-auth-proxy-config\") pod \"machine-approver-56656f9798-hjncd\" (UID: \"9be27357-dce3-4ea8-8f1f-6bb17fc0870d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hjncd" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.255605 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aae2ee36-4dcf-429c-985d-c891c771cb0d-config\") pod \"openshift-apiserver-operator-796bbdcf4f-2hj84\" (UID: \"aae2ee36-4dcf-429c-985d-c891c771cb0d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2hj84" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.255625 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/97c46bec-8e96-4d62-8808-549f712a9802-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-26gvw\" (UID: \"97c46bec-8e96-4d62-8808-549f712a9802\") " pod="openshift-authentication/oauth-openshift-558db77b4-26gvw" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.255646 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/97c46bec-8e96-4d62-8808-549f712a9802-audit-dir\") pod \"oauth-openshift-558db77b4-26gvw\" (UID: \"97c46bec-8e96-4d62-8808-549f712a9802\") " pod="openshift-authentication/oauth-openshift-558db77b4-26gvw" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.255664 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/9be27357-dce3-4ea8-8f1f-6bb17fc0870d-machine-approver-tls\") pod \"machine-approver-56656f9798-hjncd\" (UID: \"9be27357-dce3-4ea8-8f1f-6bb17fc0870d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hjncd" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.255686 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/97c46bec-8e96-4d62-8808-549f712a9802-audit-policies\") pod \"oauth-openshift-558db77b4-26gvw\" (UID: \"97c46bec-8e96-4d62-8808-549f712a9802\") " pod="openshift-authentication/oauth-openshift-558db77b4-26gvw" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.255706 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/14eded2c-58d8-4348-a7ae-1a027349ae71-etcd-serving-ca\") pod \"apiserver-76f77b778f-w8l5x\" (UID: \"14eded2c-58d8-4348-a7ae-1a027349ae71\") " pod="openshift-apiserver/apiserver-76f77b778f-w8l5x" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.255731 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/14eded2c-58d8-4348-a7ae-1a027349ae71-audit\") pod \"apiserver-76f77b778f-w8l5x\" (UID: \"14eded2c-58d8-4348-a7ae-1a027349ae71\") " pod="openshift-apiserver/apiserver-76f77b778f-w8l5x" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.255753 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ebb91b68-bf2f-43b1-a5d7-1c5de0e46e84-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-774lf\" (UID: \"ebb91b68-bf2f-43b1-a5d7-1c5de0e46e84\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-774lf" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.255774 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/2e0ed0e9-47b5-44d2-89c5-9674e4dc3ed4-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-brhm4\" (UID: \"2e0ed0e9-47b5-44d2-89c5-9674e4dc3ed4\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-brhm4" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.255799 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/97c46bec-8e96-4d62-8808-549f712a9802-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-26gvw\" (UID: \"97c46bec-8e96-4d62-8808-549f712a9802\") " pod="openshift-authentication/oauth-openshift-558db77b4-26gvw" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.255839 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/97c46bec-8e96-4d62-8808-549f712a9802-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-26gvw\" (UID: \"97c46bec-8e96-4d62-8808-549f712a9802\") " pod="openshift-authentication/oauth-openshift-558db77b4-26gvw" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.255863 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aae2ee36-4dcf-429c-985d-c891c771cb0d-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-2hj84\" (UID: \"aae2ee36-4dcf-429c-985d-c891c771cb0d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2hj84" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.255884 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtxt5\" (UniqueName: \"kubernetes.io/projected/22c6a35b-d256-4c38-861a-f98b7a22d8fa-kube-api-access-xtxt5\") pod \"cluster-image-registry-operator-dc59b4c8b-8pvz9\" (UID: \"22c6a35b-d256-4c38-861a-f98b7a22d8fa\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8pvz9" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.255906 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wvvb\" (UniqueName: \"kubernetes.io/projected/14eded2c-58d8-4348-a7ae-1a027349ae71-kube-api-access-5wvvb\") pod \"apiserver-76f77b778f-w8l5x\" (UID: \"14eded2c-58d8-4348-a7ae-1a027349ae71\") " pod="openshift-apiserver/apiserver-76f77b778f-w8l5x" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.255945 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/14eded2c-58d8-4348-a7ae-1a027349ae71-node-pullsecrets\") pod \"apiserver-76f77b778f-w8l5x\" (UID: \"14eded2c-58d8-4348-a7ae-1a027349ae71\") " pod="openshift-apiserver/apiserver-76f77b778f-w8l5x" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.255968 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/14eded2c-58d8-4348-a7ae-1a027349ae71-trusted-ca-bundle\") pod \"apiserver-76f77b778f-w8l5x\" (UID: \"14eded2c-58d8-4348-a7ae-1a027349ae71\") " pod="openshift-apiserver/apiserver-76f77b778f-w8l5x" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.256000 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/97c46bec-8e96-4d62-8808-549f712a9802-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-26gvw\" (UID: \"97c46bec-8e96-4d62-8808-549f712a9802\") " pod="openshift-authentication/oauth-openshift-558db77b4-26gvw" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.256025 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8a3e77e2-5cb3-44df-8570-e18f8e8f15a5-serving-cert\") pod \"route-controller-manager-6576b87f9c-tdzpv\" (UID: \"8a3e77e2-5cb3-44df-8570-e18f8e8f15a5\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tdzpv" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.256054 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/14eded2c-58d8-4348-a7ae-1a027349ae71-encryption-config\") pod \"apiserver-76f77b778f-w8l5x\" (UID: \"14eded2c-58d8-4348-a7ae-1a027349ae71\") " pod="openshift-apiserver/apiserver-76f77b778f-w8l5x" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.256084 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/22c6a35b-d256-4c38-861a-f98b7a22d8fa-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-8pvz9\" (UID: \"22c6a35b-d256-4c38-861a-f98b7a22d8fa\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8pvz9" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.256120 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/97c46bec-8e96-4d62-8808-549f712a9802-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-26gvw\" (UID: \"97c46bec-8e96-4d62-8808-549f712a9802\") " pod="openshift-authentication/oauth-openshift-558db77b4-26gvw" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.256142 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/97c46bec-8e96-4d62-8808-549f712a9802-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-26gvw\" (UID: \"97c46bec-8e96-4d62-8808-549f712a9802\") " pod="openshift-authentication/oauth-openshift-558db77b4-26gvw" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.256164 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/14eded2c-58d8-4348-a7ae-1a027349ae71-image-import-ca\") pod \"apiserver-76f77b778f-w8l5x\" (UID: \"14eded2c-58d8-4348-a7ae-1a027349ae71\") " pod="openshift-apiserver/apiserver-76f77b778f-w8l5x" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.256185 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pznp\" (UniqueName: \"kubernetes.io/projected/ebb91b68-bf2f-43b1-a5d7-1c5de0e46e84-kube-api-access-6pznp\") pod \"cluster-samples-operator-665b6dd947-774lf\" (UID: \"ebb91b68-bf2f-43b1-a5d7-1c5de0e46e84\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-774lf" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.256210 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/14eded2c-58d8-4348-a7ae-1a027349ae71-etcd-client\") pod \"apiserver-76f77b778f-w8l5x\" (UID: \"14eded2c-58d8-4348-a7ae-1a027349ae71\") " pod="openshift-apiserver/apiserver-76f77b778f-w8l5x" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.256236 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/97c46bec-8e96-4d62-8808-549f712a9802-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-26gvw\" (UID: \"97c46bec-8e96-4d62-8808-549f712a9802\") " pod="openshift-authentication/oauth-openshift-558db77b4-26gvw" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.256279 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzrzt\" (UniqueName: \"kubernetes.io/projected/2e0ed0e9-47b5-44d2-89c5-9674e4dc3ed4-kube-api-access-zzrzt\") pod \"machine-api-operator-5694c8668f-brhm4\" (UID: \"2e0ed0e9-47b5-44d2-89c5-9674e4dc3ed4\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-brhm4" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.256305 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14eded2c-58d8-4348-a7ae-1a027349ae71-config\") pod \"apiserver-76f77b778f-w8l5x\" (UID: \"14eded2c-58d8-4348-a7ae-1a027349ae71\") " pod="openshift-apiserver/apiserver-76f77b778f-w8l5x" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.256330 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.256332 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/97c46bec-8e96-4d62-8808-549f712a9802-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-26gvw\" (UID: \"97c46bec-8e96-4d62-8808-549f712a9802\") " pod="openshift-authentication/oauth-openshift-558db77b4-26gvw" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.258303 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/14eded2c-58d8-4348-a7ae-1a027349ae71-audit-dir\") pod \"apiserver-76f77b778f-w8l5x\" (UID: \"14eded2c-58d8-4348-a7ae-1a027349ae71\") " pod="openshift-apiserver/apiserver-76f77b778f-w8l5x" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.258341 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bhdx\" (UniqueName: \"kubernetes.io/projected/8a3e77e2-5cb3-44df-8570-e18f8e8f15a5-kube-api-access-6bhdx\") pod \"route-controller-manager-6576b87f9c-tdzpv\" (UID: \"8a3e77e2-5cb3-44df-8570-e18f8e8f15a5\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tdzpv" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.258369 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnrrn\" (UniqueName: \"kubernetes.io/projected/aae2ee36-4dcf-429c-985d-c891c771cb0d-kube-api-access-bnrrn\") pod \"openshift-apiserver-operator-796bbdcf4f-2hj84\" (UID: \"aae2ee36-4dcf-429c-985d-c891c771cb0d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2hj84" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.258406 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/97c46bec-8e96-4d62-8808-549f712a9802-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-26gvw\" (UID: \"97c46bec-8e96-4d62-8808-549f712a9802\") " pod="openshift-authentication/oauth-openshift-558db77b4-26gvw" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.258427 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8a3e77e2-5cb3-44df-8570-e18f8e8f15a5-client-ca\") pod \"route-controller-manager-6576b87f9c-tdzpv\" (UID: \"8a3e77e2-5cb3-44df-8570-e18f8e8f15a5\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tdzpv" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.258451 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9be27357-dce3-4ea8-8f1f-6bb17fc0870d-config\") pod \"machine-approver-56656f9798-hjncd\" (UID: \"9be27357-dce3-4ea8-8f1f-6bb17fc0870d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hjncd" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.258489 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/22c6a35b-d256-4c38-861a-f98b7a22d8fa-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-8pvz9\" (UID: \"22c6a35b-d256-4c38-861a-f98b7a22d8fa\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8pvz9" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.258522 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qk87g\" (UniqueName: \"kubernetes.io/projected/9be27357-dce3-4ea8-8f1f-6bb17fc0870d-kube-api-access-qk87g\") pod \"machine-approver-56656f9798-hjncd\" (UID: \"9be27357-dce3-4ea8-8f1f-6bb17fc0870d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hjncd" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.258556 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/22c6a35b-d256-4c38-861a-f98b7a22d8fa-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-8pvz9\" (UID: \"22c6a35b-d256-4c38-861a-f98b7a22d8fa\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8pvz9" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.258576 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/2e0ed0e9-47b5-44d2-89c5-9674e4dc3ed4-images\") pod \"machine-api-operator-5694c8668f-brhm4\" (UID: \"2e0ed0e9-47b5-44d2-89c5-9674e4dc3ed4\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-brhm4" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.265562 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.265855 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-htb5t"] Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.266353 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-6f2dw"] Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.266758 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6f2dw" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.266928 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-htb5t" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.275988 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.277038 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.277090 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.277050 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.277882 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-dgdm5"] Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.280460 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-dgdm5" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.282285 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-2r9nt"] Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.282918 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2r9nt" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.285900 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-d86pn"] Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.286573 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-d86pn" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.289762 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-cx7jx"] Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.290182 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-pvt57"] Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.290436 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hdrfm"] Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.290758 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hdrfm" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.291236 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-cx7jx" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.291389 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-pvt57" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.294699 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-tgwdx"] Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.295605 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-tgwdx" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.296872 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.297576 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ckqs2"] Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.298065 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ckqs2" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.299841 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-b986z"] Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.300575 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-r7dnz"] Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.300957 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-b986z" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.301179 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-r7dnz" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.302415 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d99zb"] Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.302893 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d99zb" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.303514 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4wd5d"] Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.303968 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4wd5d" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.304971 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-v4g7n"] Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.305708 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-v4g7n" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.306055 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6qhmc"] Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.306447 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6qhmc" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.307229 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-w8l5x"] Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.309544 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-kmxmh"] Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.310160 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411505-4m9wv"] Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.310520 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.310651 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411505-4m9wv" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.313438 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xg27x"] Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.313993 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xg27x" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.315562 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-ws24r"] Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.316138 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-ws24r" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.320562 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-brhm4"] Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.322906 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-6dp2z"] Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.326181 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-5xvkq"] Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.328727 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-5xvkq" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.330281 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.330890 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-6f2dw"] Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.336941 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-9r8tr"] Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.340036 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8pvz9"] Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.344606 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-htb5t"] Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.345269 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2hj84"] Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.346311 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-2r9nt"] Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.348222 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-774lf"] Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.349624 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hdrfm"] Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.349800 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6qhmc"] Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.350001 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.351371 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-tdzpv"] Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.352158 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-lcwgg"] Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.353174 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-pvt57"] Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.354158 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-dhcjs"] Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.355200 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-xfcv2"] Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.356702 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-b2sss"] Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.357895 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-r7dnz"] Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.358878 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-b986z"] Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.359235 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/14eded2c-58d8-4348-a7ae-1a027349ae71-encryption-config\") pod \"apiserver-76f77b778f-w8l5x\" (UID: \"14eded2c-58d8-4348-a7ae-1a027349ae71\") " pod="openshift-apiserver/apiserver-76f77b778f-w8l5x" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.359265 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/22c6a35b-d256-4c38-861a-f98b7a22d8fa-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-8pvz9\" (UID: \"22c6a35b-d256-4c38-861a-f98b7a22d8fa\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8pvz9" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.359293 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/97c46bec-8e96-4d62-8808-549f712a9802-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-26gvw\" (UID: \"97c46bec-8e96-4d62-8808-549f712a9802\") " pod="openshift-authentication/oauth-openshift-558db77b4-26gvw" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.360772 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/97c46bec-8e96-4d62-8808-549f712a9802-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-26gvw\" (UID: \"97c46bec-8e96-4d62-8808-549f712a9802\") " pod="openshift-authentication/oauth-openshift-558db77b4-26gvw" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.360856 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/14eded2c-58d8-4348-a7ae-1a027349ae71-image-import-ca\") pod \"apiserver-76f77b778f-w8l5x\" (UID: \"14eded2c-58d8-4348-a7ae-1a027349ae71\") " pod="openshift-apiserver/apiserver-76f77b778f-w8l5x" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.360884 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6pznp\" (UniqueName: \"kubernetes.io/projected/ebb91b68-bf2f-43b1-a5d7-1c5de0e46e84-kube-api-access-6pznp\") pod \"cluster-samples-operator-665b6dd947-774lf\" (UID: \"ebb91b68-bf2f-43b1-a5d7-1c5de0e46e84\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-774lf" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.360919 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/14eded2c-58d8-4348-a7ae-1a027349ae71-etcd-client\") pod \"apiserver-76f77b778f-w8l5x\" (UID: \"14eded2c-58d8-4348-a7ae-1a027349ae71\") " pod="openshift-apiserver/apiserver-76f77b778f-w8l5x" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.360943 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/97c46bec-8e96-4d62-8808-549f712a9802-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-26gvw\" (UID: \"97c46bec-8e96-4d62-8808-549f712a9802\") " pod="openshift-authentication/oauth-openshift-558db77b4-26gvw" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.360969 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzrzt\" (UniqueName: \"kubernetes.io/projected/2e0ed0e9-47b5-44d2-89c5-9674e4dc3ed4-kube-api-access-zzrzt\") pod \"machine-api-operator-5694c8668f-brhm4\" (UID: \"2e0ed0e9-47b5-44d2-89c5-9674e4dc3ed4\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-brhm4" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.361002 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/97c46bec-8e96-4d62-8808-549f712a9802-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-26gvw\" (UID: \"97c46bec-8e96-4d62-8808-549f712a9802\") " pod="openshift-authentication/oauth-openshift-558db77b4-26gvw" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.361027 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14eded2c-58d8-4348-a7ae-1a027349ae71-config\") pod \"apiserver-76f77b778f-w8l5x\" (UID: \"14eded2c-58d8-4348-a7ae-1a027349ae71\") " pod="openshift-apiserver/apiserver-76f77b778f-w8l5x" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.361048 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/14eded2c-58d8-4348-a7ae-1a027349ae71-audit-dir\") pod \"apiserver-76f77b778f-w8l5x\" (UID: \"14eded2c-58d8-4348-a7ae-1a027349ae71\") " pod="openshift-apiserver/apiserver-76f77b778f-w8l5x" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.361074 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6bhdx\" (UniqueName: \"kubernetes.io/projected/8a3e77e2-5cb3-44df-8570-e18f8e8f15a5-kube-api-access-6bhdx\") pod \"route-controller-manager-6576b87f9c-tdzpv\" (UID: \"8a3e77e2-5cb3-44df-8570-e18f8e8f15a5\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tdzpv" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.361096 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bnrrn\" (UniqueName: \"kubernetes.io/projected/aae2ee36-4dcf-429c-985d-c891c771cb0d-kube-api-access-bnrrn\") pod \"openshift-apiserver-operator-796bbdcf4f-2hj84\" (UID: \"aae2ee36-4dcf-429c-985d-c891c771cb0d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2hj84" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.361125 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/97c46bec-8e96-4d62-8808-549f712a9802-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-26gvw\" (UID: \"97c46bec-8e96-4d62-8808-549f712a9802\") " pod="openshift-authentication/oauth-openshift-558db77b4-26gvw" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.361147 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8a3e77e2-5cb3-44df-8570-e18f8e8f15a5-client-ca\") pod \"route-controller-manager-6576b87f9c-tdzpv\" (UID: \"8a3e77e2-5cb3-44df-8570-e18f8e8f15a5\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tdzpv" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.361168 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9be27357-dce3-4ea8-8f1f-6bb17fc0870d-config\") pod \"machine-approver-56656f9798-hjncd\" (UID: \"9be27357-dce3-4ea8-8f1f-6bb17fc0870d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hjncd" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.361189 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/22c6a35b-d256-4c38-861a-f98b7a22d8fa-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-8pvz9\" (UID: \"22c6a35b-d256-4c38-861a-f98b7a22d8fa\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8pvz9" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.361216 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qk87g\" (UniqueName: \"kubernetes.io/projected/9be27357-dce3-4ea8-8f1f-6bb17fc0870d-kube-api-access-qk87g\") pod \"machine-approver-56656f9798-hjncd\" (UID: \"9be27357-dce3-4ea8-8f1f-6bb17fc0870d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hjncd" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.361242 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/22c6a35b-d256-4c38-861a-f98b7a22d8fa-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-8pvz9\" (UID: \"22c6a35b-d256-4c38-861a-f98b7a22d8fa\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8pvz9" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.361262 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/2e0ed0e9-47b5-44d2-89c5-9674e4dc3ed4-images\") pod \"machine-api-operator-5694c8668f-brhm4\" (UID: \"2e0ed0e9-47b5-44d2-89c5-9674e4dc3ed4\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-brhm4" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.361290 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/97c46bec-8e96-4d62-8808-549f712a9802-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-26gvw\" (UID: \"97c46bec-8e96-4d62-8808-549f712a9802\") " pod="openshift-authentication/oauth-openshift-558db77b4-26gvw" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.361313 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/14eded2c-58d8-4348-a7ae-1a027349ae71-serving-cert\") pod \"apiserver-76f77b778f-w8l5x\" (UID: \"14eded2c-58d8-4348-a7ae-1a027349ae71\") " pod="openshift-apiserver/apiserver-76f77b778f-w8l5x" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.361373 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e0ed0e9-47b5-44d2-89c5-9674e4dc3ed4-config\") pod \"machine-api-operator-5694c8668f-brhm4\" (UID: \"2e0ed0e9-47b5-44d2-89c5-9674e4dc3ed4\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-brhm4" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.361409 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/97c46bec-8e96-4d62-8808-549f712a9802-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-26gvw\" (UID: \"97c46bec-8e96-4d62-8808-549f712a9802\") " pod="openshift-authentication/oauth-openshift-558db77b4-26gvw" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.361426 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a3e77e2-5cb3-44df-8570-e18f8e8f15a5-config\") pod \"route-controller-manager-6576b87f9c-tdzpv\" (UID: \"8a3e77e2-5cb3-44df-8570-e18f8e8f15a5\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tdzpv" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.360035 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-26gvw"] Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.361487 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4wd5d"] Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.361451 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mt4wv\" (UniqueName: \"kubernetes.io/projected/97c46bec-8e96-4d62-8808-549f712a9802-kube-api-access-mt4wv\") pod \"oauth-openshift-558db77b4-26gvw\" (UID: \"97c46bec-8e96-4d62-8808-549f712a9802\") " pod="openshift-authentication/oauth-openshift-558db77b4-26gvw" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.361557 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9be27357-dce3-4ea8-8f1f-6bb17fc0870d-auth-proxy-config\") pod \"machine-approver-56656f9798-hjncd\" (UID: \"9be27357-dce3-4ea8-8f1f-6bb17fc0870d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hjncd" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.361592 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aae2ee36-4dcf-429c-985d-c891c771cb0d-config\") pod \"openshift-apiserver-operator-796bbdcf4f-2hj84\" (UID: \"aae2ee36-4dcf-429c-985d-c891c771cb0d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2hj84" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.361622 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/97c46bec-8e96-4d62-8808-549f712a9802-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-26gvw\" (UID: \"97c46bec-8e96-4d62-8808-549f712a9802\") " pod="openshift-authentication/oauth-openshift-558db77b4-26gvw" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.361656 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/97c46bec-8e96-4d62-8808-549f712a9802-audit-dir\") pod \"oauth-openshift-558db77b4-26gvw\" (UID: \"97c46bec-8e96-4d62-8808-549f712a9802\") " pod="openshift-authentication/oauth-openshift-558db77b4-26gvw" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.361684 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/9be27357-dce3-4ea8-8f1f-6bb17fc0870d-machine-approver-tls\") pod \"machine-approver-56656f9798-hjncd\" (UID: \"9be27357-dce3-4ea8-8f1f-6bb17fc0870d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hjncd" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.361716 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/97c46bec-8e96-4d62-8808-549f712a9802-audit-policies\") pod \"oauth-openshift-558db77b4-26gvw\" (UID: \"97c46bec-8e96-4d62-8808-549f712a9802\") " pod="openshift-authentication/oauth-openshift-558db77b4-26gvw" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.361744 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/14eded2c-58d8-4348-a7ae-1a027349ae71-etcd-serving-ca\") pod \"apiserver-76f77b778f-w8l5x\" (UID: \"14eded2c-58d8-4348-a7ae-1a027349ae71\") " pod="openshift-apiserver/apiserver-76f77b778f-w8l5x" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.361770 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/14eded2c-58d8-4348-a7ae-1a027349ae71-audit\") pod \"apiserver-76f77b778f-w8l5x\" (UID: \"14eded2c-58d8-4348-a7ae-1a027349ae71\") " pod="openshift-apiserver/apiserver-76f77b778f-w8l5x" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.361796 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ebb91b68-bf2f-43b1-a5d7-1c5de0e46e84-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-774lf\" (UID: \"ebb91b68-bf2f-43b1-a5d7-1c5de0e46e84\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-774lf" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.361844 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/2e0ed0e9-47b5-44d2-89c5-9674e4dc3ed4-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-brhm4\" (UID: \"2e0ed0e9-47b5-44d2-89c5-9674e4dc3ed4\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-brhm4" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.361875 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/97c46bec-8e96-4d62-8808-549f712a9802-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-26gvw\" (UID: \"97c46bec-8e96-4d62-8808-549f712a9802\") " pod="openshift-authentication/oauth-openshift-558db77b4-26gvw" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.361904 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/97c46bec-8e96-4d62-8808-549f712a9802-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-26gvw\" (UID: \"97c46bec-8e96-4d62-8808-549f712a9802\") " pod="openshift-authentication/oauth-openshift-558db77b4-26gvw" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.361933 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aae2ee36-4dcf-429c-985d-c891c771cb0d-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-2hj84\" (UID: \"aae2ee36-4dcf-429c-985d-c891c771cb0d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2hj84" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.361962 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xtxt5\" (UniqueName: \"kubernetes.io/projected/22c6a35b-d256-4c38-861a-f98b7a22d8fa-kube-api-access-xtxt5\") pod \"cluster-image-registry-operator-dc59b4c8b-8pvz9\" (UID: \"22c6a35b-d256-4c38-861a-f98b7a22d8fa\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8pvz9" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.362010 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/14eded2c-58d8-4348-a7ae-1a027349ae71-node-pullsecrets\") pod \"apiserver-76f77b778f-w8l5x\" (UID: \"14eded2c-58d8-4348-a7ae-1a027349ae71\") " pod="openshift-apiserver/apiserver-76f77b778f-w8l5x" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.362037 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/14eded2c-58d8-4348-a7ae-1a027349ae71-trusted-ca-bundle\") pod \"apiserver-76f77b778f-w8l5x\" (UID: \"14eded2c-58d8-4348-a7ae-1a027349ae71\") " pod="openshift-apiserver/apiserver-76f77b778f-w8l5x" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.362091 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5wvvb\" (UniqueName: \"kubernetes.io/projected/14eded2c-58d8-4348-a7ae-1a027349ae71-kube-api-access-5wvvb\") pod \"apiserver-76f77b778f-w8l5x\" (UID: \"14eded2c-58d8-4348-a7ae-1a027349ae71\") " pod="openshift-apiserver/apiserver-76f77b778f-w8l5x" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.362188 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/97c46bec-8e96-4d62-8808-549f712a9802-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-26gvw\" (UID: \"97c46bec-8e96-4d62-8808-549f712a9802\") " pod="openshift-authentication/oauth-openshift-558db77b4-26gvw" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.362255 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8a3e77e2-5cb3-44df-8570-e18f8e8f15a5-serving-cert\") pod \"route-controller-manager-6576b87f9c-tdzpv\" (UID: \"8a3e77e2-5cb3-44df-8570-e18f8e8f15a5\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tdzpv" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.363943 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e0ed0e9-47b5-44d2-89c5-9674e4dc3ed4-config\") pod \"machine-api-operator-5694c8668f-brhm4\" (UID: \"2e0ed0e9-47b5-44d2-89c5-9674e4dc3ed4\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-brhm4" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.364124 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/14eded2c-58d8-4348-a7ae-1a027349ae71-image-import-ca\") pod \"apiserver-76f77b778f-w8l5x\" (UID: \"14eded2c-58d8-4348-a7ae-1a027349ae71\") " pod="openshift-apiserver/apiserver-76f77b778f-w8l5x" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.364795 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-d86pn"] Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.364850 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-cx7jx"] Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.364868 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ckqs2"] Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.365180 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8a3e77e2-5cb3-44df-8570-e18f8e8f15a5-client-ca\") pod \"route-controller-manager-6576b87f9c-tdzpv\" (UID: \"8a3e77e2-5cb3-44df-8570-e18f8e8f15a5\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tdzpv" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.365473 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/2e0ed0e9-47b5-44d2-89c5-9674e4dc3ed4-images\") pod \"machine-api-operator-5694c8668f-brhm4\" (UID: \"2e0ed0e9-47b5-44d2-89c5-9674e4dc3ed4\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-brhm4" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.366164 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9be27357-dce3-4ea8-8f1f-6bb17fc0870d-config\") pod \"machine-approver-56656f9798-hjncd\" (UID: \"9be27357-dce3-4ea8-8f1f-6bb17fc0870d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hjncd" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.366436 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-v2svg"] Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.367143 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/14eded2c-58d8-4348-a7ae-1a027349ae71-node-pullsecrets\") pod \"apiserver-76f77b778f-w8l5x\" (UID: \"14eded2c-58d8-4348-a7ae-1a027349ae71\") " pod="openshift-apiserver/apiserver-76f77b778f-w8l5x" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.367384 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/14eded2c-58d8-4348-a7ae-1a027349ae71-audit\") pod \"apiserver-76f77b778f-w8l5x\" (UID: \"14eded2c-58d8-4348-a7ae-1a027349ae71\") " pod="openshift-apiserver/apiserver-76f77b778f-w8l5x" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.367776 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-g4vq2"] Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.368115 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a3e77e2-5cb3-44df-8570-e18f8e8f15a5-config\") pod \"route-controller-manager-6576b87f9c-tdzpv\" (UID: \"8a3e77e2-5cb3-44df-8570-e18f8e8f15a5\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tdzpv" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.368128 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/97c46bec-8e96-4d62-8808-549f712a9802-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-26gvw\" (UID: \"97c46bec-8e96-4d62-8808-549f712a9802\") " pod="openshift-authentication/oauth-openshift-558db77b4-26gvw" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.368166 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/14eded2c-58d8-4348-a7ae-1a027349ae71-trusted-ca-bundle\") pod \"apiserver-76f77b778f-w8l5x\" (UID: \"14eded2c-58d8-4348-a7ae-1a027349ae71\") " pod="openshift-apiserver/apiserver-76f77b778f-w8l5x" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.368704 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8a3e77e2-5cb3-44df-8570-e18f8e8f15a5-serving-cert\") pod \"route-controller-manager-6576b87f9c-tdzpv\" (UID: \"8a3e77e2-5cb3-44df-8570-e18f8e8f15a5\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tdzpv" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.369012 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/97c46bec-8e96-4d62-8808-549f712a9802-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-26gvw\" (UID: \"97c46bec-8e96-4d62-8808-549f712a9802\") " pod="openshift-authentication/oauth-openshift-558db77b4-26gvw" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.369074 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/97c46bec-8e96-4d62-8808-549f712a9802-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-26gvw\" (UID: \"97c46bec-8e96-4d62-8808-549f712a9802\") " pod="openshift-authentication/oauth-openshift-558db77b4-26gvw" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.369506 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-v4g7n"] Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.369629 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9be27357-dce3-4ea8-8f1f-6bb17fc0870d-auth-proxy-config\") pod \"machine-approver-56656f9798-hjncd\" (UID: \"9be27357-dce3-4ea8-8f1f-6bb17fc0870d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hjncd" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.369778 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-g4vq2" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.369925 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/14eded2c-58d8-4348-a7ae-1a027349ae71-audit-dir\") pod \"apiserver-76f77b778f-w8l5x\" (UID: \"14eded2c-58d8-4348-a7ae-1a027349ae71\") " pod="openshift-apiserver/apiserver-76f77b778f-w8l5x" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.370239 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14eded2c-58d8-4348-a7ae-1a027349ae71-config\") pod \"apiserver-76f77b778f-w8l5x\" (UID: \"14eded2c-58d8-4348-a7ae-1a027349ae71\") " pod="openshift-apiserver/apiserver-76f77b778f-w8l5x" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.370367 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-jgdkm"] Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.370380 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-v2svg" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.370564 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aae2ee36-4dcf-429c-985d-c891c771cb0d-config\") pod \"openshift-apiserver-operator-796bbdcf4f-2hj84\" (UID: \"aae2ee36-4dcf-429c-985d-c891c771cb0d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2hj84" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.370644 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.371002 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/97c46bec-8e96-4d62-8808-549f712a9802-audit-policies\") pod \"oauth-openshift-558db77b4-26gvw\" (UID: \"97c46bec-8e96-4d62-8808-549f712a9802\") " pod="openshift-authentication/oauth-openshift-558db77b4-26gvw" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.371122 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ebb91b68-bf2f-43b1-a5d7-1c5de0e46e84-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-774lf\" (UID: \"ebb91b68-bf2f-43b1-a5d7-1c5de0e46e84\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-774lf" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.371268 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/97c46bec-8e96-4d62-8808-549f712a9802-audit-dir\") pod \"oauth-openshift-558db77b4-26gvw\" (UID: \"97c46bec-8e96-4d62-8808-549f712a9802\") " pod="openshift-authentication/oauth-openshift-558db77b4-26gvw" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.371279 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/9be27357-dce3-4ea8-8f1f-6bb17fc0870d-machine-approver-tls\") pod \"machine-approver-56656f9798-hjncd\" (UID: \"9be27357-dce3-4ea8-8f1f-6bb17fc0870d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hjncd" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.371629 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d99zb"] Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.371886 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aae2ee36-4dcf-429c-985d-c891c771cb0d-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-2hj84\" (UID: \"aae2ee36-4dcf-429c-985d-c891c771cb0d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2hj84" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.372126 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/2e0ed0e9-47b5-44d2-89c5-9674e4dc3ed4-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-brhm4\" (UID: \"2e0ed0e9-47b5-44d2-89c5-9674e4dc3ed4\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-brhm4" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.372434 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/14eded2c-58d8-4348-a7ae-1a027349ae71-etcd-serving-ca\") pod \"apiserver-76f77b778f-w8l5x\" (UID: \"14eded2c-58d8-4348-a7ae-1a027349ae71\") " pod="openshift-apiserver/apiserver-76f77b778f-w8l5x" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.372924 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xg27x"] Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.373548 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/22c6a35b-d256-4c38-861a-f98b7a22d8fa-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-8pvz9\" (UID: \"22c6a35b-d256-4c38-861a-f98b7a22d8fa\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8pvz9" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.373980 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-ws24r"] Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.375348 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-lvtt7"] Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.375490 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/14eded2c-58d8-4348-a7ae-1a027349ae71-encryption-config\") pod \"apiserver-76f77b778f-w8l5x\" (UID: \"14eded2c-58d8-4348-a7ae-1a027349ae71\") " pod="openshift-apiserver/apiserver-76f77b778f-w8l5x" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.375877 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/97c46bec-8e96-4d62-8808-549f712a9802-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-26gvw\" (UID: \"97c46bec-8e96-4d62-8808-549f712a9802\") " pod="openshift-authentication/oauth-openshift-558db77b4-26gvw" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.376038 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/97c46bec-8e96-4d62-8808-549f712a9802-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-26gvw\" (UID: \"97c46bec-8e96-4d62-8808-549f712a9802\") " pod="openshift-authentication/oauth-openshift-558db77b4-26gvw" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.377481 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/14eded2c-58d8-4348-a7ae-1a027349ae71-serving-cert\") pod \"apiserver-76f77b778f-w8l5x\" (UID: \"14eded2c-58d8-4348-a7ae-1a027349ae71\") " pod="openshift-apiserver/apiserver-76f77b778f-w8l5x" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.377486 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/14eded2c-58d8-4348-a7ae-1a027349ae71-etcd-client\") pod \"apiserver-76f77b778f-w8l5x\" (UID: \"14eded2c-58d8-4348-a7ae-1a027349ae71\") " pod="openshift-apiserver/apiserver-76f77b778f-w8l5x" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.377591 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/97c46bec-8e96-4d62-8808-549f712a9802-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-26gvw\" (UID: \"97c46bec-8e96-4d62-8808-549f712a9802\") " pod="openshift-authentication/oauth-openshift-558db77b4-26gvw" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.377667 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/22c6a35b-d256-4c38-861a-f98b7a22d8fa-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-8pvz9\" (UID: \"22c6a35b-d256-4c38-861a-f98b7a22d8fa\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8pvz9" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.377735 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-tgwdx"] Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.377785 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/97c46bec-8e96-4d62-8808-549f712a9802-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-26gvw\" (UID: \"97c46bec-8e96-4d62-8808-549f712a9802\") " pod="openshift-authentication/oauth-openshift-558db77b4-26gvw" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.378097 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/97c46bec-8e96-4d62-8808-549f712a9802-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-26gvw\" (UID: \"97c46bec-8e96-4d62-8808-549f712a9802\") " pod="openshift-authentication/oauth-openshift-558db77b4-26gvw" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.378629 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-grlhq"] Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.379223 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/97c46bec-8e96-4d62-8808-549f712a9802-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-26gvw\" (UID: \"97c46bec-8e96-4d62-8808-549f712a9802\") " pod="openshift-authentication/oauth-openshift-558db77b4-26gvw" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.379358 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/97c46bec-8e96-4d62-8808-549f712a9802-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-26gvw\" (UID: \"97c46bec-8e96-4d62-8808-549f712a9802\") " pod="openshift-authentication/oauth-openshift-558db77b4-26gvw" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.380455 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-v2svg"] Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.380648 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/97c46bec-8e96-4d62-8808-549f712a9802-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-26gvw\" (UID: \"97c46bec-8e96-4d62-8808-549f712a9802\") " pod="openshift-authentication/oauth-openshift-558db77b4-26gvw" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.381686 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411505-4m9wv"] Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.382722 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-g4vq2"] Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.383765 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-5xvkq"] Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.384812 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-57cjj"] Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.385425 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-57cjj" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.390024 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.450864 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.463175 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fcc5c745-13b8-4ff8-b677-095dd8a46081-service-ca\") pod \"console-f9d7485db-lcwgg\" (UID: \"fcc5c745-13b8-4ff8-b677-095dd8a46081\") " pod="openshift-console/console-f9d7485db-lcwgg" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.463241 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff-registry-certificates\") pod \"image-registry-697d97f7c8-lvtt7\" (UID: \"977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff\") " pod="openshift-image-registry/image-registry-697d97f7c8-lvtt7" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.463260 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fcc5c745-13b8-4ff8-b677-095dd8a46081-console-oauth-config\") pod \"console-f9d7485db-lcwgg\" (UID: \"fcc5c745-13b8-4ff8-b677-095dd8a46081\") " pod="openshift-console/console-f9d7485db-lcwgg" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.463276 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff-ca-trust-extracted\") pod \"image-registry-697d97f7c8-lvtt7\" (UID: \"977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff\") " pod="openshift-image-registry/image-registry-697d97f7c8-lvtt7" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.463319 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/50f1708b-32f7-42c3-a3ec-57f654624efa-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-5ttq4\" (UID: \"50f1708b-32f7-42c3-a3ec-57f654624efa\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5ttq4" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.463352 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrtr5\" (UniqueName: \"kubernetes.io/projected/4762fc2a-f608-4992-a53f-72ba66df1820-kube-api-access-wrtr5\") pod \"console-operator-58897d9998-6dp2z\" (UID: \"4762fc2a-f608-4992-a53f-72ba66df1820\") " pod="openshift-console-operator/console-operator-58897d9998-6dp2z" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.463382 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50f1708b-32f7-42c3-a3ec-57f654624efa-config\") pod \"controller-manager-879f6c89f-5ttq4\" (UID: \"50f1708b-32f7-42c3-a3ec-57f654624efa\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5ttq4" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.463439 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fcc5c745-13b8-4ff8-b677-095dd8a46081-trusted-ca-bundle\") pod \"console-f9d7485db-lcwgg\" (UID: \"fcc5c745-13b8-4ff8-b677-095dd8a46081\") " pod="openshift-console/console-f9d7485db-lcwgg" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.463469 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lvtt7\" (UID: \"977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff\") " pod="openshift-image-registry/image-registry-697d97f7c8-lvtt7" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.463487 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4762fc2a-f608-4992-a53f-72ba66df1820-serving-cert\") pod \"console-operator-58897d9998-6dp2z\" (UID: \"4762fc2a-f608-4992-a53f-72ba66df1820\") " pod="openshift-console-operator/console-operator-58897d9998-6dp2z" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.463513 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4c9c5\" (UniqueName: \"kubernetes.io/projected/50f1708b-32f7-42c3-a3ec-57f654624efa-kube-api-access-4c9c5\") pod \"controller-manager-879f6c89f-5ttq4\" (UID: \"50f1708b-32f7-42c3-a3ec-57f654624efa\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5ttq4" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.463531 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/071e51e3-11fa-4ff1-9417-b7fbca815e88-serving-cert\") pod \"authentication-operator-69f744f599-kmxmh\" (UID: \"071e51e3-11fa-4ff1-9417-b7fbca815e88\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kmxmh" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.463662 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4762fc2a-f608-4992-a53f-72ba66df1820-trusted-ca\") pod \"console-operator-58897d9998-6dp2z\" (UID: \"4762fc2a-f608-4992-a53f-72ba66df1820\") " pod="openshift-console-operator/console-operator-58897d9998-6dp2z" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.463767 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ppt4\" (UniqueName: \"kubernetes.io/projected/071e51e3-11fa-4ff1-9417-b7fbca815e88-kube-api-access-9ppt4\") pod \"authentication-operator-69f744f599-kmxmh\" (UID: \"071e51e3-11fa-4ff1-9417-b7fbca815e88\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kmxmh" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.463924 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/50f1708b-32f7-42c3-a3ec-57f654624efa-client-ca\") pod \"controller-manager-879f6c89f-5ttq4\" (UID: \"50f1708b-32f7-42c3-a3ec-57f654624efa\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5ttq4" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.463996 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/071e51e3-11fa-4ff1-9417-b7fbca815e88-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-kmxmh\" (UID: \"071e51e3-11fa-4ff1-9417-b7fbca815e88\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kmxmh" Dec 02 15:54:43 crc kubenswrapper[4933]: E1202 15:54:43.464015 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 15:54:43.963988896 +0000 UTC m=+147.215215639 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lvtt7" (UID: "977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.464058 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/071e51e3-11fa-4ff1-9417-b7fbca815e88-service-ca-bundle\") pod \"authentication-operator-69f744f599-kmxmh\" (UID: \"071e51e3-11fa-4ff1-9417-b7fbca815e88\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kmxmh" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.464117 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff-installation-pull-secrets\") pod \"image-registry-697d97f7c8-lvtt7\" (UID: \"977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff\") " pod="openshift-image-registry/image-registry-697d97f7c8-lvtt7" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.464169 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff-registry-tls\") pod \"image-registry-697d97f7c8-lvtt7\" (UID: \"977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff\") " pod="openshift-image-registry/image-registry-697d97f7c8-lvtt7" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.464210 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/50f1708b-32f7-42c3-a3ec-57f654624efa-serving-cert\") pod \"controller-manager-879f6c89f-5ttq4\" (UID: \"50f1708b-32f7-42c3-a3ec-57f654624efa\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5ttq4" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.464244 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff-trusted-ca\") pod \"image-registry-697d97f7c8-lvtt7\" (UID: \"977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff\") " pod="openshift-image-registry/image-registry-697d97f7c8-lvtt7" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.464318 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fcc5c745-13b8-4ff8-b677-095dd8a46081-oauth-serving-cert\") pod \"console-f9d7485db-lcwgg\" (UID: \"fcc5c745-13b8-4ff8-b677-095dd8a46081\") " pod="openshift-console/console-f9d7485db-lcwgg" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.464720 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jt7qx\" (UniqueName: \"kubernetes.io/projected/977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff-kube-api-access-jt7qx\") pod \"image-registry-697d97f7c8-lvtt7\" (UID: \"977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff\") " pod="openshift-image-registry/image-registry-697d97f7c8-lvtt7" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.464764 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rd48q\" (UniqueName: \"kubernetes.io/projected/fcc5c745-13b8-4ff8-b677-095dd8a46081-kube-api-access-rd48q\") pod \"console-f9d7485db-lcwgg\" (UID: \"fcc5c745-13b8-4ff8-b677-095dd8a46081\") " pod="openshift-console/console-f9d7485db-lcwgg" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.464782 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/071e51e3-11fa-4ff1-9417-b7fbca815e88-config\") pod \"authentication-operator-69f744f599-kmxmh\" (UID: \"071e51e3-11fa-4ff1-9417-b7fbca815e88\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kmxmh" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.464852 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zkt4\" (UniqueName: \"kubernetes.io/projected/8b382362-3187-4571-a8a0-057cbccc89ff-kube-api-access-2zkt4\") pod \"downloads-7954f5f757-9r8tr\" (UID: \"8b382362-3187-4571-a8a0-057cbccc89ff\") " pod="openshift-console/downloads-7954f5f757-9r8tr" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.464912 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fcc5c745-13b8-4ff8-b677-095dd8a46081-console-serving-cert\") pod \"console-f9d7485db-lcwgg\" (UID: \"fcc5c745-13b8-4ff8-b677-095dd8a46081\") " pod="openshift-console/console-f9d7485db-lcwgg" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.464929 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fcc5c745-13b8-4ff8-b677-095dd8a46081-console-config\") pod \"console-f9d7485db-lcwgg\" (UID: \"fcc5c745-13b8-4ff8-b677-095dd8a46081\") " pod="openshift-console/console-f9d7485db-lcwgg" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.464945 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4762fc2a-f608-4992-a53f-72ba66df1820-config\") pod \"console-operator-58897d9998-6dp2z\" (UID: \"4762fc2a-f608-4992-a53f-72ba66df1820\") " pod="openshift-console-operator/console-operator-58897d9998-6dp2z" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.464993 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff-bound-sa-token\") pod \"image-registry-697d97f7c8-lvtt7\" (UID: \"977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff\") " pod="openshift-image-registry/image-registry-697d97f7c8-lvtt7" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.470407 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.490686 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.510070 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.530061 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.550014 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.566485 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.566749 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/043b25c4-e262-410b-aec8-e18fdb93d0c7-service-ca-bundle\") pod \"router-default-5444994796-dgdm5\" (UID: \"043b25c4-e262-410b-aec8-e18fdb93d0c7\") " pod="openshift-ingress/router-default-5444994796-dgdm5" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.566783 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/146d5bc3-8ff4-4175-9482-c9e3f220e5c0-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-grlhq\" (UID: \"146d5bc3-8ff4-4175-9482-c9e3f220e5c0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-grlhq" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.566816 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrtr5\" (UniqueName: \"kubernetes.io/projected/4762fc2a-f608-4992-a53f-72ba66df1820-kube-api-access-wrtr5\") pod \"console-operator-58897d9998-6dp2z\" (UID: \"4762fc2a-f608-4992-a53f-72ba66df1820\") " pod="openshift-console-operator/console-operator-58897d9998-6dp2z" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.566869 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/97b17284-20cf-4265-a2e0-721fe06f8105-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-dhcjs\" (UID: \"97b17284-20cf-4265-a2e0-721fe06f8105\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-dhcjs" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.566885 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97b17284-20cf-4265-a2e0-721fe06f8105-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-dhcjs\" (UID: \"97b17284-20cf-4265-a2e0-721fe06f8105\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-dhcjs" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.566930 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/f863f677-8452-4742-bb8f-1307b893c75a-certs\") pod \"machine-config-server-57cjj\" (UID: \"f863f677-8452-4742-bb8f-1307b893c75a\") " pod="openshift-machine-config-operator/machine-config-server-57cjj" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.566946 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/748b551d-be9d-4862-9a91-12ca4ccc71fa-tmpfs\") pod \"packageserver-d55dfcdfc-d99zb\" (UID: \"748b551d-be9d-4862-9a91-12ca4ccc71fa\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d99zb" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.566961 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/146d5bc3-8ff4-4175-9482-c9e3f220e5c0-etcd-client\") pod \"apiserver-7bbb656c7d-grlhq\" (UID: \"146d5bc3-8ff4-4175-9482-c9e3f220e5c0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-grlhq" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.566976 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t26cf\" (UniqueName: \"kubernetes.io/projected/043b25c4-e262-410b-aec8-e18fdb93d0c7-kube-api-access-t26cf\") pod \"router-default-5444994796-dgdm5\" (UID: \"043b25c4-e262-410b-aec8-e18fdb93d0c7\") " pod="openshift-ingress/router-default-5444994796-dgdm5" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.566993 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4f2h\" (UniqueName: \"kubernetes.io/projected/97b17284-20cf-4265-a2e0-721fe06f8105-kube-api-access-n4f2h\") pod \"openshift-controller-manager-operator-756b6f6bc6-dhcjs\" (UID: \"97b17284-20cf-4265-a2e0-721fe06f8105\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-dhcjs" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.567009 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b850db5e-74db-4823-bbb3-132b75b17c30-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-b986z\" (UID: \"b850db5e-74db-4823-bbb3-132b75b17c30\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-b986z" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.567028 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgfgv\" (UniqueName: \"kubernetes.io/projected/ff11bd0d-41c3-45a6-a955-259a754887e3-kube-api-access-cgfgv\") pod \"olm-operator-6b444d44fb-hdrfm\" (UID: \"ff11bd0d-41c3-45a6-a955-259a754887e3\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hdrfm" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.567045 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbpt5\" (UniqueName: \"kubernetes.io/projected/9f5ccd80-acb3-4128-bf7c-e2726a9fe5ab-kube-api-access-lbpt5\") pod \"csi-hostpathplugin-5xvkq\" (UID: \"9f5ccd80-acb3-4128-bf7c-e2726a9fe5ab\") " pod="hostpath-provisioner/csi-hostpathplugin-5xvkq" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.567077 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/ff11bd0d-41c3-45a6-a955-259a754887e3-profile-collector-cert\") pod \"olm-operator-6b444d44fb-hdrfm\" (UID: \"ff11bd0d-41c3-45a6-a955-259a754887e3\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hdrfm" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.567092 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba88c07b-f3ab-4abd-ad64-72d34148bc09-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-htb5t\" (UID: \"ba88c07b-f3ab-4abd-ad64-72d34148bc09\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-htb5t" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.567117 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4762fc2a-f608-4992-a53f-72ba66df1820-trusted-ca\") pod \"console-operator-58897d9998-6dp2z\" (UID: \"4762fc2a-f608-4992-a53f-72ba66df1820\") " pod="openshift-console-operator/console-operator-58897d9998-6dp2z" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.567137 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/331221b7-75d0-458f-84d2-098fa5170277-config\") pod \"kube-apiserver-operator-766d6c64bb-xg27x\" (UID: \"331221b7-75d0-458f-84d2-098fa5170277\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xg27x" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.567169 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9ppt4\" (UniqueName: \"kubernetes.io/projected/071e51e3-11fa-4ff1-9417-b7fbca815e88-kube-api-access-9ppt4\") pod \"authentication-operator-69f744f599-kmxmh\" (UID: \"071e51e3-11fa-4ff1-9417-b7fbca815e88\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kmxmh" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.567194 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/028c8291-1fc5-4035-b46f-3c618ecf154d-auth-proxy-config\") pod \"machine-config-operator-74547568cd-2r9nt\" (UID: \"028c8291-1fc5-4035-b46f-3c618ecf154d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2r9nt" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.567212 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/146d5bc3-8ff4-4175-9482-c9e3f220e5c0-audit-policies\") pod \"apiserver-7bbb656c7d-grlhq\" (UID: \"146d5bc3-8ff4-4175-9482-c9e3f220e5c0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-grlhq" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.567242 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/061606b3-9a47-4cff-ad31-04e9a5a05528-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-cx7jx\" (UID: \"061606b3-9a47-4cff-ad31-04e9a5a05528\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-cx7jx" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.567269 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/78a11c22-fa5a-4573-ac7b-44bad6178356-serving-cert\") pod \"openshift-config-operator-7777fb866f-b2sss\" (UID: \"78a11c22-fa5a-4573-ac7b-44bad6178356\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-b2sss" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.567284 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22fjl\" (UniqueName: \"kubernetes.io/projected/78a11c22-fa5a-4573-ac7b-44bad6178356-kube-api-access-22fjl\") pod \"openshift-config-operator-7777fb866f-b2sss\" (UID: \"78a11c22-fa5a-4573-ac7b-44bad6178356\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-b2sss" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.567305 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/071e51e3-11fa-4ff1-9417-b7fbca815e88-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-kmxmh\" (UID: \"071e51e3-11fa-4ff1-9417-b7fbca815e88\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kmxmh" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.567321 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/071e51e3-11fa-4ff1-9417-b7fbca815e88-service-ca-bundle\") pod \"authentication-operator-69f744f599-kmxmh\" (UID: \"071e51e3-11fa-4ff1-9417-b7fbca815e88\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kmxmh" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.567353 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/9f5ccd80-acb3-4128-bf7c-e2726a9fe5ab-csi-data-dir\") pod \"csi-hostpathplugin-5xvkq\" (UID: \"9f5ccd80-acb3-4128-bf7c-e2726a9fe5ab\") " pod="hostpath-provisioner/csi-hostpathplugin-5xvkq" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.567372 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/50f1708b-32f7-42c3-a3ec-57f654624efa-serving-cert\") pod \"controller-manager-879f6c89f-5ttq4\" (UID: \"50f1708b-32f7-42c3-a3ec-57f654624efa\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5ttq4" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.567396 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fcc5c745-13b8-4ff8-b677-095dd8a46081-oauth-serving-cert\") pod \"console-f9d7485db-lcwgg\" (UID: \"fcc5c745-13b8-4ff8-b677-095dd8a46081\") " pod="openshift-console/console-f9d7485db-lcwgg" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.567410 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rd48q\" (UniqueName: \"kubernetes.io/projected/fcc5c745-13b8-4ff8-b677-095dd8a46081-kube-api-access-rd48q\") pod \"console-f9d7485db-lcwgg\" (UID: \"fcc5c745-13b8-4ff8-b677-095dd8a46081\") " pod="openshift-console/console-f9d7485db-lcwgg" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.567426 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/071e51e3-11fa-4ff1-9417-b7fbca815e88-config\") pod \"authentication-operator-69f744f599-kmxmh\" (UID: \"071e51e3-11fa-4ff1-9417-b7fbca815e88\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kmxmh" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.567442 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bqwr\" (UniqueName: \"kubernetes.io/projected/89e4dcb7-4b26-46b9-9346-61d13b21285a-kube-api-access-6bqwr\") pod \"service-ca-9c57cc56f-ws24r\" (UID: \"89e4dcb7-4b26-46b9-9346-61d13b21285a\") " pod="openshift-service-ca/service-ca-9c57cc56f-ws24r" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.567458 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/043b25c4-e262-410b-aec8-e18fdb93d0c7-default-certificate\") pod \"router-default-5444994796-dgdm5\" (UID: \"043b25c4-e262-410b-aec8-e18fdb93d0c7\") " pod="openshift-ingress/router-default-5444994796-dgdm5" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.567475 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/11c9d818-4012-423f-b3ce-bec8ac30f1d7-secret-volume\") pod \"collect-profiles-29411505-4m9wv\" (UID: \"11c9d818-4012-423f-b3ce-bec8ac30f1d7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411505-4m9wv" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.567489 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/9f5ccd80-acb3-4128-bf7c-e2726a9fe5ab-socket-dir\") pod \"csi-hostpathplugin-5xvkq\" (UID: \"9f5ccd80-acb3-4128-bf7c-e2726a9fe5ab\") " pod="hostpath-provisioner/csi-hostpathplugin-5xvkq" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.567505 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/043b25c4-e262-410b-aec8-e18fdb93d0c7-stats-auth\") pod \"router-default-5444994796-dgdm5\" (UID: \"043b25c4-e262-410b-aec8-e18fdb93d0c7\") " pod="openshift-ingress/router-default-5444994796-dgdm5" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.567520 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/043b25c4-e262-410b-aec8-e18fdb93d0c7-metrics-certs\") pod \"router-default-5444994796-dgdm5\" (UID: \"043b25c4-e262-410b-aec8-e18fdb93d0c7\") " pod="openshift-ingress/router-default-5444994796-dgdm5" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.567537 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a612bec-8f7d-4fc1-bba2-877fc67b13f9-config\") pod \"service-ca-operator-777779d784-pvt57\" (UID: \"6a612bec-8f7d-4fc1-bba2-877fc67b13f9\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-pvt57" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.567552 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/146d5bc3-8ff4-4175-9482-c9e3f220e5c0-serving-cert\") pod \"apiserver-7bbb656c7d-grlhq\" (UID: \"146d5bc3-8ff4-4175-9482-c9e3f220e5c0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-grlhq" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.567579 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qk72g\" (UniqueName: \"kubernetes.io/projected/b7c9b4f1-183e-4d60-bb84-b2995d89b0fe-kube-api-access-qk72g\") pod \"etcd-operator-b45778765-d86pn\" (UID: \"b7c9b4f1-183e-4d60-bb84-b2995d89b0fe\") " pod="openshift-etcd-operator/etcd-operator-b45778765-d86pn" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.567597 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/028c8291-1fc5-4035-b46f-3c618ecf154d-proxy-tls\") pod \"machine-config-operator-74547568cd-2r9nt\" (UID: \"028c8291-1fc5-4035-b46f-3c618ecf154d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2r9nt" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.567614 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/b7c9b4f1-183e-4d60-bb84-b2995d89b0fe-etcd-ca\") pod \"etcd-operator-b45778765-d86pn\" (UID: \"b7c9b4f1-183e-4d60-bb84-b2995d89b0fe\") " pod="openshift-etcd-operator/etcd-operator-b45778765-d86pn" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.567629 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b850db5e-74db-4823-bbb3-132b75b17c30-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-b986z\" (UID: \"b850db5e-74db-4823-bbb3-132b75b17c30\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-b986z" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.567645 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c657682e-039a-46ee-b4c7-a5f95ee75e44-bound-sa-token\") pod \"ingress-operator-5b745b69d9-v4g7n\" (UID: \"c657682e-039a-46ee-b4c7-a5f95ee75e44\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-v4g7n" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.567660 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/331221b7-75d0-458f-84d2-098fa5170277-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-xg27x\" (UID: \"331221b7-75d0-458f-84d2-098fa5170277\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xg27x" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.567678 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z52dn\" (UniqueName: \"kubernetes.io/projected/806c8633-7969-405c-b445-734ae20ede22-kube-api-access-z52dn\") pod \"multus-admission-controller-857f4d67dd-tgwdx\" (UID: \"806c8633-7969-405c-b445-734ae20ede22\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-tgwdx" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.567696 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fcc5c745-13b8-4ff8-b677-095dd8a46081-console-serving-cert\") pod \"console-f9d7485db-lcwgg\" (UID: \"fcc5c745-13b8-4ff8-b677-095dd8a46081\") " pod="openshift-console/console-f9d7485db-lcwgg" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.567713 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fcc5c745-13b8-4ff8-b677-095dd8a46081-console-config\") pod \"console-f9d7485db-lcwgg\" (UID: \"fcc5c745-13b8-4ff8-b677-095dd8a46081\") " pod="openshift-console/console-f9d7485db-lcwgg" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.567733 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e0806294-1664-4633-ba20-7b687a8cf4b2-proxy-tls\") pod \"machine-config-controller-84d6567774-6f2dw\" (UID: \"e0806294-1664-4633-ba20-7b687a8cf4b2\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6f2dw" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.567750 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/9f5ccd80-acb3-4128-bf7c-e2726a9fe5ab-mountpoint-dir\") pod \"csi-hostpathplugin-5xvkq\" (UID: \"9f5ccd80-acb3-4128-bf7c-e2726a9fe5ab\") " pod="hostpath-provisioner/csi-hostpathplugin-5xvkq" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.567767 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff-bound-sa-token\") pod \"image-registry-697d97f7c8-lvtt7\" (UID: \"977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff\") " pod="openshift-image-registry/image-registry-697d97f7c8-lvtt7" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.567783 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fcc5c745-13b8-4ff8-b677-095dd8a46081-service-ca\") pod \"console-f9d7485db-lcwgg\" (UID: \"fcc5c745-13b8-4ff8-b677-095dd8a46081\") " pod="openshift-console/console-f9d7485db-lcwgg" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.567799 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vj5fx\" (UniqueName: \"kubernetes.io/projected/da774c74-89be-4c8c-b2e2-0bad771112d2-kube-api-access-vj5fx\") pod \"package-server-manager-789f6589d5-4wd5d\" (UID: \"da774c74-89be-4c8c-b2e2-0bad771112d2\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4wd5d" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.567815 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fcc5c745-13b8-4ff8-b677-095dd8a46081-console-oauth-config\") pod \"console-f9d7485db-lcwgg\" (UID: \"fcc5c745-13b8-4ff8-b677-095dd8a46081\") " pod="openshift-console/console-f9d7485db-lcwgg" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.567857 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff-ca-trust-extracted\") pod \"image-registry-697d97f7c8-lvtt7\" (UID: \"977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff\") " pod="openshift-image-registry/image-registry-697d97f7c8-lvtt7" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.567877 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/37c7afce-94aa-439e-9415-cdf79931a95e-cert\") pod \"ingress-canary-v2svg\" (UID: \"37c7afce-94aa-439e-9415-cdf79931a95e\") " pod="openshift-ingress-canary/ingress-canary-v2svg" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.567893 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b850db5e-74db-4823-bbb3-132b75b17c30-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-b986z\" (UID: \"b850db5e-74db-4823-bbb3-132b75b17c30\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-b986z" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.567909 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50f1708b-32f7-42c3-a3ec-57f654624efa-config\") pod \"controller-manager-879f6c89f-5ttq4\" (UID: \"50f1708b-32f7-42c3-a3ec-57f654624efa\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5ttq4" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.567928 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bw6kr\" (UniqueName: \"kubernetes.io/projected/146d5bc3-8ff4-4175-9482-c9e3f220e5c0-kube-api-access-bw6kr\") pod \"apiserver-7bbb656c7d-grlhq\" (UID: \"146d5bc3-8ff4-4175-9482-c9e3f220e5c0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-grlhq" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.567979 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfchz\" (UniqueName: \"kubernetes.io/projected/11c9d818-4012-423f-b3ce-bec8ac30f1d7-kube-api-access-nfchz\") pod \"collect-profiles-29411505-4m9wv\" (UID: \"11c9d818-4012-423f-b3ce-bec8ac30f1d7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411505-4m9wv" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.568005 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/748b551d-be9d-4862-9a91-12ca4ccc71fa-webhook-cert\") pod \"packageserver-d55dfcdfc-d99zb\" (UID: \"748b551d-be9d-4862-9a91-12ca4ccc71fa\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d99zb" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.568021 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/146d5bc3-8ff4-4175-9482-c9e3f220e5c0-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-grlhq\" (UID: \"146d5bc3-8ff4-4175-9482-c9e3f220e5c0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-grlhq" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.568036 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/146d5bc3-8ff4-4175-9482-c9e3f220e5c0-encryption-config\") pod \"apiserver-7bbb656c7d-grlhq\" (UID: \"146d5bc3-8ff4-4175-9482-c9e3f220e5c0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-grlhq" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.568051 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fcc5c745-13b8-4ff8-b677-095dd8a46081-trusted-ca-bundle\") pod \"console-f9d7485db-lcwgg\" (UID: \"fcc5c745-13b8-4ff8-b677-095dd8a46081\") " pod="openshift-console/console-f9d7485db-lcwgg" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.568067 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ba88c07b-f3ab-4abd-ad64-72d34148bc09-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-htb5t\" (UID: \"ba88c07b-f3ab-4abd-ad64-72d34148bc09\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-htb5t" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.568083 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4762fc2a-f608-4992-a53f-72ba66df1820-serving-cert\") pod \"console-operator-58897d9998-6dp2z\" (UID: \"4762fc2a-f608-4992-a53f-72ba66df1820\") " pod="openshift-console-operator/console-operator-58897d9998-6dp2z" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.568100 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cbd45ad7-14dd-46e8-941b-4d2bfced7567-metrics-tls\") pod \"dns-operator-744455d44c-jgdkm\" (UID: \"cbd45ad7-14dd-46e8-941b-4d2bfced7567\") " pod="openshift-dns-operator/dns-operator-744455d44c-jgdkm" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.568132 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4c9c5\" (UniqueName: \"kubernetes.io/projected/50f1708b-32f7-42c3-a3ec-57f654624efa-kube-api-access-4c9c5\") pod \"controller-manager-879f6c89f-5ttq4\" (UID: \"50f1708b-32f7-42c3-a3ec-57f654624efa\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5ttq4" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.568149 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6a612bec-8f7d-4fc1-bba2-877fc67b13f9-serving-cert\") pod \"service-ca-operator-777779d784-pvt57\" (UID: \"6a612bec-8f7d-4fc1-bba2-877fc67b13f9\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-pvt57" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.568166 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0add1e2e-7e5a-4e56-8d51-85d089b4573c-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-6qhmc\" (UID: \"0add1e2e-7e5a-4e56-8d51-85d089b4573c\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6qhmc" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.568183 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ct6s8\" (UniqueName: \"kubernetes.io/projected/6a612bec-8f7d-4fc1-bba2-877fc67b13f9-kube-api-access-ct6s8\") pod \"service-ca-operator-777779d784-pvt57\" (UID: \"6a612bec-8f7d-4fc1-bba2-877fc67b13f9\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-pvt57" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.568198 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbsnr\" (UniqueName: \"kubernetes.io/projected/cbd45ad7-14dd-46e8-941b-4d2bfced7567-kube-api-access-wbsnr\") pod \"dns-operator-744455d44c-jgdkm\" (UID: \"cbd45ad7-14dd-46e8-941b-4d2bfced7567\") " pod="openshift-dns-operator/dns-operator-744455d44c-jgdkm" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.568218 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/071e51e3-11fa-4ff1-9417-b7fbca815e88-serving-cert\") pod \"authentication-operator-69f744f599-kmxmh\" (UID: \"071e51e3-11fa-4ff1-9417-b7fbca815e88\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kmxmh" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.568235 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98qxk\" (UniqueName: \"kubernetes.io/projected/cb41f368-0638-40fd-be0f-bc71d0182af8-kube-api-access-98qxk\") pod \"marketplace-operator-79b997595-r7dnz\" (UID: \"cb41f368-0638-40fd-be0f-bc71d0182af8\") " pod="openshift-marketplace/marketplace-operator-79b997595-r7dnz" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.568252 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rd4hf\" (UniqueName: \"kubernetes.io/projected/e0806294-1664-4633-ba20-7b687a8cf4b2-kube-api-access-rd4hf\") pod \"machine-config-controller-84d6567774-6f2dw\" (UID: \"e0806294-1664-4633-ba20-7b687a8cf4b2\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6f2dw" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.568300 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/da774c74-89be-4c8c-b2e2-0bad771112d2-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-4wd5d\" (UID: \"da774c74-89be-4c8c-b2e2-0bad771112d2\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4wd5d" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.568319 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/3d3dbac3-1faa-46d1-8a2a-ddf41a7f1f0f-profile-collector-cert\") pod \"catalog-operator-68c6474976-ckqs2\" (UID: \"3d3dbac3-1faa-46d1-8a2a-ddf41a7f1f0f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ckqs2" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.568346 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cb41f368-0638-40fd-be0f-bc71d0182af8-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-r7dnz\" (UID: \"cb41f368-0638-40fd-be0f-bc71d0182af8\") " pod="openshift-marketplace/marketplace-operator-79b997595-r7dnz" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.568363 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b7c9b4f1-183e-4d60-bb84-b2995d89b0fe-serving-cert\") pod \"etcd-operator-b45778765-d86pn\" (UID: \"b7c9b4f1-183e-4d60-bb84-b2995d89b0fe\") " pod="openshift-etcd-operator/etcd-operator-b45778765-d86pn" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.568378 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6bdr\" (UniqueName: \"kubernetes.io/projected/37c7afce-94aa-439e-9415-cdf79931a95e-kube-api-access-n6bdr\") pod \"ingress-canary-v2svg\" (UID: \"37c7afce-94aa-439e-9415-cdf79931a95e\") " pod="openshift-ingress-canary/ingress-canary-v2svg" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.568400 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hsn89\" (UniqueName: \"kubernetes.io/projected/3d3dbac3-1faa-46d1-8a2a-ddf41a7f1f0f-kube-api-access-hsn89\") pod \"catalog-operator-68c6474976-ckqs2\" (UID: \"3d3dbac3-1faa-46d1-8a2a-ddf41a7f1f0f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ckqs2" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.568414 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzhxh\" (UniqueName: \"kubernetes.io/projected/028c8291-1fc5-4035-b46f-3c618ecf154d-kube-api-access-rzhxh\") pod \"machine-config-operator-74547568cd-2r9nt\" (UID: \"028c8291-1fc5-4035-b46f-3c618ecf154d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2r9nt" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.568431 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/50f1708b-32f7-42c3-a3ec-57f654624efa-client-ca\") pod \"controller-manager-879f6c89f-5ttq4\" (UID: \"50f1708b-32f7-42c3-a3ec-57f654624efa\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5ttq4" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.568447 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/89e4dcb7-4b26-46b9-9346-61d13b21285a-signing-cabundle\") pod \"service-ca-9c57cc56f-ws24r\" (UID: \"89e4dcb7-4b26-46b9-9346-61d13b21285a\") " pod="openshift-service-ca/service-ca-9c57cc56f-ws24r" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.568465 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0add1e2e-7e5a-4e56-8d51-85d089b4573c-config\") pod \"kube-controller-manager-operator-78b949d7b-6qhmc\" (UID: \"0add1e2e-7e5a-4e56-8d51-85d089b4573c\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6qhmc" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.568482 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vrn7\" (UniqueName: \"kubernetes.io/projected/ba88c07b-f3ab-4abd-ad64-72d34148bc09-kube-api-access-9vrn7\") pod \"kube-storage-version-migrator-operator-b67b599dd-htb5t\" (UID: \"ba88c07b-f3ab-4abd-ad64-72d34148bc09\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-htb5t" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.568501 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e0806294-1664-4633-ba20-7b687a8cf4b2-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-6f2dw\" (UID: \"e0806294-1664-4633-ba20-7b687a8cf4b2\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6f2dw" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.568520 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/806c8633-7969-405c-b445-734ae20ede22-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-tgwdx\" (UID: \"806c8633-7969-405c-b445-734ae20ede22\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-tgwdx" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.568536 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/ff11bd0d-41c3-45a6-a955-259a754887e3-srv-cert\") pod \"olm-operator-6b444d44fb-hdrfm\" (UID: \"ff11bd0d-41c3-45a6-a955-259a754887e3\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hdrfm" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.568553 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff-installation-pull-secrets\") pod \"image-registry-697d97f7c8-lvtt7\" (UID: \"977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff\") " pod="openshift-image-registry/image-registry-697d97f7c8-lvtt7" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.568581 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcnlc\" (UniqueName: \"kubernetes.io/projected/061606b3-9a47-4cff-ad31-04e9a5a05528-kube-api-access-fcnlc\") pod \"control-plane-machine-set-operator-78cbb6b69f-cx7jx\" (UID: \"061606b3-9a47-4cff-ad31-04e9a5a05528\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-cx7jx" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.568598 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff-registry-tls\") pod \"image-registry-697d97f7c8-lvtt7\" (UID: \"977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff\") " pod="openshift-image-registry/image-registry-697d97f7c8-lvtt7" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.568616 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/f863f677-8452-4742-bb8f-1307b893c75a-node-bootstrap-token\") pod \"machine-config-server-57cjj\" (UID: \"f863f677-8452-4742-bb8f-1307b893c75a\") " pod="openshift-machine-config-operator/machine-config-server-57cjj" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.568632 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d0cbe25c-56a8-4824-ace7-fff562288389-config-volume\") pod \"dns-default-g4vq2\" (UID: \"d0cbe25c-56a8-4824-ace7-fff562288389\") " pod="openshift-dns/dns-default-g4vq2" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.568650 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff-trusted-ca\") pod \"image-registry-697d97f7c8-lvtt7\" (UID: \"977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff\") " pod="openshift-image-registry/image-registry-697d97f7c8-lvtt7" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.568667 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9zvp\" (UniqueName: \"kubernetes.io/projected/748b551d-be9d-4862-9a91-12ca4ccc71fa-kube-api-access-z9zvp\") pod \"packageserver-d55dfcdfc-d99zb\" (UID: \"748b551d-be9d-4862-9a91-12ca4ccc71fa\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d99zb" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.568683 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/146d5bc3-8ff4-4175-9482-c9e3f220e5c0-audit-dir\") pod \"apiserver-7bbb656c7d-grlhq\" (UID: \"146d5bc3-8ff4-4175-9482-c9e3f220e5c0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-grlhq" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.568727 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/89e4dcb7-4b26-46b9-9346-61d13b21285a-signing-key\") pod \"service-ca-9c57cc56f-ws24r\" (UID: \"89e4dcb7-4b26-46b9-9346-61d13b21285a\") " pod="openshift-service-ca/service-ca-9c57cc56f-ws24r" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.568742 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c657682e-039a-46ee-b4c7-a5f95ee75e44-trusted-ca\") pod \"ingress-operator-5b745b69d9-v4g7n\" (UID: \"c657682e-039a-46ee-b4c7-a5f95ee75e44\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-v4g7n" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.568757 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9f5ccd80-acb3-4128-bf7c-e2726a9fe5ab-registration-dir\") pod \"csi-hostpathplugin-5xvkq\" (UID: \"9f5ccd80-acb3-4128-bf7c-e2726a9fe5ab\") " pod="hostpath-provisioner/csi-hostpathplugin-5xvkq" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.568773 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2zkt4\" (UniqueName: \"kubernetes.io/projected/8b382362-3187-4571-a8a0-057cbccc89ff-kube-api-access-2zkt4\") pod \"downloads-7954f5f757-9r8tr\" (UID: \"8b382362-3187-4571-a8a0-057cbccc89ff\") " pod="openshift-console/downloads-7954f5f757-9r8tr" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.568790 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/cb41f368-0638-40fd-be0f-bc71d0182af8-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-r7dnz\" (UID: \"cb41f368-0638-40fd-be0f-bc71d0182af8\") " pod="openshift-marketplace/marketplace-operator-79b997595-r7dnz" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.568805 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/748b551d-be9d-4862-9a91-12ca4ccc71fa-apiservice-cert\") pod \"packageserver-d55dfcdfc-d99zb\" (UID: \"748b551d-be9d-4862-9a91-12ca4ccc71fa\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d99zb" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.568844 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jt7qx\" (UniqueName: \"kubernetes.io/projected/977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff-kube-api-access-jt7qx\") pod \"image-registry-697d97f7c8-lvtt7\" (UID: \"977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff\") " pod="openshift-image-registry/image-registry-697d97f7c8-lvtt7" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.568862 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7c9b4f1-183e-4d60-bb84-b2995d89b0fe-config\") pod \"etcd-operator-b45778765-d86pn\" (UID: \"b7c9b4f1-183e-4d60-bb84-b2995d89b0fe\") " pod="openshift-etcd-operator/etcd-operator-b45778765-d86pn" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.568887 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/9f5ccd80-acb3-4128-bf7c-e2726a9fe5ab-plugins-dir\") pod \"csi-hostpathplugin-5xvkq\" (UID: \"9f5ccd80-acb3-4128-bf7c-e2726a9fe5ab\") " pod="hostpath-provisioner/csi-hostpathplugin-5xvkq" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.568904 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/331221b7-75d0-458f-84d2-098fa5170277-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-xg27x\" (UID: \"331221b7-75d0-458f-84d2-098fa5170277\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xg27x" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.568922 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jcxs\" (UniqueName: \"kubernetes.io/projected/e75ec078-9907-4485-a7a4-991622b1788d-kube-api-access-2jcxs\") pod \"migrator-59844c95c7-xfcv2\" (UID: \"e75ec078-9907-4485-a7a4-991622b1788d\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-xfcv2" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.568952 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/b7c9b4f1-183e-4d60-bb84-b2995d89b0fe-etcd-service-ca\") pod \"etcd-operator-b45778765-d86pn\" (UID: \"b7c9b4f1-183e-4d60-bb84-b2995d89b0fe\") " pod="openshift-etcd-operator/etcd-operator-b45778765-d86pn" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.568970 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/78a11c22-fa5a-4573-ac7b-44bad6178356-available-featuregates\") pod \"openshift-config-operator-7777fb866f-b2sss\" (UID: \"78a11c22-fa5a-4573-ac7b-44bad6178356\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-b2sss" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.568999 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4762fc2a-f608-4992-a53f-72ba66df1820-config\") pod \"console-operator-58897d9998-6dp2z\" (UID: \"4762fc2a-f608-4992-a53f-72ba66df1820\") " pod="openshift-console-operator/console-operator-58897d9998-6dp2z" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.569015 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7knnx\" (UniqueName: \"kubernetes.io/projected/f863f677-8452-4742-bb8f-1307b893c75a-kube-api-access-7knnx\") pod \"machine-config-server-57cjj\" (UID: \"f863f677-8452-4742-bb8f-1307b893c75a\") " pod="openshift-machine-config-operator/machine-config-server-57cjj" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.569048 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d0cbe25c-56a8-4824-ace7-fff562288389-metrics-tls\") pod \"dns-default-g4vq2\" (UID: \"d0cbe25c-56a8-4824-ace7-fff562288389\") " pod="openshift-dns/dns-default-g4vq2" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.569064 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrd7k\" (UniqueName: \"kubernetes.io/projected/c657682e-039a-46ee-b4c7-a5f95ee75e44-kube-api-access-hrd7k\") pod \"ingress-operator-5b745b69d9-v4g7n\" (UID: \"c657682e-039a-46ee-b4c7-a5f95ee75e44\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-v4g7n" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.569081 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/3d3dbac3-1faa-46d1-8a2a-ddf41a7f1f0f-srv-cert\") pod \"catalog-operator-68c6474976-ckqs2\" (UID: \"3d3dbac3-1faa-46d1-8a2a-ddf41a7f1f0f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ckqs2" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.569097 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0add1e2e-7e5a-4e56-8d51-85d089b4573c-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-6qhmc\" (UID: \"0add1e2e-7e5a-4e56-8d51-85d089b4573c\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6qhmc" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.569133 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/11c9d818-4012-423f-b3ce-bec8ac30f1d7-config-volume\") pod \"collect-profiles-29411505-4m9wv\" (UID: \"11c9d818-4012-423f-b3ce-bec8ac30f1d7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411505-4m9wv" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.569151 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c657682e-039a-46ee-b4c7-a5f95ee75e44-metrics-tls\") pod \"ingress-operator-5b745b69d9-v4g7n\" (UID: \"c657682e-039a-46ee-b4c7-a5f95ee75e44\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-v4g7n" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.569190 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b7c9b4f1-183e-4d60-bb84-b2995d89b0fe-etcd-client\") pod \"etcd-operator-b45778765-d86pn\" (UID: \"b7c9b4f1-183e-4d60-bb84-b2995d89b0fe\") " pod="openshift-etcd-operator/etcd-operator-b45778765-d86pn" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.569216 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbxw7\" (UniqueName: \"kubernetes.io/projected/d0cbe25c-56a8-4824-ace7-fff562288389-kube-api-access-nbxw7\") pod \"dns-default-g4vq2\" (UID: \"d0cbe25c-56a8-4824-ace7-fff562288389\") " pod="openshift-dns/dns-default-g4vq2" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.569289 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff-registry-certificates\") pod \"image-registry-697d97f7c8-lvtt7\" (UID: \"977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff\") " pod="openshift-image-registry/image-registry-697d97f7c8-lvtt7" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.569317 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/50f1708b-32f7-42c3-a3ec-57f654624efa-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-5ttq4\" (UID: \"50f1708b-32f7-42c3-a3ec-57f654624efa\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5ttq4" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.569334 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/028c8291-1fc5-4035-b46f-3c618ecf154d-images\") pod \"machine-config-operator-74547568cd-2r9nt\" (UID: \"028c8291-1fc5-4035-b46f-3c618ecf154d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2r9nt" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.569379 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff-ca-trust-extracted\") pod \"image-registry-697d97f7c8-lvtt7\" (UID: \"977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff\") " pod="openshift-image-registry/image-registry-697d97f7c8-lvtt7" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.570884 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50f1708b-32f7-42c3-a3ec-57f654624efa-config\") pod \"controller-manager-879f6c89f-5ttq4\" (UID: \"50f1708b-32f7-42c3-a3ec-57f654624efa\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5ttq4" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.571555 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fcc5c745-13b8-4ff8-b677-095dd8a46081-trusted-ca-bundle\") pod \"console-f9d7485db-lcwgg\" (UID: \"fcc5c745-13b8-4ff8-b677-095dd8a46081\") " pod="openshift-console/console-f9d7485db-lcwgg" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.572047 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.572435 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/071e51e3-11fa-4ff1-9417-b7fbca815e88-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-kmxmh\" (UID: \"071e51e3-11fa-4ff1-9417-b7fbca815e88\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kmxmh" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.572864 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/071e51e3-11fa-4ff1-9417-b7fbca815e88-service-ca-bundle\") pod \"authentication-operator-69f744f599-kmxmh\" (UID: \"071e51e3-11fa-4ff1-9417-b7fbca815e88\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kmxmh" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.573039 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4762fc2a-f608-4992-a53f-72ba66df1820-trusted-ca\") pod \"console-operator-58897d9998-6dp2z\" (UID: \"4762fc2a-f608-4992-a53f-72ba66df1820\") " pod="openshift-console-operator/console-operator-58897d9998-6dp2z" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.573483 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fcc5c745-13b8-4ff8-b677-095dd8a46081-oauth-serving-cert\") pod \"console-f9d7485db-lcwgg\" (UID: \"fcc5c745-13b8-4ff8-b677-095dd8a46081\") " pod="openshift-console/console-f9d7485db-lcwgg" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.573501 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/071e51e3-11fa-4ff1-9417-b7fbca815e88-config\") pod \"authentication-operator-69f744f599-kmxmh\" (UID: \"071e51e3-11fa-4ff1-9417-b7fbca815e88\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kmxmh" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.573519 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff-trusted-ca\") pod \"image-registry-697d97f7c8-lvtt7\" (UID: \"977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff\") " pod="openshift-image-registry/image-registry-697d97f7c8-lvtt7" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.573584 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4762fc2a-f608-4992-a53f-72ba66df1820-config\") pod \"console-operator-58897d9998-6dp2z\" (UID: \"4762fc2a-f608-4992-a53f-72ba66df1820\") " pod="openshift-console-operator/console-operator-58897d9998-6dp2z" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.573746 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/071e51e3-11fa-4ff1-9417-b7fbca815e88-serving-cert\") pod \"authentication-operator-69f744f599-kmxmh\" (UID: \"071e51e3-11fa-4ff1-9417-b7fbca815e88\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kmxmh" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.575157 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fcc5c745-13b8-4ff8-b677-095dd8a46081-service-ca\") pod \"console-f9d7485db-lcwgg\" (UID: \"fcc5c745-13b8-4ff8-b677-095dd8a46081\") " pod="openshift-console/console-f9d7485db-lcwgg" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.575252 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/50f1708b-32f7-42c3-a3ec-57f654624efa-serving-cert\") pod \"controller-manager-879f6c89f-5ttq4\" (UID: \"50f1708b-32f7-42c3-a3ec-57f654624efa\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5ttq4" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.575649 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fcc5c745-13b8-4ff8-b677-095dd8a46081-console-config\") pod \"console-f9d7485db-lcwgg\" (UID: \"fcc5c745-13b8-4ff8-b677-095dd8a46081\") " pod="openshift-console/console-f9d7485db-lcwgg" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.575728 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4762fc2a-f608-4992-a53f-72ba66df1820-serving-cert\") pod \"console-operator-58897d9998-6dp2z\" (UID: \"4762fc2a-f608-4992-a53f-72ba66df1820\") " pod="openshift-console-operator/console-operator-58897d9998-6dp2z" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.576020 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/50f1708b-32f7-42c3-a3ec-57f654624efa-client-ca\") pod \"controller-manager-879f6c89f-5ttq4\" (UID: \"50f1708b-32f7-42c3-a3ec-57f654624efa\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5ttq4" Dec 02 15:54:43 crc kubenswrapper[4933]: E1202 15:54:43.576076 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 15:54:44.076054234 +0000 UTC m=+147.327281037 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.576204 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff-registry-certificates\") pod \"image-registry-697d97f7c8-lvtt7\" (UID: \"977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff\") " pod="openshift-image-registry/image-registry-697d97f7c8-lvtt7" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.577551 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff-registry-tls\") pod \"image-registry-697d97f7c8-lvtt7\" (UID: \"977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff\") " pod="openshift-image-registry/image-registry-697d97f7c8-lvtt7" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.578186 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fcc5c745-13b8-4ff8-b677-095dd8a46081-console-oauth-config\") pod \"console-f9d7485db-lcwgg\" (UID: \"fcc5c745-13b8-4ff8-b677-095dd8a46081\") " pod="openshift-console/console-f9d7485db-lcwgg" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.578190 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fcc5c745-13b8-4ff8-b677-095dd8a46081-console-serving-cert\") pod \"console-f9d7485db-lcwgg\" (UID: \"fcc5c745-13b8-4ff8-b677-095dd8a46081\") " pod="openshift-console/console-f9d7485db-lcwgg" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.578600 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff-installation-pull-secrets\") pod \"image-registry-697d97f7c8-lvtt7\" (UID: \"977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff\") " pod="openshift-image-registry/image-registry-697d97f7c8-lvtt7" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.585370 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/50f1708b-32f7-42c3-a3ec-57f654624efa-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-5ttq4\" (UID: \"50f1708b-32f7-42c3-a3ec-57f654624efa\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5ttq4" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.590931 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.610946 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.630167 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.650240 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.670331 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.670410 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/9f5ccd80-acb3-4128-bf7c-e2726a9fe5ab-mountpoint-dir\") pod \"csi-hostpathplugin-5xvkq\" (UID: \"9f5ccd80-acb3-4128-bf7c-e2726a9fe5ab\") " pod="hostpath-provisioner/csi-hostpathplugin-5xvkq" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.670445 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vj5fx\" (UniqueName: \"kubernetes.io/projected/da774c74-89be-4c8c-b2e2-0bad771112d2-kube-api-access-vj5fx\") pod \"package-server-manager-789f6589d5-4wd5d\" (UID: \"da774c74-89be-4c8c-b2e2-0bad771112d2\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4wd5d" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.670469 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/37c7afce-94aa-439e-9415-cdf79931a95e-cert\") pod \"ingress-canary-v2svg\" (UID: \"37c7afce-94aa-439e-9415-cdf79931a95e\") " pod="openshift-ingress-canary/ingress-canary-v2svg" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.670495 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b850db5e-74db-4823-bbb3-132b75b17c30-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-b986z\" (UID: \"b850db5e-74db-4823-bbb3-132b75b17c30\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-b986z" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.670514 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bw6kr\" (UniqueName: \"kubernetes.io/projected/146d5bc3-8ff4-4175-9482-c9e3f220e5c0-kube-api-access-bw6kr\") pod \"apiserver-7bbb656c7d-grlhq\" (UID: \"146d5bc3-8ff4-4175-9482-c9e3f220e5c0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-grlhq" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.670521 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/9f5ccd80-acb3-4128-bf7c-e2726a9fe5ab-mountpoint-dir\") pod \"csi-hostpathplugin-5xvkq\" (UID: \"9f5ccd80-acb3-4128-bf7c-e2726a9fe5ab\") " pod="hostpath-provisioner/csi-hostpathplugin-5xvkq" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.670538 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nfchz\" (UniqueName: \"kubernetes.io/projected/11c9d818-4012-423f-b3ce-bec8ac30f1d7-kube-api-access-nfchz\") pod \"collect-profiles-29411505-4m9wv\" (UID: \"11c9d818-4012-423f-b3ce-bec8ac30f1d7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411505-4m9wv" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.670593 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ba88c07b-f3ab-4abd-ad64-72d34148bc09-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-htb5t\" (UID: \"ba88c07b-f3ab-4abd-ad64-72d34148bc09\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-htb5t" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.670615 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/748b551d-be9d-4862-9a91-12ca4ccc71fa-webhook-cert\") pod \"packageserver-d55dfcdfc-d99zb\" (UID: \"748b551d-be9d-4862-9a91-12ca4ccc71fa\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d99zb" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.670631 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/146d5bc3-8ff4-4175-9482-c9e3f220e5c0-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-grlhq\" (UID: \"146d5bc3-8ff4-4175-9482-c9e3f220e5c0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-grlhq" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.670647 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/146d5bc3-8ff4-4175-9482-c9e3f220e5c0-encryption-config\") pod \"apiserver-7bbb656c7d-grlhq\" (UID: \"146d5bc3-8ff4-4175-9482-c9e3f220e5c0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-grlhq" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.670679 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lvtt7\" (UID: \"977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff\") " pod="openshift-image-registry/image-registry-697d97f7c8-lvtt7" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.670697 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cbd45ad7-14dd-46e8-941b-4d2bfced7567-metrics-tls\") pod \"dns-operator-744455d44c-jgdkm\" (UID: \"cbd45ad7-14dd-46e8-941b-4d2bfced7567\") " pod="openshift-dns-operator/dns-operator-744455d44c-jgdkm" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.670721 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6a612bec-8f7d-4fc1-bba2-877fc67b13f9-serving-cert\") pod \"service-ca-operator-777779d784-pvt57\" (UID: \"6a612bec-8f7d-4fc1-bba2-877fc67b13f9\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-pvt57" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.670740 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0add1e2e-7e5a-4e56-8d51-85d089b4573c-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-6qhmc\" (UID: \"0add1e2e-7e5a-4e56-8d51-85d089b4573c\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6qhmc" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.670755 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ct6s8\" (UniqueName: \"kubernetes.io/projected/6a612bec-8f7d-4fc1-bba2-877fc67b13f9-kube-api-access-ct6s8\") pod \"service-ca-operator-777779d784-pvt57\" (UID: \"6a612bec-8f7d-4fc1-bba2-877fc67b13f9\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-pvt57" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.670774 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbsnr\" (UniqueName: \"kubernetes.io/projected/cbd45ad7-14dd-46e8-941b-4d2bfced7567-kube-api-access-wbsnr\") pod \"dns-operator-744455d44c-jgdkm\" (UID: \"cbd45ad7-14dd-46e8-941b-4d2bfced7567\") " pod="openshift-dns-operator/dns-operator-744455d44c-jgdkm" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.670792 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/da774c74-89be-4c8c-b2e2-0bad771112d2-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-4wd5d\" (UID: \"da774c74-89be-4c8c-b2e2-0bad771112d2\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4wd5d" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.670811 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98qxk\" (UniqueName: \"kubernetes.io/projected/cb41f368-0638-40fd-be0f-bc71d0182af8-kube-api-access-98qxk\") pod \"marketplace-operator-79b997595-r7dnz\" (UID: \"cb41f368-0638-40fd-be0f-bc71d0182af8\") " pod="openshift-marketplace/marketplace-operator-79b997595-r7dnz" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.670842 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rd4hf\" (UniqueName: \"kubernetes.io/projected/e0806294-1664-4633-ba20-7b687a8cf4b2-kube-api-access-rd4hf\") pod \"machine-config-controller-84d6567774-6f2dw\" (UID: \"e0806294-1664-4633-ba20-7b687a8cf4b2\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6f2dw" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.670860 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/3d3dbac3-1faa-46d1-8a2a-ddf41a7f1f0f-profile-collector-cert\") pod \"catalog-operator-68c6474976-ckqs2\" (UID: \"3d3dbac3-1faa-46d1-8a2a-ddf41a7f1f0f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ckqs2" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.670878 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b7c9b4f1-183e-4d60-bb84-b2995d89b0fe-serving-cert\") pod \"etcd-operator-b45778765-d86pn\" (UID: \"b7c9b4f1-183e-4d60-bb84-b2995d89b0fe\") " pod="openshift-etcd-operator/etcd-operator-b45778765-d86pn" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.670898 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cb41f368-0638-40fd-be0f-bc71d0182af8-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-r7dnz\" (UID: \"cb41f368-0638-40fd-be0f-bc71d0182af8\") " pod="openshift-marketplace/marketplace-operator-79b997595-r7dnz" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.670921 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hsn89\" (UniqueName: \"kubernetes.io/projected/3d3dbac3-1faa-46d1-8a2a-ddf41a7f1f0f-kube-api-access-hsn89\") pod \"catalog-operator-68c6474976-ckqs2\" (UID: \"3d3dbac3-1faa-46d1-8a2a-ddf41a7f1f0f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ckqs2" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.670939 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n6bdr\" (UniqueName: \"kubernetes.io/projected/37c7afce-94aa-439e-9415-cdf79931a95e-kube-api-access-n6bdr\") pod \"ingress-canary-v2svg\" (UID: \"37c7afce-94aa-439e-9415-cdf79931a95e\") " pod="openshift-ingress-canary/ingress-canary-v2svg" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.670957 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/89e4dcb7-4b26-46b9-9346-61d13b21285a-signing-cabundle\") pod \"service-ca-9c57cc56f-ws24r\" (UID: \"89e4dcb7-4b26-46b9-9346-61d13b21285a\") " pod="openshift-service-ca/service-ca-9c57cc56f-ws24r" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.670971 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0add1e2e-7e5a-4e56-8d51-85d089b4573c-config\") pod \"kube-controller-manager-operator-78b949d7b-6qhmc\" (UID: \"0add1e2e-7e5a-4e56-8d51-85d089b4573c\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6qhmc" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.670988 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzhxh\" (UniqueName: \"kubernetes.io/projected/028c8291-1fc5-4035-b46f-3c618ecf154d-kube-api-access-rzhxh\") pod \"machine-config-operator-74547568cd-2r9nt\" (UID: \"028c8291-1fc5-4035-b46f-3c618ecf154d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2r9nt" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.671009 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vrn7\" (UniqueName: \"kubernetes.io/projected/ba88c07b-f3ab-4abd-ad64-72d34148bc09-kube-api-access-9vrn7\") pod \"kube-storage-version-migrator-operator-b67b599dd-htb5t\" (UID: \"ba88c07b-f3ab-4abd-ad64-72d34148bc09\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-htb5t" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.671024 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e0806294-1664-4633-ba20-7b687a8cf4b2-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-6f2dw\" (UID: \"e0806294-1664-4633-ba20-7b687a8cf4b2\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6f2dw" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.671041 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/ff11bd0d-41c3-45a6-a955-259a754887e3-srv-cert\") pod \"olm-operator-6b444d44fb-hdrfm\" (UID: \"ff11bd0d-41c3-45a6-a955-259a754887e3\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hdrfm" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.671057 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/806c8633-7969-405c-b445-734ae20ede22-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-tgwdx\" (UID: \"806c8633-7969-405c-b445-734ae20ede22\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-tgwdx" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.671075 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fcnlc\" (UniqueName: \"kubernetes.io/projected/061606b3-9a47-4cff-ad31-04e9a5a05528-kube-api-access-fcnlc\") pod \"control-plane-machine-set-operator-78cbb6b69f-cx7jx\" (UID: \"061606b3-9a47-4cff-ad31-04e9a5a05528\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-cx7jx" Dec 02 15:54:43 crc kubenswrapper[4933]: E1202 15:54:43.671099 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 15:54:44.171081627 +0000 UTC m=+147.422308400 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lvtt7" (UID: "977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.671135 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/f863f677-8452-4742-bb8f-1307b893c75a-node-bootstrap-token\") pod \"machine-config-server-57cjj\" (UID: \"f863f677-8452-4742-bb8f-1307b893c75a\") " pod="openshift-machine-config-operator/machine-config-server-57cjj" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.671168 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d0cbe25c-56a8-4824-ace7-fff562288389-config-volume\") pod \"dns-default-g4vq2\" (UID: \"d0cbe25c-56a8-4824-ace7-fff562288389\") " pod="openshift-dns/dns-default-g4vq2" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.671199 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9zvp\" (UniqueName: \"kubernetes.io/projected/748b551d-be9d-4862-9a91-12ca4ccc71fa-kube-api-access-z9zvp\") pod \"packageserver-d55dfcdfc-d99zb\" (UID: \"748b551d-be9d-4862-9a91-12ca4ccc71fa\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d99zb" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.671224 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/146d5bc3-8ff4-4175-9482-c9e3f220e5c0-audit-dir\") pod \"apiserver-7bbb656c7d-grlhq\" (UID: \"146d5bc3-8ff4-4175-9482-c9e3f220e5c0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-grlhq" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.671246 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c657682e-039a-46ee-b4c7-a5f95ee75e44-trusted-ca\") pod \"ingress-operator-5b745b69d9-v4g7n\" (UID: \"c657682e-039a-46ee-b4c7-a5f95ee75e44\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-v4g7n" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.671268 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9f5ccd80-acb3-4128-bf7c-e2726a9fe5ab-registration-dir\") pod \"csi-hostpathplugin-5xvkq\" (UID: \"9f5ccd80-acb3-4128-bf7c-e2726a9fe5ab\") " pod="hostpath-provisioner/csi-hostpathplugin-5xvkq" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.671295 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/89e4dcb7-4b26-46b9-9346-61d13b21285a-signing-key\") pod \"service-ca-9c57cc56f-ws24r\" (UID: \"89e4dcb7-4b26-46b9-9346-61d13b21285a\") " pod="openshift-service-ca/service-ca-9c57cc56f-ws24r" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.671340 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/cb41f368-0638-40fd-be0f-bc71d0182af8-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-r7dnz\" (UID: \"cb41f368-0638-40fd-be0f-bc71d0182af8\") " pod="openshift-marketplace/marketplace-operator-79b997595-r7dnz" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.671366 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/748b551d-be9d-4862-9a91-12ca4ccc71fa-apiservice-cert\") pod \"packageserver-d55dfcdfc-d99zb\" (UID: \"748b551d-be9d-4862-9a91-12ca4ccc71fa\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d99zb" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.671391 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7c9b4f1-183e-4d60-bb84-b2995d89b0fe-config\") pod \"etcd-operator-b45778765-d86pn\" (UID: \"b7c9b4f1-183e-4d60-bb84-b2995d89b0fe\") " pod="openshift-etcd-operator/etcd-operator-b45778765-d86pn" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.671414 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/9f5ccd80-acb3-4128-bf7c-e2726a9fe5ab-plugins-dir\") pod \"csi-hostpathplugin-5xvkq\" (UID: \"9f5ccd80-acb3-4128-bf7c-e2726a9fe5ab\") " pod="hostpath-provisioner/csi-hostpathplugin-5xvkq" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.671438 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/331221b7-75d0-458f-84d2-098fa5170277-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-xg27x\" (UID: \"331221b7-75d0-458f-84d2-098fa5170277\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xg27x" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.671463 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jcxs\" (UniqueName: \"kubernetes.io/projected/e75ec078-9907-4485-a7a4-991622b1788d-kube-api-access-2jcxs\") pod \"migrator-59844c95c7-xfcv2\" (UID: \"e75ec078-9907-4485-a7a4-991622b1788d\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-xfcv2" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.671486 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/b7c9b4f1-183e-4d60-bb84-b2995d89b0fe-etcd-service-ca\") pod \"etcd-operator-b45778765-d86pn\" (UID: \"b7c9b4f1-183e-4d60-bb84-b2995d89b0fe\") " pod="openshift-etcd-operator/etcd-operator-b45778765-d86pn" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.671512 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/78a11c22-fa5a-4573-ac7b-44bad6178356-available-featuregates\") pod \"openshift-config-operator-7777fb866f-b2sss\" (UID: \"78a11c22-fa5a-4573-ac7b-44bad6178356\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-b2sss" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.671549 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7knnx\" (UniqueName: \"kubernetes.io/projected/f863f677-8452-4742-bb8f-1307b893c75a-kube-api-access-7knnx\") pod \"machine-config-server-57cjj\" (UID: \"f863f677-8452-4742-bb8f-1307b893c75a\") " pod="openshift-machine-config-operator/machine-config-server-57cjj" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.671583 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d0cbe25c-56a8-4824-ace7-fff562288389-metrics-tls\") pod \"dns-default-g4vq2\" (UID: \"d0cbe25c-56a8-4824-ace7-fff562288389\") " pod="openshift-dns/dns-default-g4vq2" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.671607 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/3d3dbac3-1faa-46d1-8a2a-ddf41a7f1f0f-srv-cert\") pod \"catalog-operator-68c6474976-ckqs2\" (UID: \"3d3dbac3-1faa-46d1-8a2a-ddf41a7f1f0f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ckqs2" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.671630 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrd7k\" (UniqueName: \"kubernetes.io/projected/c657682e-039a-46ee-b4c7-a5f95ee75e44-kube-api-access-hrd7k\") pod \"ingress-operator-5b745b69d9-v4g7n\" (UID: \"c657682e-039a-46ee-b4c7-a5f95ee75e44\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-v4g7n" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.671652 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0add1e2e-7e5a-4e56-8d51-85d089b4573c-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-6qhmc\" (UID: \"0add1e2e-7e5a-4e56-8d51-85d089b4573c\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6qhmc" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.671672 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/146d5bc3-8ff4-4175-9482-c9e3f220e5c0-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-grlhq\" (UID: \"146d5bc3-8ff4-4175-9482-c9e3f220e5c0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-grlhq" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.671673 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/11c9d818-4012-423f-b3ce-bec8ac30f1d7-config-volume\") pod \"collect-profiles-29411505-4m9wv\" (UID: \"11c9d818-4012-423f-b3ce-bec8ac30f1d7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411505-4m9wv" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.671721 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c657682e-039a-46ee-b4c7-a5f95ee75e44-metrics-tls\") pod \"ingress-operator-5b745b69d9-v4g7n\" (UID: \"c657682e-039a-46ee-b4c7-a5f95ee75e44\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-v4g7n" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.671745 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b7c9b4f1-183e-4d60-bb84-b2995d89b0fe-etcd-client\") pod \"etcd-operator-b45778765-d86pn\" (UID: \"b7c9b4f1-183e-4d60-bb84-b2995d89b0fe\") " pod="openshift-etcd-operator/etcd-operator-b45778765-d86pn" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.671762 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nbxw7\" (UniqueName: \"kubernetes.io/projected/d0cbe25c-56a8-4824-ace7-fff562288389-kube-api-access-nbxw7\") pod \"dns-default-g4vq2\" (UID: \"d0cbe25c-56a8-4824-ace7-fff562288389\") " pod="openshift-dns/dns-default-g4vq2" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.671789 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/028c8291-1fc5-4035-b46f-3c618ecf154d-images\") pod \"machine-config-operator-74547568cd-2r9nt\" (UID: \"028c8291-1fc5-4035-b46f-3c618ecf154d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2r9nt" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.671811 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/043b25c4-e262-410b-aec8-e18fdb93d0c7-service-ca-bundle\") pod \"router-default-5444994796-dgdm5\" (UID: \"043b25c4-e262-410b-aec8-e18fdb93d0c7\") " pod="openshift-ingress/router-default-5444994796-dgdm5" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.671848 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/146d5bc3-8ff4-4175-9482-c9e3f220e5c0-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-grlhq\" (UID: \"146d5bc3-8ff4-4175-9482-c9e3f220e5c0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-grlhq" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.671869 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/97b17284-20cf-4265-a2e0-721fe06f8105-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-dhcjs\" (UID: \"97b17284-20cf-4265-a2e0-721fe06f8105\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-dhcjs" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.671885 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97b17284-20cf-4265-a2e0-721fe06f8105-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-dhcjs\" (UID: \"97b17284-20cf-4265-a2e0-721fe06f8105\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-dhcjs" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.671914 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/f863f677-8452-4742-bb8f-1307b893c75a-certs\") pod \"machine-config-server-57cjj\" (UID: \"f863f677-8452-4742-bb8f-1307b893c75a\") " pod="openshift-machine-config-operator/machine-config-server-57cjj" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.671932 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/146d5bc3-8ff4-4175-9482-c9e3f220e5c0-etcd-client\") pod \"apiserver-7bbb656c7d-grlhq\" (UID: \"146d5bc3-8ff4-4175-9482-c9e3f220e5c0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-grlhq" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.671946 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/748b551d-be9d-4862-9a91-12ca4ccc71fa-tmpfs\") pod \"packageserver-d55dfcdfc-d99zb\" (UID: \"748b551d-be9d-4862-9a91-12ca4ccc71fa\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d99zb" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.671965 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t26cf\" (UniqueName: \"kubernetes.io/projected/043b25c4-e262-410b-aec8-e18fdb93d0c7-kube-api-access-t26cf\") pod \"router-default-5444994796-dgdm5\" (UID: \"043b25c4-e262-410b-aec8-e18fdb93d0c7\") " pod="openshift-ingress/router-default-5444994796-dgdm5" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.671984 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4f2h\" (UniqueName: \"kubernetes.io/projected/97b17284-20cf-4265-a2e0-721fe06f8105-kube-api-access-n4f2h\") pod \"openshift-controller-manager-operator-756b6f6bc6-dhcjs\" (UID: \"97b17284-20cf-4265-a2e0-721fe06f8105\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-dhcjs" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.672004 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b850db5e-74db-4823-bbb3-132b75b17c30-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-b986z\" (UID: \"b850db5e-74db-4823-bbb3-132b75b17c30\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-b986z" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.672024 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cgfgv\" (UniqueName: \"kubernetes.io/projected/ff11bd0d-41c3-45a6-a955-259a754887e3-kube-api-access-cgfgv\") pod \"olm-operator-6b444d44fb-hdrfm\" (UID: \"ff11bd0d-41c3-45a6-a955-259a754887e3\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hdrfm" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.672040 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbpt5\" (UniqueName: \"kubernetes.io/projected/9f5ccd80-acb3-4128-bf7c-e2726a9fe5ab-kube-api-access-lbpt5\") pod \"csi-hostpathplugin-5xvkq\" (UID: \"9f5ccd80-acb3-4128-bf7c-e2726a9fe5ab\") " pod="hostpath-provisioner/csi-hostpathplugin-5xvkq" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.672057 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba88c07b-f3ab-4abd-ad64-72d34148bc09-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-htb5t\" (UID: \"ba88c07b-f3ab-4abd-ad64-72d34148bc09\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-htb5t" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.672066 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/9f5ccd80-acb3-4128-bf7c-e2726a9fe5ab-plugins-dir\") pod \"csi-hostpathplugin-5xvkq\" (UID: \"9f5ccd80-acb3-4128-bf7c-e2726a9fe5ab\") " pod="hostpath-provisioner/csi-hostpathplugin-5xvkq" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.672099 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/78a11c22-fa5a-4573-ac7b-44bad6178356-available-featuregates\") pod \"openshift-config-operator-7777fb866f-b2sss\" (UID: \"78a11c22-fa5a-4573-ac7b-44bad6178356\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-b2sss" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.672075 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/ff11bd0d-41c3-45a6-a955-259a754887e3-profile-collector-cert\") pod \"olm-operator-6b444d44fb-hdrfm\" (UID: \"ff11bd0d-41c3-45a6-a955-259a754887e3\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hdrfm" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.672158 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/331221b7-75d0-458f-84d2-098fa5170277-config\") pod \"kube-apiserver-operator-766d6c64bb-xg27x\" (UID: \"331221b7-75d0-458f-84d2-098fa5170277\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xg27x" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.672183 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/028c8291-1fc5-4035-b46f-3c618ecf154d-auth-proxy-config\") pod \"machine-config-operator-74547568cd-2r9nt\" (UID: \"028c8291-1fc5-4035-b46f-3c618ecf154d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2r9nt" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.672201 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/146d5bc3-8ff4-4175-9482-c9e3f220e5c0-audit-policies\") pod \"apiserver-7bbb656c7d-grlhq\" (UID: \"146d5bc3-8ff4-4175-9482-c9e3f220e5c0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-grlhq" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.672239 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/146d5bc3-8ff4-4175-9482-c9e3f220e5c0-audit-dir\") pod \"apiserver-7bbb656c7d-grlhq\" (UID: \"146d5bc3-8ff4-4175-9482-c9e3f220e5c0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-grlhq" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.672246 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/061606b3-9a47-4cff-ad31-04e9a5a05528-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-cx7jx\" (UID: \"061606b3-9a47-4cff-ad31-04e9a5a05528\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-cx7jx" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.672265 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/78a11c22-fa5a-4573-ac7b-44bad6178356-serving-cert\") pod \"openshift-config-operator-7777fb866f-b2sss\" (UID: \"78a11c22-fa5a-4573-ac7b-44bad6178356\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-b2sss" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.672288 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22fjl\" (UniqueName: \"kubernetes.io/projected/78a11c22-fa5a-4573-ac7b-44bad6178356-kube-api-access-22fjl\") pod \"openshift-config-operator-7777fb866f-b2sss\" (UID: \"78a11c22-fa5a-4573-ac7b-44bad6178356\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-b2sss" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.672319 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/9f5ccd80-acb3-4128-bf7c-e2726a9fe5ab-csi-data-dir\") pod \"csi-hostpathplugin-5xvkq\" (UID: \"9f5ccd80-acb3-4128-bf7c-e2726a9fe5ab\") " pod="hostpath-provisioner/csi-hostpathplugin-5xvkq" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.672344 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6bqwr\" (UniqueName: \"kubernetes.io/projected/89e4dcb7-4b26-46b9-9346-61d13b21285a-kube-api-access-6bqwr\") pod \"service-ca-9c57cc56f-ws24r\" (UID: \"89e4dcb7-4b26-46b9-9346-61d13b21285a\") " pod="openshift-service-ca/service-ca-9c57cc56f-ws24r" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.672360 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/043b25c4-e262-410b-aec8-e18fdb93d0c7-default-certificate\") pod \"router-default-5444994796-dgdm5\" (UID: \"043b25c4-e262-410b-aec8-e18fdb93d0c7\") " pod="openshift-ingress/router-default-5444994796-dgdm5" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.672378 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/043b25c4-e262-410b-aec8-e18fdb93d0c7-stats-auth\") pod \"router-default-5444994796-dgdm5\" (UID: \"043b25c4-e262-410b-aec8-e18fdb93d0c7\") " pod="openshift-ingress/router-default-5444994796-dgdm5" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.672393 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/043b25c4-e262-410b-aec8-e18fdb93d0c7-metrics-certs\") pod \"router-default-5444994796-dgdm5\" (UID: \"043b25c4-e262-410b-aec8-e18fdb93d0c7\") " pod="openshift-ingress/router-default-5444994796-dgdm5" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.672409 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/11c9d818-4012-423f-b3ce-bec8ac30f1d7-secret-volume\") pod \"collect-profiles-29411505-4m9wv\" (UID: \"11c9d818-4012-423f-b3ce-bec8ac30f1d7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411505-4m9wv" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.672425 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/9f5ccd80-acb3-4128-bf7c-e2726a9fe5ab-socket-dir\") pod \"csi-hostpathplugin-5xvkq\" (UID: \"9f5ccd80-acb3-4128-bf7c-e2726a9fe5ab\") " pod="hostpath-provisioner/csi-hostpathplugin-5xvkq" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.672441 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qk72g\" (UniqueName: \"kubernetes.io/projected/b7c9b4f1-183e-4d60-bb84-b2995d89b0fe-kube-api-access-qk72g\") pod \"etcd-operator-b45778765-d86pn\" (UID: \"b7c9b4f1-183e-4d60-bb84-b2995d89b0fe\") " pod="openshift-etcd-operator/etcd-operator-b45778765-d86pn" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.672457 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/028c8291-1fc5-4035-b46f-3c618ecf154d-proxy-tls\") pod \"machine-config-operator-74547568cd-2r9nt\" (UID: \"028c8291-1fc5-4035-b46f-3c618ecf154d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2r9nt" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.672473 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a612bec-8f7d-4fc1-bba2-877fc67b13f9-config\") pod \"service-ca-operator-777779d784-pvt57\" (UID: \"6a612bec-8f7d-4fc1-bba2-877fc67b13f9\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-pvt57" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.672490 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/146d5bc3-8ff4-4175-9482-c9e3f220e5c0-serving-cert\") pod \"apiserver-7bbb656c7d-grlhq\" (UID: \"146d5bc3-8ff4-4175-9482-c9e3f220e5c0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-grlhq" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.672502 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/748b551d-be9d-4862-9a91-12ca4ccc71fa-tmpfs\") pod \"packageserver-d55dfcdfc-d99zb\" (UID: \"748b551d-be9d-4862-9a91-12ca4ccc71fa\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d99zb" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.672507 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/b7c9b4f1-183e-4d60-bb84-b2995d89b0fe-etcd-ca\") pod \"etcd-operator-b45778765-d86pn\" (UID: \"b7c9b4f1-183e-4d60-bb84-b2995d89b0fe\") " pod="openshift-etcd-operator/etcd-operator-b45778765-d86pn" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.672540 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b850db5e-74db-4823-bbb3-132b75b17c30-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-b986z\" (UID: \"b850db5e-74db-4823-bbb3-132b75b17c30\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-b986z" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.672557 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/331221b7-75d0-458f-84d2-098fa5170277-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-xg27x\" (UID: \"331221b7-75d0-458f-84d2-098fa5170277\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xg27x" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.672573 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z52dn\" (UniqueName: \"kubernetes.io/projected/806c8633-7969-405c-b445-734ae20ede22-kube-api-access-z52dn\") pod \"multus-admission-controller-857f4d67dd-tgwdx\" (UID: \"806c8633-7969-405c-b445-734ae20ede22\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-tgwdx" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.672590 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c657682e-039a-46ee-b4c7-a5f95ee75e44-bound-sa-token\") pod \"ingress-operator-5b745b69d9-v4g7n\" (UID: \"c657682e-039a-46ee-b4c7-a5f95ee75e44\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-v4g7n" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.672609 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e0806294-1664-4633-ba20-7b687a8cf4b2-proxy-tls\") pod \"machine-config-controller-84d6567774-6f2dw\" (UID: \"e0806294-1664-4633-ba20-7b687a8cf4b2\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6f2dw" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.672692 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/146d5bc3-8ff4-4175-9482-c9e3f220e5c0-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-grlhq\" (UID: \"146d5bc3-8ff4-4175-9482-c9e3f220e5c0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-grlhq" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.672867 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/028c8291-1fc5-4035-b46f-3c618ecf154d-auth-proxy-config\") pod \"machine-config-operator-74547568cd-2r9nt\" (UID: \"028c8291-1fc5-4035-b46f-3c618ecf154d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2r9nt" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.672977 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97b17284-20cf-4265-a2e0-721fe06f8105-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-dhcjs\" (UID: \"97b17284-20cf-4265-a2e0-721fe06f8105\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-dhcjs" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.673017 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/146d5bc3-8ff4-4175-9482-c9e3f220e5c0-audit-policies\") pod \"apiserver-7bbb656c7d-grlhq\" (UID: \"146d5bc3-8ff4-4175-9482-c9e3f220e5c0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-grlhq" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.673055 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/9f5ccd80-acb3-4128-bf7c-e2726a9fe5ab-socket-dir\") pod \"csi-hostpathplugin-5xvkq\" (UID: \"9f5ccd80-acb3-4128-bf7c-e2726a9fe5ab\") " pod="hostpath-provisioner/csi-hostpathplugin-5xvkq" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.673662 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e0806294-1664-4633-ba20-7b687a8cf4b2-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-6f2dw\" (UID: \"e0806294-1664-4633-ba20-7b687a8cf4b2\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6f2dw" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.674203 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/9f5ccd80-acb3-4128-bf7c-e2726a9fe5ab-csi-data-dir\") pod \"csi-hostpathplugin-5xvkq\" (UID: \"9f5ccd80-acb3-4128-bf7c-e2726a9fe5ab\") " pod="hostpath-provisioner/csi-hostpathplugin-5xvkq" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.674591 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba88c07b-f3ab-4abd-ad64-72d34148bc09-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-htb5t\" (UID: \"ba88c07b-f3ab-4abd-ad64-72d34148bc09\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-htb5t" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.672362 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9f5ccd80-acb3-4128-bf7c-e2726a9fe5ab-registration-dir\") pod \"csi-hostpathplugin-5xvkq\" (UID: \"9f5ccd80-acb3-4128-bf7c-e2726a9fe5ab\") " pod="hostpath-provisioner/csi-hostpathplugin-5xvkq" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.675103 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/146d5bc3-8ff4-4175-9482-c9e3f220e5c0-encryption-config\") pod \"apiserver-7bbb656c7d-grlhq\" (UID: \"146d5bc3-8ff4-4175-9482-c9e3f220e5c0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-grlhq" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.676135 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/146d5bc3-8ff4-4175-9482-c9e3f220e5c0-serving-cert\") pod \"apiserver-7bbb656c7d-grlhq\" (UID: \"146d5bc3-8ff4-4175-9482-c9e3f220e5c0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-grlhq" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.677300 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/146d5bc3-8ff4-4175-9482-c9e3f220e5c0-etcd-client\") pod \"apiserver-7bbb656c7d-grlhq\" (UID: \"146d5bc3-8ff4-4175-9482-c9e3f220e5c0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-grlhq" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.677880 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/97b17284-20cf-4265-a2e0-721fe06f8105-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-dhcjs\" (UID: \"97b17284-20cf-4265-a2e0-721fe06f8105\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-dhcjs" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.677919 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ba88c07b-f3ab-4abd-ad64-72d34148bc09-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-htb5t\" (UID: \"ba88c07b-f3ab-4abd-ad64-72d34148bc09\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-htb5t" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.677945 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/78a11c22-fa5a-4573-ac7b-44bad6178356-serving-cert\") pod \"openshift-config-operator-7777fb866f-b2sss\" (UID: \"78a11c22-fa5a-4573-ac7b-44bad6178356\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-b2sss" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.678098 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e0806294-1664-4633-ba20-7b687a8cf4b2-proxy-tls\") pod \"machine-config-controller-84d6567774-6f2dw\" (UID: \"e0806294-1664-4633-ba20-7b687a8cf4b2\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6f2dw" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.678841 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cbd45ad7-14dd-46e8-941b-4d2bfced7567-metrics-tls\") pod \"dns-operator-744455d44c-jgdkm\" (UID: \"cbd45ad7-14dd-46e8-941b-4d2bfced7567\") " pod="openshift-dns-operator/dns-operator-744455d44c-jgdkm" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.679406 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/043b25c4-e262-410b-aec8-e18fdb93d0c7-default-certificate\") pod \"router-default-5444994796-dgdm5\" (UID: \"043b25c4-e262-410b-aec8-e18fdb93d0c7\") " pod="openshift-ingress/router-default-5444994796-dgdm5" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.690305 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.697180 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/043b25c4-e262-410b-aec8-e18fdb93d0c7-metrics-certs\") pod \"router-default-5444994796-dgdm5\" (UID: \"043b25c4-e262-410b-aec8-e18fdb93d0c7\") " pod="openshift-ingress/router-default-5444994796-dgdm5" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.710881 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.730021 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.750988 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.755044 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/043b25c4-e262-410b-aec8-e18fdb93d0c7-service-ca-bundle\") pod \"router-default-5444994796-dgdm5\" (UID: \"043b25c4-e262-410b-aec8-e18fdb93d0c7\") " pod="openshift-ingress/router-default-5444994796-dgdm5" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.770505 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.773592 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 15:54:43 crc kubenswrapper[4933]: E1202 15:54:43.773807 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 15:54:44.273775871 +0000 UTC m=+147.525002634 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.774876 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lvtt7\" (UID: \"977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff\") " pod="openshift-image-registry/image-registry-697d97f7c8-lvtt7" Dec 02 15:54:43 crc kubenswrapper[4933]: E1202 15:54:43.775400 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 15:54:44.275378501 +0000 UTC m=+147.526605204 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lvtt7" (UID: "977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.781995 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/043b25c4-e262-410b-aec8-e18fdb93d0c7-stats-auth\") pod \"router-default-5444994796-dgdm5\" (UID: \"043b25c4-e262-410b-aec8-e18fdb93d0c7\") " pod="openshift-ingress/router-default-5444994796-dgdm5" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.790166 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.809994 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.814194 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/028c8291-1fc5-4035-b46f-3c618ecf154d-images\") pod \"machine-config-operator-74547568cd-2r9nt\" (UID: \"028c8291-1fc5-4035-b46f-3c618ecf154d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2r9nt" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.829630 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.850623 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.856350 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/028c8291-1fc5-4035-b46f-3c618ecf154d-proxy-tls\") pod \"machine-config-operator-74547568cd-2r9nt\" (UID: \"028c8291-1fc5-4035-b46f-3c618ecf154d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2r9nt" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.870167 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.875517 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.875800 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b7c9b4f1-183e-4d60-bb84-b2995d89b0fe-serving-cert\") pod \"etcd-operator-b45778765-d86pn\" (UID: \"b7c9b4f1-183e-4d60-bb84-b2995d89b0fe\") " pod="openshift-etcd-operator/etcd-operator-b45778765-d86pn" Dec 02 15:54:43 crc kubenswrapper[4933]: E1202 15:54:43.876334 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 15:54:44.37630198 +0000 UTC m=+147.627528723 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.890477 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.910254 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.916559 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b7c9b4f1-183e-4d60-bb84-b2995d89b0fe-etcd-client\") pod \"etcd-operator-b45778765-d86pn\" (UID: \"b7c9b4f1-183e-4d60-bb84-b2995d89b0fe\") " pod="openshift-etcd-operator/etcd-operator-b45778765-d86pn" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.930915 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.932666 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7c9b4f1-183e-4d60-bb84-b2995d89b0fe-config\") pod \"etcd-operator-b45778765-d86pn\" (UID: \"b7c9b4f1-183e-4d60-bb84-b2995d89b0fe\") " pod="openshift-etcd-operator/etcd-operator-b45778765-d86pn" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.950519 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.954672 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/b7c9b4f1-183e-4d60-bb84-b2995d89b0fe-etcd-ca\") pod \"etcd-operator-b45778765-d86pn\" (UID: \"b7c9b4f1-183e-4d60-bb84-b2995d89b0fe\") " pod="openshift-etcd-operator/etcd-operator-b45778765-d86pn" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.971532 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.975564 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/b7c9b4f1-183e-4d60-bb84-b2995d89b0fe-etcd-service-ca\") pod \"etcd-operator-b45778765-d86pn\" (UID: \"b7c9b4f1-183e-4d60-bb84-b2995d89b0fe\") " pod="openshift-etcd-operator/etcd-operator-b45778765-d86pn" Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.979150 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lvtt7\" (UID: \"977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff\") " pod="openshift-image-registry/image-registry-697d97f7c8-lvtt7" Dec 02 15:54:43 crc kubenswrapper[4933]: E1202 15:54:43.979710 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 15:54:44.479686335 +0000 UTC m=+147.730913068 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lvtt7" (UID: "977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:43 crc kubenswrapper[4933]: I1202 15:54:43.991344 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 02 15:54:44 crc kubenswrapper[4933]: I1202 15:54:44.010635 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 02 15:54:44 crc kubenswrapper[4933]: I1202 15:54:44.030868 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 02 15:54:44 crc kubenswrapper[4933]: I1202 15:54:44.049426 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 02 15:54:44 crc kubenswrapper[4933]: I1202 15:54:44.058310 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/061606b3-9a47-4cff-ad31-04e9a5a05528-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-cx7jx\" (UID: \"061606b3-9a47-4cff-ad31-04e9a5a05528\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-cx7jx" Dec 02 15:54:44 crc kubenswrapper[4933]: I1202 15:54:44.070209 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 02 15:54:44 crc kubenswrapper[4933]: I1202 15:54:44.082171 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 15:54:44 crc kubenswrapper[4933]: E1202 15:54:44.082470 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 15:54:44.582428331 +0000 UTC m=+147.833655234 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:44 crc kubenswrapper[4933]: I1202 15:54:44.083049 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lvtt7\" (UID: \"977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff\") " pod="openshift-image-registry/image-registry-697d97f7c8-lvtt7" Dec 02 15:54:44 crc kubenswrapper[4933]: E1202 15:54:44.083467 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 15:54:44.583448113 +0000 UTC m=+147.834674836 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lvtt7" (UID: "977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:44 crc kubenswrapper[4933]: I1202 15:54:44.091108 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 02 15:54:44 crc kubenswrapper[4933]: I1202 15:54:44.096273 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/ff11bd0d-41c3-45a6-a955-259a754887e3-profile-collector-cert\") pod \"olm-operator-6b444d44fb-hdrfm\" (UID: \"ff11bd0d-41c3-45a6-a955-259a754887e3\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hdrfm" Dec 02 15:54:44 crc kubenswrapper[4933]: I1202 15:54:44.096524 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/3d3dbac3-1faa-46d1-8a2a-ddf41a7f1f0f-profile-collector-cert\") pod \"catalog-operator-68c6474976-ckqs2\" (UID: \"3d3dbac3-1faa-46d1-8a2a-ddf41a7f1f0f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ckqs2" Dec 02 15:54:44 crc kubenswrapper[4933]: I1202 15:54:44.096891 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/11c9d818-4012-423f-b3ce-bec8ac30f1d7-secret-volume\") pod \"collect-profiles-29411505-4m9wv\" (UID: \"11c9d818-4012-423f-b3ce-bec8ac30f1d7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411505-4m9wv" Dec 02 15:54:44 crc kubenswrapper[4933]: I1202 15:54:44.110848 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 02 15:54:44 crc kubenswrapper[4933]: I1202 15:54:44.131296 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 02 15:54:44 crc kubenswrapper[4933]: I1202 15:54:44.137575 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/ff11bd0d-41c3-45a6-a955-259a754887e3-srv-cert\") pod \"olm-operator-6b444d44fb-hdrfm\" (UID: \"ff11bd0d-41c3-45a6-a955-259a754887e3\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hdrfm" Dec 02 15:54:44 crc kubenswrapper[4933]: I1202 15:54:44.150319 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 02 15:54:44 crc kubenswrapper[4933]: I1202 15:54:44.169625 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 02 15:54:44 crc kubenswrapper[4933]: I1202 15:54:44.174990 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6a612bec-8f7d-4fc1-bba2-877fc67b13f9-serving-cert\") pod \"service-ca-operator-777779d784-pvt57\" (UID: \"6a612bec-8f7d-4fc1-bba2-877fc67b13f9\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-pvt57" Dec 02 15:54:44 crc kubenswrapper[4933]: I1202 15:54:44.183919 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 15:54:44 crc kubenswrapper[4933]: E1202 15:54:44.184087 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 15:54:44.684059241 +0000 UTC m=+147.935285944 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:44 crc kubenswrapper[4933]: I1202 15:54:44.184240 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 15:54:44 crc kubenswrapper[4933]: I1202 15:54:44.184285 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lvtt7\" (UID: \"977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff\") " pod="openshift-image-registry/image-registry-697d97f7c8-lvtt7" Dec 02 15:54:44 crc kubenswrapper[4933]: I1202 15:54:44.184574 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 15:54:44 crc kubenswrapper[4933]: I1202 15:54:44.184631 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 15:54:44 crc kubenswrapper[4933]: E1202 15:54:44.184804 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 15:54:44.684778484 +0000 UTC m=+147.936005277 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lvtt7" (UID: "977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:44 crc kubenswrapper[4933]: I1202 15:54:44.184921 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 15:54:44 crc kubenswrapper[4933]: I1202 15:54:44.185736 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 15:54:44 crc kubenswrapper[4933]: I1202 15:54:44.188051 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 15:54:44 crc kubenswrapper[4933]: I1202 15:54:44.188245 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 15:54:44 crc kubenswrapper[4933]: I1202 15:54:44.189058 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 15:54:44 crc kubenswrapper[4933]: I1202 15:54:44.190076 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 02 15:54:44 crc kubenswrapper[4933]: I1202 15:54:44.210481 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 02 15:54:44 crc kubenswrapper[4933]: I1202 15:54:44.230791 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 02 15:54:44 crc kubenswrapper[4933]: I1202 15:54:44.234634 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a612bec-8f7d-4fc1-bba2-877fc67b13f9-config\") pod \"service-ca-operator-777779d784-pvt57\" (UID: \"6a612bec-8f7d-4fc1-bba2-877fc67b13f9\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-pvt57" Dec 02 15:54:44 crc kubenswrapper[4933]: I1202 15:54:44.249489 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 02 15:54:44 crc kubenswrapper[4933]: I1202 15:54:44.270538 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 02 15:54:44 crc kubenswrapper[4933]: I1202 15:54:44.277319 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/806c8633-7969-405c-b445-734ae20ede22-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-tgwdx\" (UID: \"806c8633-7969-405c-b445-734ae20ede22\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-tgwdx" Dec 02 15:54:44 crc kubenswrapper[4933]: I1202 15:54:44.286436 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 15:54:44 crc kubenswrapper[4933]: E1202 15:54:44.286636 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 15:54:44.786609241 +0000 UTC m=+148.037835944 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:44 crc kubenswrapper[4933]: I1202 15:54:44.286984 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lvtt7\" (UID: \"977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff\") " pod="openshift-image-registry/image-registry-697d97f7c8-lvtt7" Dec 02 15:54:44 crc kubenswrapper[4933]: E1202 15:54:44.287322 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 15:54:44.787314333 +0000 UTC m=+148.038541036 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lvtt7" (UID: "977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:44 crc kubenswrapper[4933]: I1202 15:54:44.292072 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 02 15:54:44 crc kubenswrapper[4933]: I1202 15:54:44.309307 4933 request.go:700] Waited for 1.010941347s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-operator-lifecycle-manager/secrets?fieldSelector=metadata.name%3Dcatalog-operator-serving-cert&limit=500&resourceVersion=0 Dec 02 15:54:44 crc kubenswrapper[4933]: I1202 15:54:44.311033 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 02 15:54:44 crc kubenswrapper[4933]: I1202 15:54:44.316352 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/3d3dbac3-1faa-46d1-8a2a-ddf41a7f1f0f-srv-cert\") pod \"catalog-operator-68c6474976-ckqs2\" (UID: \"3d3dbac3-1faa-46d1-8a2a-ddf41a7f1f0f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ckqs2" Dec 02 15:54:44 crc kubenswrapper[4933]: I1202 15:54:44.330534 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 02 15:54:44 crc kubenswrapper[4933]: I1202 15:54:44.350991 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 02 15:54:44 crc kubenswrapper[4933]: I1202 15:54:44.370648 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 02 15:54:44 crc kubenswrapper[4933]: I1202 15:54:44.376131 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b850db5e-74db-4823-bbb3-132b75b17c30-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-b986z\" (UID: \"b850db5e-74db-4823-bbb3-132b75b17c30\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-b986z" Dec 02 15:54:44 crc kubenswrapper[4933]: I1202 15:54:44.385398 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 15:54:44 crc kubenswrapper[4933]: I1202 15:54:44.387663 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 15:54:44 crc kubenswrapper[4933]: E1202 15:54:44.387778 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 15:54:44.887757046 +0000 UTC m=+148.138983749 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:44 crc kubenswrapper[4933]: I1202 15:54:44.388337 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lvtt7\" (UID: \"977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff\") " pod="openshift-image-registry/image-registry-697d97f7c8-lvtt7" Dec 02 15:54:44 crc kubenswrapper[4933]: E1202 15:54:44.388739 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 15:54:44.888725566 +0000 UTC m=+148.139952269 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lvtt7" (UID: "977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:44 crc kubenswrapper[4933]: I1202 15:54:44.390530 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 02 15:54:44 crc kubenswrapper[4933]: I1202 15:54:44.398079 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b850db5e-74db-4823-bbb3-132b75b17c30-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-b986z\" (UID: \"b850db5e-74db-4823-bbb3-132b75b17c30\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-b986z" Dec 02 15:54:44 crc kubenswrapper[4933]: I1202 15:54:44.409285 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 15:54:44 crc kubenswrapper[4933]: I1202 15:54:44.411235 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 02 15:54:44 crc kubenswrapper[4933]: I1202 15:54:44.411356 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 15:54:44 crc kubenswrapper[4933]: I1202 15:54:44.431223 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 02 15:54:44 crc kubenswrapper[4933]: I1202 15:54:44.450363 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 02 15:54:44 crc kubenswrapper[4933]: I1202 15:54:44.459115 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/cb41f368-0638-40fd-be0f-bc71d0182af8-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-r7dnz\" (UID: \"cb41f368-0638-40fd-be0f-bc71d0182af8\") " pod="openshift-marketplace/marketplace-operator-79b997595-r7dnz" Dec 02 15:54:44 crc kubenswrapper[4933]: I1202 15:54:44.476566 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 02 15:54:44 crc kubenswrapper[4933]: I1202 15:54:44.483613 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cb41f368-0638-40fd-be0f-bc71d0182af8-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-r7dnz\" (UID: \"cb41f368-0638-40fd-be0f-bc71d0182af8\") " pod="openshift-marketplace/marketplace-operator-79b997595-r7dnz" Dec 02 15:54:44 crc kubenswrapper[4933]: I1202 15:54:44.489431 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 15:54:44 crc kubenswrapper[4933]: E1202 15:54:44.490177 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 15:54:44.990157241 +0000 UTC m=+148.241383944 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:44 crc kubenswrapper[4933]: I1202 15:54:44.491815 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 02 15:54:44 crc kubenswrapper[4933]: I1202 15:54:44.511592 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 02 15:54:44 crc kubenswrapper[4933]: I1202 15:54:44.516003 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/748b551d-be9d-4862-9a91-12ca4ccc71fa-apiservice-cert\") pod \"packageserver-d55dfcdfc-d99zb\" (UID: \"748b551d-be9d-4862-9a91-12ca4ccc71fa\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d99zb" Dec 02 15:54:44 crc kubenswrapper[4933]: I1202 15:54:44.524450 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/748b551d-be9d-4862-9a91-12ca4ccc71fa-webhook-cert\") pod \"packageserver-d55dfcdfc-d99zb\" (UID: \"748b551d-be9d-4862-9a91-12ca4ccc71fa\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d99zb" Dec 02 15:54:44 crc kubenswrapper[4933]: I1202 15:54:44.530891 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 02 15:54:44 crc kubenswrapper[4933]: I1202 15:54:44.535726 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/da774c74-89be-4c8c-b2e2-0bad771112d2-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-4wd5d\" (UID: \"da774c74-89be-4c8c-b2e2-0bad771112d2\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4wd5d" Dec 02 15:54:44 crc kubenswrapper[4933]: I1202 15:54:44.550543 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 02 15:54:44 crc kubenswrapper[4933]: I1202 15:54:44.569997 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 02 15:54:44 crc kubenswrapper[4933]: I1202 15:54:44.589755 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 02 15:54:44 crc kubenswrapper[4933]: I1202 15:54:44.591133 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lvtt7\" (UID: \"977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff\") " pod="openshift-image-registry/image-registry-697d97f7c8-lvtt7" Dec 02 15:54:44 crc kubenswrapper[4933]: E1202 15:54:44.591475 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 15:54:45.091457581 +0000 UTC m=+148.342684284 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lvtt7" (UID: "977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:44 crc kubenswrapper[4933]: I1202 15:54:44.616517 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 02 15:54:44 crc kubenswrapper[4933]: I1202 15:54:44.624482 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c657682e-039a-46ee-b4c7-a5f95ee75e44-trusted-ca\") pod \"ingress-operator-5b745b69d9-v4g7n\" (UID: \"c657682e-039a-46ee-b4c7-a5f95ee75e44\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-v4g7n" Dec 02 15:54:44 crc kubenswrapper[4933]: I1202 15:54:44.629929 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 02 15:54:44 crc kubenswrapper[4933]: I1202 15:54:44.636864 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c657682e-039a-46ee-b4c7-a5f95ee75e44-metrics-tls\") pod \"ingress-operator-5b745b69d9-v4g7n\" (UID: \"c657682e-039a-46ee-b4c7-a5f95ee75e44\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-v4g7n" Dec 02 15:54:44 crc kubenswrapper[4933]: I1202 15:54:44.650115 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 02 15:54:44 crc kubenswrapper[4933]: I1202 15:54:44.670156 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 02 15:54:44 crc kubenswrapper[4933]: E1202 15:54:44.670957 4933 secret.go:188] Couldn't get secret openshift-ingress-canary/canary-serving-cert: failed to sync secret cache: timed out waiting for the condition Dec 02 15:54:44 crc kubenswrapper[4933]: E1202 15:54:44.671070 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/37c7afce-94aa-439e-9415-cdf79931a95e-cert podName:37c7afce-94aa-439e-9415-cdf79931a95e nodeName:}" failed. No retries permitted until 2025-12-02 15:54:45.171042869 +0000 UTC m=+148.422269572 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/37c7afce-94aa-439e-9415-cdf79931a95e-cert") pod "ingress-canary-v2svg" (UID: "37c7afce-94aa-439e-9415-cdf79931a95e") : failed to sync secret cache: timed out waiting for the condition Dec 02 15:54:44 crc kubenswrapper[4933]: E1202 15:54:44.671990 4933 secret.go:188] Couldn't get secret openshift-dns/dns-default-metrics-tls: failed to sync secret cache: timed out waiting for the condition Dec 02 15:54:44 crc kubenswrapper[4933]: E1202 15:54:44.672012 4933 secret.go:188] Couldn't get secret openshift-service-ca/signing-key: failed to sync secret cache: timed out waiting for the condition Dec 02 15:54:44 crc kubenswrapper[4933]: E1202 15:54:44.672035 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0cbe25c-56a8-4824-ace7-fff562288389-metrics-tls podName:d0cbe25c-56a8-4824-ace7-fff562288389 nodeName:}" failed. No retries permitted until 2025-12-02 15:54:45.1720239 +0000 UTC m=+148.423250603 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d0cbe25c-56a8-4824-ace7-fff562288389-metrics-tls") pod "dns-default-g4vq2" (UID: "d0cbe25c-56a8-4824-ace7-fff562288389") : failed to sync secret cache: timed out waiting for the condition Dec 02 15:54:44 crc kubenswrapper[4933]: E1202 15:54:44.672055 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/89e4dcb7-4b26-46b9-9346-61d13b21285a-signing-key podName:89e4dcb7-4b26-46b9-9346-61d13b21285a nodeName:}" failed. No retries permitted until 2025-12-02 15:54:45.172044841 +0000 UTC m=+148.423271544 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "signing-key" (UniqueName: "kubernetes.io/secret/89e4dcb7-4b26-46b9-9346-61d13b21285a-signing-key") pod "service-ca-9c57cc56f-ws24r" (UID: "89e4dcb7-4b26-46b9-9346-61d13b21285a") : failed to sync secret cache: timed out waiting for the condition Dec 02 15:54:44 crc kubenswrapper[4933]: E1202 15:54:44.672058 4933 secret.go:188] Couldn't get secret openshift-kube-apiserver-operator/kube-apiserver-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Dec 02 15:54:44 crc kubenswrapper[4933]: E1202 15:54:44.672074 4933 secret.go:188] Couldn't get secret openshift-kube-controller-manager-operator/kube-controller-manager-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Dec 02 15:54:44 crc kubenswrapper[4933]: E1202 15:54:44.672098 4933 configmap.go:193] Couldn't get configMap openshift-operator-lifecycle-manager/collect-profiles-config: failed to sync configmap cache: timed out waiting for the condition Dec 02 15:54:44 crc kubenswrapper[4933]: E1202 15:54:44.672106 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0add1e2e-7e5a-4e56-8d51-85d089b4573c-serving-cert podName:0add1e2e-7e5a-4e56-8d51-85d089b4573c nodeName:}" failed. No retries permitted until 2025-12-02 15:54:45.172098053 +0000 UTC m=+148.423324756 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/0add1e2e-7e5a-4e56-8d51-85d089b4573c-serving-cert") pod "kube-controller-manager-operator-78b949d7b-6qhmc" (UID: "0add1e2e-7e5a-4e56-8d51-85d089b4573c") : failed to sync secret cache: timed out waiting for the condition Dec 02 15:54:44 crc kubenswrapper[4933]: E1202 15:54:44.672121 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/11c9d818-4012-423f-b3ce-bec8ac30f1d7-config-volume podName:11c9d818-4012-423f-b3ce-bec8ac30f1d7 nodeName:}" failed. No retries permitted until 2025-12-02 15:54:45.172115343 +0000 UTC m=+148.423342046 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-volume" (UniqueName: "kubernetes.io/configmap/11c9d818-4012-423f-b3ce-bec8ac30f1d7-config-volume") pod "collect-profiles-29411505-4m9wv" (UID: "11c9d818-4012-423f-b3ce-bec8ac30f1d7") : failed to sync configmap cache: timed out waiting for the condition Dec 02 15:54:44 crc kubenswrapper[4933]: E1202 15:54:44.672134 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/331221b7-75d0-458f-84d2-098fa5170277-serving-cert podName:331221b7-75d0-458f-84d2-098fa5170277 nodeName:}" failed. No retries permitted until 2025-12-02 15:54:45.172127933 +0000 UTC m=+148.423354636 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/331221b7-75d0-458f-84d2-098fa5170277-serving-cert") pod "kube-apiserver-operator-766d6c64bb-xg27x" (UID: "331221b7-75d0-458f-84d2-098fa5170277") : failed to sync secret cache: timed out waiting for the condition Dec 02 15:54:44 crc kubenswrapper[4933]: E1202 15:54:44.672182 4933 configmap.go:193] Couldn't get configMap openshift-service-ca/signing-cabundle: failed to sync configmap cache: timed out waiting for the condition Dec 02 15:54:44 crc kubenswrapper[4933]: E1202 15:54:44.672216 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/89e4dcb7-4b26-46b9-9346-61d13b21285a-signing-cabundle podName:89e4dcb7-4b26-46b9-9346-61d13b21285a nodeName:}" failed. No retries permitted until 2025-12-02 15:54:45.172206776 +0000 UTC m=+148.423433479 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "signing-cabundle" (UniqueName: "kubernetes.io/configmap/89e4dcb7-4b26-46b9-9346-61d13b21285a-signing-cabundle") pod "service-ca-9c57cc56f-ws24r" (UID: "89e4dcb7-4b26-46b9-9346-61d13b21285a") : failed to sync configmap cache: timed out waiting for the condition Dec 02 15:54:44 crc kubenswrapper[4933]: E1202 15:54:44.672542 4933 configmap.go:193] Couldn't get configMap openshift-dns/dns-default: failed to sync configmap cache: timed out waiting for the condition Dec 02 15:54:44 crc kubenswrapper[4933]: E1202 15:54:44.672619 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d0cbe25c-56a8-4824-ace7-fff562288389-config-volume podName:d0cbe25c-56a8-4824-ace7-fff562288389 nodeName:}" failed. No retries permitted until 2025-12-02 15:54:45.172608399 +0000 UTC m=+148.423835102 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-volume" (UniqueName: "kubernetes.io/configmap/d0cbe25c-56a8-4824-ace7-fff562288389-config-volume") pod "dns-default-g4vq2" (UID: "d0cbe25c-56a8-4824-ace7-fff562288389") : failed to sync configmap cache: timed out waiting for the condition Dec 02 15:54:44 crc kubenswrapper[4933]: E1202 15:54:44.672687 4933 secret.go:188] Couldn't get secret openshift-machine-config-operator/node-bootstrapper-token: failed to sync secret cache: timed out waiting for the condition Dec 02 15:54:44 crc kubenswrapper[4933]: I1202 15:54:44.672750 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0add1e2e-7e5a-4e56-8d51-85d089b4573c-config\") pod \"kube-controller-manager-operator-78b949d7b-6qhmc\" (UID: \"0add1e2e-7e5a-4e56-8d51-85d089b4573c\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6qhmc" Dec 02 15:54:44 crc kubenswrapper[4933]: E1202 15:54:44.672795 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f863f677-8452-4742-bb8f-1307b893c75a-node-bootstrap-token podName:f863f677-8452-4742-bb8f-1307b893c75a nodeName:}" failed. No retries permitted until 2025-12-02 15:54:45.172722172 +0000 UTC m=+148.423948875 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-bootstrap-token" (UniqueName: "kubernetes.io/secret/f863f677-8452-4742-bb8f-1307b893c75a-node-bootstrap-token") pod "machine-config-server-57cjj" (UID: "f863f677-8452-4742-bb8f-1307b893c75a") : failed to sync secret cache: timed out waiting for the condition Dec 02 15:54:44 crc kubenswrapper[4933]: E1202 15:54:44.673035 4933 configmap.go:193] Couldn't get configMap openshift-kube-apiserver-operator/kube-apiserver-operator-config: failed to sync configmap cache: timed out waiting for the condition Dec 02 15:54:44 crc kubenswrapper[4933]: E1202 15:54:44.673116 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/331221b7-75d0-458f-84d2-098fa5170277-config podName:331221b7-75d0-458f-84d2-098fa5170277 nodeName:}" failed. No retries permitted until 2025-12-02 15:54:45.173098224 +0000 UTC m=+148.424324927 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/331221b7-75d0-458f-84d2-098fa5170277-config") pod "kube-apiserver-operator-766d6c64bb-xg27x" (UID: "331221b7-75d0-458f-84d2-098fa5170277") : failed to sync configmap cache: timed out waiting for the condition Dec 02 15:54:44 crc kubenswrapper[4933]: E1202 15:54:44.674202 4933 secret.go:188] Couldn't get secret openshift-machine-config-operator/machine-config-server-tls: failed to sync secret cache: timed out waiting for the condition Dec 02 15:54:44 crc kubenswrapper[4933]: E1202 15:54:44.674259 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f863f677-8452-4742-bb8f-1307b893c75a-certs podName:f863f677-8452-4742-bb8f-1307b893c75a nodeName:}" failed. No retries permitted until 2025-12-02 15:54:45.17424636 +0000 UTC m=+148.425473133 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certs" (UniqueName: "kubernetes.io/secret/f863f677-8452-4742-bb8f-1307b893c75a-certs") pod "machine-config-server-57cjj" (UID: "f863f677-8452-4742-bb8f-1307b893c75a") : failed to sync secret cache: timed out waiting for the condition Dec 02 15:54:44 crc kubenswrapper[4933]: I1202 15:54:44.692783 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 15:54:44 crc kubenswrapper[4933]: E1202 15:54:44.693641 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 15:54:45.193621538 +0000 UTC m=+148.444848251 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:44 crc kubenswrapper[4933]: I1202 15:54:44.701303 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 02 15:54:44 crc kubenswrapper[4933]: I1202 15:54:44.710951 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 02 15:54:44 crc kubenswrapper[4933]: I1202 15:54:44.730792 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 02 15:54:44 crc kubenswrapper[4933]: I1202 15:54:44.750419 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 02 15:54:44 crc kubenswrapper[4933]: I1202 15:54:44.770344 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 02 15:54:44 crc kubenswrapper[4933]: I1202 15:54:44.791038 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 02 15:54:44 crc kubenswrapper[4933]: I1202 15:54:44.794456 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lvtt7\" (UID: \"977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff\") " pod="openshift-image-registry/image-registry-697d97f7c8-lvtt7" Dec 02 15:54:44 crc kubenswrapper[4933]: E1202 15:54:44.794935 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 15:54:45.294924728 +0000 UTC m=+148.546151421 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lvtt7" (UID: "977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:44 crc kubenswrapper[4933]: I1202 15:54:44.809621 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 02 15:54:44 crc kubenswrapper[4933]: I1202 15:54:44.830312 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 02 15:54:44 crc kubenswrapper[4933]: W1202 15:54:44.837091 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-8aa26f4676cc7d4e564c7eab8f3585881117d3e9d49b03e40ce06e689bbd1c9a WatchSource:0}: Error finding container 8aa26f4676cc7d4e564c7eab8f3585881117d3e9d49b03e40ce06e689bbd1c9a: Status 404 returned error can't find the container with id 8aa26f4676cc7d4e564c7eab8f3585881117d3e9d49b03e40ce06e689bbd1c9a Dec 02 15:54:44 crc kubenswrapper[4933]: I1202 15:54:44.853537 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 02 15:54:44 crc kubenswrapper[4933]: I1202 15:54:44.870446 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 02 15:54:44 crc kubenswrapper[4933]: I1202 15:54:44.890875 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 02 15:54:44 crc kubenswrapper[4933]: I1202 15:54:44.895077 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 15:54:44 crc kubenswrapper[4933]: E1202 15:54:44.895189 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 15:54:45.395170996 +0000 UTC m=+148.646397699 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:44 crc kubenswrapper[4933]: I1202 15:54:44.895340 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lvtt7\" (UID: \"977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff\") " pod="openshift-image-registry/image-registry-697d97f7c8-lvtt7" Dec 02 15:54:44 crc kubenswrapper[4933]: E1202 15:54:44.895667 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 15:54:45.395658071 +0000 UTC m=+148.646884774 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lvtt7" (UID: "977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:44 crc kubenswrapper[4933]: I1202 15:54:44.909759 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 02 15:54:44 crc kubenswrapper[4933]: I1202 15:54:44.930533 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 02 15:54:44 crc kubenswrapper[4933]: I1202 15:54:44.950560 4933 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 02 15:54:44 crc kubenswrapper[4933]: I1202 15:54:44.970928 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 02 15:54:44 crc kubenswrapper[4933]: I1202 15:54:44.990869 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 02 15:54:44 crc kubenswrapper[4933]: I1202 15:54:44.996967 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 15:54:44 crc kubenswrapper[4933]: E1202 15:54:44.997131 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 15:54:45.497110096 +0000 UTC m=+148.748336799 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:44 crc kubenswrapper[4933]: I1202 15:54:44.997404 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lvtt7\" (UID: \"977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff\") " pod="openshift-image-registry/image-registry-697d97f7c8-lvtt7" Dec 02 15:54:44 crc kubenswrapper[4933]: E1202 15:54:44.997720 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 15:54:45.497713005 +0000 UTC m=+148.748939708 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lvtt7" (UID: "977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:45 crc kubenswrapper[4933]: I1202 15:54:45.024329 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/22c6a35b-d256-4c38-861a-f98b7a22d8fa-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-8pvz9\" (UID: \"22c6a35b-d256-4c38-861a-f98b7a22d8fa\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8pvz9" Dec 02 15:54:45 crc kubenswrapper[4933]: I1202 15:54:45.046272 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bhdx\" (UniqueName: \"kubernetes.io/projected/8a3e77e2-5cb3-44df-8570-e18f8e8f15a5-kube-api-access-6bhdx\") pod \"route-controller-manager-6576b87f9c-tdzpv\" (UID: \"8a3e77e2-5cb3-44df-8570-e18f8e8f15a5\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tdzpv" Dec 02 15:54:45 crc kubenswrapper[4933]: I1202 15:54:45.066055 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mt4wv\" (UniqueName: \"kubernetes.io/projected/97c46bec-8e96-4d62-8808-549f712a9802-kube-api-access-mt4wv\") pod \"oauth-openshift-558db77b4-26gvw\" (UID: \"97c46bec-8e96-4d62-8808-549f712a9802\") " pod="openshift-authentication/oauth-openshift-558db77b4-26gvw" Dec 02 15:54:45 crc kubenswrapper[4933]: I1202 15:54:45.085664 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnrrn\" (UniqueName: \"kubernetes.io/projected/aae2ee36-4dcf-429c-985d-c891c771cb0d-kube-api-access-bnrrn\") pod \"openshift-apiserver-operator-796bbdcf4f-2hj84\" (UID: \"aae2ee36-4dcf-429c-985d-c891c771cb0d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2hj84" Dec 02 15:54:45 crc kubenswrapper[4933]: I1202 15:54:45.098763 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 15:54:45 crc kubenswrapper[4933]: E1202 15:54:45.098978 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 15:54:45.598949943 +0000 UTC m=+148.850176646 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:45 crc kubenswrapper[4933]: I1202 15:54:45.099138 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lvtt7\" (UID: \"977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff\") " pod="openshift-image-registry/image-registry-697d97f7c8-lvtt7" Dec 02 15:54:45 crc kubenswrapper[4933]: E1202 15:54:45.099465 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 15:54:45.599451169 +0000 UTC m=+148.850677872 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lvtt7" (UID: "977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:45 crc kubenswrapper[4933]: I1202 15:54:45.104097 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6pznp\" (UniqueName: \"kubernetes.io/projected/ebb91b68-bf2f-43b1-a5d7-1c5de0e46e84-kube-api-access-6pznp\") pod \"cluster-samples-operator-665b6dd947-774lf\" (UID: \"ebb91b68-bf2f-43b1-a5d7-1c5de0e46e84\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-774lf" Dec 02 15:54:45 crc kubenswrapper[4933]: I1202 15:54:45.114158 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-774lf" Dec 02 15:54:45 crc kubenswrapper[4933]: I1202 15:54:45.124694 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tdzpv" Dec 02 15:54:45 crc kubenswrapper[4933]: I1202 15:54:45.126545 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qk87g\" (UniqueName: \"kubernetes.io/projected/9be27357-dce3-4ea8-8f1f-6bb17fc0870d-kube-api-access-qk87g\") pod \"machine-approver-56656f9798-hjncd\" (UID: \"9be27357-dce3-4ea8-8f1f-6bb17fc0870d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hjncd" Dec 02 15:54:45 crc kubenswrapper[4933]: I1202 15:54:45.127744 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"a3c50e0fbc049d8c9875e7fb9a9f83e13cf15709e0b0c6abe544f0eb1ffd7475"} Dec 02 15:54:45 crc kubenswrapper[4933]: I1202 15:54:45.128315 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2hj84" Dec 02 15:54:45 crc kubenswrapper[4933]: I1202 15:54:45.133866 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"8aa26f4676cc7d4e564c7eab8f3585881117d3e9d49b03e40ce06e689bbd1c9a"} Dec 02 15:54:45 crc kubenswrapper[4933]: I1202 15:54:45.135096 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"43be0f194061e40fdf69e8efd1521d72ed63902e9b6f53ca41550dd7adf3814b"} Dec 02 15:54:45 crc kubenswrapper[4933]: I1202 15:54:45.144175 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzrzt\" (UniqueName: \"kubernetes.io/projected/2e0ed0e9-47b5-44d2-89c5-9674e4dc3ed4-kube-api-access-zzrzt\") pod \"machine-api-operator-5694c8668f-brhm4\" (UID: \"2e0ed0e9-47b5-44d2-89c5-9674e4dc3ed4\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-brhm4" Dec 02 15:54:45 crc kubenswrapper[4933]: I1202 15:54:45.155765 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-26gvw" Dec 02 15:54:45 crc kubenswrapper[4933]: I1202 15:54:45.165357 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wvvb\" (UniqueName: \"kubernetes.io/projected/14eded2c-58d8-4348-a7ae-1a027349ae71-kube-api-access-5wvvb\") pod \"apiserver-76f77b778f-w8l5x\" (UID: \"14eded2c-58d8-4348-a7ae-1a027349ae71\") " pod="openshift-apiserver/apiserver-76f77b778f-w8l5x" Dec 02 15:54:45 crc kubenswrapper[4933]: I1202 15:54:45.184215 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtxt5\" (UniqueName: \"kubernetes.io/projected/22c6a35b-d256-4c38-861a-f98b7a22d8fa-kube-api-access-xtxt5\") pod \"cluster-image-registry-operator-dc59b4c8b-8pvz9\" (UID: \"22c6a35b-d256-4c38-861a-f98b7a22d8fa\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8pvz9" Dec 02 15:54:45 crc kubenswrapper[4933]: I1202 15:54:45.189801 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 02 15:54:45 crc kubenswrapper[4933]: I1202 15:54:45.200586 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hjncd" Dec 02 15:54:45 crc kubenswrapper[4933]: I1202 15:54:45.200851 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 15:54:45 crc kubenswrapper[4933]: E1202 15:54:45.200983 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 15:54:45.700955575 +0000 UTC m=+148.952182278 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:45 crc kubenswrapper[4933]: I1202 15:54:45.201131 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/f863f677-8452-4742-bb8f-1307b893c75a-node-bootstrap-token\") pod \"machine-config-server-57cjj\" (UID: \"f863f677-8452-4742-bb8f-1307b893c75a\") " pod="openshift-machine-config-operator/machine-config-server-57cjj" Dec 02 15:54:45 crc kubenswrapper[4933]: I1202 15:54:45.201163 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d0cbe25c-56a8-4824-ace7-fff562288389-config-volume\") pod \"dns-default-g4vq2\" (UID: \"d0cbe25c-56a8-4824-ace7-fff562288389\") " pod="openshift-dns/dns-default-g4vq2" Dec 02 15:54:45 crc kubenswrapper[4933]: I1202 15:54:45.201194 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/89e4dcb7-4b26-46b9-9346-61d13b21285a-signing-key\") pod \"service-ca-9c57cc56f-ws24r\" (UID: \"89e4dcb7-4b26-46b9-9346-61d13b21285a\") " pod="openshift-service-ca/service-ca-9c57cc56f-ws24r" Dec 02 15:54:45 crc kubenswrapper[4933]: I1202 15:54:45.201227 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/331221b7-75d0-458f-84d2-098fa5170277-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-xg27x\" (UID: \"331221b7-75d0-458f-84d2-098fa5170277\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xg27x" Dec 02 15:54:45 crc kubenswrapper[4933]: I1202 15:54:45.201258 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d0cbe25c-56a8-4824-ace7-fff562288389-metrics-tls\") pod \"dns-default-g4vq2\" (UID: \"d0cbe25c-56a8-4824-ace7-fff562288389\") " pod="openshift-dns/dns-default-g4vq2" Dec 02 15:54:45 crc kubenswrapper[4933]: I1202 15:54:45.201284 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0add1e2e-7e5a-4e56-8d51-85d089b4573c-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-6qhmc\" (UID: \"0add1e2e-7e5a-4e56-8d51-85d089b4573c\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6qhmc" Dec 02 15:54:45 crc kubenswrapper[4933]: I1202 15:54:45.201305 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/11c9d818-4012-423f-b3ce-bec8ac30f1d7-config-volume\") pod \"collect-profiles-29411505-4m9wv\" (UID: \"11c9d818-4012-423f-b3ce-bec8ac30f1d7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411505-4m9wv" Dec 02 15:54:45 crc kubenswrapper[4933]: I1202 15:54:45.201358 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/f863f677-8452-4742-bb8f-1307b893c75a-certs\") pod \"machine-config-server-57cjj\" (UID: \"f863f677-8452-4742-bb8f-1307b893c75a\") " pod="openshift-machine-config-operator/machine-config-server-57cjj" Dec 02 15:54:45 crc kubenswrapper[4933]: I1202 15:54:45.201411 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/331221b7-75d0-458f-84d2-098fa5170277-config\") pod \"kube-apiserver-operator-766d6c64bb-xg27x\" (UID: \"331221b7-75d0-458f-84d2-098fa5170277\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xg27x" Dec 02 15:54:45 crc kubenswrapper[4933]: I1202 15:54:45.201492 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/37c7afce-94aa-439e-9415-cdf79931a95e-cert\") pod \"ingress-canary-v2svg\" (UID: \"37c7afce-94aa-439e-9415-cdf79931a95e\") " pod="openshift-ingress-canary/ingress-canary-v2svg" Dec 02 15:54:45 crc kubenswrapper[4933]: I1202 15:54:45.201534 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lvtt7\" (UID: \"977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff\") " pod="openshift-image-registry/image-registry-697d97f7c8-lvtt7" Dec 02 15:54:45 crc kubenswrapper[4933]: I1202 15:54:45.201605 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/89e4dcb7-4b26-46b9-9346-61d13b21285a-signing-cabundle\") pod \"service-ca-9c57cc56f-ws24r\" (UID: \"89e4dcb7-4b26-46b9-9346-61d13b21285a\") " pod="openshift-service-ca/service-ca-9c57cc56f-ws24r" Dec 02 15:54:45 crc kubenswrapper[4933]: E1202 15:54:45.201896 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 15:54:45.701881574 +0000 UTC m=+148.953108277 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lvtt7" (UID: "977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:45 crc kubenswrapper[4933]: I1202 15:54:45.202262 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d0cbe25c-56a8-4824-ace7-fff562288389-config-volume\") pod \"dns-default-g4vq2\" (UID: \"d0cbe25c-56a8-4824-ace7-fff562288389\") " pod="openshift-dns/dns-default-g4vq2" Dec 02 15:54:45 crc kubenswrapper[4933]: I1202 15:54:45.202349 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/11c9d818-4012-423f-b3ce-bec8ac30f1d7-config-volume\") pod \"collect-profiles-29411505-4m9wv\" (UID: \"11c9d818-4012-423f-b3ce-bec8ac30f1d7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411505-4m9wv" Dec 02 15:54:45 crc kubenswrapper[4933]: I1202 15:54:45.202803 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/89e4dcb7-4b26-46b9-9346-61d13b21285a-signing-cabundle\") pod \"service-ca-9c57cc56f-ws24r\" (UID: \"89e4dcb7-4b26-46b9-9346-61d13b21285a\") " pod="openshift-service-ca/service-ca-9c57cc56f-ws24r" Dec 02 15:54:45 crc kubenswrapper[4933]: I1202 15:54:45.203012 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/331221b7-75d0-458f-84d2-098fa5170277-config\") pod \"kube-apiserver-operator-766d6c64bb-xg27x\" (UID: \"331221b7-75d0-458f-84d2-098fa5170277\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xg27x" Dec 02 15:54:45 crc kubenswrapper[4933]: I1202 15:54:45.204210 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0add1e2e-7e5a-4e56-8d51-85d089b4573c-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-6qhmc\" (UID: \"0add1e2e-7e5a-4e56-8d51-85d089b4573c\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6qhmc" Dec 02 15:54:45 crc kubenswrapper[4933]: I1202 15:54:45.206659 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/89e4dcb7-4b26-46b9-9346-61d13b21285a-signing-key\") pod \"service-ca-9c57cc56f-ws24r\" (UID: \"89e4dcb7-4b26-46b9-9346-61d13b21285a\") " pod="openshift-service-ca/service-ca-9c57cc56f-ws24r" Dec 02 15:54:45 crc kubenswrapper[4933]: I1202 15:54:45.206801 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/331221b7-75d0-458f-84d2-098fa5170277-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-xg27x\" (UID: \"331221b7-75d0-458f-84d2-098fa5170277\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xg27x" Dec 02 15:54:45 crc kubenswrapper[4933]: I1202 15:54:45.209992 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 02 15:54:45 crc kubenswrapper[4933]: I1202 15:54:45.230414 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 02 15:54:45 crc kubenswrapper[4933]: I1202 15:54:45.249858 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 02 15:54:45 crc kubenswrapper[4933]: I1202 15:54:45.255664 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d0cbe25c-56a8-4824-ace7-fff562288389-metrics-tls\") pod \"dns-default-g4vq2\" (UID: \"d0cbe25c-56a8-4824-ace7-fff562288389\") " pod="openshift-dns/dns-default-g4vq2" Dec 02 15:54:45 crc kubenswrapper[4933]: I1202 15:54:45.270507 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 02 15:54:45 crc kubenswrapper[4933]: I1202 15:54:45.290250 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 02 15:54:45 crc kubenswrapper[4933]: I1202 15:54:45.302385 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-w8l5x" Dec 02 15:54:45 crc kubenswrapper[4933]: I1202 15:54:45.302967 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 15:54:45 crc kubenswrapper[4933]: E1202 15:54:45.303121 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 15:54:45.803091852 +0000 UTC m=+149.054318595 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:45 crc kubenswrapper[4933]: I1202 15:54:45.303349 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lvtt7\" (UID: \"977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff\") " pod="openshift-image-registry/image-registry-697d97f7c8-lvtt7" Dec 02 15:54:45 crc kubenswrapper[4933]: E1202 15:54:45.303882 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 15:54:45.803866376 +0000 UTC m=+149.055093109 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lvtt7" (UID: "977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:45 crc kubenswrapper[4933]: I1202 15:54:45.310463 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 02 15:54:45 crc kubenswrapper[4933]: I1202 15:54:45.316071 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/37c7afce-94aa-439e-9415-cdf79931a95e-cert\") pod \"ingress-canary-v2svg\" (UID: \"37c7afce-94aa-439e-9415-cdf79931a95e\") " pod="openshift-ingress-canary/ingress-canary-v2svg" Dec 02 15:54:45 crc kubenswrapper[4933]: I1202 15:54:45.328968 4933 request.go:700] Waited for 1.943324148s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/secrets?fieldSelector=metadata.name%3Dmachine-config-server-tls&limit=500&resourceVersion=0 Dec 02 15:54:45 crc kubenswrapper[4933]: I1202 15:54:45.331152 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 02 15:54:45 crc kubenswrapper[4933]: I1202 15:54:45.335310 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/f863f677-8452-4742-bb8f-1307b893c75a-certs\") pod \"machine-config-server-57cjj\" (UID: \"f863f677-8452-4742-bb8f-1307b893c75a\") " pod="openshift-machine-config-operator/machine-config-server-57cjj" Dec 02 15:54:45 crc kubenswrapper[4933]: I1202 15:54:45.351591 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 02 15:54:45 crc kubenswrapper[4933]: I1202 15:54:45.370898 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 02 15:54:45 crc kubenswrapper[4933]: I1202 15:54:45.375914 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/f863f677-8452-4742-bb8f-1307b893c75a-node-bootstrap-token\") pod \"machine-config-server-57cjj\" (UID: \"f863f677-8452-4742-bb8f-1307b893c75a\") " pod="openshift-machine-config-operator/machine-config-server-57cjj" Dec 02 15:54:45 crc kubenswrapper[4933]: I1202 15:54:45.382942 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-brhm4" Dec 02 15:54:45 crc kubenswrapper[4933]: I1202 15:54:45.414100 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8pvz9" Dec 02 15:54:45 crc kubenswrapper[4933]: I1202 15:54:45.414754 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 15:54:45 crc kubenswrapper[4933]: E1202 15:54:45.415394 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 15:54:45.915374987 +0000 UTC m=+149.166601690 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:45 crc kubenswrapper[4933]: I1202 15:54:45.440262 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ppt4\" (UniqueName: \"kubernetes.io/projected/071e51e3-11fa-4ff1-9417-b7fbca815e88-kube-api-access-9ppt4\") pod \"authentication-operator-69f744f599-kmxmh\" (UID: \"071e51e3-11fa-4ff1-9417-b7fbca815e88\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kmxmh" Dec 02 15:54:45 crc kubenswrapper[4933]: I1202 15:54:45.448773 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff-bound-sa-token\") pod \"image-registry-697d97f7c8-lvtt7\" (UID: \"977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff\") " pod="openshift-image-registry/image-registry-697d97f7c8-lvtt7" Dec 02 15:54:45 crc kubenswrapper[4933]: I1202 15:54:45.465363 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4c9c5\" (UniqueName: \"kubernetes.io/projected/50f1708b-32f7-42c3-a3ec-57f654624efa-kube-api-access-4c9c5\") pod \"controller-manager-879f6c89f-5ttq4\" (UID: \"50f1708b-32f7-42c3-a3ec-57f654624efa\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5ttq4" Dec 02 15:54:45 crc kubenswrapper[4933]: I1202 15:54:45.483504 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rd48q\" (UniqueName: \"kubernetes.io/projected/fcc5c745-13b8-4ff8-b677-095dd8a46081-kube-api-access-rd48q\") pod \"console-f9d7485db-lcwgg\" (UID: \"fcc5c745-13b8-4ff8-b677-095dd8a46081\") " pod="openshift-console/console-f9d7485db-lcwgg" Dec 02 15:54:45 crc kubenswrapper[4933]: I1202 15:54:45.505169 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zkt4\" (UniqueName: \"kubernetes.io/projected/8b382362-3187-4571-a8a0-057cbccc89ff-kube-api-access-2zkt4\") pod \"downloads-7954f5f757-9r8tr\" (UID: \"8b382362-3187-4571-a8a0-057cbccc89ff\") " pod="openshift-console/downloads-7954f5f757-9r8tr" Dec 02 15:54:45 crc kubenswrapper[4933]: I1202 15:54:45.516657 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lvtt7\" (UID: \"977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff\") " pod="openshift-image-registry/image-registry-697d97f7c8-lvtt7" Dec 02 15:54:45 crc kubenswrapper[4933]: E1202 15:54:45.516971 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 15:54:46.016960106 +0000 UTC m=+149.268186809 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lvtt7" (UID: "977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:45 crc kubenswrapper[4933]: I1202 15:54:45.530965 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jt7qx\" (UniqueName: \"kubernetes.io/projected/977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff-kube-api-access-jt7qx\") pod \"image-registry-697d97f7c8-lvtt7\" (UID: \"977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff\") " pod="openshift-image-registry/image-registry-697d97f7c8-lvtt7" Dec 02 15:54:45 crc kubenswrapper[4933]: I1202 15:54:45.552870 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrtr5\" (UniqueName: \"kubernetes.io/projected/4762fc2a-f608-4992-a53f-72ba66df1820-kube-api-access-wrtr5\") pod \"console-operator-58897d9998-6dp2z\" (UID: \"4762fc2a-f608-4992-a53f-72ba66df1820\") " pod="openshift-console-operator/console-operator-58897d9998-6dp2z" Dec 02 15:54:45 crc kubenswrapper[4933]: I1202 15:54:45.568643 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vj5fx\" (UniqueName: \"kubernetes.io/projected/da774c74-89be-4c8c-b2e2-0bad771112d2-kube-api-access-vj5fx\") pod \"package-server-manager-789f6589d5-4wd5d\" (UID: \"da774c74-89be-4c8c-b2e2-0bad771112d2\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4wd5d" Dec 02 15:54:45 crc kubenswrapper[4933]: I1202 15:54:45.576262 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-5ttq4" Dec 02 15:54:45 crc kubenswrapper[4933]: I1202 15:54:45.587603 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfchz\" (UniqueName: \"kubernetes.io/projected/11c9d818-4012-423f-b3ce-bec8ac30f1d7-kube-api-access-nfchz\") pod \"collect-profiles-29411505-4m9wv\" (UID: \"11c9d818-4012-423f-b3ce-bec8ac30f1d7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411505-4m9wv" Dec 02 15:54:45 crc kubenswrapper[4933]: I1202 15:54:45.606756 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bw6kr\" (UniqueName: \"kubernetes.io/projected/146d5bc3-8ff4-4175-9482-c9e3f220e5c0-kube-api-access-bw6kr\") pod \"apiserver-7bbb656c7d-grlhq\" (UID: \"146d5bc3-8ff4-4175-9482-c9e3f220e5c0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-grlhq" Dec 02 15:54:45 crc kubenswrapper[4933]: I1202 15:54:45.618211 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 15:54:45 crc kubenswrapper[4933]: E1202 15:54:45.618462 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 15:54:46.118420391 +0000 UTC m=+149.369647134 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:45 crc kubenswrapper[4933]: I1202 15:54:45.618666 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lvtt7\" (UID: \"977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff\") " pod="openshift-image-registry/image-registry-697d97f7c8-lvtt7" Dec 02 15:54:45 crc kubenswrapper[4933]: I1202 15:54:45.618881 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-kmxmh" Dec 02 15:54:45 crc kubenswrapper[4933]: E1202 15:54:45.619365 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 15:54:46.11934482 +0000 UTC m=+149.370571553 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lvtt7" (UID: "977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:45 crc kubenswrapper[4933]: I1202 15:54:45.627179 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcnlc\" (UniqueName: \"kubernetes.io/projected/061606b3-9a47-4cff-ad31-04e9a5a05528-kube-api-access-fcnlc\") pod \"control-plane-machine-set-operator-78cbb6b69f-cx7jx\" (UID: \"061606b3-9a47-4cff-ad31-04e9a5a05528\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-cx7jx" Dec 02 15:54:45 crc kubenswrapper[4933]: I1202 15:54:45.645644 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jcxs\" (UniqueName: \"kubernetes.io/projected/e75ec078-9907-4485-a7a4-991622b1788d-kube-api-access-2jcxs\") pod \"migrator-59844c95c7-xfcv2\" (UID: \"e75ec078-9907-4485-a7a4-991622b1788d\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-xfcv2" Dec 02 15:54:45 crc kubenswrapper[4933]: I1202 15:54:45.654492 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4wd5d" Dec 02 15:54:45 crc kubenswrapper[4933]: I1202 15:54:45.674004 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-6dp2z" Dec 02 15:54:45 crc kubenswrapper[4933]: I1202 15:54:45.677314 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9zvp\" (UniqueName: \"kubernetes.io/projected/748b551d-be9d-4862-9a91-12ca4ccc71fa-kube-api-access-z9zvp\") pod \"packageserver-d55dfcdfc-d99zb\" (UID: \"748b551d-be9d-4862-9a91-12ca4ccc71fa\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d99zb" Dec 02 15:54:45 crc kubenswrapper[4933]: I1202 15:54:45.691164 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0add1e2e-7e5a-4e56-8d51-85d089b4573c-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-6qhmc\" (UID: \"0add1e2e-7e5a-4e56-8d51-85d089b4573c\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6qhmc" Dec 02 15:54:45 crc kubenswrapper[4933]: I1202 15:54:45.695992 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-9r8tr" Dec 02 15:54:45 crc kubenswrapper[4933]: I1202 15:54:45.708176 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ct6s8\" (UniqueName: \"kubernetes.io/projected/6a612bec-8f7d-4fc1-bba2-877fc67b13f9-kube-api-access-ct6s8\") pod \"service-ca-operator-777779d784-pvt57\" (UID: \"6a612bec-8f7d-4fc1-bba2-877fc67b13f9\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-pvt57" Dec 02 15:54:45 crc kubenswrapper[4933]: I1202 15:54:45.709195 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6qhmc" Dec 02 15:54:45 crc kubenswrapper[4933]: I1202 15:54:45.716337 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411505-4m9wv" Dec 02 15:54:45 crc kubenswrapper[4933]: I1202 15:54:45.721478 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 15:54:45 crc kubenswrapper[4933]: E1202 15:54:45.721692 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 15:54:46.221674182 +0000 UTC m=+149.472900885 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:45 crc kubenswrapper[4933]: I1202 15:54:45.721775 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lvtt7\" (UID: \"977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff\") " pod="openshift-image-registry/image-registry-697d97f7c8-lvtt7" Dec 02 15:54:45 crc kubenswrapper[4933]: E1202 15:54:45.722389 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 15:54:46.222382455 +0000 UTC m=+149.473609158 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lvtt7" (UID: "977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:45 crc kubenswrapper[4933]: I1202 15:54:45.726973 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrd7k\" (UniqueName: \"kubernetes.io/projected/c657682e-039a-46ee-b4c7-a5f95ee75e44-kube-api-access-hrd7k\") pod \"ingress-operator-5b745b69d9-v4g7n\" (UID: \"c657682e-039a-46ee-b4c7-a5f95ee75e44\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-v4g7n" Dec 02 15:54:45 crc kubenswrapper[4933]: I1202 15:54:45.739127 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-lcwgg" Dec 02 15:54:45 crc kubenswrapper[4933]: I1202 15:54:45.749370 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbsnr\" (UniqueName: \"kubernetes.io/projected/cbd45ad7-14dd-46e8-941b-4d2bfced7567-kube-api-access-wbsnr\") pod \"dns-operator-744455d44c-jgdkm\" (UID: \"cbd45ad7-14dd-46e8-941b-4d2bfced7567\") " pod="openshift-dns-operator/dns-operator-744455d44c-jgdkm" Dec 02 15:54:45 crc kubenswrapper[4933]: I1202 15:54:45.766415 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzhxh\" (UniqueName: \"kubernetes.io/projected/028c8291-1fc5-4035-b46f-3c618ecf154d-kube-api-access-rzhxh\") pod \"machine-config-operator-74547568cd-2r9nt\" (UID: \"028c8291-1fc5-4035-b46f-3c618ecf154d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2r9nt" Dec 02 15:54:45 crc kubenswrapper[4933]: I1202 15:54:45.787775 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98qxk\" (UniqueName: \"kubernetes.io/projected/cb41f368-0638-40fd-be0f-bc71d0182af8-kube-api-access-98qxk\") pod \"marketplace-operator-79b997595-r7dnz\" (UID: \"cb41f368-0638-40fd-be0f-bc71d0182af8\") " pod="openshift-marketplace/marketplace-operator-79b997595-r7dnz" Dec 02 15:54:45 crc kubenswrapper[4933]: I1202 15:54:45.815391 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-jgdkm" Dec 02 15:54:45 crc kubenswrapper[4933]: I1202 15:54:45.823667 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 15:54:45 crc kubenswrapper[4933]: E1202 15:54:45.824132 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 15:54:46.324110528 +0000 UTC m=+149.575337231 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:45 crc kubenswrapper[4933]: I1202 15:54:45.829361 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-grlhq" Dec 02 15:54:45 crc kubenswrapper[4933]: I1202 15:54:45.830254 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hsn89\" (UniqueName: \"kubernetes.io/projected/3d3dbac3-1faa-46d1-8a2a-ddf41a7f1f0f-kube-api-access-hsn89\") pod \"catalog-operator-68c6474976-ckqs2\" (UID: \"3d3dbac3-1faa-46d1-8a2a-ddf41a7f1f0f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ckqs2" Dec 02 15:54:45 crc kubenswrapper[4933]: I1202 15:54:45.832415 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rd4hf\" (UniqueName: \"kubernetes.io/projected/e0806294-1664-4633-ba20-7b687a8cf4b2-kube-api-access-rd4hf\") pod \"machine-config-controller-84d6567774-6f2dw\" (UID: \"e0806294-1664-4633-ba20-7b687a8cf4b2\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6f2dw" Dec 02 15:54:45 crc kubenswrapper[4933]: I1202 15:54:45.836335 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-xfcv2" Dec 02 15:54:45 crc kubenswrapper[4933]: I1202 15:54:45.844668 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6f2dw" Dec 02 15:54:45 crc kubenswrapper[4933]: I1202 15:54:45.859517 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6bdr\" (UniqueName: \"kubernetes.io/projected/37c7afce-94aa-439e-9415-cdf79931a95e-kube-api-access-n6bdr\") pod \"ingress-canary-v2svg\" (UID: \"37c7afce-94aa-439e-9415-cdf79931a95e\") " pod="openshift-ingress-canary/ingress-canary-v2svg" Dec 02 15:54:45 crc kubenswrapper[4933]: I1202 15:54:45.865372 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2r9nt" Dec 02 15:54:45 crc kubenswrapper[4933]: I1202 15:54:45.867619 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7knnx\" (UniqueName: \"kubernetes.io/projected/f863f677-8452-4742-bb8f-1307b893c75a-kube-api-access-7knnx\") pod \"machine-config-server-57cjj\" (UID: \"f863f677-8452-4742-bb8f-1307b893c75a\") " pod="openshift-machine-config-operator/machine-config-server-57cjj" Dec 02 15:54:45 crc kubenswrapper[4933]: I1202 15:54:45.886497 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-cx7jx" Dec 02 15:54:45 crc kubenswrapper[4933]: I1202 15:54:45.894923 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbxw7\" (UniqueName: \"kubernetes.io/projected/d0cbe25c-56a8-4824-ace7-fff562288389-kube-api-access-nbxw7\") pod \"dns-default-g4vq2\" (UID: \"d0cbe25c-56a8-4824-ace7-fff562288389\") " pod="openshift-dns/dns-default-g4vq2" Dec 02 15:54:45 crc kubenswrapper[4933]: I1202 15:54:45.895251 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-pvt57" Dec 02 15:54:45 crc kubenswrapper[4933]: I1202 15:54:45.906409 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qk72g\" (UniqueName: \"kubernetes.io/projected/b7c9b4f1-183e-4d60-bb84-b2995d89b0fe-kube-api-access-qk72g\") pod \"etcd-operator-b45778765-d86pn\" (UID: \"b7c9b4f1-183e-4d60-bb84-b2995d89b0fe\") " pod="openshift-etcd-operator/etcd-operator-b45778765-d86pn" Dec 02 15:54:45 crc kubenswrapper[4933]: I1202 15:54:45.919472 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ckqs2" Dec 02 15:54:45 crc kubenswrapper[4933]: I1202 15:54:45.929199 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lvtt7\" (UID: \"977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff\") " pod="openshift-image-registry/image-registry-697d97f7c8-lvtt7" Dec 02 15:54:45 crc kubenswrapper[4933]: E1202 15:54:45.929634 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 15:54:46.429616541 +0000 UTC m=+149.680843244 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lvtt7" (UID: "977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:45 crc kubenswrapper[4933]: I1202 15:54:45.931037 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbpt5\" (UniqueName: \"kubernetes.io/projected/9f5ccd80-acb3-4128-bf7c-e2726a9fe5ab-kube-api-access-lbpt5\") pod \"csi-hostpathplugin-5xvkq\" (UID: \"9f5ccd80-acb3-4128-bf7c-e2726a9fe5ab\") " pod="hostpath-provisioner/csi-hostpathplugin-5xvkq" Dec 02 15:54:45 crc kubenswrapper[4933]: I1202 15:54:45.936454 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-r7dnz" Dec 02 15:54:45 crc kubenswrapper[4933]: I1202 15:54:45.946595 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d99zb" Dec 02 15:54:45 crc kubenswrapper[4933]: I1202 15:54:45.959595 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t26cf\" (UniqueName: \"kubernetes.io/projected/043b25c4-e262-410b-aec8-e18fdb93d0c7-kube-api-access-t26cf\") pod \"router-default-5444994796-dgdm5\" (UID: \"043b25c4-e262-410b-aec8-e18fdb93d0c7\") " pod="openshift-ingress/router-default-5444994796-dgdm5" Dec 02 15:54:45 crc kubenswrapper[4933]: I1202 15:54:45.990494 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4f2h\" (UniqueName: \"kubernetes.io/projected/97b17284-20cf-4265-a2e0-721fe06f8105-kube-api-access-n4f2h\") pod \"openshift-controller-manager-operator-756b6f6bc6-dhcjs\" (UID: \"97b17284-20cf-4265-a2e0-721fe06f8105\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-dhcjs" Dec 02 15:54:46 crc kubenswrapper[4933]: I1202 15:54:46.015864 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgfgv\" (UniqueName: \"kubernetes.io/projected/ff11bd0d-41c3-45a6-a955-259a754887e3-kube-api-access-cgfgv\") pod \"olm-operator-6b444d44fb-hdrfm\" (UID: \"ff11bd0d-41c3-45a6-a955-259a754887e3\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hdrfm" Dec 02 15:54:46 crc kubenswrapper[4933]: I1202 15:54:46.019376 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b850db5e-74db-4823-bbb3-132b75b17c30-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-b986z\" (UID: \"b850db5e-74db-4823-bbb3-132b75b17c30\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-b986z" Dec 02 15:54:46 crc kubenswrapper[4933]: I1202 15:54:46.027679 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bqwr\" (UniqueName: \"kubernetes.io/projected/89e4dcb7-4b26-46b9-9346-61d13b21285a-kube-api-access-6bqwr\") pod \"service-ca-9c57cc56f-ws24r\" (UID: \"89e4dcb7-4b26-46b9-9346-61d13b21285a\") " pod="openshift-service-ca/service-ca-9c57cc56f-ws24r" Dec 02 15:54:46 crc kubenswrapper[4933]: I1202 15:54:46.030348 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 15:54:46 crc kubenswrapper[4933]: E1202 15:54:46.030975 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 15:54:46.530814136 +0000 UTC m=+149.782040839 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:46 crc kubenswrapper[4933]: I1202 15:54:46.031685 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-ws24r" Dec 02 15:54:46 crc kubenswrapper[4933]: I1202 15:54:46.045537 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22fjl\" (UniqueName: \"kubernetes.io/projected/78a11c22-fa5a-4573-ac7b-44bad6178356-kube-api-access-22fjl\") pod \"openshift-config-operator-7777fb866f-b2sss\" (UID: \"78a11c22-fa5a-4573-ac7b-44bad6178356\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-b2sss" Dec 02 15:54:46 crc kubenswrapper[4933]: I1202 15:54:46.056282 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-5xvkq" Dec 02 15:54:46 crc kubenswrapper[4933]: I1202 15:54:46.065615 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-g4vq2" Dec 02 15:54:46 crc kubenswrapper[4933]: I1202 15:54:46.073037 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vrn7\" (UniqueName: \"kubernetes.io/projected/ba88c07b-f3ab-4abd-ad64-72d34148bc09-kube-api-access-9vrn7\") pod \"kube-storage-version-migrator-operator-b67b599dd-htb5t\" (UID: \"ba88c07b-f3ab-4abd-ad64-72d34148bc09\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-htb5t" Dec 02 15:54:46 crc kubenswrapper[4933]: I1202 15:54:46.079676 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-v2svg" Dec 02 15:54:46 crc kubenswrapper[4933]: I1202 15:54:46.092047 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/331221b7-75d0-458f-84d2-098fa5170277-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-xg27x\" (UID: \"331221b7-75d0-458f-84d2-098fa5170277\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xg27x" Dec 02 15:54:46 crc kubenswrapper[4933]: I1202 15:54:46.097731 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-57cjj" Dec 02 15:54:46 crc kubenswrapper[4933]: I1202 15:54:46.112801 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-b2sss" Dec 02 15:54:46 crc kubenswrapper[4933]: I1202 15:54:46.113415 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z52dn\" (UniqueName: \"kubernetes.io/projected/806c8633-7969-405c-b445-734ae20ede22-kube-api-access-z52dn\") pod \"multus-admission-controller-857f4d67dd-tgwdx\" (UID: \"806c8633-7969-405c-b445-734ae20ede22\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-tgwdx" Dec 02 15:54:46 crc kubenswrapper[4933]: I1202 15:54:46.121519 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-dhcjs" Dec 02 15:54:46 crc kubenswrapper[4933]: I1202 15:54:46.132094 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lvtt7\" (UID: \"977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff\") " pod="openshift-image-registry/image-registry-697d97f7c8-lvtt7" Dec 02 15:54:46 crc kubenswrapper[4933]: E1202 15:54:46.132504 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 15:54:46.632492119 +0000 UTC m=+149.883718822 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lvtt7" (UID: "977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:46 crc kubenswrapper[4933]: I1202 15:54:46.134720 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c657682e-039a-46ee-b4c7-a5f95ee75e44-bound-sa-token\") pod \"ingress-operator-5b745b69d9-v4g7n\" (UID: \"c657682e-039a-46ee-b4c7-a5f95ee75e44\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-v4g7n" Dec 02 15:54:46 crc kubenswrapper[4933]: I1202 15:54:46.151105 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-htb5t" Dec 02 15:54:46 crc kubenswrapper[4933]: I1202 15:54:46.157891 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-dgdm5" Dec 02 15:54:46 crc kubenswrapper[4933]: W1202 15:54:46.165685 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf863f677_8452_4742_bb8f_1307b893c75a.slice/crio-7af6cf7843126a9dd2103900c94670504d4e65f7729acb443c9eccab433e2f8c WatchSource:0}: Error finding container 7af6cf7843126a9dd2103900c94670504d4e65f7729acb443c9eccab433e2f8c: Status 404 returned error can't find the container with id 7af6cf7843126a9dd2103900c94670504d4e65f7729acb443c9eccab433e2f8c Dec 02 15:54:46 crc kubenswrapper[4933]: I1202 15:54:46.176185 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-d86pn" Dec 02 15:54:46 crc kubenswrapper[4933]: I1202 15:54:46.179022 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"f3d61e74446ebebbaf2409a588ba586bd8047e6764d4f35af68d427171d149a3"} Dec 02 15:54:46 crc kubenswrapper[4933]: I1202 15:54:46.179092 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 15:54:46 crc kubenswrapper[4933]: I1202 15:54:46.182168 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hdrfm" Dec 02 15:54:46 crc kubenswrapper[4933]: I1202 15:54:46.186876 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-774lf"] Dec 02 15:54:46 crc kubenswrapper[4933]: I1202 15:54:46.190111 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hjncd" event={"ID":"9be27357-dce3-4ea8-8f1f-6bb17fc0870d","Type":"ContainerStarted","Data":"7139521f44c1f0baa91f31ad28c057bccaf04995dbc9c9be12ef8843cb9e3881"} Dec 02 15:54:46 crc kubenswrapper[4933]: I1202 15:54:46.203338 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-tgwdx" Dec 02 15:54:46 crc kubenswrapper[4933]: I1202 15:54:46.207740 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8pvz9"] Dec 02 15:54:46 crc kubenswrapper[4933]: I1202 15:54:46.213110 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"77ef7edefc40968ec95cf5fb410bc8f5b52d5114e1d6368f58591e7a831324e4"} Dec 02 15:54:46 crc kubenswrapper[4933]: I1202 15:54:46.216977 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"24ef78b073291fbdf1b97681f82ebea47d0076cdc3324eb454e553f0f4580db0"} Dec 02 15:54:46 crc kubenswrapper[4933]: I1202 15:54:46.226155 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-b986z" Dec 02 15:54:46 crc kubenswrapper[4933]: I1202 15:54:46.241808 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 15:54:46 crc kubenswrapper[4933]: E1202 15:54:46.242286 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 15:54:46.742270795 +0000 UTC m=+149.993497498 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:46 crc kubenswrapper[4933]: I1202 15:54:46.270426 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-v4g7n" Dec 02 15:54:46 crc kubenswrapper[4933]: I1202 15:54:46.322228 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xg27x" Dec 02 15:54:46 crc kubenswrapper[4933]: I1202 15:54:46.345676 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lvtt7\" (UID: \"977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff\") " pod="openshift-image-registry/image-registry-697d97f7c8-lvtt7" Dec 02 15:54:46 crc kubenswrapper[4933]: E1202 15:54:46.347412 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 15:54:46.847398195 +0000 UTC m=+150.098624898 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lvtt7" (UID: "977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:46 crc kubenswrapper[4933]: I1202 15:54:46.446959 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 15:54:46 crc kubenswrapper[4933]: E1202 15:54:46.447264 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 15:54:46.947228709 +0000 UTC m=+150.198455412 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:46 crc kubenswrapper[4933]: I1202 15:54:46.447359 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lvtt7\" (UID: \"977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff\") " pod="openshift-image-registry/image-registry-697d97f7c8-lvtt7" Dec 02 15:54:46 crc kubenswrapper[4933]: E1202 15:54:46.447798 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 15:54:46.947789527 +0000 UTC m=+150.199016230 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lvtt7" (UID: "977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:46 crc kubenswrapper[4933]: I1202 15:54:46.553427 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 15:54:46 crc kubenswrapper[4933]: E1202 15:54:46.554240 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 15:54:47.054218268 +0000 UTC m=+150.305444971 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:46 crc kubenswrapper[4933]: I1202 15:54:46.655185 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lvtt7\" (UID: \"977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff\") " pod="openshift-image-registry/image-registry-697d97f7c8-lvtt7" Dec 02 15:54:46 crc kubenswrapper[4933]: E1202 15:54:46.655523 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 15:54:47.155508268 +0000 UTC m=+150.406734971 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lvtt7" (UID: "977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:46 crc kubenswrapper[4933]: I1202 15:54:46.757177 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 15:54:46 crc kubenswrapper[4933]: E1202 15:54:46.760599 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 15:54:47.260578036 +0000 UTC m=+150.511804739 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:46 crc kubenswrapper[4933]: I1202 15:54:46.861632 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lvtt7\" (UID: \"977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff\") " pod="openshift-image-registry/image-registry-697d97f7c8-lvtt7" Dec 02 15:54:46 crc kubenswrapper[4933]: E1202 15:54:46.862008 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 15:54:47.36199659 +0000 UTC m=+150.613223293 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lvtt7" (UID: "977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:46 crc kubenswrapper[4933]: I1202 15:54:46.966123 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 15:54:46 crc kubenswrapper[4933]: E1202 15:54:46.966583 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 15:54:47.466566773 +0000 UTC m=+150.717793476 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:46 crc kubenswrapper[4933]: I1202 15:54:46.967089 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lvtt7\" (UID: \"977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff\") " pod="openshift-image-registry/image-registry-697d97f7c8-lvtt7" Dec 02 15:54:46 crc kubenswrapper[4933]: E1202 15:54:46.968228 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 15:54:47.468203675 +0000 UTC m=+150.719430378 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lvtt7" (UID: "977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:47 crc kubenswrapper[4933]: I1202 15:54:47.068786 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 15:54:47 crc kubenswrapper[4933]: E1202 15:54:47.069132 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 15:54:47.569118653 +0000 UTC m=+150.820345356 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:47 crc kubenswrapper[4933]: I1202 15:54:47.170351 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lvtt7\" (UID: \"977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff\") " pod="openshift-image-registry/image-registry-697d97f7c8-lvtt7" Dec 02 15:54:47 crc kubenswrapper[4933]: I1202 15:54:47.170773 4933 patch_prober.go:28] interesting pod/machine-config-daemon-d2p6w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 15:54:47 crc kubenswrapper[4933]: I1202 15:54:47.170867 4933 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 15:54:47 crc kubenswrapper[4933]: E1202 15:54:47.170810 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 15:54:47.670777644 +0000 UTC m=+150.922004347 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lvtt7" (UID: "977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:47 crc kubenswrapper[4933]: I1202 15:54:47.233370 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-774lf" event={"ID":"ebb91b68-bf2f-43b1-a5d7-1c5de0e46e84","Type":"ContainerStarted","Data":"0b9f65c169c304d0efde48b9868e3787616349a019a0f1fe6b57252cc07186b6"} Dec 02 15:54:47 crc kubenswrapper[4933]: I1202 15:54:47.238670 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-dgdm5" event={"ID":"043b25c4-e262-410b-aec8-e18fdb93d0c7","Type":"ContainerStarted","Data":"d0d554afaafb8cec1d7b042eb69eae341ac6d33b3b26d978c817077de79dcbd5"} Dec 02 15:54:47 crc kubenswrapper[4933]: I1202 15:54:47.238727 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-dgdm5" event={"ID":"043b25c4-e262-410b-aec8-e18fdb93d0c7","Type":"ContainerStarted","Data":"e4676aba2831d5240328c3f9b1a777f168231aa2aab9ace9f3ff9d6945023f81"} Dec 02 15:54:47 crc kubenswrapper[4933]: I1202 15:54:47.240640 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hjncd" event={"ID":"9be27357-dce3-4ea8-8f1f-6bb17fc0870d","Type":"ContainerStarted","Data":"29ca95933bc71311f0159fbf926dd80a8e2b3ea7ad9090b0c655ca296487bf4e"} Dec 02 15:54:47 crc kubenswrapper[4933]: I1202 15:54:47.240695 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hjncd" event={"ID":"9be27357-dce3-4ea8-8f1f-6bb17fc0870d","Type":"ContainerStarted","Data":"1127427609c9fdf0ce7aa03a9e3f1042a862a388f03d288ab67762b8d8dc34f0"} Dec 02 15:54:47 crc kubenswrapper[4933]: I1202 15:54:47.242905 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-57cjj" event={"ID":"f863f677-8452-4742-bb8f-1307b893c75a","Type":"ContainerStarted","Data":"f08c40e5710d8296cb7c3e2314a0114a48cd2065ba2beb77178e3415ff66032f"} Dec 02 15:54:47 crc kubenswrapper[4933]: I1202 15:54:47.242996 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-57cjj" event={"ID":"f863f677-8452-4742-bb8f-1307b893c75a","Type":"ContainerStarted","Data":"7af6cf7843126a9dd2103900c94670504d4e65f7729acb443c9eccab433e2f8c"} Dec 02 15:54:47 crc kubenswrapper[4933]: I1202 15:54:47.245314 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8pvz9" event={"ID":"22c6a35b-d256-4c38-861a-f98b7a22d8fa","Type":"ContainerStarted","Data":"266a5184d4dd47505fc217e4d3fa8b6fbdfaf74264f1e49a9372f268d768c237"} Dec 02 15:54:47 crc kubenswrapper[4933]: I1202 15:54:47.245404 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8pvz9" event={"ID":"22c6a35b-d256-4c38-861a-f98b7a22d8fa","Type":"ContainerStarted","Data":"57df8879f64c7e8b50259997aa3b971d4f7669a65fce8ff27dc20fd60b998220"} Dec 02 15:54:47 crc kubenswrapper[4933]: I1202 15:54:47.273085 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 15:54:47 crc kubenswrapper[4933]: E1202 15:54:47.281192 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 15:54:47.781157829 +0000 UTC m=+151.032384532 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:47 crc kubenswrapper[4933]: I1202 15:54:47.357411 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-lcwgg"] Dec 02 15:54:47 crc kubenswrapper[4933]: I1202 15:54:47.361595 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-tdzpv"] Dec 02 15:54:47 crc kubenswrapper[4933]: I1202 15:54:47.368002 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-5ttq4"] Dec 02 15:54:47 crc kubenswrapper[4933]: I1202 15:54:47.372809 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-6dp2z"] Dec 02 15:54:47 crc kubenswrapper[4933]: I1202 15:54:47.377647 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lvtt7\" (UID: \"977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff\") " pod="openshift-image-registry/image-registry-697d97f7c8-lvtt7" Dec 02 15:54:47 crc kubenswrapper[4933]: E1202 15:54:47.380995 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 15:54:47.880970203 +0000 UTC m=+151.132196906 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lvtt7" (UID: "977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:47 crc kubenswrapper[4933]: I1202 15:54:47.385180 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4wd5d"] Dec 02 15:54:47 crc kubenswrapper[4933]: I1202 15:54:47.399084 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-kmxmh"] Dec 02 15:54:47 crc kubenswrapper[4933]: I1202 15:54:47.407684 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-26gvw"] Dec 02 15:54:47 crc kubenswrapper[4933]: W1202 15:54:47.416703 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a3e77e2_5cb3_44df_8570_e18f8e8f15a5.slice/crio-5170cca61e2eda9138aae42d2e43ec7eef1986db9ea6647bfe0e0d93c33f29f2 WatchSource:0}: Error finding container 5170cca61e2eda9138aae42d2e43ec7eef1986db9ea6647bfe0e0d93c33f29f2: Status 404 returned error can't find the container with id 5170cca61e2eda9138aae42d2e43ec7eef1986db9ea6647bfe0e0d93c33f29f2 Dec 02 15:54:47 crc kubenswrapper[4933]: I1202 15:54:47.420540 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-w8l5x"] Dec 02 15:54:47 crc kubenswrapper[4933]: I1202 15:54:47.440205 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411505-4m9wv"] Dec 02 15:54:47 crc kubenswrapper[4933]: W1202 15:54:47.449718 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda774c74_89be_4c8c_b2e2_0bad771112d2.slice/crio-67ff16e6edbab91dde2af8b2aa73c94003db671dddbdd6fc04bd0c91b8e2ff7a WatchSource:0}: Error finding container 67ff16e6edbab91dde2af8b2aa73c94003db671dddbdd6fc04bd0c91b8e2ff7a: Status 404 returned error can't find the container with id 67ff16e6edbab91dde2af8b2aa73c94003db671dddbdd6fc04bd0c91b8e2ff7a Dec 02 15:54:47 crc kubenswrapper[4933]: I1202 15:54:47.450869 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6qhmc"] Dec 02 15:54:47 crc kubenswrapper[4933]: I1202 15:54:47.489839 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 15:54:47 crc kubenswrapper[4933]: E1202 15:54:47.490273 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 15:54:47.990252004 +0000 UTC m=+151.241478697 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:47 crc kubenswrapper[4933]: I1202 15:54:47.499615 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-6f2dw"] Dec 02 15:54:47 crc kubenswrapper[4933]: W1202 15:54:47.501668 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfcc5c745_13b8_4ff8_b677_095dd8a46081.slice/crio-b75e4a2db0750eb91a9ff384a0d37c436e19ae4a376b29bcf9d645755d20500c WatchSource:0}: Error finding container b75e4a2db0750eb91a9ff384a0d37c436e19ae4a376b29bcf9d645755d20500c: Status 404 returned error can't find the container with id b75e4a2db0750eb91a9ff384a0d37c436e19ae4a376b29bcf9d645755d20500c Dec 02 15:54:47 crc kubenswrapper[4933]: I1202 15:54:47.531568 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-cx7jx"] Dec 02 15:54:47 crc kubenswrapper[4933]: W1202 15:54:47.549067 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod14eded2c_58d8_4348_a7ae_1a027349ae71.slice/crio-5f052b3461e0e20c68acbc5d72d4de65dbf3205b8e4177f78ff2d7222c748447 WatchSource:0}: Error finding container 5f052b3461e0e20c68acbc5d72d4de65dbf3205b8e4177f78ff2d7222c748447: Status 404 returned error can't find the container with id 5f052b3461e0e20c68acbc5d72d4de65dbf3205b8e4177f78ff2d7222c748447 Dec 02 15:54:47 crc kubenswrapper[4933]: I1202 15:54:47.591694 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lvtt7\" (UID: \"977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff\") " pod="openshift-image-registry/image-registry-697d97f7c8-lvtt7" Dec 02 15:54:47 crc kubenswrapper[4933]: E1202 15:54:47.592266 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 15:54:48.092248766 +0000 UTC m=+151.343475469 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lvtt7" (UID: "977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:47 crc kubenswrapper[4933]: I1202 15:54:47.652332 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-9r8tr"] Dec 02 15:54:47 crc kubenswrapper[4933]: I1202 15:54:47.658437 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-b2sss"] Dec 02 15:54:47 crc kubenswrapper[4933]: I1202 15:54:47.695408 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 15:54:47 crc kubenswrapper[4933]: E1202 15:54:47.695730 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 15:54:48.195710044 +0000 UTC m=+151.446936747 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:47 crc kubenswrapper[4933]: I1202 15:54:47.700520 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2hj84"] Dec 02 15:54:47 crc kubenswrapper[4933]: I1202 15:54:47.711945 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ckqs2"] Dec 02 15:54:47 crc kubenswrapper[4933]: I1202 15:54:47.716250 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-grlhq"] Dec 02 15:54:47 crc kubenswrapper[4933]: I1202 15:54:47.725455 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-dhcjs"] Dec 02 15:54:47 crc kubenswrapper[4933]: I1202 15:54:47.730357 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-pvt57"] Dec 02 15:54:47 crc kubenswrapper[4933]: I1202 15:54:47.744749 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-2r9nt"] Dec 02 15:54:47 crc kubenswrapper[4933]: I1202 15:54:47.748835 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-brhm4"] Dec 02 15:54:47 crc kubenswrapper[4933]: I1202 15:54:47.766434 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-g4vq2"] Dec 02 15:54:47 crc kubenswrapper[4933]: I1202 15:54:47.796748 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lvtt7\" (UID: \"977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff\") " pod="openshift-image-registry/image-registry-697d97f7c8-lvtt7" Dec 02 15:54:47 crc kubenswrapper[4933]: E1202 15:54:47.797208 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 15:54:48.29719619 +0000 UTC m=+151.548422883 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lvtt7" (UID: "977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:47 crc kubenswrapper[4933]: W1202 15:54:47.800041 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d3dbac3_1faa_46d1_8a2a_ddf41a7f1f0f.slice/crio-72782eb588e56567ed41a98e9e474bd1d2deed68384d458bafbd3c2a59742082 WatchSource:0}: Error finding container 72782eb588e56567ed41a98e9e474bd1d2deed68384d458bafbd3c2a59742082: Status 404 returned error can't find the container with id 72782eb588e56567ed41a98e9e474bd1d2deed68384d458bafbd3c2a59742082 Dec 02 15:54:47 crc kubenswrapper[4933]: W1202 15:54:47.808279 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaae2ee36_4dcf_429c_985d_c891c771cb0d.slice/crio-6b1aaa4e9f33b5a65f92b6ed5b495b6c7ce3a34c02ebd866cc0bb7f9a520688c WatchSource:0}: Error finding container 6b1aaa4e9f33b5a65f92b6ed5b495b6c7ce3a34c02ebd866cc0bb7f9a520688c: Status 404 returned error can't find the container with id 6b1aaa4e9f33b5a65f92b6ed5b495b6c7ce3a34c02ebd866cc0bb7f9a520688c Dec 02 15:54:47 crc kubenswrapper[4933]: W1202 15:54:47.820274 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod78a11c22_fa5a_4573_ac7b_44bad6178356.slice/crio-9d8a47d3368239e8b17f2ae4aca7a59b98e1fb0a2a3d2567b5fa4346d63ce632 WatchSource:0}: Error finding container 9d8a47d3368239e8b17f2ae4aca7a59b98e1fb0a2a3d2567b5fa4346d63ce632: Status 404 returned error can't find the container with id 9d8a47d3368239e8b17f2ae4aca7a59b98e1fb0a2a3d2567b5fa4346d63ce632 Dec 02 15:54:47 crc kubenswrapper[4933]: I1202 15:54:47.821205 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-57cjj" podStartSLOduration=4.821186193 podStartE2EDuration="4.821186193s" podCreationTimestamp="2025-12-02 15:54:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 15:54:47.82077343 +0000 UTC m=+151.072000133" watchObservedRunningTime="2025-12-02 15:54:47.821186193 +0000 UTC m=+151.072412896" Dec 02 15:54:47 crc kubenswrapper[4933]: W1202 15:54:47.833798 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod028c8291_1fc5_4035_b46f_3c618ecf154d.slice/crio-dace550fbc2fee6edc407c788ac2cf3e3edec0642e706a73ff01e21793249a7c WatchSource:0}: Error finding container dace550fbc2fee6edc407c788ac2cf3e3edec0642e706a73ff01e21793249a7c: Status 404 returned error can't find the container with id dace550fbc2fee6edc407c788ac2cf3e3edec0642e706a73ff01e21793249a7c Dec 02 15:54:47 crc kubenswrapper[4933]: W1202 15:54:47.845071 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e0ed0e9_47b5_44d2_89c5_9674e4dc3ed4.slice/crio-e5dbf15b9e92f870c3f33eeef4cd8cc0a24f467ab2acc33ea465e59802fda51f WatchSource:0}: Error finding container e5dbf15b9e92f870c3f33eeef4cd8cc0a24f467ab2acc33ea465e59802fda51f: Status 404 returned error can't find the container with id e5dbf15b9e92f870c3f33eeef4cd8cc0a24f467ab2acc33ea465e59802fda51f Dec 02 15:54:47 crc kubenswrapper[4933]: I1202 15:54:47.895229 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d99zb"] Dec 02 15:54:47 crc kubenswrapper[4933]: I1202 15:54:47.899400 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 15:54:47 crc kubenswrapper[4933]: E1202 15:54:47.899577 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 15:54:48.399542163 +0000 UTC m=+151.650768866 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:47 crc kubenswrapper[4933]: I1202 15:54:47.899923 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lvtt7\" (UID: \"977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff\") " pod="openshift-image-registry/image-registry-697d97f7c8-lvtt7" Dec 02 15:54:47 crc kubenswrapper[4933]: E1202 15:54:47.900405 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 15:54:48.400392219 +0000 UTC m=+151.651618922 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lvtt7" (UID: "977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:47 crc kubenswrapper[4933]: I1202 15:54:47.900785 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-v2svg"] Dec 02 15:54:47 crc kubenswrapper[4933]: I1202 15:54:47.921525 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-jgdkm"] Dec 02 15:54:47 crc kubenswrapper[4933]: I1202 15:54:47.958613 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-ws24r"] Dec 02 15:54:47 crc kubenswrapper[4933]: I1202 15:54:47.975424 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-xfcv2"] Dec 02 15:54:48 crc kubenswrapper[4933]: I1202 15:54:48.000679 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 15:54:48 crc kubenswrapper[4933]: E1202 15:54:48.001058 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 15:54:48.501037569 +0000 UTC m=+151.752264262 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:48 crc kubenswrapper[4933]: I1202 15:54:48.014923 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-5xvkq"] Dec 02 15:54:48 crc kubenswrapper[4933]: I1202 15:54:48.023769 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-dgdm5" podStartSLOduration=126.023753272 podStartE2EDuration="2m6.023753272s" podCreationTimestamp="2025-12-02 15:52:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 15:54:48.022554614 +0000 UTC m=+151.273781337" watchObservedRunningTime="2025-12-02 15:54:48.023753272 +0000 UTC m=+151.274979975" Dec 02 15:54:48 crc kubenswrapper[4933]: W1202 15:54:48.073967 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcbd45ad7_14dd_46e8_941b_4d2bfced7567.slice/crio-6517402996cfefc205602dc54da1f5239253835d8cd8f3be0828bb42ae678384 WatchSource:0}: Error finding container 6517402996cfefc205602dc54da1f5239253835d8cd8f3be0828bb42ae678384: Status 404 returned error can't find the container with id 6517402996cfefc205602dc54da1f5239253835d8cd8f3be0828bb42ae678384 Dec 02 15:54:48 crc kubenswrapper[4933]: I1202 15:54:48.082439 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-r7dnz"] Dec 02 15:54:48 crc kubenswrapper[4933]: I1202 15:54:48.101353 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lvtt7\" (UID: \"977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff\") " pod="openshift-image-registry/image-registry-697d97f7c8-lvtt7" Dec 02 15:54:48 crc kubenswrapper[4933]: E1202 15:54:48.101676 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 15:54:48.601660478 +0000 UTC m=+151.852887181 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lvtt7" (UID: "977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:48 crc kubenswrapper[4933]: I1202 15:54:48.109910 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8pvz9" podStartSLOduration=126.109879536 podStartE2EDuration="2m6.109879536s" podCreationTimestamp="2025-12-02 15:52:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 15:54:48.107619675 +0000 UTC m=+151.358846398" watchObservedRunningTime="2025-12-02 15:54:48.109879536 +0000 UTC m=+151.361106239" Dec 02 15:54:48 crc kubenswrapper[4933]: I1202 15:54:48.160947 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-dgdm5" Dec 02 15:54:48 crc kubenswrapper[4933]: I1202 15:54:48.188077 4933 patch_prober.go:28] interesting pod/router-default-5444994796-dgdm5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 15:54:48 crc kubenswrapper[4933]: [-]has-synced failed: reason withheld Dec 02 15:54:48 crc kubenswrapper[4933]: [+]process-running ok Dec 02 15:54:48 crc kubenswrapper[4933]: healthz check failed Dec 02 15:54:48 crc kubenswrapper[4933]: I1202 15:54:48.188151 4933 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dgdm5" podUID="043b25c4-e262-410b-aec8-e18fdb93d0c7" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 15:54:48 crc kubenswrapper[4933]: I1202 15:54:48.202430 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 15:54:48 crc kubenswrapper[4933]: E1202 15:54:48.202643 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 15:54:48.702616677 +0000 UTC m=+151.953843380 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:48 crc kubenswrapper[4933]: I1202 15:54:48.202789 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lvtt7\" (UID: \"977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff\") " pod="openshift-image-registry/image-registry-697d97f7c8-lvtt7" Dec 02 15:54:48 crc kubenswrapper[4933]: E1202 15:54:48.203156 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 15:54:48.703148204 +0000 UTC m=+151.954374907 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lvtt7" (UID: "977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:48 crc kubenswrapper[4933]: I1202 15:54:48.222432 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-d86pn"] Dec 02 15:54:48 crc kubenswrapper[4933]: I1202 15:54:48.265318 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hdrfm"] Dec 02 15:54:48 crc kubenswrapper[4933]: I1202 15:54:48.276748 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-v4g7n"] Dec 02 15:54:48 crc kubenswrapper[4933]: I1202 15:54:48.303123 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-htb5t"] Dec 02 15:54:48 crc kubenswrapper[4933]: I1202 15:54:48.304035 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 15:54:48 crc kubenswrapper[4933]: I1202 15:54:48.304057 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-b986z"] Dec 02 15:54:48 crc kubenswrapper[4933]: E1202 15:54:48.305044 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 15:54:48.805023742 +0000 UTC m=+152.056250455 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:48 crc kubenswrapper[4933]: I1202 15:54:48.320179 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6qhmc" event={"ID":"0add1e2e-7e5a-4e56-8d51-85d089b4573c","Type":"ContainerStarted","Data":"74496ebde6029cad654e28dd3ed4fbb9a6d61aa5689ab3f8ee556234ef358b2d"} Dec 02 15:54:48 crc kubenswrapper[4933]: I1202 15:54:48.331702 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xg27x"] Dec 02 15:54:48 crc kubenswrapper[4933]: I1202 15:54:48.342936 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-cx7jx" event={"ID":"061606b3-9a47-4cff-ad31-04e9a5a05528","Type":"ContainerStarted","Data":"6217a1327022156b364e9988018beab4f80ae5907ec7e4db4891c71287878db4"} Dec 02 15:54:48 crc kubenswrapper[4933]: I1202 15:54:48.348393 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tdzpv" event={"ID":"8a3e77e2-5cb3-44df-8570-e18f8e8f15a5","Type":"ContainerStarted","Data":"638fa4de69f06fe097dae8e1bceb9cb537d8f26b110bb755c39c62a0120056a4"} Dec 02 15:54:48 crc kubenswrapper[4933]: I1202 15:54:48.348430 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tdzpv" event={"ID":"8a3e77e2-5cb3-44df-8570-e18f8e8f15a5","Type":"ContainerStarted","Data":"5170cca61e2eda9138aae42d2e43ec7eef1986db9ea6647bfe0e0d93c33f29f2"} Dec 02 15:54:48 crc kubenswrapper[4933]: I1202 15:54:48.349809 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tdzpv" Dec 02 15:54:48 crc kubenswrapper[4933]: I1202 15:54:48.351541 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-pvt57" event={"ID":"6a612bec-8f7d-4fc1-bba2-877fc67b13f9","Type":"ContainerStarted","Data":"e31e99fe0e47ceb278b385d68114910f8d7c934e83364d74375f1a4349c93da4"} Dec 02 15:54:48 crc kubenswrapper[4933]: I1202 15:54:48.352217 4933 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-tdzpv container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/healthz\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Dec 02 15:54:48 crc kubenswrapper[4933]: I1202 15:54:48.352279 4933 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tdzpv" podUID="8a3e77e2-5cb3-44df-8570-e18f8e8f15a5" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.11:8443/healthz\": dial tcp 10.217.0.11:8443: connect: connection refused" Dec 02 15:54:48 crc kubenswrapper[4933]: I1202 15:54:48.352664 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-5ttq4" event={"ID":"50f1708b-32f7-42c3-a3ec-57f654624efa","Type":"ContainerStarted","Data":"31cfa6907247dc63d7723d030649af968668e8ca15715a542073c7c4713a26f5"} Dec 02 15:54:48 crc kubenswrapper[4933]: I1202 15:54:48.354870 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d99zb" event={"ID":"748b551d-be9d-4862-9a91-12ca4ccc71fa","Type":"ContainerStarted","Data":"c089b6e9f4652c3de0f25dcb6789c037ad62ada4956a798388333334d86d943c"} Dec 02 15:54:48 crc kubenswrapper[4933]: I1202 15:54:48.355567 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-dhcjs" event={"ID":"97b17284-20cf-4265-a2e0-721fe06f8105","Type":"ContainerStarted","Data":"fe2e3d7f043786a42d4b6bb0ebef78ee7c74d1485f4086bc2cc9ec0d748d2438"} Dec 02 15:54:48 crc kubenswrapper[4933]: I1202 15:54:48.356125 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-ws24r" event={"ID":"89e4dcb7-4b26-46b9-9346-61d13b21285a","Type":"ContainerStarted","Data":"d0e579a180d83913de213aea913c3560ad0460e424fffaa275e8c9e6cd53cbbb"} Dec 02 15:54:48 crc kubenswrapper[4933]: I1202 15:54:48.356805 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-xfcv2" event={"ID":"e75ec078-9907-4485-a7a4-991622b1788d","Type":"ContainerStarted","Data":"6a23b03193887bd22938647857972f26f4c82a2249d47b1a1e71c02194a172d3"} Dec 02 15:54:48 crc kubenswrapper[4933]: I1202 15:54:48.357557 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-26gvw" event={"ID":"97c46bec-8e96-4d62-8808-549f712a9802","Type":"ContainerStarted","Data":"d6a6f452db963ef9d5e86189e27b24357c3586aaaa061626711d36f3333fc55f"} Dec 02 15:54:48 crc kubenswrapper[4933]: I1202 15:54:48.370396 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hjncd" podStartSLOduration=126.370374134 podStartE2EDuration="2m6.370374134s" podCreationTimestamp="2025-12-02 15:52:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 15:54:48.338119291 +0000 UTC m=+151.589345994" watchObservedRunningTime="2025-12-02 15:54:48.370374134 +0000 UTC m=+151.621600837" Dec 02 15:54:48 crc kubenswrapper[4933]: I1202 15:54:48.371166 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-tgwdx"] Dec 02 15:54:48 crc kubenswrapper[4933]: I1202 15:54:48.377543 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-lcwgg" event={"ID":"fcc5c745-13b8-4ff8-b677-095dd8a46081","Type":"ContainerStarted","Data":"aa44205b3705eb4d58add5f1ca3e85e96f3d8caf0d5a9efe641cb1bd3ad6c400"} Dec 02 15:54:48 crc kubenswrapper[4933]: I1202 15:54:48.377598 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-lcwgg" event={"ID":"fcc5c745-13b8-4ff8-b677-095dd8a46081","Type":"ContainerStarted","Data":"b75e4a2db0750eb91a9ff384a0d37c436e19ae4a376b29bcf9d645755d20500c"} Dec 02 15:54:48 crc kubenswrapper[4933]: I1202 15:54:48.378761 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-grlhq" event={"ID":"146d5bc3-8ff4-4175-9482-c9e3f220e5c0","Type":"ContainerStarted","Data":"8d314cc227173ac3c50223818363451ca9643025f0733ce5a4d56841b7f25340"} Dec 02 15:54:48 crc kubenswrapper[4933]: I1202 15:54:48.382254 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-brhm4" event={"ID":"2e0ed0e9-47b5-44d2-89c5-9674e4dc3ed4","Type":"ContainerStarted","Data":"e5dbf15b9e92f870c3f33eeef4cd8cc0a24f467ab2acc33ea465e59802fda51f"} Dec 02 15:54:48 crc kubenswrapper[4933]: I1202 15:54:48.389779 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2hj84" event={"ID":"aae2ee36-4dcf-429c-985d-c891c771cb0d","Type":"ContainerStarted","Data":"6b1aaa4e9f33b5a65f92b6ed5b495b6c7ce3a34c02ebd866cc0bb7f9a520688c"} Dec 02 15:54:48 crc kubenswrapper[4933]: I1202 15:54:48.391702 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-jgdkm" event={"ID":"cbd45ad7-14dd-46e8-941b-4d2bfced7567","Type":"ContainerStarted","Data":"6517402996cfefc205602dc54da1f5239253835d8cd8f3be0828bb42ae678384"} Dec 02 15:54:48 crc kubenswrapper[4933]: I1202 15:54:48.393112 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-w8l5x" event={"ID":"14eded2c-58d8-4348-a7ae-1a027349ae71","Type":"ContainerStarted","Data":"5f052b3461e0e20c68acbc5d72d4de65dbf3205b8e4177f78ff2d7222c748447"} Dec 02 15:54:48 crc kubenswrapper[4933]: I1202 15:54:48.394150 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2r9nt" event={"ID":"028c8291-1fc5-4035-b46f-3c618ecf154d","Type":"ContainerStarted","Data":"dace550fbc2fee6edc407c788ac2cf3e3edec0642e706a73ff01e21793249a7c"} Dec 02 15:54:48 crc kubenswrapper[4933]: I1202 15:54:48.395043 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6f2dw" event={"ID":"e0806294-1664-4633-ba20-7b687a8cf4b2","Type":"ContainerStarted","Data":"970b08ef6ad55da7103a33f6d742dae24ae923ad80d6d9d922509583dbc068cc"} Dec 02 15:54:48 crc kubenswrapper[4933]: I1202 15:54:48.395884 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ckqs2" event={"ID":"3d3dbac3-1faa-46d1-8a2a-ddf41a7f1f0f","Type":"ContainerStarted","Data":"72782eb588e56567ed41a98e9e474bd1d2deed68384d458bafbd3c2a59742082"} Dec 02 15:54:48 crc kubenswrapper[4933]: I1202 15:54:48.396961 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4wd5d" event={"ID":"da774c74-89be-4c8c-b2e2-0bad771112d2","Type":"ContainerStarted","Data":"9478afc2aa6de4d721a6ae3b6c3cc4f6bda6ae4271e637b2beda15bfce37f45c"} Dec 02 15:54:48 crc kubenswrapper[4933]: I1202 15:54:48.396991 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4wd5d" event={"ID":"da774c74-89be-4c8c-b2e2-0bad771112d2","Type":"ContainerStarted","Data":"67ff16e6edbab91dde2af8b2aa73c94003db671dddbdd6fc04bd0c91b8e2ff7a"} Dec 02 15:54:48 crc kubenswrapper[4933]: I1202 15:54:48.397798 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-b2sss" event={"ID":"78a11c22-fa5a-4573-ac7b-44bad6178356","Type":"ContainerStarted","Data":"9d8a47d3368239e8b17f2ae4aca7a59b98e1fb0a2a3d2567b5fa4346d63ce632"} Dec 02 15:54:48 crc kubenswrapper[4933]: I1202 15:54:48.398640 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411505-4m9wv" event={"ID":"11c9d818-4012-423f-b3ce-bec8ac30f1d7","Type":"ContainerStarted","Data":"a28e87bc132f50e73d8fb72cf28cbf990c33d45d9b1100d63afda83307f367a2"} Dec 02 15:54:48 crc kubenswrapper[4933]: I1202 15:54:48.399452 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-9r8tr" event={"ID":"8b382362-3187-4571-a8a0-057cbccc89ff","Type":"ContainerStarted","Data":"87d4da9ac144da1ec3718f43b3d47f0649e0d31d0a3000ecfdfee08410a1f826"} Dec 02 15:54:48 crc kubenswrapper[4933]: I1202 15:54:48.400279 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-g4vq2" event={"ID":"d0cbe25c-56a8-4824-ace7-fff562288389","Type":"ContainerStarted","Data":"6d21b8d2c6c1b5a3fca62f48f200523c4435968b0fa164e17ace4f9cfed76e2c"} Dec 02 15:54:48 crc kubenswrapper[4933]: I1202 15:54:48.401566 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-6dp2z" event={"ID":"4762fc2a-f608-4992-a53f-72ba66df1820","Type":"ContainerStarted","Data":"7ba4a9df04075865621be3b39c894e545c1916e903fdd419f97db3803db1f5f8"} Dec 02 15:54:48 crc kubenswrapper[4933]: I1202 15:54:48.401616 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-6dp2z" event={"ID":"4762fc2a-f608-4992-a53f-72ba66df1820","Type":"ContainerStarted","Data":"5b000182c2064b46f3553a5b9709a9e1612de114a5d2e1c7e6ecbaff53f349ba"} Dec 02 15:54:48 crc kubenswrapper[4933]: I1202 15:54:48.401752 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-6dp2z" Dec 02 15:54:48 crc kubenswrapper[4933]: I1202 15:54:48.404049 4933 patch_prober.go:28] interesting pod/console-operator-58897d9998-6dp2z container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/readyz\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Dec 02 15:54:48 crc kubenswrapper[4933]: I1202 15:54:48.404131 4933 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-6dp2z" podUID="4762fc2a-f608-4992-a53f-72ba66df1820" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.9:8443/readyz\": dial tcp 10.217.0.9:8443: connect: connection refused" Dec 02 15:54:48 crc kubenswrapper[4933]: I1202 15:54:48.405095 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lvtt7\" (UID: \"977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff\") " pod="openshift-image-registry/image-registry-697d97f7c8-lvtt7" Dec 02 15:54:48 crc kubenswrapper[4933]: E1202 15:54:48.405425 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 15:54:48.905414534 +0000 UTC m=+152.156641237 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lvtt7" (UID: "977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:48 crc kubenswrapper[4933]: I1202 15:54:48.406717 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-kmxmh" event={"ID":"071e51e3-11fa-4ff1-9417-b7fbca815e88","Type":"ContainerStarted","Data":"907066820abc9e2b016265f3095edfb9ccd095da3e8524598ee47149e17a2466"} Dec 02 15:54:48 crc kubenswrapper[4933]: I1202 15:54:48.406754 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-kmxmh" event={"ID":"071e51e3-11fa-4ff1-9417-b7fbca815e88","Type":"ContainerStarted","Data":"d036340551af05ce08c4b0151c6117fb9f71013d0ae1f87153555fca0a3fa6be"} Dec 02 15:54:48 crc kubenswrapper[4933]: I1202 15:54:48.410393 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-774lf" event={"ID":"ebb91b68-bf2f-43b1-a5d7-1c5de0e46e84","Type":"ContainerStarted","Data":"b9ab8b349e81d01f334cee328d2c6c5c3d659f04d735744e3998e8cc9738e9ef"} Dec 02 15:54:48 crc kubenswrapper[4933]: I1202 15:54:48.412254 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-v2svg" event={"ID":"37c7afce-94aa-439e-9415-cdf79931a95e","Type":"ContainerStarted","Data":"d36a23952366158959b2e55ff9f44af9ed3f51a0ef08001c8323903e4a5395d9"} Dec 02 15:54:48 crc kubenswrapper[4933]: I1202 15:54:48.505724 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 15:54:48 crc kubenswrapper[4933]: E1202 15:54:48.505971 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 15:54:49.005884268 +0000 UTC m=+152.257110981 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:48 crc kubenswrapper[4933]: I1202 15:54:48.506086 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lvtt7\" (UID: \"977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff\") " pod="openshift-image-registry/image-registry-697d97f7c8-lvtt7" Dec 02 15:54:48 crc kubenswrapper[4933]: E1202 15:54:48.511441 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 15:54:49.011422892 +0000 UTC m=+152.262649605 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lvtt7" (UID: "977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:48 crc kubenswrapper[4933]: W1202 15:54:48.513102 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff11bd0d_41c3_45a6_a955_259a754887e3.slice/crio-bd90000f073eb6216e15aa5ee379384691ed20352dee684ff54fa79c56f818ee WatchSource:0}: Error finding container bd90000f073eb6216e15aa5ee379384691ed20352dee684ff54fa79c56f818ee: Status 404 returned error can't find the container with id bd90000f073eb6216e15aa5ee379384691ed20352dee684ff54fa79c56f818ee Dec 02 15:54:48 crc kubenswrapper[4933]: I1202 15:54:48.591604 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-kmxmh" podStartSLOduration=126.591581318 podStartE2EDuration="2m6.591581318s" podCreationTimestamp="2025-12-02 15:52:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 15:54:48.587549872 +0000 UTC m=+151.838776575" watchObservedRunningTime="2025-12-02 15:54:48.591581318 +0000 UTC m=+151.842808021" Dec 02 15:54:48 crc kubenswrapper[4933]: I1202 15:54:48.609049 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 15:54:48 crc kubenswrapper[4933]: E1202 15:54:48.618767 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 15:54:49.118733161 +0000 UTC m=+152.369959864 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:48 crc kubenswrapper[4933]: I1202 15:54:48.623083 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tdzpv" podStartSLOduration=125.623064067 podStartE2EDuration="2m5.623064067s" podCreationTimestamp="2025-12-02 15:52:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 15:54:48.609090678 +0000 UTC m=+151.860317381" watchObservedRunningTime="2025-12-02 15:54:48.623064067 +0000 UTC m=+151.874290770" Dec 02 15:54:48 crc kubenswrapper[4933]: I1202 15:54:48.711804 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-6dp2z" podStartSLOduration=126.711773942 podStartE2EDuration="2m6.711773942s" podCreationTimestamp="2025-12-02 15:52:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 15:54:48.668226904 +0000 UTC m=+151.919453607" watchObservedRunningTime="2025-12-02 15:54:48.711773942 +0000 UTC m=+151.963000635" Dec 02 15:54:48 crc kubenswrapper[4933]: I1202 15:54:48.718789 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lvtt7\" (UID: \"977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff\") " pod="openshift-image-registry/image-registry-697d97f7c8-lvtt7" Dec 02 15:54:48 crc kubenswrapper[4933]: E1202 15:54:48.719440 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 15:54:49.219414291 +0000 UTC m=+152.470641004 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lvtt7" (UID: "977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:48 crc kubenswrapper[4933]: I1202 15:54:48.820364 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 15:54:48 crc kubenswrapper[4933]: E1202 15:54:48.820770 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 15:54:49.320751663 +0000 UTC m=+152.571978366 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:48 crc kubenswrapper[4933]: I1202 15:54:48.921733 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lvtt7\" (UID: \"977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff\") " pod="openshift-image-registry/image-registry-697d97f7c8-lvtt7" Dec 02 15:54:48 crc kubenswrapper[4933]: E1202 15:54:48.922570 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 15:54:49.422553009 +0000 UTC m=+152.673779712 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lvtt7" (UID: "977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:49 crc kubenswrapper[4933]: I1202 15:54:49.023428 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 15:54:49 crc kubenswrapper[4933]: E1202 15:54:49.023623 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 15:54:49.52358093 +0000 UTC m=+152.774807643 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:49 crc kubenswrapper[4933]: I1202 15:54:49.023715 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lvtt7\" (UID: \"977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff\") " pod="openshift-image-registry/image-registry-697d97f7c8-lvtt7" Dec 02 15:54:49 crc kubenswrapper[4933]: E1202 15:54:49.024092 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 15:54:49.524084026 +0000 UTC m=+152.775310729 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lvtt7" (UID: "977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:49 crc kubenswrapper[4933]: I1202 15:54:49.138298 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 15:54:49 crc kubenswrapper[4933]: E1202 15:54:49.138637 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 15:54:49.638607771 +0000 UTC m=+152.889834474 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:49 crc kubenswrapper[4933]: I1202 15:54:49.138753 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lvtt7\" (UID: \"977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff\") " pod="openshift-image-registry/image-registry-697d97f7c8-lvtt7" Dec 02 15:54:49 crc kubenswrapper[4933]: E1202 15:54:49.139171 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 15:54:49.639156369 +0000 UTC m=+152.890383082 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lvtt7" (UID: "977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:49 crc kubenswrapper[4933]: I1202 15:54:49.176208 4933 patch_prober.go:28] interesting pod/router-default-5444994796-dgdm5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 15:54:49 crc kubenswrapper[4933]: [-]has-synced failed: reason withheld Dec 02 15:54:49 crc kubenswrapper[4933]: [+]process-running ok Dec 02 15:54:49 crc kubenswrapper[4933]: healthz check failed Dec 02 15:54:49 crc kubenswrapper[4933]: I1202 15:54:49.176276 4933 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dgdm5" podUID="043b25c4-e262-410b-aec8-e18fdb93d0c7" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 15:54:49 crc kubenswrapper[4933]: I1202 15:54:49.239998 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 15:54:49 crc kubenswrapper[4933]: E1202 15:54:49.240280 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 15:54:49.740245372 +0000 UTC m=+152.991472075 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:49 crc kubenswrapper[4933]: I1202 15:54:49.240768 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lvtt7\" (UID: \"977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff\") " pod="openshift-image-registry/image-registry-697d97f7c8-lvtt7" Dec 02 15:54:49 crc kubenswrapper[4933]: E1202 15:54:49.241139 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 15:54:49.74112534 +0000 UTC m=+152.992352033 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lvtt7" (UID: "977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:49 crc kubenswrapper[4933]: I1202 15:54:49.350670 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 15:54:49 crc kubenswrapper[4933]: E1202 15:54:49.350799 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 15:54:49.850782012 +0000 UTC m=+153.102008715 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:49 crc kubenswrapper[4933]: I1202 15:54:49.351087 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lvtt7\" (UID: \"977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff\") " pod="openshift-image-registry/image-registry-697d97f7c8-lvtt7" Dec 02 15:54:49 crc kubenswrapper[4933]: E1202 15:54:49.351409 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 15:54:49.851397972 +0000 UTC m=+153.102624675 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lvtt7" (UID: "977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:49 crc kubenswrapper[4933]: I1202 15:54:49.452299 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 15:54:49 crc kubenswrapper[4933]: E1202 15:54:49.452675 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 15:54:49.952659101 +0000 UTC m=+153.203885804 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:49 crc kubenswrapper[4933]: I1202 15:54:49.503263 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-brhm4" event={"ID":"2e0ed0e9-47b5-44d2-89c5-9674e4dc3ed4","Type":"ContainerStarted","Data":"b688351332e7f88edd76345d561ff07a14601172261f8fb5a5478fba7706d4c6"} Dec 02 15:54:49 crc kubenswrapper[4933]: I1202 15:54:49.524331 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-htb5t" event={"ID":"ba88c07b-f3ab-4abd-ad64-72d34148bc09","Type":"ContainerStarted","Data":"7381cdc7ea6d65ccdf30d17663c567c031906a86201eb22f25a51a06820597dd"} Dec 02 15:54:49 crc kubenswrapper[4933]: I1202 15:54:49.536334 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-pvt57" event={"ID":"6a612bec-8f7d-4fc1-bba2-877fc67b13f9","Type":"ContainerStarted","Data":"7cf50393c18e2a334722fac578c71356078080861a197174045c7727ceab2360"} Dec 02 15:54:49 crc kubenswrapper[4933]: I1202 15:54:49.538576 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2hj84" event={"ID":"aae2ee36-4dcf-429c-985d-c891c771cb0d","Type":"ContainerStarted","Data":"e8738e39b1a2f0acbf5967bb2d141a4eb2d0282f0db6bb0d1b4ebd76d7cb7600"} Dec 02 15:54:49 crc kubenswrapper[4933]: I1202 15:54:49.542148 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-5ttq4" event={"ID":"50f1708b-32f7-42c3-a3ec-57f654624efa","Type":"ContainerStarted","Data":"2fb3e2c3fff05da92a6133150ad1ffae122f269e7d7bba156439bfb4c776cec9"} Dec 02 15:54:49 crc kubenswrapper[4933]: I1202 15:54:49.542307 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-5ttq4" Dec 02 15:54:49 crc kubenswrapper[4933]: I1202 15:54:49.545184 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xg27x" event={"ID":"331221b7-75d0-458f-84d2-098fa5170277","Type":"ContainerStarted","Data":"79ff6f7a98e90027c1bc33c7cf5c3a993854551c7e1d1131726a9ca883fa241b"} Dec 02 15:54:49 crc kubenswrapper[4933]: I1202 15:54:49.548053 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-5ttq4" Dec 02 15:54:49 crc kubenswrapper[4933]: I1202 15:54:49.554319 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lvtt7\" (UID: \"977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff\") " pod="openshift-image-registry/image-registry-697d97f7c8-lvtt7" Dec 02 15:54:49 crc kubenswrapper[4933]: E1202 15:54:49.554744 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 15:54:50.054729094 +0000 UTC m=+153.305955797 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lvtt7" (UID: "977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:49 crc kubenswrapper[4933]: I1202 15:54:49.563157 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4wd5d" event={"ID":"da774c74-89be-4c8c-b2e2-0bad771112d2","Type":"ContainerStarted","Data":"622992776c2a016e87fe1c6f0363130cbc7ab5d29bfc23ea06e4532500389652"} Dec 02 15:54:49 crc kubenswrapper[4933]: I1202 15:54:49.563599 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4wd5d" Dec 02 15:54:49 crc kubenswrapper[4933]: I1202 15:54:49.582395 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hdrfm" event={"ID":"ff11bd0d-41c3-45a6-a955-259a754887e3","Type":"ContainerStarted","Data":"bd90000f073eb6216e15aa5ee379384691ed20352dee684ff54fa79c56f818ee"} Dec 02 15:54:49 crc kubenswrapper[4933]: I1202 15:54:49.582605 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-pvt57" podStartSLOduration=126.582582988 podStartE2EDuration="2m6.582582988s" podCreationTimestamp="2025-12-02 15:52:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 15:54:49.554005191 +0000 UTC m=+152.805231904" watchObservedRunningTime="2025-12-02 15:54:49.582582988 +0000 UTC m=+152.833809691" Dec 02 15:54:49 crc kubenswrapper[4933]: I1202 15:54:49.590468 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-5ttq4" podStartSLOduration=127.590438525 podStartE2EDuration="2m7.590438525s" podCreationTimestamp="2025-12-02 15:52:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 15:54:49.582570748 +0000 UTC m=+152.833797471" watchObservedRunningTime="2025-12-02 15:54:49.590438525 +0000 UTC m=+152.841665238" Dec 02 15:54:49 crc kubenswrapper[4933]: I1202 15:54:49.616184 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6qhmc" event={"ID":"0add1e2e-7e5a-4e56-8d51-85d089b4573c","Type":"ContainerStarted","Data":"1802b95aa96b5a43ce5a066689cad5dd05a213a1d6f4841278c732e86dfc10c7"} Dec 02 15:54:49 crc kubenswrapper[4933]: I1202 15:54:49.618761 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2hj84" podStartSLOduration=127.618742863 podStartE2EDuration="2m7.618742863s" podCreationTimestamp="2025-12-02 15:52:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 15:54:49.616139122 +0000 UTC m=+152.867365825" watchObservedRunningTime="2025-12-02 15:54:49.618742863 +0000 UTC m=+152.869969566" Dec 02 15:54:49 crc kubenswrapper[4933]: I1202 15:54:49.631410 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-b2sss" event={"ID":"78a11c22-fa5a-4573-ac7b-44bad6178356","Type":"ContainerStarted","Data":"cec9798d89cd1394894f60618cdf423ada3da66103562566f8efa9ac2dbd467b"} Dec 02 15:54:49 crc kubenswrapper[4933]: I1202 15:54:49.651694 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4wd5d" podStartSLOduration=126.651678237 podStartE2EDuration="2m6.651678237s" podCreationTimestamp="2025-12-02 15:52:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 15:54:49.650558712 +0000 UTC m=+152.901785415" watchObservedRunningTime="2025-12-02 15:54:49.651678237 +0000 UTC m=+152.902904940" Dec 02 15:54:49 crc kubenswrapper[4933]: I1202 15:54:49.658391 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 15:54:49 crc kubenswrapper[4933]: E1202 15:54:49.658558 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 15:54:50.158535583 +0000 UTC m=+153.409762286 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:49 crc kubenswrapper[4933]: I1202 15:54:49.658879 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lvtt7\" (UID: \"977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff\") " pod="openshift-image-registry/image-registry-697d97f7c8-lvtt7" Dec 02 15:54:49 crc kubenswrapper[4933]: E1202 15:54:49.660381 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 15:54:50.16037177 +0000 UTC m=+153.411598473 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lvtt7" (UID: "977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:49 crc kubenswrapper[4933]: I1202 15:54:49.661370 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-r7dnz" event={"ID":"cb41f368-0638-40fd-be0f-bc71d0182af8","Type":"ContainerStarted","Data":"55c977583d2fbec56c83a31f6b1b81f12f19ab02eca1a47358e92937ad50773c"} Dec 02 15:54:49 crc kubenswrapper[4933]: I1202 15:54:49.672555 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-cx7jx" event={"ID":"061606b3-9a47-4cff-ad31-04e9a5a05528","Type":"ContainerStarted","Data":"615e51417560354569794c4ba9ab9b8cdcc4150c74af3b4c84d35bc434e10c11"} Dec 02 15:54:49 crc kubenswrapper[4933]: I1202 15:54:49.686713 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6f2dw" event={"ID":"e0806294-1664-4633-ba20-7b687a8cf4b2","Type":"ContainerStarted","Data":"e360c1bd3afd9cb13d819fe8caeb95096fb814dfc4a2ea5e8614cc71b649f216"} Dec 02 15:54:49 crc kubenswrapper[4933]: I1202 15:54:49.691659 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-5xvkq" event={"ID":"9f5ccd80-acb3-4128-bf7c-e2726a9fe5ab","Type":"ContainerStarted","Data":"02eb0c9be6fc2c792593e70f414762401a072b2d8243d6d820fee6c35d3ab6c1"} Dec 02 15:54:49 crc kubenswrapper[4933]: I1202 15:54:49.700567 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2r9nt" event={"ID":"028c8291-1fc5-4035-b46f-3c618ecf154d","Type":"ContainerStarted","Data":"4e4ca247c3424cd3f0eeee76aacddcfd3e7d53132ce23146f6ff4c88295e5a97"} Dec 02 15:54:49 crc kubenswrapper[4933]: I1202 15:54:49.708937 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-dhcjs" event={"ID":"97b17284-20cf-4265-a2e0-721fe06f8105","Type":"ContainerStarted","Data":"29ba8c8ceefa35bc346b2afc167e8835cce1dcdccedcc71e0822b92bfcbd062b"} Dec 02 15:54:49 crc kubenswrapper[4933]: I1202 15:54:49.711483 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411505-4m9wv" event={"ID":"11c9d818-4012-423f-b3ce-bec8ac30f1d7","Type":"ContainerStarted","Data":"30c530fce0541189af024a3f9587d8114687b666d0ff8684af85e2de71b36f90"} Dec 02 15:54:49 crc kubenswrapper[4933]: I1202 15:54:49.711753 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6qhmc" podStartSLOduration=127.711735933 podStartE2EDuration="2m7.711735933s" podCreationTimestamp="2025-12-02 15:52:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 15:54:49.700549382 +0000 UTC m=+152.951776085" watchObservedRunningTime="2025-12-02 15:54:49.711735933 +0000 UTC m=+152.962962636" Dec 02 15:54:49 crc kubenswrapper[4933]: I1202 15:54:49.722723 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-cx7jx" podStartSLOduration=126.722702177 podStartE2EDuration="2m6.722702177s" podCreationTimestamp="2025-12-02 15:52:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 15:54:49.715599814 +0000 UTC m=+152.966826517" watchObservedRunningTime="2025-12-02 15:54:49.722702177 +0000 UTC m=+152.973928880" Dec 02 15:54:49 crc kubenswrapper[4933]: I1202 15:54:49.726337 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-26gvw" event={"ID":"97c46bec-8e96-4d62-8808-549f712a9802","Type":"ContainerStarted","Data":"044d11bf10e397a8a41127d36183d01ab0215c39c452c4c961bcf7be13516bb1"} Dec 02 15:54:49 crc kubenswrapper[4933]: I1202 15:54:49.727287 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-26gvw" Dec 02 15:54:49 crc kubenswrapper[4933]: I1202 15:54:49.758107 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29411505-4m9wv" podStartSLOduration=127.758089918 podStartE2EDuration="2m7.758089918s" podCreationTimestamp="2025-12-02 15:52:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 15:54:49.756207839 +0000 UTC m=+153.007434542" watchObservedRunningTime="2025-12-02 15:54:49.758089918 +0000 UTC m=+153.009316621" Dec 02 15:54:49 crc kubenswrapper[4933]: I1202 15:54:49.764218 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 15:54:49 crc kubenswrapper[4933]: E1202 15:54:49.764546 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 15:54:50.26451733 +0000 UTC m=+153.515744063 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:49 crc kubenswrapper[4933]: I1202 15:54:49.765138 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lvtt7\" (UID: \"977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff\") " pod="openshift-image-registry/image-registry-697d97f7c8-lvtt7" Dec 02 15:54:49 crc kubenswrapper[4933]: E1202 15:54:49.783851 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 15:54:50.283807565 +0000 UTC m=+153.535034268 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lvtt7" (UID: "977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:49 crc kubenswrapper[4933]: I1202 15:54:49.800809 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-26gvw" Dec 02 15:54:49 crc kubenswrapper[4933]: I1202 15:54:49.808244 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-dhcjs" podStartSLOduration=127.808215822 podStartE2EDuration="2m7.808215822s" podCreationTimestamp="2025-12-02 15:52:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 15:54:49.807001423 +0000 UTC m=+153.058228136" watchObservedRunningTime="2025-12-02 15:54:49.808215822 +0000 UTC m=+153.059442535" Dec 02 15:54:49 crc kubenswrapper[4933]: I1202 15:54:49.837662 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-grlhq" event={"ID":"146d5bc3-8ff4-4175-9482-c9e3f220e5c0","Type":"ContainerStarted","Data":"549057d345d0be3581d1260452d3c02960fbcaa09524e24009ec7239d1d03ea4"} Dec 02 15:54:49 crc kubenswrapper[4933]: I1202 15:54:49.858576 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-26gvw" podStartSLOduration=127.858559492 podStartE2EDuration="2m7.858559492s" podCreationTimestamp="2025-12-02 15:52:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 15:54:49.857010933 +0000 UTC m=+153.108237636" watchObservedRunningTime="2025-12-02 15:54:49.858559492 +0000 UTC m=+153.109786195" Dec 02 15:54:49 crc kubenswrapper[4933]: I1202 15:54:49.866187 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 15:54:49 crc kubenswrapper[4933]: E1202 15:54:49.867094 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 15:54:50.367075269 +0000 UTC m=+153.618301972 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:49 crc kubenswrapper[4933]: I1202 15:54:49.891652 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-tgwdx" event={"ID":"806c8633-7969-405c-b445-734ae20ede22","Type":"ContainerStarted","Data":"2689d1d76be0ae7e7781bfc8812b392e807af1aa0223e5cf407829031a5235f2"} Dec 02 15:54:49 crc kubenswrapper[4933]: I1202 15:54:49.951765 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-774lf" event={"ID":"ebb91b68-bf2f-43b1-a5d7-1c5de0e46e84","Type":"ContainerStarted","Data":"5194d3b74b5aa0f20bbf1a733db234ac7752b82c11820fff8b1e4d027a616427"} Dec 02 15:54:49 crc kubenswrapper[4933]: I1202 15:54:49.968033 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lvtt7\" (UID: \"977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff\") " pod="openshift-image-registry/image-registry-697d97f7c8-lvtt7" Dec 02 15:54:49 crc kubenswrapper[4933]: E1202 15:54:49.968601 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 15:54:50.468582456 +0000 UTC m=+153.719809159 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lvtt7" (UID: "977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:49 crc kubenswrapper[4933]: I1202 15:54:49.990541 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-v4g7n" event={"ID":"c657682e-039a-46ee-b4c7-a5f95ee75e44","Type":"ContainerStarted","Data":"40ae5006ca7d6f61a4f48b7fede4461abfc867eaf75f1a10c43d58e57b73e7b7"} Dec 02 15:54:50 crc kubenswrapper[4933]: I1202 15:54:50.026343 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-774lf" podStartSLOduration=128.026313539 podStartE2EDuration="2m8.026313539s" podCreationTimestamp="2025-12-02 15:52:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 15:54:50.017972427 +0000 UTC m=+153.269199150" watchObservedRunningTime="2025-12-02 15:54:50.026313539 +0000 UTC m=+153.277540262" Dec 02 15:54:50 crc kubenswrapper[4933]: I1202 15:54:50.032241 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-g4vq2" event={"ID":"d0cbe25c-56a8-4824-ace7-fff562288389","Type":"ContainerStarted","Data":"fe0eb2389255d1931710f1cf8aaec9304e998ecd73922d89bf58207127c2ef17"} Dec 02 15:54:50 crc kubenswrapper[4933]: I1202 15:54:50.077632 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-b986z" event={"ID":"b850db5e-74db-4823-bbb3-132b75b17c30","Type":"ContainerStarted","Data":"2d61869664f9f384baad50538941daccfcc9433e619082d5b12c290925265137"} Dec 02 15:54:50 crc kubenswrapper[4933]: I1202 15:54:50.082032 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 15:54:50 crc kubenswrapper[4933]: E1202 15:54:50.082495 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 15:54:50.582471432 +0000 UTC m=+153.833698125 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:50 crc kubenswrapper[4933]: I1202 15:54:50.104522 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-d86pn" event={"ID":"b7c9b4f1-183e-4d60-bb84-b2995d89b0fe","Type":"ContainerStarted","Data":"1c0e482f272cf89c823c1709e2ebfaae9ed52a347fe3e410298e778ed753df2e"} Dec 02 15:54:50 crc kubenswrapper[4933]: I1202 15:54:50.114518 4933 generic.go:334] "Generic (PLEG): container finished" podID="14eded2c-58d8-4348-a7ae-1a027349ae71" containerID="a3774b06dfe29de8bfeba16c8c7281b042a4f9e39db87bb9873424ab936d72f4" exitCode=0 Dec 02 15:54:50 crc kubenswrapper[4933]: I1202 15:54:50.114832 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-w8l5x" event={"ID":"14eded2c-58d8-4348-a7ae-1a027349ae71","Type":"ContainerDied","Data":"a3774b06dfe29de8bfeba16c8c7281b042a4f9e39db87bb9873424ab936d72f4"} Dec 02 15:54:50 crc kubenswrapper[4933]: I1202 15:54:50.123172 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tdzpv" Dec 02 15:54:50 crc kubenswrapper[4933]: I1202 15:54:50.133164 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-6dp2z" Dec 02 15:54:50 crc kubenswrapper[4933]: I1202 15:54:50.173263 4933 patch_prober.go:28] interesting pod/router-default-5444994796-dgdm5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 15:54:50 crc kubenswrapper[4933]: [-]has-synced failed: reason withheld Dec 02 15:54:50 crc kubenswrapper[4933]: [+]process-running ok Dec 02 15:54:50 crc kubenswrapper[4933]: healthz check failed Dec 02 15:54:50 crc kubenswrapper[4933]: I1202 15:54:50.173323 4933 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dgdm5" podUID="043b25c4-e262-410b-aec8-e18fdb93d0c7" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 15:54:50 crc kubenswrapper[4933]: I1202 15:54:50.193119 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lvtt7\" (UID: \"977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff\") " pod="openshift-image-registry/image-registry-697d97f7c8-lvtt7" Dec 02 15:54:50 crc kubenswrapper[4933]: E1202 15:54:50.197521 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 15:54:50.697500793 +0000 UTC m=+153.948727616 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lvtt7" (UID: "977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:50 crc kubenswrapper[4933]: I1202 15:54:50.252067 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-lcwgg" podStartSLOduration=128.252047895 podStartE2EDuration="2m8.252047895s" podCreationTimestamp="2025-12-02 15:52:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 15:54:50.249608109 +0000 UTC m=+153.500834812" watchObservedRunningTime="2025-12-02 15:54:50.252047895 +0000 UTC m=+153.503274598" Dec 02 15:54:50 crc kubenswrapper[4933]: I1202 15:54:50.294687 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 15:54:50 crc kubenswrapper[4933]: E1202 15:54:50.295022 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 15:54:50.795001204 +0000 UTC m=+154.046227907 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:50 crc kubenswrapper[4933]: I1202 15:54:50.398803 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lvtt7\" (UID: \"977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff\") " pod="openshift-image-registry/image-registry-697d97f7c8-lvtt7" Dec 02 15:54:50 crc kubenswrapper[4933]: E1202 15:54:50.399508 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 15:54:50.899458263 +0000 UTC m=+154.150685136 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lvtt7" (UID: "977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:50 crc kubenswrapper[4933]: I1202 15:54:50.499708 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 15:54:50 crc kubenswrapper[4933]: E1202 15:54:50.500186 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 15:54:51.000166395 +0000 UTC m=+154.251393098 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:50 crc kubenswrapper[4933]: I1202 15:54:50.601321 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lvtt7\" (UID: \"977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff\") " pod="openshift-image-registry/image-registry-697d97f7c8-lvtt7" Dec 02 15:54:50 crc kubenswrapper[4933]: E1202 15:54:50.601750 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 15:54:51.101734643 +0000 UTC m=+154.352961346 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lvtt7" (UID: "977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:50 crc kubenswrapper[4933]: I1202 15:54:50.702540 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 15:54:50 crc kubenswrapper[4933]: E1202 15:54:50.702905 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 15:54:51.202889309 +0000 UTC m=+154.454116012 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:50 crc kubenswrapper[4933]: I1202 15:54:50.804578 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lvtt7\" (UID: \"977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff\") " pod="openshift-image-registry/image-registry-697d97f7c8-lvtt7" Dec 02 15:54:50 crc kubenswrapper[4933]: E1202 15:54:50.805178 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 15:54:51.305156189 +0000 UTC m=+154.556382892 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lvtt7" (UID: "977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:50 crc kubenswrapper[4933]: I1202 15:54:50.907609 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 15:54:50 crc kubenswrapper[4933]: E1202 15:54:50.907954 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 15:54:51.407913125 +0000 UTC m=+154.659139828 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:50 crc kubenswrapper[4933]: I1202 15:54:50.909410 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lvtt7\" (UID: \"977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff\") " pod="openshift-image-registry/image-registry-697d97f7c8-lvtt7" Dec 02 15:54:50 crc kubenswrapper[4933]: E1202 15:54:50.909746 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 15:54:51.409732182 +0000 UTC m=+154.660958875 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lvtt7" (UID: "977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:51 crc kubenswrapper[4933]: I1202 15:54:51.011363 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 15:54:51 crc kubenswrapper[4933]: E1202 15:54:51.011716 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 15:54:51.511696493 +0000 UTC m=+154.762923206 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:51 crc kubenswrapper[4933]: I1202 15:54:51.113089 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lvtt7\" (UID: \"977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff\") " pod="openshift-image-registry/image-registry-697d97f7c8-lvtt7" Dec 02 15:54:51 crc kubenswrapper[4933]: E1202 15:54:51.113484 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 15:54:51.613468008 +0000 UTC m=+154.864694711 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lvtt7" (UID: "977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:51 crc kubenswrapper[4933]: I1202 15:54:51.145145 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d99zb" event={"ID":"748b551d-be9d-4862-9a91-12ca4ccc71fa","Type":"ContainerStarted","Data":"0f876535f9d40f3c6d9f786de7a94095e83ed834bb6ca215fea84f29926cdc68"} Dec 02 15:54:51 crc kubenswrapper[4933]: I1202 15:54:51.145778 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d99zb" Dec 02 15:54:51 crc kubenswrapper[4933]: I1202 15:54:51.149993 4933 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-d99zb container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.34:5443/healthz\": dial tcp 10.217.0.34:5443: connect: connection refused" start-of-body= Dec 02 15:54:51 crc kubenswrapper[4933]: I1202 15:54:51.150061 4933 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d99zb" podUID="748b551d-be9d-4862-9a91-12ca4ccc71fa" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.34:5443/healthz\": dial tcp 10.217.0.34:5443: connect: connection refused" Dec 02 15:54:51 crc kubenswrapper[4933]: I1202 15:54:51.152181 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-v2svg" event={"ID":"37c7afce-94aa-439e-9415-cdf79931a95e","Type":"ContainerStarted","Data":"3db0792110bc6478b9de440070e89eb7a7afe74a774797a70a6955ee71f837c0"} Dec 02 15:54:51 crc kubenswrapper[4933]: I1202 15:54:51.166071 4933 patch_prober.go:28] interesting pod/router-default-5444994796-dgdm5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 15:54:51 crc kubenswrapper[4933]: [-]has-synced failed: reason withheld Dec 02 15:54:51 crc kubenswrapper[4933]: [+]process-running ok Dec 02 15:54:51 crc kubenswrapper[4933]: healthz check failed Dec 02 15:54:51 crc kubenswrapper[4933]: I1202 15:54:51.166145 4933 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dgdm5" podUID="043b25c4-e262-410b-aec8-e18fdb93d0c7" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 15:54:51 crc kubenswrapper[4933]: I1202 15:54:51.170525 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xg27x" event={"ID":"331221b7-75d0-458f-84d2-098fa5170277","Type":"ContainerStarted","Data":"382c06d886dfb02de721910ed3ba8cf1f615db5fc9159473a4ba4147a9914741"} Dec 02 15:54:51 crc kubenswrapper[4933]: I1202 15:54:51.179016 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-xfcv2" event={"ID":"e75ec078-9907-4485-a7a4-991622b1788d","Type":"ContainerStarted","Data":"4bd117b3fdb2d0b9eb7b37fb9df815ec35a3bbce0ffe288b5f23483887cdac51"} Dec 02 15:54:51 crc kubenswrapper[4933]: I1202 15:54:51.179054 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-xfcv2" event={"ID":"e75ec078-9907-4485-a7a4-991622b1788d","Type":"ContainerStarted","Data":"f09ecf5771186e345a136c24750f3b35c5fd3b88eb6d84f5cc5693f72276b347"} Dec 02 15:54:51 crc kubenswrapper[4933]: I1202 15:54:51.180748 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d99zb" podStartSLOduration=128.18073489 podStartE2EDuration="2m8.18073489s" podCreationTimestamp="2025-12-02 15:52:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 15:54:51.180014867 +0000 UTC m=+154.431241580" watchObservedRunningTime="2025-12-02 15:54:51.18073489 +0000 UTC m=+154.431961583" Dec 02 15:54:51 crc kubenswrapper[4933]: I1202 15:54:51.215302 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 15:54:51 crc kubenswrapper[4933]: E1202 15:54:51.249078 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 15:54:51.749046385 +0000 UTC m=+155.000273088 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:51 crc kubenswrapper[4933]: I1202 15:54:51.274472 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-g4vq2" event={"ID":"d0cbe25c-56a8-4824-ace7-fff562288389","Type":"ContainerStarted","Data":"1cec58a1e81b8779a3f58ef147a20fb524435f997647367bcbdda37650ecb337"} Dec 02 15:54:51 crc kubenswrapper[4933]: I1202 15:54:51.275156 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-g4vq2" Dec 02 15:54:51 crc kubenswrapper[4933]: I1202 15:54:51.311220 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-v2svg" podStartSLOduration=8.311195536 podStartE2EDuration="8.311195536s" podCreationTimestamp="2025-12-02 15:54:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 15:54:51.27439154 +0000 UTC m=+154.525618243" watchObservedRunningTime="2025-12-02 15:54:51.311195536 +0000 UTC m=+154.562422239" Dec 02 15:54:51 crc kubenswrapper[4933]: I1202 15:54:51.313345 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xg27x" podStartSLOduration=129.313337233 podStartE2EDuration="2m9.313337233s" podCreationTimestamp="2025-12-02 15:52:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 15:54:51.310019449 +0000 UTC m=+154.561246152" watchObservedRunningTime="2025-12-02 15:54:51.313337233 +0000 UTC m=+154.564563936" Dec 02 15:54:51 crc kubenswrapper[4933]: I1202 15:54:51.313968 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-b986z" event={"ID":"b850db5e-74db-4823-bbb3-132b75b17c30","Type":"ContainerStarted","Data":"f46a3adf99e44423ac48821a4ba1ca41b3a975c41250f164b4449cea4cef2a0e"} Dec 02 15:54:51 crc kubenswrapper[4933]: I1202 15:54:51.317718 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lvtt7\" (UID: \"977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff\") " pod="openshift-image-registry/image-registry-697d97f7c8-lvtt7" Dec 02 15:54:51 crc kubenswrapper[4933]: E1202 15:54:51.321242 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 15:54:51.821225411 +0000 UTC m=+155.072452114 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lvtt7" (UID: "977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:51 crc kubenswrapper[4933]: I1202 15:54:51.326252 4933 generic.go:334] "Generic (PLEG): container finished" podID="78a11c22-fa5a-4573-ac7b-44bad6178356" containerID="cec9798d89cd1394894f60618cdf423ada3da66103562566f8efa9ac2dbd467b" exitCode=0 Dec 02 15:54:51 crc kubenswrapper[4933]: I1202 15:54:51.326631 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-b2sss" event={"ID":"78a11c22-fa5a-4573-ac7b-44bad6178356","Type":"ContainerDied","Data":"cec9798d89cd1394894f60618cdf423ada3da66103562566f8efa9ac2dbd467b"} Dec 02 15:54:51 crc kubenswrapper[4933]: I1202 15:54:51.369247 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-xfcv2" podStartSLOduration=128.369227198 podStartE2EDuration="2m8.369227198s" podCreationTimestamp="2025-12-02 15:52:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 15:54:51.367901706 +0000 UTC m=+154.619128409" watchObservedRunningTime="2025-12-02 15:54:51.369227198 +0000 UTC m=+154.620453891" Dec 02 15:54:51 crc kubenswrapper[4933]: I1202 15:54:51.375348 4933 generic.go:334] "Generic (PLEG): container finished" podID="146d5bc3-8ff4-4175-9482-c9e3f220e5c0" containerID="549057d345d0be3581d1260452d3c02960fbcaa09524e24009ec7239d1d03ea4" exitCode=0 Dec 02 15:54:51 crc kubenswrapper[4933]: I1202 15:54:51.375510 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-grlhq" event={"ID":"146d5bc3-8ff4-4175-9482-c9e3f220e5c0","Type":"ContainerDied","Data":"549057d345d0be3581d1260452d3c02960fbcaa09524e24009ec7239d1d03ea4"} Dec 02 15:54:51 crc kubenswrapper[4933]: I1202 15:54:51.375555 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-grlhq" event={"ID":"146d5bc3-8ff4-4175-9482-c9e3f220e5c0","Type":"ContainerStarted","Data":"172045b9eee289b67ea83145360101e02f1281590d2f5366b16c1ea0dc6bc6a1"} Dec 02 15:54:51 crc kubenswrapper[4933]: I1202 15:54:51.417654 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-ws24r" event={"ID":"89e4dcb7-4b26-46b9-9346-61d13b21285a","Type":"ContainerStarted","Data":"ac24d2ec7c0861322df0bed7f24806c063960fcf7738a1ee455fb116b4dce2a3"} Dec 02 15:54:51 crc kubenswrapper[4933]: I1202 15:54:51.418523 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 15:54:51 crc kubenswrapper[4933]: E1202 15:54:51.419630 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 15:54:51.919614749 +0000 UTC m=+155.170841452 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:51 crc kubenswrapper[4933]: I1202 15:54:51.439507 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ckqs2" event={"ID":"3d3dbac3-1faa-46d1-8a2a-ddf41a7f1f0f","Type":"ContainerStarted","Data":"f31239f7408729fbe158d1ad82001a84e886a770acdaaa31f43a62df071b62a6"} Dec 02 15:54:51 crc kubenswrapper[4933]: I1202 15:54:51.442240 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-b986z" podStartSLOduration=129.442229119 podStartE2EDuration="2m9.442229119s" podCreationTimestamp="2025-12-02 15:52:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 15:54:51.39509734 +0000 UTC m=+154.646324043" watchObservedRunningTime="2025-12-02 15:54:51.442229119 +0000 UTC m=+154.693455822" Dec 02 15:54:51 crc kubenswrapper[4933]: I1202 15:54:51.452350 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ckqs2" Dec 02 15:54:51 crc kubenswrapper[4933]: I1202 15:54:51.471452 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ckqs2" Dec 02 15:54:51 crc kubenswrapper[4933]: I1202 15:54:51.476797 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2r9nt" event={"ID":"028c8291-1fc5-4035-b46f-3c618ecf154d","Type":"ContainerStarted","Data":"971014c8c39c411970b74e55852f5cecffd3010ff8d8ab0102ea62d634a08ae2"} Dec 02 15:54:51 crc kubenswrapper[4933]: I1202 15:54:51.492377 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-g4vq2" podStartSLOduration=8.492352683 podStartE2EDuration="8.492352683s" podCreationTimestamp="2025-12-02 15:54:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 15:54:51.488509422 +0000 UTC m=+154.739736125" watchObservedRunningTime="2025-12-02 15:54:51.492352683 +0000 UTC m=+154.743579386" Dec 02 15:54:51 crc kubenswrapper[4933]: I1202 15:54:51.519621 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-d86pn" event={"ID":"b7c9b4f1-183e-4d60-bb84-b2995d89b0fe","Type":"ContainerStarted","Data":"a944117bf297407987fefa3cc005ba23bb6f811dc4162dfef8876b9b019c8664"} Dec 02 15:54:51 crc kubenswrapper[4933]: I1202 15:54:51.521579 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lvtt7\" (UID: \"977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff\") " pod="openshift-image-registry/image-registry-697d97f7c8-lvtt7" Dec 02 15:54:51 crc kubenswrapper[4933]: E1202 15:54:51.522966 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 15:54:52.022948313 +0000 UTC m=+155.274175016 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lvtt7" (UID: "977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:51 crc kubenswrapper[4933]: I1202 15:54:51.546935 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-tgwdx" event={"ID":"806c8633-7969-405c-b445-734ae20ede22","Type":"ContainerStarted","Data":"ffc800f24598990cbc16673f57aacc7684e5f356683344f4585f3de0e89a8499"} Dec 02 15:54:51 crc kubenswrapper[4933]: I1202 15:54:51.554119 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-grlhq" podStartSLOduration=128.554091321 podStartE2EDuration="2m8.554091321s" podCreationTimestamp="2025-12-02 15:52:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 15:54:51.537381156 +0000 UTC m=+154.788607859" watchObservedRunningTime="2025-12-02 15:54:51.554091321 +0000 UTC m=+154.805318024" Dec 02 15:54:51 crc kubenswrapper[4933]: I1202 15:54:51.555896 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ckqs2" podStartSLOduration=128.555888167 podStartE2EDuration="2m8.555888167s" podCreationTimestamp="2025-12-02 15:52:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 15:54:51.552289284 +0000 UTC m=+154.803516007" watchObservedRunningTime="2025-12-02 15:54:51.555888167 +0000 UTC m=+154.807114870" Dec 02 15:54:51 crc kubenswrapper[4933]: I1202 15:54:51.566788 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-9r8tr" event={"ID":"8b382362-3187-4571-a8a0-057cbccc89ff","Type":"ContainerStarted","Data":"eef55e3d29a2e819e4a1b457a8021f73db22cb87babfb088a58fd8e34e106b9e"} Dec 02 15:54:51 crc kubenswrapper[4933]: I1202 15:54:51.567934 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-9r8tr" Dec 02 15:54:51 crc kubenswrapper[4933]: I1202 15:54:51.571348 4933 patch_prober.go:28] interesting pod/downloads-7954f5f757-9r8tr container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Dec 02 15:54:51 crc kubenswrapper[4933]: I1202 15:54:51.571420 4933 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-9r8tr" podUID="8b382362-3187-4571-a8a0-057cbccc89ff" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" Dec 02 15:54:51 crc kubenswrapper[4933]: I1202 15:54:51.576101 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-brhm4" event={"ID":"2e0ed0e9-47b5-44d2-89c5-9674e4dc3ed4","Type":"ContainerStarted","Data":"23b588fa1f77c86d84246c1d4bdbbc75a26f2764cf4f72b2f4388ad31d7f9255"} Dec 02 15:54:51 crc kubenswrapper[4933]: I1202 15:54:51.614416 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-htb5t" event={"ID":"ba88c07b-f3ab-4abd-ad64-72d34148bc09","Type":"ContainerStarted","Data":"30800067230e3835da19fbf00c3c9273d5d2b6b785e4031b7b541212774c841d"} Dec 02 15:54:51 crc kubenswrapper[4933]: I1202 15:54:51.629706 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 15:54:51 crc kubenswrapper[4933]: E1202 15:54:51.630010 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 15:54:52.129983324 +0000 UTC m=+155.381210017 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:51 crc kubenswrapper[4933]: I1202 15:54:51.630250 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lvtt7\" (UID: \"977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff\") " pod="openshift-image-registry/image-registry-697d97f7c8-lvtt7" Dec 02 15:54:51 crc kubenswrapper[4933]: E1202 15:54:51.638256 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 15:54:52.138231163 +0000 UTC m=+155.389457866 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lvtt7" (UID: "977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:51 crc kubenswrapper[4933]: I1202 15:54:51.652732 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-ws24r" podStartSLOduration=128.652711087 podStartE2EDuration="2m8.652711087s" podCreationTimestamp="2025-12-02 15:52:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 15:54:51.600792247 +0000 UTC m=+154.852018960" watchObservedRunningTime="2025-12-02 15:54:51.652711087 +0000 UTC m=+154.903937780" Dec 02 15:54:51 crc kubenswrapper[4933]: I1202 15:54:51.658084 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-jgdkm" event={"ID":"cbd45ad7-14dd-46e8-941b-4d2bfced7567","Type":"ContainerStarted","Data":"95b551eeee47749f32b7d4b648fb2e57811b34b5496e3f026a5895b6e692c8c6"} Dec 02 15:54:51 crc kubenswrapper[4933]: I1202 15:54:51.689754 4933 generic.go:334] "Generic (PLEG): container finished" podID="11c9d818-4012-423f-b3ce-bec8ac30f1d7" containerID="30c530fce0541189af024a3f9587d8114687b666d0ff8684af85e2de71b36f90" exitCode=0 Dec 02 15:54:51 crc kubenswrapper[4933]: I1202 15:54:51.690126 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411505-4m9wv" event={"ID":"11c9d818-4012-423f-b3ce-bec8ac30f1d7","Type":"ContainerDied","Data":"30c530fce0541189af024a3f9587d8114687b666d0ff8684af85e2de71b36f90"} Dec 02 15:54:51 crc kubenswrapper[4933]: I1202 15:54:51.703896 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-9r8tr" podStartSLOduration=129.703875603 podStartE2EDuration="2m9.703875603s" podCreationTimestamp="2025-12-02 15:52:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 15:54:51.696996517 +0000 UTC m=+154.948223240" watchObservedRunningTime="2025-12-02 15:54:51.703875603 +0000 UTC m=+154.955102306" Dec 02 15:54:51 crc kubenswrapper[4933]: I1202 15:54:51.705654 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2r9nt" podStartSLOduration=128.705646999 podStartE2EDuration="2m8.705646999s" podCreationTimestamp="2025-12-02 15:52:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 15:54:51.654418051 +0000 UTC m=+154.905644764" watchObservedRunningTime="2025-12-02 15:54:51.705646999 +0000 UTC m=+154.956873692" Dec 02 15:54:51 crc kubenswrapper[4933]: I1202 15:54:51.712465 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-r7dnz" event={"ID":"cb41f368-0638-40fd-be0f-bc71d0182af8","Type":"ContainerStarted","Data":"c583d93a0b8b419befad740071a84b41fe53c2fb71edd40029821c1edf52785c"} Dec 02 15:54:51 crc kubenswrapper[4933]: I1202 15:54:51.713625 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-r7dnz" Dec 02 15:54:51 crc kubenswrapper[4933]: I1202 15:54:51.734179 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 15:54:51 crc kubenswrapper[4933]: E1202 15:54:51.735279 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 15:54:52.235230968 +0000 UTC m=+155.486457671 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:51 crc kubenswrapper[4933]: I1202 15:54:51.735615 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lvtt7\" (UID: \"977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff\") " pod="openshift-image-registry/image-registry-697d97f7c8-lvtt7" Dec 02 15:54:51 crc kubenswrapper[4933]: E1202 15:54:51.736019 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 15:54:52.236005212 +0000 UTC m=+155.487231905 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lvtt7" (UID: "977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:51 crc kubenswrapper[4933]: I1202 15:54:51.755014 4933 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-r7dnz container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.22:8080/healthz\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Dec 02 15:54:51 crc kubenswrapper[4933]: I1202 15:54:51.755081 4933 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-r7dnz" podUID="cb41f368-0638-40fd-be0f-bc71d0182af8" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.22:8080/healthz\": dial tcp 10.217.0.22:8080: connect: connection refused" Dec 02 15:54:51 crc kubenswrapper[4933]: I1202 15:54:51.774369 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-w8l5x" event={"ID":"14eded2c-58d8-4348-a7ae-1a027349ae71","Type":"ContainerStarted","Data":"74ab9159c8f560e0e69a08f50e97a82d3ed0b6e66c3587e1740cae6cff1fe572"} Dec 02 15:54:51 crc kubenswrapper[4933]: I1202 15:54:51.775732 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-brhm4" podStartSLOduration=128.775717929 podStartE2EDuration="2m8.775717929s" podCreationTimestamp="2025-12-02 15:52:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 15:54:51.733455482 +0000 UTC m=+154.984682185" watchObservedRunningTime="2025-12-02 15:54:51.775717929 +0000 UTC m=+155.026944632" Dec 02 15:54:51 crc kubenswrapper[4933]: I1202 15:54:51.797151 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hdrfm" event={"ID":"ff11bd0d-41c3-45a6-a955-259a754887e3","Type":"ContainerStarted","Data":"946311ebba2d497d6647b768b6bdf9f52e348abbe163a698741d827773260fc2"} Dec 02 15:54:51 crc kubenswrapper[4933]: I1202 15:54:51.798011 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hdrfm" Dec 02 15:54:51 crc kubenswrapper[4933]: I1202 15:54:51.817229 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6f2dw" event={"ID":"e0806294-1664-4633-ba20-7b687a8cf4b2","Type":"ContainerStarted","Data":"61b1beaea1c1fd86b6f0aca837dfbf7a96575c6bd495c9ee44b6d3bbf7697067"} Dec 02 15:54:51 crc kubenswrapper[4933]: I1202 15:54:51.831754 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-d86pn" podStartSLOduration=129.831734488 podStartE2EDuration="2m9.831734488s" podCreationTimestamp="2025-12-02 15:52:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 15:54:51.78277193 +0000 UTC m=+155.033998633" watchObservedRunningTime="2025-12-02 15:54:51.831734488 +0000 UTC m=+155.082961181" Dec 02 15:54:51 crc kubenswrapper[4933]: I1202 15:54:51.832424 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-htb5t" podStartSLOduration=128.832418199 podStartE2EDuration="2m8.832418199s" podCreationTimestamp="2025-12-02 15:52:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 15:54:51.829775346 +0000 UTC m=+155.081002049" watchObservedRunningTime="2025-12-02 15:54:51.832418199 +0000 UTC m=+155.083644902" Dec 02 15:54:51 crc kubenswrapper[4933]: I1202 15:54:51.846913 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 15:54:51 crc kubenswrapper[4933]: E1202 15:54:51.849445 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 15:54:52.349423413 +0000 UTC m=+155.600650116 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:51 crc kubenswrapper[4933]: I1202 15:54:51.850114 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-v4g7n" event={"ID":"c657682e-039a-46ee-b4c7-a5f95ee75e44","Type":"ContainerStarted","Data":"f908c3586df3dd5c7aa834f5b0a8427742cc732b44d387dd303028d38f50d675"} Dec 02 15:54:51 crc kubenswrapper[4933]: I1202 15:54:51.850183 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-v4g7n" event={"ID":"c657682e-039a-46ee-b4c7-a5f95ee75e44","Type":"ContainerStarted","Data":"193fcc6e673e70be573d96520210692847236c57c293e433c288b36e73a623fa"} Dec 02 15:54:51 crc kubenswrapper[4933]: I1202 15:54:51.851991 4933 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-hdrfm container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.21:8443/healthz\": dial tcp 10.217.0.21:8443: connect: connection refused" start-of-body= Dec 02 15:54:51 crc kubenswrapper[4933]: I1202 15:54:51.852066 4933 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hdrfm" podUID="ff11bd0d-41c3-45a6-a955-259a754887e3" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.21:8443/healthz\": dial tcp 10.217.0.21:8443: connect: connection refused" Dec 02 15:54:51 crc kubenswrapper[4933]: I1202 15:54:51.875247 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hdrfm" podStartSLOduration=128.875222723 podStartE2EDuration="2m8.875222723s" podCreationTimestamp="2025-12-02 15:52:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 15:54:51.873788518 +0000 UTC m=+155.125015221" watchObservedRunningTime="2025-12-02 15:54:51.875222723 +0000 UTC m=+155.126449426" Dec 02 15:54:51 crc kubenswrapper[4933]: I1202 15:54:51.949904 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lvtt7\" (UID: \"977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff\") " pod="openshift-image-registry/image-registry-697d97f7c8-lvtt7" Dec 02 15:54:51 crc kubenswrapper[4933]: E1202 15:54:51.952401 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 15:54:52.452379415 +0000 UTC m=+155.703606118 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lvtt7" (UID: "977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:51 crc kubenswrapper[4933]: I1202 15:54:51.982327 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-r7dnz" podStartSLOduration=128.982302594 podStartE2EDuration="2m8.982302594s" podCreationTimestamp="2025-12-02 15:52:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 15:54:51.980650983 +0000 UTC m=+155.231877696" watchObservedRunningTime="2025-12-02 15:54:51.982302594 +0000 UTC m=+155.233529297" Dec 02 15:54:52 crc kubenswrapper[4933]: I1202 15:54:52.003700 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6f2dw" podStartSLOduration=129.003682446 podStartE2EDuration="2m9.003682446s" podCreationTimestamp="2025-12-02 15:52:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 15:54:52.002815959 +0000 UTC m=+155.254042672" watchObservedRunningTime="2025-12-02 15:54:52.003682446 +0000 UTC m=+155.254909169" Dec 02 15:54:52 crc kubenswrapper[4933]: I1202 15:54:52.033562 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-v4g7n" podStartSLOduration=130.033541933 podStartE2EDuration="2m10.033541933s" podCreationTimestamp="2025-12-02 15:52:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 15:54:52.031760808 +0000 UTC m=+155.282987531" watchObservedRunningTime="2025-12-02 15:54:52.033541933 +0000 UTC m=+155.284768656" Dec 02 15:54:52 crc kubenswrapper[4933]: I1202 15:54:52.060766 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 15:54:52 crc kubenswrapper[4933]: E1202 15:54:52.061423 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 15:54:52.561399978 +0000 UTC m=+155.812626681 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:52 crc kubenswrapper[4933]: I1202 15:54:52.162725 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lvtt7\" (UID: \"977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff\") " pod="openshift-image-registry/image-registry-697d97f7c8-lvtt7" Dec 02 15:54:52 crc kubenswrapper[4933]: E1202 15:54:52.163133 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 15:54:52.663111891 +0000 UTC m=+155.914338594 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lvtt7" (UID: "977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:52 crc kubenswrapper[4933]: I1202 15:54:52.164156 4933 patch_prober.go:28] interesting pod/router-default-5444994796-dgdm5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 15:54:52 crc kubenswrapper[4933]: [-]has-synced failed: reason withheld Dec 02 15:54:52 crc kubenswrapper[4933]: [+]process-running ok Dec 02 15:54:52 crc kubenswrapper[4933]: healthz check failed Dec 02 15:54:52 crc kubenswrapper[4933]: I1202 15:54:52.164209 4933 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dgdm5" podUID="043b25c4-e262-410b-aec8-e18fdb93d0c7" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 15:54:52 crc kubenswrapper[4933]: I1202 15:54:52.263809 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 15:54:52 crc kubenswrapper[4933]: E1202 15:54:52.263978 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 15:54:52.763942957 +0000 UTC m=+156.015169660 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:52 crc kubenswrapper[4933]: I1202 15:54:52.264111 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lvtt7\" (UID: \"977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff\") " pod="openshift-image-registry/image-registry-697d97f7c8-lvtt7" Dec 02 15:54:52 crc kubenswrapper[4933]: E1202 15:54:52.264386 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 15:54:52.764379101 +0000 UTC m=+156.015605794 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lvtt7" (UID: "977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:52 crc kubenswrapper[4933]: I1202 15:54:52.365568 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 15:54:52 crc kubenswrapper[4933]: E1202 15:54:52.365741 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 15:54:52.865719122 +0000 UTC m=+156.116945835 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:52 crc kubenswrapper[4933]: I1202 15:54:52.365894 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lvtt7\" (UID: \"977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff\") " pod="openshift-image-registry/image-registry-697d97f7c8-lvtt7" Dec 02 15:54:52 crc kubenswrapper[4933]: E1202 15:54:52.366257 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 15:54:52.866249289 +0000 UTC m=+156.117475982 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lvtt7" (UID: "977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:52 crc kubenswrapper[4933]: I1202 15:54:52.467893 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 15:54:52 crc kubenswrapper[4933]: E1202 15:54:52.468104 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 15:54:52.968075145 +0000 UTC m=+156.219301848 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:52 crc kubenswrapper[4933]: I1202 15:54:52.468166 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lvtt7\" (UID: \"977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff\") " pod="openshift-image-registry/image-registry-697d97f7c8-lvtt7" Dec 02 15:54:52 crc kubenswrapper[4933]: E1202 15:54:52.468501 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 15:54:52.968493769 +0000 UTC m=+156.219720472 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lvtt7" (UID: "977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:52 crc kubenswrapper[4933]: I1202 15:54:52.570169 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 15:54:52 crc kubenswrapper[4933]: E1202 15:54:52.570374 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 15:54:53.070341386 +0000 UTC m=+156.321568089 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:52 crc kubenswrapper[4933]: I1202 15:54:52.570810 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lvtt7\" (UID: \"977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff\") " pod="openshift-image-registry/image-registry-697d97f7c8-lvtt7" Dec 02 15:54:52 crc kubenswrapper[4933]: E1202 15:54:52.571393 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 15:54:53.071376188 +0000 UTC m=+156.322602891 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lvtt7" (UID: "977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:52 crc kubenswrapper[4933]: I1202 15:54:52.672037 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 15:54:52 crc kubenswrapper[4933]: E1202 15:54:52.672191 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 15:54:53.172171843 +0000 UTC m=+156.423398546 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:52 crc kubenswrapper[4933]: I1202 15:54:52.672251 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lvtt7\" (UID: \"977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff\") " pod="openshift-image-registry/image-registry-697d97f7c8-lvtt7" Dec 02 15:54:52 crc kubenswrapper[4933]: E1202 15:54:52.672585 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 15:54:53.172574685 +0000 UTC m=+156.423801388 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lvtt7" (UID: "977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:52 crc kubenswrapper[4933]: I1202 15:54:52.754078 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-54gm8"] Dec 02 15:54:52 crc kubenswrapper[4933]: I1202 15:54:52.755328 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-54gm8" Dec 02 15:54:52 crc kubenswrapper[4933]: W1202 15:54:52.761293 4933 reflector.go:561] object-"openshift-marketplace"/"community-operators-dockercfg-dmngl": failed to list *v1.Secret: secrets "community-operators-dockercfg-dmngl" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-marketplace": no relationship found between node 'crc' and this object Dec 02 15:54:52 crc kubenswrapper[4933]: E1202 15:54:52.761348 4933 reflector.go:158] "Unhandled Error" err="object-\"openshift-marketplace\"/\"community-operators-dockercfg-dmngl\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"community-operators-dockercfg-dmngl\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-marketplace\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 02 15:54:52 crc kubenswrapper[4933]: I1202 15:54:52.773450 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 15:54:52 crc kubenswrapper[4933]: E1202 15:54:52.773683 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 15:54:53.273647969 +0000 UTC m=+156.524874672 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:52 crc kubenswrapper[4933]: I1202 15:54:52.773845 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lvtt7\" (UID: \"977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff\") " pod="openshift-image-registry/image-registry-697d97f7c8-lvtt7" Dec 02 15:54:52 crc kubenswrapper[4933]: E1202 15:54:52.774297 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 15:54:53.274286759 +0000 UTC m=+156.525513462 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lvtt7" (UID: "977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:52 crc kubenswrapper[4933]: I1202 15:54:52.798801 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-54gm8"] Dec 02 15:54:52 crc kubenswrapper[4933]: I1202 15:54:52.858402 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-w8l5x" event={"ID":"14eded2c-58d8-4348-a7ae-1a027349ae71","Type":"ContainerStarted","Data":"e7e10e18db472a9e49138617b9c5771e5cd4a6433010cd74d4bfc81e02c6fcba"} Dec 02 15:54:52 crc kubenswrapper[4933]: I1202 15:54:52.860886 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-b2sss" event={"ID":"78a11c22-fa5a-4573-ac7b-44bad6178356","Type":"ContainerStarted","Data":"43031abccfff062deb563b7280e7da8a5b4b91e2f560401ca3d2a151ce68644e"} Dec 02 15:54:52 crc kubenswrapper[4933]: I1202 15:54:52.860980 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-b2sss" Dec 02 15:54:52 crc kubenswrapper[4933]: I1202 15:54:52.863597 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-tgwdx" event={"ID":"806c8633-7969-405c-b445-734ae20ede22","Type":"ContainerStarted","Data":"13f872665967716cf9e0c488fe8376a5f3271227e5f8313e3008edf707df32e6"} Dec 02 15:54:52 crc kubenswrapper[4933]: I1202 15:54:52.867180 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-jgdkm" event={"ID":"cbd45ad7-14dd-46e8-941b-4d2bfced7567","Type":"ContainerStarted","Data":"e5993d2c33bfd4c8748298768e81efe64e13f47b24d95f9543086b2371a2dfba"} Dec 02 15:54:52 crc kubenswrapper[4933]: I1202 15:54:52.868868 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-5xvkq" event={"ID":"9f5ccd80-acb3-4128-bf7c-e2726a9fe5ab","Type":"ContainerStarted","Data":"c11983a09386503da5947ba8b763b695b48d8b902da196539064defffa7ec8ac"} Dec 02 15:54:52 crc kubenswrapper[4933]: I1202 15:54:52.870933 4933 patch_prober.go:28] interesting pod/downloads-7954f5f757-9r8tr container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Dec 02 15:54:52 crc kubenswrapper[4933]: I1202 15:54:52.870986 4933 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-9r8tr" podUID="8b382362-3187-4571-a8a0-057cbccc89ff" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" Dec 02 15:54:52 crc kubenswrapper[4933]: I1202 15:54:52.871104 4933 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-r7dnz container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.22:8080/healthz\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Dec 02 15:54:52 crc kubenswrapper[4933]: I1202 15:54:52.871150 4933 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-r7dnz" podUID="cb41f368-0638-40fd-be0f-bc71d0182af8" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.22:8080/healthz\": dial tcp 10.217.0.22:8080: connect: connection refused" Dec 02 15:54:52 crc kubenswrapper[4933]: I1202 15:54:52.875972 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hdrfm" Dec 02 15:54:52 crc kubenswrapper[4933]: I1202 15:54:52.877148 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 15:54:52 crc kubenswrapper[4933]: I1202 15:54:52.877439 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7cef2b5-6b23-47ba-83ce-161a02f128a1-catalog-content\") pod \"community-operators-54gm8\" (UID: \"c7cef2b5-6b23-47ba-83ce-161a02f128a1\") " pod="openshift-marketplace/community-operators-54gm8" Dec 02 15:54:52 crc kubenswrapper[4933]: I1202 15:54:52.877494 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7cef2b5-6b23-47ba-83ce-161a02f128a1-utilities\") pod \"community-operators-54gm8\" (UID: \"c7cef2b5-6b23-47ba-83ce-161a02f128a1\") " pod="openshift-marketplace/community-operators-54gm8" Dec 02 15:54:52 crc kubenswrapper[4933]: I1202 15:54:52.877537 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjrk7\" (UniqueName: \"kubernetes.io/projected/c7cef2b5-6b23-47ba-83ce-161a02f128a1-kube-api-access-hjrk7\") pod \"community-operators-54gm8\" (UID: \"c7cef2b5-6b23-47ba-83ce-161a02f128a1\") " pod="openshift-marketplace/community-operators-54gm8" Dec 02 15:54:52 crc kubenswrapper[4933]: E1202 15:54:52.877635 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 15:54:53.377619213 +0000 UTC m=+156.628845916 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:52 crc kubenswrapper[4933]: I1202 15:54:52.975171 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-mp97n"] Dec 02 15:54:52 crc kubenswrapper[4933]: I1202 15:54:52.976310 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mp97n" Dec 02 15:54:52 crc kubenswrapper[4933]: I1202 15:54:52.978568 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7cef2b5-6b23-47ba-83ce-161a02f128a1-utilities\") pod \"community-operators-54gm8\" (UID: \"c7cef2b5-6b23-47ba-83ce-161a02f128a1\") " pod="openshift-marketplace/community-operators-54gm8" Dec 02 15:54:52 crc kubenswrapper[4933]: I1202 15:54:52.978617 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lvtt7\" (UID: \"977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff\") " pod="openshift-image-registry/image-registry-697d97f7c8-lvtt7" Dec 02 15:54:52 crc kubenswrapper[4933]: I1202 15:54:52.978938 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjrk7\" (UniqueName: \"kubernetes.io/projected/c7cef2b5-6b23-47ba-83ce-161a02f128a1-kube-api-access-hjrk7\") pod \"community-operators-54gm8\" (UID: \"c7cef2b5-6b23-47ba-83ce-161a02f128a1\") " pod="openshift-marketplace/community-operators-54gm8" Dec 02 15:54:52 crc kubenswrapper[4933]: I1202 15:54:52.979311 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7cef2b5-6b23-47ba-83ce-161a02f128a1-catalog-content\") pod \"community-operators-54gm8\" (UID: \"c7cef2b5-6b23-47ba-83ce-161a02f128a1\") " pod="openshift-marketplace/community-operators-54gm8" Dec 02 15:54:52 crc kubenswrapper[4933]: I1202 15:54:52.983434 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7cef2b5-6b23-47ba-83ce-161a02f128a1-utilities\") pod \"community-operators-54gm8\" (UID: \"c7cef2b5-6b23-47ba-83ce-161a02f128a1\") " pod="openshift-marketplace/community-operators-54gm8" Dec 02 15:54:52 crc kubenswrapper[4933]: E1202 15:54:52.984401 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 15:54:53.484361334 +0000 UTC m=+156.735588037 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lvtt7" (UID: "977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:52 crc kubenswrapper[4933]: I1202 15:54:52.986522 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7cef2b5-6b23-47ba-83ce-161a02f128a1-catalog-content\") pod \"community-operators-54gm8\" (UID: \"c7cef2b5-6b23-47ba-83ce-161a02f128a1\") " pod="openshift-marketplace/community-operators-54gm8" Dec 02 15:54:53 crc kubenswrapper[4933]: W1202 15:54:53.015683 4933 reflector.go:561] object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g": failed to list *v1.Secret: secrets "certified-operators-dockercfg-4rs5g" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-marketplace": no relationship found between node 'crc' and this object Dec 02 15:54:53 crc kubenswrapper[4933]: E1202 15:54:53.015746 4933 reflector.go:158] "Unhandled Error" err="object-\"openshift-marketplace\"/\"certified-operators-dockercfg-4rs5g\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"certified-operators-dockercfg-4rs5g\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-marketplace\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 02 15:54:53 crc kubenswrapper[4933]: I1202 15:54:53.047378 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mp97n"] Dec 02 15:54:53 crc kubenswrapper[4933]: I1202 15:54:53.049503 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-xnlpk"] Dec 02 15:54:53 crc kubenswrapper[4933]: I1202 15:54:53.050428 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xnlpk" Dec 02 15:54:53 crc kubenswrapper[4933]: I1202 15:54:53.060179 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjrk7\" (UniqueName: \"kubernetes.io/projected/c7cef2b5-6b23-47ba-83ce-161a02f128a1-kube-api-access-hjrk7\") pod \"community-operators-54gm8\" (UID: \"c7cef2b5-6b23-47ba-83ce-161a02f128a1\") " pod="openshift-marketplace/community-operators-54gm8" Dec 02 15:54:53 crc kubenswrapper[4933]: I1202 15:54:53.082214 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 15:54:53 crc kubenswrapper[4933]: I1202 15:54:53.082393 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f5f6b9a-09c1-46e9-9fba-1de95dd3244c-utilities\") pod \"certified-operators-mp97n\" (UID: \"4f5f6b9a-09c1-46e9-9fba-1de95dd3244c\") " pod="openshift-marketplace/certified-operators-mp97n" Dec 02 15:54:53 crc kubenswrapper[4933]: I1202 15:54:53.082440 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfbjr\" (UniqueName: \"kubernetes.io/projected/4f5f6b9a-09c1-46e9-9fba-1de95dd3244c-kube-api-access-mfbjr\") pod \"certified-operators-mp97n\" (UID: \"4f5f6b9a-09c1-46e9-9fba-1de95dd3244c\") " pod="openshift-marketplace/certified-operators-mp97n" Dec 02 15:54:53 crc kubenswrapper[4933]: I1202 15:54:53.082478 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f5f6b9a-09c1-46e9-9fba-1de95dd3244c-catalog-content\") pod \"certified-operators-mp97n\" (UID: \"4f5f6b9a-09c1-46e9-9fba-1de95dd3244c\") " pod="openshift-marketplace/certified-operators-mp97n" Dec 02 15:54:53 crc kubenswrapper[4933]: E1202 15:54:53.082679 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 15:54:53.58266 +0000 UTC m=+156.833886703 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:53 crc kubenswrapper[4933]: I1202 15:54:53.106766 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-jgdkm" podStartSLOduration=131.106733596 podStartE2EDuration="2m11.106733596s" podCreationTimestamp="2025-12-02 15:52:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 15:54:53.077129266 +0000 UTC m=+156.328355979" watchObservedRunningTime="2025-12-02 15:54:53.106733596 +0000 UTC m=+156.357960299" Dec 02 15:54:53 crc kubenswrapper[4933]: I1202 15:54:53.113095 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d99zb" Dec 02 15:54:53 crc kubenswrapper[4933]: I1202 15:54:53.128449 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xnlpk"] Dec 02 15:54:53 crc kubenswrapper[4933]: I1202 15:54:53.169624 4933 patch_prober.go:28] interesting pod/router-default-5444994796-dgdm5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 15:54:53 crc kubenswrapper[4933]: [-]has-synced failed: reason withheld Dec 02 15:54:53 crc kubenswrapper[4933]: [+]process-running ok Dec 02 15:54:53 crc kubenswrapper[4933]: healthz check failed Dec 02 15:54:53 crc kubenswrapper[4933]: I1202 15:54:53.169680 4933 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dgdm5" podUID="043b25c4-e262-410b-aec8-e18fdb93d0c7" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 15:54:53 crc kubenswrapper[4933]: I1202 15:54:53.192054 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f5f6b9a-09c1-46e9-9fba-1de95dd3244c-utilities\") pod \"certified-operators-mp97n\" (UID: \"4f5f6b9a-09c1-46e9-9fba-1de95dd3244c\") " pod="openshift-marketplace/certified-operators-mp97n" Dec 02 15:54:53 crc kubenswrapper[4933]: I1202 15:54:53.192116 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mfbjr\" (UniqueName: \"kubernetes.io/projected/4f5f6b9a-09c1-46e9-9fba-1de95dd3244c-kube-api-access-mfbjr\") pod \"certified-operators-mp97n\" (UID: \"4f5f6b9a-09c1-46e9-9fba-1de95dd3244c\") " pod="openshift-marketplace/certified-operators-mp97n" Dec 02 15:54:53 crc kubenswrapper[4933]: I1202 15:54:53.192156 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f5f6b9a-09c1-46e9-9fba-1de95dd3244c-catalog-content\") pod \"certified-operators-mp97n\" (UID: \"4f5f6b9a-09c1-46e9-9fba-1de95dd3244c\") " pod="openshift-marketplace/certified-operators-mp97n" Dec 02 15:54:53 crc kubenswrapper[4933]: I1202 15:54:53.192196 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjfbf\" (UniqueName: \"kubernetes.io/projected/dcfaf9e9-2939-430d-b0c8-7f8daaca0f42-kube-api-access-pjfbf\") pod \"community-operators-xnlpk\" (UID: \"dcfaf9e9-2939-430d-b0c8-7f8daaca0f42\") " pod="openshift-marketplace/community-operators-xnlpk" Dec 02 15:54:53 crc kubenswrapper[4933]: I1202 15:54:53.192217 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dcfaf9e9-2939-430d-b0c8-7f8daaca0f42-utilities\") pod \"community-operators-xnlpk\" (UID: \"dcfaf9e9-2939-430d-b0c8-7f8daaca0f42\") " pod="openshift-marketplace/community-operators-xnlpk" Dec 02 15:54:53 crc kubenswrapper[4933]: I1202 15:54:53.192258 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lvtt7\" (UID: \"977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff\") " pod="openshift-image-registry/image-registry-697d97f7c8-lvtt7" Dec 02 15:54:53 crc kubenswrapper[4933]: I1202 15:54:53.192301 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dcfaf9e9-2939-430d-b0c8-7f8daaca0f42-catalog-content\") pod \"community-operators-xnlpk\" (UID: \"dcfaf9e9-2939-430d-b0c8-7f8daaca0f42\") " pod="openshift-marketplace/community-operators-xnlpk" Dec 02 15:54:53 crc kubenswrapper[4933]: I1202 15:54:53.192863 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f5f6b9a-09c1-46e9-9fba-1de95dd3244c-utilities\") pod \"certified-operators-mp97n\" (UID: \"4f5f6b9a-09c1-46e9-9fba-1de95dd3244c\") " pod="openshift-marketplace/certified-operators-mp97n" Dec 02 15:54:53 crc kubenswrapper[4933]: I1202 15:54:53.193534 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f5f6b9a-09c1-46e9-9fba-1de95dd3244c-catalog-content\") pod \"certified-operators-mp97n\" (UID: \"4f5f6b9a-09c1-46e9-9fba-1de95dd3244c\") " pod="openshift-marketplace/certified-operators-mp97n" Dec 02 15:54:53 crc kubenswrapper[4933]: E1202 15:54:53.193959 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 15:54:53.693942693 +0000 UTC m=+156.945169396 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lvtt7" (UID: "977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:53 crc kubenswrapper[4933]: I1202 15:54:53.273117 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-kph29"] Dec 02 15:54:53 crc kubenswrapper[4933]: I1202 15:54:53.275364 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kph29" Dec 02 15:54:53 crc kubenswrapper[4933]: I1202 15:54:53.286804 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfbjr\" (UniqueName: \"kubernetes.io/projected/4f5f6b9a-09c1-46e9-9fba-1de95dd3244c-kube-api-access-mfbjr\") pod \"certified-operators-mp97n\" (UID: \"4f5f6b9a-09c1-46e9-9fba-1de95dd3244c\") " pod="openshift-marketplace/certified-operators-mp97n" Dec 02 15:54:53 crc kubenswrapper[4933]: I1202 15:54:53.293617 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 15:54:53 crc kubenswrapper[4933]: E1202 15:54:53.293852 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 15:54:53.793801508 +0000 UTC m=+157.045028211 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:53 crc kubenswrapper[4933]: I1202 15:54:53.294317 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjfbf\" (UniqueName: \"kubernetes.io/projected/dcfaf9e9-2939-430d-b0c8-7f8daaca0f42-kube-api-access-pjfbf\") pod \"community-operators-xnlpk\" (UID: \"dcfaf9e9-2939-430d-b0c8-7f8daaca0f42\") " pod="openshift-marketplace/community-operators-xnlpk" Dec 02 15:54:53 crc kubenswrapper[4933]: I1202 15:54:53.294433 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dcfaf9e9-2939-430d-b0c8-7f8daaca0f42-utilities\") pod \"community-operators-xnlpk\" (UID: \"dcfaf9e9-2939-430d-b0c8-7f8daaca0f42\") " pod="openshift-marketplace/community-operators-xnlpk" Dec 02 15:54:53 crc kubenswrapper[4933]: I1202 15:54:53.294612 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lvtt7\" (UID: \"977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff\") " pod="openshift-image-registry/image-registry-697d97f7c8-lvtt7" Dec 02 15:54:53 crc kubenswrapper[4933]: I1202 15:54:53.294721 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dcfaf9e9-2939-430d-b0c8-7f8daaca0f42-catalog-content\") pod \"community-operators-xnlpk\" (UID: \"dcfaf9e9-2939-430d-b0c8-7f8daaca0f42\") " pod="openshift-marketplace/community-operators-xnlpk" Dec 02 15:54:53 crc kubenswrapper[4933]: I1202 15:54:53.295315 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dcfaf9e9-2939-430d-b0c8-7f8daaca0f42-catalog-content\") pod \"community-operators-xnlpk\" (UID: \"dcfaf9e9-2939-430d-b0c8-7f8daaca0f42\") " pod="openshift-marketplace/community-operators-xnlpk" Dec 02 15:54:53 crc kubenswrapper[4933]: I1202 15:54:53.295463 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dcfaf9e9-2939-430d-b0c8-7f8daaca0f42-utilities\") pod \"community-operators-xnlpk\" (UID: \"dcfaf9e9-2939-430d-b0c8-7f8daaca0f42\") " pod="openshift-marketplace/community-operators-xnlpk" Dec 02 15:54:53 crc kubenswrapper[4933]: E1202 15:54:53.295800 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 15:54:53.79578552 +0000 UTC m=+157.047012223 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lvtt7" (UID: "977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:53 crc kubenswrapper[4933]: I1202 15:54:53.310519 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kph29"] Dec 02 15:54:53 crc kubenswrapper[4933]: I1202 15:54:53.313722 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-b2sss" podStartSLOduration=131.313692812 podStartE2EDuration="2m11.313692812s" podCreationTimestamp="2025-12-02 15:52:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 15:54:53.307494538 +0000 UTC m=+156.558721241" watchObservedRunningTime="2025-12-02 15:54:53.313692812 +0000 UTC m=+156.564919515" Dec 02 15:54:53 crc kubenswrapper[4933]: I1202 15:54:53.350218 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 02 15:54:53 crc kubenswrapper[4933]: I1202 15:54:53.370773 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjfbf\" (UniqueName: \"kubernetes.io/projected/dcfaf9e9-2939-430d-b0c8-7f8daaca0f42-kube-api-access-pjfbf\") pod \"community-operators-xnlpk\" (UID: \"dcfaf9e9-2939-430d-b0c8-7f8daaca0f42\") " pod="openshift-marketplace/community-operators-xnlpk" Dec 02 15:54:53 crc kubenswrapper[4933]: I1202 15:54:53.371074 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 02 15:54:53 crc kubenswrapper[4933]: I1202 15:54:53.371225 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 02 15:54:53 crc kubenswrapper[4933]: I1202 15:54:53.391497 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Dec 02 15:54:53 crc kubenswrapper[4933]: I1202 15:54:53.392036 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Dec 02 15:54:53 crc kubenswrapper[4933]: I1202 15:54:53.396246 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 15:54:53 crc kubenswrapper[4933]: I1202 15:54:53.396587 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49685142-ff9c-4dec-9f6a-dbc8f06b7174-catalog-content\") pod \"certified-operators-kph29\" (UID: \"49685142-ff9c-4dec-9f6a-dbc8f06b7174\") " pod="openshift-marketplace/certified-operators-kph29" Dec 02 15:54:53 crc kubenswrapper[4933]: I1202 15:54:53.396644 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49685142-ff9c-4dec-9f6a-dbc8f06b7174-utilities\") pod \"certified-operators-kph29\" (UID: \"49685142-ff9c-4dec-9f6a-dbc8f06b7174\") " pod="openshift-marketplace/certified-operators-kph29" Dec 02 15:54:53 crc kubenswrapper[4933]: I1202 15:54:53.396687 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p95k6\" (UniqueName: \"kubernetes.io/projected/49685142-ff9c-4dec-9f6a-dbc8f06b7174-kube-api-access-p95k6\") pod \"certified-operators-kph29\" (UID: \"49685142-ff9c-4dec-9f6a-dbc8f06b7174\") " pod="openshift-marketplace/certified-operators-kph29" Dec 02 15:54:53 crc kubenswrapper[4933]: E1202 15:54:53.396884 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 15:54:53.896855473 +0000 UTC m=+157.148082186 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:53 crc kubenswrapper[4933]: I1202 15:54:53.489286 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-tgwdx" podStartSLOduration=130.489254664 podStartE2EDuration="2m10.489254664s" podCreationTimestamp="2025-12-02 15:52:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 15:54:53.392456335 +0000 UTC m=+156.643683038" watchObservedRunningTime="2025-12-02 15:54:53.489254664 +0000 UTC m=+156.740481367" Dec 02 15:54:53 crc kubenswrapper[4933]: I1202 15:54:53.489893 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-w8l5x" podStartSLOduration=131.489887663 podStartE2EDuration="2m11.489887663s" podCreationTimestamp="2025-12-02 15:52:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 15:54:53.485439154 +0000 UTC m=+156.736665847" watchObservedRunningTime="2025-12-02 15:54:53.489887663 +0000 UTC m=+156.741114366" Dec 02 15:54:53 crc kubenswrapper[4933]: I1202 15:54:53.501924 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49685142-ff9c-4dec-9f6a-dbc8f06b7174-utilities\") pod \"certified-operators-kph29\" (UID: \"49685142-ff9c-4dec-9f6a-dbc8f06b7174\") " pod="openshift-marketplace/certified-operators-kph29" Dec 02 15:54:53 crc kubenswrapper[4933]: I1202 15:54:53.502065 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p95k6\" (UniqueName: \"kubernetes.io/projected/49685142-ff9c-4dec-9f6a-dbc8f06b7174-kube-api-access-p95k6\") pod \"certified-operators-kph29\" (UID: \"49685142-ff9c-4dec-9f6a-dbc8f06b7174\") " pod="openshift-marketplace/certified-operators-kph29" Dec 02 15:54:53 crc kubenswrapper[4933]: I1202 15:54:53.502129 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2806d3f7-26c5-43c1-b77e-9f42eefb7a0c-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"2806d3f7-26c5-43c1-b77e-9f42eefb7a0c\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 02 15:54:53 crc kubenswrapper[4933]: I1202 15:54:53.502245 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2806d3f7-26c5-43c1-b77e-9f42eefb7a0c-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"2806d3f7-26c5-43c1-b77e-9f42eefb7a0c\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 02 15:54:53 crc kubenswrapper[4933]: I1202 15:54:53.502328 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lvtt7\" (UID: \"977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff\") " pod="openshift-image-registry/image-registry-697d97f7c8-lvtt7" Dec 02 15:54:53 crc kubenswrapper[4933]: I1202 15:54:53.502420 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49685142-ff9c-4dec-9f6a-dbc8f06b7174-catalog-content\") pod \"certified-operators-kph29\" (UID: \"49685142-ff9c-4dec-9f6a-dbc8f06b7174\") " pod="openshift-marketplace/certified-operators-kph29" Dec 02 15:54:53 crc kubenswrapper[4933]: I1202 15:54:53.503323 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49685142-ff9c-4dec-9f6a-dbc8f06b7174-catalog-content\") pod \"certified-operators-kph29\" (UID: \"49685142-ff9c-4dec-9f6a-dbc8f06b7174\") " pod="openshift-marketplace/certified-operators-kph29" Dec 02 15:54:53 crc kubenswrapper[4933]: I1202 15:54:53.503762 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49685142-ff9c-4dec-9f6a-dbc8f06b7174-utilities\") pod \"certified-operators-kph29\" (UID: \"49685142-ff9c-4dec-9f6a-dbc8f06b7174\") " pod="openshift-marketplace/certified-operators-kph29" Dec 02 15:54:53 crc kubenswrapper[4933]: E1202 15:54:53.504138 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 15:54:54.00411398 +0000 UTC m=+157.255340863 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lvtt7" (UID: "977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:53 crc kubenswrapper[4933]: I1202 15:54:53.560413 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p95k6\" (UniqueName: \"kubernetes.io/projected/49685142-ff9c-4dec-9f6a-dbc8f06b7174-kube-api-access-p95k6\") pod \"certified-operators-kph29\" (UID: \"49685142-ff9c-4dec-9f6a-dbc8f06b7174\") " pod="openshift-marketplace/certified-operators-kph29" Dec 02 15:54:53 crc kubenswrapper[4933]: I1202 15:54:53.611537 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 15:54:53 crc kubenswrapper[4933]: I1202 15:54:53.611701 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2806d3f7-26c5-43c1-b77e-9f42eefb7a0c-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"2806d3f7-26c5-43c1-b77e-9f42eefb7a0c\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 02 15:54:53 crc kubenswrapper[4933]: I1202 15:54:53.611791 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2806d3f7-26c5-43c1-b77e-9f42eefb7a0c-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"2806d3f7-26c5-43c1-b77e-9f42eefb7a0c\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 02 15:54:53 crc kubenswrapper[4933]: I1202 15:54:53.611892 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2806d3f7-26c5-43c1-b77e-9f42eefb7a0c-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"2806d3f7-26c5-43c1-b77e-9f42eefb7a0c\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 02 15:54:53 crc kubenswrapper[4933]: E1202 15:54:53.611963 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 15:54:54.111948235 +0000 UTC m=+157.363174938 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:53 crc kubenswrapper[4933]: I1202 15:54:53.676243 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2806d3f7-26c5-43c1-b77e-9f42eefb7a0c-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"2806d3f7-26c5-43c1-b77e-9f42eefb7a0c\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 02 15:54:53 crc kubenswrapper[4933]: I1202 15:54:53.717612 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lvtt7\" (UID: \"977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff\") " pod="openshift-image-registry/image-registry-697d97f7c8-lvtt7" Dec 02 15:54:53 crc kubenswrapper[4933]: E1202 15:54:53.718055 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 15:54:54.218041696 +0000 UTC m=+157.469268399 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lvtt7" (UID: "977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:53 crc kubenswrapper[4933]: I1202 15:54:53.766372 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411505-4m9wv" Dec 02 15:54:53 crc kubenswrapper[4933]: I1202 15:54:53.788003 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 02 15:54:53 crc kubenswrapper[4933]: I1202 15:54:53.828691 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/11c9d818-4012-423f-b3ce-bec8ac30f1d7-config-volume\") pod \"11c9d818-4012-423f-b3ce-bec8ac30f1d7\" (UID: \"11c9d818-4012-423f-b3ce-bec8ac30f1d7\") " Dec 02 15:54:53 crc kubenswrapper[4933]: I1202 15:54:53.829042 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 15:54:53 crc kubenswrapper[4933]: I1202 15:54:53.829083 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/11c9d818-4012-423f-b3ce-bec8ac30f1d7-secret-volume\") pod \"11c9d818-4012-423f-b3ce-bec8ac30f1d7\" (UID: \"11c9d818-4012-423f-b3ce-bec8ac30f1d7\") " Dec 02 15:54:53 crc kubenswrapper[4933]: I1202 15:54:53.829175 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nfchz\" (UniqueName: \"kubernetes.io/projected/11c9d818-4012-423f-b3ce-bec8ac30f1d7-kube-api-access-nfchz\") pod \"11c9d818-4012-423f-b3ce-bec8ac30f1d7\" (UID: \"11c9d818-4012-423f-b3ce-bec8ac30f1d7\") " Dec 02 15:54:53 crc kubenswrapper[4933]: E1202 15:54:53.829366 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 15:54:54.32933386 +0000 UTC m=+157.580560573 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:53 crc kubenswrapper[4933]: I1202 15:54:53.829461 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lvtt7\" (UID: \"977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff\") " pod="openshift-image-registry/image-registry-697d97f7c8-lvtt7" Dec 02 15:54:53 crc kubenswrapper[4933]: E1202 15:54:53.830018 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 15:54:54.329996291 +0000 UTC m=+157.581222994 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lvtt7" (UID: "977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:53 crc kubenswrapper[4933]: I1202 15:54:53.830930 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11c9d818-4012-423f-b3ce-bec8ac30f1d7-config-volume" (OuterVolumeSpecName: "config-volume") pod "11c9d818-4012-423f-b3ce-bec8ac30f1d7" (UID: "11c9d818-4012-423f-b3ce-bec8ac30f1d7"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 15:54:53 crc kubenswrapper[4933]: I1202 15:54:53.838375 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11c9d818-4012-423f-b3ce-bec8ac30f1d7-kube-api-access-nfchz" (OuterVolumeSpecName: "kube-api-access-nfchz") pod "11c9d818-4012-423f-b3ce-bec8ac30f1d7" (UID: "11c9d818-4012-423f-b3ce-bec8ac30f1d7"). InnerVolumeSpecName "kube-api-access-nfchz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 15:54:53 crc kubenswrapper[4933]: I1202 15:54:53.850338 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11c9d818-4012-423f-b3ce-bec8ac30f1d7-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "11c9d818-4012-423f-b3ce-bec8ac30f1d7" (UID: "11c9d818-4012-423f-b3ce-bec8ac30f1d7"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 15:54:53 crc kubenswrapper[4933]: I1202 15:54:53.912774 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-5xvkq" event={"ID":"9f5ccd80-acb3-4128-bf7c-e2726a9fe5ab","Type":"ContainerStarted","Data":"7abd768009064a7de90db0a7ad706ebc1b2390444e2ee48901b9a44cee7adecd"} Dec 02 15:54:53 crc kubenswrapper[4933]: I1202 15:54:53.940371 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 15:54:53 crc kubenswrapper[4933]: I1202 15:54:53.940657 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nfchz\" (UniqueName: \"kubernetes.io/projected/11c9d818-4012-423f-b3ce-bec8ac30f1d7-kube-api-access-nfchz\") on node \"crc\" DevicePath \"\"" Dec 02 15:54:53 crc kubenswrapper[4933]: I1202 15:54:53.940669 4933 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/11c9d818-4012-423f-b3ce-bec8ac30f1d7-config-volume\") on node \"crc\" DevicePath \"\"" Dec 02 15:54:53 crc kubenswrapper[4933]: I1202 15:54:53.940678 4933 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/11c9d818-4012-423f-b3ce-bec8ac30f1d7-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 02 15:54:53 crc kubenswrapper[4933]: E1202 15:54:53.940740 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 15:54:54.440726557 +0000 UTC m=+157.691953260 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:53 crc kubenswrapper[4933]: I1202 15:54:53.960012 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411505-4m9wv" Dec 02 15:54:53 crc kubenswrapper[4933]: I1202 15:54:53.971001 4933 patch_prober.go:28] interesting pod/downloads-7954f5f757-9r8tr container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Dec 02 15:54:53 crc kubenswrapper[4933]: I1202 15:54:53.971079 4933 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-9r8tr" podUID="8b382362-3187-4571-a8a0-057cbccc89ff" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" Dec 02 15:54:53 crc kubenswrapper[4933]: I1202 15:54:53.971138 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411505-4m9wv" event={"ID":"11c9d818-4012-423f-b3ce-bec8ac30f1d7","Type":"ContainerDied","Data":"a28e87bc132f50e73d8fb72cf28cbf990c33d45d9b1100d63afda83307f367a2"} Dec 02 15:54:53 crc kubenswrapper[4933]: I1202 15:54:53.971209 4933 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a28e87bc132f50e73d8fb72cf28cbf990c33d45d9b1100d63afda83307f367a2" Dec 02 15:54:54 crc kubenswrapper[4933]: I1202 15:54:54.049572 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lvtt7\" (UID: \"977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff\") " pod="openshift-image-registry/image-registry-697d97f7c8-lvtt7" Dec 02 15:54:54 crc kubenswrapper[4933]: E1202 15:54:54.052334 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 15:54:54.55231799 +0000 UTC m=+157.803544693 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lvtt7" (UID: "977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:54 crc kubenswrapper[4933]: I1202 15:54:54.079428 4933 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-marketplace/community-operators-54gm8" secret="" err="failed to sync secret cache: timed out waiting for the condition" Dec 02 15:54:54 crc kubenswrapper[4933]: I1202 15:54:54.079900 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-54gm8" Dec 02 15:54:54 crc kubenswrapper[4933]: I1202 15:54:54.091342 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-r7dnz" Dec 02 15:54:54 crc kubenswrapper[4933]: I1202 15:54:54.152881 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 15:54:54 crc kubenswrapper[4933]: E1202 15:54:54.153306 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 15:54:54.65328873 +0000 UTC m=+157.904515433 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:54 crc kubenswrapper[4933]: I1202 15:54:54.168169 4933 patch_prober.go:28] interesting pod/router-default-5444994796-dgdm5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 15:54:54 crc kubenswrapper[4933]: [-]has-synced failed: reason withheld Dec 02 15:54:54 crc kubenswrapper[4933]: [+]process-running ok Dec 02 15:54:54 crc kubenswrapper[4933]: healthz check failed Dec 02 15:54:54 crc kubenswrapper[4933]: I1202 15:54:54.168412 4933 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dgdm5" podUID="043b25c4-e262-410b-aec8-e18fdb93d0c7" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 15:54:54 crc kubenswrapper[4933]: I1202 15:54:54.254597 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lvtt7\" (UID: \"977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff\") " pod="openshift-image-registry/image-registry-697d97f7c8-lvtt7" Dec 02 15:54:54 crc kubenswrapper[4933]: E1202 15:54:54.254977 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 15:54:54.754965972 +0000 UTC m=+158.006192675 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lvtt7" (UID: "977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:54 crc kubenswrapper[4933]: I1202 15:54:54.298283 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 02 15:54:54 crc kubenswrapper[4933]: I1202 15:54:54.304766 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xnlpk" Dec 02 15:54:54 crc kubenswrapper[4933]: I1202 15:54:54.318360 4933 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-marketplace/certified-operators-mp97n" secret="" err="failed to sync secret cache: timed out waiting for the condition" Dec 02 15:54:54 crc kubenswrapper[4933]: I1202 15:54:54.318440 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mp97n" Dec 02 15:54:54 crc kubenswrapper[4933]: I1202 15:54:54.358439 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 15:54:54 crc kubenswrapper[4933]: E1202 15:54:54.359127 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 15:54:54.859111882 +0000 UTC m=+158.110338585 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:54 crc kubenswrapper[4933]: I1202 15:54:54.459792 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lvtt7\" (UID: \"977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff\") " pod="openshift-image-registry/image-registry-697d97f7c8-lvtt7" Dec 02 15:54:54 crc kubenswrapper[4933]: E1202 15:54:54.460485 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 15:54:54.960474084 +0000 UTC m=+158.211700787 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lvtt7" (UID: "977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:54 crc kubenswrapper[4933]: I1202 15:54:54.461128 4933 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Dec 02 15:54:54 crc kubenswrapper[4933]: I1202 15:54:54.561618 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 15:54:54 crc kubenswrapper[4933]: E1202 15:54:54.561937 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 15:54:55.061915299 +0000 UTC m=+158.313142002 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:54 crc kubenswrapper[4933]: I1202 15:54:54.564076 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 02 15:54:54 crc kubenswrapper[4933]: I1202 15:54:54.603674 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 02 15:54:54 crc kubenswrapper[4933]: I1202 15:54:54.606047 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kph29" Dec 02 15:54:54 crc kubenswrapper[4933]: I1202 15:54:54.673058 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lvtt7\" (UID: \"977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff\") " pod="openshift-image-registry/image-registry-697d97f7c8-lvtt7" Dec 02 15:54:54 crc kubenswrapper[4933]: E1202 15:54:54.677029 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 15:54:55.177009482 +0000 UTC m=+158.428236185 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lvtt7" (UID: "977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:54 crc kubenswrapper[4933]: I1202 15:54:54.777303 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 15:54:54 crc kubenswrapper[4933]: E1202 15:54:54.777657 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 15:54:55.277641431 +0000 UTC m=+158.528868134 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:54 crc kubenswrapper[4933]: I1202 15:54:54.878585 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lvtt7\" (UID: \"977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff\") " pod="openshift-image-registry/image-registry-697d97f7c8-lvtt7" Dec 02 15:54:54 crc kubenswrapper[4933]: E1202 15:54:54.878912 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 15:54:55.37890135 +0000 UTC m=+158.630128053 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lvtt7" (UID: "977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:54 crc kubenswrapper[4933]: I1202 15:54:54.890619 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mp97n"] Dec 02 15:54:54 crc kubenswrapper[4933]: W1202 15:54:54.912917 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f5f6b9a_09c1_46e9_9fba_1de95dd3244c.slice/crio-3384e192e61babc0ec7997e0b9abf00142a09c86ca8c0cd569a8635d573e9ddd WatchSource:0}: Error finding container 3384e192e61babc0ec7997e0b9abf00142a09c86ca8c0cd569a8635d573e9ddd: Status 404 returned error can't find the container with id 3384e192e61babc0ec7997e0b9abf00142a09c86ca8c0cd569a8635d573e9ddd Dec 02 15:54:54 crc kubenswrapper[4933]: I1202 15:54:54.988210 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 15:54:54 crc kubenswrapper[4933]: E1202 15:54:54.988376 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 15:54:55.488358167 +0000 UTC m=+158.739584870 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:54 crc kubenswrapper[4933]: I1202 15:54:54.988690 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lvtt7\" (UID: \"977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff\") " pod="openshift-image-registry/image-registry-697d97f7c8-lvtt7" Dec 02 15:54:54 crc kubenswrapper[4933]: E1202 15:54:54.989097 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 15:54:55.489086109 +0000 UTC m=+158.740312812 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lvtt7" (UID: "977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:55 crc kubenswrapper[4933]: I1202 15:54:55.036772 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-mv2m4"] Dec 02 15:54:55 crc kubenswrapper[4933]: E1202 15:54:55.045355 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11c9d818-4012-423f-b3ce-bec8ac30f1d7" containerName="collect-profiles" Dec 02 15:54:55 crc kubenswrapper[4933]: I1202 15:54:55.045691 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="11c9d818-4012-423f-b3ce-bec8ac30f1d7" containerName="collect-profiles" Dec 02 15:54:55 crc kubenswrapper[4933]: I1202 15:54:55.046097 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="11c9d818-4012-423f-b3ce-bec8ac30f1d7" containerName="collect-profiles" Dec 02 15:54:55 crc kubenswrapper[4933]: I1202 15:54:55.051366 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mv2m4" Dec 02 15:54:55 crc kubenswrapper[4933]: I1202 15:54:55.071386 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 02 15:54:55 crc kubenswrapper[4933]: I1202 15:54:55.097032 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 15:54:55 crc kubenswrapper[4933]: E1202 15:54:55.097370 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 15:54:55.597353518 +0000 UTC m=+158.848580221 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:55 crc kubenswrapper[4933]: I1202 15:54:55.097516 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3e7c8a1-0224-434e-bf6f-27342e1f27e6-utilities\") pod \"redhat-marketplace-mv2m4\" (UID: \"b3e7c8a1-0224-434e-bf6f-27342e1f27e6\") " pod="openshift-marketplace/redhat-marketplace-mv2m4" Dec 02 15:54:55 crc kubenswrapper[4933]: I1202 15:54:55.097590 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lvtt7\" (UID: \"977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff\") " pod="openshift-image-registry/image-registry-697d97f7c8-lvtt7" Dec 02 15:54:55 crc kubenswrapper[4933]: I1202 15:54:55.097645 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fj6gq\" (UniqueName: \"kubernetes.io/projected/b3e7c8a1-0224-434e-bf6f-27342e1f27e6-kube-api-access-fj6gq\") pod \"redhat-marketplace-mv2m4\" (UID: \"b3e7c8a1-0224-434e-bf6f-27342e1f27e6\") " pod="openshift-marketplace/redhat-marketplace-mv2m4" Dec 02 15:54:55 crc kubenswrapper[4933]: I1202 15:54:55.097671 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3e7c8a1-0224-434e-bf6f-27342e1f27e6-catalog-content\") pod \"redhat-marketplace-mv2m4\" (UID: \"b3e7c8a1-0224-434e-bf6f-27342e1f27e6\") " pod="openshift-marketplace/redhat-marketplace-mv2m4" Dec 02 15:54:55 crc kubenswrapper[4933]: E1202 15:54:55.098128 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 15:54:55.598120853 +0000 UTC m=+158.849347556 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lvtt7" (UID: "977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:55 crc kubenswrapper[4933]: I1202 15:54:55.171109 4933 patch_prober.go:28] interesting pod/router-default-5444994796-dgdm5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 15:54:55 crc kubenswrapper[4933]: [-]has-synced failed: reason withheld Dec 02 15:54:55 crc kubenswrapper[4933]: [+]process-running ok Dec 02 15:54:55 crc kubenswrapper[4933]: healthz check failed Dec 02 15:54:55 crc kubenswrapper[4933]: I1202 15:54:55.171163 4933 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dgdm5" podUID="043b25c4-e262-410b-aec8-e18fdb93d0c7" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 15:54:55 crc kubenswrapper[4933]: I1202 15:54:55.198424 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 15:54:55 crc kubenswrapper[4933]: I1202 15:54:55.198658 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3e7c8a1-0224-434e-bf6f-27342e1f27e6-catalog-content\") pod \"redhat-marketplace-mv2m4\" (UID: \"b3e7c8a1-0224-434e-bf6f-27342e1f27e6\") " pod="openshift-marketplace/redhat-marketplace-mv2m4" Dec 02 15:54:55 crc kubenswrapper[4933]: I1202 15:54:55.198712 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3e7c8a1-0224-434e-bf6f-27342e1f27e6-utilities\") pod \"redhat-marketplace-mv2m4\" (UID: \"b3e7c8a1-0224-434e-bf6f-27342e1f27e6\") " pod="openshift-marketplace/redhat-marketplace-mv2m4" Dec 02 15:54:55 crc kubenswrapper[4933]: I1202 15:54:55.198794 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fj6gq\" (UniqueName: \"kubernetes.io/projected/b3e7c8a1-0224-434e-bf6f-27342e1f27e6-kube-api-access-fj6gq\") pod \"redhat-marketplace-mv2m4\" (UID: \"b3e7c8a1-0224-434e-bf6f-27342e1f27e6\") " pod="openshift-marketplace/redhat-marketplace-mv2m4" Dec 02 15:54:55 crc kubenswrapper[4933]: E1202 15:54:55.199607 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 15:54:55.699590418 +0000 UTC m=+158.950817141 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:55 crc kubenswrapper[4933]: I1202 15:54:55.200092 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3e7c8a1-0224-434e-bf6f-27342e1f27e6-catalog-content\") pod \"redhat-marketplace-mv2m4\" (UID: \"b3e7c8a1-0224-434e-bf6f-27342e1f27e6\") " pod="openshift-marketplace/redhat-marketplace-mv2m4" Dec 02 15:54:55 crc kubenswrapper[4933]: I1202 15:54:55.203152 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3e7c8a1-0224-434e-bf6f-27342e1f27e6-utilities\") pod \"redhat-marketplace-mv2m4\" (UID: \"b3e7c8a1-0224-434e-bf6f-27342e1f27e6\") " pod="openshift-marketplace/redhat-marketplace-mv2m4" Dec 02 15:54:55 crc kubenswrapper[4933]: I1202 15:54:55.258600 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mv2m4"] Dec 02 15:54:55 crc kubenswrapper[4933]: I1202 15:54:55.258687 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-b2sss" Dec 02 15:54:55 crc kubenswrapper[4933]: I1202 15:54:55.258703 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-5xvkq" event={"ID":"9f5ccd80-acb3-4128-bf7c-e2726a9fe5ab","Type":"ContainerStarted","Data":"b161b11a9e658233b3c47bfd3cd3ee7da2cf978e0a10108a688173d2f650d45b"} Dec 02 15:54:55 crc kubenswrapper[4933]: I1202 15:54:55.258725 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xnlpk"] Dec 02 15:54:55 crc kubenswrapper[4933]: I1202 15:54:55.258738 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mp97n" event={"ID":"4f5f6b9a-09c1-46e9-9fba-1de95dd3244c","Type":"ContainerStarted","Data":"3384e192e61babc0ec7997e0b9abf00142a09c86ca8c0cd569a8635d573e9ddd"} Dec 02 15:54:55 crc kubenswrapper[4933]: I1202 15:54:55.258749 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"2806d3f7-26c5-43c1-b77e-9f42eefb7a0c","Type":"ContainerStarted","Data":"52192488db8d0d91e56d3c3538a78056bfad94b24e90d67fc322820bfabcb766"} Dec 02 15:54:55 crc kubenswrapper[4933]: I1202 15:54:55.259855 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fj6gq\" (UniqueName: \"kubernetes.io/projected/b3e7c8a1-0224-434e-bf6f-27342e1f27e6-kube-api-access-fj6gq\") pod \"redhat-marketplace-mv2m4\" (UID: \"b3e7c8a1-0224-434e-bf6f-27342e1f27e6\") " pod="openshift-marketplace/redhat-marketplace-mv2m4" Dec 02 15:54:55 crc kubenswrapper[4933]: I1202 15:54:55.262436 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-54gm8"] Dec 02 15:54:55 crc kubenswrapper[4933]: W1202 15:54:55.262705 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddcfaf9e9_2939_430d_b0c8_7f8daaca0f42.slice/crio-30122f2d14f9ca26839fb2fb7fa4bd11c23222863da7aa653dbd03451243670f WatchSource:0}: Error finding container 30122f2d14f9ca26839fb2fb7fa4bd11c23222863da7aa653dbd03451243670f: Status 404 returned error can't find the container with id 30122f2d14f9ca26839fb2fb7fa4bd11c23222863da7aa653dbd03451243670f Dec 02 15:54:55 crc kubenswrapper[4933]: I1202 15:54:55.295161 4933 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-12-02T15:54:54.461203527Z","Handler":null,"Name":""} Dec 02 15:54:55 crc kubenswrapper[4933]: I1202 15:54:55.303466 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lvtt7\" (UID: \"977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff\") " pod="openshift-image-registry/image-registry-697d97f7c8-lvtt7" Dec 02 15:54:55 crc kubenswrapper[4933]: E1202 15:54:55.303960 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 15:54:55.803942504 +0000 UTC m=+159.055169207 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lvtt7" (UID: "977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 15:54:55 crc kubenswrapper[4933]: I1202 15:54:55.304186 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-w8l5x" Dec 02 15:54:55 crc kubenswrapper[4933]: I1202 15:54:55.304931 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-w8l5x" Dec 02 15:54:55 crc kubenswrapper[4933]: I1202 15:54:55.310280 4933 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Dec 02 15:54:55 crc kubenswrapper[4933]: I1202 15:54:55.310379 4933 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Dec 02 15:54:55 crc kubenswrapper[4933]: I1202 15:54:55.333066 4933 patch_prober.go:28] interesting pod/apiserver-76f77b778f-w8l5x container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Dec 02 15:54:55 crc kubenswrapper[4933]: [+]log ok Dec 02 15:54:55 crc kubenswrapper[4933]: [+]etcd ok Dec 02 15:54:55 crc kubenswrapper[4933]: [+]poststarthook/start-apiserver-admission-initializer ok Dec 02 15:54:55 crc kubenswrapper[4933]: [+]poststarthook/generic-apiserver-start-informers ok Dec 02 15:54:55 crc kubenswrapper[4933]: [+]poststarthook/max-in-flight-filter ok Dec 02 15:54:55 crc kubenswrapper[4933]: [+]poststarthook/storage-object-count-tracker-hook ok Dec 02 15:54:55 crc kubenswrapper[4933]: [+]poststarthook/image.openshift.io-apiserver-caches ok Dec 02 15:54:55 crc kubenswrapper[4933]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Dec 02 15:54:55 crc kubenswrapper[4933]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Dec 02 15:54:55 crc kubenswrapper[4933]: [+]poststarthook/project.openshift.io-projectcache ok Dec 02 15:54:55 crc kubenswrapper[4933]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Dec 02 15:54:55 crc kubenswrapper[4933]: [+]poststarthook/openshift.io-startinformers ok Dec 02 15:54:55 crc kubenswrapper[4933]: [+]poststarthook/openshift.io-restmapperupdater ok Dec 02 15:54:55 crc kubenswrapper[4933]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Dec 02 15:54:55 crc kubenswrapper[4933]: livez check failed Dec 02 15:54:55 crc kubenswrapper[4933]: I1202 15:54:55.334418 4933 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-w8l5x" podUID="14eded2c-58d8-4348-a7ae-1a027349ae71" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 15:54:55 crc kubenswrapper[4933]: I1202 15:54:55.400915 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kph29"] Dec 02 15:54:55 crc kubenswrapper[4933]: I1202 15:54:55.406794 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 15:54:55 crc kubenswrapper[4933]: I1202 15:54:55.415881 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-w9qmg"] Dec 02 15:54:55 crc kubenswrapper[4933]: I1202 15:54:55.417215 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w9qmg" Dec 02 15:54:55 crc kubenswrapper[4933]: I1202 15:54:55.428618 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 02 15:54:55 crc kubenswrapper[4933]: I1202 15:54:55.435943 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-w9qmg"] Dec 02 15:54:55 crc kubenswrapper[4933]: W1202 15:54:55.501965 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod49685142_ff9c_4dec_9f6a_dbc8f06b7174.slice/crio-df2f971299ec7bc71cfcaac58fceaa5a2e4b14721b76f02163d8e8c852a5b4d7 WatchSource:0}: Error finding container df2f971299ec7bc71cfcaac58fceaa5a2e4b14721b76f02163d8e8c852a5b4d7: Status 404 returned error can't find the container with id df2f971299ec7bc71cfcaac58fceaa5a2e4b14721b76f02163d8e8c852a5b4d7 Dec 02 15:54:55 crc kubenswrapper[4933]: I1202 15:54:55.502980 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mv2m4" Dec 02 15:54:55 crc kubenswrapper[4933]: I1202 15:54:55.508287 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4379206-f985-4847-90b4-604172bb7e6d-utilities\") pod \"redhat-marketplace-w9qmg\" (UID: \"e4379206-f985-4847-90b4-604172bb7e6d\") " pod="openshift-marketplace/redhat-marketplace-w9qmg" Dec 02 15:54:55 crc kubenswrapper[4933]: I1202 15:54:55.508345 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lvtt7\" (UID: \"977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff\") " pod="openshift-image-registry/image-registry-697d97f7c8-lvtt7" Dec 02 15:54:55 crc kubenswrapper[4933]: I1202 15:54:55.508388 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4379206-f985-4847-90b4-604172bb7e6d-catalog-content\") pod \"redhat-marketplace-w9qmg\" (UID: \"e4379206-f985-4847-90b4-604172bb7e6d\") " pod="openshift-marketplace/redhat-marketplace-w9qmg" Dec 02 15:54:55 crc kubenswrapper[4933]: I1202 15:54:55.508418 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jwpm\" (UniqueName: \"kubernetes.io/projected/e4379206-f985-4847-90b4-604172bb7e6d-kube-api-access-6jwpm\") pod \"redhat-marketplace-w9qmg\" (UID: \"e4379206-f985-4847-90b4-604172bb7e6d\") " pod="openshift-marketplace/redhat-marketplace-w9qmg" Dec 02 15:54:55 crc kubenswrapper[4933]: I1202 15:54:55.551594 4933 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 02 15:54:55 crc kubenswrapper[4933]: I1202 15:54:55.551643 4933 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lvtt7\" (UID: \"977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-lvtt7" Dec 02 15:54:55 crc kubenswrapper[4933]: I1202 15:54:55.589187 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lvtt7\" (UID: \"977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff\") " pod="openshift-image-registry/image-registry-697d97f7c8-lvtt7" Dec 02 15:54:55 crc kubenswrapper[4933]: I1202 15:54:55.609766 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4379206-f985-4847-90b4-604172bb7e6d-catalog-content\") pod \"redhat-marketplace-w9qmg\" (UID: \"e4379206-f985-4847-90b4-604172bb7e6d\") " pod="openshift-marketplace/redhat-marketplace-w9qmg" Dec 02 15:54:55 crc kubenswrapper[4933]: I1202 15:54:55.609922 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jwpm\" (UniqueName: \"kubernetes.io/projected/e4379206-f985-4847-90b4-604172bb7e6d-kube-api-access-6jwpm\") pod \"redhat-marketplace-w9qmg\" (UID: \"e4379206-f985-4847-90b4-604172bb7e6d\") " pod="openshift-marketplace/redhat-marketplace-w9qmg" Dec 02 15:54:55 crc kubenswrapper[4933]: I1202 15:54:55.610012 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4379206-f985-4847-90b4-604172bb7e6d-utilities\") pod \"redhat-marketplace-w9qmg\" (UID: \"e4379206-f985-4847-90b4-604172bb7e6d\") " pod="openshift-marketplace/redhat-marketplace-w9qmg" Dec 02 15:54:55 crc kubenswrapper[4933]: I1202 15:54:55.610889 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4379206-f985-4847-90b4-604172bb7e6d-utilities\") pod \"redhat-marketplace-w9qmg\" (UID: \"e4379206-f985-4847-90b4-604172bb7e6d\") " pod="openshift-marketplace/redhat-marketplace-w9qmg" Dec 02 15:54:55 crc kubenswrapper[4933]: I1202 15:54:55.610929 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4379206-f985-4847-90b4-604172bb7e6d-catalog-content\") pod \"redhat-marketplace-w9qmg\" (UID: \"e4379206-f985-4847-90b4-604172bb7e6d\") " pod="openshift-marketplace/redhat-marketplace-w9qmg" Dec 02 15:54:55 crc kubenswrapper[4933]: I1202 15:54:55.631159 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jwpm\" (UniqueName: \"kubernetes.io/projected/e4379206-f985-4847-90b4-604172bb7e6d-kube-api-access-6jwpm\") pod \"redhat-marketplace-w9qmg\" (UID: \"e4379206-f985-4847-90b4-604172bb7e6d\") " pod="openshift-marketplace/redhat-marketplace-w9qmg" Dec 02 15:54:55 crc kubenswrapper[4933]: I1202 15:54:55.651565 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-lvtt7" Dec 02 15:54:55 crc kubenswrapper[4933]: I1202 15:54:55.698280 4933 patch_prober.go:28] interesting pod/downloads-7954f5f757-9r8tr container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Dec 02 15:54:55 crc kubenswrapper[4933]: I1202 15:54:55.698675 4933 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-9r8tr" podUID="8b382362-3187-4571-a8a0-057cbccc89ff" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" Dec 02 15:54:55 crc kubenswrapper[4933]: I1202 15:54:55.699388 4933 patch_prober.go:28] interesting pod/downloads-7954f5f757-9r8tr container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Dec 02 15:54:55 crc kubenswrapper[4933]: I1202 15:54:55.699494 4933 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-9r8tr" podUID="8b382362-3187-4571-a8a0-057cbccc89ff" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" Dec 02 15:54:55 crc kubenswrapper[4933]: I1202 15:54:55.739848 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-lcwgg" Dec 02 15:54:55 crc kubenswrapper[4933]: I1202 15:54:55.739881 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-lcwgg" Dec 02 15:54:55 crc kubenswrapper[4933]: I1202 15:54:55.740511 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w9qmg" Dec 02 15:54:55 crc kubenswrapper[4933]: I1202 15:54:55.742067 4933 patch_prober.go:28] interesting pod/console-f9d7485db-lcwgg container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.13:8443/health\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Dec 02 15:54:55 crc kubenswrapper[4933]: I1202 15:54:55.742118 4933 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-lcwgg" podUID="fcc5c745-13b8-4ff8-b677-095dd8a46081" containerName="console" probeResult="failure" output="Get \"https://10.217.0.13:8443/health\": dial tcp 10.217.0.13:8443: connect: connection refused" Dec 02 15:54:55 crc kubenswrapper[4933]: I1202 15:54:55.841254 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-grlhq" Dec 02 15:54:55 crc kubenswrapper[4933]: I1202 15:54:55.841298 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-grlhq" Dec 02 15:54:55 crc kubenswrapper[4933]: I1202 15:54:55.841753 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-jb7rb"] Dec 02 15:54:55 crc kubenswrapper[4933]: I1202 15:54:55.847169 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jb7rb" Dec 02 15:54:55 crc kubenswrapper[4933]: I1202 15:54:55.854904 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 02 15:54:55 crc kubenswrapper[4933]: I1202 15:54:55.855630 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mv2m4"] Dec 02 15:54:55 crc kubenswrapper[4933]: I1202 15:54:55.858382 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jb7rb"] Dec 02 15:54:55 crc kubenswrapper[4933]: I1202 15:54:55.858569 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-grlhq" Dec 02 15:54:55 crc kubenswrapper[4933]: I1202 15:54:55.938509 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43ed131f-a943-4e5e-a11f-e07507790a41-catalog-content\") pod \"redhat-operators-jb7rb\" (UID: \"43ed131f-a943-4e5e-a11f-e07507790a41\") " pod="openshift-marketplace/redhat-operators-jb7rb" Dec 02 15:54:55 crc kubenswrapper[4933]: I1202 15:54:55.939892 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxxg4\" (UniqueName: \"kubernetes.io/projected/43ed131f-a943-4e5e-a11f-e07507790a41-kube-api-access-gxxg4\") pod \"redhat-operators-jb7rb\" (UID: \"43ed131f-a943-4e5e-a11f-e07507790a41\") " pod="openshift-marketplace/redhat-operators-jb7rb" Dec 02 15:54:55 crc kubenswrapper[4933]: I1202 15:54:55.939955 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43ed131f-a943-4e5e-a11f-e07507790a41-utilities\") pod \"redhat-operators-jb7rb\" (UID: \"43ed131f-a943-4e5e-a11f-e07507790a41\") " pod="openshift-marketplace/redhat-operators-jb7rb" Dec 02 15:54:56 crc kubenswrapper[4933]: I1202 15:54:56.014798 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-5sjn5"] Dec 02 15:54:56 crc kubenswrapper[4933]: I1202 15:54:56.018084 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5sjn5" Dec 02 15:54:56 crc kubenswrapper[4933]: I1202 15:54:56.027581 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5sjn5"] Dec 02 15:54:56 crc kubenswrapper[4933]: I1202 15:54:56.041016 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43ed131f-a943-4e5e-a11f-e07507790a41-catalog-content\") pod \"redhat-operators-jb7rb\" (UID: \"43ed131f-a943-4e5e-a11f-e07507790a41\") " pod="openshift-marketplace/redhat-operators-jb7rb" Dec 02 15:54:56 crc kubenswrapper[4933]: I1202 15:54:56.041084 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxxg4\" (UniqueName: \"kubernetes.io/projected/43ed131f-a943-4e5e-a11f-e07507790a41-kube-api-access-gxxg4\") pod \"redhat-operators-jb7rb\" (UID: \"43ed131f-a943-4e5e-a11f-e07507790a41\") " pod="openshift-marketplace/redhat-operators-jb7rb" Dec 02 15:54:56 crc kubenswrapper[4933]: I1202 15:54:56.041108 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43ed131f-a943-4e5e-a11f-e07507790a41-utilities\") pod \"redhat-operators-jb7rb\" (UID: \"43ed131f-a943-4e5e-a11f-e07507790a41\") " pod="openshift-marketplace/redhat-operators-jb7rb" Dec 02 15:54:56 crc kubenswrapper[4933]: I1202 15:54:56.048355 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-w9qmg"] Dec 02 15:54:56 crc kubenswrapper[4933]: W1202 15:54:56.051920 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4379206_f985_4847_90b4_604172bb7e6d.slice/crio-dddde4ccf5e780f758010ce59f192f8c0a4a015be674c8c6efb82e01d45fbf6f WatchSource:0}: Error finding container dddde4ccf5e780f758010ce59f192f8c0a4a015be674c8c6efb82e01d45fbf6f: Status 404 returned error can't find the container with id dddde4ccf5e780f758010ce59f192f8c0a4a015be674c8c6efb82e01d45fbf6f Dec 02 15:54:56 crc kubenswrapper[4933]: I1202 15:54:56.063115 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxxg4\" (UniqueName: \"kubernetes.io/projected/43ed131f-a943-4e5e-a11f-e07507790a41-kube-api-access-gxxg4\") pod \"redhat-operators-jb7rb\" (UID: \"43ed131f-a943-4e5e-a11f-e07507790a41\") " pod="openshift-marketplace/redhat-operators-jb7rb" Dec 02 15:54:56 crc kubenswrapper[4933]: I1202 15:54:56.120910 4933 generic.go:334] "Generic (PLEG): container finished" podID="c7cef2b5-6b23-47ba-83ce-161a02f128a1" containerID="115ff0ad4b0b271baef8b36cfbfbb04e13362ecf4986a5fa009c168aa3b0bc64" exitCode=0 Dec 02 15:54:56 crc kubenswrapper[4933]: I1202 15:54:56.120979 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-54gm8" event={"ID":"c7cef2b5-6b23-47ba-83ce-161a02f128a1","Type":"ContainerDied","Data":"115ff0ad4b0b271baef8b36cfbfbb04e13362ecf4986a5fa009c168aa3b0bc64"} Dec 02 15:54:56 crc kubenswrapper[4933]: I1202 15:54:56.121005 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-54gm8" event={"ID":"c7cef2b5-6b23-47ba-83ce-161a02f128a1","Type":"ContainerStarted","Data":"b509b031e61c155d89d15104c7bb3788900dc6b9ed585a9fa8c9fad628a27c2c"} Dec 02 15:54:56 crc kubenswrapper[4933]: I1202 15:54:56.122654 4933 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 15:54:56 crc kubenswrapper[4933]: I1202 15:54:56.128150 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"2806d3f7-26c5-43c1-b77e-9f42eefb7a0c","Type":"ContainerStarted","Data":"49a382c18ad1468c6aa20d12e3bf6a8ebaac468aae2b59fc1aa0ff552c419bfd"} Dec 02 15:54:56 crc kubenswrapper[4933]: I1202 15:54:56.133188 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w9qmg" event={"ID":"e4379206-f985-4847-90b4-604172bb7e6d","Type":"ContainerStarted","Data":"dddde4ccf5e780f758010ce59f192f8c0a4a015be674c8c6efb82e01d45fbf6f"} Dec 02 15:54:56 crc kubenswrapper[4933]: I1202 15:54:56.140160 4933 generic.go:334] "Generic (PLEG): container finished" podID="dcfaf9e9-2939-430d-b0c8-7f8daaca0f42" containerID="cbb31cc2f12f5d397e8f8a75777bbc3b19b57a47677de32e2de4511cc81bc1ee" exitCode=0 Dec 02 15:54:56 crc kubenswrapper[4933]: I1202 15:54:56.140213 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xnlpk" event={"ID":"dcfaf9e9-2939-430d-b0c8-7f8daaca0f42","Type":"ContainerDied","Data":"cbb31cc2f12f5d397e8f8a75777bbc3b19b57a47677de32e2de4511cc81bc1ee"} Dec 02 15:54:56 crc kubenswrapper[4933]: I1202 15:54:56.140236 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xnlpk" event={"ID":"dcfaf9e9-2939-430d-b0c8-7f8daaca0f42","Type":"ContainerStarted","Data":"30122f2d14f9ca26839fb2fb7fa4bd11c23222863da7aa653dbd03451243670f"} Dec 02 15:54:56 crc kubenswrapper[4933]: I1202 15:54:56.142122 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f5707d4-08a6-4d6a-9cb6-efbaf25caa15-catalog-content\") pod \"redhat-operators-5sjn5\" (UID: \"3f5707d4-08a6-4d6a-9cb6-efbaf25caa15\") " pod="openshift-marketplace/redhat-operators-5sjn5" Dec 02 15:54:56 crc kubenswrapper[4933]: I1202 15:54:56.142166 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wqj4\" (UniqueName: \"kubernetes.io/projected/3f5707d4-08a6-4d6a-9cb6-efbaf25caa15-kube-api-access-4wqj4\") pod \"redhat-operators-5sjn5\" (UID: \"3f5707d4-08a6-4d6a-9cb6-efbaf25caa15\") " pod="openshift-marketplace/redhat-operators-5sjn5" Dec 02 15:54:56 crc kubenswrapper[4933]: I1202 15:54:56.142185 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f5707d4-08a6-4d6a-9cb6-efbaf25caa15-utilities\") pod \"redhat-operators-5sjn5\" (UID: \"3f5707d4-08a6-4d6a-9cb6-efbaf25caa15\") " pod="openshift-marketplace/redhat-operators-5sjn5" Dec 02 15:54:56 crc kubenswrapper[4933]: I1202 15:54:56.143419 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kph29" event={"ID":"49685142-ff9c-4dec-9f6a-dbc8f06b7174","Type":"ContainerStarted","Data":"df2f971299ec7bc71cfcaac58fceaa5a2e4b14721b76f02163d8e8c852a5b4d7"} Dec 02 15:54:56 crc kubenswrapper[4933]: I1202 15:54:56.151314 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-5xvkq" event={"ID":"9f5ccd80-acb3-4128-bf7c-e2726a9fe5ab","Type":"ContainerStarted","Data":"a0e9affd0484b838dedcad7d041f50ce36afe2c5987871d967e1f544e8e857aa"} Dec 02 15:54:56 crc kubenswrapper[4933]: I1202 15:54:56.154192 4933 generic.go:334] "Generic (PLEG): container finished" podID="4f5f6b9a-09c1-46e9-9fba-1de95dd3244c" containerID="23e6214856d265959fa8774ac6c12cd5bb4ab1b4ca6d1c36b2875a4eebd80812" exitCode=0 Dec 02 15:54:56 crc kubenswrapper[4933]: I1202 15:54:56.154260 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mp97n" event={"ID":"4f5f6b9a-09c1-46e9-9fba-1de95dd3244c","Type":"ContainerDied","Data":"23e6214856d265959fa8774ac6c12cd5bb4ab1b4ca6d1c36b2875a4eebd80812"} Dec 02 15:54:56 crc kubenswrapper[4933]: I1202 15:54:56.158875 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-lvtt7"] Dec 02 15:54:56 crc kubenswrapper[4933]: I1202 15:54:56.160022 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-dgdm5" Dec 02 15:54:56 crc kubenswrapper[4933]: I1202 15:54:56.165693 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mv2m4" event={"ID":"b3e7c8a1-0224-434e-bf6f-27342e1f27e6","Type":"ContainerStarted","Data":"a9a9d14b75345ba1f3324a744c896f88ebaeea28ad15b10b049f765f75d69597"} Dec 02 15:54:56 crc kubenswrapper[4933]: I1202 15:54:56.166617 4933 patch_prober.go:28] interesting pod/router-default-5444994796-dgdm5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 15:54:56 crc kubenswrapper[4933]: [-]has-synced failed: reason withheld Dec 02 15:54:56 crc kubenswrapper[4933]: [+]process-running ok Dec 02 15:54:56 crc kubenswrapper[4933]: healthz check failed Dec 02 15:54:56 crc kubenswrapper[4933]: I1202 15:54:56.166658 4933 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dgdm5" podUID="043b25c4-e262-410b-aec8-e18fdb93d0c7" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 15:54:56 crc kubenswrapper[4933]: I1202 15:54:56.176138 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-grlhq" Dec 02 15:54:56 crc kubenswrapper[4933]: I1202 15:54:56.187516 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=3.187496502 podStartE2EDuration="3.187496502s" podCreationTimestamp="2025-12-02 15:54:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 15:54:56.184318542 +0000 UTC m=+159.435545275" watchObservedRunningTime="2025-12-02 15:54:56.187496502 +0000 UTC m=+159.438723205" Dec 02 15:54:56 crc kubenswrapper[4933]: I1202 15:54:56.243280 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f5707d4-08a6-4d6a-9cb6-efbaf25caa15-utilities\") pod \"redhat-operators-5sjn5\" (UID: \"3f5707d4-08a6-4d6a-9cb6-efbaf25caa15\") " pod="openshift-marketplace/redhat-operators-5sjn5" Dec 02 15:54:56 crc kubenswrapper[4933]: I1202 15:54:56.243871 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f5707d4-08a6-4d6a-9cb6-efbaf25caa15-utilities\") pod \"redhat-operators-5sjn5\" (UID: \"3f5707d4-08a6-4d6a-9cb6-efbaf25caa15\") " pod="openshift-marketplace/redhat-operators-5sjn5" Dec 02 15:54:56 crc kubenswrapper[4933]: I1202 15:54:56.249299 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f5707d4-08a6-4d6a-9cb6-efbaf25caa15-catalog-content\") pod \"redhat-operators-5sjn5\" (UID: \"3f5707d4-08a6-4d6a-9cb6-efbaf25caa15\") " pod="openshift-marketplace/redhat-operators-5sjn5" Dec 02 15:54:56 crc kubenswrapper[4933]: I1202 15:54:56.250409 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wqj4\" (UniqueName: \"kubernetes.io/projected/3f5707d4-08a6-4d6a-9cb6-efbaf25caa15-kube-api-access-4wqj4\") pod \"redhat-operators-5sjn5\" (UID: \"3f5707d4-08a6-4d6a-9cb6-efbaf25caa15\") " pod="openshift-marketplace/redhat-operators-5sjn5" Dec 02 15:54:56 crc kubenswrapper[4933]: I1202 15:54:56.250775 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f5707d4-08a6-4d6a-9cb6-efbaf25caa15-catalog-content\") pod \"redhat-operators-5sjn5\" (UID: \"3f5707d4-08a6-4d6a-9cb6-efbaf25caa15\") " pod="openshift-marketplace/redhat-operators-5sjn5" Dec 02 15:54:56 crc kubenswrapper[4933]: I1202 15:54:56.254569 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-5xvkq" podStartSLOduration=13.254555127 podStartE2EDuration="13.254555127s" podCreationTimestamp="2025-12-02 15:54:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 15:54:56.252594636 +0000 UTC m=+159.503821359" watchObservedRunningTime="2025-12-02 15:54:56.254555127 +0000 UTC m=+159.505781830" Dec 02 15:54:56 crc kubenswrapper[4933]: I1202 15:54:56.278094 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wqj4\" (UniqueName: \"kubernetes.io/projected/3f5707d4-08a6-4d6a-9cb6-efbaf25caa15-kube-api-access-4wqj4\") pod \"redhat-operators-5sjn5\" (UID: \"3f5707d4-08a6-4d6a-9cb6-efbaf25caa15\") " pod="openshift-marketplace/redhat-operators-5sjn5" Dec 02 15:54:56 crc kubenswrapper[4933]: I1202 15:54:56.326684 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43ed131f-a943-4e5e-a11f-e07507790a41-utilities\") pod \"redhat-operators-jb7rb\" (UID: \"43ed131f-a943-4e5e-a11f-e07507790a41\") " pod="openshift-marketplace/redhat-operators-jb7rb" Dec 02 15:54:56 crc kubenswrapper[4933]: I1202 15:54:56.327109 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43ed131f-a943-4e5e-a11f-e07507790a41-catalog-content\") pod \"redhat-operators-jb7rb\" (UID: \"43ed131f-a943-4e5e-a11f-e07507790a41\") " pod="openshift-marketplace/redhat-operators-jb7rb" Dec 02 15:54:56 crc kubenswrapper[4933]: I1202 15:54:56.340313 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5sjn5" Dec 02 15:54:56 crc kubenswrapper[4933]: I1202 15:54:56.483406 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jb7rb" Dec 02 15:54:56 crc kubenswrapper[4933]: I1202 15:54:56.620209 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5sjn5"] Dec 02 15:54:56 crc kubenswrapper[4933]: W1202 15:54:56.638419 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f5707d4_08a6_4d6a_9cb6_efbaf25caa15.slice/crio-6f68b72fcaa9a00f66a696294d4d71a89c21a5ad9fedad71cfbcea85f9540571 WatchSource:0}: Error finding container 6f68b72fcaa9a00f66a696294d4d71a89c21a5ad9fedad71cfbcea85f9540571: Status 404 returned error can't find the container with id 6f68b72fcaa9a00f66a696294d4d71a89c21a5ad9fedad71cfbcea85f9540571 Dec 02 15:54:56 crc kubenswrapper[4933]: I1202 15:54:56.721398 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jb7rb"] Dec 02 15:54:57 crc kubenswrapper[4933]: I1202 15:54:57.069219 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Dec 02 15:54:57 crc kubenswrapper[4933]: I1202 15:54:57.168047 4933 patch_prober.go:28] interesting pod/router-default-5444994796-dgdm5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 15:54:57 crc kubenswrapper[4933]: [-]has-synced failed: reason withheld Dec 02 15:54:57 crc kubenswrapper[4933]: [+]process-running ok Dec 02 15:54:57 crc kubenswrapper[4933]: healthz check failed Dec 02 15:54:57 crc kubenswrapper[4933]: I1202 15:54:57.168103 4933 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dgdm5" podUID="043b25c4-e262-410b-aec8-e18fdb93d0c7" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 15:54:57 crc kubenswrapper[4933]: I1202 15:54:57.185932 4933 generic.go:334] "Generic (PLEG): container finished" podID="2806d3f7-26c5-43c1-b77e-9f42eefb7a0c" containerID="49a382c18ad1468c6aa20d12e3bf6a8ebaac468aae2b59fc1aa0ff552c419bfd" exitCode=0 Dec 02 15:54:57 crc kubenswrapper[4933]: I1202 15:54:57.186010 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"2806d3f7-26c5-43c1-b77e-9f42eefb7a0c","Type":"ContainerDied","Data":"49a382c18ad1468c6aa20d12e3bf6a8ebaac468aae2b59fc1aa0ff552c419bfd"} Dec 02 15:54:57 crc kubenswrapper[4933]: I1202 15:54:57.190072 4933 generic.go:334] "Generic (PLEG): container finished" podID="e4379206-f985-4847-90b4-604172bb7e6d" containerID="12825a2199ba212f149748bb00f378440ab74d1c41d4203e02f0a2d64e3da9d8" exitCode=0 Dec 02 15:54:57 crc kubenswrapper[4933]: I1202 15:54:57.190807 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w9qmg" event={"ID":"e4379206-f985-4847-90b4-604172bb7e6d","Type":"ContainerDied","Data":"12825a2199ba212f149748bb00f378440ab74d1c41d4203e02f0a2d64e3da9d8"} Dec 02 15:54:57 crc kubenswrapper[4933]: I1202 15:54:57.195599 4933 generic.go:334] "Generic (PLEG): container finished" podID="43ed131f-a943-4e5e-a11f-e07507790a41" containerID="6e6dfc22a672262f956133e07d0a3a50e62784d26feb2d3518fd9a1c5c94f602" exitCode=0 Dec 02 15:54:57 crc kubenswrapper[4933]: I1202 15:54:57.195668 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jb7rb" event={"ID":"43ed131f-a943-4e5e-a11f-e07507790a41","Type":"ContainerDied","Data":"6e6dfc22a672262f956133e07d0a3a50e62784d26feb2d3518fd9a1c5c94f602"} Dec 02 15:54:57 crc kubenswrapper[4933]: I1202 15:54:57.195697 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jb7rb" event={"ID":"43ed131f-a943-4e5e-a11f-e07507790a41","Type":"ContainerStarted","Data":"c90a00abbdeb98e162cb6f3ec6f14ade359977632fff201f542084c60cf22279"} Dec 02 15:54:57 crc kubenswrapper[4933]: I1202 15:54:57.197786 4933 generic.go:334] "Generic (PLEG): container finished" podID="49685142-ff9c-4dec-9f6a-dbc8f06b7174" containerID="086bb4b1a714b2015bba77f199edac2bc0213ed03000fc6e8ea6992c593a4436" exitCode=0 Dec 02 15:54:57 crc kubenswrapper[4933]: I1202 15:54:57.198127 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kph29" event={"ID":"49685142-ff9c-4dec-9f6a-dbc8f06b7174","Type":"ContainerDied","Data":"086bb4b1a714b2015bba77f199edac2bc0213ed03000fc6e8ea6992c593a4436"} Dec 02 15:54:57 crc kubenswrapper[4933]: I1202 15:54:57.203722 4933 generic.go:334] "Generic (PLEG): container finished" podID="3f5707d4-08a6-4d6a-9cb6-efbaf25caa15" containerID="d3b6d1ad2f4d9493b53732a4b71dfdfb3dc5469003a70aa18c906815c304ba1f" exitCode=0 Dec 02 15:54:57 crc kubenswrapper[4933]: I1202 15:54:57.203944 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5sjn5" event={"ID":"3f5707d4-08a6-4d6a-9cb6-efbaf25caa15","Type":"ContainerDied","Data":"d3b6d1ad2f4d9493b53732a4b71dfdfb3dc5469003a70aa18c906815c304ba1f"} Dec 02 15:54:57 crc kubenswrapper[4933]: I1202 15:54:57.203982 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5sjn5" event={"ID":"3f5707d4-08a6-4d6a-9cb6-efbaf25caa15","Type":"ContainerStarted","Data":"6f68b72fcaa9a00f66a696294d4d71a89c21a5ad9fedad71cfbcea85f9540571"} Dec 02 15:54:57 crc kubenswrapper[4933]: I1202 15:54:57.241484 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-lvtt7" event={"ID":"977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff","Type":"ContainerStarted","Data":"097e9649a54b11a78a403954daefddf325237cb7718f9ab24e9b9ec97d9508a5"} Dec 02 15:54:57 crc kubenswrapper[4933]: I1202 15:54:57.241550 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-lvtt7" event={"ID":"977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff","Type":"ContainerStarted","Data":"f262dc09d98ad7fb78b72e0a1d9d0b4a5370d69be2f413077e4e917951945ba8"} Dec 02 15:54:57 crc kubenswrapper[4933]: I1202 15:54:57.242200 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-lvtt7" Dec 02 15:54:57 crc kubenswrapper[4933]: I1202 15:54:57.251146 4933 generic.go:334] "Generic (PLEG): container finished" podID="b3e7c8a1-0224-434e-bf6f-27342e1f27e6" containerID="e07988a312b895d475ffb56550309b00cc4b039db26fa8e64b4698fbcfa021ce" exitCode=0 Dec 02 15:54:57 crc kubenswrapper[4933]: I1202 15:54:57.252780 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mv2m4" event={"ID":"b3e7c8a1-0224-434e-bf6f-27342e1f27e6","Type":"ContainerDied","Data":"e07988a312b895d475ffb56550309b00cc4b039db26fa8e64b4698fbcfa021ce"} Dec 02 15:54:57 crc kubenswrapper[4933]: I1202 15:54:57.333976 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-lvtt7" podStartSLOduration=135.333852789 podStartE2EDuration="2m15.333852789s" podCreationTimestamp="2025-12-02 15:52:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 15:54:57.293151952 +0000 UTC m=+160.544378655" watchObservedRunningTime="2025-12-02 15:54:57.333852789 +0000 UTC m=+160.585079512" Dec 02 15:54:58 crc kubenswrapper[4933]: I1202 15:54:58.162605 4933 patch_prober.go:28] interesting pod/router-default-5444994796-dgdm5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 15:54:58 crc kubenswrapper[4933]: [-]has-synced failed: reason withheld Dec 02 15:54:58 crc kubenswrapper[4933]: [+]process-running ok Dec 02 15:54:58 crc kubenswrapper[4933]: healthz check failed Dec 02 15:54:58 crc kubenswrapper[4933]: I1202 15:54:58.163062 4933 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dgdm5" podUID="043b25c4-e262-410b-aec8-e18fdb93d0c7" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 15:54:58 crc kubenswrapper[4933]: I1202 15:54:58.740313 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 02 15:54:58 crc kubenswrapper[4933]: I1202 15:54:58.810949 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2806d3f7-26c5-43c1-b77e-9f42eefb7a0c-kubelet-dir\") pod \"2806d3f7-26c5-43c1-b77e-9f42eefb7a0c\" (UID: \"2806d3f7-26c5-43c1-b77e-9f42eefb7a0c\") " Dec 02 15:54:58 crc kubenswrapper[4933]: I1202 15:54:58.811014 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2806d3f7-26c5-43c1-b77e-9f42eefb7a0c-kube-api-access\") pod \"2806d3f7-26c5-43c1-b77e-9f42eefb7a0c\" (UID: \"2806d3f7-26c5-43c1-b77e-9f42eefb7a0c\") " Dec 02 15:54:58 crc kubenswrapper[4933]: I1202 15:54:58.811186 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2806d3f7-26c5-43c1-b77e-9f42eefb7a0c-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "2806d3f7-26c5-43c1-b77e-9f42eefb7a0c" (UID: "2806d3f7-26c5-43c1-b77e-9f42eefb7a0c"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 15:54:58 crc kubenswrapper[4933]: I1202 15:54:58.811271 4933 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2806d3f7-26c5-43c1-b77e-9f42eefb7a0c-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 02 15:54:58 crc kubenswrapper[4933]: I1202 15:54:58.821209 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2806d3f7-26c5-43c1-b77e-9f42eefb7a0c-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "2806d3f7-26c5-43c1-b77e-9f42eefb7a0c" (UID: "2806d3f7-26c5-43c1-b77e-9f42eefb7a0c"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 15:54:58 crc kubenswrapper[4933]: I1202 15:54:58.912426 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2806d3f7-26c5-43c1-b77e-9f42eefb7a0c-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 02 15:54:59 crc kubenswrapper[4933]: I1202 15:54:59.161817 4933 patch_prober.go:28] interesting pod/router-default-5444994796-dgdm5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 15:54:59 crc kubenswrapper[4933]: [-]has-synced failed: reason withheld Dec 02 15:54:59 crc kubenswrapper[4933]: [+]process-running ok Dec 02 15:54:59 crc kubenswrapper[4933]: healthz check failed Dec 02 15:54:59 crc kubenswrapper[4933]: I1202 15:54:59.161893 4933 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dgdm5" podUID="043b25c4-e262-410b-aec8-e18fdb93d0c7" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 15:54:59 crc kubenswrapper[4933]: I1202 15:54:59.287188 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"2806d3f7-26c5-43c1-b77e-9f42eefb7a0c","Type":"ContainerDied","Data":"52192488db8d0d91e56d3c3538a78056bfad94b24e90d67fc322820bfabcb766"} Dec 02 15:54:59 crc kubenswrapper[4933]: I1202 15:54:59.287250 4933 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="52192488db8d0d91e56d3c3538a78056bfad94b24e90d67fc322820bfabcb766" Dec 02 15:54:59 crc kubenswrapper[4933]: I1202 15:54:59.287325 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 02 15:54:59 crc kubenswrapper[4933]: I1202 15:54:59.976313 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 02 15:54:59 crc kubenswrapper[4933]: E1202 15:54:59.976616 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2806d3f7-26c5-43c1-b77e-9f42eefb7a0c" containerName="pruner" Dec 02 15:54:59 crc kubenswrapper[4933]: I1202 15:54:59.976635 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="2806d3f7-26c5-43c1-b77e-9f42eefb7a0c" containerName="pruner" Dec 02 15:54:59 crc kubenswrapper[4933]: I1202 15:54:59.976778 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="2806d3f7-26c5-43c1-b77e-9f42eefb7a0c" containerName="pruner" Dec 02 15:54:59 crc kubenswrapper[4933]: I1202 15:54:59.977278 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 02 15:54:59 crc kubenswrapper[4933]: I1202 15:54:59.979673 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 02 15:54:59 crc kubenswrapper[4933]: I1202 15:54:59.983115 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 02 15:55:00 crc kubenswrapper[4933]: I1202 15:55:00.001792 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 02 15:55:00 crc kubenswrapper[4933]: I1202 15:55:00.144533 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b4c9a124-d2ee-48be-952c-1b3a742d6c00-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"b4c9a124-d2ee-48be-952c-1b3a742d6c00\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 02 15:55:00 crc kubenswrapper[4933]: I1202 15:55:00.144822 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b4c9a124-d2ee-48be-952c-1b3a742d6c00-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"b4c9a124-d2ee-48be-952c-1b3a742d6c00\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 02 15:55:00 crc kubenswrapper[4933]: I1202 15:55:00.164010 4933 patch_prober.go:28] interesting pod/router-default-5444994796-dgdm5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 15:55:00 crc kubenswrapper[4933]: [-]has-synced failed: reason withheld Dec 02 15:55:00 crc kubenswrapper[4933]: [+]process-running ok Dec 02 15:55:00 crc kubenswrapper[4933]: healthz check failed Dec 02 15:55:00 crc kubenswrapper[4933]: I1202 15:55:00.164071 4933 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dgdm5" podUID="043b25c4-e262-410b-aec8-e18fdb93d0c7" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 15:55:00 crc kubenswrapper[4933]: I1202 15:55:00.246950 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b4c9a124-d2ee-48be-952c-1b3a742d6c00-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"b4c9a124-d2ee-48be-952c-1b3a742d6c00\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 02 15:55:00 crc kubenswrapper[4933]: I1202 15:55:00.247038 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b4c9a124-d2ee-48be-952c-1b3a742d6c00-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"b4c9a124-d2ee-48be-952c-1b3a742d6c00\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 02 15:55:00 crc kubenswrapper[4933]: I1202 15:55:00.247770 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b4c9a124-d2ee-48be-952c-1b3a742d6c00-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"b4c9a124-d2ee-48be-952c-1b3a742d6c00\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 02 15:55:00 crc kubenswrapper[4933]: I1202 15:55:00.303175 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b4c9a124-d2ee-48be-952c-1b3a742d6c00-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"b4c9a124-d2ee-48be-952c-1b3a742d6c00\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 02 15:55:00 crc kubenswrapper[4933]: I1202 15:55:00.304745 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 02 15:55:00 crc kubenswrapper[4933]: I1202 15:55:00.312328 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-w8l5x" Dec 02 15:55:00 crc kubenswrapper[4933]: I1202 15:55:00.318290 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-w8l5x" Dec 02 15:55:00 crc kubenswrapper[4933]: I1202 15:55:00.886902 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 02 15:55:01 crc kubenswrapper[4933]: I1202 15:55:01.071562 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-g4vq2" Dec 02 15:55:01 crc kubenswrapper[4933]: I1202 15:55:01.171691 4933 patch_prober.go:28] interesting pod/router-default-5444994796-dgdm5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 15:55:01 crc kubenswrapper[4933]: [-]has-synced failed: reason withheld Dec 02 15:55:01 crc kubenswrapper[4933]: [+]process-running ok Dec 02 15:55:01 crc kubenswrapper[4933]: healthz check failed Dec 02 15:55:01 crc kubenswrapper[4933]: I1202 15:55:01.172316 4933 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dgdm5" podUID="043b25c4-e262-410b-aec8-e18fdb93d0c7" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 15:55:02 crc kubenswrapper[4933]: I1202 15:55:02.165622 4933 patch_prober.go:28] interesting pod/router-default-5444994796-dgdm5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 15:55:02 crc kubenswrapper[4933]: [-]has-synced failed: reason withheld Dec 02 15:55:02 crc kubenswrapper[4933]: [+]process-running ok Dec 02 15:55:02 crc kubenswrapper[4933]: healthz check failed Dec 02 15:55:02 crc kubenswrapper[4933]: I1202 15:55:02.166049 4933 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dgdm5" podUID="043b25c4-e262-410b-aec8-e18fdb93d0c7" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 15:55:03 crc kubenswrapper[4933]: I1202 15:55:03.162039 4933 patch_prober.go:28] interesting pod/router-default-5444994796-dgdm5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 15:55:03 crc kubenswrapper[4933]: [-]has-synced failed: reason withheld Dec 02 15:55:03 crc kubenswrapper[4933]: [+]process-running ok Dec 02 15:55:03 crc kubenswrapper[4933]: healthz check failed Dec 02 15:55:03 crc kubenswrapper[4933]: I1202 15:55:03.162173 4933 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dgdm5" podUID="043b25c4-e262-410b-aec8-e18fdb93d0c7" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 15:55:04 crc kubenswrapper[4933]: I1202 15:55:04.161555 4933 patch_prober.go:28] interesting pod/router-default-5444994796-dgdm5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 15:55:04 crc kubenswrapper[4933]: [-]has-synced failed: reason withheld Dec 02 15:55:04 crc kubenswrapper[4933]: [+]process-running ok Dec 02 15:55:04 crc kubenswrapper[4933]: healthz check failed Dec 02 15:55:04 crc kubenswrapper[4933]: I1202 15:55:04.162065 4933 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dgdm5" podUID="043b25c4-e262-410b-aec8-e18fdb93d0c7" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 15:55:05 crc kubenswrapper[4933]: I1202 15:55:05.133743 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c95a4730-1427-4097-9ca3-4bd251e7acf0-metrics-certs\") pod \"network-metrics-daemon-qbps2\" (UID: \"c95a4730-1427-4097-9ca3-4bd251e7acf0\") " pod="openshift-multus/network-metrics-daemon-qbps2" Dec 02 15:55:05 crc kubenswrapper[4933]: I1202 15:55:05.144617 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c95a4730-1427-4097-9ca3-4bd251e7acf0-metrics-certs\") pod \"network-metrics-daemon-qbps2\" (UID: \"c95a4730-1427-4097-9ca3-4bd251e7acf0\") " pod="openshift-multus/network-metrics-daemon-qbps2" Dec 02 15:55:05 crc kubenswrapper[4933]: I1202 15:55:05.163638 4933 patch_prober.go:28] interesting pod/router-default-5444994796-dgdm5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 15:55:05 crc kubenswrapper[4933]: [-]has-synced failed: reason withheld Dec 02 15:55:05 crc kubenswrapper[4933]: [+]process-running ok Dec 02 15:55:05 crc kubenswrapper[4933]: healthz check failed Dec 02 15:55:05 crc kubenswrapper[4933]: I1202 15:55:05.163701 4933 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dgdm5" podUID="043b25c4-e262-410b-aec8-e18fdb93d0c7" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 15:55:05 crc kubenswrapper[4933]: I1202 15:55:05.396593 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qbps2" Dec 02 15:55:05 crc kubenswrapper[4933]: I1202 15:55:05.701700 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-9r8tr" Dec 02 15:55:05 crc kubenswrapper[4933]: I1202 15:55:05.741446 4933 patch_prober.go:28] interesting pod/console-f9d7485db-lcwgg container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.13:8443/health\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Dec 02 15:55:05 crc kubenswrapper[4933]: I1202 15:55:05.741509 4933 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-lcwgg" podUID="fcc5c745-13b8-4ff8-b677-095dd8a46081" containerName="console" probeResult="failure" output="Get \"https://10.217.0.13:8443/health\": dial tcp 10.217.0.13:8443: connect: connection refused" Dec 02 15:55:06 crc kubenswrapper[4933]: I1202 15:55:06.165642 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-dgdm5" Dec 02 15:55:06 crc kubenswrapper[4933]: I1202 15:55:06.168783 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-dgdm5" Dec 02 15:55:10 crc kubenswrapper[4933]: I1202 15:55:10.466095 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"b4c9a124-d2ee-48be-952c-1b3a742d6c00","Type":"ContainerStarted","Data":"d145a767db6aa3eba0a99bc61e6dbe81fe7196c0592af3815a937bbe8fc907d6"} Dec 02 15:55:15 crc kubenswrapper[4933]: I1202 15:55:15.658232 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-lvtt7" Dec 02 15:55:15 crc kubenswrapper[4933]: I1202 15:55:15.743758 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-lcwgg" Dec 02 15:55:15 crc kubenswrapper[4933]: I1202 15:55:15.751928 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-lcwgg" Dec 02 15:55:17 crc kubenswrapper[4933]: I1202 15:55:17.169312 4933 patch_prober.go:28] interesting pod/machine-config-daemon-d2p6w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 15:55:17 crc kubenswrapper[4933]: I1202 15:55:17.170145 4933 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 15:55:24 crc kubenswrapper[4933]: I1202 15:55:24.754946 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 15:55:25 crc kubenswrapper[4933]: I1202 15:55:25.661541 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4wd5d" Dec 02 15:55:27 crc kubenswrapper[4933]: E1202 15:55:27.789020 4933 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Dec 02 15:55:27 crc kubenswrapper[4933]: E1202 15:55:27.789574 4933 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mfbjr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-mp97n_openshift-marketplace(4f5f6b9a-09c1-46e9-9fba-1de95dd3244c): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 02 15:55:27 crc kubenswrapper[4933]: E1202 15:55:27.791168 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-mp97n" podUID="4f5f6b9a-09c1-46e9-9fba-1de95dd3244c" Dec 02 15:55:29 crc kubenswrapper[4933]: E1202 15:55:29.143512 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-mp97n" podUID="4f5f6b9a-09c1-46e9-9fba-1de95dd3244c" Dec 02 15:55:33 crc kubenswrapper[4933]: I1202 15:55:33.168756 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 02 15:55:33 crc kubenswrapper[4933]: I1202 15:55:33.169962 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 02 15:55:33 crc kubenswrapper[4933]: I1202 15:55:33.180390 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 02 15:55:33 crc kubenswrapper[4933]: I1202 15:55:33.240346 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/657b5a96-11b1-459d-8524-63f58a48e6c4-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"657b5a96-11b1-459d-8524-63f58a48e6c4\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 02 15:55:33 crc kubenswrapper[4933]: I1202 15:55:33.240448 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/657b5a96-11b1-459d-8524-63f58a48e6c4-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"657b5a96-11b1-459d-8524-63f58a48e6c4\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 02 15:55:33 crc kubenswrapper[4933]: E1202 15:55:33.251797 4933 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 02 15:55:33 crc kubenswrapper[4933]: E1202 15:55:33.252061 4933 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pjfbf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-xnlpk_openshift-marketplace(dcfaf9e9-2939-430d-b0c8-7f8daaca0f42): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 02 15:55:33 crc kubenswrapper[4933]: E1202 15:55:33.253370 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-xnlpk" podUID="dcfaf9e9-2939-430d-b0c8-7f8daaca0f42" Dec 02 15:55:33 crc kubenswrapper[4933]: I1202 15:55:33.348582 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/657b5a96-11b1-459d-8524-63f58a48e6c4-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"657b5a96-11b1-459d-8524-63f58a48e6c4\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 02 15:55:33 crc kubenswrapper[4933]: I1202 15:55:33.349275 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/657b5a96-11b1-459d-8524-63f58a48e6c4-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"657b5a96-11b1-459d-8524-63f58a48e6c4\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 02 15:55:33 crc kubenswrapper[4933]: I1202 15:55:33.349512 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/657b5a96-11b1-459d-8524-63f58a48e6c4-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"657b5a96-11b1-459d-8524-63f58a48e6c4\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 02 15:55:33 crc kubenswrapper[4933]: I1202 15:55:33.374404 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/657b5a96-11b1-459d-8524-63f58a48e6c4-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"657b5a96-11b1-459d-8524-63f58a48e6c4\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 02 15:55:33 crc kubenswrapper[4933]: I1202 15:55:33.503249 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 02 15:55:35 crc kubenswrapper[4933]: E1202 15:55:35.791945 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-xnlpk" podUID="dcfaf9e9-2939-430d-b0c8-7f8daaca0f42" Dec 02 15:55:35 crc kubenswrapper[4933]: E1202 15:55:35.893253 4933 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Dec 02 15:55:35 crc kubenswrapper[4933]: E1202 15:55:35.893544 4933 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fj6gq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-mv2m4_openshift-marketplace(b3e7c8a1-0224-434e-bf6f-27342e1f27e6): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 02 15:55:35 crc kubenswrapper[4933]: E1202 15:55:35.894717 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-mv2m4" podUID="b3e7c8a1-0224-434e-bf6f-27342e1f27e6" Dec 02 15:55:38 crc kubenswrapper[4933]: I1202 15:55:38.565969 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 02 15:55:38 crc kubenswrapper[4933]: I1202 15:55:38.567132 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 02 15:55:38 crc kubenswrapper[4933]: I1202 15:55:38.579755 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 02 15:55:38 crc kubenswrapper[4933]: I1202 15:55:38.722399 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6a53a119-fa38-4661-950b-e58963acf7ff-var-lock\") pod \"installer-9-crc\" (UID: \"6a53a119-fa38-4661-950b-e58963acf7ff\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 02 15:55:38 crc kubenswrapper[4933]: I1202 15:55:38.722451 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6a53a119-fa38-4661-950b-e58963acf7ff-kube-api-access\") pod \"installer-9-crc\" (UID: \"6a53a119-fa38-4661-950b-e58963acf7ff\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 02 15:55:38 crc kubenswrapper[4933]: I1202 15:55:38.722480 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6a53a119-fa38-4661-950b-e58963acf7ff-kubelet-dir\") pod \"installer-9-crc\" (UID: \"6a53a119-fa38-4661-950b-e58963acf7ff\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 02 15:55:38 crc kubenswrapper[4933]: I1202 15:55:38.823480 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6a53a119-fa38-4661-950b-e58963acf7ff-var-lock\") pod \"installer-9-crc\" (UID: \"6a53a119-fa38-4661-950b-e58963acf7ff\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 02 15:55:38 crc kubenswrapper[4933]: I1202 15:55:38.823532 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6a53a119-fa38-4661-950b-e58963acf7ff-kube-api-access\") pod \"installer-9-crc\" (UID: \"6a53a119-fa38-4661-950b-e58963acf7ff\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 02 15:55:38 crc kubenswrapper[4933]: I1202 15:55:38.823559 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6a53a119-fa38-4661-950b-e58963acf7ff-kubelet-dir\") pod \"installer-9-crc\" (UID: \"6a53a119-fa38-4661-950b-e58963acf7ff\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 02 15:55:38 crc kubenswrapper[4933]: I1202 15:55:38.823634 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6a53a119-fa38-4661-950b-e58963acf7ff-kubelet-dir\") pod \"installer-9-crc\" (UID: \"6a53a119-fa38-4661-950b-e58963acf7ff\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 02 15:55:38 crc kubenswrapper[4933]: I1202 15:55:38.823702 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6a53a119-fa38-4661-950b-e58963acf7ff-var-lock\") pod \"installer-9-crc\" (UID: \"6a53a119-fa38-4661-950b-e58963acf7ff\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 02 15:55:38 crc kubenswrapper[4933]: I1202 15:55:38.846473 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6a53a119-fa38-4661-950b-e58963acf7ff-kube-api-access\") pod \"installer-9-crc\" (UID: \"6a53a119-fa38-4661-950b-e58963acf7ff\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 02 15:55:38 crc kubenswrapper[4933]: I1202 15:55:38.885350 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 02 15:55:40 crc kubenswrapper[4933]: E1202 15:55:40.496486 4933 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Dec 02 15:55:40 crc kubenswrapper[4933]: E1202 15:55:40.496685 4933 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-p95k6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-kph29_openshift-marketplace(49685142-ff9c-4dec-9f6a-dbc8f06b7174): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 02 15:55:40 crc kubenswrapper[4933]: E1202 15:55:40.499128 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-kph29" podUID="49685142-ff9c-4dec-9f6a-dbc8f06b7174" Dec 02 15:55:45 crc kubenswrapper[4933]: E1202 15:55:45.803466 4933 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 02 15:55:45 crc kubenswrapper[4933]: E1202 15:55:45.803983 4933 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hjrk7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-54gm8_openshift-marketplace(c7cef2b5-6b23-47ba-83ce-161a02f128a1): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 02 15:55:45 crc kubenswrapper[4933]: E1202 15:55:45.805171 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-54gm8" podUID="c7cef2b5-6b23-47ba-83ce-161a02f128a1" Dec 02 15:55:46 crc kubenswrapper[4933]: E1202 15:55:46.762747 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-54gm8" podUID="c7cef2b5-6b23-47ba-83ce-161a02f128a1" Dec 02 15:55:46 crc kubenswrapper[4933]: E1202 15:55:46.762885 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-kph29" podUID="49685142-ff9c-4dec-9f6a-dbc8f06b7174" Dec 02 15:55:47 crc kubenswrapper[4933]: I1202 15:55:47.170100 4933 patch_prober.go:28] interesting pod/machine-config-daemon-d2p6w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 15:55:47 crc kubenswrapper[4933]: I1202 15:55:47.170774 4933 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 15:55:47 crc kubenswrapper[4933]: I1202 15:55:47.170862 4933 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" Dec 02 15:55:47 crc kubenswrapper[4933]: I1202 15:55:47.171694 4933 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"54194f3459a2bbe748821e4f8e94abdd18e7c4e483d4cc2c9d5b765db584dd01"} pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 15:55:47 crc kubenswrapper[4933]: I1202 15:55:47.171879 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" containerName="machine-config-daemon" containerID="cri-o://54194f3459a2bbe748821e4f8e94abdd18e7c4e483d4cc2c9d5b765db584dd01" gracePeriod=600 Dec 02 15:55:47 crc kubenswrapper[4933]: I1202 15:55:47.231960 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 02 15:55:47 crc kubenswrapper[4933]: I1202 15:55:47.234676 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-qbps2"] Dec 02 15:55:47 crc kubenswrapper[4933]: W1202 15:55:47.246209 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod6a53a119_fa38_4661_950b_e58963acf7ff.slice/crio-0a13483c0257773cd7274a79c17b625a3a97b86a6b7b730bf34465abbde91793 WatchSource:0}: Error finding container 0a13483c0257773cd7274a79c17b625a3a97b86a6b7b730bf34465abbde91793: Status 404 returned error can't find the container with id 0a13483c0257773cd7274a79c17b625a3a97b86a6b7b730bf34465abbde91793 Dec 02 15:55:47 crc kubenswrapper[4933]: I1202 15:55:47.340291 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 02 15:55:47 crc kubenswrapper[4933]: W1202 15:55:47.351779 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod657b5a96_11b1_459d_8524_63f58a48e6c4.slice/crio-5118a18eb9589a38a60a06454701364ff75897cb9ddf6c5c33350d884683e600 WatchSource:0}: Error finding container 5118a18eb9589a38a60a06454701364ff75897cb9ddf6c5c33350d884683e600: Status 404 returned error can't find the container with id 5118a18eb9589a38a60a06454701364ff75897cb9ddf6c5c33350d884683e600 Dec 02 15:55:47 crc kubenswrapper[4933]: I1202 15:55:47.699458 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-qbps2" event={"ID":"c95a4730-1427-4097-9ca3-4bd251e7acf0","Type":"ContainerStarted","Data":"1b5141ed20d603cd9200da3252037093f51b20c16a0b81d72c5e8da5002aedab"} Dec 02 15:55:47 crc kubenswrapper[4933]: I1202 15:55:47.700575 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"657b5a96-11b1-459d-8524-63f58a48e6c4","Type":"ContainerStarted","Data":"5118a18eb9589a38a60a06454701364ff75897cb9ddf6c5c33350d884683e600"} Dec 02 15:55:47 crc kubenswrapper[4933]: I1202 15:55:47.701938 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"6a53a119-fa38-4661-950b-e58963acf7ff","Type":"ContainerStarted","Data":"0a13483c0257773cd7274a79c17b625a3a97b86a6b7b730bf34465abbde91793"} Dec 02 15:55:48 crc kubenswrapper[4933]: E1202 15:55:48.883219 4933 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/system.slice/rpm-ostreed.service\": RecentStats: unable to find data in memory cache]" Dec 02 15:55:51 crc kubenswrapper[4933]: E1202 15:55:51.529870 4933 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Dec 02 15:55:51 crc kubenswrapper[4933]: E1202 15:55:51.530416 4933 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4wqj4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-5sjn5_openshift-marketplace(3f5707d4-08a6-4d6a-9cb6-efbaf25caa15): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 02 15:55:51 crc kubenswrapper[4933]: E1202 15:55:51.531606 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-5sjn5" podUID="3f5707d4-08a6-4d6a-9cb6-efbaf25caa15" Dec 02 15:55:51 crc kubenswrapper[4933]: E1202 15:55:51.623129 4933 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Dec 02 15:55:51 crc kubenswrapper[4933]: E1202 15:55:51.623316 4933 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gxxg4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-jb7rb_openshift-marketplace(43ed131f-a943-4e5e-a11f-e07507790a41): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 02 15:55:51 crc kubenswrapper[4933]: E1202 15:55:51.624868 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-jb7rb" podUID="43ed131f-a943-4e5e-a11f-e07507790a41" Dec 02 15:55:51 crc kubenswrapper[4933]: I1202 15:55:51.726512 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"b4c9a124-d2ee-48be-952c-1b3a742d6c00","Type":"ContainerStarted","Data":"819d8df032ce016f9aec614dc4c2bb3d3b8671b0ac0b2137f90cab921356e967"} Dec 02 15:55:51 crc kubenswrapper[4933]: I1202 15:55:51.730080 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"657b5a96-11b1-459d-8524-63f58a48e6c4","Type":"ContainerStarted","Data":"5ae82b5a8321b1357fbba9d8a382f534d836e924b675fd6ba6b9f2ba0fb89658"} Dec 02 15:55:51 crc kubenswrapper[4933]: I1202 15:55:51.731794 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"6a53a119-fa38-4661-950b-e58963acf7ff","Type":"ContainerStarted","Data":"2c641641ce88dcdcd38684d29f2c284954fba776681a96c108d9690bde8d6c63"} Dec 02 15:55:51 crc kubenswrapper[4933]: I1202 15:55:51.734231 4933 generic.go:334] "Generic (PLEG): container finished" podID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" containerID="54194f3459a2bbe748821e4f8e94abdd18e7c4e483d4cc2c9d5b765db584dd01" exitCode=0 Dec 02 15:55:51 crc kubenswrapper[4933]: I1202 15:55:51.734287 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" event={"ID":"e6c1c5e6-50dd-428a-890c-2c3f0456f2fa","Type":"ContainerDied","Data":"54194f3459a2bbe748821e4f8e94abdd18e7c4e483d4cc2c9d5b765db584dd01"} Dec 02 15:55:51 crc kubenswrapper[4933]: I1202 15:55:51.736628 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-qbps2" event={"ID":"c95a4730-1427-4097-9ca3-4bd251e7acf0","Type":"ContainerStarted","Data":"102ae23b34bf6f0e733093ede325ef94e9895c15c716fec8f4d17470173ee392"} Dec 02 15:55:51 crc kubenswrapper[4933]: E1202 15:55:51.739869 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-jb7rb" podUID="43ed131f-a943-4e5e-a11f-e07507790a41" Dec 02 15:55:51 crc kubenswrapper[4933]: I1202 15:55:51.740180 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=52.740138535 podStartE2EDuration="52.740138535s" podCreationTimestamp="2025-12-02 15:54:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 15:55:51.736784153 +0000 UTC m=+214.988010876" watchObservedRunningTime="2025-12-02 15:55:51.740138535 +0000 UTC m=+214.991365228" Dec 02 15:55:51 crc kubenswrapper[4933]: E1202 15:55:51.740886 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-5sjn5" podUID="3f5707d4-08a6-4d6a-9cb6-efbaf25caa15" Dec 02 15:55:51 crc kubenswrapper[4933]: I1202 15:55:51.771727 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=13.77170919 podStartE2EDuration="13.77170919s" podCreationTimestamp="2025-12-02 15:55:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 15:55:51.760595218 +0000 UTC m=+215.011821921" watchObservedRunningTime="2025-12-02 15:55:51.77170919 +0000 UTC m=+215.022935893" Dec 02 15:55:51 crc kubenswrapper[4933]: I1202 15:55:51.773424 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=18.773418357 podStartE2EDuration="18.773418357s" podCreationTimestamp="2025-12-02 15:55:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 15:55:51.771378539 +0000 UTC m=+215.022605242" watchObservedRunningTime="2025-12-02 15:55:51.773418357 +0000 UTC m=+215.024645060" Dec 02 15:55:52 crc kubenswrapper[4933]: E1202 15:55:52.011229 4933 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Dec 02 15:55:52 crc kubenswrapper[4933]: E1202 15:55:52.011431 4933 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6jwpm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-w9qmg_openshift-marketplace(e4379206-f985-4847-90b4-604172bb7e6d): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 02 15:55:52 crc kubenswrapper[4933]: E1202 15:55:52.012737 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-w9qmg" podUID="e4379206-f985-4847-90b4-604172bb7e6d" Dec 02 15:55:52 crc kubenswrapper[4933]: I1202 15:55:52.749570 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-qbps2" event={"ID":"c95a4730-1427-4097-9ca3-4bd251e7acf0","Type":"ContainerStarted","Data":"ef7c5fa58676ca97f05c2e1335e7ac97a293e1f00932350b6a128e658441fed7"} Dec 02 15:55:52 crc kubenswrapper[4933]: I1202 15:55:52.754019 4933 generic.go:334] "Generic (PLEG): container finished" podID="b4c9a124-d2ee-48be-952c-1b3a742d6c00" containerID="819d8df032ce016f9aec614dc4c2bb3d3b8671b0ac0b2137f90cab921356e967" exitCode=0 Dec 02 15:55:52 crc kubenswrapper[4933]: I1202 15:55:52.754253 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"b4c9a124-d2ee-48be-952c-1b3a742d6c00","Type":"ContainerDied","Data":"819d8df032ce016f9aec614dc4c2bb3d3b8671b0ac0b2137f90cab921356e967"} Dec 02 15:55:52 crc kubenswrapper[4933]: I1202 15:55:52.756754 4933 generic.go:334] "Generic (PLEG): container finished" podID="657b5a96-11b1-459d-8524-63f58a48e6c4" containerID="5ae82b5a8321b1357fbba9d8a382f534d836e924b675fd6ba6b9f2ba0fb89658" exitCode=0 Dec 02 15:55:52 crc kubenswrapper[4933]: I1202 15:55:52.756960 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"657b5a96-11b1-459d-8524-63f58a48e6c4","Type":"ContainerDied","Data":"5ae82b5a8321b1357fbba9d8a382f534d836e924b675fd6ba6b9f2ba0fb89658"} Dec 02 15:55:52 crc kubenswrapper[4933]: I1202 15:55:52.759764 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" event={"ID":"e6c1c5e6-50dd-428a-890c-2c3f0456f2fa","Type":"ContainerStarted","Data":"84975a1489617fef6b371e8f083c03eea00aaba65ae72fcc68b86a8281f97488"} Dec 02 15:55:52 crc kubenswrapper[4933]: E1202 15:55:52.766156 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-w9qmg" podUID="e4379206-f985-4847-90b4-604172bb7e6d" Dec 02 15:55:52 crc kubenswrapper[4933]: I1202 15:55:52.769728 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-qbps2" podStartSLOduration=190.769712714 podStartE2EDuration="3m10.769712714s" podCreationTimestamp="2025-12-02 15:52:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 15:55:52.76657766 +0000 UTC m=+216.017804373" watchObservedRunningTime="2025-12-02 15:55:52.769712714 +0000 UTC m=+216.020939427" Dec 02 15:55:53 crc kubenswrapper[4933]: I1202 15:55:53.766506 4933 generic.go:334] "Generic (PLEG): container finished" podID="4f5f6b9a-09c1-46e9-9fba-1de95dd3244c" containerID="264a46aff9cc71fa5de98c4b52a9f4c7f908dfec5c819be7a7a9cf31403928b5" exitCode=0 Dec 02 15:55:53 crc kubenswrapper[4933]: I1202 15:55:53.766611 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mp97n" event={"ID":"4f5f6b9a-09c1-46e9-9fba-1de95dd3244c","Type":"ContainerDied","Data":"264a46aff9cc71fa5de98c4b52a9f4c7f908dfec5c819be7a7a9cf31403928b5"} Dec 02 15:55:53 crc kubenswrapper[4933]: I1202 15:55:53.770378 4933 generic.go:334] "Generic (PLEG): container finished" podID="b3e7c8a1-0224-434e-bf6f-27342e1f27e6" containerID="fe383de997ecef19e7276a057660d2cb3de1bfee4d13ba1b3316b3d03728169d" exitCode=0 Dec 02 15:55:53 crc kubenswrapper[4933]: I1202 15:55:53.770462 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mv2m4" event={"ID":"b3e7c8a1-0224-434e-bf6f-27342e1f27e6","Type":"ContainerDied","Data":"fe383de997ecef19e7276a057660d2cb3de1bfee4d13ba1b3316b3d03728169d"} Dec 02 15:55:53 crc kubenswrapper[4933]: I1202 15:55:53.774628 4933 generic.go:334] "Generic (PLEG): container finished" podID="dcfaf9e9-2939-430d-b0c8-7f8daaca0f42" containerID="0ac96bad7630b3e5145f1a296df458a3dec889d9045df9248fa03debb9ee0c94" exitCode=0 Dec 02 15:55:53 crc kubenswrapper[4933]: I1202 15:55:53.774812 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xnlpk" event={"ID":"dcfaf9e9-2939-430d-b0c8-7f8daaca0f42","Type":"ContainerDied","Data":"0ac96bad7630b3e5145f1a296df458a3dec889d9045df9248fa03debb9ee0c94"} Dec 02 15:55:54 crc kubenswrapper[4933]: I1202 15:55:54.082963 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 02 15:55:54 crc kubenswrapper[4933]: I1202 15:55:54.126050 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b4c9a124-d2ee-48be-952c-1b3a742d6c00-kubelet-dir\") pod \"b4c9a124-d2ee-48be-952c-1b3a742d6c00\" (UID: \"b4c9a124-d2ee-48be-952c-1b3a742d6c00\") " Dec 02 15:55:54 crc kubenswrapper[4933]: I1202 15:55:54.126168 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b4c9a124-d2ee-48be-952c-1b3a742d6c00-kube-api-access\") pod \"b4c9a124-d2ee-48be-952c-1b3a742d6c00\" (UID: \"b4c9a124-d2ee-48be-952c-1b3a742d6c00\") " Dec 02 15:55:54 crc kubenswrapper[4933]: I1202 15:55:54.126200 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b4c9a124-d2ee-48be-952c-1b3a742d6c00-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "b4c9a124-d2ee-48be-952c-1b3a742d6c00" (UID: "b4c9a124-d2ee-48be-952c-1b3a742d6c00"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 15:55:54 crc kubenswrapper[4933]: I1202 15:55:54.126709 4933 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b4c9a124-d2ee-48be-952c-1b3a742d6c00-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 02 15:55:54 crc kubenswrapper[4933]: I1202 15:55:54.135033 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4c9a124-d2ee-48be-952c-1b3a742d6c00-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "b4c9a124-d2ee-48be-952c-1b3a742d6c00" (UID: "b4c9a124-d2ee-48be-952c-1b3a742d6c00"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 15:55:54 crc kubenswrapper[4933]: I1202 15:55:54.135996 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 02 15:55:54 crc kubenswrapper[4933]: I1202 15:55:54.227667 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/657b5a96-11b1-459d-8524-63f58a48e6c4-kube-api-access\") pod \"657b5a96-11b1-459d-8524-63f58a48e6c4\" (UID: \"657b5a96-11b1-459d-8524-63f58a48e6c4\") " Dec 02 15:55:54 crc kubenswrapper[4933]: I1202 15:55:54.227743 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/657b5a96-11b1-459d-8524-63f58a48e6c4-kubelet-dir\") pod \"657b5a96-11b1-459d-8524-63f58a48e6c4\" (UID: \"657b5a96-11b1-459d-8524-63f58a48e6c4\") " Dec 02 15:55:54 crc kubenswrapper[4933]: I1202 15:55:54.227985 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b4c9a124-d2ee-48be-952c-1b3a742d6c00-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 02 15:55:54 crc kubenswrapper[4933]: I1202 15:55:54.228048 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/657b5a96-11b1-459d-8524-63f58a48e6c4-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "657b5a96-11b1-459d-8524-63f58a48e6c4" (UID: "657b5a96-11b1-459d-8524-63f58a48e6c4"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 15:55:54 crc kubenswrapper[4933]: I1202 15:55:54.232020 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/657b5a96-11b1-459d-8524-63f58a48e6c4-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "657b5a96-11b1-459d-8524-63f58a48e6c4" (UID: "657b5a96-11b1-459d-8524-63f58a48e6c4"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 15:55:54 crc kubenswrapper[4933]: I1202 15:55:54.329038 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/657b5a96-11b1-459d-8524-63f58a48e6c4-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 02 15:55:54 crc kubenswrapper[4933]: I1202 15:55:54.329077 4933 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/657b5a96-11b1-459d-8524-63f58a48e6c4-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 02 15:55:54 crc kubenswrapper[4933]: I1202 15:55:54.787333 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 02 15:55:54 crc kubenswrapper[4933]: I1202 15:55:54.787315 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"b4c9a124-d2ee-48be-952c-1b3a742d6c00","Type":"ContainerDied","Data":"d145a767db6aa3eba0a99bc61e6dbe81fe7196c0592af3815a937bbe8fc907d6"} Dec 02 15:55:54 crc kubenswrapper[4933]: I1202 15:55:54.787782 4933 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d145a767db6aa3eba0a99bc61e6dbe81fe7196c0592af3815a937bbe8fc907d6" Dec 02 15:55:54 crc kubenswrapper[4933]: I1202 15:55:54.789089 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"657b5a96-11b1-459d-8524-63f58a48e6c4","Type":"ContainerDied","Data":"5118a18eb9589a38a60a06454701364ff75897cb9ddf6c5c33350d884683e600"} Dec 02 15:55:54 crc kubenswrapper[4933]: I1202 15:55:54.789128 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 02 15:55:54 crc kubenswrapper[4933]: I1202 15:55:54.789143 4933 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5118a18eb9589a38a60a06454701364ff75897cb9ddf6c5c33350d884683e600" Dec 02 15:55:55 crc kubenswrapper[4933]: I1202 15:55:55.800627 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xnlpk" event={"ID":"dcfaf9e9-2939-430d-b0c8-7f8daaca0f42","Type":"ContainerStarted","Data":"4c2eeb6549f3e0814ec58ea3ba6207164d996e2c08c5796735ca479613038bc6"} Dec 02 15:55:55 crc kubenswrapper[4933]: I1202 15:55:55.805778 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mp97n" event={"ID":"4f5f6b9a-09c1-46e9-9fba-1de95dd3244c","Type":"ContainerStarted","Data":"c5a80b8d57297ab1bbbb1fc495c0187e3370d09045324fb3adb2daa6b5507834"} Dec 02 15:55:55 crc kubenswrapper[4933]: I1202 15:55:55.809269 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mv2m4" event={"ID":"b3e7c8a1-0224-434e-bf6f-27342e1f27e6","Type":"ContainerStarted","Data":"3074e86c3f9d3362141c1623062e45d8c282d41b64e81cda76434a9928ac376c"} Dec 02 15:55:55 crc kubenswrapper[4933]: I1202 15:55:55.833353 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-xnlpk" podStartSLOduration=3.5608973390000003 podStartE2EDuration="1m2.833336295s" podCreationTimestamp="2025-12-02 15:54:53 +0000 UTC" firstStartedPulling="2025-12-02 15:54:56.142070226 +0000 UTC m=+159.393296929" lastFinishedPulling="2025-12-02 15:55:55.414509182 +0000 UTC m=+218.665735885" observedRunningTime="2025-12-02 15:55:55.823262839 +0000 UTC m=+219.074489562" watchObservedRunningTime="2025-12-02 15:55:55.833336295 +0000 UTC m=+219.084562998" Dec 02 15:55:55 crc kubenswrapper[4933]: I1202 15:55:55.853591 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-mp97n" podStartSLOduration=4.610167698 podStartE2EDuration="1m3.853575151s" podCreationTimestamp="2025-12-02 15:54:52 +0000 UTC" firstStartedPulling="2025-12-02 15:54:56.160789474 +0000 UTC m=+159.412016177" lastFinishedPulling="2025-12-02 15:55:55.404196917 +0000 UTC m=+218.655423630" observedRunningTime="2025-12-02 15:55:55.850390555 +0000 UTC m=+219.101617278" watchObservedRunningTime="2025-12-02 15:55:55.853575151 +0000 UTC m=+219.104801844" Dec 02 15:55:55 crc kubenswrapper[4933]: I1202 15:55:55.877967 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-mv2m4" podStartSLOduration=3.190082681 podStartE2EDuration="1m0.877951146s" podCreationTimestamp="2025-12-02 15:54:55 +0000 UTC" firstStartedPulling="2025-12-02 15:54:57.257498352 +0000 UTC m=+160.508725055" lastFinishedPulling="2025-12-02 15:55:54.945366817 +0000 UTC m=+218.196593520" observedRunningTime="2025-12-02 15:55:55.876531908 +0000 UTC m=+219.127758641" watchObservedRunningTime="2025-12-02 15:55:55.877951146 +0000 UTC m=+219.129177849" Dec 02 15:56:04 crc kubenswrapper[4933]: I1202 15:56:04.306103 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-xnlpk" Dec 02 15:56:04 crc kubenswrapper[4933]: I1202 15:56:04.307104 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-xnlpk" Dec 02 15:56:04 crc kubenswrapper[4933]: I1202 15:56:04.320114 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-mp97n" Dec 02 15:56:04 crc kubenswrapper[4933]: I1202 15:56:04.320234 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-mp97n" Dec 02 15:56:04 crc kubenswrapper[4933]: I1202 15:56:04.404959 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-xnlpk" Dec 02 15:56:04 crc kubenswrapper[4933]: I1202 15:56:04.405378 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-mp97n" Dec 02 15:56:04 crc kubenswrapper[4933]: I1202 15:56:04.869851 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-54gm8" event={"ID":"c7cef2b5-6b23-47ba-83ce-161a02f128a1","Type":"ContainerStarted","Data":"8361cc21fe6cf67feae0c8dccefb304d37c678dc42099934e664c3eb8a8446fc"} Dec 02 15:56:04 crc kubenswrapper[4933]: I1202 15:56:04.923056 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-xnlpk" Dec 02 15:56:04 crc kubenswrapper[4933]: I1202 15:56:04.946003 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-mp97n" Dec 02 15:56:05 crc kubenswrapper[4933]: I1202 15:56:05.401732 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xnlpk"] Dec 02 15:56:05 crc kubenswrapper[4933]: I1202 15:56:05.503399 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-mv2m4" Dec 02 15:56:05 crc kubenswrapper[4933]: I1202 15:56:05.503467 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-mv2m4" Dec 02 15:56:05 crc kubenswrapper[4933]: I1202 15:56:05.541788 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-mv2m4" Dec 02 15:56:05 crc kubenswrapper[4933]: I1202 15:56:05.877226 4933 generic.go:334] "Generic (PLEG): container finished" podID="c7cef2b5-6b23-47ba-83ce-161a02f128a1" containerID="8361cc21fe6cf67feae0c8dccefb304d37c678dc42099934e664c3eb8a8446fc" exitCode=0 Dec 02 15:56:05 crc kubenswrapper[4933]: I1202 15:56:05.877420 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-54gm8" event={"ID":"c7cef2b5-6b23-47ba-83ce-161a02f128a1","Type":"ContainerDied","Data":"8361cc21fe6cf67feae0c8dccefb304d37c678dc42099934e664c3eb8a8446fc"} Dec 02 15:56:05 crc kubenswrapper[4933]: I1202 15:56:05.916498 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-mv2m4" Dec 02 15:56:06 crc kubenswrapper[4933]: I1202 15:56:06.883774 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5sjn5" event={"ID":"3f5707d4-08a6-4d6a-9cb6-efbaf25caa15","Type":"ContainerStarted","Data":"0a5f0f2b56c36056815297f5e95c02d751bb9a6ea76462f4d491a9cd2b14cc55"} Dec 02 15:56:06 crc kubenswrapper[4933]: I1202 15:56:06.889565 4933 generic.go:334] "Generic (PLEG): container finished" podID="49685142-ff9c-4dec-9f6a-dbc8f06b7174" containerID="2361a7f4f6a12289c9a426bed576ea742831908f5e2fc57a73c189f03eaa4c5d" exitCode=0 Dec 02 15:56:06 crc kubenswrapper[4933]: I1202 15:56:06.889658 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kph29" event={"ID":"49685142-ff9c-4dec-9f6a-dbc8f06b7174","Type":"ContainerDied","Data":"2361a7f4f6a12289c9a426bed576ea742831908f5e2fc57a73c189f03eaa4c5d"} Dec 02 15:56:06 crc kubenswrapper[4933]: I1202 15:56:06.890412 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-xnlpk" podUID="dcfaf9e9-2939-430d-b0c8-7f8daaca0f42" containerName="registry-server" containerID="cri-o://4c2eeb6549f3e0814ec58ea3ba6207164d996e2c08c5796735ca479613038bc6" gracePeriod=2 Dec 02 15:56:07 crc kubenswrapper[4933]: I1202 15:56:07.260734 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xnlpk" Dec 02 15:56:07 crc kubenswrapper[4933]: I1202 15:56:07.409614 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjfbf\" (UniqueName: \"kubernetes.io/projected/dcfaf9e9-2939-430d-b0c8-7f8daaca0f42-kube-api-access-pjfbf\") pod \"dcfaf9e9-2939-430d-b0c8-7f8daaca0f42\" (UID: \"dcfaf9e9-2939-430d-b0c8-7f8daaca0f42\") " Dec 02 15:56:07 crc kubenswrapper[4933]: I1202 15:56:07.409668 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dcfaf9e9-2939-430d-b0c8-7f8daaca0f42-catalog-content\") pod \"dcfaf9e9-2939-430d-b0c8-7f8daaca0f42\" (UID: \"dcfaf9e9-2939-430d-b0c8-7f8daaca0f42\") " Dec 02 15:56:07 crc kubenswrapper[4933]: I1202 15:56:07.410174 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dcfaf9e9-2939-430d-b0c8-7f8daaca0f42-utilities\") pod \"dcfaf9e9-2939-430d-b0c8-7f8daaca0f42\" (UID: \"dcfaf9e9-2939-430d-b0c8-7f8daaca0f42\") " Dec 02 15:56:07 crc kubenswrapper[4933]: I1202 15:56:07.411227 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dcfaf9e9-2939-430d-b0c8-7f8daaca0f42-utilities" (OuterVolumeSpecName: "utilities") pod "dcfaf9e9-2939-430d-b0c8-7f8daaca0f42" (UID: "dcfaf9e9-2939-430d-b0c8-7f8daaca0f42"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 15:56:07 crc kubenswrapper[4933]: I1202 15:56:07.420724 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dcfaf9e9-2939-430d-b0c8-7f8daaca0f42-kube-api-access-pjfbf" (OuterVolumeSpecName: "kube-api-access-pjfbf") pod "dcfaf9e9-2939-430d-b0c8-7f8daaca0f42" (UID: "dcfaf9e9-2939-430d-b0c8-7f8daaca0f42"). InnerVolumeSpecName "kube-api-access-pjfbf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 15:56:07 crc kubenswrapper[4933]: I1202 15:56:07.464353 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dcfaf9e9-2939-430d-b0c8-7f8daaca0f42-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dcfaf9e9-2939-430d-b0c8-7f8daaca0f42" (UID: "dcfaf9e9-2939-430d-b0c8-7f8daaca0f42"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 15:56:07 crc kubenswrapper[4933]: I1202 15:56:07.511465 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjfbf\" (UniqueName: \"kubernetes.io/projected/dcfaf9e9-2939-430d-b0c8-7f8daaca0f42-kube-api-access-pjfbf\") on node \"crc\" DevicePath \"\"" Dec 02 15:56:07 crc kubenswrapper[4933]: I1202 15:56:07.511492 4933 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dcfaf9e9-2939-430d-b0c8-7f8daaca0f42-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 15:56:07 crc kubenswrapper[4933]: I1202 15:56:07.511575 4933 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dcfaf9e9-2939-430d-b0c8-7f8daaca0f42-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 15:56:07 crc kubenswrapper[4933]: I1202 15:56:07.897950 4933 generic.go:334] "Generic (PLEG): container finished" podID="dcfaf9e9-2939-430d-b0c8-7f8daaca0f42" containerID="4c2eeb6549f3e0814ec58ea3ba6207164d996e2c08c5796735ca479613038bc6" exitCode=0 Dec 02 15:56:07 crc kubenswrapper[4933]: I1202 15:56:07.898019 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xnlpk" Dec 02 15:56:07 crc kubenswrapper[4933]: I1202 15:56:07.898007 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xnlpk" event={"ID":"dcfaf9e9-2939-430d-b0c8-7f8daaca0f42","Type":"ContainerDied","Data":"4c2eeb6549f3e0814ec58ea3ba6207164d996e2c08c5796735ca479613038bc6"} Dec 02 15:56:07 crc kubenswrapper[4933]: I1202 15:56:07.898659 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xnlpk" event={"ID":"dcfaf9e9-2939-430d-b0c8-7f8daaca0f42","Type":"ContainerDied","Data":"30122f2d14f9ca26839fb2fb7fa4bd11c23222863da7aa653dbd03451243670f"} Dec 02 15:56:07 crc kubenswrapper[4933]: I1202 15:56:07.898799 4933 scope.go:117] "RemoveContainer" containerID="4c2eeb6549f3e0814ec58ea3ba6207164d996e2c08c5796735ca479613038bc6" Dec 02 15:56:07 crc kubenswrapper[4933]: I1202 15:56:07.910617 4933 generic.go:334] "Generic (PLEG): container finished" podID="3f5707d4-08a6-4d6a-9cb6-efbaf25caa15" containerID="0a5f0f2b56c36056815297f5e95c02d751bb9a6ea76462f4d491a9cd2b14cc55" exitCode=0 Dec 02 15:56:07 crc kubenswrapper[4933]: I1202 15:56:07.910669 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5sjn5" event={"ID":"3f5707d4-08a6-4d6a-9cb6-efbaf25caa15","Type":"ContainerDied","Data":"0a5f0f2b56c36056815297f5e95c02d751bb9a6ea76462f4d491a9cd2b14cc55"} Dec 02 15:56:07 crc kubenswrapper[4933]: I1202 15:56:07.925703 4933 scope.go:117] "RemoveContainer" containerID="0ac96bad7630b3e5145f1a296df458a3dec889d9045df9248fa03debb9ee0c94" Dec 02 15:56:07 crc kubenswrapper[4933]: I1202 15:56:07.945770 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xnlpk"] Dec 02 15:56:07 crc kubenswrapper[4933]: I1202 15:56:07.949337 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-xnlpk"] Dec 02 15:56:07 crc kubenswrapper[4933]: I1202 15:56:07.963804 4933 scope.go:117] "RemoveContainer" containerID="cbb31cc2f12f5d397e8f8a75777bbc3b19b57a47677de32e2de4511cc81bc1ee" Dec 02 15:56:07 crc kubenswrapper[4933]: I1202 15:56:07.983264 4933 scope.go:117] "RemoveContainer" containerID="4c2eeb6549f3e0814ec58ea3ba6207164d996e2c08c5796735ca479613038bc6" Dec 02 15:56:07 crc kubenswrapper[4933]: E1202 15:56:07.983705 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c2eeb6549f3e0814ec58ea3ba6207164d996e2c08c5796735ca479613038bc6\": container with ID starting with 4c2eeb6549f3e0814ec58ea3ba6207164d996e2c08c5796735ca479613038bc6 not found: ID does not exist" containerID="4c2eeb6549f3e0814ec58ea3ba6207164d996e2c08c5796735ca479613038bc6" Dec 02 15:56:07 crc kubenswrapper[4933]: I1202 15:56:07.983749 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c2eeb6549f3e0814ec58ea3ba6207164d996e2c08c5796735ca479613038bc6"} err="failed to get container status \"4c2eeb6549f3e0814ec58ea3ba6207164d996e2c08c5796735ca479613038bc6\": rpc error: code = NotFound desc = could not find container \"4c2eeb6549f3e0814ec58ea3ba6207164d996e2c08c5796735ca479613038bc6\": container with ID starting with 4c2eeb6549f3e0814ec58ea3ba6207164d996e2c08c5796735ca479613038bc6 not found: ID does not exist" Dec 02 15:56:07 crc kubenswrapper[4933]: I1202 15:56:07.983778 4933 scope.go:117] "RemoveContainer" containerID="0ac96bad7630b3e5145f1a296df458a3dec889d9045df9248fa03debb9ee0c94" Dec 02 15:56:07 crc kubenswrapper[4933]: E1202 15:56:07.984452 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ac96bad7630b3e5145f1a296df458a3dec889d9045df9248fa03debb9ee0c94\": container with ID starting with 0ac96bad7630b3e5145f1a296df458a3dec889d9045df9248fa03debb9ee0c94 not found: ID does not exist" containerID="0ac96bad7630b3e5145f1a296df458a3dec889d9045df9248fa03debb9ee0c94" Dec 02 15:56:07 crc kubenswrapper[4933]: I1202 15:56:07.984603 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ac96bad7630b3e5145f1a296df458a3dec889d9045df9248fa03debb9ee0c94"} err="failed to get container status \"0ac96bad7630b3e5145f1a296df458a3dec889d9045df9248fa03debb9ee0c94\": rpc error: code = NotFound desc = could not find container \"0ac96bad7630b3e5145f1a296df458a3dec889d9045df9248fa03debb9ee0c94\": container with ID starting with 0ac96bad7630b3e5145f1a296df458a3dec889d9045df9248fa03debb9ee0c94 not found: ID does not exist" Dec 02 15:56:07 crc kubenswrapper[4933]: I1202 15:56:07.984724 4933 scope.go:117] "RemoveContainer" containerID="cbb31cc2f12f5d397e8f8a75777bbc3b19b57a47677de32e2de4511cc81bc1ee" Dec 02 15:56:07 crc kubenswrapper[4933]: E1202 15:56:07.985288 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cbb31cc2f12f5d397e8f8a75777bbc3b19b57a47677de32e2de4511cc81bc1ee\": container with ID starting with cbb31cc2f12f5d397e8f8a75777bbc3b19b57a47677de32e2de4511cc81bc1ee not found: ID does not exist" containerID="cbb31cc2f12f5d397e8f8a75777bbc3b19b57a47677de32e2de4511cc81bc1ee" Dec 02 15:56:07 crc kubenswrapper[4933]: I1202 15:56:07.985330 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbb31cc2f12f5d397e8f8a75777bbc3b19b57a47677de32e2de4511cc81bc1ee"} err="failed to get container status \"cbb31cc2f12f5d397e8f8a75777bbc3b19b57a47677de32e2de4511cc81bc1ee\": rpc error: code = NotFound desc = could not find container \"cbb31cc2f12f5d397e8f8a75777bbc3b19b57a47677de32e2de4511cc81bc1ee\": container with ID starting with cbb31cc2f12f5d397e8f8a75777bbc3b19b57a47677de32e2de4511cc81bc1ee not found: ID does not exist" Dec 02 15:56:08 crc kubenswrapper[4933]: I1202 15:56:08.918696 4933 generic.go:334] "Generic (PLEG): container finished" podID="e4379206-f985-4847-90b4-604172bb7e6d" containerID="180a0cd7f8de32a5f3081d75f59b252b6884d65379ef8f4284ed5c51c23f095c" exitCode=0 Dec 02 15:56:08 crc kubenswrapper[4933]: I1202 15:56:08.918753 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w9qmg" event={"ID":"e4379206-f985-4847-90b4-604172bb7e6d","Type":"ContainerDied","Data":"180a0cd7f8de32a5f3081d75f59b252b6884d65379ef8f4284ed5c51c23f095c"} Dec 02 15:56:08 crc kubenswrapper[4933]: I1202 15:56:08.925700 4933 generic.go:334] "Generic (PLEG): container finished" podID="43ed131f-a943-4e5e-a11f-e07507790a41" containerID="fb3797e3a924c501703f252c1bf7b1f9267e19460cd3f72936f4c849bc6f9155" exitCode=0 Dec 02 15:56:08 crc kubenswrapper[4933]: I1202 15:56:08.925775 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jb7rb" event={"ID":"43ed131f-a943-4e5e-a11f-e07507790a41","Type":"ContainerDied","Data":"fb3797e3a924c501703f252c1bf7b1f9267e19460cd3f72936f4c849bc6f9155"} Dec 02 15:56:08 crc kubenswrapper[4933]: I1202 15:56:08.928039 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kph29" event={"ID":"49685142-ff9c-4dec-9f6a-dbc8f06b7174","Type":"ContainerStarted","Data":"61601e1478cb15a05c3be8da00f7e57daecc9cfefed32b705475af46854eb46f"} Dec 02 15:56:08 crc kubenswrapper[4933]: I1202 15:56:08.932335 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5sjn5" event={"ID":"3f5707d4-08a6-4d6a-9cb6-efbaf25caa15","Type":"ContainerStarted","Data":"b2ddf2182159ac94110db91f7643ca8775d115da88525bd699e0a8aeb98491bf"} Dec 02 15:56:08 crc kubenswrapper[4933]: I1202 15:56:08.934678 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-54gm8" event={"ID":"c7cef2b5-6b23-47ba-83ce-161a02f128a1","Type":"ContainerStarted","Data":"b7908cb3ce05f99c0ff3b8500eb3564591f14c669382713dd6aca2221b612082"} Dec 02 15:56:08 crc kubenswrapper[4933]: I1202 15:56:08.966944 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-5sjn5" podStartSLOduration=1.5815146329999998 podStartE2EDuration="1m12.966748506s" podCreationTimestamp="2025-12-02 15:54:56 +0000 UTC" firstStartedPulling="2025-12-02 15:54:57.235104799 +0000 UTC m=+160.486331502" lastFinishedPulling="2025-12-02 15:56:08.620338672 +0000 UTC m=+231.871565375" observedRunningTime="2025-12-02 15:56:08.959516465 +0000 UTC m=+232.210743188" watchObservedRunningTime="2025-12-02 15:56:08.966748506 +0000 UTC m=+232.217975229" Dec 02 15:56:08 crc kubenswrapper[4933]: I1202 15:56:08.986134 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-kph29" podStartSLOduration=5.036239682 podStartE2EDuration="1m15.986099833s" podCreationTimestamp="2025-12-02 15:54:53 +0000 UTC" firstStartedPulling="2025-12-02 15:54:57.202336211 +0000 UTC m=+160.453562914" lastFinishedPulling="2025-12-02 15:56:08.152196362 +0000 UTC m=+231.403423065" observedRunningTime="2025-12-02 15:56:08.984407896 +0000 UTC m=+232.235634599" watchObservedRunningTime="2025-12-02 15:56:08.986099833 +0000 UTC m=+232.237326536" Dec 02 15:56:09 crc kubenswrapper[4933]: I1202 15:56:09.044721 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-54gm8" podStartSLOduration=5.456847643 podStartE2EDuration="1m17.04469139s" podCreationTimestamp="2025-12-02 15:54:52 +0000 UTC" firstStartedPulling="2025-12-02 15:54:56.122417319 +0000 UTC m=+159.373644022" lastFinishedPulling="2025-12-02 15:56:07.710261066 +0000 UTC m=+230.961487769" observedRunningTime="2025-12-02 15:56:09.044453542 +0000 UTC m=+232.295680245" watchObservedRunningTime="2025-12-02 15:56:09.04469139 +0000 UTC m=+232.295918093" Dec 02 15:56:09 crc kubenswrapper[4933]: I1202 15:56:09.071305 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dcfaf9e9-2939-430d-b0c8-7f8daaca0f42" path="/var/lib/kubelet/pods/dcfaf9e9-2939-430d-b0c8-7f8daaca0f42/volumes" Dec 02 15:56:10 crc kubenswrapper[4933]: I1202 15:56:10.981064 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w9qmg" event={"ID":"e4379206-f985-4847-90b4-604172bb7e6d","Type":"ContainerStarted","Data":"3a1c66091de41b276ec642b816f79d7552a7b014c3396ce16756e260b52bf561"} Dec 02 15:56:10 crc kubenswrapper[4933]: I1202 15:56:10.985688 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jb7rb" event={"ID":"43ed131f-a943-4e5e-a11f-e07507790a41","Type":"ContainerStarted","Data":"e7b4ddfc385effe8ebae763018a4d0dc4c55cdb2374f6f12aa9709a955f7fe4a"} Dec 02 15:56:11 crc kubenswrapper[4933]: I1202 15:56:11.004052 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-w9qmg" podStartSLOduration=3.352235692 podStartE2EDuration="1m16.004025514s" podCreationTimestamp="2025-12-02 15:54:55 +0000 UTC" firstStartedPulling="2025-12-02 15:54:57.191432418 +0000 UTC m=+160.442659121" lastFinishedPulling="2025-12-02 15:56:09.84322224 +0000 UTC m=+233.094448943" observedRunningTime="2025-12-02 15:56:11.00210416 +0000 UTC m=+234.253330883" watchObservedRunningTime="2025-12-02 15:56:11.004025514 +0000 UTC m=+234.255252217" Dec 02 15:56:11 crc kubenswrapper[4933]: I1202 15:56:11.021602 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-jb7rb" podStartSLOduration=3.416110983 podStartE2EDuration="1m16.021583201s" podCreationTimestamp="2025-12-02 15:54:55 +0000 UTC" firstStartedPulling="2025-12-02 15:54:57.19817864 +0000 UTC m=+160.449405333" lastFinishedPulling="2025-12-02 15:56:09.803650848 +0000 UTC m=+233.054877551" observedRunningTime="2025-12-02 15:56:11.020029299 +0000 UTC m=+234.271256012" watchObservedRunningTime="2025-12-02 15:56:11.021583201 +0000 UTC m=+234.272809914" Dec 02 15:56:14 crc kubenswrapper[4933]: I1202 15:56:14.081476 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-54gm8" Dec 02 15:56:14 crc kubenswrapper[4933]: I1202 15:56:14.081887 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-54gm8" Dec 02 15:56:14 crc kubenswrapper[4933]: I1202 15:56:14.154457 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-54gm8" Dec 02 15:56:14 crc kubenswrapper[4933]: I1202 15:56:14.608146 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-kph29" Dec 02 15:56:14 crc kubenswrapper[4933]: I1202 15:56:14.608966 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-kph29" Dec 02 15:56:14 crc kubenswrapper[4933]: I1202 15:56:14.658051 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-kph29" Dec 02 15:56:15 crc kubenswrapper[4933]: I1202 15:56:15.061501 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-54gm8" Dec 02 15:56:15 crc kubenswrapper[4933]: I1202 15:56:15.063376 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-kph29" Dec 02 15:56:15 crc kubenswrapper[4933]: I1202 15:56:15.741578 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-w9qmg" Dec 02 15:56:15 crc kubenswrapper[4933]: I1202 15:56:15.741649 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-w9qmg" Dec 02 15:56:15 crc kubenswrapper[4933]: I1202 15:56:15.783716 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-w9qmg" Dec 02 15:56:16 crc kubenswrapper[4933]: I1202 15:56:16.060316 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-w9qmg" Dec 02 15:56:16 crc kubenswrapper[4933]: I1202 15:56:16.341063 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-5sjn5" Dec 02 15:56:16 crc kubenswrapper[4933]: I1202 15:56:16.341817 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-5sjn5" Dec 02 15:56:16 crc kubenswrapper[4933]: I1202 15:56:16.388263 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-5sjn5" Dec 02 15:56:16 crc kubenswrapper[4933]: I1202 15:56:16.483511 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-jb7rb" Dec 02 15:56:16 crc kubenswrapper[4933]: I1202 15:56:16.484114 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-jb7rb" Dec 02 15:56:16 crc kubenswrapper[4933]: I1202 15:56:16.537470 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-jb7rb" Dec 02 15:56:16 crc kubenswrapper[4933]: I1202 15:56:16.801360 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-w9qmg"] Dec 02 15:56:17 crc kubenswrapper[4933]: I1202 15:56:17.067019 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-5sjn5" Dec 02 15:56:17 crc kubenswrapper[4933]: I1202 15:56:17.074663 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-jb7rb" Dec 02 15:56:17 crc kubenswrapper[4933]: I1202 15:56:17.806035 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kph29"] Dec 02 15:56:18 crc kubenswrapper[4933]: I1202 15:56:18.033007 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-w9qmg" podUID="e4379206-f985-4847-90b4-604172bb7e6d" containerName="registry-server" containerID="cri-o://3a1c66091de41b276ec642b816f79d7552a7b014c3396ce16756e260b52bf561" gracePeriod=2 Dec 02 15:56:18 crc kubenswrapper[4933]: I1202 15:56:18.033113 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-kph29" podUID="49685142-ff9c-4dec-9f6a-dbc8f06b7174" containerName="registry-server" containerID="cri-o://61601e1478cb15a05c3be8da00f7e57daecc9cfefed32b705475af46854eb46f" gracePeriod=2 Dec 02 15:56:19 crc kubenswrapper[4933]: I1202 15:56:19.204251 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5sjn5"] Dec 02 15:56:20 crc kubenswrapper[4933]: I1202 15:56:20.012125 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-shlpj"] Dec 02 15:56:20 crc kubenswrapper[4933]: E1202 15:56:20.012565 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="657b5a96-11b1-459d-8524-63f58a48e6c4" containerName="pruner" Dec 02 15:56:20 crc kubenswrapper[4933]: I1202 15:56:20.012595 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="657b5a96-11b1-459d-8524-63f58a48e6c4" containerName="pruner" Dec 02 15:56:20 crc kubenswrapper[4933]: E1202 15:56:20.012614 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4c9a124-d2ee-48be-952c-1b3a742d6c00" containerName="pruner" Dec 02 15:56:20 crc kubenswrapper[4933]: I1202 15:56:20.012624 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4c9a124-d2ee-48be-952c-1b3a742d6c00" containerName="pruner" Dec 02 15:56:20 crc kubenswrapper[4933]: E1202 15:56:20.012635 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcfaf9e9-2939-430d-b0c8-7f8daaca0f42" containerName="extract-content" Dec 02 15:56:20 crc kubenswrapper[4933]: I1202 15:56:20.012645 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcfaf9e9-2939-430d-b0c8-7f8daaca0f42" containerName="extract-content" Dec 02 15:56:20 crc kubenswrapper[4933]: E1202 15:56:20.012654 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcfaf9e9-2939-430d-b0c8-7f8daaca0f42" containerName="extract-utilities" Dec 02 15:56:20 crc kubenswrapper[4933]: I1202 15:56:20.012661 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcfaf9e9-2939-430d-b0c8-7f8daaca0f42" containerName="extract-utilities" Dec 02 15:56:20 crc kubenswrapper[4933]: E1202 15:56:20.012683 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcfaf9e9-2939-430d-b0c8-7f8daaca0f42" containerName="registry-server" Dec 02 15:56:20 crc kubenswrapper[4933]: I1202 15:56:20.012689 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcfaf9e9-2939-430d-b0c8-7f8daaca0f42" containerName="registry-server" Dec 02 15:56:20 crc kubenswrapper[4933]: I1202 15:56:20.012814 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="657b5a96-11b1-459d-8524-63f58a48e6c4" containerName="pruner" Dec 02 15:56:20 crc kubenswrapper[4933]: I1202 15:56:20.012915 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4c9a124-d2ee-48be-952c-1b3a742d6c00" containerName="pruner" Dec 02 15:56:20 crc kubenswrapper[4933]: I1202 15:56:20.012931 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="dcfaf9e9-2939-430d-b0c8-7f8daaca0f42" containerName="registry-server" Dec 02 15:56:20 crc kubenswrapper[4933]: I1202 15:56:20.013495 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-shlpj" Dec 02 15:56:20 crc kubenswrapper[4933]: I1202 15:56:20.025138 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-shlpj"] Dec 02 15:56:20 crc kubenswrapper[4933]: I1202 15:56:20.046467 4933 generic.go:334] "Generic (PLEG): container finished" podID="e4379206-f985-4847-90b4-604172bb7e6d" containerID="3a1c66091de41b276ec642b816f79d7552a7b014c3396ce16756e260b52bf561" exitCode=0 Dec 02 15:56:20 crc kubenswrapper[4933]: I1202 15:56:20.046543 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w9qmg" event={"ID":"e4379206-f985-4847-90b4-604172bb7e6d","Type":"ContainerDied","Data":"3a1c66091de41b276ec642b816f79d7552a7b014c3396ce16756e260b52bf561"} Dec 02 15:56:20 crc kubenswrapper[4933]: I1202 15:56:20.048727 4933 generic.go:334] "Generic (PLEG): container finished" podID="49685142-ff9c-4dec-9f6a-dbc8f06b7174" containerID="61601e1478cb15a05c3be8da00f7e57daecc9cfefed32b705475af46854eb46f" exitCode=0 Dec 02 15:56:20 crc kubenswrapper[4933]: I1202 15:56:20.048836 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kph29" event={"ID":"49685142-ff9c-4dec-9f6a-dbc8f06b7174","Type":"ContainerDied","Data":"61601e1478cb15a05c3be8da00f7e57daecc9cfefed32b705475af46854eb46f"} Dec 02 15:56:20 crc kubenswrapper[4933]: I1202 15:56:20.049021 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-5sjn5" podUID="3f5707d4-08a6-4d6a-9cb6-efbaf25caa15" containerName="registry-server" containerID="cri-o://b2ddf2182159ac94110db91f7643ca8775d115da88525bd699e0a8aeb98491bf" gracePeriod=2 Dec 02 15:56:20 crc kubenswrapper[4933]: I1202 15:56:20.086752 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/df4c64f4-2e92-4262-9f0d-070daf74622e-installation-pull-secrets\") pod \"image-registry-66df7c8f76-shlpj\" (UID: \"df4c64f4-2e92-4262-9f0d-070daf74622e\") " pod="openshift-image-registry/image-registry-66df7c8f76-shlpj" Dec 02 15:56:20 crc kubenswrapper[4933]: I1202 15:56:20.086860 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/df4c64f4-2e92-4262-9f0d-070daf74622e-bound-sa-token\") pod \"image-registry-66df7c8f76-shlpj\" (UID: \"df4c64f4-2e92-4262-9f0d-070daf74622e\") " pod="openshift-image-registry/image-registry-66df7c8f76-shlpj" Dec 02 15:56:20 crc kubenswrapper[4933]: I1202 15:56:20.086889 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jd5qd\" (UniqueName: \"kubernetes.io/projected/df4c64f4-2e92-4262-9f0d-070daf74622e-kube-api-access-jd5qd\") pod \"image-registry-66df7c8f76-shlpj\" (UID: \"df4c64f4-2e92-4262-9f0d-070daf74622e\") " pod="openshift-image-registry/image-registry-66df7c8f76-shlpj" Dec 02 15:56:20 crc kubenswrapper[4933]: I1202 15:56:20.086905 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/df4c64f4-2e92-4262-9f0d-070daf74622e-registry-tls\") pod \"image-registry-66df7c8f76-shlpj\" (UID: \"df4c64f4-2e92-4262-9f0d-070daf74622e\") " pod="openshift-image-registry/image-registry-66df7c8f76-shlpj" Dec 02 15:56:20 crc kubenswrapper[4933]: I1202 15:56:20.086925 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/df4c64f4-2e92-4262-9f0d-070daf74622e-ca-trust-extracted\") pod \"image-registry-66df7c8f76-shlpj\" (UID: \"df4c64f4-2e92-4262-9f0d-070daf74622e\") " pod="openshift-image-registry/image-registry-66df7c8f76-shlpj" Dec 02 15:56:20 crc kubenswrapper[4933]: I1202 15:56:20.086951 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-shlpj\" (UID: \"df4c64f4-2e92-4262-9f0d-070daf74622e\") " pod="openshift-image-registry/image-registry-66df7c8f76-shlpj" Dec 02 15:56:20 crc kubenswrapper[4933]: I1202 15:56:20.086986 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/df4c64f4-2e92-4262-9f0d-070daf74622e-registry-certificates\") pod \"image-registry-66df7c8f76-shlpj\" (UID: \"df4c64f4-2e92-4262-9f0d-070daf74622e\") " pod="openshift-image-registry/image-registry-66df7c8f76-shlpj" Dec 02 15:56:20 crc kubenswrapper[4933]: I1202 15:56:20.087019 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/df4c64f4-2e92-4262-9f0d-070daf74622e-trusted-ca\") pod \"image-registry-66df7c8f76-shlpj\" (UID: \"df4c64f4-2e92-4262-9f0d-070daf74622e\") " pod="openshift-image-registry/image-registry-66df7c8f76-shlpj" Dec 02 15:56:20 crc kubenswrapper[4933]: I1202 15:56:20.106809 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-shlpj\" (UID: \"df4c64f4-2e92-4262-9f0d-070daf74622e\") " pod="openshift-image-registry/image-registry-66df7c8f76-shlpj" Dec 02 15:56:20 crc kubenswrapper[4933]: I1202 15:56:20.187739 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/df4c64f4-2e92-4262-9f0d-070daf74622e-bound-sa-token\") pod \"image-registry-66df7c8f76-shlpj\" (UID: \"df4c64f4-2e92-4262-9f0d-070daf74622e\") " pod="openshift-image-registry/image-registry-66df7c8f76-shlpj" Dec 02 15:56:20 crc kubenswrapper[4933]: I1202 15:56:20.187787 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jd5qd\" (UniqueName: \"kubernetes.io/projected/df4c64f4-2e92-4262-9f0d-070daf74622e-kube-api-access-jd5qd\") pod \"image-registry-66df7c8f76-shlpj\" (UID: \"df4c64f4-2e92-4262-9f0d-070daf74622e\") " pod="openshift-image-registry/image-registry-66df7c8f76-shlpj" Dec 02 15:56:20 crc kubenswrapper[4933]: I1202 15:56:20.187809 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/df4c64f4-2e92-4262-9f0d-070daf74622e-registry-tls\") pod \"image-registry-66df7c8f76-shlpj\" (UID: \"df4c64f4-2e92-4262-9f0d-070daf74622e\") " pod="openshift-image-registry/image-registry-66df7c8f76-shlpj" Dec 02 15:56:20 crc kubenswrapper[4933]: I1202 15:56:20.187942 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/df4c64f4-2e92-4262-9f0d-070daf74622e-ca-trust-extracted\") pod \"image-registry-66df7c8f76-shlpj\" (UID: \"df4c64f4-2e92-4262-9f0d-070daf74622e\") " pod="openshift-image-registry/image-registry-66df7c8f76-shlpj" Dec 02 15:56:20 crc kubenswrapper[4933]: I1202 15:56:20.187997 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/df4c64f4-2e92-4262-9f0d-070daf74622e-registry-certificates\") pod \"image-registry-66df7c8f76-shlpj\" (UID: \"df4c64f4-2e92-4262-9f0d-070daf74622e\") " pod="openshift-image-registry/image-registry-66df7c8f76-shlpj" Dec 02 15:56:20 crc kubenswrapper[4933]: I1202 15:56:20.188027 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/df4c64f4-2e92-4262-9f0d-070daf74622e-trusted-ca\") pod \"image-registry-66df7c8f76-shlpj\" (UID: \"df4c64f4-2e92-4262-9f0d-070daf74622e\") " pod="openshift-image-registry/image-registry-66df7c8f76-shlpj" Dec 02 15:56:20 crc kubenswrapper[4933]: I1202 15:56:20.188066 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/df4c64f4-2e92-4262-9f0d-070daf74622e-installation-pull-secrets\") pod \"image-registry-66df7c8f76-shlpj\" (UID: \"df4c64f4-2e92-4262-9f0d-070daf74622e\") " pod="openshift-image-registry/image-registry-66df7c8f76-shlpj" Dec 02 15:56:20 crc kubenswrapper[4933]: I1202 15:56:20.188528 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/df4c64f4-2e92-4262-9f0d-070daf74622e-ca-trust-extracted\") pod \"image-registry-66df7c8f76-shlpj\" (UID: \"df4c64f4-2e92-4262-9f0d-070daf74622e\") " pod="openshift-image-registry/image-registry-66df7c8f76-shlpj" Dec 02 15:56:20 crc kubenswrapper[4933]: I1202 15:56:20.189681 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/df4c64f4-2e92-4262-9f0d-070daf74622e-registry-certificates\") pod \"image-registry-66df7c8f76-shlpj\" (UID: \"df4c64f4-2e92-4262-9f0d-070daf74622e\") " pod="openshift-image-registry/image-registry-66df7c8f76-shlpj" Dec 02 15:56:20 crc kubenswrapper[4933]: I1202 15:56:20.190034 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/df4c64f4-2e92-4262-9f0d-070daf74622e-trusted-ca\") pod \"image-registry-66df7c8f76-shlpj\" (UID: \"df4c64f4-2e92-4262-9f0d-070daf74622e\") " pod="openshift-image-registry/image-registry-66df7c8f76-shlpj" Dec 02 15:56:20 crc kubenswrapper[4933]: I1202 15:56:20.192709 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/df4c64f4-2e92-4262-9f0d-070daf74622e-registry-tls\") pod \"image-registry-66df7c8f76-shlpj\" (UID: \"df4c64f4-2e92-4262-9f0d-070daf74622e\") " pod="openshift-image-registry/image-registry-66df7c8f76-shlpj" Dec 02 15:56:20 crc kubenswrapper[4933]: I1202 15:56:20.194363 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/df4c64f4-2e92-4262-9f0d-070daf74622e-installation-pull-secrets\") pod \"image-registry-66df7c8f76-shlpj\" (UID: \"df4c64f4-2e92-4262-9f0d-070daf74622e\") " pod="openshift-image-registry/image-registry-66df7c8f76-shlpj" Dec 02 15:56:20 crc kubenswrapper[4933]: I1202 15:56:20.203935 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jd5qd\" (UniqueName: \"kubernetes.io/projected/df4c64f4-2e92-4262-9f0d-070daf74622e-kube-api-access-jd5qd\") pod \"image-registry-66df7c8f76-shlpj\" (UID: \"df4c64f4-2e92-4262-9f0d-070daf74622e\") " pod="openshift-image-registry/image-registry-66df7c8f76-shlpj" Dec 02 15:56:20 crc kubenswrapper[4933]: I1202 15:56:20.211488 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/df4c64f4-2e92-4262-9f0d-070daf74622e-bound-sa-token\") pod \"image-registry-66df7c8f76-shlpj\" (UID: \"df4c64f4-2e92-4262-9f0d-070daf74622e\") " pod="openshift-image-registry/image-registry-66df7c8f76-shlpj" Dec 02 15:56:20 crc kubenswrapper[4933]: I1202 15:56:20.336926 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-shlpj" Dec 02 15:56:20 crc kubenswrapper[4933]: I1202 15:56:20.728119 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-shlpj"] Dec 02 15:56:20 crc kubenswrapper[4933]: W1202 15:56:20.735275 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf4c64f4_2e92_4262_9f0d_070daf74622e.slice/crio-c1e9e36252c5c1c98671d8f74b3c4ddb060aa903fe5e8ff3f7949950d63d8e5e WatchSource:0}: Error finding container c1e9e36252c5c1c98671d8f74b3c4ddb060aa903fe5e8ff3f7949950d63d8e5e: Status 404 returned error can't find the container with id c1e9e36252c5c1c98671d8f74b3c4ddb060aa903fe5e8ff3f7949950d63d8e5e Dec 02 15:56:21 crc kubenswrapper[4933]: I1202 15:56:21.037328 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kph29" Dec 02 15:56:21 crc kubenswrapper[4933]: I1202 15:56:21.066413 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kph29" Dec 02 15:56:21 crc kubenswrapper[4933]: I1202 15:56:21.067860 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kph29" event={"ID":"49685142-ff9c-4dec-9f6a-dbc8f06b7174","Type":"ContainerDied","Data":"df2f971299ec7bc71cfcaac58fceaa5a2e4b14721b76f02163d8e8c852a5b4d7"} Dec 02 15:56:21 crc kubenswrapper[4933]: I1202 15:56:21.067902 4933 scope.go:117] "RemoveContainer" containerID="61601e1478cb15a05c3be8da00f7e57daecc9cfefed32b705475af46854eb46f" Dec 02 15:56:21 crc kubenswrapper[4933]: I1202 15:56:21.068173 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-shlpj" event={"ID":"df4c64f4-2e92-4262-9f0d-070daf74622e","Type":"ContainerStarted","Data":"c1e9e36252c5c1c98671d8f74b3c4ddb060aa903fe5e8ff3f7949950d63d8e5e"} Dec 02 15:56:21 crc kubenswrapper[4933]: I1202 15:56:21.085966 4933 scope.go:117] "RemoveContainer" containerID="2361a7f4f6a12289c9a426bed576ea742831908f5e2fc57a73c189f03eaa4c5d" Dec 02 15:56:21 crc kubenswrapper[4933]: I1202 15:56:21.106143 4933 scope.go:117] "RemoveContainer" containerID="086bb4b1a714b2015bba77f199edac2bc0213ed03000fc6e8ea6992c593a4436" Dec 02 15:56:21 crc kubenswrapper[4933]: I1202 15:56:21.108898 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49685142-ff9c-4dec-9f6a-dbc8f06b7174-utilities\") pod \"49685142-ff9c-4dec-9f6a-dbc8f06b7174\" (UID: \"49685142-ff9c-4dec-9f6a-dbc8f06b7174\") " Dec 02 15:56:21 crc kubenswrapper[4933]: I1202 15:56:21.108941 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49685142-ff9c-4dec-9f6a-dbc8f06b7174-catalog-content\") pod \"49685142-ff9c-4dec-9f6a-dbc8f06b7174\" (UID: \"49685142-ff9c-4dec-9f6a-dbc8f06b7174\") " Dec 02 15:56:21 crc kubenswrapper[4933]: I1202 15:56:21.108981 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p95k6\" (UniqueName: \"kubernetes.io/projected/49685142-ff9c-4dec-9f6a-dbc8f06b7174-kube-api-access-p95k6\") pod \"49685142-ff9c-4dec-9f6a-dbc8f06b7174\" (UID: \"49685142-ff9c-4dec-9f6a-dbc8f06b7174\") " Dec 02 15:56:21 crc kubenswrapper[4933]: I1202 15:56:21.109672 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49685142-ff9c-4dec-9f6a-dbc8f06b7174-utilities" (OuterVolumeSpecName: "utilities") pod "49685142-ff9c-4dec-9f6a-dbc8f06b7174" (UID: "49685142-ff9c-4dec-9f6a-dbc8f06b7174"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 15:56:21 crc kubenswrapper[4933]: I1202 15:56:21.115381 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49685142-ff9c-4dec-9f6a-dbc8f06b7174-kube-api-access-p95k6" (OuterVolumeSpecName: "kube-api-access-p95k6") pod "49685142-ff9c-4dec-9f6a-dbc8f06b7174" (UID: "49685142-ff9c-4dec-9f6a-dbc8f06b7174"). InnerVolumeSpecName "kube-api-access-p95k6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 15:56:21 crc kubenswrapper[4933]: I1202 15:56:21.159204 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49685142-ff9c-4dec-9f6a-dbc8f06b7174-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "49685142-ff9c-4dec-9f6a-dbc8f06b7174" (UID: "49685142-ff9c-4dec-9f6a-dbc8f06b7174"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 15:56:21 crc kubenswrapper[4933]: I1202 15:56:21.210493 4933 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49685142-ff9c-4dec-9f6a-dbc8f06b7174-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 15:56:21 crc kubenswrapper[4933]: I1202 15:56:21.210534 4933 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49685142-ff9c-4dec-9f6a-dbc8f06b7174-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 15:56:21 crc kubenswrapper[4933]: I1202 15:56:21.210550 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p95k6\" (UniqueName: \"kubernetes.io/projected/49685142-ff9c-4dec-9f6a-dbc8f06b7174-kube-api-access-p95k6\") on node \"crc\" DevicePath \"\"" Dec 02 15:56:21 crc kubenswrapper[4933]: I1202 15:56:21.416865 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w9qmg" Dec 02 15:56:21 crc kubenswrapper[4933]: I1202 15:56:21.423506 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kph29"] Dec 02 15:56:21 crc kubenswrapper[4933]: I1202 15:56:21.427555 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-kph29"] Dec 02 15:56:21 crc kubenswrapper[4933]: I1202 15:56:21.514235 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4379206-f985-4847-90b4-604172bb7e6d-utilities\") pod \"e4379206-f985-4847-90b4-604172bb7e6d\" (UID: \"e4379206-f985-4847-90b4-604172bb7e6d\") " Dec 02 15:56:21 crc kubenswrapper[4933]: I1202 15:56:21.514322 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6jwpm\" (UniqueName: \"kubernetes.io/projected/e4379206-f985-4847-90b4-604172bb7e6d-kube-api-access-6jwpm\") pod \"e4379206-f985-4847-90b4-604172bb7e6d\" (UID: \"e4379206-f985-4847-90b4-604172bb7e6d\") " Dec 02 15:56:21 crc kubenswrapper[4933]: I1202 15:56:21.514397 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4379206-f985-4847-90b4-604172bb7e6d-catalog-content\") pod \"e4379206-f985-4847-90b4-604172bb7e6d\" (UID: \"e4379206-f985-4847-90b4-604172bb7e6d\") " Dec 02 15:56:21 crc kubenswrapper[4933]: I1202 15:56:21.515189 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4379206-f985-4847-90b4-604172bb7e6d-utilities" (OuterVolumeSpecName: "utilities") pod "e4379206-f985-4847-90b4-604172bb7e6d" (UID: "e4379206-f985-4847-90b4-604172bb7e6d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 15:56:21 crc kubenswrapper[4933]: I1202 15:56:21.518407 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4379206-f985-4847-90b4-604172bb7e6d-kube-api-access-6jwpm" (OuterVolumeSpecName: "kube-api-access-6jwpm") pod "e4379206-f985-4847-90b4-604172bb7e6d" (UID: "e4379206-f985-4847-90b4-604172bb7e6d"). InnerVolumeSpecName "kube-api-access-6jwpm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 15:56:21 crc kubenswrapper[4933]: I1202 15:56:21.534275 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4379206-f985-4847-90b4-604172bb7e6d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e4379206-f985-4847-90b4-604172bb7e6d" (UID: "e4379206-f985-4847-90b4-604172bb7e6d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 15:56:21 crc kubenswrapper[4933]: I1202 15:56:21.566797 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5sjn5" Dec 02 15:56:21 crc kubenswrapper[4933]: I1202 15:56:21.615179 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f5707d4-08a6-4d6a-9cb6-efbaf25caa15-catalog-content\") pod \"3f5707d4-08a6-4d6a-9cb6-efbaf25caa15\" (UID: \"3f5707d4-08a6-4d6a-9cb6-efbaf25caa15\") " Dec 02 15:56:21 crc kubenswrapper[4933]: I1202 15:56:21.615309 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f5707d4-08a6-4d6a-9cb6-efbaf25caa15-utilities\") pod \"3f5707d4-08a6-4d6a-9cb6-efbaf25caa15\" (UID: \"3f5707d4-08a6-4d6a-9cb6-efbaf25caa15\") " Dec 02 15:56:21 crc kubenswrapper[4933]: I1202 15:56:21.615388 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4wqj4\" (UniqueName: \"kubernetes.io/projected/3f5707d4-08a6-4d6a-9cb6-efbaf25caa15-kube-api-access-4wqj4\") pod \"3f5707d4-08a6-4d6a-9cb6-efbaf25caa15\" (UID: \"3f5707d4-08a6-4d6a-9cb6-efbaf25caa15\") " Dec 02 15:56:21 crc kubenswrapper[4933]: I1202 15:56:21.615746 4933 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4379206-f985-4847-90b4-604172bb7e6d-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 15:56:21 crc kubenswrapper[4933]: I1202 15:56:21.615766 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6jwpm\" (UniqueName: \"kubernetes.io/projected/e4379206-f985-4847-90b4-604172bb7e6d-kube-api-access-6jwpm\") on node \"crc\" DevicePath \"\"" Dec 02 15:56:21 crc kubenswrapper[4933]: I1202 15:56:21.615779 4933 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4379206-f985-4847-90b4-604172bb7e6d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 15:56:21 crc kubenswrapper[4933]: I1202 15:56:21.616386 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f5707d4-08a6-4d6a-9cb6-efbaf25caa15-utilities" (OuterVolumeSpecName: "utilities") pod "3f5707d4-08a6-4d6a-9cb6-efbaf25caa15" (UID: "3f5707d4-08a6-4d6a-9cb6-efbaf25caa15"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 15:56:21 crc kubenswrapper[4933]: I1202 15:56:21.618450 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f5707d4-08a6-4d6a-9cb6-efbaf25caa15-kube-api-access-4wqj4" (OuterVolumeSpecName: "kube-api-access-4wqj4") pod "3f5707d4-08a6-4d6a-9cb6-efbaf25caa15" (UID: "3f5707d4-08a6-4d6a-9cb6-efbaf25caa15"). InnerVolumeSpecName "kube-api-access-4wqj4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 15:56:21 crc kubenswrapper[4933]: I1202 15:56:21.716717 4933 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f5707d4-08a6-4d6a-9cb6-efbaf25caa15-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 15:56:21 crc kubenswrapper[4933]: I1202 15:56:21.716755 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4wqj4\" (UniqueName: \"kubernetes.io/projected/3f5707d4-08a6-4d6a-9cb6-efbaf25caa15-kube-api-access-4wqj4\") on node \"crc\" DevicePath \"\"" Dec 02 15:56:21 crc kubenswrapper[4933]: I1202 15:56:21.720645 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f5707d4-08a6-4d6a-9cb6-efbaf25caa15-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3f5707d4-08a6-4d6a-9cb6-efbaf25caa15" (UID: "3f5707d4-08a6-4d6a-9cb6-efbaf25caa15"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 15:56:21 crc kubenswrapper[4933]: I1202 15:56:21.818911 4933 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f5707d4-08a6-4d6a-9cb6-efbaf25caa15-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 15:56:22 crc kubenswrapper[4933]: I1202 15:56:22.073918 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-shlpj" event={"ID":"df4c64f4-2e92-4262-9f0d-070daf74622e","Type":"ContainerStarted","Data":"a71060cde73808e603d12bf0dd484cf0b474368fb0ff4ef105fea02e4bd2ece4"} Dec 02 15:56:22 crc kubenswrapper[4933]: I1202 15:56:22.073992 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-shlpj" Dec 02 15:56:22 crc kubenswrapper[4933]: I1202 15:56:22.075857 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w9qmg" Dec 02 15:56:22 crc kubenswrapper[4933]: I1202 15:56:22.075872 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w9qmg" event={"ID":"e4379206-f985-4847-90b4-604172bb7e6d","Type":"ContainerDied","Data":"dddde4ccf5e780f758010ce59f192f8c0a4a015be674c8c6efb82e01d45fbf6f"} Dec 02 15:56:22 crc kubenswrapper[4933]: I1202 15:56:22.075954 4933 scope.go:117] "RemoveContainer" containerID="3a1c66091de41b276ec642b816f79d7552a7b014c3396ce16756e260b52bf561" Dec 02 15:56:22 crc kubenswrapper[4933]: I1202 15:56:22.079493 4933 generic.go:334] "Generic (PLEG): container finished" podID="3f5707d4-08a6-4d6a-9cb6-efbaf25caa15" containerID="b2ddf2182159ac94110db91f7643ca8775d115da88525bd699e0a8aeb98491bf" exitCode=0 Dec 02 15:56:22 crc kubenswrapper[4933]: I1202 15:56:22.079539 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5sjn5" event={"ID":"3f5707d4-08a6-4d6a-9cb6-efbaf25caa15","Type":"ContainerDied","Data":"b2ddf2182159ac94110db91f7643ca8775d115da88525bd699e0a8aeb98491bf"} Dec 02 15:56:22 crc kubenswrapper[4933]: I1202 15:56:22.079554 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5sjn5" Dec 02 15:56:22 crc kubenswrapper[4933]: I1202 15:56:22.079568 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5sjn5" event={"ID":"3f5707d4-08a6-4d6a-9cb6-efbaf25caa15","Type":"ContainerDied","Data":"6f68b72fcaa9a00f66a696294d4d71a89c21a5ad9fedad71cfbcea85f9540571"} Dec 02 15:56:22 crc kubenswrapper[4933]: I1202 15:56:22.096072 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-shlpj" podStartSLOduration=3.096048811 podStartE2EDuration="3.096048811s" podCreationTimestamp="2025-12-02 15:56:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 15:56:22.093383752 +0000 UTC m=+245.344610455" watchObservedRunningTime="2025-12-02 15:56:22.096048811 +0000 UTC m=+245.347275514" Dec 02 15:56:22 crc kubenswrapper[4933]: I1202 15:56:22.099864 4933 scope.go:117] "RemoveContainer" containerID="180a0cd7f8de32a5f3081d75f59b252b6884d65379ef8f4284ed5c51c23f095c" Dec 02 15:56:22 crc kubenswrapper[4933]: I1202 15:56:22.111866 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-w9qmg"] Dec 02 15:56:22 crc kubenswrapper[4933]: I1202 15:56:22.117533 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-w9qmg"] Dec 02 15:56:22 crc kubenswrapper[4933]: I1202 15:56:22.131995 4933 scope.go:117] "RemoveContainer" containerID="12825a2199ba212f149748bb00f378440ab74d1c41d4203e02f0a2d64e3da9d8" Dec 02 15:56:22 crc kubenswrapper[4933]: I1202 15:56:22.140739 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5sjn5"] Dec 02 15:56:22 crc kubenswrapper[4933]: I1202 15:56:22.144929 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-5sjn5"] Dec 02 15:56:22 crc kubenswrapper[4933]: I1202 15:56:22.152440 4933 scope.go:117] "RemoveContainer" containerID="b2ddf2182159ac94110db91f7643ca8775d115da88525bd699e0a8aeb98491bf" Dec 02 15:56:22 crc kubenswrapper[4933]: I1202 15:56:22.164908 4933 scope.go:117] "RemoveContainer" containerID="0a5f0f2b56c36056815297f5e95c02d751bb9a6ea76462f4d491a9cd2b14cc55" Dec 02 15:56:22 crc kubenswrapper[4933]: I1202 15:56:22.179361 4933 scope.go:117] "RemoveContainer" containerID="d3b6d1ad2f4d9493b53732a4b71dfdfb3dc5469003a70aa18c906815c304ba1f" Dec 02 15:56:22 crc kubenswrapper[4933]: I1202 15:56:22.194645 4933 scope.go:117] "RemoveContainer" containerID="b2ddf2182159ac94110db91f7643ca8775d115da88525bd699e0a8aeb98491bf" Dec 02 15:56:22 crc kubenswrapper[4933]: E1202 15:56:22.195140 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2ddf2182159ac94110db91f7643ca8775d115da88525bd699e0a8aeb98491bf\": container with ID starting with b2ddf2182159ac94110db91f7643ca8775d115da88525bd699e0a8aeb98491bf not found: ID does not exist" containerID="b2ddf2182159ac94110db91f7643ca8775d115da88525bd699e0a8aeb98491bf" Dec 02 15:56:22 crc kubenswrapper[4933]: I1202 15:56:22.195176 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2ddf2182159ac94110db91f7643ca8775d115da88525bd699e0a8aeb98491bf"} err="failed to get container status \"b2ddf2182159ac94110db91f7643ca8775d115da88525bd699e0a8aeb98491bf\": rpc error: code = NotFound desc = could not find container \"b2ddf2182159ac94110db91f7643ca8775d115da88525bd699e0a8aeb98491bf\": container with ID starting with b2ddf2182159ac94110db91f7643ca8775d115da88525bd699e0a8aeb98491bf not found: ID does not exist" Dec 02 15:56:22 crc kubenswrapper[4933]: I1202 15:56:22.195202 4933 scope.go:117] "RemoveContainer" containerID="0a5f0f2b56c36056815297f5e95c02d751bb9a6ea76462f4d491a9cd2b14cc55" Dec 02 15:56:22 crc kubenswrapper[4933]: E1202 15:56:22.195450 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a5f0f2b56c36056815297f5e95c02d751bb9a6ea76462f4d491a9cd2b14cc55\": container with ID starting with 0a5f0f2b56c36056815297f5e95c02d751bb9a6ea76462f4d491a9cd2b14cc55 not found: ID does not exist" containerID="0a5f0f2b56c36056815297f5e95c02d751bb9a6ea76462f4d491a9cd2b14cc55" Dec 02 15:56:22 crc kubenswrapper[4933]: I1202 15:56:22.195478 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a5f0f2b56c36056815297f5e95c02d751bb9a6ea76462f4d491a9cd2b14cc55"} err="failed to get container status \"0a5f0f2b56c36056815297f5e95c02d751bb9a6ea76462f4d491a9cd2b14cc55\": rpc error: code = NotFound desc = could not find container \"0a5f0f2b56c36056815297f5e95c02d751bb9a6ea76462f4d491a9cd2b14cc55\": container with ID starting with 0a5f0f2b56c36056815297f5e95c02d751bb9a6ea76462f4d491a9cd2b14cc55 not found: ID does not exist" Dec 02 15:56:22 crc kubenswrapper[4933]: I1202 15:56:22.195499 4933 scope.go:117] "RemoveContainer" containerID="d3b6d1ad2f4d9493b53732a4b71dfdfb3dc5469003a70aa18c906815c304ba1f" Dec 02 15:56:22 crc kubenswrapper[4933]: E1202 15:56:22.196185 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3b6d1ad2f4d9493b53732a4b71dfdfb3dc5469003a70aa18c906815c304ba1f\": container with ID starting with d3b6d1ad2f4d9493b53732a4b71dfdfb3dc5469003a70aa18c906815c304ba1f not found: ID does not exist" containerID="d3b6d1ad2f4d9493b53732a4b71dfdfb3dc5469003a70aa18c906815c304ba1f" Dec 02 15:56:22 crc kubenswrapper[4933]: I1202 15:56:22.196228 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3b6d1ad2f4d9493b53732a4b71dfdfb3dc5469003a70aa18c906815c304ba1f"} err="failed to get container status \"d3b6d1ad2f4d9493b53732a4b71dfdfb3dc5469003a70aa18c906815c304ba1f\": rpc error: code = NotFound desc = could not find container \"d3b6d1ad2f4d9493b53732a4b71dfdfb3dc5469003a70aa18c906815c304ba1f\": container with ID starting with d3b6d1ad2f4d9493b53732a4b71dfdfb3dc5469003a70aa18c906815c304ba1f not found: ID does not exist" Dec 02 15:56:23 crc kubenswrapper[4933]: I1202 15:56:23.061581 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f5707d4-08a6-4d6a-9cb6-efbaf25caa15" path="/var/lib/kubelet/pods/3f5707d4-08a6-4d6a-9cb6-efbaf25caa15/volumes" Dec 02 15:56:23 crc kubenswrapper[4933]: I1202 15:56:23.062493 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49685142-ff9c-4dec-9f6a-dbc8f06b7174" path="/var/lib/kubelet/pods/49685142-ff9c-4dec-9f6a-dbc8f06b7174/volumes" Dec 02 15:56:23 crc kubenswrapper[4933]: I1202 15:56:23.063146 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4379206-f985-4847-90b4-604172bb7e6d" path="/var/lib/kubelet/pods/e4379206-f985-4847-90b4-604172bb7e6d/volumes" Dec 02 15:56:25 crc kubenswrapper[4933]: I1202 15:56:25.078817 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-26gvw"] Dec 02 15:56:27 crc kubenswrapper[4933]: I1202 15:56:27.744593 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mp97n"] Dec 02 15:56:27 crc kubenswrapper[4933]: I1202 15:56:27.745389 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-mp97n" podUID="4f5f6b9a-09c1-46e9-9fba-1de95dd3244c" containerName="registry-server" containerID="cri-o://c5a80b8d57297ab1bbbb1fc495c0187e3370d09045324fb3adb2daa6b5507834" gracePeriod=30 Dec 02 15:56:27 crc kubenswrapper[4933]: I1202 15:56:27.752285 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-54gm8"] Dec 02 15:56:27 crc kubenswrapper[4933]: I1202 15:56:27.752569 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-54gm8" podUID="c7cef2b5-6b23-47ba-83ce-161a02f128a1" containerName="registry-server" containerID="cri-o://b7908cb3ce05f99c0ff3b8500eb3564591f14c669382713dd6aca2221b612082" gracePeriod=30 Dec 02 15:56:27 crc kubenswrapper[4933]: I1202 15:56:27.759503 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-r7dnz"] Dec 02 15:56:27 crc kubenswrapper[4933]: I1202 15:56:27.759697 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-r7dnz" podUID="cb41f368-0638-40fd-be0f-bc71d0182af8" containerName="marketplace-operator" containerID="cri-o://c583d93a0b8b419befad740071a84b41fe53c2fb71edd40029821c1edf52785c" gracePeriod=30 Dec 02 15:56:27 crc kubenswrapper[4933]: I1202 15:56:27.776482 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mv2m4"] Dec 02 15:56:27 crc kubenswrapper[4933]: I1202 15:56:27.777128 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-mv2m4" podUID="b3e7c8a1-0224-434e-bf6f-27342e1f27e6" containerName="registry-server" containerID="cri-o://3074e86c3f9d3362141c1623062e45d8c282d41b64e81cda76434a9928ac376c" gracePeriod=30 Dec 02 15:56:27 crc kubenswrapper[4933]: I1202 15:56:27.788066 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jb7rb"] Dec 02 15:56:27 crc kubenswrapper[4933]: I1202 15:56:27.789005 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-jb7rb" podUID="43ed131f-a943-4e5e-a11f-e07507790a41" containerName="registry-server" containerID="cri-o://e7b4ddfc385effe8ebae763018a4d0dc4c55cdb2374f6f12aa9709a955f7fe4a" gracePeriod=30 Dec 02 15:56:27 crc kubenswrapper[4933]: I1202 15:56:27.794183 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-k9r4r"] Dec 02 15:56:27 crc kubenswrapper[4933]: E1202 15:56:27.794422 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4379206-f985-4847-90b4-604172bb7e6d" containerName="extract-utilities" Dec 02 15:56:27 crc kubenswrapper[4933]: I1202 15:56:27.794439 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4379206-f985-4847-90b4-604172bb7e6d" containerName="extract-utilities" Dec 02 15:56:27 crc kubenswrapper[4933]: E1202 15:56:27.794450 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4379206-f985-4847-90b4-604172bb7e6d" containerName="registry-server" Dec 02 15:56:27 crc kubenswrapper[4933]: I1202 15:56:27.794457 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4379206-f985-4847-90b4-604172bb7e6d" containerName="registry-server" Dec 02 15:56:27 crc kubenswrapper[4933]: E1202 15:56:27.794469 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f5707d4-08a6-4d6a-9cb6-efbaf25caa15" containerName="extract-utilities" Dec 02 15:56:27 crc kubenswrapper[4933]: I1202 15:56:27.794475 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f5707d4-08a6-4d6a-9cb6-efbaf25caa15" containerName="extract-utilities" Dec 02 15:56:27 crc kubenswrapper[4933]: E1202 15:56:27.794484 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49685142-ff9c-4dec-9f6a-dbc8f06b7174" containerName="extract-content" Dec 02 15:56:27 crc kubenswrapper[4933]: I1202 15:56:27.794489 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="49685142-ff9c-4dec-9f6a-dbc8f06b7174" containerName="extract-content" Dec 02 15:56:27 crc kubenswrapper[4933]: E1202 15:56:27.794498 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49685142-ff9c-4dec-9f6a-dbc8f06b7174" containerName="extract-utilities" Dec 02 15:56:27 crc kubenswrapper[4933]: I1202 15:56:27.794504 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="49685142-ff9c-4dec-9f6a-dbc8f06b7174" containerName="extract-utilities" Dec 02 15:56:27 crc kubenswrapper[4933]: E1202 15:56:27.794513 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49685142-ff9c-4dec-9f6a-dbc8f06b7174" containerName="registry-server" Dec 02 15:56:27 crc kubenswrapper[4933]: I1202 15:56:27.794518 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="49685142-ff9c-4dec-9f6a-dbc8f06b7174" containerName="registry-server" Dec 02 15:56:27 crc kubenswrapper[4933]: E1202 15:56:27.794526 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f5707d4-08a6-4d6a-9cb6-efbaf25caa15" containerName="registry-server" Dec 02 15:56:27 crc kubenswrapper[4933]: I1202 15:56:27.794533 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f5707d4-08a6-4d6a-9cb6-efbaf25caa15" containerName="registry-server" Dec 02 15:56:27 crc kubenswrapper[4933]: E1202 15:56:27.794539 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f5707d4-08a6-4d6a-9cb6-efbaf25caa15" containerName="extract-content" Dec 02 15:56:27 crc kubenswrapper[4933]: I1202 15:56:27.794545 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f5707d4-08a6-4d6a-9cb6-efbaf25caa15" containerName="extract-content" Dec 02 15:56:27 crc kubenswrapper[4933]: E1202 15:56:27.794552 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4379206-f985-4847-90b4-604172bb7e6d" containerName="extract-content" Dec 02 15:56:27 crc kubenswrapper[4933]: I1202 15:56:27.794558 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4379206-f985-4847-90b4-604172bb7e6d" containerName="extract-content" Dec 02 15:56:27 crc kubenswrapper[4933]: I1202 15:56:27.794646 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f5707d4-08a6-4d6a-9cb6-efbaf25caa15" containerName="registry-server" Dec 02 15:56:27 crc kubenswrapper[4933]: I1202 15:56:27.794660 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4379206-f985-4847-90b4-604172bb7e6d" containerName="registry-server" Dec 02 15:56:27 crc kubenswrapper[4933]: I1202 15:56:27.794667 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="49685142-ff9c-4dec-9f6a-dbc8f06b7174" containerName="registry-server" Dec 02 15:56:27 crc kubenswrapper[4933]: I1202 15:56:27.795105 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-k9r4r" Dec 02 15:56:27 crc kubenswrapper[4933]: I1202 15:56:27.801888 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-k9r4r"] Dec 02 15:56:27 crc kubenswrapper[4933]: I1202 15:56:27.991193 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e66c3d98-54cc-4d3f-a1b4-3eee8ed9ab51-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-k9r4r\" (UID: \"e66c3d98-54cc-4d3f-a1b4-3eee8ed9ab51\") " pod="openshift-marketplace/marketplace-operator-79b997595-k9r4r" Dec 02 15:56:27 crc kubenswrapper[4933]: I1202 15:56:27.991270 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/e66c3d98-54cc-4d3f-a1b4-3eee8ed9ab51-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-k9r4r\" (UID: \"e66c3d98-54cc-4d3f-a1b4-3eee8ed9ab51\") " pod="openshift-marketplace/marketplace-operator-79b997595-k9r4r" Dec 02 15:56:27 crc kubenswrapper[4933]: I1202 15:56:27.991307 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xt8pm\" (UniqueName: \"kubernetes.io/projected/e66c3d98-54cc-4d3f-a1b4-3eee8ed9ab51-kube-api-access-xt8pm\") pod \"marketplace-operator-79b997595-k9r4r\" (UID: \"e66c3d98-54cc-4d3f-a1b4-3eee8ed9ab51\") " pod="openshift-marketplace/marketplace-operator-79b997595-k9r4r" Dec 02 15:56:28 crc kubenswrapper[4933]: I1202 15:56:28.093329 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e66c3d98-54cc-4d3f-a1b4-3eee8ed9ab51-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-k9r4r\" (UID: \"e66c3d98-54cc-4d3f-a1b4-3eee8ed9ab51\") " pod="openshift-marketplace/marketplace-operator-79b997595-k9r4r" Dec 02 15:56:28 crc kubenswrapper[4933]: I1202 15:56:28.093697 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/e66c3d98-54cc-4d3f-a1b4-3eee8ed9ab51-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-k9r4r\" (UID: \"e66c3d98-54cc-4d3f-a1b4-3eee8ed9ab51\") " pod="openshift-marketplace/marketplace-operator-79b997595-k9r4r" Dec 02 15:56:28 crc kubenswrapper[4933]: I1202 15:56:28.093732 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xt8pm\" (UniqueName: \"kubernetes.io/projected/e66c3d98-54cc-4d3f-a1b4-3eee8ed9ab51-kube-api-access-xt8pm\") pod \"marketplace-operator-79b997595-k9r4r\" (UID: \"e66c3d98-54cc-4d3f-a1b4-3eee8ed9ab51\") " pod="openshift-marketplace/marketplace-operator-79b997595-k9r4r" Dec 02 15:56:28 crc kubenswrapper[4933]: I1202 15:56:28.095583 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e66c3d98-54cc-4d3f-a1b4-3eee8ed9ab51-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-k9r4r\" (UID: \"e66c3d98-54cc-4d3f-a1b4-3eee8ed9ab51\") " pod="openshift-marketplace/marketplace-operator-79b997595-k9r4r" Dec 02 15:56:28 crc kubenswrapper[4933]: I1202 15:56:28.110749 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/e66c3d98-54cc-4d3f-a1b4-3eee8ed9ab51-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-k9r4r\" (UID: \"e66c3d98-54cc-4d3f-a1b4-3eee8ed9ab51\") " pod="openshift-marketplace/marketplace-operator-79b997595-k9r4r" Dec 02 15:56:28 crc kubenswrapper[4933]: I1202 15:56:28.117807 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xt8pm\" (UniqueName: \"kubernetes.io/projected/e66c3d98-54cc-4d3f-a1b4-3eee8ed9ab51-kube-api-access-xt8pm\") pod \"marketplace-operator-79b997595-k9r4r\" (UID: \"e66c3d98-54cc-4d3f-a1b4-3eee8ed9ab51\") " pod="openshift-marketplace/marketplace-operator-79b997595-k9r4r" Dec 02 15:56:28 crc kubenswrapper[4933]: I1202 15:56:28.118182 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-k9r4r" Dec 02 15:56:28 crc kubenswrapper[4933]: I1202 15:56:28.143730 4933 generic.go:334] "Generic (PLEG): container finished" podID="c7cef2b5-6b23-47ba-83ce-161a02f128a1" containerID="b7908cb3ce05f99c0ff3b8500eb3564591f14c669382713dd6aca2221b612082" exitCode=0 Dec 02 15:56:28 crc kubenswrapper[4933]: I1202 15:56:28.143851 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-54gm8" event={"ID":"c7cef2b5-6b23-47ba-83ce-161a02f128a1","Type":"ContainerDied","Data":"b7908cb3ce05f99c0ff3b8500eb3564591f14c669382713dd6aca2221b612082"} Dec 02 15:56:28 crc kubenswrapper[4933]: I1202 15:56:28.143911 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-54gm8" event={"ID":"c7cef2b5-6b23-47ba-83ce-161a02f128a1","Type":"ContainerDied","Data":"b509b031e61c155d89d15104c7bb3788900dc6b9ed585a9fa8c9fad628a27c2c"} Dec 02 15:56:28 crc kubenswrapper[4933]: I1202 15:56:28.143929 4933 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b509b031e61c155d89d15104c7bb3788900dc6b9ed585a9fa8c9fad628a27c2c" Dec 02 15:56:28 crc kubenswrapper[4933]: I1202 15:56:28.146337 4933 generic.go:334] "Generic (PLEG): container finished" podID="cb41f368-0638-40fd-be0f-bc71d0182af8" containerID="c583d93a0b8b419befad740071a84b41fe53c2fb71edd40029821c1edf52785c" exitCode=0 Dec 02 15:56:28 crc kubenswrapper[4933]: I1202 15:56:28.146392 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-r7dnz" event={"ID":"cb41f368-0638-40fd-be0f-bc71d0182af8","Type":"ContainerDied","Data":"c583d93a0b8b419befad740071a84b41fe53c2fb71edd40029821c1edf52785c"} Dec 02 15:56:28 crc kubenswrapper[4933]: I1202 15:56:28.149365 4933 generic.go:334] "Generic (PLEG): container finished" podID="43ed131f-a943-4e5e-a11f-e07507790a41" containerID="e7b4ddfc385effe8ebae763018a4d0dc4c55cdb2374f6f12aa9709a955f7fe4a" exitCode=0 Dec 02 15:56:28 crc kubenswrapper[4933]: I1202 15:56:28.149516 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jb7rb" event={"ID":"43ed131f-a943-4e5e-a11f-e07507790a41","Type":"ContainerDied","Data":"e7b4ddfc385effe8ebae763018a4d0dc4c55cdb2374f6f12aa9709a955f7fe4a"} Dec 02 15:56:28 crc kubenswrapper[4933]: I1202 15:56:28.152242 4933 generic.go:334] "Generic (PLEG): container finished" podID="4f5f6b9a-09c1-46e9-9fba-1de95dd3244c" containerID="c5a80b8d57297ab1bbbb1fc495c0187e3370d09045324fb3adb2daa6b5507834" exitCode=0 Dec 02 15:56:28 crc kubenswrapper[4933]: I1202 15:56:28.152291 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mp97n" event={"ID":"4f5f6b9a-09c1-46e9-9fba-1de95dd3244c","Type":"ContainerDied","Data":"c5a80b8d57297ab1bbbb1fc495c0187e3370d09045324fb3adb2daa6b5507834"} Dec 02 15:56:28 crc kubenswrapper[4933]: I1202 15:56:28.154332 4933 generic.go:334] "Generic (PLEG): container finished" podID="b3e7c8a1-0224-434e-bf6f-27342e1f27e6" containerID="3074e86c3f9d3362141c1623062e45d8c282d41b64e81cda76434a9928ac376c" exitCode=0 Dec 02 15:56:28 crc kubenswrapper[4933]: I1202 15:56:28.154352 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mv2m4" event={"ID":"b3e7c8a1-0224-434e-bf6f-27342e1f27e6","Type":"ContainerDied","Data":"3074e86c3f9d3362141c1623062e45d8c282d41b64e81cda76434a9928ac376c"} Dec 02 15:56:28 crc kubenswrapper[4933]: I1202 15:56:28.278595 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-54gm8" Dec 02 15:56:28 crc kubenswrapper[4933]: I1202 15:56:28.283512 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mv2m4" Dec 02 15:56:28 crc kubenswrapper[4933]: I1202 15:56:28.287439 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-r7dnz" Dec 02 15:56:28 crc kubenswrapper[4933]: I1202 15:56:28.298765 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fj6gq\" (UniqueName: \"kubernetes.io/projected/b3e7c8a1-0224-434e-bf6f-27342e1f27e6-kube-api-access-fj6gq\") pod \"b3e7c8a1-0224-434e-bf6f-27342e1f27e6\" (UID: \"b3e7c8a1-0224-434e-bf6f-27342e1f27e6\") " Dec 02 15:56:28 crc kubenswrapper[4933]: I1202 15:56:28.299810 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/cb41f368-0638-40fd-be0f-bc71d0182af8-marketplace-operator-metrics\") pod \"cb41f368-0638-40fd-be0f-bc71d0182af8\" (UID: \"cb41f368-0638-40fd-be0f-bc71d0182af8\") " Dec 02 15:56:28 crc kubenswrapper[4933]: I1202 15:56:28.299865 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-98qxk\" (UniqueName: \"kubernetes.io/projected/cb41f368-0638-40fd-be0f-bc71d0182af8-kube-api-access-98qxk\") pod \"cb41f368-0638-40fd-be0f-bc71d0182af8\" (UID: \"cb41f368-0638-40fd-be0f-bc71d0182af8\") " Dec 02 15:56:28 crc kubenswrapper[4933]: I1202 15:56:28.299902 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3e7c8a1-0224-434e-bf6f-27342e1f27e6-utilities\") pod \"b3e7c8a1-0224-434e-bf6f-27342e1f27e6\" (UID: \"b3e7c8a1-0224-434e-bf6f-27342e1f27e6\") " Dec 02 15:56:28 crc kubenswrapper[4933]: I1202 15:56:28.299962 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3e7c8a1-0224-434e-bf6f-27342e1f27e6-catalog-content\") pod \"b3e7c8a1-0224-434e-bf6f-27342e1f27e6\" (UID: \"b3e7c8a1-0224-434e-bf6f-27342e1f27e6\") " Dec 02 15:56:28 crc kubenswrapper[4933]: I1202 15:56:28.300034 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hjrk7\" (UniqueName: \"kubernetes.io/projected/c7cef2b5-6b23-47ba-83ce-161a02f128a1-kube-api-access-hjrk7\") pod \"c7cef2b5-6b23-47ba-83ce-161a02f128a1\" (UID: \"c7cef2b5-6b23-47ba-83ce-161a02f128a1\") " Dec 02 15:56:28 crc kubenswrapper[4933]: I1202 15:56:28.300079 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7cef2b5-6b23-47ba-83ce-161a02f128a1-catalog-content\") pod \"c7cef2b5-6b23-47ba-83ce-161a02f128a1\" (UID: \"c7cef2b5-6b23-47ba-83ce-161a02f128a1\") " Dec 02 15:56:28 crc kubenswrapper[4933]: I1202 15:56:28.300125 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cb41f368-0638-40fd-be0f-bc71d0182af8-marketplace-trusted-ca\") pod \"cb41f368-0638-40fd-be0f-bc71d0182af8\" (UID: \"cb41f368-0638-40fd-be0f-bc71d0182af8\") " Dec 02 15:56:28 crc kubenswrapper[4933]: I1202 15:56:28.300151 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7cef2b5-6b23-47ba-83ce-161a02f128a1-utilities\") pod \"c7cef2b5-6b23-47ba-83ce-161a02f128a1\" (UID: \"c7cef2b5-6b23-47ba-83ce-161a02f128a1\") " Dec 02 15:56:28 crc kubenswrapper[4933]: I1202 15:56:28.302294 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7cef2b5-6b23-47ba-83ce-161a02f128a1-utilities" (OuterVolumeSpecName: "utilities") pod "c7cef2b5-6b23-47ba-83ce-161a02f128a1" (UID: "c7cef2b5-6b23-47ba-83ce-161a02f128a1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 15:56:28 crc kubenswrapper[4933]: I1202 15:56:28.302576 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3e7c8a1-0224-434e-bf6f-27342e1f27e6-kube-api-access-fj6gq" (OuterVolumeSpecName: "kube-api-access-fj6gq") pod "b3e7c8a1-0224-434e-bf6f-27342e1f27e6" (UID: "b3e7c8a1-0224-434e-bf6f-27342e1f27e6"). InnerVolumeSpecName "kube-api-access-fj6gq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 15:56:28 crc kubenswrapper[4933]: I1202 15:56:28.304325 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb41f368-0638-40fd-be0f-bc71d0182af8-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "cb41f368-0638-40fd-be0f-bc71d0182af8" (UID: "cb41f368-0638-40fd-be0f-bc71d0182af8"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 15:56:28 crc kubenswrapper[4933]: I1202 15:56:28.305222 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3e7c8a1-0224-434e-bf6f-27342e1f27e6-utilities" (OuterVolumeSpecName: "utilities") pod "b3e7c8a1-0224-434e-bf6f-27342e1f27e6" (UID: "b3e7c8a1-0224-434e-bf6f-27342e1f27e6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 15:56:28 crc kubenswrapper[4933]: I1202 15:56:28.305378 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jb7rb" Dec 02 15:56:28 crc kubenswrapper[4933]: I1202 15:56:28.306388 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7cef2b5-6b23-47ba-83ce-161a02f128a1-kube-api-access-hjrk7" (OuterVolumeSpecName: "kube-api-access-hjrk7") pod "c7cef2b5-6b23-47ba-83ce-161a02f128a1" (UID: "c7cef2b5-6b23-47ba-83ce-161a02f128a1"). InnerVolumeSpecName "kube-api-access-hjrk7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 15:56:28 crc kubenswrapper[4933]: I1202 15:56:28.308345 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb41f368-0638-40fd-be0f-bc71d0182af8-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "cb41f368-0638-40fd-be0f-bc71d0182af8" (UID: "cb41f368-0638-40fd-be0f-bc71d0182af8"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 15:56:28 crc kubenswrapper[4933]: I1202 15:56:28.310202 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb41f368-0638-40fd-be0f-bc71d0182af8-kube-api-access-98qxk" (OuterVolumeSpecName: "kube-api-access-98qxk") pod "cb41f368-0638-40fd-be0f-bc71d0182af8" (UID: "cb41f368-0638-40fd-be0f-bc71d0182af8"). InnerVolumeSpecName "kube-api-access-98qxk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 15:56:28 crc kubenswrapper[4933]: I1202 15:56:28.326531 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3e7c8a1-0224-434e-bf6f-27342e1f27e6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b3e7c8a1-0224-434e-bf6f-27342e1f27e6" (UID: "b3e7c8a1-0224-434e-bf6f-27342e1f27e6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 15:56:28 crc kubenswrapper[4933]: I1202 15:56:28.368177 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7cef2b5-6b23-47ba-83ce-161a02f128a1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c7cef2b5-6b23-47ba-83ce-161a02f128a1" (UID: "c7cef2b5-6b23-47ba-83ce-161a02f128a1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 15:56:28 crc kubenswrapper[4933]: I1202 15:56:28.402569 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43ed131f-a943-4e5e-a11f-e07507790a41-catalog-content\") pod \"43ed131f-a943-4e5e-a11f-e07507790a41\" (UID: \"43ed131f-a943-4e5e-a11f-e07507790a41\") " Dec 02 15:56:28 crc kubenswrapper[4933]: I1202 15:56:28.403062 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gxxg4\" (UniqueName: \"kubernetes.io/projected/43ed131f-a943-4e5e-a11f-e07507790a41-kube-api-access-gxxg4\") pod \"43ed131f-a943-4e5e-a11f-e07507790a41\" (UID: \"43ed131f-a943-4e5e-a11f-e07507790a41\") " Dec 02 15:56:28 crc kubenswrapper[4933]: I1202 15:56:28.403120 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43ed131f-a943-4e5e-a11f-e07507790a41-utilities\") pod \"43ed131f-a943-4e5e-a11f-e07507790a41\" (UID: \"43ed131f-a943-4e5e-a11f-e07507790a41\") " Dec 02 15:56:28 crc kubenswrapper[4933]: I1202 15:56:28.403392 4933 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7cef2b5-6b23-47ba-83ce-161a02f128a1-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 15:56:28 crc kubenswrapper[4933]: I1202 15:56:28.403409 4933 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cb41f368-0638-40fd-be0f-bc71d0182af8-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 02 15:56:28 crc kubenswrapper[4933]: I1202 15:56:28.403422 4933 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7cef2b5-6b23-47ba-83ce-161a02f128a1-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 15:56:28 crc kubenswrapper[4933]: I1202 15:56:28.403433 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fj6gq\" (UniqueName: \"kubernetes.io/projected/b3e7c8a1-0224-434e-bf6f-27342e1f27e6-kube-api-access-fj6gq\") on node \"crc\" DevicePath \"\"" Dec 02 15:56:28 crc kubenswrapper[4933]: I1202 15:56:28.403468 4933 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/cb41f368-0638-40fd-be0f-bc71d0182af8-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 02 15:56:28 crc kubenswrapper[4933]: I1202 15:56:28.403478 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-98qxk\" (UniqueName: \"kubernetes.io/projected/cb41f368-0638-40fd-be0f-bc71d0182af8-kube-api-access-98qxk\") on node \"crc\" DevicePath \"\"" Dec 02 15:56:28 crc kubenswrapper[4933]: I1202 15:56:28.403512 4933 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3e7c8a1-0224-434e-bf6f-27342e1f27e6-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 15:56:28 crc kubenswrapper[4933]: I1202 15:56:28.403520 4933 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3e7c8a1-0224-434e-bf6f-27342e1f27e6-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 15:56:28 crc kubenswrapper[4933]: I1202 15:56:28.403530 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hjrk7\" (UniqueName: \"kubernetes.io/projected/c7cef2b5-6b23-47ba-83ce-161a02f128a1-kube-api-access-hjrk7\") on node \"crc\" DevicePath \"\"" Dec 02 15:56:28 crc kubenswrapper[4933]: I1202 15:56:28.404449 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43ed131f-a943-4e5e-a11f-e07507790a41-utilities" (OuterVolumeSpecName: "utilities") pod "43ed131f-a943-4e5e-a11f-e07507790a41" (UID: "43ed131f-a943-4e5e-a11f-e07507790a41"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 15:56:28 crc kubenswrapper[4933]: I1202 15:56:28.406445 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43ed131f-a943-4e5e-a11f-e07507790a41-kube-api-access-gxxg4" (OuterVolumeSpecName: "kube-api-access-gxxg4") pod "43ed131f-a943-4e5e-a11f-e07507790a41" (UID: "43ed131f-a943-4e5e-a11f-e07507790a41"). InnerVolumeSpecName "kube-api-access-gxxg4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 15:56:28 crc kubenswrapper[4933]: I1202 15:56:28.504805 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gxxg4\" (UniqueName: \"kubernetes.io/projected/43ed131f-a943-4e5e-a11f-e07507790a41-kube-api-access-gxxg4\") on node \"crc\" DevicePath \"\"" Dec 02 15:56:28 crc kubenswrapper[4933]: I1202 15:56:28.504869 4933 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43ed131f-a943-4e5e-a11f-e07507790a41-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 15:56:28 crc kubenswrapper[4933]: I1202 15:56:28.516901 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43ed131f-a943-4e5e-a11f-e07507790a41-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "43ed131f-a943-4e5e-a11f-e07507790a41" (UID: "43ed131f-a943-4e5e-a11f-e07507790a41"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 15:56:28 crc kubenswrapper[4933]: I1202 15:56:28.589066 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-k9r4r"] Dec 02 15:56:28 crc kubenswrapper[4933]: I1202 15:56:28.596944 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mp97n" Dec 02 15:56:28 crc kubenswrapper[4933]: I1202 15:56:28.605882 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mfbjr\" (UniqueName: \"kubernetes.io/projected/4f5f6b9a-09c1-46e9-9fba-1de95dd3244c-kube-api-access-mfbjr\") pod \"4f5f6b9a-09c1-46e9-9fba-1de95dd3244c\" (UID: \"4f5f6b9a-09c1-46e9-9fba-1de95dd3244c\") " Dec 02 15:56:28 crc kubenswrapper[4933]: I1202 15:56:28.605959 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f5f6b9a-09c1-46e9-9fba-1de95dd3244c-utilities\") pod \"4f5f6b9a-09c1-46e9-9fba-1de95dd3244c\" (UID: \"4f5f6b9a-09c1-46e9-9fba-1de95dd3244c\") " Dec 02 15:56:28 crc kubenswrapper[4933]: I1202 15:56:28.605983 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f5f6b9a-09c1-46e9-9fba-1de95dd3244c-catalog-content\") pod \"4f5f6b9a-09c1-46e9-9fba-1de95dd3244c\" (UID: \"4f5f6b9a-09c1-46e9-9fba-1de95dd3244c\") " Dec 02 15:56:28 crc kubenswrapper[4933]: I1202 15:56:28.606178 4933 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43ed131f-a943-4e5e-a11f-e07507790a41-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 15:56:28 crc kubenswrapper[4933]: I1202 15:56:28.607331 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f5f6b9a-09c1-46e9-9fba-1de95dd3244c-utilities" (OuterVolumeSpecName: "utilities") pod "4f5f6b9a-09c1-46e9-9fba-1de95dd3244c" (UID: "4f5f6b9a-09c1-46e9-9fba-1de95dd3244c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 15:56:28 crc kubenswrapper[4933]: I1202 15:56:28.612590 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f5f6b9a-09c1-46e9-9fba-1de95dd3244c-kube-api-access-mfbjr" (OuterVolumeSpecName: "kube-api-access-mfbjr") pod "4f5f6b9a-09c1-46e9-9fba-1de95dd3244c" (UID: "4f5f6b9a-09c1-46e9-9fba-1de95dd3244c"). InnerVolumeSpecName "kube-api-access-mfbjr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 15:56:28 crc kubenswrapper[4933]: I1202 15:56:28.677427 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f5f6b9a-09c1-46e9-9fba-1de95dd3244c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4f5f6b9a-09c1-46e9-9fba-1de95dd3244c" (UID: "4f5f6b9a-09c1-46e9-9fba-1de95dd3244c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 15:56:28 crc kubenswrapper[4933]: I1202 15:56:28.708029 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mfbjr\" (UniqueName: \"kubernetes.io/projected/4f5f6b9a-09c1-46e9-9fba-1de95dd3244c-kube-api-access-mfbjr\") on node \"crc\" DevicePath \"\"" Dec 02 15:56:28 crc kubenswrapper[4933]: I1202 15:56:28.708196 4933 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f5f6b9a-09c1-46e9-9fba-1de95dd3244c-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 15:56:28 crc kubenswrapper[4933]: I1202 15:56:28.708259 4933 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f5f6b9a-09c1-46e9-9fba-1de95dd3244c-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 15:56:28 crc kubenswrapper[4933]: I1202 15:56:28.988330 4933 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 02 15:56:28 crc kubenswrapper[4933]: E1202 15:56:28.988566 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3e7c8a1-0224-434e-bf6f-27342e1f27e6" containerName="registry-server" Dec 02 15:56:28 crc kubenswrapper[4933]: I1202 15:56:28.988580 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3e7c8a1-0224-434e-bf6f-27342e1f27e6" containerName="registry-server" Dec 02 15:56:28 crc kubenswrapper[4933]: E1202 15:56:28.988595 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43ed131f-a943-4e5e-a11f-e07507790a41" containerName="extract-utilities" Dec 02 15:56:28 crc kubenswrapper[4933]: I1202 15:56:28.988606 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="43ed131f-a943-4e5e-a11f-e07507790a41" containerName="extract-utilities" Dec 02 15:56:28 crc kubenswrapper[4933]: E1202 15:56:28.988620 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3e7c8a1-0224-434e-bf6f-27342e1f27e6" containerName="extract-content" Dec 02 15:56:28 crc kubenswrapper[4933]: I1202 15:56:28.988628 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3e7c8a1-0224-434e-bf6f-27342e1f27e6" containerName="extract-content" Dec 02 15:56:28 crc kubenswrapper[4933]: E1202 15:56:28.988641 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f5f6b9a-09c1-46e9-9fba-1de95dd3244c" containerName="extract-content" Dec 02 15:56:28 crc kubenswrapper[4933]: I1202 15:56:28.988651 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f5f6b9a-09c1-46e9-9fba-1de95dd3244c" containerName="extract-content" Dec 02 15:56:28 crc kubenswrapper[4933]: E1202 15:56:28.988662 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7cef2b5-6b23-47ba-83ce-161a02f128a1" containerName="registry-server" Dec 02 15:56:28 crc kubenswrapper[4933]: I1202 15:56:28.988670 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7cef2b5-6b23-47ba-83ce-161a02f128a1" containerName="registry-server" Dec 02 15:56:28 crc kubenswrapper[4933]: E1202 15:56:28.988683 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb41f368-0638-40fd-be0f-bc71d0182af8" containerName="marketplace-operator" Dec 02 15:56:28 crc kubenswrapper[4933]: I1202 15:56:28.988704 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb41f368-0638-40fd-be0f-bc71d0182af8" containerName="marketplace-operator" Dec 02 15:56:28 crc kubenswrapper[4933]: E1202 15:56:28.988728 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f5f6b9a-09c1-46e9-9fba-1de95dd3244c" containerName="registry-server" Dec 02 15:56:28 crc kubenswrapper[4933]: I1202 15:56:28.988735 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f5f6b9a-09c1-46e9-9fba-1de95dd3244c" containerName="registry-server" Dec 02 15:56:28 crc kubenswrapper[4933]: E1202 15:56:28.988745 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3e7c8a1-0224-434e-bf6f-27342e1f27e6" containerName="extract-utilities" Dec 02 15:56:28 crc kubenswrapper[4933]: I1202 15:56:28.988752 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3e7c8a1-0224-434e-bf6f-27342e1f27e6" containerName="extract-utilities" Dec 02 15:56:28 crc kubenswrapper[4933]: E1202 15:56:28.988764 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f5f6b9a-09c1-46e9-9fba-1de95dd3244c" containerName="extract-utilities" Dec 02 15:56:28 crc kubenswrapper[4933]: I1202 15:56:28.988771 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f5f6b9a-09c1-46e9-9fba-1de95dd3244c" containerName="extract-utilities" Dec 02 15:56:28 crc kubenswrapper[4933]: E1202 15:56:28.988783 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43ed131f-a943-4e5e-a11f-e07507790a41" containerName="registry-server" Dec 02 15:56:28 crc kubenswrapper[4933]: I1202 15:56:28.988790 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="43ed131f-a943-4e5e-a11f-e07507790a41" containerName="registry-server" Dec 02 15:56:28 crc kubenswrapper[4933]: E1202 15:56:28.988798 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7cef2b5-6b23-47ba-83ce-161a02f128a1" containerName="extract-content" Dec 02 15:56:28 crc kubenswrapper[4933]: I1202 15:56:28.988806 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7cef2b5-6b23-47ba-83ce-161a02f128a1" containerName="extract-content" Dec 02 15:56:28 crc kubenswrapper[4933]: E1202 15:56:28.988816 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7cef2b5-6b23-47ba-83ce-161a02f128a1" containerName="extract-utilities" Dec 02 15:56:28 crc kubenswrapper[4933]: I1202 15:56:28.989031 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7cef2b5-6b23-47ba-83ce-161a02f128a1" containerName="extract-utilities" Dec 02 15:56:28 crc kubenswrapper[4933]: E1202 15:56:28.989042 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43ed131f-a943-4e5e-a11f-e07507790a41" containerName="extract-content" Dec 02 15:56:28 crc kubenswrapper[4933]: I1202 15:56:28.989050 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="43ed131f-a943-4e5e-a11f-e07507790a41" containerName="extract-content" Dec 02 15:56:28 crc kubenswrapper[4933]: I1202 15:56:28.989177 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f5f6b9a-09c1-46e9-9fba-1de95dd3244c" containerName="registry-server" Dec 02 15:56:28 crc kubenswrapper[4933]: I1202 15:56:28.989190 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="43ed131f-a943-4e5e-a11f-e07507790a41" containerName="registry-server" Dec 02 15:56:28 crc kubenswrapper[4933]: I1202 15:56:28.989204 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7cef2b5-6b23-47ba-83ce-161a02f128a1" containerName="registry-server" Dec 02 15:56:28 crc kubenswrapper[4933]: I1202 15:56:28.989217 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3e7c8a1-0224-434e-bf6f-27342e1f27e6" containerName="registry-server" Dec 02 15:56:28 crc kubenswrapper[4933]: I1202 15:56:28.989227 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb41f368-0638-40fd-be0f-bc71d0182af8" containerName="marketplace-operator" Dec 02 15:56:28 crc kubenswrapper[4933]: I1202 15:56:28.989613 4933 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 02 15:56:28 crc kubenswrapper[4933]: I1202 15:56:28.989965 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 15:56:28 crc kubenswrapper[4933]: I1202 15:56:28.990094 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://67497877f8241167fd89cf2ebbc7257a910b4f22e96a1e58f78936317daf19aa" gracePeriod=15 Dec 02 15:56:28 crc kubenswrapper[4933]: I1202 15:56:28.990229 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://a0f8661c980893fc646e84a6bb8547946653723fb3f1449d585953e25715d588" gracePeriod=15 Dec 02 15:56:28 crc kubenswrapper[4933]: I1202 15:56:28.990278 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://266ff5f18c317176752753189e1fd32c61d17e22bebba19421d71cdce41c10e4" gracePeriod=15 Dec 02 15:56:28 crc kubenswrapper[4933]: I1202 15:56:28.990324 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://559f28d5b8b13a4d78a727f671a75d2c61a391f83089c98749a8071b466c8fae" gracePeriod=15 Dec 02 15:56:28 crc kubenswrapper[4933]: I1202 15:56:28.990352 4933 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 02 15:56:28 crc kubenswrapper[4933]: I1202 15:56:28.990370 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://50e3f8f75abacf2b0701b20cc58796727bb04d99836c7ad69c27355d0c7f070f" gracePeriod=15 Dec 02 15:56:28 crc kubenswrapper[4933]: E1202 15:56:28.991151 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 02 15:56:28 crc kubenswrapper[4933]: I1202 15:56:28.991191 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 02 15:56:28 crc kubenswrapper[4933]: E1202 15:56:28.991205 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 02 15:56:28 crc kubenswrapper[4933]: I1202 15:56:28.991214 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 02 15:56:28 crc kubenswrapper[4933]: E1202 15:56:28.991230 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 02 15:56:28 crc kubenswrapper[4933]: I1202 15:56:28.991238 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 02 15:56:28 crc kubenswrapper[4933]: E1202 15:56:28.991267 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 02 15:56:28 crc kubenswrapper[4933]: I1202 15:56:28.992005 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 02 15:56:28 crc kubenswrapper[4933]: E1202 15:56:28.992019 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 02 15:56:28 crc kubenswrapper[4933]: I1202 15:56:28.992027 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 02 15:56:28 crc kubenswrapper[4933]: E1202 15:56:28.992039 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 02 15:56:28 crc kubenswrapper[4933]: I1202 15:56:28.992047 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 02 15:56:28 crc kubenswrapper[4933]: I1202 15:56:28.992211 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 02 15:56:28 crc kubenswrapper[4933]: I1202 15:56:28.992245 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 02 15:56:28 crc kubenswrapper[4933]: I1202 15:56:28.992259 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 02 15:56:28 crc kubenswrapper[4933]: I1202 15:56:28.992269 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 02 15:56:28 crc kubenswrapper[4933]: I1202 15:56:28.992278 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 02 15:56:28 crc kubenswrapper[4933]: I1202 15:56:28.992286 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 02 15:56:28 crc kubenswrapper[4933]: E1202 15:56:28.992432 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 02 15:56:28 crc kubenswrapper[4933]: I1202 15:56:28.992442 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 02 15:56:28 crc kubenswrapper[4933]: I1202 15:56:28.999060 4933 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="f4b27818a5e8e43d0dc095d08835c792" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" Dec 02 15:56:29 crc kubenswrapper[4933]: I1202 15:56:29.012224 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 15:56:29 crc kubenswrapper[4933]: I1202 15:56:29.012384 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 15:56:29 crc kubenswrapper[4933]: I1202 15:56:29.012455 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 15:56:29 crc kubenswrapper[4933]: I1202 15:56:29.012535 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 15:56:29 crc kubenswrapper[4933]: I1202 15:56:29.012576 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 15:56:29 crc kubenswrapper[4933]: I1202 15:56:29.012603 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 15:56:29 crc kubenswrapper[4933]: I1202 15:56:29.012622 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 15:56:29 crc kubenswrapper[4933]: I1202 15:56:29.012738 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 15:56:29 crc kubenswrapper[4933]: E1202 15:56:29.106300 4933 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.213:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-lvtt7" volumeName="registry-storage" Dec 02 15:56:29 crc kubenswrapper[4933]: I1202 15:56:29.113986 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 15:56:29 crc kubenswrapper[4933]: I1202 15:56:29.114061 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 15:56:29 crc kubenswrapper[4933]: I1202 15:56:29.114086 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 15:56:29 crc kubenswrapper[4933]: I1202 15:56:29.114114 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 15:56:29 crc kubenswrapper[4933]: I1202 15:56:29.114118 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 15:56:29 crc kubenswrapper[4933]: I1202 15:56:29.114131 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 15:56:29 crc kubenswrapper[4933]: I1202 15:56:29.114173 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 15:56:29 crc kubenswrapper[4933]: I1202 15:56:29.114198 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 15:56:29 crc kubenswrapper[4933]: I1202 15:56:29.114193 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 15:56:29 crc kubenswrapper[4933]: I1202 15:56:29.114237 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 15:56:29 crc kubenswrapper[4933]: I1202 15:56:29.114218 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 15:56:29 crc kubenswrapper[4933]: I1202 15:56:29.114275 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 15:56:29 crc kubenswrapper[4933]: I1202 15:56:29.114341 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 15:56:29 crc kubenswrapper[4933]: I1202 15:56:29.114388 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 15:56:29 crc kubenswrapper[4933]: I1202 15:56:29.114448 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 15:56:29 crc kubenswrapper[4933]: I1202 15:56:29.114480 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 15:56:29 crc kubenswrapper[4933]: I1202 15:56:29.167314 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mv2m4" event={"ID":"b3e7c8a1-0224-434e-bf6f-27342e1f27e6","Type":"ContainerDied","Data":"a9a9d14b75345ba1f3324a744c896f88ebaeea28ad15b10b049f765f75d69597"} Dec 02 15:56:29 crc kubenswrapper[4933]: I1202 15:56:29.167397 4933 scope.go:117] "RemoveContainer" containerID="3074e86c3f9d3362141c1623062e45d8c282d41b64e81cda76434a9928ac376c" Dec 02 15:56:29 crc kubenswrapper[4933]: I1202 15:56:29.167600 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mv2m4" Dec 02 15:56:29 crc kubenswrapper[4933]: I1202 15:56:29.169294 4933 status_manager.go:851] "Failed to get status for pod" podUID="b3e7c8a1-0224-434e-bf6f-27342e1f27e6" pod="openshift-marketplace/redhat-marketplace-mv2m4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mv2m4\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:29 crc kubenswrapper[4933]: I1202 15:56:29.171547 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-k9r4r" event={"ID":"e66c3d98-54cc-4d3f-a1b4-3eee8ed9ab51","Type":"ContainerStarted","Data":"c120112a03a18bfa78b3735edcd753a50cf994b9dfbd07c9b3b4576aa6a513d8"} Dec 02 15:56:29 crc kubenswrapper[4933]: I1202 15:56:29.171611 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-k9r4r" event={"ID":"e66c3d98-54cc-4d3f-a1b4-3eee8ed9ab51","Type":"ContainerStarted","Data":"161a1cd1b4d2ed3c4f3a5cf5c08bb84eb88cebcecc74312c97d9ed4a6fff056c"} Dec 02 15:56:29 crc kubenswrapper[4933]: I1202 15:56:29.171861 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-k9r4r" Dec 02 15:56:29 crc kubenswrapper[4933]: I1202 15:56:29.172173 4933 status_manager.go:851] "Failed to get status for pod" podUID="b3e7c8a1-0224-434e-bf6f-27342e1f27e6" pod="openshift-marketplace/redhat-marketplace-mv2m4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mv2m4\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:29 crc kubenswrapper[4933]: I1202 15:56:29.172546 4933 status_manager.go:851] "Failed to get status for pod" podUID="e66c3d98-54cc-4d3f-a1b4-3eee8ed9ab51" pod="openshift-marketplace/marketplace-operator-79b997595-k9r4r" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-k9r4r\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:29 crc kubenswrapper[4933]: I1202 15:56:29.172979 4933 status_manager.go:851] "Failed to get status for pod" podUID="b3e7c8a1-0224-434e-bf6f-27342e1f27e6" pod="openshift-marketplace/redhat-marketplace-mv2m4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mv2m4\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:29 crc kubenswrapper[4933]: I1202 15:56:29.173316 4933 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-k9r4r container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.57:8080/healthz\": dial tcp 10.217.0.57:8080: connect: connection refused" start-of-body= Dec 02 15:56:29 crc kubenswrapper[4933]: I1202 15:56:29.173354 4933 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-k9r4r" podUID="e66c3d98-54cc-4d3f-a1b4-3eee8ed9ab51" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.57:8080/healthz\": dial tcp 10.217.0.57:8080: connect: connection refused" Dec 02 15:56:29 crc kubenswrapper[4933]: I1202 15:56:29.173417 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-r7dnz" Dec 02 15:56:29 crc kubenswrapper[4933]: I1202 15:56:29.173461 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-r7dnz" event={"ID":"cb41f368-0638-40fd-be0f-bc71d0182af8","Type":"ContainerDied","Data":"55c977583d2fbec56c83a31f6b1b81f12f19ab02eca1a47358e92937ad50773c"} Dec 02 15:56:29 crc kubenswrapper[4933]: E1202 15:56:29.173674 4933 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/events\": dial tcp 38.102.83.213:6443: connect: connection refused" event=< Dec 02 15:56:29 crc kubenswrapper[4933]: &Event{ObjectMeta:{marketplace-operator-79b997595-k9r4r.187d711adc028929 openshift-marketplace 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-marketplace,Name:marketplace-operator-79b997595-k9r4r,UID:e66c3d98-54cc-4d3f-a1b4-3eee8ed9ab51,APIVersion:v1,ResourceVersion:29382,FieldPath:spec.containers{marketplace-operator},},Reason:ProbeError,Message:Readiness probe error: Get "http://10.217.0.57:8080/healthz": dial tcp 10.217.0.57:8080: connect: connection refused Dec 02 15:56:29 crc kubenswrapper[4933]: body: Dec 02 15:56:29 crc kubenswrapper[4933]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-02 15:56:29.173344553 +0000 UTC m=+252.424571256,LastTimestamp:2025-12-02 15:56:29.173344553 +0000 UTC m=+252.424571256,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Dec 02 15:56:29 crc kubenswrapper[4933]: > Dec 02 15:56:29 crc kubenswrapper[4933]: I1202 15:56:29.174090 4933 status_manager.go:851] "Failed to get status for pod" podUID="e66c3d98-54cc-4d3f-a1b4-3eee8ed9ab51" pod="openshift-marketplace/marketplace-operator-79b997595-k9r4r" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-k9r4r\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:29 crc kubenswrapper[4933]: I1202 15:56:29.174374 4933 status_manager.go:851] "Failed to get status for pod" podUID="cb41f368-0638-40fd-be0f-bc71d0182af8" pod="openshift-marketplace/marketplace-operator-79b997595-r7dnz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-r7dnz\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:29 crc kubenswrapper[4933]: I1202 15:56:29.174869 4933 status_manager.go:851] "Failed to get status for pod" podUID="b3e7c8a1-0224-434e-bf6f-27342e1f27e6" pod="openshift-marketplace/redhat-marketplace-mv2m4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mv2m4\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:29 crc kubenswrapper[4933]: I1202 15:56:29.175269 4933 status_manager.go:851] "Failed to get status for pod" podUID="e66c3d98-54cc-4d3f-a1b4-3eee8ed9ab51" pod="openshift-marketplace/marketplace-operator-79b997595-k9r4r" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-k9r4r\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:29 crc kubenswrapper[4933]: I1202 15:56:29.177085 4933 status_manager.go:851] "Failed to get status for pod" podUID="cb41f368-0638-40fd-be0f-bc71d0182af8" pod="openshift-marketplace/marketplace-operator-79b997595-r7dnz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-r7dnz\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:29 crc kubenswrapper[4933]: I1202 15:56:29.177409 4933 status_manager.go:851] "Failed to get status for pod" podUID="b3e7c8a1-0224-434e-bf6f-27342e1f27e6" pod="openshift-marketplace/redhat-marketplace-mv2m4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mv2m4\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:29 crc kubenswrapper[4933]: I1202 15:56:29.177726 4933 status_manager.go:851] "Failed to get status for pod" podUID="e66c3d98-54cc-4d3f-a1b4-3eee8ed9ab51" pod="openshift-marketplace/marketplace-operator-79b997595-k9r4r" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-k9r4r\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:29 crc kubenswrapper[4933]: I1202 15:56:29.182460 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jb7rb" Dec 02 15:56:29 crc kubenswrapper[4933]: I1202 15:56:29.182733 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jb7rb" event={"ID":"43ed131f-a943-4e5e-a11f-e07507790a41","Type":"ContainerDied","Data":"c90a00abbdeb98e162cb6f3ec6f14ade359977632fff201f542084c60cf22279"} Dec 02 15:56:29 crc kubenswrapper[4933]: I1202 15:56:29.183275 4933 status_manager.go:851] "Failed to get status for pod" podUID="cb41f368-0638-40fd-be0f-bc71d0182af8" pod="openshift-marketplace/marketplace-operator-79b997595-r7dnz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-r7dnz\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:29 crc kubenswrapper[4933]: I1202 15:56:29.183480 4933 status_manager.go:851] "Failed to get status for pod" podUID="43ed131f-a943-4e5e-a11f-e07507790a41" pod="openshift-marketplace/redhat-operators-jb7rb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-jb7rb\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:29 crc kubenswrapper[4933]: I1202 15:56:29.183738 4933 status_manager.go:851] "Failed to get status for pod" podUID="b3e7c8a1-0224-434e-bf6f-27342e1f27e6" pod="openshift-marketplace/redhat-marketplace-mv2m4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mv2m4\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:29 crc kubenswrapper[4933]: I1202 15:56:29.184402 4933 status_manager.go:851] "Failed to get status for pod" podUID="e66c3d98-54cc-4d3f-a1b4-3eee8ed9ab51" pod="openshift-marketplace/marketplace-operator-79b997595-k9r4r" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-k9r4r\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:29 crc kubenswrapper[4933]: I1202 15:56:29.188957 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 02 15:56:29 crc kubenswrapper[4933]: I1202 15:56:29.191171 4933 scope.go:117] "RemoveContainer" containerID="fe383de997ecef19e7276a057660d2cb3de1bfee4d13ba1b3316b3d03728169d" Dec 02 15:56:29 crc kubenswrapper[4933]: I1202 15:56:29.192450 4933 status_manager.go:851] "Failed to get status for pod" podUID="cb41f368-0638-40fd-be0f-bc71d0182af8" pod="openshift-marketplace/marketplace-operator-79b997595-r7dnz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-r7dnz\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:29 crc kubenswrapper[4933]: I1202 15:56:29.192841 4933 status_manager.go:851] "Failed to get status for pod" podUID="43ed131f-a943-4e5e-a11f-e07507790a41" pod="openshift-marketplace/redhat-operators-jb7rb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-jb7rb\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:29 crc kubenswrapper[4933]: I1202 15:56:29.193034 4933 status_manager.go:851] "Failed to get status for pod" podUID="b3e7c8a1-0224-434e-bf6f-27342e1f27e6" pod="openshift-marketplace/redhat-marketplace-mv2m4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mv2m4\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:29 crc kubenswrapper[4933]: I1202 15:56:29.193194 4933 status_manager.go:851] "Failed to get status for pod" podUID="e66c3d98-54cc-4d3f-a1b4-3eee8ed9ab51" pod="openshift-marketplace/marketplace-operator-79b997595-k9r4r" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-k9r4r\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:29 crc kubenswrapper[4933]: I1202 15:56:29.193528 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 02 15:56:29 crc kubenswrapper[4933]: I1202 15:56:29.195326 4933 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="50e3f8f75abacf2b0701b20cc58796727bb04d99836c7ad69c27355d0c7f070f" exitCode=2 Dec 02 15:56:29 crc kubenswrapper[4933]: I1202 15:56:29.200691 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mp97n" Dec 02 15:56:29 crc kubenswrapper[4933]: I1202 15:56:29.200694 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mp97n" event={"ID":"4f5f6b9a-09c1-46e9-9fba-1de95dd3244c","Type":"ContainerDied","Data":"3384e192e61babc0ec7997e0b9abf00142a09c86ca8c0cd569a8635d573e9ddd"} Dec 02 15:56:29 crc kubenswrapper[4933]: I1202 15:56:29.200706 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-54gm8" Dec 02 15:56:29 crc kubenswrapper[4933]: I1202 15:56:29.201645 4933 status_manager.go:851] "Failed to get status for pod" podUID="cb41f368-0638-40fd-be0f-bc71d0182af8" pod="openshift-marketplace/marketplace-operator-79b997595-r7dnz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-r7dnz\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:29 crc kubenswrapper[4933]: I1202 15:56:29.201992 4933 status_manager.go:851] "Failed to get status for pod" podUID="4f5f6b9a-09c1-46e9-9fba-1de95dd3244c" pod="openshift-marketplace/certified-operators-mp97n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-mp97n\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:29 crc kubenswrapper[4933]: I1202 15:56:29.202238 4933 status_manager.go:851] "Failed to get status for pod" podUID="43ed131f-a943-4e5e-a11f-e07507790a41" pod="openshift-marketplace/redhat-operators-jb7rb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-jb7rb\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:29 crc kubenswrapper[4933]: I1202 15:56:29.202517 4933 status_manager.go:851] "Failed to get status for pod" podUID="b3e7c8a1-0224-434e-bf6f-27342e1f27e6" pod="openshift-marketplace/redhat-marketplace-mv2m4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mv2m4\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:29 crc kubenswrapper[4933]: I1202 15:56:29.202974 4933 status_manager.go:851] "Failed to get status for pod" podUID="c7cef2b5-6b23-47ba-83ce-161a02f128a1" pod="openshift-marketplace/community-operators-54gm8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-54gm8\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:29 crc kubenswrapper[4933]: I1202 15:56:29.203683 4933 status_manager.go:851] "Failed to get status for pod" podUID="e66c3d98-54cc-4d3f-a1b4-3eee8ed9ab51" pod="openshift-marketplace/marketplace-operator-79b997595-k9r4r" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-k9r4r\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:29 crc kubenswrapper[4933]: I1202 15:56:29.204790 4933 status_manager.go:851] "Failed to get status for pod" podUID="cb41f368-0638-40fd-be0f-bc71d0182af8" pod="openshift-marketplace/marketplace-operator-79b997595-r7dnz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-r7dnz\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:29 crc kubenswrapper[4933]: I1202 15:56:29.205549 4933 status_manager.go:851] "Failed to get status for pod" podUID="4f5f6b9a-09c1-46e9-9fba-1de95dd3244c" pod="openshift-marketplace/certified-operators-mp97n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-mp97n\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:29 crc kubenswrapper[4933]: I1202 15:56:29.205934 4933 status_manager.go:851] "Failed to get status for pod" podUID="43ed131f-a943-4e5e-a11f-e07507790a41" pod="openshift-marketplace/redhat-operators-jb7rb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-jb7rb\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:29 crc kubenswrapper[4933]: I1202 15:56:29.206296 4933 status_manager.go:851] "Failed to get status for pod" podUID="b3e7c8a1-0224-434e-bf6f-27342e1f27e6" pod="openshift-marketplace/redhat-marketplace-mv2m4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mv2m4\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:29 crc kubenswrapper[4933]: I1202 15:56:29.206594 4933 status_manager.go:851] "Failed to get status for pod" podUID="c7cef2b5-6b23-47ba-83ce-161a02f128a1" pod="openshift-marketplace/community-operators-54gm8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-54gm8\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:29 crc kubenswrapper[4933]: I1202 15:56:29.206963 4933 status_manager.go:851] "Failed to get status for pod" podUID="e66c3d98-54cc-4d3f-a1b4-3eee8ed9ab51" pod="openshift-marketplace/marketplace-operator-79b997595-k9r4r" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-k9r4r\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:29 crc kubenswrapper[4933]: I1202 15:56:29.218128 4933 scope.go:117] "RemoveContainer" containerID="e07988a312b895d475ffb56550309b00cc4b039db26fa8e64b4698fbcfa021ce" Dec 02 15:56:29 crc kubenswrapper[4933]: I1202 15:56:29.224567 4933 status_manager.go:851] "Failed to get status for pod" podUID="c7cef2b5-6b23-47ba-83ce-161a02f128a1" pod="openshift-marketplace/community-operators-54gm8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-54gm8\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:29 crc kubenswrapper[4933]: I1202 15:56:29.225449 4933 status_manager.go:851] "Failed to get status for pod" podUID="e66c3d98-54cc-4d3f-a1b4-3eee8ed9ab51" pod="openshift-marketplace/marketplace-operator-79b997595-k9r4r" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-k9r4r\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:29 crc kubenswrapper[4933]: I1202 15:56:29.226353 4933 status_manager.go:851] "Failed to get status for pod" podUID="cb41f368-0638-40fd-be0f-bc71d0182af8" pod="openshift-marketplace/marketplace-operator-79b997595-r7dnz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-r7dnz\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:29 crc kubenswrapper[4933]: I1202 15:56:29.227189 4933 status_manager.go:851] "Failed to get status for pod" podUID="4f5f6b9a-09c1-46e9-9fba-1de95dd3244c" pod="openshift-marketplace/certified-operators-mp97n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-mp97n\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:29 crc kubenswrapper[4933]: I1202 15:56:29.227609 4933 status_manager.go:851] "Failed to get status for pod" podUID="43ed131f-a943-4e5e-a11f-e07507790a41" pod="openshift-marketplace/redhat-operators-jb7rb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-jb7rb\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:29 crc kubenswrapper[4933]: I1202 15:56:29.228354 4933 status_manager.go:851] "Failed to get status for pod" podUID="b3e7c8a1-0224-434e-bf6f-27342e1f27e6" pod="openshift-marketplace/redhat-marketplace-mv2m4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mv2m4\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:29 crc kubenswrapper[4933]: I1202 15:56:29.228876 4933 status_manager.go:851] "Failed to get status for pod" podUID="cb41f368-0638-40fd-be0f-bc71d0182af8" pod="openshift-marketplace/marketplace-operator-79b997595-r7dnz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-r7dnz\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:29 crc kubenswrapper[4933]: I1202 15:56:29.229530 4933 status_manager.go:851] "Failed to get status for pod" podUID="4f5f6b9a-09c1-46e9-9fba-1de95dd3244c" pod="openshift-marketplace/certified-operators-mp97n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-mp97n\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:29 crc kubenswrapper[4933]: I1202 15:56:29.230125 4933 status_manager.go:851] "Failed to get status for pod" podUID="43ed131f-a943-4e5e-a11f-e07507790a41" pod="openshift-marketplace/redhat-operators-jb7rb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-jb7rb\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:29 crc kubenswrapper[4933]: I1202 15:56:29.230364 4933 status_manager.go:851] "Failed to get status for pod" podUID="b3e7c8a1-0224-434e-bf6f-27342e1f27e6" pod="openshift-marketplace/redhat-marketplace-mv2m4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mv2m4\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:29 crc kubenswrapper[4933]: I1202 15:56:29.230550 4933 status_manager.go:851] "Failed to get status for pod" podUID="c7cef2b5-6b23-47ba-83ce-161a02f128a1" pod="openshift-marketplace/community-operators-54gm8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-54gm8\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:29 crc kubenswrapper[4933]: I1202 15:56:29.230779 4933 status_manager.go:851] "Failed to get status for pod" podUID="e66c3d98-54cc-4d3f-a1b4-3eee8ed9ab51" pod="openshift-marketplace/marketplace-operator-79b997595-k9r4r" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-k9r4r\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:29 crc kubenswrapper[4933]: I1202 15:56:29.240091 4933 scope.go:117] "RemoveContainer" containerID="c583d93a0b8b419befad740071a84b41fe53c2fb71edd40029821c1edf52785c" Dec 02 15:56:29 crc kubenswrapper[4933]: I1202 15:56:29.254532 4933 scope.go:117] "RemoveContainer" containerID="e7b4ddfc385effe8ebae763018a4d0dc4c55cdb2374f6f12aa9709a955f7fe4a" Dec 02 15:56:29 crc kubenswrapper[4933]: I1202 15:56:29.267235 4933 scope.go:117] "RemoveContainer" containerID="fb3797e3a924c501703f252c1bf7b1f9267e19460cd3f72936f4c849bc6f9155" Dec 02 15:56:29 crc kubenswrapper[4933]: I1202 15:56:29.285402 4933 scope.go:117] "RemoveContainer" containerID="6e6dfc22a672262f956133e07d0a3a50e62784d26feb2d3518fd9a1c5c94f602" Dec 02 15:56:29 crc kubenswrapper[4933]: I1202 15:56:29.298948 4933 scope.go:117] "RemoveContainer" containerID="c5a80b8d57297ab1bbbb1fc495c0187e3370d09045324fb3adb2daa6b5507834" Dec 02 15:56:29 crc kubenswrapper[4933]: I1202 15:56:29.315001 4933 scope.go:117] "RemoveContainer" containerID="264a46aff9cc71fa5de98c4b52a9f4c7f908dfec5c819be7a7a9cf31403928b5" Dec 02 15:56:29 crc kubenswrapper[4933]: I1202 15:56:29.327718 4933 scope.go:117] "RemoveContainer" containerID="23e6214856d265959fa8774ac6c12cd5bb4ab1b4ca6d1c36b2875a4eebd80812" Dec 02 15:56:30 crc kubenswrapper[4933]: I1202 15:56:30.207751 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 02 15:56:30 crc kubenswrapper[4933]: I1202 15:56:30.208894 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 02 15:56:30 crc kubenswrapper[4933]: I1202 15:56:30.209474 4933 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="a0f8661c980893fc646e84a6bb8547946653723fb3f1449d585953e25715d588" exitCode=0 Dec 02 15:56:30 crc kubenswrapper[4933]: I1202 15:56:30.209506 4933 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="266ff5f18c317176752753189e1fd32c61d17e22bebba19421d71cdce41c10e4" exitCode=0 Dec 02 15:56:30 crc kubenswrapper[4933]: I1202 15:56:30.209515 4933 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="559f28d5b8b13a4d78a727f671a75d2c61a391f83089c98749a8071b466c8fae" exitCode=0 Dec 02 15:56:30 crc kubenswrapper[4933]: I1202 15:56:30.209569 4933 scope.go:117] "RemoveContainer" containerID="bdced4864fc5e9a41404f9484c6126634ffcbc3388080207f6a5508be6dc7b19" Dec 02 15:56:30 crc kubenswrapper[4933]: I1202 15:56:30.212896 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-k9r4r_e66c3d98-54cc-4d3f-a1b4-3eee8ed9ab51/marketplace-operator/0.log" Dec 02 15:56:30 crc kubenswrapper[4933]: I1202 15:56:30.212930 4933 generic.go:334] "Generic (PLEG): container finished" podID="e66c3d98-54cc-4d3f-a1b4-3eee8ed9ab51" containerID="c120112a03a18bfa78b3735edcd753a50cf994b9dfbd07c9b3b4576aa6a513d8" exitCode=1 Dec 02 15:56:30 crc kubenswrapper[4933]: I1202 15:56:30.212983 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-k9r4r" event={"ID":"e66c3d98-54cc-4d3f-a1b4-3eee8ed9ab51","Type":"ContainerDied","Data":"c120112a03a18bfa78b3735edcd753a50cf994b9dfbd07c9b3b4576aa6a513d8"} Dec 02 15:56:30 crc kubenswrapper[4933]: I1202 15:56:30.213288 4933 scope.go:117] "RemoveContainer" containerID="c120112a03a18bfa78b3735edcd753a50cf994b9dfbd07c9b3b4576aa6a513d8" Dec 02 15:56:30 crc kubenswrapper[4933]: I1202 15:56:30.213645 4933 status_manager.go:851] "Failed to get status for pod" podUID="4f5f6b9a-09c1-46e9-9fba-1de95dd3244c" pod="openshift-marketplace/certified-operators-mp97n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-mp97n\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:30 crc kubenswrapper[4933]: I1202 15:56:30.213894 4933 status_manager.go:851] "Failed to get status for pod" podUID="43ed131f-a943-4e5e-a11f-e07507790a41" pod="openshift-marketplace/redhat-operators-jb7rb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-jb7rb\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:30 crc kubenswrapper[4933]: I1202 15:56:30.214084 4933 status_manager.go:851] "Failed to get status for pod" podUID="b3e7c8a1-0224-434e-bf6f-27342e1f27e6" pod="openshift-marketplace/redhat-marketplace-mv2m4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mv2m4\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:30 crc kubenswrapper[4933]: I1202 15:56:30.214327 4933 status_manager.go:851] "Failed to get status for pod" podUID="c7cef2b5-6b23-47ba-83ce-161a02f128a1" pod="openshift-marketplace/community-operators-54gm8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-54gm8\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:30 crc kubenswrapper[4933]: I1202 15:56:30.214526 4933 status_manager.go:851] "Failed to get status for pod" podUID="e66c3d98-54cc-4d3f-a1b4-3eee8ed9ab51" pod="openshift-marketplace/marketplace-operator-79b997595-k9r4r" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-k9r4r\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:30 crc kubenswrapper[4933]: I1202 15:56:30.214760 4933 status_manager.go:851] "Failed to get status for pod" podUID="cb41f368-0638-40fd-be0f-bc71d0182af8" pod="openshift-marketplace/marketplace-operator-79b997595-r7dnz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-r7dnz\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:30 crc kubenswrapper[4933]: I1202 15:56:30.214858 4933 generic.go:334] "Generic (PLEG): container finished" podID="6a53a119-fa38-4661-950b-e58963acf7ff" containerID="2c641641ce88dcdcd38684d29f2c284954fba776681a96c108d9690bde8d6c63" exitCode=0 Dec 02 15:56:30 crc kubenswrapper[4933]: I1202 15:56:30.214878 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"6a53a119-fa38-4661-950b-e58963acf7ff","Type":"ContainerDied","Data":"2c641641ce88dcdcd38684d29f2c284954fba776681a96c108d9690bde8d6c63"} Dec 02 15:56:30 crc kubenswrapper[4933]: I1202 15:56:30.215406 4933 status_manager.go:851] "Failed to get status for pod" podUID="cb41f368-0638-40fd-be0f-bc71d0182af8" pod="openshift-marketplace/marketplace-operator-79b997595-r7dnz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-r7dnz\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:30 crc kubenswrapper[4933]: I1202 15:56:30.215650 4933 status_manager.go:851] "Failed to get status for pod" podUID="4f5f6b9a-09c1-46e9-9fba-1de95dd3244c" pod="openshift-marketplace/certified-operators-mp97n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-mp97n\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:30 crc kubenswrapper[4933]: I1202 15:56:30.215900 4933 status_manager.go:851] "Failed to get status for pod" podUID="43ed131f-a943-4e5e-a11f-e07507790a41" pod="openshift-marketplace/redhat-operators-jb7rb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-jb7rb\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:30 crc kubenswrapper[4933]: I1202 15:56:30.216095 4933 status_manager.go:851] "Failed to get status for pod" podUID="b3e7c8a1-0224-434e-bf6f-27342e1f27e6" pod="openshift-marketplace/redhat-marketplace-mv2m4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mv2m4\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:30 crc kubenswrapper[4933]: I1202 15:56:30.216303 4933 status_manager.go:851] "Failed to get status for pod" podUID="c7cef2b5-6b23-47ba-83ce-161a02f128a1" pod="openshift-marketplace/community-operators-54gm8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-54gm8\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:30 crc kubenswrapper[4933]: I1202 15:56:30.216621 4933 status_manager.go:851] "Failed to get status for pod" podUID="e66c3d98-54cc-4d3f-a1b4-3eee8ed9ab51" pod="openshift-marketplace/marketplace-operator-79b997595-k9r4r" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-k9r4r\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:30 crc kubenswrapper[4933]: I1202 15:56:30.216873 4933 status_manager.go:851] "Failed to get status for pod" podUID="6a53a119-fa38-4661-950b-e58963acf7ff" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:31 crc kubenswrapper[4933]: I1202 15:56:31.220756 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-k9r4r_e66c3d98-54cc-4d3f-a1b4-3eee8ed9ab51/marketplace-operator/1.log" Dec 02 15:56:31 crc kubenswrapper[4933]: I1202 15:56:31.221205 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-k9r4r_e66c3d98-54cc-4d3f-a1b4-3eee8ed9ab51/marketplace-operator/0.log" Dec 02 15:56:31 crc kubenswrapper[4933]: I1202 15:56:31.221250 4933 generic.go:334] "Generic (PLEG): container finished" podID="e66c3d98-54cc-4d3f-a1b4-3eee8ed9ab51" containerID="30fb2c88ce32300cc80a328922ccca20df60fd665631841e12d586d65b24198d" exitCode=1 Dec 02 15:56:31 crc kubenswrapper[4933]: I1202 15:56:31.221318 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-k9r4r" event={"ID":"e66c3d98-54cc-4d3f-a1b4-3eee8ed9ab51","Type":"ContainerDied","Data":"30fb2c88ce32300cc80a328922ccca20df60fd665631841e12d586d65b24198d"} Dec 02 15:56:31 crc kubenswrapper[4933]: I1202 15:56:31.221356 4933 scope.go:117] "RemoveContainer" containerID="c120112a03a18bfa78b3735edcd753a50cf994b9dfbd07c9b3b4576aa6a513d8" Dec 02 15:56:31 crc kubenswrapper[4933]: I1202 15:56:31.221719 4933 scope.go:117] "RemoveContainer" containerID="30fb2c88ce32300cc80a328922ccca20df60fd665631841e12d586d65b24198d" Dec 02 15:56:31 crc kubenswrapper[4933]: E1202 15:56:31.221931 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"marketplace-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=marketplace-operator pod=marketplace-operator-79b997595-k9r4r_openshift-marketplace(e66c3d98-54cc-4d3f-a1b4-3eee8ed9ab51)\"" pod="openshift-marketplace/marketplace-operator-79b997595-k9r4r" podUID="e66c3d98-54cc-4d3f-a1b4-3eee8ed9ab51" Dec 02 15:56:31 crc kubenswrapper[4933]: I1202 15:56:31.222022 4933 status_manager.go:851] "Failed to get status for pod" podUID="6a53a119-fa38-4661-950b-e58963acf7ff" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:31 crc kubenswrapper[4933]: I1202 15:56:31.222365 4933 status_manager.go:851] "Failed to get status for pod" podUID="cb41f368-0638-40fd-be0f-bc71d0182af8" pod="openshift-marketplace/marketplace-operator-79b997595-r7dnz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-r7dnz\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:31 crc kubenswrapper[4933]: I1202 15:56:31.222613 4933 status_manager.go:851] "Failed to get status for pod" podUID="4f5f6b9a-09c1-46e9-9fba-1de95dd3244c" pod="openshift-marketplace/certified-operators-mp97n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-mp97n\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:31 crc kubenswrapper[4933]: I1202 15:56:31.222915 4933 status_manager.go:851] "Failed to get status for pod" podUID="43ed131f-a943-4e5e-a11f-e07507790a41" pod="openshift-marketplace/redhat-operators-jb7rb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-jb7rb\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:31 crc kubenswrapper[4933]: I1202 15:56:31.223273 4933 status_manager.go:851] "Failed to get status for pod" podUID="b3e7c8a1-0224-434e-bf6f-27342e1f27e6" pod="openshift-marketplace/redhat-marketplace-mv2m4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mv2m4\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:31 crc kubenswrapper[4933]: I1202 15:56:31.223533 4933 status_manager.go:851] "Failed to get status for pod" podUID="c7cef2b5-6b23-47ba-83ce-161a02f128a1" pod="openshift-marketplace/community-operators-54gm8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-54gm8\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:31 crc kubenswrapper[4933]: I1202 15:56:31.223849 4933 status_manager.go:851] "Failed to get status for pod" podUID="e66c3d98-54cc-4d3f-a1b4-3eee8ed9ab51" pod="openshift-marketplace/marketplace-operator-79b997595-k9r4r" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-k9r4r\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:31 crc kubenswrapper[4933]: I1202 15:56:31.224691 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 02 15:56:31 crc kubenswrapper[4933]: I1202 15:56:31.463765 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 02 15:56:31 crc kubenswrapper[4933]: I1202 15:56:31.464465 4933 status_manager.go:851] "Failed to get status for pod" podUID="6a53a119-fa38-4661-950b-e58963acf7ff" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:31 crc kubenswrapper[4933]: I1202 15:56:31.464630 4933 status_manager.go:851] "Failed to get status for pod" podUID="cb41f368-0638-40fd-be0f-bc71d0182af8" pod="openshift-marketplace/marketplace-operator-79b997595-r7dnz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-r7dnz\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:31 crc kubenswrapper[4933]: I1202 15:56:31.464833 4933 status_manager.go:851] "Failed to get status for pod" podUID="4f5f6b9a-09c1-46e9-9fba-1de95dd3244c" pod="openshift-marketplace/certified-operators-mp97n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-mp97n\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:31 crc kubenswrapper[4933]: I1202 15:56:31.465068 4933 status_manager.go:851] "Failed to get status for pod" podUID="43ed131f-a943-4e5e-a11f-e07507790a41" pod="openshift-marketplace/redhat-operators-jb7rb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-jb7rb\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:31 crc kubenswrapper[4933]: I1202 15:56:31.465219 4933 status_manager.go:851] "Failed to get status for pod" podUID="b3e7c8a1-0224-434e-bf6f-27342e1f27e6" pod="openshift-marketplace/redhat-marketplace-mv2m4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mv2m4\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:31 crc kubenswrapper[4933]: I1202 15:56:31.465357 4933 status_manager.go:851] "Failed to get status for pod" podUID="c7cef2b5-6b23-47ba-83ce-161a02f128a1" pod="openshift-marketplace/community-operators-54gm8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-54gm8\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:31 crc kubenswrapper[4933]: I1202 15:56:31.465490 4933 status_manager.go:851] "Failed to get status for pod" podUID="e66c3d98-54cc-4d3f-a1b4-3eee8ed9ab51" pod="openshift-marketplace/marketplace-operator-79b997595-k9r4r" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-k9r4r\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:31 crc kubenswrapper[4933]: I1202 15:56:31.548768 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6a53a119-fa38-4661-950b-e58963acf7ff-kubelet-dir\") pod \"6a53a119-fa38-4661-950b-e58963acf7ff\" (UID: \"6a53a119-fa38-4661-950b-e58963acf7ff\") " Dec 02 15:56:31 crc kubenswrapper[4933]: I1202 15:56:31.548914 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6a53a119-fa38-4661-950b-e58963acf7ff-var-lock\") pod \"6a53a119-fa38-4661-950b-e58963acf7ff\" (UID: \"6a53a119-fa38-4661-950b-e58963acf7ff\") " Dec 02 15:56:31 crc kubenswrapper[4933]: I1202 15:56:31.548909 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6a53a119-fa38-4661-950b-e58963acf7ff-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "6a53a119-fa38-4661-950b-e58963acf7ff" (UID: "6a53a119-fa38-4661-950b-e58963acf7ff"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 15:56:31 crc kubenswrapper[4933]: I1202 15:56:31.548950 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6a53a119-fa38-4661-950b-e58963acf7ff-kube-api-access\") pod \"6a53a119-fa38-4661-950b-e58963acf7ff\" (UID: \"6a53a119-fa38-4661-950b-e58963acf7ff\") " Dec 02 15:56:31 crc kubenswrapper[4933]: I1202 15:56:31.548988 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6a53a119-fa38-4661-950b-e58963acf7ff-var-lock" (OuterVolumeSpecName: "var-lock") pod "6a53a119-fa38-4661-950b-e58963acf7ff" (UID: "6a53a119-fa38-4661-950b-e58963acf7ff"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 15:56:31 crc kubenswrapper[4933]: I1202 15:56:31.549158 4933 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6a53a119-fa38-4661-950b-e58963acf7ff-var-lock\") on node \"crc\" DevicePath \"\"" Dec 02 15:56:31 crc kubenswrapper[4933]: I1202 15:56:31.549174 4933 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6a53a119-fa38-4661-950b-e58963acf7ff-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 02 15:56:31 crc kubenswrapper[4933]: I1202 15:56:31.553991 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a53a119-fa38-4661-950b-e58963acf7ff-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "6a53a119-fa38-4661-950b-e58963acf7ff" (UID: "6a53a119-fa38-4661-950b-e58963acf7ff"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 15:56:31 crc kubenswrapper[4933]: I1202 15:56:31.650301 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6a53a119-fa38-4661-950b-e58963acf7ff-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 02 15:56:31 crc kubenswrapper[4933]: I1202 15:56:31.856637 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 02 15:56:31 crc kubenswrapper[4933]: I1202 15:56:31.857406 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 15:56:31 crc kubenswrapper[4933]: I1202 15:56:31.858121 4933 status_manager.go:851] "Failed to get status for pod" podUID="cb41f368-0638-40fd-be0f-bc71d0182af8" pod="openshift-marketplace/marketplace-operator-79b997595-r7dnz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-r7dnz\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:31 crc kubenswrapper[4933]: I1202 15:56:31.858701 4933 status_manager.go:851] "Failed to get status for pod" podUID="4f5f6b9a-09c1-46e9-9fba-1de95dd3244c" pod="openshift-marketplace/certified-operators-mp97n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-mp97n\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:31 crc kubenswrapper[4933]: I1202 15:56:31.858993 4933 status_manager.go:851] "Failed to get status for pod" podUID="43ed131f-a943-4e5e-a11f-e07507790a41" pod="openshift-marketplace/redhat-operators-jb7rb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-jb7rb\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:31 crc kubenswrapper[4933]: I1202 15:56:31.859355 4933 status_manager.go:851] "Failed to get status for pod" podUID="b3e7c8a1-0224-434e-bf6f-27342e1f27e6" pod="openshift-marketplace/redhat-marketplace-mv2m4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mv2m4\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:31 crc kubenswrapper[4933]: I1202 15:56:31.859751 4933 status_manager.go:851] "Failed to get status for pod" podUID="c7cef2b5-6b23-47ba-83ce-161a02f128a1" pod="openshift-marketplace/community-operators-54gm8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-54gm8\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:31 crc kubenswrapper[4933]: I1202 15:56:31.860115 4933 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:31 crc kubenswrapper[4933]: I1202 15:56:31.860501 4933 status_manager.go:851] "Failed to get status for pod" podUID="e66c3d98-54cc-4d3f-a1b4-3eee8ed9ab51" pod="openshift-marketplace/marketplace-operator-79b997595-k9r4r" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-k9r4r\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:31 crc kubenswrapper[4933]: I1202 15:56:31.860870 4933 status_manager.go:851] "Failed to get status for pod" podUID="6a53a119-fa38-4661-950b-e58963acf7ff" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:31 crc kubenswrapper[4933]: I1202 15:56:31.953844 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 02 15:56:31 crc kubenswrapper[4933]: I1202 15:56:31.954267 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 02 15:56:31 crc kubenswrapper[4933]: I1202 15:56:31.954340 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 02 15:56:31 crc kubenswrapper[4933]: I1202 15:56:31.953951 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 15:56:31 crc kubenswrapper[4933]: I1202 15:56:31.954326 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 15:56:31 crc kubenswrapper[4933]: I1202 15:56:31.954521 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 15:56:31 crc kubenswrapper[4933]: I1202 15:56:31.954943 4933 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 02 15:56:31 crc kubenswrapper[4933]: I1202 15:56:31.954976 4933 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 02 15:56:31 crc kubenswrapper[4933]: I1202 15:56:31.954995 4933 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Dec 02 15:56:32 crc kubenswrapper[4933]: I1202 15:56:32.239762 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-k9r4r_e66c3d98-54cc-4d3f-a1b4-3eee8ed9ab51/marketplace-operator/1.log" Dec 02 15:56:32 crc kubenswrapper[4933]: I1202 15:56:32.244761 4933 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:32 crc kubenswrapper[4933]: I1202 15:56:32.245291 4933 status_manager.go:851] "Failed to get status for pod" podUID="e66c3d98-54cc-4d3f-a1b4-3eee8ed9ab51" pod="openshift-marketplace/marketplace-operator-79b997595-k9r4r" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-k9r4r\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:32 crc kubenswrapper[4933]: I1202 15:56:32.245554 4933 status_manager.go:851] "Failed to get status for pod" podUID="6a53a119-fa38-4661-950b-e58963acf7ff" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:32 crc kubenswrapper[4933]: I1202 15:56:32.245762 4933 status_manager.go:851] "Failed to get status for pod" podUID="cb41f368-0638-40fd-be0f-bc71d0182af8" pod="openshift-marketplace/marketplace-operator-79b997595-r7dnz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-r7dnz\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:32 crc kubenswrapper[4933]: I1202 15:56:32.246207 4933 status_manager.go:851] "Failed to get status for pod" podUID="4f5f6b9a-09c1-46e9-9fba-1de95dd3244c" pod="openshift-marketplace/certified-operators-mp97n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-mp97n\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:32 crc kubenswrapper[4933]: I1202 15:56:32.246477 4933 status_manager.go:851] "Failed to get status for pod" podUID="43ed131f-a943-4e5e-a11f-e07507790a41" pod="openshift-marketplace/redhat-operators-jb7rb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-jb7rb\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:32 crc kubenswrapper[4933]: I1202 15:56:32.246533 4933 scope.go:117] "RemoveContainer" containerID="30fb2c88ce32300cc80a328922ccca20df60fd665631841e12d586d65b24198d" Dec 02 15:56:32 crc kubenswrapper[4933]: I1202 15:56:32.246866 4933 status_manager.go:851] "Failed to get status for pod" podUID="b3e7c8a1-0224-434e-bf6f-27342e1f27e6" pod="openshift-marketplace/redhat-marketplace-mv2m4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mv2m4\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:32 crc kubenswrapper[4933]: I1202 15:56:32.247069 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"6a53a119-fa38-4661-950b-e58963acf7ff","Type":"ContainerDied","Data":"0a13483c0257773cd7274a79c17b625a3a97b86a6b7b730bf34465abbde91793"} Dec 02 15:56:32 crc kubenswrapper[4933]: I1202 15:56:32.247090 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 02 15:56:32 crc kubenswrapper[4933]: I1202 15:56:32.247102 4933 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0a13483c0257773cd7274a79c17b625a3a97b86a6b7b730bf34465abbde91793" Dec 02 15:56:32 crc kubenswrapper[4933]: I1202 15:56:32.247162 4933 status_manager.go:851] "Failed to get status for pod" podUID="c7cef2b5-6b23-47ba-83ce-161a02f128a1" pod="openshift-marketplace/community-operators-54gm8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-54gm8\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:32 crc kubenswrapper[4933]: E1202 15:56:32.247920 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"marketplace-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=marketplace-operator pod=marketplace-operator-79b997595-k9r4r_openshift-marketplace(e66c3d98-54cc-4d3f-a1b4-3eee8ed9ab51)\"" pod="openshift-marketplace/marketplace-operator-79b997595-k9r4r" podUID="e66c3d98-54cc-4d3f-a1b4-3eee8ed9ab51" Dec 02 15:56:32 crc kubenswrapper[4933]: I1202 15:56:32.263221 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 02 15:56:32 crc kubenswrapper[4933]: I1202 15:56:32.264571 4933 status_manager.go:851] "Failed to get status for pod" podUID="6a53a119-fa38-4661-950b-e58963acf7ff" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:32 crc kubenswrapper[4933]: I1202 15:56:32.264866 4933 status_manager.go:851] "Failed to get status for pod" podUID="cb41f368-0638-40fd-be0f-bc71d0182af8" pod="openshift-marketplace/marketplace-operator-79b997595-r7dnz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-r7dnz\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:32 crc kubenswrapper[4933]: I1202 15:56:32.265219 4933 status_manager.go:851] "Failed to get status for pod" podUID="4f5f6b9a-09c1-46e9-9fba-1de95dd3244c" pod="openshift-marketplace/certified-operators-mp97n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-mp97n\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:32 crc kubenswrapper[4933]: I1202 15:56:32.265474 4933 status_manager.go:851] "Failed to get status for pod" podUID="43ed131f-a943-4e5e-a11f-e07507790a41" pod="openshift-marketplace/redhat-operators-jb7rb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-jb7rb\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:32 crc kubenswrapper[4933]: I1202 15:56:32.266136 4933 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="67497877f8241167fd89cf2ebbc7257a910b4f22e96a1e58f78936317daf19aa" exitCode=0 Dec 02 15:56:32 crc kubenswrapper[4933]: I1202 15:56:32.266193 4933 scope.go:117] "RemoveContainer" containerID="a0f8661c980893fc646e84a6bb8547946653723fb3f1449d585953e25715d588" Dec 02 15:56:32 crc kubenswrapper[4933]: I1202 15:56:32.266423 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 15:56:32 crc kubenswrapper[4933]: I1202 15:56:32.266649 4933 status_manager.go:851] "Failed to get status for pod" podUID="b3e7c8a1-0224-434e-bf6f-27342e1f27e6" pod="openshift-marketplace/redhat-marketplace-mv2m4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mv2m4\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:32 crc kubenswrapper[4933]: I1202 15:56:32.267094 4933 status_manager.go:851] "Failed to get status for pod" podUID="c7cef2b5-6b23-47ba-83ce-161a02f128a1" pod="openshift-marketplace/community-operators-54gm8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-54gm8\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:32 crc kubenswrapper[4933]: I1202 15:56:32.267317 4933 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:32 crc kubenswrapper[4933]: I1202 15:56:32.267495 4933 status_manager.go:851] "Failed to get status for pod" podUID="e66c3d98-54cc-4d3f-a1b4-3eee8ed9ab51" pod="openshift-marketplace/marketplace-operator-79b997595-k9r4r" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-k9r4r\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:32 crc kubenswrapper[4933]: I1202 15:56:32.280958 4933 scope.go:117] "RemoveContainer" containerID="266ff5f18c317176752753189e1fd32c61d17e22bebba19421d71cdce41c10e4" Dec 02 15:56:32 crc kubenswrapper[4933]: I1202 15:56:32.282530 4933 status_manager.go:851] "Failed to get status for pod" podUID="6a53a119-fa38-4661-950b-e58963acf7ff" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:32 crc kubenswrapper[4933]: I1202 15:56:32.282837 4933 status_manager.go:851] "Failed to get status for pod" podUID="cb41f368-0638-40fd-be0f-bc71d0182af8" pod="openshift-marketplace/marketplace-operator-79b997595-r7dnz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-r7dnz\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:32 crc kubenswrapper[4933]: I1202 15:56:32.283113 4933 status_manager.go:851] "Failed to get status for pod" podUID="4f5f6b9a-09c1-46e9-9fba-1de95dd3244c" pod="openshift-marketplace/certified-operators-mp97n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-mp97n\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:32 crc kubenswrapper[4933]: I1202 15:56:32.283294 4933 status_manager.go:851] "Failed to get status for pod" podUID="43ed131f-a943-4e5e-a11f-e07507790a41" pod="openshift-marketplace/redhat-operators-jb7rb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-jb7rb\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:32 crc kubenswrapper[4933]: I1202 15:56:32.283460 4933 status_manager.go:851] "Failed to get status for pod" podUID="b3e7c8a1-0224-434e-bf6f-27342e1f27e6" pod="openshift-marketplace/redhat-marketplace-mv2m4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mv2m4\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:32 crc kubenswrapper[4933]: I1202 15:56:32.283625 4933 status_manager.go:851] "Failed to get status for pod" podUID="c7cef2b5-6b23-47ba-83ce-161a02f128a1" pod="openshift-marketplace/community-operators-54gm8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-54gm8\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:32 crc kubenswrapper[4933]: I1202 15:56:32.283837 4933 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:32 crc kubenswrapper[4933]: I1202 15:56:32.284017 4933 status_manager.go:851] "Failed to get status for pod" podUID="e66c3d98-54cc-4d3f-a1b4-3eee8ed9ab51" pod="openshift-marketplace/marketplace-operator-79b997595-k9r4r" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-k9r4r\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:32 crc kubenswrapper[4933]: I1202 15:56:32.293848 4933 scope.go:117] "RemoveContainer" containerID="559f28d5b8b13a4d78a727f671a75d2c61a391f83089c98749a8071b466c8fae" Dec 02 15:56:32 crc kubenswrapper[4933]: I1202 15:56:32.307101 4933 scope.go:117] "RemoveContainer" containerID="50e3f8f75abacf2b0701b20cc58796727bb04d99836c7ad69c27355d0c7f070f" Dec 02 15:56:32 crc kubenswrapper[4933]: I1202 15:56:32.319889 4933 scope.go:117] "RemoveContainer" containerID="67497877f8241167fd89cf2ebbc7257a910b4f22e96a1e58f78936317daf19aa" Dec 02 15:56:32 crc kubenswrapper[4933]: I1202 15:56:32.334032 4933 scope.go:117] "RemoveContainer" containerID="cd63fc5b2ffbd52b09628908868b28e2168bfe8bc453de41ce175aedcd404cc5" Dec 02 15:56:32 crc kubenswrapper[4933]: I1202 15:56:32.352321 4933 scope.go:117] "RemoveContainer" containerID="a0f8661c980893fc646e84a6bb8547946653723fb3f1449d585953e25715d588" Dec 02 15:56:32 crc kubenswrapper[4933]: E1202 15:56:32.352786 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0f8661c980893fc646e84a6bb8547946653723fb3f1449d585953e25715d588\": container with ID starting with a0f8661c980893fc646e84a6bb8547946653723fb3f1449d585953e25715d588 not found: ID does not exist" containerID="a0f8661c980893fc646e84a6bb8547946653723fb3f1449d585953e25715d588" Dec 02 15:56:32 crc kubenswrapper[4933]: I1202 15:56:32.352945 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0f8661c980893fc646e84a6bb8547946653723fb3f1449d585953e25715d588"} err="failed to get container status \"a0f8661c980893fc646e84a6bb8547946653723fb3f1449d585953e25715d588\": rpc error: code = NotFound desc = could not find container \"a0f8661c980893fc646e84a6bb8547946653723fb3f1449d585953e25715d588\": container with ID starting with a0f8661c980893fc646e84a6bb8547946653723fb3f1449d585953e25715d588 not found: ID does not exist" Dec 02 15:56:32 crc kubenswrapper[4933]: I1202 15:56:32.352979 4933 scope.go:117] "RemoveContainer" containerID="266ff5f18c317176752753189e1fd32c61d17e22bebba19421d71cdce41c10e4" Dec 02 15:56:32 crc kubenswrapper[4933]: E1202 15:56:32.353273 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"266ff5f18c317176752753189e1fd32c61d17e22bebba19421d71cdce41c10e4\": container with ID starting with 266ff5f18c317176752753189e1fd32c61d17e22bebba19421d71cdce41c10e4 not found: ID does not exist" containerID="266ff5f18c317176752753189e1fd32c61d17e22bebba19421d71cdce41c10e4" Dec 02 15:56:32 crc kubenswrapper[4933]: I1202 15:56:32.353311 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"266ff5f18c317176752753189e1fd32c61d17e22bebba19421d71cdce41c10e4"} err="failed to get container status \"266ff5f18c317176752753189e1fd32c61d17e22bebba19421d71cdce41c10e4\": rpc error: code = NotFound desc = could not find container \"266ff5f18c317176752753189e1fd32c61d17e22bebba19421d71cdce41c10e4\": container with ID starting with 266ff5f18c317176752753189e1fd32c61d17e22bebba19421d71cdce41c10e4 not found: ID does not exist" Dec 02 15:56:32 crc kubenswrapper[4933]: I1202 15:56:32.353337 4933 scope.go:117] "RemoveContainer" containerID="559f28d5b8b13a4d78a727f671a75d2c61a391f83089c98749a8071b466c8fae" Dec 02 15:56:32 crc kubenswrapper[4933]: E1202 15:56:32.353598 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"559f28d5b8b13a4d78a727f671a75d2c61a391f83089c98749a8071b466c8fae\": container with ID starting with 559f28d5b8b13a4d78a727f671a75d2c61a391f83089c98749a8071b466c8fae not found: ID does not exist" containerID="559f28d5b8b13a4d78a727f671a75d2c61a391f83089c98749a8071b466c8fae" Dec 02 15:56:32 crc kubenswrapper[4933]: I1202 15:56:32.353634 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"559f28d5b8b13a4d78a727f671a75d2c61a391f83089c98749a8071b466c8fae"} err="failed to get container status \"559f28d5b8b13a4d78a727f671a75d2c61a391f83089c98749a8071b466c8fae\": rpc error: code = NotFound desc = could not find container \"559f28d5b8b13a4d78a727f671a75d2c61a391f83089c98749a8071b466c8fae\": container with ID starting with 559f28d5b8b13a4d78a727f671a75d2c61a391f83089c98749a8071b466c8fae not found: ID does not exist" Dec 02 15:56:32 crc kubenswrapper[4933]: I1202 15:56:32.353653 4933 scope.go:117] "RemoveContainer" containerID="50e3f8f75abacf2b0701b20cc58796727bb04d99836c7ad69c27355d0c7f070f" Dec 02 15:56:32 crc kubenswrapper[4933]: E1202 15:56:32.354174 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50e3f8f75abacf2b0701b20cc58796727bb04d99836c7ad69c27355d0c7f070f\": container with ID starting with 50e3f8f75abacf2b0701b20cc58796727bb04d99836c7ad69c27355d0c7f070f not found: ID does not exist" containerID="50e3f8f75abacf2b0701b20cc58796727bb04d99836c7ad69c27355d0c7f070f" Dec 02 15:56:32 crc kubenswrapper[4933]: I1202 15:56:32.354211 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50e3f8f75abacf2b0701b20cc58796727bb04d99836c7ad69c27355d0c7f070f"} err="failed to get container status \"50e3f8f75abacf2b0701b20cc58796727bb04d99836c7ad69c27355d0c7f070f\": rpc error: code = NotFound desc = could not find container \"50e3f8f75abacf2b0701b20cc58796727bb04d99836c7ad69c27355d0c7f070f\": container with ID starting with 50e3f8f75abacf2b0701b20cc58796727bb04d99836c7ad69c27355d0c7f070f not found: ID does not exist" Dec 02 15:56:32 crc kubenswrapper[4933]: I1202 15:56:32.354237 4933 scope.go:117] "RemoveContainer" containerID="67497877f8241167fd89cf2ebbc7257a910b4f22e96a1e58f78936317daf19aa" Dec 02 15:56:32 crc kubenswrapper[4933]: E1202 15:56:32.354489 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67497877f8241167fd89cf2ebbc7257a910b4f22e96a1e58f78936317daf19aa\": container with ID starting with 67497877f8241167fd89cf2ebbc7257a910b4f22e96a1e58f78936317daf19aa not found: ID does not exist" containerID="67497877f8241167fd89cf2ebbc7257a910b4f22e96a1e58f78936317daf19aa" Dec 02 15:56:32 crc kubenswrapper[4933]: I1202 15:56:32.354523 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67497877f8241167fd89cf2ebbc7257a910b4f22e96a1e58f78936317daf19aa"} err="failed to get container status \"67497877f8241167fd89cf2ebbc7257a910b4f22e96a1e58f78936317daf19aa\": rpc error: code = NotFound desc = could not find container \"67497877f8241167fd89cf2ebbc7257a910b4f22e96a1e58f78936317daf19aa\": container with ID starting with 67497877f8241167fd89cf2ebbc7257a910b4f22e96a1e58f78936317daf19aa not found: ID does not exist" Dec 02 15:56:32 crc kubenswrapper[4933]: I1202 15:56:32.354549 4933 scope.go:117] "RemoveContainer" containerID="cd63fc5b2ffbd52b09628908868b28e2168bfe8bc453de41ce175aedcd404cc5" Dec 02 15:56:32 crc kubenswrapper[4933]: E1202 15:56:32.354786 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd63fc5b2ffbd52b09628908868b28e2168bfe8bc453de41ce175aedcd404cc5\": container with ID starting with cd63fc5b2ffbd52b09628908868b28e2168bfe8bc453de41ce175aedcd404cc5 not found: ID does not exist" containerID="cd63fc5b2ffbd52b09628908868b28e2168bfe8bc453de41ce175aedcd404cc5" Dec 02 15:56:32 crc kubenswrapper[4933]: I1202 15:56:32.354817 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd63fc5b2ffbd52b09628908868b28e2168bfe8bc453de41ce175aedcd404cc5"} err="failed to get container status \"cd63fc5b2ffbd52b09628908868b28e2168bfe8bc453de41ce175aedcd404cc5\": rpc error: code = NotFound desc = could not find container \"cd63fc5b2ffbd52b09628908868b28e2168bfe8bc453de41ce175aedcd404cc5\": container with ID starting with cd63fc5b2ffbd52b09628908868b28e2168bfe8bc453de41ce175aedcd404cc5 not found: ID does not exist" Dec 02 15:56:33 crc kubenswrapper[4933]: I1202 15:56:33.060217 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Dec 02 15:56:34 crc kubenswrapper[4933]: E1202 15:56:34.035924 4933 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.213:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 15:56:34 crc kubenswrapper[4933]: I1202 15:56:34.036749 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 15:56:34 crc kubenswrapper[4933]: W1202 15:56:34.076961 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-5a25ffc3e7d1f96fdd62408a2374fc4cb136bd11ec71fc1422ac5041d4489b9a WatchSource:0}: Error finding container 5a25ffc3e7d1f96fdd62408a2374fc4cb136bd11ec71fc1422ac5041d4489b9a: Status 404 returned error can't find the container with id 5a25ffc3e7d1f96fdd62408a2374fc4cb136bd11ec71fc1422ac5041d4489b9a Dec 02 15:56:34 crc kubenswrapper[4933]: I1202 15:56:34.276305 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"5a25ffc3e7d1f96fdd62408a2374fc4cb136bd11ec71fc1422ac5041d4489b9a"} Dec 02 15:56:35 crc kubenswrapper[4933]: I1202 15:56:35.282171 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"fbeb9626a90286af02747f996c8a9031812820adc659b84cd2af221068ab855e"} Dec 02 15:56:35 crc kubenswrapper[4933]: I1202 15:56:35.282877 4933 status_manager.go:851] "Failed to get status for pod" podUID="6a53a119-fa38-4661-950b-e58963acf7ff" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:35 crc kubenswrapper[4933]: E1202 15:56:35.282923 4933 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.213:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 15:56:35 crc kubenswrapper[4933]: I1202 15:56:35.283167 4933 status_manager.go:851] "Failed to get status for pod" podUID="cb41f368-0638-40fd-be0f-bc71d0182af8" pod="openshift-marketplace/marketplace-operator-79b997595-r7dnz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-r7dnz\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:35 crc kubenswrapper[4933]: I1202 15:56:35.283395 4933 status_manager.go:851] "Failed to get status for pod" podUID="4f5f6b9a-09c1-46e9-9fba-1de95dd3244c" pod="openshift-marketplace/certified-operators-mp97n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-mp97n\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:35 crc kubenswrapper[4933]: I1202 15:56:35.283640 4933 status_manager.go:851] "Failed to get status for pod" podUID="43ed131f-a943-4e5e-a11f-e07507790a41" pod="openshift-marketplace/redhat-operators-jb7rb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-jb7rb\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:35 crc kubenswrapper[4933]: I1202 15:56:35.283977 4933 status_manager.go:851] "Failed to get status for pod" podUID="b3e7c8a1-0224-434e-bf6f-27342e1f27e6" pod="openshift-marketplace/redhat-marketplace-mv2m4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mv2m4\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:35 crc kubenswrapper[4933]: I1202 15:56:35.284287 4933 status_manager.go:851] "Failed to get status for pod" podUID="c7cef2b5-6b23-47ba-83ce-161a02f128a1" pod="openshift-marketplace/community-operators-54gm8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-54gm8\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:35 crc kubenswrapper[4933]: I1202 15:56:35.284512 4933 status_manager.go:851] "Failed to get status for pod" podUID="e66c3d98-54cc-4d3f-a1b4-3eee8ed9ab51" pod="openshift-marketplace/marketplace-operator-79b997595-k9r4r" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-k9r4r\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:36 crc kubenswrapper[4933]: E1202 15:56:36.288053 4933 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.213:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 15:56:37 crc kubenswrapper[4933]: I1202 15:56:37.055849 4933 status_manager.go:851] "Failed to get status for pod" podUID="b3e7c8a1-0224-434e-bf6f-27342e1f27e6" pod="openshift-marketplace/redhat-marketplace-mv2m4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mv2m4\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:37 crc kubenswrapper[4933]: I1202 15:56:37.056361 4933 status_manager.go:851] "Failed to get status for pod" podUID="c7cef2b5-6b23-47ba-83ce-161a02f128a1" pod="openshift-marketplace/community-operators-54gm8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-54gm8\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:37 crc kubenswrapper[4933]: I1202 15:56:37.056867 4933 status_manager.go:851] "Failed to get status for pod" podUID="e66c3d98-54cc-4d3f-a1b4-3eee8ed9ab51" pod="openshift-marketplace/marketplace-operator-79b997595-k9r4r" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-k9r4r\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:37 crc kubenswrapper[4933]: I1202 15:56:37.057115 4933 status_manager.go:851] "Failed to get status for pod" podUID="6a53a119-fa38-4661-950b-e58963acf7ff" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:37 crc kubenswrapper[4933]: I1202 15:56:37.057369 4933 status_manager.go:851] "Failed to get status for pod" podUID="cb41f368-0638-40fd-be0f-bc71d0182af8" pod="openshift-marketplace/marketplace-operator-79b997595-r7dnz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-r7dnz\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:37 crc kubenswrapper[4933]: I1202 15:56:37.057639 4933 status_manager.go:851] "Failed to get status for pod" podUID="4f5f6b9a-09c1-46e9-9fba-1de95dd3244c" pod="openshift-marketplace/certified-operators-mp97n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-mp97n\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:37 crc kubenswrapper[4933]: I1202 15:56:37.057899 4933 status_manager.go:851] "Failed to get status for pod" podUID="43ed131f-a943-4e5e-a11f-e07507790a41" pod="openshift-marketplace/redhat-operators-jb7rb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-jb7rb\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:37 crc kubenswrapper[4933]: E1202 15:56:37.667330 4933 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:37 crc kubenswrapper[4933]: E1202 15:56:37.668132 4933 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:37 crc kubenswrapper[4933]: E1202 15:56:37.668472 4933 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:37 crc kubenswrapper[4933]: E1202 15:56:37.668741 4933 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:37 crc kubenswrapper[4933]: E1202 15:56:37.669011 4933 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:37 crc kubenswrapper[4933]: I1202 15:56:37.669041 4933 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Dec 02 15:56:37 crc kubenswrapper[4933]: E1202 15:56:37.669249 4933 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.213:6443: connect: connection refused" interval="200ms" Dec 02 15:56:37 crc kubenswrapper[4933]: E1202 15:56:37.870309 4933 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.213:6443: connect: connection refused" interval="400ms" Dec 02 15:56:37 crc kubenswrapper[4933]: E1202 15:56:37.875896 4933 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/events\": dial tcp 38.102.83.213:6443: connect: connection refused" event=< Dec 02 15:56:37 crc kubenswrapper[4933]: &Event{ObjectMeta:{marketplace-operator-79b997595-k9r4r.187d711adc028929 openshift-marketplace 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-marketplace,Name:marketplace-operator-79b997595-k9r4r,UID:e66c3d98-54cc-4d3f-a1b4-3eee8ed9ab51,APIVersion:v1,ResourceVersion:29382,FieldPath:spec.containers{marketplace-operator},},Reason:ProbeError,Message:Readiness probe error: Get "http://10.217.0.57:8080/healthz": dial tcp 10.217.0.57:8080: connect: connection refused Dec 02 15:56:37 crc kubenswrapper[4933]: body: Dec 02 15:56:37 crc kubenswrapper[4933]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-02 15:56:29.173344553 +0000 UTC m=+252.424571256,LastTimestamp:2025-12-02 15:56:29.173344553 +0000 UTC m=+252.424571256,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Dec 02 15:56:37 crc kubenswrapper[4933]: > Dec 02 15:56:38 crc kubenswrapper[4933]: I1202 15:56:38.118983 4933 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-marketplace/marketplace-operator-79b997595-k9r4r" Dec 02 15:56:38 crc kubenswrapper[4933]: I1202 15:56:38.119036 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-k9r4r" Dec 02 15:56:38 crc kubenswrapper[4933]: I1202 15:56:38.119609 4933 scope.go:117] "RemoveContainer" containerID="30fb2c88ce32300cc80a328922ccca20df60fd665631841e12d586d65b24198d" Dec 02 15:56:38 crc kubenswrapper[4933]: E1202 15:56:38.119810 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"marketplace-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=marketplace-operator pod=marketplace-operator-79b997595-k9r4r_openshift-marketplace(e66c3d98-54cc-4d3f-a1b4-3eee8ed9ab51)\"" pod="openshift-marketplace/marketplace-operator-79b997595-k9r4r" podUID="e66c3d98-54cc-4d3f-a1b4-3eee8ed9ab51" Dec 02 15:56:38 crc kubenswrapper[4933]: E1202 15:56:38.272177 4933 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.213:6443: connect: connection refused" interval="800ms" Dec 02 15:56:39 crc kubenswrapper[4933]: E1202 15:56:39.073646 4933 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.213:6443: connect: connection refused" interval="1.6s" Dec 02 15:56:40 crc kubenswrapper[4933]: I1202 15:56:40.052479 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 15:56:40 crc kubenswrapper[4933]: I1202 15:56:40.053510 4933 status_manager.go:851] "Failed to get status for pod" podUID="6a53a119-fa38-4661-950b-e58963acf7ff" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:40 crc kubenswrapper[4933]: I1202 15:56:40.053989 4933 status_manager.go:851] "Failed to get status for pod" podUID="cb41f368-0638-40fd-be0f-bc71d0182af8" pod="openshift-marketplace/marketplace-operator-79b997595-r7dnz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-r7dnz\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:40 crc kubenswrapper[4933]: I1202 15:56:40.054508 4933 status_manager.go:851] "Failed to get status for pod" podUID="4f5f6b9a-09c1-46e9-9fba-1de95dd3244c" pod="openshift-marketplace/certified-operators-mp97n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-mp97n\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:40 crc kubenswrapper[4933]: I1202 15:56:40.054786 4933 status_manager.go:851] "Failed to get status for pod" podUID="43ed131f-a943-4e5e-a11f-e07507790a41" pod="openshift-marketplace/redhat-operators-jb7rb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-jb7rb\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:40 crc kubenswrapper[4933]: I1202 15:56:40.055042 4933 status_manager.go:851] "Failed to get status for pod" podUID="b3e7c8a1-0224-434e-bf6f-27342e1f27e6" pod="openshift-marketplace/redhat-marketplace-mv2m4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mv2m4\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:40 crc kubenswrapper[4933]: I1202 15:56:40.055353 4933 status_manager.go:851] "Failed to get status for pod" podUID="c7cef2b5-6b23-47ba-83ce-161a02f128a1" pod="openshift-marketplace/community-operators-54gm8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-54gm8\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:40 crc kubenswrapper[4933]: I1202 15:56:40.055639 4933 status_manager.go:851] "Failed to get status for pod" podUID="e66c3d98-54cc-4d3f-a1b4-3eee8ed9ab51" pod="openshift-marketplace/marketplace-operator-79b997595-k9r4r" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-k9r4r\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:40 crc kubenswrapper[4933]: I1202 15:56:40.065604 4933 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="67141d41-dade-4d16-8921-1a3eeaef658e" Dec 02 15:56:40 crc kubenswrapper[4933]: I1202 15:56:40.065636 4933 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="67141d41-dade-4d16-8921-1a3eeaef658e" Dec 02 15:56:40 crc kubenswrapper[4933]: E1202 15:56:40.066041 4933 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.213:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 15:56:40 crc kubenswrapper[4933]: I1202 15:56:40.066678 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 15:56:40 crc kubenswrapper[4933]: I1202 15:56:40.315383 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"27c8bf3313f4cf94794254f269ba8c9d75b182554ac023741d541cb476b10158"} Dec 02 15:56:40 crc kubenswrapper[4933]: I1202 15:56:40.343607 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-shlpj" Dec 02 15:56:40 crc kubenswrapper[4933]: I1202 15:56:40.344636 4933 status_manager.go:851] "Failed to get status for pod" podUID="b3e7c8a1-0224-434e-bf6f-27342e1f27e6" pod="openshift-marketplace/redhat-marketplace-mv2m4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mv2m4\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:40 crc kubenswrapper[4933]: I1202 15:56:40.345409 4933 status_manager.go:851] "Failed to get status for pod" podUID="c7cef2b5-6b23-47ba-83ce-161a02f128a1" pod="openshift-marketplace/community-operators-54gm8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-54gm8\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:40 crc kubenswrapper[4933]: I1202 15:56:40.345785 4933 status_manager.go:851] "Failed to get status for pod" podUID="e66c3d98-54cc-4d3f-a1b4-3eee8ed9ab51" pod="openshift-marketplace/marketplace-operator-79b997595-k9r4r" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-k9r4r\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:40 crc kubenswrapper[4933]: I1202 15:56:40.346209 4933 status_manager.go:851] "Failed to get status for pod" podUID="df4c64f4-2e92-4262-9f0d-070daf74622e" pod="openshift-image-registry/image-registry-66df7c8f76-shlpj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-registry-66df7c8f76-shlpj\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:40 crc kubenswrapper[4933]: I1202 15:56:40.346598 4933 status_manager.go:851] "Failed to get status for pod" podUID="6a53a119-fa38-4661-950b-e58963acf7ff" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:40 crc kubenswrapper[4933]: I1202 15:56:40.346974 4933 status_manager.go:851] "Failed to get status for pod" podUID="cb41f368-0638-40fd-be0f-bc71d0182af8" pod="openshift-marketplace/marketplace-operator-79b997595-r7dnz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-r7dnz\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:40 crc kubenswrapper[4933]: I1202 15:56:40.347239 4933 status_manager.go:851] "Failed to get status for pod" podUID="4f5f6b9a-09c1-46e9-9fba-1de95dd3244c" pod="openshift-marketplace/certified-operators-mp97n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-mp97n\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:40 crc kubenswrapper[4933]: I1202 15:56:40.347575 4933 status_manager.go:851] "Failed to get status for pod" podUID="43ed131f-a943-4e5e-a11f-e07507790a41" pod="openshift-marketplace/redhat-operators-jb7rb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-jb7rb\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:40 crc kubenswrapper[4933]: E1202 15:56:40.379955 4933 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.213:6443: connect: connection refused" pod="openshift-image-registry/image-registry-66df7c8f76-shlpj" volumeName="registry-storage" Dec 02 15:56:40 crc kubenswrapper[4933]: E1202 15:56:40.674278 4933 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.213:6443: connect: connection refused" interval="3.2s" Dec 02 15:56:41 crc kubenswrapper[4933]: I1202 15:56:41.325074 4933 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="f02e6a03fba92060cf3910cf67db8e47f4155a20013d61f916812efd53e0f526" exitCode=0 Dec 02 15:56:41 crc kubenswrapper[4933]: I1202 15:56:41.325157 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"f02e6a03fba92060cf3910cf67db8e47f4155a20013d61f916812efd53e0f526"} Dec 02 15:56:41 crc kubenswrapper[4933]: I1202 15:56:41.325748 4933 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="67141d41-dade-4d16-8921-1a3eeaef658e" Dec 02 15:56:41 crc kubenswrapper[4933]: I1202 15:56:41.325773 4933 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="67141d41-dade-4d16-8921-1a3eeaef658e" Dec 02 15:56:41 crc kubenswrapper[4933]: E1202 15:56:41.326273 4933 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.213:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 15:56:41 crc kubenswrapper[4933]: I1202 15:56:41.326277 4933 status_manager.go:851] "Failed to get status for pod" podUID="b3e7c8a1-0224-434e-bf6f-27342e1f27e6" pod="openshift-marketplace/redhat-marketplace-mv2m4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mv2m4\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:41 crc kubenswrapper[4933]: I1202 15:56:41.326919 4933 status_manager.go:851] "Failed to get status for pod" podUID="c7cef2b5-6b23-47ba-83ce-161a02f128a1" pod="openshift-marketplace/community-operators-54gm8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-54gm8\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:41 crc kubenswrapper[4933]: I1202 15:56:41.327248 4933 status_manager.go:851] "Failed to get status for pod" podUID="e66c3d98-54cc-4d3f-a1b4-3eee8ed9ab51" pod="openshift-marketplace/marketplace-operator-79b997595-k9r4r" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-k9r4r\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:41 crc kubenswrapper[4933]: I1202 15:56:41.327560 4933 status_manager.go:851] "Failed to get status for pod" podUID="df4c64f4-2e92-4262-9f0d-070daf74622e" pod="openshift-image-registry/image-registry-66df7c8f76-shlpj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-registry-66df7c8f76-shlpj\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:41 crc kubenswrapper[4933]: I1202 15:56:41.327901 4933 status_manager.go:851] "Failed to get status for pod" podUID="6a53a119-fa38-4661-950b-e58963acf7ff" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:41 crc kubenswrapper[4933]: I1202 15:56:41.328219 4933 status_manager.go:851] "Failed to get status for pod" podUID="cb41f368-0638-40fd-be0f-bc71d0182af8" pod="openshift-marketplace/marketplace-operator-79b997595-r7dnz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-r7dnz\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:41 crc kubenswrapper[4933]: I1202 15:56:41.328509 4933 status_manager.go:851] "Failed to get status for pod" podUID="4f5f6b9a-09c1-46e9-9fba-1de95dd3244c" pod="openshift-marketplace/certified-operators-mp97n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-mp97n\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:41 crc kubenswrapper[4933]: I1202 15:56:41.329079 4933 status_manager.go:851] "Failed to get status for pod" podUID="43ed131f-a943-4e5e-a11f-e07507790a41" pod="openshift-marketplace/redhat-operators-jb7rb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-jb7rb\": dial tcp 38.102.83.213:6443: connect: connection refused" Dec 02 15:56:43 crc kubenswrapper[4933]: I1202 15:56:43.343092 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"078ecc3481df44dfdf9396da3778f862e824ec43fd3211c3ac8d2a8d9f0ce54d"} Dec 02 15:56:43 crc kubenswrapper[4933]: I1202 15:56:43.343413 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 15:56:43 crc kubenswrapper[4933]: I1202 15:56:43.343425 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"a6956c9e1b63c96e565896043c3795671f624460b72bd4bd3803742a0facd9c0"} Dec 02 15:56:43 crc kubenswrapper[4933]: I1202 15:56:43.343436 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"1ad79ffd6117d82d98b31952fe9692ba58bc5e2a8bf513b4bc8487fcc35cfde7"} Dec 02 15:56:43 crc kubenswrapper[4933]: I1202 15:56:43.343445 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"590b3180503fe52548237d02322f2815d0e8c9117f18928a6167d12eab0945af"} Dec 02 15:56:43 crc kubenswrapper[4933]: I1202 15:56:43.343453 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"ef29fcdbe0eaae9e9ac1fe59c5f60db04df6880b9f2473f2c44a4be5aef38af4"} Dec 02 15:56:43 crc kubenswrapper[4933]: I1202 15:56:43.343326 4933 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="67141d41-dade-4d16-8921-1a3eeaef658e" Dec 02 15:56:43 crc kubenswrapper[4933]: I1202 15:56:43.343472 4933 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="67141d41-dade-4d16-8921-1a3eeaef658e" Dec 02 15:56:43 crc kubenswrapper[4933]: I1202 15:56:43.345518 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 02 15:56:43 crc kubenswrapper[4933]: I1202 15:56:43.345568 4933 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="fee933bdd8638f0085a6f720a178c8ce59bf46b40a0bcb015ac9c570e25ce97d" exitCode=1 Dec 02 15:56:43 crc kubenswrapper[4933]: I1202 15:56:43.345595 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"fee933bdd8638f0085a6f720a178c8ce59bf46b40a0bcb015ac9c570e25ce97d"} Dec 02 15:56:43 crc kubenswrapper[4933]: I1202 15:56:43.346100 4933 scope.go:117] "RemoveContainer" containerID="fee933bdd8638f0085a6f720a178c8ce59bf46b40a0bcb015ac9c570e25ce97d" Dec 02 15:56:44 crc kubenswrapper[4933]: I1202 15:56:44.352598 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 02 15:56:44 crc kubenswrapper[4933]: I1202 15:56:44.352930 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"942040b671e9582cbf98abb038647f7a0b537cbe8b9b6d9d2c9e77811c1f7929"} Dec 02 15:56:45 crc kubenswrapper[4933]: I1202 15:56:45.067715 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 15:56:45 crc kubenswrapper[4933]: I1202 15:56:45.067754 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 15:56:45 crc kubenswrapper[4933]: I1202 15:56:45.072812 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 15:56:48 crc kubenswrapper[4933]: I1202 15:56:48.559934 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 15:56:48 crc kubenswrapper[4933]: I1202 15:56:48.700400 4933 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 15:56:48 crc kubenswrapper[4933]: I1202 15:56:48.823560 4933 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="165a904e-ebf5-49ee-97b7-2720f314b31a" Dec 02 15:56:49 crc kubenswrapper[4933]: I1202 15:56:49.398651 4933 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="67141d41-dade-4d16-8921-1a3eeaef658e" Dec 02 15:56:49 crc kubenswrapper[4933]: I1202 15:56:49.398674 4933 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="67141d41-dade-4d16-8921-1a3eeaef658e" Dec 02 15:56:49 crc kubenswrapper[4933]: I1202 15:56:49.402347 4933 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="165a904e-ebf5-49ee-97b7-2720f314b31a" Dec 02 15:56:49 crc kubenswrapper[4933]: I1202 15:56:49.402357 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 15:56:50 crc kubenswrapper[4933]: I1202 15:56:50.114428 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-26gvw" podUID="97c46bec-8e96-4d62-8808-549f712a9802" containerName="oauth-openshift" containerID="cri-o://044d11bf10e397a8a41127d36183d01ab0215c39c452c4c961bcf7be13516bb1" gracePeriod=15 Dec 02 15:56:50 crc kubenswrapper[4933]: I1202 15:56:50.406234 4933 generic.go:334] "Generic (PLEG): container finished" podID="97c46bec-8e96-4d62-8808-549f712a9802" containerID="044d11bf10e397a8a41127d36183d01ab0215c39c452c4c961bcf7be13516bb1" exitCode=0 Dec 02 15:56:50 crc kubenswrapper[4933]: I1202 15:56:50.406341 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-26gvw" event={"ID":"97c46bec-8e96-4d62-8808-549f712a9802","Type":"ContainerDied","Data":"044d11bf10e397a8a41127d36183d01ab0215c39c452c4c961bcf7be13516bb1"} Dec 02 15:56:50 crc kubenswrapper[4933]: I1202 15:56:50.406899 4933 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="67141d41-dade-4d16-8921-1a3eeaef658e" Dec 02 15:56:50 crc kubenswrapper[4933]: I1202 15:56:50.406920 4933 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="67141d41-dade-4d16-8921-1a3eeaef658e" Dec 02 15:56:50 crc kubenswrapper[4933]: I1202 15:56:50.410018 4933 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="165a904e-ebf5-49ee-97b7-2720f314b31a" Dec 02 15:56:50 crc kubenswrapper[4933]: I1202 15:56:50.623258 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-26gvw" Dec 02 15:56:50 crc kubenswrapper[4933]: I1202 15:56:50.706574 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/97c46bec-8e96-4d62-8808-549f712a9802-v4-0-config-user-template-provider-selection\") pod \"97c46bec-8e96-4d62-8808-549f712a9802\" (UID: \"97c46bec-8e96-4d62-8808-549f712a9802\") " Dec 02 15:56:50 crc kubenswrapper[4933]: I1202 15:56:50.706622 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/97c46bec-8e96-4d62-8808-549f712a9802-audit-dir\") pod \"97c46bec-8e96-4d62-8808-549f712a9802\" (UID: \"97c46bec-8e96-4d62-8808-549f712a9802\") " Dec 02 15:56:50 crc kubenswrapper[4933]: I1202 15:56:50.706652 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/97c46bec-8e96-4d62-8808-549f712a9802-v4-0-config-user-idp-0-file-data\") pod \"97c46bec-8e96-4d62-8808-549f712a9802\" (UID: \"97c46bec-8e96-4d62-8808-549f712a9802\") " Dec 02 15:56:50 crc kubenswrapper[4933]: I1202 15:56:50.706676 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/97c46bec-8e96-4d62-8808-549f712a9802-v4-0-config-system-serving-cert\") pod \"97c46bec-8e96-4d62-8808-549f712a9802\" (UID: \"97c46bec-8e96-4d62-8808-549f712a9802\") " Dec 02 15:56:50 crc kubenswrapper[4933]: I1202 15:56:50.706718 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/97c46bec-8e96-4d62-8808-549f712a9802-v4-0-config-system-ocp-branding-template\") pod \"97c46bec-8e96-4d62-8808-549f712a9802\" (UID: \"97c46bec-8e96-4d62-8808-549f712a9802\") " Dec 02 15:56:50 crc kubenswrapper[4933]: I1202 15:56:50.706746 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/97c46bec-8e96-4d62-8808-549f712a9802-v4-0-config-system-trusted-ca-bundle\") pod \"97c46bec-8e96-4d62-8808-549f712a9802\" (UID: \"97c46bec-8e96-4d62-8808-549f712a9802\") " Dec 02 15:56:50 crc kubenswrapper[4933]: I1202 15:56:50.706756 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/97c46bec-8e96-4d62-8808-549f712a9802-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "97c46bec-8e96-4d62-8808-549f712a9802" (UID: "97c46bec-8e96-4d62-8808-549f712a9802"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 15:56:50 crc kubenswrapper[4933]: I1202 15:56:50.706767 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mt4wv\" (UniqueName: \"kubernetes.io/projected/97c46bec-8e96-4d62-8808-549f712a9802-kube-api-access-mt4wv\") pod \"97c46bec-8e96-4d62-8808-549f712a9802\" (UID: \"97c46bec-8e96-4d62-8808-549f712a9802\") " Dec 02 15:56:50 crc kubenswrapper[4933]: I1202 15:56:50.706890 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/97c46bec-8e96-4d62-8808-549f712a9802-v4-0-config-user-template-error\") pod \"97c46bec-8e96-4d62-8808-549f712a9802\" (UID: \"97c46bec-8e96-4d62-8808-549f712a9802\") " Dec 02 15:56:50 crc kubenswrapper[4933]: I1202 15:56:50.706945 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/97c46bec-8e96-4d62-8808-549f712a9802-v4-0-config-user-template-login\") pod \"97c46bec-8e96-4d62-8808-549f712a9802\" (UID: \"97c46bec-8e96-4d62-8808-549f712a9802\") " Dec 02 15:56:50 crc kubenswrapper[4933]: I1202 15:56:50.706989 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/97c46bec-8e96-4d62-8808-549f712a9802-v4-0-config-system-cliconfig\") pod \"97c46bec-8e96-4d62-8808-549f712a9802\" (UID: \"97c46bec-8e96-4d62-8808-549f712a9802\") " Dec 02 15:56:50 crc kubenswrapper[4933]: I1202 15:56:50.707046 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/97c46bec-8e96-4d62-8808-549f712a9802-v4-0-config-system-session\") pod \"97c46bec-8e96-4d62-8808-549f712a9802\" (UID: \"97c46bec-8e96-4d62-8808-549f712a9802\") " Dec 02 15:56:50 crc kubenswrapper[4933]: I1202 15:56:50.707083 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/97c46bec-8e96-4d62-8808-549f712a9802-audit-policies\") pod \"97c46bec-8e96-4d62-8808-549f712a9802\" (UID: \"97c46bec-8e96-4d62-8808-549f712a9802\") " Dec 02 15:56:50 crc kubenswrapper[4933]: I1202 15:56:50.707146 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/97c46bec-8e96-4d62-8808-549f712a9802-v4-0-config-system-service-ca\") pod \"97c46bec-8e96-4d62-8808-549f712a9802\" (UID: \"97c46bec-8e96-4d62-8808-549f712a9802\") " Dec 02 15:56:50 crc kubenswrapper[4933]: I1202 15:56:50.707181 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/97c46bec-8e96-4d62-8808-549f712a9802-v4-0-config-system-router-certs\") pod \"97c46bec-8e96-4d62-8808-549f712a9802\" (UID: \"97c46bec-8e96-4d62-8808-549f712a9802\") " Dec 02 15:56:50 crc kubenswrapper[4933]: I1202 15:56:50.708113 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97c46bec-8e96-4d62-8808-549f712a9802-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "97c46bec-8e96-4d62-8808-549f712a9802" (UID: "97c46bec-8e96-4d62-8808-549f712a9802"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 15:56:50 crc kubenswrapper[4933]: I1202 15:56:50.708135 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97c46bec-8e96-4d62-8808-549f712a9802-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "97c46bec-8e96-4d62-8808-549f712a9802" (UID: "97c46bec-8e96-4d62-8808-549f712a9802"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 15:56:50 crc kubenswrapper[4933]: I1202 15:56:50.708377 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97c46bec-8e96-4d62-8808-549f712a9802-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "97c46bec-8e96-4d62-8808-549f712a9802" (UID: "97c46bec-8e96-4d62-8808-549f712a9802"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 15:56:50 crc kubenswrapper[4933]: I1202 15:56:50.708450 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97c46bec-8e96-4d62-8808-549f712a9802-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "97c46bec-8e96-4d62-8808-549f712a9802" (UID: "97c46bec-8e96-4d62-8808-549f712a9802"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 15:56:50 crc kubenswrapper[4933]: I1202 15:56:50.709050 4933 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/97c46bec-8e96-4d62-8808-549f712a9802-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 02 15:56:50 crc kubenswrapper[4933]: I1202 15:56:50.709077 4933 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/97c46bec-8e96-4d62-8808-549f712a9802-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 02 15:56:50 crc kubenswrapper[4933]: I1202 15:56:50.709096 4933 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/97c46bec-8e96-4d62-8808-549f712a9802-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 02 15:56:50 crc kubenswrapper[4933]: I1202 15:56:50.709115 4933 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/97c46bec-8e96-4d62-8808-549f712a9802-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 15:56:50 crc kubenswrapper[4933]: I1202 15:56:50.709134 4933 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/97c46bec-8e96-4d62-8808-549f712a9802-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 02 15:56:50 crc kubenswrapper[4933]: I1202 15:56:50.713986 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97c46bec-8e96-4d62-8808-549f712a9802-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "97c46bec-8e96-4d62-8808-549f712a9802" (UID: "97c46bec-8e96-4d62-8808-549f712a9802"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 15:56:50 crc kubenswrapper[4933]: I1202 15:56:50.714114 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97c46bec-8e96-4d62-8808-549f712a9802-kube-api-access-mt4wv" (OuterVolumeSpecName: "kube-api-access-mt4wv") pod "97c46bec-8e96-4d62-8808-549f712a9802" (UID: "97c46bec-8e96-4d62-8808-549f712a9802"). InnerVolumeSpecName "kube-api-access-mt4wv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 15:56:50 crc kubenswrapper[4933]: I1202 15:56:50.714243 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97c46bec-8e96-4d62-8808-549f712a9802-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "97c46bec-8e96-4d62-8808-549f712a9802" (UID: "97c46bec-8e96-4d62-8808-549f712a9802"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 15:56:50 crc kubenswrapper[4933]: I1202 15:56:50.716369 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97c46bec-8e96-4d62-8808-549f712a9802-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "97c46bec-8e96-4d62-8808-549f712a9802" (UID: "97c46bec-8e96-4d62-8808-549f712a9802"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 15:56:50 crc kubenswrapper[4933]: I1202 15:56:50.716725 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97c46bec-8e96-4d62-8808-549f712a9802-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "97c46bec-8e96-4d62-8808-549f712a9802" (UID: "97c46bec-8e96-4d62-8808-549f712a9802"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 15:56:50 crc kubenswrapper[4933]: I1202 15:56:50.716985 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97c46bec-8e96-4d62-8808-549f712a9802-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "97c46bec-8e96-4d62-8808-549f712a9802" (UID: "97c46bec-8e96-4d62-8808-549f712a9802"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 15:56:50 crc kubenswrapper[4933]: I1202 15:56:50.720179 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97c46bec-8e96-4d62-8808-549f712a9802-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "97c46bec-8e96-4d62-8808-549f712a9802" (UID: "97c46bec-8e96-4d62-8808-549f712a9802"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 15:56:50 crc kubenswrapper[4933]: I1202 15:56:50.726086 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97c46bec-8e96-4d62-8808-549f712a9802-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "97c46bec-8e96-4d62-8808-549f712a9802" (UID: "97c46bec-8e96-4d62-8808-549f712a9802"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 15:56:50 crc kubenswrapper[4933]: I1202 15:56:50.726433 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97c46bec-8e96-4d62-8808-549f712a9802-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "97c46bec-8e96-4d62-8808-549f712a9802" (UID: "97c46bec-8e96-4d62-8808-549f712a9802"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 15:56:50 crc kubenswrapper[4933]: I1202 15:56:50.810805 4933 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/97c46bec-8e96-4d62-8808-549f712a9802-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 02 15:56:50 crc kubenswrapper[4933]: I1202 15:56:50.810902 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mt4wv\" (UniqueName: \"kubernetes.io/projected/97c46bec-8e96-4d62-8808-549f712a9802-kube-api-access-mt4wv\") on node \"crc\" DevicePath \"\"" Dec 02 15:56:50 crc kubenswrapper[4933]: I1202 15:56:50.810922 4933 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/97c46bec-8e96-4d62-8808-549f712a9802-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 02 15:56:50 crc kubenswrapper[4933]: I1202 15:56:50.810935 4933 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/97c46bec-8e96-4d62-8808-549f712a9802-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 02 15:56:50 crc kubenswrapper[4933]: I1202 15:56:50.810949 4933 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/97c46bec-8e96-4d62-8808-549f712a9802-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 02 15:56:50 crc kubenswrapper[4933]: I1202 15:56:50.810961 4933 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/97c46bec-8e96-4d62-8808-549f712a9802-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 02 15:56:50 crc kubenswrapper[4933]: I1202 15:56:50.810972 4933 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/97c46bec-8e96-4d62-8808-549f712a9802-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 02 15:56:50 crc kubenswrapper[4933]: I1202 15:56:50.810983 4933 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/97c46bec-8e96-4d62-8808-549f712a9802-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 15:56:50 crc kubenswrapper[4933]: I1202 15:56:50.811017 4933 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/97c46bec-8e96-4d62-8808-549f712a9802-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 02 15:56:51 crc kubenswrapper[4933]: I1202 15:56:51.418582 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-26gvw" event={"ID":"97c46bec-8e96-4d62-8808-549f712a9802","Type":"ContainerDied","Data":"d6a6f452db963ef9d5e86189e27b24357c3586aaaa061626711d36f3333fc55f"} Dec 02 15:56:51 crc kubenswrapper[4933]: I1202 15:56:51.418784 4933 scope.go:117] "RemoveContainer" containerID="044d11bf10e397a8a41127d36183d01ab0215c39c452c4c961bcf7be13516bb1" Dec 02 15:56:51 crc kubenswrapper[4933]: I1202 15:56:51.418634 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-26gvw" Dec 02 15:56:52 crc kubenswrapper[4933]: I1202 15:56:52.053502 4933 scope.go:117] "RemoveContainer" containerID="30fb2c88ce32300cc80a328922ccca20df60fd665631841e12d586d65b24198d" Dec 02 15:56:52 crc kubenswrapper[4933]: I1202 15:56:52.185518 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 15:56:52 crc kubenswrapper[4933]: I1202 15:56:52.189124 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 15:56:52 crc kubenswrapper[4933]: I1202 15:56:52.427654 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-k9r4r_e66c3d98-54cc-4d3f-a1b4-3eee8ed9ab51/marketplace-operator/2.log" Dec 02 15:56:52 crc kubenswrapper[4933]: I1202 15:56:52.428213 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-k9r4r_e66c3d98-54cc-4d3f-a1b4-3eee8ed9ab51/marketplace-operator/1.log" Dec 02 15:56:52 crc kubenswrapper[4933]: I1202 15:56:52.428262 4933 generic.go:334] "Generic (PLEG): container finished" podID="e66c3d98-54cc-4d3f-a1b4-3eee8ed9ab51" containerID="213f318e4cbcb876a63ec085799ffc63076899bc2c57a7bb95b8e6c3d58753da" exitCode=1 Dec 02 15:56:52 crc kubenswrapper[4933]: I1202 15:56:52.428332 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-k9r4r" event={"ID":"e66c3d98-54cc-4d3f-a1b4-3eee8ed9ab51","Type":"ContainerDied","Data":"213f318e4cbcb876a63ec085799ffc63076899bc2c57a7bb95b8e6c3d58753da"} Dec 02 15:56:52 crc kubenswrapper[4933]: I1202 15:56:52.428409 4933 scope.go:117] "RemoveContainer" containerID="30fb2c88ce32300cc80a328922ccca20df60fd665631841e12d586d65b24198d" Dec 02 15:56:52 crc kubenswrapper[4933]: I1202 15:56:52.428917 4933 scope.go:117] "RemoveContainer" containerID="213f318e4cbcb876a63ec085799ffc63076899bc2c57a7bb95b8e6c3d58753da" Dec 02 15:56:52 crc kubenswrapper[4933]: E1202 15:56:52.429171 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"marketplace-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=marketplace-operator pod=marketplace-operator-79b997595-k9r4r_openshift-marketplace(e66c3d98-54cc-4d3f-a1b4-3eee8ed9ab51)\"" pod="openshift-marketplace/marketplace-operator-79b997595-k9r4r" podUID="e66c3d98-54cc-4d3f-a1b4-3eee8ed9ab51" Dec 02 15:56:53 crc kubenswrapper[4933]: I1202 15:56:53.438238 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-k9r4r_e66c3d98-54cc-4d3f-a1b4-3eee8ed9ab51/marketplace-operator/2.log" Dec 02 15:56:58 crc kubenswrapper[4933]: I1202 15:56:58.118657 4933 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-marketplace/marketplace-operator-79b997595-k9r4r" Dec 02 15:56:58 crc kubenswrapper[4933]: I1202 15:56:58.119121 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-k9r4r" Dec 02 15:56:58 crc kubenswrapper[4933]: I1202 15:56:58.119897 4933 scope.go:117] "RemoveContainer" containerID="213f318e4cbcb876a63ec085799ffc63076899bc2c57a7bb95b8e6c3d58753da" Dec 02 15:56:58 crc kubenswrapper[4933]: E1202 15:56:58.120285 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"marketplace-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=marketplace-operator pod=marketplace-operator-79b997595-k9r4r_openshift-marketplace(e66c3d98-54cc-4d3f-a1b4-3eee8ed9ab51)\"" pod="openshift-marketplace/marketplace-operator-79b997595-k9r4r" podUID="e66c3d98-54cc-4d3f-a1b4-3eee8ed9ab51" Dec 02 15:56:58 crc kubenswrapper[4933]: I1202 15:56:58.569960 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 15:56:58 crc kubenswrapper[4933]: I1202 15:56:58.731236 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 02 15:56:58 crc kubenswrapper[4933]: I1202 15:56:58.812668 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 02 15:57:00 crc kubenswrapper[4933]: I1202 15:57:00.029561 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 02 15:57:00 crc kubenswrapper[4933]: I1202 15:57:00.029614 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 02 15:57:00 crc kubenswrapper[4933]: I1202 15:57:00.104173 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 02 15:57:00 crc kubenswrapper[4933]: I1202 15:57:00.237220 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 02 15:57:00 crc kubenswrapper[4933]: I1202 15:57:00.574165 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 02 15:57:00 crc kubenswrapper[4933]: I1202 15:57:00.596357 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 02 15:57:00 crc kubenswrapper[4933]: I1202 15:57:00.636937 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 02 15:57:01 crc kubenswrapper[4933]: I1202 15:57:01.194780 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 02 15:57:01 crc kubenswrapper[4933]: I1202 15:57:01.376595 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 02 15:57:01 crc kubenswrapper[4933]: I1202 15:57:01.442645 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 02 15:57:01 crc kubenswrapper[4933]: I1202 15:57:01.944660 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 02 15:57:01 crc kubenswrapper[4933]: I1202 15:57:01.998086 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 02 15:57:02 crc kubenswrapper[4933]: I1202 15:57:02.004061 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 02 15:57:02 crc kubenswrapper[4933]: I1202 15:57:02.020105 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 02 15:57:02 crc kubenswrapper[4933]: I1202 15:57:02.127646 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 02 15:57:02 crc kubenswrapper[4933]: I1202 15:57:02.146621 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 02 15:57:02 crc kubenswrapper[4933]: I1202 15:57:02.494480 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 02 15:57:02 crc kubenswrapper[4933]: I1202 15:57:02.550010 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 02 15:57:02 crc kubenswrapper[4933]: I1202 15:57:02.550689 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 02 15:57:02 crc kubenswrapper[4933]: I1202 15:57:02.661042 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 02 15:57:02 crc kubenswrapper[4933]: I1202 15:57:02.835256 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 02 15:57:02 crc kubenswrapper[4933]: I1202 15:57:02.841494 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 02 15:57:02 crc kubenswrapper[4933]: I1202 15:57:02.862058 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 02 15:57:02 crc kubenswrapper[4933]: I1202 15:57:02.885593 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 02 15:57:02 crc kubenswrapper[4933]: I1202 15:57:02.967682 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 02 15:57:03 crc kubenswrapper[4933]: I1202 15:57:03.019266 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 02 15:57:03 crc kubenswrapper[4933]: I1202 15:57:03.081714 4933 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 02 15:57:03 crc kubenswrapper[4933]: I1202 15:57:03.294437 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 02 15:57:03 crc kubenswrapper[4933]: I1202 15:57:03.303810 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 02 15:57:03 crc kubenswrapper[4933]: I1202 15:57:03.357986 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 02 15:57:03 crc kubenswrapper[4933]: I1202 15:57:03.376215 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 02 15:57:03 crc kubenswrapper[4933]: I1202 15:57:03.532063 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 02 15:57:03 crc kubenswrapper[4933]: I1202 15:57:03.616276 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 02 15:57:03 crc kubenswrapper[4933]: I1202 15:57:03.672955 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 02 15:57:03 crc kubenswrapper[4933]: I1202 15:57:03.736734 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 02 15:57:03 crc kubenswrapper[4933]: I1202 15:57:03.823380 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 02 15:57:03 crc kubenswrapper[4933]: I1202 15:57:03.883996 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 02 15:57:03 crc kubenswrapper[4933]: I1202 15:57:03.952565 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 02 15:57:03 crc kubenswrapper[4933]: I1202 15:57:03.963589 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 02 15:57:03 crc kubenswrapper[4933]: I1202 15:57:03.986919 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 02 15:57:04 crc kubenswrapper[4933]: I1202 15:57:04.026092 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 02 15:57:04 crc kubenswrapper[4933]: I1202 15:57:04.132600 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 02 15:57:04 crc kubenswrapper[4933]: I1202 15:57:04.250378 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 02 15:57:04 crc kubenswrapper[4933]: I1202 15:57:04.562279 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 02 15:57:04 crc kubenswrapper[4933]: I1202 15:57:04.641663 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 02 15:57:04 crc kubenswrapper[4933]: I1202 15:57:04.643336 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 02 15:57:04 crc kubenswrapper[4933]: I1202 15:57:04.911262 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 02 15:57:04 crc kubenswrapper[4933]: I1202 15:57:04.915191 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 02 15:57:04 crc kubenswrapper[4933]: I1202 15:57:04.939477 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 02 15:57:04 crc kubenswrapper[4933]: I1202 15:57:04.946192 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 02 15:57:04 crc kubenswrapper[4933]: I1202 15:57:04.954920 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 02 15:57:05 crc kubenswrapper[4933]: I1202 15:57:05.051879 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 02 15:57:05 crc kubenswrapper[4933]: I1202 15:57:05.074285 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 02 15:57:05 crc kubenswrapper[4933]: I1202 15:57:05.099895 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 02 15:57:05 crc kubenswrapper[4933]: I1202 15:57:05.147819 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 02 15:57:05 crc kubenswrapper[4933]: I1202 15:57:05.154497 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 02 15:57:05 crc kubenswrapper[4933]: I1202 15:57:05.272098 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 02 15:57:05 crc kubenswrapper[4933]: I1202 15:57:05.323789 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 02 15:57:05 crc kubenswrapper[4933]: I1202 15:57:05.342717 4933 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 02 15:57:05 crc kubenswrapper[4933]: I1202 15:57:05.420094 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 02 15:57:05 crc kubenswrapper[4933]: I1202 15:57:05.438736 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 02 15:57:05 crc kubenswrapper[4933]: I1202 15:57:05.457415 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 02 15:57:05 crc kubenswrapper[4933]: I1202 15:57:05.458437 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 02 15:57:05 crc kubenswrapper[4933]: I1202 15:57:05.464459 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 02 15:57:05 crc kubenswrapper[4933]: I1202 15:57:05.484936 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 02 15:57:05 crc kubenswrapper[4933]: I1202 15:57:05.515792 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 02 15:57:05 crc kubenswrapper[4933]: I1202 15:57:05.529478 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 02 15:57:05 crc kubenswrapper[4933]: I1202 15:57:05.627573 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 02 15:57:05 crc kubenswrapper[4933]: I1202 15:57:05.660984 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 02 15:57:05 crc kubenswrapper[4933]: I1202 15:57:05.693506 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 02 15:57:05 crc kubenswrapper[4933]: I1202 15:57:05.723596 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 02 15:57:05 crc kubenswrapper[4933]: I1202 15:57:05.749853 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 02 15:57:05 crc kubenswrapper[4933]: I1202 15:57:05.881179 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 02 15:57:05 crc kubenswrapper[4933]: I1202 15:57:05.913659 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 02 15:57:05 crc kubenswrapper[4933]: I1202 15:57:05.928517 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 02 15:57:05 crc kubenswrapper[4933]: I1202 15:57:05.941071 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 02 15:57:06 crc kubenswrapper[4933]: I1202 15:57:06.028586 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 02 15:57:06 crc kubenswrapper[4933]: I1202 15:57:06.039401 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 02 15:57:06 crc kubenswrapper[4933]: I1202 15:57:06.160061 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 02 15:57:06 crc kubenswrapper[4933]: I1202 15:57:06.241509 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 02 15:57:06 crc kubenswrapper[4933]: I1202 15:57:06.241528 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 02 15:57:06 crc kubenswrapper[4933]: I1202 15:57:06.279474 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 02 15:57:06 crc kubenswrapper[4933]: I1202 15:57:06.285448 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 02 15:57:06 crc kubenswrapper[4933]: I1202 15:57:06.301280 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 02 15:57:06 crc kubenswrapper[4933]: I1202 15:57:06.371645 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 02 15:57:06 crc kubenswrapper[4933]: I1202 15:57:06.434299 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 02 15:57:06 crc kubenswrapper[4933]: I1202 15:57:06.572385 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 02 15:57:06 crc kubenswrapper[4933]: I1202 15:57:06.585174 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 02 15:57:06 crc kubenswrapper[4933]: I1202 15:57:06.601163 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 02 15:57:06 crc kubenswrapper[4933]: I1202 15:57:06.608739 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 02 15:57:06 crc kubenswrapper[4933]: I1202 15:57:06.642403 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 02 15:57:06 crc kubenswrapper[4933]: I1202 15:57:06.678437 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 02 15:57:06 crc kubenswrapper[4933]: I1202 15:57:06.709153 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 02 15:57:06 crc kubenswrapper[4933]: I1202 15:57:06.738068 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 02 15:57:06 crc kubenswrapper[4933]: I1202 15:57:06.747365 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 02 15:57:06 crc kubenswrapper[4933]: I1202 15:57:06.802667 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 02 15:57:07 crc kubenswrapper[4933]: I1202 15:57:07.073671 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 02 15:57:07 crc kubenswrapper[4933]: I1202 15:57:07.116362 4933 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 02 15:57:07 crc kubenswrapper[4933]: I1202 15:57:07.120254 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-r7dnz","openshift-marketplace/community-operators-54gm8","openshift-marketplace/certified-operators-mp97n","openshift-marketplace/redhat-marketplace-mv2m4","openshift-kube-apiserver/kube-apiserver-crc","openshift-marketplace/redhat-operators-jb7rb","openshift-authentication/oauth-openshift-558db77b4-26gvw"] Dec 02 15:57:07 crc kubenswrapper[4933]: I1202 15:57:07.120324 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 02 15:57:07 crc kubenswrapper[4933]: I1202 15:57:07.123787 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 15:57:07 crc kubenswrapper[4933]: I1202 15:57:07.140188 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=19.140171735 podStartE2EDuration="19.140171735s" podCreationTimestamp="2025-12-02 15:56:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 15:57:07.137371315 +0000 UTC m=+290.388598018" watchObservedRunningTime="2025-12-02 15:57:07.140171735 +0000 UTC m=+290.391398438" Dec 02 15:57:07 crc kubenswrapper[4933]: I1202 15:57:07.307543 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 02 15:57:07 crc kubenswrapper[4933]: I1202 15:57:07.378566 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 02 15:57:07 crc kubenswrapper[4933]: I1202 15:57:07.438406 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 02 15:57:07 crc kubenswrapper[4933]: I1202 15:57:07.471537 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 02 15:57:07 crc kubenswrapper[4933]: I1202 15:57:07.514280 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 02 15:57:07 crc kubenswrapper[4933]: I1202 15:57:07.615125 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 02 15:57:07 crc kubenswrapper[4933]: I1202 15:57:07.666696 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 02 15:57:07 crc kubenswrapper[4933]: I1202 15:57:07.671182 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 02 15:57:07 crc kubenswrapper[4933]: I1202 15:57:07.799048 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 02 15:57:07 crc kubenswrapper[4933]: I1202 15:57:07.803940 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 02 15:57:07 crc kubenswrapper[4933]: I1202 15:57:07.871394 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 02 15:57:08 crc kubenswrapper[4933]: I1202 15:57:08.070187 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 02 15:57:08 crc kubenswrapper[4933]: I1202 15:57:08.126648 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 02 15:57:08 crc kubenswrapper[4933]: I1202 15:57:08.146179 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 02 15:57:08 crc kubenswrapper[4933]: I1202 15:57:08.153686 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 02 15:57:08 crc kubenswrapper[4933]: I1202 15:57:08.224942 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 02 15:57:08 crc kubenswrapper[4933]: I1202 15:57:08.229487 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 02 15:57:08 crc kubenswrapper[4933]: I1202 15:57:08.261629 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 02 15:57:08 crc kubenswrapper[4933]: I1202 15:57:08.330228 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 02 15:57:08 crc kubenswrapper[4933]: I1202 15:57:08.388244 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 02 15:57:08 crc kubenswrapper[4933]: I1202 15:57:08.488445 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 02 15:57:08 crc kubenswrapper[4933]: I1202 15:57:08.521777 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 02 15:57:08 crc kubenswrapper[4933]: I1202 15:57:08.611613 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 02 15:57:08 crc kubenswrapper[4933]: I1202 15:57:08.746564 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-lvtt7"] Dec 02 15:57:08 crc kubenswrapper[4933]: I1202 15:57:08.749716 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 02 15:57:08 crc kubenswrapper[4933]: I1202 15:57:08.778962 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 02 15:57:08 crc kubenswrapper[4933]: I1202 15:57:08.913286 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 02 15:57:08 crc kubenswrapper[4933]: I1202 15:57:08.950460 4933 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 02 15:57:08 crc kubenswrapper[4933]: I1202 15:57:08.960514 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 02 15:57:09 crc kubenswrapper[4933]: I1202 15:57:09.036744 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 02 15:57:09 crc kubenswrapper[4933]: I1202 15:57:09.061085 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43ed131f-a943-4e5e-a11f-e07507790a41" path="/var/lib/kubelet/pods/43ed131f-a943-4e5e-a11f-e07507790a41/volumes" Dec 02 15:57:09 crc kubenswrapper[4933]: I1202 15:57:09.062152 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f5f6b9a-09c1-46e9-9fba-1de95dd3244c" path="/var/lib/kubelet/pods/4f5f6b9a-09c1-46e9-9fba-1de95dd3244c/volumes" Dec 02 15:57:09 crc kubenswrapper[4933]: I1202 15:57:09.063027 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97c46bec-8e96-4d62-8808-549f712a9802" path="/var/lib/kubelet/pods/97c46bec-8e96-4d62-8808-549f712a9802/volumes" Dec 02 15:57:09 crc kubenswrapper[4933]: I1202 15:57:09.064330 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3e7c8a1-0224-434e-bf6f-27342e1f27e6" path="/var/lib/kubelet/pods/b3e7c8a1-0224-434e-bf6f-27342e1f27e6/volumes" Dec 02 15:57:09 crc kubenswrapper[4933]: I1202 15:57:09.065137 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7cef2b5-6b23-47ba-83ce-161a02f128a1" path="/var/lib/kubelet/pods/c7cef2b5-6b23-47ba-83ce-161a02f128a1/volumes" Dec 02 15:57:09 crc kubenswrapper[4933]: I1202 15:57:09.066766 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb41f368-0638-40fd-be0f-bc71d0182af8" path="/var/lib/kubelet/pods/cb41f368-0638-40fd-be0f-bc71d0182af8/volumes" Dec 02 15:57:09 crc kubenswrapper[4933]: I1202 15:57:09.073120 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 02 15:57:09 crc kubenswrapper[4933]: I1202 15:57:09.158176 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 02 15:57:09 crc kubenswrapper[4933]: I1202 15:57:09.412170 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 02 15:57:09 crc kubenswrapper[4933]: I1202 15:57:09.558268 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 02 15:57:09 crc kubenswrapper[4933]: I1202 15:57:09.598117 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 02 15:57:09 crc kubenswrapper[4933]: I1202 15:57:09.600281 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 02 15:57:09 crc kubenswrapper[4933]: I1202 15:57:09.653058 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 02 15:57:09 crc kubenswrapper[4933]: I1202 15:57:09.737696 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 02 15:57:09 crc kubenswrapper[4933]: I1202 15:57:09.794518 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 02 15:57:09 crc kubenswrapper[4933]: I1202 15:57:09.819689 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 02 15:57:09 crc kubenswrapper[4933]: I1202 15:57:09.854890 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 02 15:57:09 crc kubenswrapper[4933]: I1202 15:57:09.862870 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 02 15:57:09 crc kubenswrapper[4933]: I1202 15:57:09.899996 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 02 15:57:09 crc kubenswrapper[4933]: I1202 15:57:09.915234 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 02 15:57:10 crc kubenswrapper[4933]: I1202 15:57:10.021095 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 02 15:57:10 crc kubenswrapper[4933]: I1202 15:57:10.053552 4933 scope.go:117] "RemoveContainer" containerID="213f318e4cbcb876a63ec085799ffc63076899bc2c57a7bb95b8e6c3d58753da" Dec 02 15:57:10 crc kubenswrapper[4933]: E1202 15:57:10.053748 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"marketplace-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=marketplace-operator pod=marketplace-operator-79b997595-k9r4r_openshift-marketplace(e66c3d98-54cc-4d3f-a1b4-3eee8ed9ab51)\"" pod="openshift-marketplace/marketplace-operator-79b997595-k9r4r" podUID="e66c3d98-54cc-4d3f-a1b4-3eee8ed9ab51" Dec 02 15:57:10 crc kubenswrapper[4933]: I1202 15:57:10.117007 4933 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 02 15:57:10 crc kubenswrapper[4933]: I1202 15:57:10.117191 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://fbeb9626a90286af02747f996c8a9031812820adc659b84cd2af221068ab855e" gracePeriod=5 Dec 02 15:57:10 crc kubenswrapper[4933]: I1202 15:57:10.152184 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 02 15:57:10 crc kubenswrapper[4933]: I1202 15:57:10.254352 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 02 15:57:10 crc kubenswrapper[4933]: I1202 15:57:10.381739 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 02 15:57:10 crc kubenswrapper[4933]: I1202 15:57:10.421302 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 02 15:57:10 crc kubenswrapper[4933]: I1202 15:57:10.429349 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 02 15:57:10 crc kubenswrapper[4933]: I1202 15:57:10.488093 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 02 15:57:10 crc kubenswrapper[4933]: I1202 15:57:10.522437 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 02 15:57:10 crc kubenswrapper[4933]: I1202 15:57:10.554669 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 02 15:57:10 crc kubenswrapper[4933]: I1202 15:57:10.578904 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 02 15:57:10 crc kubenswrapper[4933]: I1202 15:57:10.634858 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 02 15:57:10 crc kubenswrapper[4933]: I1202 15:57:10.647306 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 02 15:57:10 crc kubenswrapper[4933]: I1202 15:57:10.687437 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 02 15:57:10 crc kubenswrapper[4933]: I1202 15:57:10.771551 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 02 15:57:10 crc kubenswrapper[4933]: I1202 15:57:10.792660 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 02 15:57:10 crc kubenswrapper[4933]: I1202 15:57:10.810134 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 02 15:57:10 crc kubenswrapper[4933]: I1202 15:57:10.890372 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 02 15:57:10 crc kubenswrapper[4933]: I1202 15:57:10.898051 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 02 15:57:11 crc kubenswrapper[4933]: I1202 15:57:11.029017 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 02 15:57:11 crc kubenswrapper[4933]: I1202 15:57:11.057920 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 02 15:57:11 crc kubenswrapper[4933]: I1202 15:57:11.119267 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 02 15:57:11 crc kubenswrapper[4933]: I1202 15:57:11.122485 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 02 15:57:11 crc kubenswrapper[4933]: I1202 15:57:11.199665 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 02 15:57:11 crc kubenswrapper[4933]: I1202 15:57:11.371540 4933 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 02 15:57:11 crc kubenswrapper[4933]: I1202 15:57:11.419912 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 02 15:57:11 crc kubenswrapper[4933]: I1202 15:57:11.466843 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 02 15:57:11 crc kubenswrapper[4933]: I1202 15:57:11.479876 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 02 15:57:11 crc kubenswrapper[4933]: I1202 15:57:11.610091 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 02 15:57:11 crc kubenswrapper[4933]: I1202 15:57:11.654643 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 02 15:57:11 crc kubenswrapper[4933]: I1202 15:57:11.666612 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 02 15:57:11 crc kubenswrapper[4933]: I1202 15:57:11.667386 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 02 15:57:11 crc kubenswrapper[4933]: I1202 15:57:11.719109 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 02 15:57:11 crc kubenswrapper[4933]: I1202 15:57:11.736182 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 02 15:57:11 crc kubenswrapper[4933]: I1202 15:57:11.848489 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 02 15:57:11 crc kubenswrapper[4933]: I1202 15:57:11.922896 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 02 15:57:12 crc kubenswrapper[4933]: I1202 15:57:12.119000 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 02 15:57:12 crc kubenswrapper[4933]: I1202 15:57:12.158311 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 02 15:57:12 crc kubenswrapper[4933]: I1202 15:57:12.181939 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 02 15:57:12 crc kubenswrapper[4933]: I1202 15:57:12.238112 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 02 15:57:12 crc kubenswrapper[4933]: I1202 15:57:12.390601 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 02 15:57:12 crc kubenswrapper[4933]: I1202 15:57:12.414354 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 02 15:57:12 crc kubenswrapper[4933]: I1202 15:57:12.500600 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 02 15:57:12 crc kubenswrapper[4933]: I1202 15:57:12.567637 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 02 15:57:12 crc kubenswrapper[4933]: I1202 15:57:12.596161 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 02 15:57:12 crc kubenswrapper[4933]: I1202 15:57:12.615732 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 02 15:57:12 crc kubenswrapper[4933]: I1202 15:57:12.634428 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 02 15:57:12 crc kubenswrapper[4933]: I1202 15:57:12.677555 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 02 15:57:12 crc kubenswrapper[4933]: I1202 15:57:12.706335 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 02 15:57:12 crc kubenswrapper[4933]: I1202 15:57:12.722137 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 02 15:57:12 crc kubenswrapper[4933]: I1202 15:57:12.838346 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 02 15:57:12 crc kubenswrapper[4933]: I1202 15:57:12.903985 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 02 15:57:12 crc kubenswrapper[4933]: I1202 15:57:12.920326 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 02 15:57:12 crc kubenswrapper[4933]: I1202 15:57:12.944326 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 02 15:57:12 crc kubenswrapper[4933]: I1202 15:57:12.965873 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 02 15:57:13 crc kubenswrapper[4933]: I1202 15:57:13.051360 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 02 15:57:13 crc kubenswrapper[4933]: I1202 15:57:13.209583 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 02 15:57:13 crc kubenswrapper[4933]: I1202 15:57:13.213437 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 02 15:57:13 crc kubenswrapper[4933]: I1202 15:57:13.281664 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 02 15:57:13 crc kubenswrapper[4933]: I1202 15:57:13.294978 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 02 15:57:13 crc kubenswrapper[4933]: I1202 15:57:13.376651 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 02 15:57:13 crc kubenswrapper[4933]: I1202 15:57:13.412195 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 02 15:57:13 crc kubenswrapper[4933]: I1202 15:57:13.514412 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 02 15:57:13 crc kubenswrapper[4933]: I1202 15:57:13.520757 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 02 15:57:13 crc kubenswrapper[4933]: I1202 15:57:13.593594 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 02 15:57:13 crc kubenswrapper[4933]: I1202 15:57:13.613718 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 02 15:57:13 crc kubenswrapper[4933]: I1202 15:57:13.677107 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 02 15:57:13 crc kubenswrapper[4933]: I1202 15:57:13.742083 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 02 15:57:13 crc kubenswrapper[4933]: I1202 15:57:13.785155 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 02 15:57:13 crc kubenswrapper[4933]: I1202 15:57:13.833772 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-86f4ddc759-m29kp"] Dec 02 15:57:13 crc kubenswrapper[4933]: E1202 15:57:13.835295 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a53a119-fa38-4661-950b-e58963acf7ff" containerName="installer" Dec 02 15:57:13 crc kubenswrapper[4933]: I1202 15:57:13.835408 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a53a119-fa38-4661-950b-e58963acf7ff" containerName="installer" Dec 02 15:57:13 crc kubenswrapper[4933]: E1202 15:57:13.835505 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 02 15:57:13 crc kubenswrapper[4933]: I1202 15:57:13.835587 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 02 15:57:13 crc kubenswrapper[4933]: E1202 15:57:13.835683 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97c46bec-8e96-4d62-8808-549f712a9802" containerName="oauth-openshift" Dec 02 15:57:13 crc kubenswrapper[4933]: I1202 15:57:13.835776 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="97c46bec-8e96-4d62-8808-549f712a9802" containerName="oauth-openshift" Dec 02 15:57:13 crc kubenswrapper[4933]: I1202 15:57:13.836024 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="97c46bec-8e96-4d62-8808-549f712a9802" containerName="oauth-openshift" Dec 02 15:57:13 crc kubenswrapper[4933]: I1202 15:57:13.836149 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a53a119-fa38-4661-950b-e58963acf7ff" containerName="installer" Dec 02 15:57:13 crc kubenswrapper[4933]: I1202 15:57:13.836275 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 02 15:57:13 crc kubenswrapper[4933]: I1202 15:57:13.836900 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-86f4ddc759-m29kp" Dec 02 15:57:13 crc kubenswrapper[4933]: I1202 15:57:13.839177 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 02 15:57:13 crc kubenswrapper[4933]: I1202 15:57:13.840996 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 02 15:57:13 crc kubenswrapper[4933]: I1202 15:57:13.841353 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 02 15:57:13 crc kubenswrapper[4933]: I1202 15:57:13.841296 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 02 15:57:13 crc kubenswrapper[4933]: I1202 15:57:13.841503 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 02 15:57:13 crc kubenswrapper[4933]: I1202 15:57:13.841295 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 02 15:57:13 crc kubenswrapper[4933]: I1202 15:57:13.841595 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 02 15:57:13 crc kubenswrapper[4933]: I1202 15:57:13.841988 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 02 15:57:13 crc kubenswrapper[4933]: I1202 15:57:13.842062 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 02 15:57:13 crc kubenswrapper[4933]: I1202 15:57:13.842212 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 02 15:57:13 crc kubenswrapper[4933]: I1202 15:57:13.844748 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 02 15:57:13 crc kubenswrapper[4933]: I1202 15:57:13.845333 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 02 15:57:13 crc kubenswrapper[4933]: I1202 15:57:13.848264 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 02 15:57:13 crc kubenswrapper[4933]: I1202 15:57:13.855761 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-86f4ddc759-m29kp"] Dec 02 15:57:13 crc kubenswrapper[4933]: I1202 15:57:13.857281 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 02 15:57:13 crc kubenswrapper[4933]: I1202 15:57:13.865971 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 02 15:57:13 crc kubenswrapper[4933]: I1202 15:57:13.871816 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 02 15:57:13 crc kubenswrapper[4933]: I1202 15:57:13.904537 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 02 15:57:13 crc kubenswrapper[4933]: I1202 15:57:13.914025 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 02 15:57:13 crc kubenswrapper[4933]: I1202 15:57:13.920033 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 02 15:57:13 crc kubenswrapper[4933]: I1202 15:57:13.933530 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/a373e792-ddbe-4358-83fe-4e187199e75f-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-86f4ddc759-m29kp\" (UID: \"a373e792-ddbe-4358-83fe-4e187199e75f\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-m29kp" Dec 02 15:57:13 crc kubenswrapper[4933]: I1202 15:57:13.933724 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/a373e792-ddbe-4358-83fe-4e187199e75f-v4-0-config-user-template-error\") pod \"oauth-openshift-86f4ddc759-m29kp\" (UID: \"a373e792-ddbe-4358-83fe-4e187199e75f\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-m29kp" Dec 02 15:57:13 crc kubenswrapper[4933]: I1202 15:57:13.933877 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/a373e792-ddbe-4358-83fe-4e187199e75f-v4-0-config-system-cliconfig\") pod \"oauth-openshift-86f4ddc759-m29kp\" (UID: \"a373e792-ddbe-4358-83fe-4e187199e75f\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-m29kp" Dec 02 15:57:13 crc kubenswrapper[4933]: I1202 15:57:13.934214 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/a373e792-ddbe-4358-83fe-4e187199e75f-v4-0-config-system-service-ca\") pod \"oauth-openshift-86f4ddc759-m29kp\" (UID: \"a373e792-ddbe-4358-83fe-4e187199e75f\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-m29kp" Dec 02 15:57:13 crc kubenswrapper[4933]: I1202 15:57:13.934332 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/a373e792-ddbe-4358-83fe-4e187199e75f-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-86f4ddc759-m29kp\" (UID: \"a373e792-ddbe-4358-83fe-4e187199e75f\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-m29kp" Dec 02 15:57:13 crc kubenswrapper[4933]: I1202 15:57:13.934451 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a373e792-ddbe-4358-83fe-4e187199e75f-audit-dir\") pod \"oauth-openshift-86f4ddc759-m29kp\" (UID: \"a373e792-ddbe-4358-83fe-4e187199e75f\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-m29kp" Dec 02 15:57:13 crc kubenswrapper[4933]: I1202 15:57:13.934569 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdvnz\" (UniqueName: \"kubernetes.io/projected/a373e792-ddbe-4358-83fe-4e187199e75f-kube-api-access-fdvnz\") pod \"oauth-openshift-86f4ddc759-m29kp\" (UID: \"a373e792-ddbe-4358-83fe-4e187199e75f\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-m29kp" Dec 02 15:57:13 crc kubenswrapper[4933]: I1202 15:57:13.934727 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/a373e792-ddbe-4358-83fe-4e187199e75f-v4-0-config-system-router-certs\") pod \"oauth-openshift-86f4ddc759-m29kp\" (UID: \"a373e792-ddbe-4358-83fe-4e187199e75f\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-m29kp" Dec 02 15:57:13 crc kubenswrapper[4933]: I1202 15:57:13.934891 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/a373e792-ddbe-4358-83fe-4e187199e75f-v4-0-config-user-template-login\") pod \"oauth-openshift-86f4ddc759-m29kp\" (UID: \"a373e792-ddbe-4358-83fe-4e187199e75f\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-m29kp" Dec 02 15:57:13 crc kubenswrapper[4933]: I1202 15:57:13.935013 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a373e792-ddbe-4358-83fe-4e187199e75f-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-86f4ddc759-m29kp\" (UID: \"a373e792-ddbe-4358-83fe-4e187199e75f\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-m29kp" Dec 02 15:57:13 crc kubenswrapper[4933]: I1202 15:57:13.935134 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/a373e792-ddbe-4358-83fe-4e187199e75f-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-86f4ddc759-m29kp\" (UID: \"a373e792-ddbe-4358-83fe-4e187199e75f\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-m29kp" Dec 02 15:57:13 crc kubenswrapper[4933]: I1202 15:57:13.935244 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/a373e792-ddbe-4358-83fe-4e187199e75f-v4-0-config-system-serving-cert\") pod \"oauth-openshift-86f4ddc759-m29kp\" (UID: \"a373e792-ddbe-4358-83fe-4e187199e75f\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-m29kp" Dec 02 15:57:13 crc kubenswrapper[4933]: I1202 15:57:13.935366 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a373e792-ddbe-4358-83fe-4e187199e75f-audit-policies\") pod \"oauth-openshift-86f4ddc759-m29kp\" (UID: \"a373e792-ddbe-4358-83fe-4e187199e75f\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-m29kp" Dec 02 15:57:13 crc kubenswrapper[4933]: I1202 15:57:13.935475 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/a373e792-ddbe-4358-83fe-4e187199e75f-v4-0-config-system-session\") pod \"oauth-openshift-86f4ddc759-m29kp\" (UID: \"a373e792-ddbe-4358-83fe-4e187199e75f\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-m29kp" Dec 02 15:57:14 crc kubenswrapper[4933]: I1202 15:57:14.036048 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/a373e792-ddbe-4358-83fe-4e187199e75f-v4-0-config-system-serving-cert\") pod \"oauth-openshift-86f4ddc759-m29kp\" (UID: \"a373e792-ddbe-4358-83fe-4e187199e75f\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-m29kp" Dec 02 15:57:14 crc kubenswrapper[4933]: I1202 15:57:14.036326 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a373e792-ddbe-4358-83fe-4e187199e75f-audit-policies\") pod \"oauth-openshift-86f4ddc759-m29kp\" (UID: \"a373e792-ddbe-4358-83fe-4e187199e75f\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-m29kp" Dec 02 15:57:14 crc kubenswrapper[4933]: I1202 15:57:14.036409 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/a373e792-ddbe-4358-83fe-4e187199e75f-v4-0-config-system-session\") pod \"oauth-openshift-86f4ddc759-m29kp\" (UID: \"a373e792-ddbe-4358-83fe-4e187199e75f\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-m29kp" Dec 02 15:57:14 crc kubenswrapper[4933]: I1202 15:57:14.036511 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/a373e792-ddbe-4358-83fe-4e187199e75f-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-86f4ddc759-m29kp\" (UID: \"a373e792-ddbe-4358-83fe-4e187199e75f\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-m29kp" Dec 02 15:57:14 crc kubenswrapper[4933]: I1202 15:57:14.036588 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/a373e792-ddbe-4358-83fe-4e187199e75f-v4-0-config-user-template-error\") pod \"oauth-openshift-86f4ddc759-m29kp\" (UID: \"a373e792-ddbe-4358-83fe-4e187199e75f\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-m29kp" Dec 02 15:57:14 crc kubenswrapper[4933]: I1202 15:57:14.036671 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/a373e792-ddbe-4358-83fe-4e187199e75f-v4-0-config-system-cliconfig\") pod \"oauth-openshift-86f4ddc759-m29kp\" (UID: \"a373e792-ddbe-4358-83fe-4e187199e75f\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-m29kp" Dec 02 15:57:14 crc kubenswrapper[4933]: I1202 15:57:14.036750 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/a373e792-ddbe-4358-83fe-4e187199e75f-v4-0-config-system-service-ca\") pod \"oauth-openshift-86f4ddc759-m29kp\" (UID: \"a373e792-ddbe-4358-83fe-4e187199e75f\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-m29kp" Dec 02 15:57:14 crc kubenswrapper[4933]: I1202 15:57:14.036847 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/a373e792-ddbe-4358-83fe-4e187199e75f-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-86f4ddc759-m29kp\" (UID: \"a373e792-ddbe-4358-83fe-4e187199e75f\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-m29kp" Dec 02 15:57:14 crc kubenswrapper[4933]: I1202 15:57:14.036932 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a373e792-ddbe-4358-83fe-4e187199e75f-audit-dir\") pod \"oauth-openshift-86f4ddc759-m29kp\" (UID: \"a373e792-ddbe-4358-83fe-4e187199e75f\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-m29kp" Dec 02 15:57:14 crc kubenswrapper[4933]: I1202 15:57:14.037011 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/a373e792-ddbe-4358-83fe-4e187199e75f-v4-0-config-system-router-certs\") pod \"oauth-openshift-86f4ddc759-m29kp\" (UID: \"a373e792-ddbe-4358-83fe-4e187199e75f\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-m29kp" Dec 02 15:57:14 crc kubenswrapper[4933]: I1202 15:57:14.037091 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdvnz\" (UniqueName: \"kubernetes.io/projected/a373e792-ddbe-4358-83fe-4e187199e75f-kube-api-access-fdvnz\") pod \"oauth-openshift-86f4ddc759-m29kp\" (UID: \"a373e792-ddbe-4358-83fe-4e187199e75f\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-m29kp" Dec 02 15:57:14 crc kubenswrapper[4933]: I1202 15:57:14.037168 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/a373e792-ddbe-4358-83fe-4e187199e75f-v4-0-config-user-template-login\") pod \"oauth-openshift-86f4ddc759-m29kp\" (UID: \"a373e792-ddbe-4358-83fe-4e187199e75f\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-m29kp" Dec 02 15:57:14 crc kubenswrapper[4933]: I1202 15:57:14.037253 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a373e792-ddbe-4358-83fe-4e187199e75f-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-86f4ddc759-m29kp\" (UID: \"a373e792-ddbe-4358-83fe-4e187199e75f\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-m29kp" Dec 02 15:57:14 crc kubenswrapper[4933]: I1202 15:57:14.037335 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/a373e792-ddbe-4358-83fe-4e187199e75f-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-86f4ddc759-m29kp\" (UID: \"a373e792-ddbe-4358-83fe-4e187199e75f\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-m29kp" Dec 02 15:57:14 crc kubenswrapper[4933]: I1202 15:57:14.038792 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/a373e792-ddbe-4358-83fe-4e187199e75f-v4-0-config-system-service-ca\") pod \"oauth-openshift-86f4ddc759-m29kp\" (UID: \"a373e792-ddbe-4358-83fe-4e187199e75f\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-m29kp" Dec 02 15:57:14 crc kubenswrapper[4933]: I1202 15:57:14.039486 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a373e792-ddbe-4358-83fe-4e187199e75f-audit-dir\") pod \"oauth-openshift-86f4ddc759-m29kp\" (UID: \"a373e792-ddbe-4358-83fe-4e187199e75f\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-m29kp" Dec 02 15:57:14 crc kubenswrapper[4933]: I1202 15:57:14.039556 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/a373e792-ddbe-4358-83fe-4e187199e75f-v4-0-config-system-cliconfig\") pod \"oauth-openshift-86f4ddc759-m29kp\" (UID: \"a373e792-ddbe-4358-83fe-4e187199e75f\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-m29kp" Dec 02 15:57:14 crc kubenswrapper[4933]: I1202 15:57:14.039559 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a373e792-ddbe-4358-83fe-4e187199e75f-audit-policies\") pod \"oauth-openshift-86f4ddc759-m29kp\" (UID: \"a373e792-ddbe-4358-83fe-4e187199e75f\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-m29kp" Dec 02 15:57:14 crc kubenswrapper[4933]: I1202 15:57:14.040891 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a373e792-ddbe-4358-83fe-4e187199e75f-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-86f4ddc759-m29kp\" (UID: \"a373e792-ddbe-4358-83fe-4e187199e75f\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-m29kp" Dec 02 15:57:14 crc kubenswrapper[4933]: I1202 15:57:14.043904 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/a373e792-ddbe-4358-83fe-4e187199e75f-v4-0-config-user-template-error\") pod \"oauth-openshift-86f4ddc759-m29kp\" (UID: \"a373e792-ddbe-4358-83fe-4e187199e75f\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-m29kp" Dec 02 15:57:14 crc kubenswrapper[4933]: I1202 15:57:14.043958 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/a373e792-ddbe-4358-83fe-4e187199e75f-v4-0-config-system-router-certs\") pod \"oauth-openshift-86f4ddc759-m29kp\" (UID: \"a373e792-ddbe-4358-83fe-4e187199e75f\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-m29kp" Dec 02 15:57:14 crc kubenswrapper[4933]: I1202 15:57:14.045634 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/a373e792-ddbe-4358-83fe-4e187199e75f-v4-0-config-user-template-login\") pod \"oauth-openshift-86f4ddc759-m29kp\" (UID: \"a373e792-ddbe-4358-83fe-4e187199e75f\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-m29kp" Dec 02 15:57:14 crc kubenswrapper[4933]: I1202 15:57:14.046596 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/a373e792-ddbe-4358-83fe-4e187199e75f-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-86f4ddc759-m29kp\" (UID: \"a373e792-ddbe-4358-83fe-4e187199e75f\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-m29kp" Dec 02 15:57:14 crc kubenswrapper[4933]: I1202 15:57:14.046977 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/a373e792-ddbe-4358-83fe-4e187199e75f-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-86f4ddc759-m29kp\" (UID: \"a373e792-ddbe-4358-83fe-4e187199e75f\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-m29kp" Dec 02 15:57:14 crc kubenswrapper[4933]: I1202 15:57:14.057022 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/a373e792-ddbe-4358-83fe-4e187199e75f-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-86f4ddc759-m29kp\" (UID: \"a373e792-ddbe-4358-83fe-4e187199e75f\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-m29kp" Dec 02 15:57:14 crc kubenswrapper[4933]: I1202 15:57:14.058731 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/a373e792-ddbe-4358-83fe-4e187199e75f-v4-0-config-system-serving-cert\") pod \"oauth-openshift-86f4ddc759-m29kp\" (UID: \"a373e792-ddbe-4358-83fe-4e187199e75f\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-m29kp" Dec 02 15:57:14 crc kubenswrapper[4933]: I1202 15:57:14.060523 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/a373e792-ddbe-4358-83fe-4e187199e75f-v4-0-config-system-session\") pod \"oauth-openshift-86f4ddc759-m29kp\" (UID: \"a373e792-ddbe-4358-83fe-4e187199e75f\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-m29kp" Dec 02 15:57:14 crc kubenswrapper[4933]: I1202 15:57:14.061019 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdvnz\" (UniqueName: \"kubernetes.io/projected/a373e792-ddbe-4358-83fe-4e187199e75f-kube-api-access-fdvnz\") pod \"oauth-openshift-86f4ddc759-m29kp\" (UID: \"a373e792-ddbe-4358-83fe-4e187199e75f\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-m29kp" Dec 02 15:57:14 crc kubenswrapper[4933]: I1202 15:57:14.157075 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 02 15:57:14 crc kubenswrapper[4933]: I1202 15:57:14.161887 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-86f4ddc759-m29kp" Dec 02 15:57:14 crc kubenswrapper[4933]: I1202 15:57:14.287762 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 02 15:57:14 crc kubenswrapper[4933]: I1202 15:57:14.449451 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 02 15:57:14 crc kubenswrapper[4933]: I1202 15:57:14.614177 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-86f4ddc759-m29kp"] Dec 02 15:57:14 crc kubenswrapper[4933]: I1202 15:57:14.829563 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 02 15:57:14 crc kubenswrapper[4933]: I1202 15:57:14.852446 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 02 15:57:15 crc kubenswrapper[4933]: I1202 15:57:15.006908 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 02 15:57:15 crc kubenswrapper[4933]: I1202 15:57:15.061520 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 02 15:57:15 crc kubenswrapper[4933]: I1202 15:57:15.251776 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 02 15:57:15 crc kubenswrapper[4933]: I1202 15:57:15.382492 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 02 15:57:15 crc kubenswrapper[4933]: I1202 15:57:15.465390 4933 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 02 15:57:15 crc kubenswrapper[4933]: I1202 15:57:15.483758 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 02 15:57:15 crc kubenswrapper[4933]: I1202 15:57:15.576230 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 02 15:57:15 crc kubenswrapper[4933]: I1202 15:57:15.576558 4933 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="fbeb9626a90286af02747f996c8a9031812820adc659b84cd2af221068ab855e" exitCode=137 Dec 02 15:57:15 crc kubenswrapper[4933]: I1202 15:57:15.578216 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-86f4ddc759-m29kp" event={"ID":"a373e792-ddbe-4358-83fe-4e187199e75f","Type":"ContainerStarted","Data":"fcaad6e3a615ce580baa3557bd243bd7099b5402bee689656048a0302cabcb04"} Dec 02 15:57:15 crc kubenswrapper[4933]: I1202 15:57:15.578260 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-86f4ddc759-m29kp" event={"ID":"a373e792-ddbe-4358-83fe-4e187199e75f","Type":"ContainerStarted","Data":"34c74b23cdeaeaa3dd66b64cd3859960e5b54bb9a4125f361e8cf02377a6b70d"} Dec 02 15:57:15 crc kubenswrapper[4933]: I1202 15:57:15.578540 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-86f4ddc759-m29kp" Dec 02 15:57:15 crc kubenswrapper[4933]: I1202 15:57:15.583709 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-86f4ddc759-m29kp" Dec 02 15:57:15 crc kubenswrapper[4933]: I1202 15:57:15.605521 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-86f4ddc759-m29kp" podStartSLOduration=50.605500119 podStartE2EDuration="50.605500119s" podCreationTimestamp="2025-12-02 15:56:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 15:57:15.602204344 +0000 UTC m=+298.853431057" watchObservedRunningTime="2025-12-02 15:57:15.605500119 +0000 UTC m=+298.856726832" Dec 02 15:57:15 crc kubenswrapper[4933]: I1202 15:57:15.691506 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 02 15:57:15 crc kubenswrapper[4933]: I1202 15:57:15.712566 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 02 15:57:15 crc kubenswrapper[4933]: I1202 15:57:15.712638 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 15:57:15 crc kubenswrapper[4933]: I1202 15:57:15.776061 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 02 15:57:15 crc kubenswrapper[4933]: I1202 15:57:15.856710 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 02 15:57:15 crc kubenswrapper[4933]: I1202 15:57:15.856770 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 02 15:57:15 crc kubenswrapper[4933]: I1202 15:57:15.856797 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 02 15:57:15 crc kubenswrapper[4933]: I1202 15:57:15.856819 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 02 15:57:15 crc kubenswrapper[4933]: I1202 15:57:15.856912 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 02 15:57:15 crc kubenswrapper[4933]: I1202 15:57:15.857907 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 15:57:15 crc kubenswrapper[4933]: I1202 15:57:15.857957 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 15:57:15 crc kubenswrapper[4933]: I1202 15:57:15.858001 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 15:57:15 crc kubenswrapper[4933]: I1202 15:57:15.858088 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 15:57:15 crc kubenswrapper[4933]: I1202 15:57:15.859856 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 02 15:57:15 crc kubenswrapper[4933]: I1202 15:57:15.866202 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 15:57:15 crc kubenswrapper[4933]: I1202 15:57:15.875709 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 02 15:57:15 crc kubenswrapper[4933]: I1202 15:57:15.958857 4933 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Dec 02 15:57:15 crc kubenswrapper[4933]: I1202 15:57:15.958892 4933 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 02 15:57:15 crc kubenswrapper[4933]: I1202 15:57:15.958906 4933 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Dec 02 15:57:15 crc kubenswrapper[4933]: I1202 15:57:15.958917 4933 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Dec 02 15:57:15 crc kubenswrapper[4933]: I1202 15:57:15.958928 4933 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 02 15:57:16 crc kubenswrapper[4933]: I1202 15:57:16.321334 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 02 15:57:16 crc kubenswrapper[4933]: I1202 15:57:16.586046 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 02 15:57:16 crc kubenswrapper[4933]: I1202 15:57:16.586201 4933 scope.go:117] "RemoveContainer" containerID="fbeb9626a90286af02747f996c8a9031812820adc659b84cd2af221068ab855e" Dec 02 15:57:16 crc kubenswrapper[4933]: I1202 15:57:16.586326 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 15:57:16 crc kubenswrapper[4933]: I1202 15:57:16.923211 4933 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Dec 02 15:57:17 crc kubenswrapper[4933]: I1202 15:57:17.019754 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 02 15:57:17 crc kubenswrapper[4933]: I1202 15:57:17.061313 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Dec 02 15:57:17 crc kubenswrapper[4933]: I1202 15:57:17.762952 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 02 15:57:21 crc kubenswrapper[4933]: I1202 15:57:21.053292 4933 scope.go:117] "RemoveContainer" containerID="213f318e4cbcb876a63ec085799ffc63076899bc2c57a7bb95b8e6c3d58753da" Dec 02 15:57:21 crc kubenswrapper[4933]: I1202 15:57:21.610331 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-k9r4r_e66c3d98-54cc-4d3f-a1b4-3eee8ed9ab51/marketplace-operator/2.log" Dec 02 15:57:21 crc kubenswrapper[4933]: I1202 15:57:21.610646 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-k9r4r" event={"ID":"e66c3d98-54cc-4d3f-a1b4-3eee8ed9ab51","Type":"ContainerStarted","Data":"d2978bb2407b067cdb717e35c8ebf6f315796a6b9de6ebb892f8f747d175c303"} Dec 02 15:57:21 crc kubenswrapper[4933]: I1202 15:57:21.611428 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-k9r4r" Dec 02 15:57:21 crc kubenswrapper[4933]: I1202 15:57:21.613453 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-k9r4r" Dec 02 15:57:21 crc kubenswrapper[4933]: I1202 15:57:21.625776 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-k9r4r" podStartSLOduration=54.625760587 podStartE2EDuration="54.625760587s" podCreationTimestamp="2025-12-02 15:56:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 15:57:21.623529503 +0000 UTC m=+304.874756206" watchObservedRunningTime="2025-12-02 15:57:21.625760587 +0000 UTC m=+304.876987290" Dec 02 15:57:33 crc kubenswrapper[4933]: I1202 15:57:33.200504 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-5ttq4"] Dec 02 15:57:33 crc kubenswrapper[4933]: I1202 15:57:33.201293 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-5ttq4" podUID="50f1708b-32f7-42c3-a3ec-57f654624efa" containerName="controller-manager" containerID="cri-o://2fb3e2c3fff05da92a6133150ad1ffae122f269e7d7bba156439bfb4c776cec9" gracePeriod=30 Dec 02 15:57:33 crc kubenswrapper[4933]: I1202 15:57:33.308851 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-tdzpv"] Dec 02 15:57:33 crc kubenswrapper[4933]: I1202 15:57:33.565563 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-5ttq4" Dec 02 15:57:33 crc kubenswrapper[4933]: I1202 15:57:33.586592 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/50f1708b-32f7-42c3-a3ec-57f654624efa-proxy-ca-bundles\") pod \"50f1708b-32f7-42c3-a3ec-57f654624efa\" (UID: \"50f1708b-32f7-42c3-a3ec-57f654624efa\") " Dec 02 15:57:33 crc kubenswrapper[4933]: I1202 15:57:33.586673 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4c9c5\" (UniqueName: \"kubernetes.io/projected/50f1708b-32f7-42c3-a3ec-57f654624efa-kube-api-access-4c9c5\") pod \"50f1708b-32f7-42c3-a3ec-57f654624efa\" (UID: \"50f1708b-32f7-42c3-a3ec-57f654624efa\") " Dec 02 15:57:33 crc kubenswrapper[4933]: I1202 15:57:33.586699 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50f1708b-32f7-42c3-a3ec-57f654624efa-config\") pod \"50f1708b-32f7-42c3-a3ec-57f654624efa\" (UID: \"50f1708b-32f7-42c3-a3ec-57f654624efa\") " Dec 02 15:57:33 crc kubenswrapper[4933]: I1202 15:57:33.586745 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/50f1708b-32f7-42c3-a3ec-57f654624efa-client-ca\") pod \"50f1708b-32f7-42c3-a3ec-57f654624efa\" (UID: \"50f1708b-32f7-42c3-a3ec-57f654624efa\") " Dec 02 15:57:33 crc kubenswrapper[4933]: I1202 15:57:33.586767 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/50f1708b-32f7-42c3-a3ec-57f654624efa-serving-cert\") pod \"50f1708b-32f7-42c3-a3ec-57f654624efa\" (UID: \"50f1708b-32f7-42c3-a3ec-57f654624efa\") " Dec 02 15:57:33 crc kubenswrapper[4933]: I1202 15:57:33.587508 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50f1708b-32f7-42c3-a3ec-57f654624efa-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "50f1708b-32f7-42c3-a3ec-57f654624efa" (UID: "50f1708b-32f7-42c3-a3ec-57f654624efa"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 15:57:33 crc kubenswrapper[4933]: I1202 15:57:33.587782 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50f1708b-32f7-42c3-a3ec-57f654624efa-client-ca" (OuterVolumeSpecName: "client-ca") pod "50f1708b-32f7-42c3-a3ec-57f654624efa" (UID: "50f1708b-32f7-42c3-a3ec-57f654624efa"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 15:57:33 crc kubenswrapper[4933]: I1202 15:57:33.587851 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50f1708b-32f7-42c3-a3ec-57f654624efa-config" (OuterVolumeSpecName: "config") pod "50f1708b-32f7-42c3-a3ec-57f654624efa" (UID: "50f1708b-32f7-42c3-a3ec-57f654624efa"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 15:57:33 crc kubenswrapper[4933]: I1202 15:57:33.591910 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50f1708b-32f7-42c3-a3ec-57f654624efa-kube-api-access-4c9c5" (OuterVolumeSpecName: "kube-api-access-4c9c5") pod "50f1708b-32f7-42c3-a3ec-57f654624efa" (UID: "50f1708b-32f7-42c3-a3ec-57f654624efa"). InnerVolumeSpecName "kube-api-access-4c9c5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 15:57:33 crc kubenswrapper[4933]: I1202 15:57:33.597077 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50f1708b-32f7-42c3-a3ec-57f654624efa-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "50f1708b-32f7-42c3-a3ec-57f654624efa" (UID: "50f1708b-32f7-42c3-a3ec-57f654624efa"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 15:57:33 crc kubenswrapper[4933]: I1202 15:57:33.665477 4933 generic.go:334] "Generic (PLEG): container finished" podID="50f1708b-32f7-42c3-a3ec-57f654624efa" containerID="2fb3e2c3fff05da92a6133150ad1ffae122f269e7d7bba156439bfb4c776cec9" exitCode=0 Dec 02 15:57:33 crc kubenswrapper[4933]: I1202 15:57:33.665685 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tdzpv" podUID="8a3e77e2-5cb3-44df-8570-e18f8e8f15a5" containerName="route-controller-manager" containerID="cri-o://638fa4de69f06fe097dae8e1bceb9cb537d8f26b110bb755c39c62a0120056a4" gracePeriod=30 Dec 02 15:57:33 crc kubenswrapper[4933]: I1202 15:57:33.666037 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-5ttq4" event={"ID":"50f1708b-32f7-42c3-a3ec-57f654624efa","Type":"ContainerDied","Data":"2fb3e2c3fff05da92a6133150ad1ffae122f269e7d7bba156439bfb4c776cec9"} Dec 02 15:57:33 crc kubenswrapper[4933]: I1202 15:57:33.666284 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-5ttq4" event={"ID":"50f1708b-32f7-42c3-a3ec-57f654624efa","Type":"ContainerDied","Data":"31cfa6907247dc63d7723d030649af968668e8ca15715a542073c7c4713a26f5"} Dec 02 15:57:33 crc kubenswrapper[4933]: I1202 15:57:33.666160 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-5ttq4" Dec 02 15:57:33 crc kubenswrapper[4933]: I1202 15:57:33.666335 4933 scope.go:117] "RemoveContainer" containerID="2fb3e2c3fff05da92a6133150ad1ffae122f269e7d7bba156439bfb4c776cec9" Dec 02 15:57:33 crc kubenswrapper[4933]: I1202 15:57:33.687160 4933 scope.go:117] "RemoveContainer" containerID="2fb3e2c3fff05da92a6133150ad1ffae122f269e7d7bba156439bfb4c776cec9" Dec 02 15:57:33 crc kubenswrapper[4933]: I1202 15:57:33.687729 4933 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/50f1708b-32f7-42c3-a3ec-57f654624efa-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 02 15:57:33 crc kubenswrapper[4933]: I1202 15:57:33.687766 4933 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50f1708b-32f7-42c3-a3ec-57f654624efa-config\") on node \"crc\" DevicePath \"\"" Dec 02 15:57:33 crc kubenswrapper[4933]: I1202 15:57:33.687779 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4c9c5\" (UniqueName: \"kubernetes.io/projected/50f1708b-32f7-42c3-a3ec-57f654624efa-kube-api-access-4c9c5\") on node \"crc\" DevicePath \"\"" Dec 02 15:57:33 crc kubenswrapper[4933]: I1202 15:57:33.687790 4933 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/50f1708b-32f7-42c3-a3ec-57f654624efa-client-ca\") on node \"crc\" DevicePath \"\"" Dec 02 15:57:33 crc kubenswrapper[4933]: I1202 15:57:33.687801 4933 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/50f1708b-32f7-42c3-a3ec-57f654624efa-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 15:57:33 crc kubenswrapper[4933]: E1202 15:57:33.688491 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2fb3e2c3fff05da92a6133150ad1ffae122f269e7d7bba156439bfb4c776cec9\": container with ID starting with 2fb3e2c3fff05da92a6133150ad1ffae122f269e7d7bba156439bfb4c776cec9 not found: ID does not exist" containerID="2fb3e2c3fff05da92a6133150ad1ffae122f269e7d7bba156439bfb4c776cec9" Dec 02 15:57:33 crc kubenswrapper[4933]: I1202 15:57:33.688519 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2fb3e2c3fff05da92a6133150ad1ffae122f269e7d7bba156439bfb4c776cec9"} err="failed to get container status \"2fb3e2c3fff05da92a6133150ad1ffae122f269e7d7bba156439bfb4c776cec9\": rpc error: code = NotFound desc = could not find container \"2fb3e2c3fff05da92a6133150ad1ffae122f269e7d7bba156439bfb4c776cec9\": container with ID starting with 2fb3e2c3fff05da92a6133150ad1ffae122f269e7d7bba156439bfb4c776cec9 not found: ID does not exist" Dec 02 15:57:33 crc kubenswrapper[4933]: I1202 15:57:33.693603 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-5ttq4"] Dec 02 15:57:33 crc kubenswrapper[4933]: I1202 15:57:33.696399 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-5ttq4"] Dec 02 15:57:33 crc kubenswrapper[4933]: I1202 15:57:33.788969 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-lvtt7" podUID="977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff" containerName="registry" containerID="cri-o://097e9649a54b11a78a403954daefddf325237cb7718f9ab24e9b9ec97d9508a5" gracePeriod=30 Dec 02 15:57:33 crc kubenswrapper[4933]: I1202 15:57:33.946934 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tdzpv" Dec 02 15:57:33 crc kubenswrapper[4933]: I1202 15:57:33.993747 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6bhdx\" (UniqueName: \"kubernetes.io/projected/8a3e77e2-5cb3-44df-8570-e18f8e8f15a5-kube-api-access-6bhdx\") pod \"8a3e77e2-5cb3-44df-8570-e18f8e8f15a5\" (UID: \"8a3e77e2-5cb3-44df-8570-e18f8e8f15a5\") " Dec 02 15:57:33 crc kubenswrapper[4933]: I1202 15:57:33.993802 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8a3e77e2-5cb3-44df-8570-e18f8e8f15a5-client-ca\") pod \"8a3e77e2-5cb3-44df-8570-e18f8e8f15a5\" (UID: \"8a3e77e2-5cb3-44df-8570-e18f8e8f15a5\") " Dec 02 15:57:33 crc kubenswrapper[4933]: I1202 15:57:33.993921 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8a3e77e2-5cb3-44df-8570-e18f8e8f15a5-serving-cert\") pod \"8a3e77e2-5cb3-44df-8570-e18f8e8f15a5\" (UID: \"8a3e77e2-5cb3-44df-8570-e18f8e8f15a5\") " Dec 02 15:57:33 crc kubenswrapper[4933]: I1202 15:57:33.993948 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a3e77e2-5cb3-44df-8570-e18f8e8f15a5-config\") pod \"8a3e77e2-5cb3-44df-8570-e18f8e8f15a5\" (UID: \"8a3e77e2-5cb3-44df-8570-e18f8e8f15a5\") " Dec 02 15:57:33 crc kubenswrapper[4933]: I1202 15:57:33.994584 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a3e77e2-5cb3-44df-8570-e18f8e8f15a5-client-ca" (OuterVolumeSpecName: "client-ca") pod "8a3e77e2-5cb3-44df-8570-e18f8e8f15a5" (UID: "8a3e77e2-5cb3-44df-8570-e18f8e8f15a5"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 15:57:33 crc kubenswrapper[4933]: I1202 15:57:33.998006 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a3e77e2-5cb3-44df-8570-e18f8e8f15a5-kube-api-access-6bhdx" (OuterVolumeSpecName: "kube-api-access-6bhdx") pod "8a3e77e2-5cb3-44df-8570-e18f8e8f15a5" (UID: "8a3e77e2-5cb3-44df-8570-e18f8e8f15a5"). InnerVolumeSpecName "kube-api-access-6bhdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 15:57:33 crc kubenswrapper[4933]: I1202 15:57:33.999268 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a3e77e2-5cb3-44df-8570-e18f8e8f15a5-config" (OuterVolumeSpecName: "config") pod "8a3e77e2-5cb3-44df-8570-e18f8e8f15a5" (UID: "8a3e77e2-5cb3-44df-8570-e18f8e8f15a5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 15:57:34 crc kubenswrapper[4933]: I1202 15:57:34.004970 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a3e77e2-5cb3-44df-8570-e18f8e8f15a5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8a3e77e2-5cb3-44df-8570-e18f8e8f15a5" (UID: "8a3e77e2-5cb3-44df-8570-e18f8e8f15a5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 15:57:34 crc kubenswrapper[4933]: I1202 15:57:34.095542 4933 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8a3e77e2-5cb3-44df-8570-e18f8e8f15a5-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 15:57:34 crc kubenswrapper[4933]: I1202 15:57:34.095597 4933 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a3e77e2-5cb3-44df-8570-e18f8e8f15a5-config\") on node \"crc\" DevicePath \"\"" Dec 02 15:57:34 crc kubenswrapper[4933]: I1202 15:57:34.095607 4933 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8a3e77e2-5cb3-44df-8570-e18f8e8f15a5-client-ca\") on node \"crc\" DevicePath \"\"" Dec 02 15:57:34 crc kubenswrapper[4933]: I1202 15:57:34.095616 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6bhdx\" (UniqueName: \"kubernetes.io/projected/8a3e77e2-5cb3-44df-8570-e18f8e8f15a5-kube-api-access-6bhdx\") on node \"crc\" DevicePath \"\"" Dec 02 15:57:34 crc kubenswrapper[4933]: I1202 15:57:34.124702 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-lvtt7" Dec 02 15:57:34 crc kubenswrapper[4933]: I1202 15:57:34.196214 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff\" (UID: \"977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff\") " Dec 02 15:57:34 crc kubenswrapper[4933]: I1202 15:57:34.196294 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff-registry-certificates\") pod \"977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff\" (UID: \"977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff\") " Dec 02 15:57:34 crc kubenswrapper[4933]: I1202 15:57:34.196347 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jt7qx\" (UniqueName: \"kubernetes.io/projected/977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff-kube-api-access-jt7qx\") pod \"977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff\" (UID: \"977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff\") " Dec 02 15:57:34 crc kubenswrapper[4933]: I1202 15:57:34.196375 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff-registry-tls\") pod \"977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff\" (UID: \"977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff\") " Dec 02 15:57:34 crc kubenswrapper[4933]: I1202 15:57:34.196436 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff-bound-sa-token\") pod \"977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff\" (UID: \"977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff\") " Dec 02 15:57:34 crc kubenswrapper[4933]: I1202 15:57:34.196466 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff-ca-trust-extracted\") pod \"977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff\" (UID: \"977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff\") " Dec 02 15:57:34 crc kubenswrapper[4933]: I1202 15:57:34.196526 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff-trusted-ca\") pod \"977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff\" (UID: \"977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff\") " Dec 02 15:57:34 crc kubenswrapper[4933]: I1202 15:57:34.196574 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff-installation-pull-secrets\") pod \"977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff\" (UID: \"977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff\") " Dec 02 15:57:34 crc kubenswrapper[4933]: I1202 15:57:34.197620 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff" (UID: "977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 15:57:34 crc kubenswrapper[4933]: I1202 15:57:34.197775 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff" (UID: "977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 15:57:34 crc kubenswrapper[4933]: I1202 15:57:34.199723 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff" (UID: "977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 15:57:34 crc kubenswrapper[4933]: I1202 15:57:34.201191 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff" (UID: "977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 15:57:34 crc kubenswrapper[4933]: I1202 15:57:34.201787 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff" (UID: "977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 15:57:34 crc kubenswrapper[4933]: I1202 15:57:34.202575 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff-kube-api-access-jt7qx" (OuterVolumeSpecName: "kube-api-access-jt7qx") pod "977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff" (UID: "977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff"). InnerVolumeSpecName "kube-api-access-jt7qx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 15:57:34 crc kubenswrapper[4933]: I1202 15:57:34.206586 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff" (UID: "977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 02 15:57:34 crc kubenswrapper[4933]: I1202 15:57:34.212284 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff" (UID: "977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 15:57:34 crc kubenswrapper[4933]: I1202 15:57:34.297073 4933 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 02 15:57:34 crc kubenswrapper[4933]: I1202 15:57:34.297108 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jt7qx\" (UniqueName: \"kubernetes.io/projected/977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff-kube-api-access-jt7qx\") on node \"crc\" DevicePath \"\"" Dec 02 15:57:34 crc kubenswrapper[4933]: I1202 15:57:34.297118 4933 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 02 15:57:34 crc kubenswrapper[4933]: I1202 15:57:34.297129 4933 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 02 15:57:34 crc kubenswrapper[4933]: I1202 15:57:34.297137 4933 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 02 15:57:34 crc kubenswrapper[4933]: I1202 15:57:34.297144 4933 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 02 15:57:34 crc kubenswrapper[4933]: I1202 15:57:34.297153 4933 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 02 15:57:34 crc kubenswrapper[4933]: I1202 15:57:34.566181 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5b8dbb6c7f-fcpzg"] Dec 02 15:57:34 crc kubenswrapper[4933]: E1202 15:57:34.566372 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50f1708b-32f7-42c3-a3ec-57f654624efa" containerName="controller-manager" Dec 02 15:57:34 crc kubenswrapper[4933]: I1202 15:57:34.566383 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="50f1708b-32f7-42c3-a3ec-57f654624efa" containerName="controller-manager" Dec 02 15:57:34 crc kubenswrapper[4933]: E1202 15:57:34.566390 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a3e77e2-5cb3-44df-8570-e18f8e8f15a5" containerName="route-controller-manager" Dec 02 15:57:34 crc kubenswrapper[4933]: I1202 15:57:34.566396 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a3e77e2-5cb3-44df-8570-e18f8e8f15a5" containerName="route-controller-manager" Dec 02 15:57:34 crc kubenswrapper[4933]: E1202 15:57:34.566408 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff" containerName="registry" Dec 02 15:57:34 crc kubenswrapper[4933]: I1202 15:57:34.566415 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff" containerName="registry" Dec 02 15:57:34 crc kubenswrapper[4933]: I1202 15:57:34.566495 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="50f1708b-32f7-42c3-a3ec-57f654624efa" containerName="controller-manager" Dec 02 15:57:34 crc kubenswrapper[4933]: I1202 15:57:34.566503 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff" containerName="registry" Dec 02 15:57:34 crc kubenswrapper[4933]: I1202 15:57:34.566511 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a3e77e2-5cb3-44df-8570-e18f8e8f15a5" containerName="route-controller-manager" Dec 02 15:57:34 crc kubenswrapper[4933]: I1202 15:57:34.566844 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5b8dbb6c7f-fcpzg" Dec 02 15:57:34 crc kubenswrapper[4933]: I1202 15:57:34.570088 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 02 15:57:34 crc kubenswrapper[4933]: I1202 15:57:34.570187 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 02 15:57:34 crc kubenswrapper[4933]: I1202 15:57:34.570527 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 02 15:57:34 crc kubenswrapper[4933]: I1202 15:57:34.571591 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 02 15:57:34 crc kubenswrapper[4933]: I1202 15:57:34.571686 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f4666cb67-hhgf7"] Dec 02 15:57:34 crc kubenswrapper[4933]: I1202 15:57:34.572279 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 02 15:57:34 crc kubenswrapper[4933]: I1202 15:57:34.573322 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 02 15:57:34 crc kubenswrapper[4933]: I1202 15:57:34.573447 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7f4666cb67-hhgf7" Dec 02 15:57:34 crc kubenswrapper[4933]: I1202 15:57:34.577079 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 02 15:57:34 crc kubenswrapper[4933]: I1202 15:57:34.577664 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5b8dbb6c7f-fcpzg"] Dec 02 15:57:34 crc kubenswrapper[4933]: I1202 15:57:34.618360 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f4666cb67-hhgf7"] Dec 02 15:57:34 crc kubenswrapper[4933]: I1202 15:57:34.673072 4933 generic.go:334] "Generic (PLEG): container finished" podID="8a3e77e2-5cb3-44df-8570-e18f8e8f15a5" containerID="638fa4de69f06fe097dae8e1bceb9cb537d8f26b110bb755c39c62a0120056a4" exitCode=0 Dec 02 15:57:34 crc kubenswrapper[4933]: I1202 15:57:34.673113 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tdzpv" event={"ID":"8a3e77e2-5cb3-44df-8570-e18f8e8f15a5","Type":"ContainerDied","Data":"638fa4de69f06fe097dae8e1bceb9cb537d8f26b110bb755c39c62a0120056a4"} Dec 02 15:57:34 crc kubenswrapper[4933]: I1202 15:57:34.673162 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tdzpv" event={"ID":"8a3e77e2-5cb3-44df-8570-e18f8e8f15a5","Type":"ContainerDied","Data":"5170cca61e2eda9138aae42d2e43ec7eef1986db9ea6647bfe0e0d93c33f29f2"} Dec 02 15:57:34 crc kubenswrapper[4933]: I1202 15:57:34.673193 4933 scope.go:117] "RemoveContainer" containerID="638fa4de69f06fe097dae8e1bceb9cb537d8f26b110bb755c39c62a0120056a4" Dec 02 15:57:34 crc kubenswrapper[4933]: I1202 15:57:34.673334 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tdzpv" Dec 02 15:57:34 crc kubenswrapper[4933]: I1202 15:57:34.676586 4933 generic.go:334] "Generic (PLEG): container finished" podID="977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff" containerID="097e9649a54b11a78a403954daefddf325237cb7718f9ab24e9b9ec97d9508a5" exitCode=0 Dec 02 15:57:34 crc kubenswrapper[4933]: I1202 15:57:34.676647 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-lvtt7" event={"ID":"977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff","Type":"ContainerDied","Data":"097e9649a54b11a78a403954daefddf325237cb7718f9ab24e9b9ec97d9508a5"} Dec 02 15:57:34 crc kubenswrapper[4933]: I1202 15:57:34.676664 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-lvtt7" Dec 02 15:57:34 crc kubenswrapper[4933]: I1202 15:57:34.676670 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-lvtt7" event={"ID":"977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff","Type":"ContainerDied","Data":"f262dc09d98ad7fb78b72e0a1d9d0b4a5370d69be2f413077e4e917951945ba8"} Dec 02 15:57:34 crc kubenswrapper[4933]: I1202 15:57:34.692326 4933 scope.go:117] "RemoveContainer" containerID="638fa4de69f06fe097dae8e1bceb9cb537d8f26b110bb755c39c62a0120056a4" Dec 02 15:57:34 crc kubenswrapper[4933]: E1202 15:57:34.692708 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"638fa4de69f06fe097dae8e1bceb9cb537d8f26b110bb755c39c62a0120056a4\": container with ID starting with 638fa4de69f06fe097dae8e1bceb9cb537d8f26b110bb755c39c62a0120056a4 not found: ID does not exist" containerID="638fa4de69f06fe097dae8e1bceb9cb537d8f26b110bb755c39c62a0120056a4" Dec 02 15:57:34 crc kubenswrapper[4933]: I1202 15:57:34.692789 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"638fa4de69f06fe097dae8e1bceb9cb537d8f26b110bb755c39c62a0120056a4"} err="failed to get container status \"638fa4de69f06fe097dae8e1bceb9cb537d8f26b110bb755c39c62a0120056a4\": rpc error: code = NotFound desc = could not find container \"638fa4de69f06fe097dae8e1bceb9cb537d8f26b110bb755c39c62a0120056a4\": container with ID starting with 638fa4de69f06fe097dae8e1bceb9cb537d8f26b110bb755c39c62a0120056a4 not found: ID does not exist" Dec 02 15:57:34 crc kubenswrapper[4933]: I1202 15:57:34.692852 4933 scope.go:117] "RemoveContainer" containerID="097e9649a54b11a78a403954daefddf325237cb7718f9ab24e9b9ec97d9508a5" Dec 02 15:57:34 crc kubenswrapper[4933]: I1202 15:57:34.699553 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-tdzpv"] Dec 02 15:57:34 crc kubenswrapper[4933]: I1202 15:57:34.702776 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-tdzpv"] Dec 02 15:57:34 crc kubenswrapper[4933]: I1202 15:57:34.719693 4933 scope.go:117] "RemoveContainer" containerID="097e9649a54b11a78a403954daefddf325237cb7718f9ab24e9b9ec97d9508a5" Dec 02 15:57:34 crc kubenswrapper[4933]: E1202 15:57:34.721043 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"097e9649a54b11a78a403954daefddf325237cb7718f9ab24e9b9ec97d9508a5\": container with ID starting with 097e9649a54b11a78a403954daefddf325237cb7718f9ab24e9b9ec97d9508a5 not found: ID does not exist" containerID="097e9649a54b11a78a403954daefddf325237cb7718f9ab24e9b9ec97d9508a5" Dec 02 15:57:34 crc kubenswrapper[4933]: I1202 15:57:34.721096 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"097e9649a54b11a78a403954daefddf325237cb7718f9ab24e9b9ec97d9508a5"} err="failed to get container status \"097e9649a54b11a78a403954daefddf325237cb7718f9ab24e9b9ec97d9508a5\": rpc error: code = NotFound desc = could not find container \"097e9649a54b11a78a403954daefddf325237cb7718f9ab24e9b9ec97d9508a5\": container with ID starting with 097e9649a54b11a78a403954daefddf325237cb7718f9ab24e9b9ec97d9508a5 not found: ID does not exist" Dec 02 15:57:34 crc kubenswrapper[4933]: I1202 15:57:34.722566 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf3d53b9-25c5-4ce1-8cfa-2341f0e96b7f-config\") pod \"route-controller-manager-7f4666cb67-hhgf7\" (UID: \"cf3d53b9-25c5-4ce1-8cfa-2341f0e96b7f\") " pod="openshift-route-controller-manager/route-controller-manager-7f4666cb67-hhgf7" Dec 02 15:57:34 crc kubenswrapper[4933]: I1202 15:57:34.722604 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cf3d53b9-25c5-4ce1-8cfa-2341f0e96b7f-client-ca\") pod \"route-controller-manager-7f4666cb67-hhgf7\" (UID: \"cf3d53b9-25c5-4ce1-8cfa-2341f0e96b7f\") " pod="openshift-route-controller-manager/route-controller-manager-7f4666cb67-hhgf7" Dec 02 15:57:34 crc kubenswrapper[4933]: I1202 15:57:34.722637 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74kjt\" (UniqueName: \"kubernetes.io/projected/607cd80f-eb66-4eb2-b11a-715b09f8c8bd-kube-api-access-74kjt\") pod \"controller-manager-5b8dbb6c7f-fcpzg\" (UID: \"607cd80f-eb66-4eb2-b11a-715b09f8c8bd\") " pod="openshift-controller-manager/controller-manager-5b8dbb6c7f-fcpzg" Dec 02 15:57:34 crc kubenswrapper[4933]: I1202 15:57:34.722667 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/607cd80f-eb66-4eb2-b11a-715b09f8c8bd-serving-cert\") pod \"controller-manager-5b8dbb6c7f-fcpzg\" (UID: \"607cd80f-eb66-4eb2-b11a-715b09f8c8bd\") " pod="openshift-controller-manager/controller-manager-5b8dbb6c7f-fcpzg" Dec 02 15:57:34 crc kubenswrapper[4933]: I1202 15:57:34.722696 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjmpk\" (UniqueName: \"kubernetes.io/projected/cf3d53b9-25c5-4ce1-8cfa-2341f0e96b7f-kube-api-access-qjmpk\") pod \"route-controller-manager-7f4666cb67-hhgf7\" (UID: \"cf3d53b9-25c5-4ce1-8cfa-2341f0e96b7f\") " pod="openshift-route-controller-manager/route-controller-manager-7f4666cb67-hhgf7" Dec 02 15:57:34 crc kubenswrapper[4933]: I1202 15:57:34.722709 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/607cd80f-eb66-4eb2-b11a-715b09f8c8bd-proxy-ca-bundles\") pod \"controller-manager-5b8dbb6c7f-fcpzg\" (UID: \"607cd80f-eb66-4eb2-b11a-715b09f8c8bd\") " pod="openshift-controller-manager/controller-manager-5b8dbb6c7f-fcpzg" Dec 02 15:57:34 crc kubenswrapper[4933]: I1202 15:57:34.722726 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/607cd80f-eb66-4eb2-b11a-715b09f8c8bd-config\") pod \"controller-manager-5b8dbb6c7f-fcpzg\" (UID: \"607cd80f-eb66-4eb2-b11a-715b09f8c8bd\") " pod="openshift-controller-manager/controller-manager-5b8dbb6c7f-fcpzg" Dec 02 15:57:34 crc kubenswrapper[4933]: I1202 15:57:34.722747 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf3d53b9-25c5-4ce1-8cfa-2341f0e96b7f-serving-cert\") pod \"route-controller-manager-7f4666cb67-hhgf7\" (UID: \"cf3d53b9-25c5-4ce1-8cfa-2341f0e96b7f\") " pod="openshift-route-controller-manager/route-controller-manager-7f4666cb67-hhgf7" Dec 02 15:57:34 crc kubenswrapper[4933]: I1202 15:57:34.722765 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/607cd80f-eb66-4eb2-b11a-715b09f8c8bd-client-ca\") pod \"controller-manager-5b8dbb6c7f-fcpzg\" (UID: \"607cd80f-eb66-4eb2-b11a-715b09f8c8bd\") " pod="openshift-controller-manager/controller-manager-5b8dbb6c7f-fcpzg" Dec 02 15:57:34 crc kubenswrapper[4933]: I1202 15:57:34.725994 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-lvtt7"] Dec 02 15:57:34 crc kubenswrapper[4933]: I1202 15:57:34.728594 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-lvtt7"] Dec 02 15:57:34 crc kubenswrapper[4933]: I1202 15:57:34.824118 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cf3d53b9-25c5-4ce1-8cfa-2341f0e96b7f-client-ca\") pod \"route-controller-manager-7f4666cb67-hhgf7\" (UID: \"cf3d53b9-25c5-4ce1-8cfa-2341f0e96b7f\") " pod="openshift-route-controller-manager/route-controller-manager-7f4666cb67-hhgf7" Dec 02 15:57:34 crc kubenswrapper[4933]: I1202 15:57:34.824181 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74kjt\" (UniqueName: \"kubernetes.io/projected/607cd80f-eb66-4eb2-b11a-715b09f8c8bd-kube-api-access-74kjt\") pod \"controller-manager-5b8dbb6c7f-fcpzg\" (UID: \"607cd80f-eb66-4eb2-b11a-715b09f8c8bd\") " pod="openshift-controller-manager/controller-manager-5b8dbb6c7f-fcpzg" Dec 02 15:57:34 crc kubenswrapper[4933]: I1202 15:57:34.824223 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/607cd80f-eb66-4eb2-b11a-715b09f8c8bd-serving-cert\") pod \"controller-manager-5b8dbb6c7f-fcpzg\" (UID: \"607cd80f-eb66-4eb2-b11a-715b09f8c8bd\") " pod="openshift-controller-manager/controller-manager-5b8dbb6c7f-fcpzg" Dec 02 15:57:34 crc kubenswrapper[4933]: I1202 15:57:34.824264 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjmpk\" (UniqueName: \"kubernetes.io/projected/cf3d53b9-25c5-4ce1-8cfa-2341f0e96b7f-kube-api-access-qjmpk\") pod \"route-controller-manager-7f4666cb67-hhgf7\" (UID: \"cf3d53b9-25c5-4ce1-8cfa-2341f0e96b7f\") " pod="openshift-route-controller-manager/route-controller-manager-7f4666cb67-hhgf7" Dec 02 15:57:34 crc kubenswrapper[4933]: I1202 15:57:34.824287 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/607cd80f-eb66-4eb2-b11a-715b09f8c8bd-proxy-ca-bundles\") pod \"controller-manager-5b8dbb6c7f-fcpzg\" (UID: \"607cd80f-eb66-4eb2-b11a-715b09f8c8bd\") " pod="openshift-controller-manager/controller-manager-5b8dbb6c7f-fcpzg" Dec 02 15:57:34 crc kubenswrapper[4933]: I1202 15:57:34.824313 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/607cd80f-eb66-4eb2-b11a-715b09f8c8bd-config\") pod \"controller-manager-5b8dbb6c7f-fcpzg\" (UID: \"607cd80f-eb66-4eb2-b11a-715b09f8c8bd\") " pod="openshift-controller-manager/controller-manager-5b8dbb6c7f-fcpzg" Dec 02 15:57:34 crc kubenswrapper[4933]: I1202 15:57:34.824341 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf3d53b9-25c5-4ce1-8cfa-2341f0e96b7f-serving-cert\") pod \"route-controller-manager-7f4666cb67-hhgf7\" (UID: \"cf3d53b9-25c5-4ce1-8cfa-2341f0e96b7f\") " pod="openshift-route-controller-manager/route-controller-manager-7f4666cb67-hhgf7" Dec 02 15:57:34 crc kubenswrapper[4933]: I1202 15:57:34.824363 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/607cd80f-eb66-4eb2-b11a-715b09f8c8bd-client-ca\") pod \"controller-manager-5b8dbb6c7f-fcpzg\" (UID: \"607cd80f-eb66-4eb2-b11a-715b09f8c8bd\") " pod="openshift-controller-manager/controller-manager-5b8dbb6c7f-fcpzg" Dec 02 15:57:34 crc kubenswrapper[4933]: I1202 15:57:34.824392 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf3d53b9-25c5-4ce1-8cfa-2341f0e96b7f-config\") pod \"route-controller-manager-7f4666cb67-hhgf7\" (UID: \"cf3d53b9-25c5-4ce1-8cfa-2341f0e96b7f\") " pod="openshift-route-controller-manager/route-controller-manager-7f4666cb67-hhgf7" Dec 02 15:57:34 crc kubenswrapper[4933]: I1202 15:57:34.825219 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cf3d53b9-25c5-4ce1-8cfa-2341f0e96b7f-client-ca\") pod \"route-controller-manager-7f4666cb67-hhgf7\" (UID: \"cf3d53b9-25c5-4ce1-8cfa-2341f0e96b7f\") " pod="openshift-route-controller-manager/route-controller-manager-7f4666cb67-hhgf7" Dec 02 15:57:34 crc kubenswrapper[4933]: I1202 15:57:34.825619 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf3d53b9-25c5-4ce1-8cfa-2341f0e96b7f-config\") pod \"route-controller-manager-7f4666cb67-hhgf7\" (UID: \"cf3d53b9-25c5-4ce1-8cfa-2341f0e96b7f\") " pod="openshift-route-controller-manager/route-controller-manager-7f4666cb67-hhgf7" Dec 02 15:57:34 crc kubenswrapper[4933]: I1202 15:57:34.826501 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/607cd80f-eb66-4eb2-b11a-715b09f8c8bd-client-ca\") pod \"controller-manager-5b8dbb6c7f-fcpzg\" (UID: \"607cd80f-eb66-4eb2-b11a-715b09f8c8bd\") " pod="openshift-controller-manager/controller-manager-5b8dbb6c7f-fcpzg" Dec 02 15:57:34 crc kubenswrapper[4933]: I1202 15:57:34.827399 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/607cd80f-eb66-4eb2-b11a-715b09f8c8bd-config\") pod \"controller-manager-5b8dbb6c7f-fcpzg\" (UID: \"607cd80f-eb66-4eb2-b11a-715b09f8c8bd\") " pod="openshift-controller-manager/controller-manager-5b8dbb6c7f-fcpzg" Dec 02 15:57:34 crc kubenswrapper[4933]: I1202 15:57:34.833038 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf3d53b9-25c5-4ce1-8cfa-2341f0e96b7f-serving-cert\") pod \"route-controller-manager-7f4666cb67-hhgf7\" (UID: \"cf3d53b9-25c5-4ce1-8cfa-2341f0e96b7f\") " pod="openshift-route-controller-manager/route-controller-manager-7f4666cb67-hhgf7" Dec 02 15:57:34 crc kubenswrapper[4933]: I1202 15:57:34.837573 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/607cd80f-eb66-4eb2-b11a-715b09f8c8bd-serving-cert\") pod \"controller-manager-5b8dbb6c7f-fcpzg\" (UID: \"607cd80f-eb66-4eb2-b11a-715b09f8c8bd\") " pod="openshift-controller-manager/controller-manager-5b8dbb6c7f-fcpzg" Dec 02 15:57:34 crc kubenswrapper[4933]: I1202 15:57:34.842455 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/607cd80f-eb66-4eb2-b11a-715b09f8c8bd-proxy-ca-bundles\") pod \"controller-manager-5b8dbb6c7f-fcpzg\" (UID: \"607cd80f-eb66-4eb2-b11a-715b09f8c8bd\") " pod="openshift-controller-manager/controller-manager-5b8dbb6c7f-fcpzg" Dec 02 15:57:34 crc kubenswrapper[4933]: I1202 15:57:34.845466 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74kjt\" (UniqueName: \"kubernetes.io/projected/607cd80f-eb66-4eb2-b11a-715b09f8c8bd-kube-api-access-74kjt\") pod \"controller-manager-5b8dbb6c7f-fcpzg\" (UID: \"607cd80f-eb66-4eb2-b11a-715b09f8c8bd\") " pod="openshift-controller-manager/controller-manager-5b8dbb6c7f-fcpzg" Dec 02 15:57:34 crc kubenswrapper[4933]: I1202 15:57:34.849465 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjmpk\" (UniqueName: \"kubernetes.io/projected/cf3d53b9-25c5-4ce1-8cfa-2341f0e96b7f-kube-api-access-qjmpk\") pod \"route-controller-manager-7f4666cb67-hhgf7\" (UID: \"cf3d53b9-25c5-4ce1-8cfa-2341f0e96b7f\") " pod="openshift-route-controller-manager/route-controller-manager-7f4666cb67-hhgf7" Dec 02 15:57:34 crc kubenswrapper[4933]: I1202 15:57:34.944330 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5b8dbb6c7f-fcpzg" Dec 02 15:57:34 crc kubenswrapper[4933]: I1202 15:57:34.954477 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7f4666cb67-hhgf7" Dec 02 15:57:35 crc kubenswrapper[4933]: I1202 15:57:35.003186 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5b8dbb6c7f-fcpzg"] Dec 02 15:57:35 crc kubenswrapper[4933]: I1202 15:57:35.007259 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f4666cb67-hhgf7"] Dec 02 15:57:35 crc kubenswrapper[4933]: I1202 15:57:35.060248 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50f1708b-32f7-42c3-a3ec-57f654624efa" path="/var/lib/kubelet/pods/50f1708b-32f7-42c3-a3ec-57f654624efa/volumes" Dec 02 15:57:35 crc kubenswrapper[4933]: I1202 15:57:35.060945 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a3e77e2-5cb3-44df-8570-e18f8e8f15a5" path="/var/lib/kubelet/pods/8a3e77e2-5cb3-44df-8570-e18f8e8f15a5/volumes" Dec 02 15:57:35 crc kubenswrapper[4933]: I1202 15:57:35.061531 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff" path="/var/lib/kubelet/pods/977aa5a4-bed3-4a5a-b8cf-ed231d1f6cff/volumes" Dec 02 15:57:35 crc kubenswrapper[4933]: I1202 15:57:35.220604 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5b8dbb6c7f-fcpzg"] Dec 02 15:57:35 crc kubenswrapper[4933]: W1202 15:57:35.223146 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod607cd80f_eb66_4eb2_b11a_715b09f8c8bd.slice/crio-3516be30aa85396f721e33c22f7d9376794afef536dee5fdda408ec5a648ac75 WatchSource:0}: Error finding container 3516be30aa85396f721e33c22f7d9376794afef536dee5fdda408ec5a648ac75: Status 404 returned error can't find the container with id 3516be30aa85396f721e33c22f7d9376794afef536dee5fdda408ec5a648ac75 Dec 02 15:57:35 crc kubenswrapper[4933]: I1202 15:57:35.273222 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f4666cb67-hhgf7"] Dec 02 15:57:35 crc kubenswrapper[4933]: W1202 15:57:35.279332 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf3d53b9_25c5_4ce1_8cfa_2341f0e96b7f.slice/crio-74999ff90655af35478a92559d9e9a0e36e8ffcba6eaef2e2ee97ad6c54586e2 WatchSource:0}: Error finding container 74999ff90655af35478a92559d9e9a0e36e8ffcba6eaef2e2ee97ad6c54586e2: Status 404 returned error can't find the container with id 74999ff90655af35478a92559d9e9a0e36e8ffcba6eaef2e2ee97ad6c54586e2 Dec 02 15:57:35 crc kubenswrapper[4933]: I1202 15:57:35.685139 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5b8dbb6c7f-fcpzg" event={"ID":"607cd80f-eb66-4eb2-b11a-715b09f8c8bd","Type":"ContainerStarted","Data":"1ff28f598a896ebe758934fd8aa859084af049e7af4d8de06115255fcccf9aef"} Dec 02 15:57:35 crc kubenswrapper[4933]: I1202 15:57:35.685473 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5b8dbb6c7f-fcpzg" event={"ID":"607cd80f-eb66-4eb2-b11a-715b09f8c8bd","Type":"ContainerStarted","Data":"3516be30aa85396f721e33c22f7d9376794afef536dee5fdda408ec5a648ac75"} Dec 02 15:57:35 crc kubenswrapper[4933]: I1202 15:57:35.685318 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5b8dbb6c7f-fcpzg" podUID="607cd80f-eb66-4eb2-b11a-715b09f8c8bd" containerName="controller-manager" containerID="cri-o://1ff28f598a896ebe758934fd8aa859084af049e7af4d8de06115255fcccf9aef" gracePeriod=30 Dec 02 15:57:35 crc kubenswrapper[4933]: I1202 15:57:35.685741 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5b8dbb6c7f-fcpzg" Dec 02 15:57:35 crc kubenswrapper[4933]: I1202 15:57:35.686688 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7f4666cb67-hhgf7" event={"ID":"cf3d53b9-25c5-4ce1-8cfa-2341f0e96b7f","Type":"ContainerStarted","Data":"a4013b8370c31abc058a74838c042a9d0f12e48e7c64ffe10c8bc89c28dee83c"} Dec 02 15:57:35 crc kubenswrapper[4933]: I1202 15:57:35.686711 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7f4666cb67-hhgf7" event={"ID":"cf3d53b9-25c5-4ce1-8cfa-2341f0e96b7f","Type":"ContainerStarted","Data":"74999ff90655af35478a92559d9e9a0e36e8ffcba6eaef2e2ee97ad6c54586e2"} Dec 02 15:57:35 crc kubenswrapper[4933]: I1202 15:57:35.686841 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-7f4666cb67-hhgf7" podUID="cf3d53b9-25c5-4ce1-8cfa-2341f0e96b7f" containerName="route-controller-manager" containerID="cri-o://a4013b8370c31abc058a74838c042a9d0f12e48e7c64ffe10c8bc89c28dee83c" gracePeriod=30 Dec 02 15:57:35 crc kubenswrapper[4933]: I1202 15:57:35.686984 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7f4666cb67-hhgf7" Dec 02 15:57:35 crc kubenswrapper[4933]: I1202 15:57:35.698789 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5b8dbb6c7f-fcpzg" Dec 02 15:57:35 crc kubenswrapper[4933]: I1202 15:57:35.706729 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5b8dbb6c7f-fcpzg" podStartSLOduration=2.706705076 podStartE2EDuration="2.706705076s" podCreationTimestamp="2025-12-02 15:57:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 15:57:35.703396341 +0000 UTC m=+318.954623054" watchObservedRunningTime="2025-12-02 15:57:35.706705076 +0000 UTC m=+318.957931819" Dec 02 15:57:35 crc kubenswrapper[4933]: I1202 15:57:35.722782 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7f4666cb67-hhgf7" podStartSLOduration=2.7227643969999997 podStartE2EDuration="2.722764397s" podCreationTimestamp="2025-12-02 15:57:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 15:57:35.721629914 +0000 UTC m=+318.972856617" watchObservedRunningTime="2025-12-02 15:57:35.722764397 +0000 UTC m=+318.973991100" Dec 02 15:57:35 crc kubenswrapper[4933]: I1202 15:57:35.954166 4933 patch_prober.go:28] interesting pod/route-controller-manager-7f4666cb67-hhgf7 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.61:8443/healthz\": read tcp 10.217.0.2:60638->10.217.0.61:8443: read: connection reset by peer" start-of-body= Dec 02 15:57:35 crc kubenswrapper[4933]: I1202 15:57:35.954212 4933 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-7f4666cb67-hhgf7" podUID="cf3d53b9-25c5-4ce1-8cfa-2341f0e96b7f" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.61:8443/healthz\": read tcp 10.217.0.2:60638->10.217.0.61:8443: read: connection reset by peer" Dec 02 15:57:36 crc kubenswrapper[4933]: I1202 15:57:36.042958 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5b8dbb6c7f-fcpzg" Dec 02 15:57:36 crc kubenswrapper[4933]: I1202 15:57:36.069707 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6d9f96d886-md9gd"] Dec 02 15:57:36 crc kubenswrapper[4933]: E1202 15:57:36.069939 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="607cd80f-eb66-4eb2-b11a-715b09f8c8bd" containerName="controller-manager" Dec 02 15:57:36 crc kubenswrapper[4933]: I1202 15:57:36.069952 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="607cd80f-eb66-4eb2-b11a-715b09f8c8bd" containerName="controller-manager" Dec 02 15:57:36 crc kubenswrapper[4933]: I1202 15:57:36.070035 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="607cd80f-eb66-4eb2-b11a-715b09f8c8bd" containerName="controller-manager" Dec 02 15:57:36 crc kubenswrapper[4933]: I1202 15:57:36.070752 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6d9f96d886-md9gd" Dec 02 15:57:36 crc kubenswrapper[4933]: I1202 15:57:36.118790 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6d9f96d886-md9gd"] Dec 02 15:57:36 crc kubenswrapper[4933]: I1202 15:57:36.138948 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/607cd80f-eb66-4eb2-b11a-715b09f8c8bd-proxy-ca-bundles\") pod \"607cd80f-eb66-4eb2-b11a-715b09f8c8bd\" (UID: \"607cd80f-eb66-4eb2-b11a-715b09f8c8bd\") " Dec 02 15:57:36 crc kubenswrapper[4933]: I1202 15:57:36.140819 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/607cd80f-eb66-4eb2-b11a-715b09f8c8bd-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "607cd80f-eb66-4eb2-b11a-715b09f8c8bd" (UID: "607cd80f-eb66-4eb2-b11a-715b09f8c8bd"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 15:57:36 crc kubenswrapper[4933]: I1202 15:57:36.141154 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/607cd80f-eb66-4eb2-b11a-715b09f8c8bd-client-ca\") pod \"607cd80f-eb66-4eb2-b11a-715b09f8c8bd\" (UID: \"607cd80f-eb66-4eb2-b11a-715b09f8c8bd\") " Dec 02 15:57:36 crc kubenswrapper[4933]: I1202 15:57:36.141468 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/607cd80f-eb66-4eb2-b11a-715b09f8c8bd-config\") pod \"607cd80f-eb66-4eb2-b11a-715b09f8c8bd\" (UID: \"607cd80f-eb66-4eb2-b11a-715b09f8c8bd\") " Dec 02 15:57:36 crc kubenswrapper[4933]: I1202 15:57:36.141473 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/607cd80f-eb66-4eb2-b11a-715b09f8c8bd-client-ca" (OuterVolumeSpecName: "client-ca") pod "607cd80f-eb66-4eb2-b11a-715b09f8c8bd" (UID: "607cd80f-eb66-4eb2-b11a-715b09f8c8bd"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 15:57:36 crc kubenswrapper[4933]: I1202 15:57:36.141564 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-74kjt\" (UniqueName: \"kubernetes.io/projected/607cd80f-eb66-4eb2-b11a-715b09f8c8bd-kube-api-access-74kjt\") pod \"607cd80f-eb66-4eb2-b11a-715b09f8c8bd\" (UID: \"607cd80f-eb66-4eb2-b11a-715b09f8c8bd\") " Dec 02 15:57:36 crc kubenswrapper[4933]: I1202 15:57:36.141585 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/607cd80f-eb66-4eb2-b11a-715b09f8c8bd-serving-cert\") pod \"607cd80f-eb66-4eb2-b11a-715b09f8c8bd\" (UID: \"607cd80f-eb66-4eb2-b11a-715b09f8c8bd\") " Dec 02 15:57:36 crc kubenswrapper[4933]: I1202 15:57:36.141893 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/41d5e210-0a43-4e21-a725-d1a9a0d75128-config\") pod \"controller-manager-6d9f96d886-md9gd\" (UID: \"41d5e210-0a43-4e21-a725-d1a9a0d75128\") " pod="openshift-controller-manager/controller-manager-6d9f96d886-md9gd" Dec 02 15:57:36 crc kubenswrapper[4933]: I1202 15:57:36.141935 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/41d5e210-0a43-4e21-a725-d1a9a0d75128-proxy-ca-bundles\") pod \"controller-manager-6d9f96d886-md9gd\" (UID: \"41d5e210-0a43-4e21-a725-d1a9a0d75128\") " pod="openshift-controller-manager/controller-manager-6d9f96d886-md9gd" Dec 02 15:57:36 crc kubenswrapper[4933]: I1202 15:57:36.141955 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/41d5e210-0a43-4e21-a725-d1a9a0d75128-client-ca\") pod \"controller-manager-6d9f96d886-md9gd\" (UID: \"41d5e210-0a43-4e21-a725-d1a9a0d75128\") " pod="openshift-controller-manager/controller-manager-6d9f96d886-md9gd" Dec 02 15:57:36 crc kubenswrapper[4933]: I1202 15:57:36.141980 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/41d5e210-0a43-4e21-a725-d1a9a0d75128-serving-cert\") pod \"controller-manager-6d9f96d886-md9gd\" (UID: \"41d5e210-0a43-4e21-a725-d1a9a0d75128\") " pod="openshift-controller-manager/controller-manager-6d9f96d886-md9gd" Dec 02 15:57:36 crc kubenswrapper[4933]: I1202 15:57:36.142035 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48g6x\" (UniqueName: \"kubernetes.io/projected/41d5e210-0a43-4e21-a725-d1a9a0d75128-kube-api-access-48g6x\") pod \"controller-manager-6d9f96d886-md9gd\" (UID: \"41d5e210-0a43-4e21-a725-d1a9a0d75128\") " pod="openshift-controller-manager/controller-manager-6d9f96d886-md9gd" Dec 02 15:57:36 crc kubenswrapper[4933]: I1202 15:57:36.142091 4933 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/607cd80f-eb66-4eb2-b11a-715b09f8c8bd-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 02 15:57:36 crc kubenswrapper[4933]: I1202 15:57:36.142102 4933 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/607cd80f-eb66-4eb2-b11a-715b09f8c8bd-client-ca\") on node \"crc\" DevicePath \"\"" Dec 02 15:57:36 crc kubenswrapper[4933]: I1202 15:57:36.142943 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/607cd80f-eb66-4eb2-b11a-715b09f8c8bd-config" (OuterVolumeSpecName: "config") pod "607cd80f-eb66-4eb2-b11a-715b09f8c8bd" (UID: "607cd80f-eb66-4eb2-b11a-715b09f8c8bd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 15:57:36 crc kubenswrapper[4933]: I1202 15:57:36.147333 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/607cd80f-eb66-4eb2-b11a-715b09f8c8bd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "607cd80f-eb66-4eb2-b11a-715b09f8c8bd" (UID: "607cd80f-eb66-4eb2-b11a-715b09f8c8bd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 15:57:36 crc kubenswrapper[4933]: I1202 15:57:36.154956 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/607cd80f-eb66-4eb2-b11a-715b09f8c8bd-kube-api-access-74kjt" (OuterVolumeSpecName: "kube-api-access-74kjt") pod "607cd80f-eb66-4eb2-b11a-715b09f8c8bd" (UID: "607cd80f-eb66-4eb2-b11a-715b09f8c8bd"). InnerVolumeSpecName "kube-api-access-74kjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 15:57:36 crc kubenswrapper[4933]: I1202 15:57:36.184629 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-7f4666cb67-hhgf7_cf3d53b9-25c5-4ce1-8cfa-2341f0e96b7f/route-controller-manager/0.log" Dec 02 15:57:36 crc kubenswrapper[4933]: I1202 15:57:36.184715 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7f4666cb67-hhgf7" Dec 02 15:57:36 crc kubenswrapper[4933]: I1202 15:57:36.244036 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf3d53b9-25c5-4ce1-8cfa-2341f0e96b7f-config" (OuterVolumeSpecName: "config") pod "cf3d53b9-25c5-4ce1-8cfa-2341f0e96b7f" (UID: "cf3d53b9-25c5-4ce1-8cfa-2341f0e96b7f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 15:57:36 crc kubenswrapper[4933]: I1202 15:57:36.244054 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf3d53b9-25c5-4ce1-8cfa-2341f0e96b7f-config\") pod \"cf3d53b9-25c5-4ce1-8cfa-2341f0e96b7f\" (UID: \"cf3d53b9-25c5-4ce1-8cfa-2341f0e96b7f\") " Dec 02 15:57:36 crc kubenswrapper[4933]: I1202 15:57:36.244155 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf3d53b9-25c5-4ce1-8cfa-2341f0e96b7f-serving-cert\") pod \"cf3d53b9-25c5-4ce1-8cfa-2341f0e96b7f\" (UID: \"cf3d53b9-25c5-4ce1-8cfa-2341f0e96b7f\") " Dec 02 15:57:36 crc kubenswrapper[4933]: I1202 15:57:36.244205 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cf3d53b9-25c5-4ce1-8cfa-2341f0e96b7f-client-ca\") pod \"cf3d53b9-25c5-4ce1-8cfa-2341f0e96b7f\" (UID: \"cf3d53b9-25c5-4ce1-8cfa-2341f0e96b7f\") " Dec 02 15:57:36 crc kubenswrapper[4933]: I1202 15:57:36.244239 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qjmpk\" (UniqueName: \"kubernetes.io/projected/cf3d53b9-25c5-4ce1-8cfa-2341f0e96b7f-kube-api-access-qjmpk\") pod \"cf3d53b9-25c5-4ce1-8cfa-2341f0e96b7f\" (UID: \"cf3d53b9-25c5-4ce1-8cfa-2341f0e96b7f\") " Dec 02 15:57:36 crc kubenswrapper[4933]: I1202 15:57:36.244385 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/41d5e210-0a43-4e21-a725-d1a9a0d75128-serving-cert\") pod \"controller-manager-6d9f96d886-md9gd\" (UID: \"41d5e210-0a43-4e21-a725-d1a9a0d75128\") " pod="openshift-controller-manager/controller-manager-6d9f96d886-md9gd" Dec 02 15:57:36 crc kubenswrapper[4933]: I1202 15:57:36.244443 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48g6x\" (UniqueName: \"kubernetes.io/projected/41d5e210-0a43-4e21-a725-d1a9a0d75128-kube-api-access-48g6x\") pod \"controller-manager-6d9f96d886-md9gd\" (UID: \"41d5e210-0a43-4e21-a725-d1a9a0d75128\") " pod="openshift-controller-manager/controller-manager-6d9f96d886-md9gd" Dec 02 15:57:36 crc kubenswrapper[4933]: I1202 15:57:36.244503 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/41d5e210-0a43-4e21-a725-d1a9a0d75128-config\") pod \"controller-manager-6d9f96d886-md9gd\" (UID: \"41d5e210-0a43-4e21-a725-d1a9a0d75128\") " pod="openshift-controller-manager/controller-manager-6d9f96d886-md9gd" Dec 02 15:57:36 crc kubenswrapper[4933]: I1202 15:57:36.244534 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/41d5e210-0a43-4e21-a725-d1a9a0d75128-proxy-ca-bundles\") pod \"controller-manager-6d9f96d886-md9gd\" (UID: \"41d5e210-0a43-4e21-a725-d1a9a0d75128\") " pod="openshift-controller-manager/controller-manager-6d9f96d886-md9gd" Dec 02 15:57:36 crc kubenswrapper[4933]: I1202 15:57:36.244554 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/41d5e210-0a43-4e21-a725-d1a9a0d75128-client-ca\") pod \"controller-manager-6d9f96d886-md9gd\" (UID: \"41d5e210-0a43-4e21-a725-d1a9a0d75128\") " pod="openshift-controller-manager/controller-manager-6d9f96d886-md9gd" Dec 02 15:57:36 crc kubenswrapper[4933]: I1202 15:57:36.244622 4933 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf3d53b9-25c5-4ce1-8cfa-2341f0e96b7f-config\") on node \"crc\" DevicePath \"\"" Dec 02 15:57:36 crc kubenswrapper[4933]: I1202 15:57:36.244633 4933 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/607cd80f-eb66-4eb2-b11a-715b09f8c8bd-config\") on node \"crc\" DevicePath \"\"" Dec 02 15:57:36 crc kubenswrapper[4933]: I1202 15:57:36.244645 4933 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/607cd80f-eb66-4eb2-b11a-715b09f8c8bd-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 15:57:36 crc kubenswrapper[4933]: I1202 15:57:36.244665 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf3d53b9-25c5-4ce1-8cfa-2341f0e96b7f-client-ca" (OuterVolumeSpecName: "client-ca") pod "cf3d53b9-25c5-4ce1-8cfa-2341f0e96b7f" (UID: "cf3d53b9-25c5-4ce1-8cfa-2341f0e96b7f"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 15:57:36 crc kubenswrapper[4933]: I1202 15:57:36.244733 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-74kjt\" (UniqueName: \"kubernetes.io/projected/607cd80f-eb66-4eb2-b11a-715b09f8c8bd-kube-api-access-74kjt\") on node \"crc\" DevicePath \"\"" Dec 02 15:57:36 crc kubenswrapper[4933]: I1202 15:57:36.245970 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/41d5e210-0a43-4e21-a725-d1a9a0d75128-client-ca\") pod \"controller-manager-6d9f96d886-md9gd\" (UID: \"41d5e210-0a43-4e21-a725-d1a9a0d75128\") " pod="openshift-controller-manager/controller-manager-6d9f96d886-md9gd" Dec 02 15:57:36 crc kubenswrapper[4933]: I1202 15:57:36.246025 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/41d5e210-0a43-4e21-a725-d1a9a0d75128-proxy-ca-bundles\") pod \"controller-manager-6d9f96d886-md9gd\" (UID: \"41d5e210-0a43-4e21-a725-d1a9a0d75128\") " pod="openshift-controller-manager/controller-manager-6d9f96d886-md9gd" Dec 02 15:57:36 crc kubenswrapper[4933]: I1202 15:57:36.247393 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf3d53b9-25c5-4ce1-8cfa-2341f0e96b7f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "cf3d53b9-25c5-4ce1-8cfa-2341f0e96b7f" (UID: "cf3d53b9-25c5-4ce1-8cfa-2341f0e96b7f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 15:57:36 crc kubenswrapper[4933]: I1202 15:57:36.248628 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/41d5e210-0a43-4e21-a725-d1a9a0d75128-config\") pod \"controller-manager-6d9f96d886-md9gd\" (UID: \"41d5e210-0a43-4e21-a725-d1a9a0d75128\") " pod="openshift-controller-manager/controller-manager-6d9f96d886-md9gd" Dec 02 15:57:36 crc kubenswrapper[4933]: I1202 15:57:36.249012 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/41d5e210-0a43-4e21-a725-d1a9a0d75128-serving-cert\") pod \"controller-manager-6d9f96d886-md9gd\" (UID: \"41d5e210-0a43-4e21-a725-d1a9a0d75128\") " pod="openshift-controller-manager/controller-manager-6d9f96d886-md9gd" Dec 02 15:57:36 crc kubenswrapper[4933]: I1202 15:57:36.249285 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf3d53b9-25c5-4ce1-8cfa-2341f0e96b7f-kube-api-access-qjmpk" (OuterVolumeSpecName: "kube-api-access-qjmpk") pod "cf3d53b9-25c5-4ce1-8cfa-2341f0e96b7f" (UID: "cf3d53b9-25c5-4ce1-8cfa-2341f0e96b7f"). InnerVolumeSpecName "kube-api-access-qjmpk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 15:57:36 crc kubenswrapper[4933]: I1202 15:57:36.261436 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48g6x\" (UniqueName: \"kubernetes.io/projected/41d5e210-0a43-4e21-a725-d1a9a0d75128-kube-api-access-48g6x\") pod \"controller-manager-6d9f96d886-md9gd\" (UID: \"41d5e210-0a43-4e21-a725-d1a9a0d75128\") " pod="openshift-controller-manager/controller-manager-6d9f96d886-md9gd" Dec 02 15:57:36 crc kubenswrapper[4933]: I1202 15:57:36.345326 4933 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf3d53b9-25c5-4ce1-8cfa-2341f0e96b7f-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 15:57:36 crc kubenswrapper[4933]: I1202 15:57:36.345358 4933 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cf3d53b9-25c5-4ce1-8cfa-2341f0e96b7f-client-ca\") on node \"crc\" DevicePath \"\"" Dec 02 15:57:36 crc kubenswrapper[4933]: I1202 15:57:36.345369 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qjmpk\" (UniqueName: \"kubernetes.io/projected/cf3d53b9-25c5-4ce1-8cfa-2341f0e96b7f-kube-api-access-qjmpk\") on node \"crc\" DevicePath \"\"" Dec 02 15:57:36 crc kubenswrapper[4933]: I1202 15:57:36.384609 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6d9f96d886-md9gd" Dec 02 15:57:36 crc kubenswrapper[4933]: I1202 15:57:36.572724 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6d9f96d886-md9gd"] Dec 02 15:57:36 crc kubenswrapper[4933]: W1202 15:57:36.579023 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod41d5e210_0a43_4e21_a725_d1a9a0d75128.slice/crio-5bd947be55896c947b05796e09b72445f201d35b3abe1f3564cab07bb84de9e7 WatchSource:0}: Error finding container 5bd947be55896c947b05796e09b72445f201d35b3abe1f3564cab07bb84de9e7: Status 404 returned error can't find the container with id 5bd947be55896c947b05796e09b72445f201d35b3abe1f3564cab07bb84de9e7 Dec 02 15:57:36 crc kubenswrapper[4933]: I1202 15:57:36.693186 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6d9f96d886-md9gd" event={"ID":"41d5e210-0a43-4e21-a725-d1a9a0d75128","Type":"ContainerStarted","Data":"5bd947be55896c947b05796e09b72445f201d35b3abe1f3564cab07bb84de9e7"} Dec 02 15:57:36 crc kubenswrapper[4933]: I1202 15:57:36.695046 4933 generic.go:334] "Generic (PLEG): container finished" podID="607cd80f-eb66-4eb2-b11a-715b09f8c8bd" containerID="1ff28f598a896ebe758934fd8aa859084af049e7af4d8de06115255fcccf9aef" exitCode=0 Dec 02 15:57:36 crc kubenswrapper[4933]: I1202 15:57:36.695116 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5b8dbb6c7f-fcpzg" event={"ID":"607cd80f-eb66-4eb2-b11a-715b09f8c8bd","Type":"ContainerDied","Data":"1ff28f598a896ebe758934fd8aa859084af049e7af4d8de06115255fcccf9aef"} Dec 02 15:57:36 crc kubenswrapper[4933]: I1202 15:57:36.695136 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5b8dbb6c7f-fcpzg" event={"ID":"607cd80f-eb66-4eb2-b11a-715b09f8c8bd","Type":"ContainerDied","Data":"3516be30aa85396f721e33c22f7d9376794afef536dee5fdda408ec5a648ac75"} Dec 02 15:57:36 crc kubenswrapper[4933]: I1202 15:57:36.695140 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5b8dbb6c7f-fcpzg" Dec 02 15:57:36 crc kubenswrapper[4933]: I1202 15:57:36.695154 4933 scope.go:117] "RemoveContainer" containerID="1ff28f598a896ebe758934fd8aa859084af049e7af4d8de06115255fcccf9aef" Dec 02 15:57:36 crc kubenswrapper[4933]: I1202 15:57:36.698389 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-7f4666cb67-hhgf7_cf3d53b9-25c5-4ce1-8cfa-2341f0e96b7f/route-controller-manager/0.log" Dec 02 15:57:36 crc kubenswrapper[4933]: I1202 15:57:36.698427 4933 generic.go:334] "Generic (PLEG): container finished" podID="cf3d53b9-25c5-4ce1-8cfa-2341f0e96b7f" containerID="a4013b8370c31abc058a74838c042a9d0f12e48e7c64ffe10c8bc89c28dee83c" exitCode=255 Dec 02 15:57:36 crc kubenswrapper[4933]: I1202 15:57:36.698450 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7f4666cb67-hhgf7" event={"ID":"cf3d53b9-25c5-4ce1-8cfa-2341f0e96b7f","Type":"ContainerDied","Data":"a4013b8370c31abc058a74838c042a9d0f12e48e7c64ffe10c8bc89c28dee83c"} Dec 02 15:57:36 crc kubenswrapper[4933]: I1202 15:57:36.698474 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7f4666cb67-hhgf7" event={"ID":"cf3d53b9-25c5-4ce1-8cfa-2341f0e96b7f","Type":"ContainerDied","Data":"74999ff90655af35478a92559d9e9a0e36e8ffcba6eaef2e2ee97ad6c54586e2"} Dec 02 15:57:36 crc kubenswrapper[4933]: I1202 15:57:36.698524 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7f4666cb67-hhgf7" Dec 02 15:57:36 crc kubenswrapper[4933]: I1202 15:57:36.710706 4933 scope.go:117] "RemoveContainer" containerID="1ff28f598a896ebe758934fd8aa859084af049e7af4d8de06115255fcccf9aef" Dec 02 15:57:36 crc kubenswrapper[4933]: E1202 15:57:36.711166 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ff28f598a896ebe758934fd8aa859084af049e7af4d8de06115255fcccf9aef\": container with ID starting with 1ff28f598a896ebe758934fd8aa859084af049e7af4d8de06115255fcccf9aef not found: ID does not exist" containerID="1ff28f598a896ebe758934fd8aa859084af049e7af4d8de06115255fcccf9aef" Dec 02 15:57:36 crc kubenswrapper[4933]: I1202 15:57:36.711206 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ff28f598a896ebe758934fd8aa859084af049e7af4d8de06115255fcccf9aef"} err="failed to get container status \"1ff28f598a896ebe758934fd8aa859084af049e7af4d8de06115255fcccf9aef\": rpc error: code = NotFound desc = could not find container \"1ff28f598a896ebe758934fd8aa859084af049e7af4d8de06115255fcccf9aef\": container with ID starting with 1ff28f598a896ebe758934fd8aa859084af049e7af4d8de06115255fcccf9aef not found: ID does not exist" Dec 02 15:57:36 crc kubenswrapper[4933]: I1202 15:57:36.711236 4933 scope.go:117] "RemoveContainer" containerID="a4013b8370c31abc058a74838c042a9d0f12e48e7c64ffe10c8bc89c28dee83c" Dec 02 15:57:36 crc kubenswrapper[4933]: I1202 15:57:36.734397 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5b8dbb6c7f-fcpzg"] Dec 02 15:57:36 crc kubenswrapper[4933]: I1202 15:57:36.734924 4933 scope.go:117] "RemoveContainer" containerID="a4013b8370c31abc058a74838c042a9d0f12e48e7c64ffe10c8bc89c28dee83c" Dec 02 15:57:36 crc kubenswrapper[4933]: E1202 15:57:36.735315 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4013b8370c31abc058a74838c042a9d0f12e48e7c64ffe10c8bc89c28dee83c\": container with ID starting with a4013b8370c31abc058a74838c042a9d0f12e48e7c64ffe10c8bc89c28dee83c not found: ID does not exist" containerID="a4013b8370c31abc058a74838c042a9d0f12e48e7c64ffe10c8bc89c28dee83c" Dec 02 15:57:36 crc kubenswrapper[4933]: I1202 15:57:36.735353 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4013b8370c31abc058a74838c042a9d0f12e48e7c64ffe10c8bc89c28dee83c"} err="failed to get container status \"a4013b8370c31abc058a74838c042a9d0f12e48e7c64ffe10c8bc89c28dee83c\": rpc error: code = NotFound desc = could not find container \"a4013b8370c31abc058a74838c042a9d0f12e48e7c64ffe10c8bc89c28dee83c\": container with ID starting with a4013b8370c31abc058a74838c042a9d0f12e48e7c64ffe10c8bc89c28dee83c not found: ID does not exist" Dec 02 15:57:36 crc kubenswrapper[4933]: I1202 15:57:36.741238 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5b8dbb6c7f-fcpzg"] Dec 02 15:57:36 crc kubenswrapper[4933]: I1202 15:57:36.748129 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f4666cb67-hhgf7"] Dec 02 15:57:36 crc kubenswrapper[4933]: I1202 15:57:36.751168 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f4666cb67-hhgf7"] Dec 02 15:57:37 crc kubenswrapper[4933]: I1202 15:57:37.059123 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="607cd80f-eb66-4eb2-b11a-715b09f8c8bd" path="/var/lib/kubelet/pods/607cd80f-eb66-4eb2-b11a-715b09f8c8bd/volumes" Dec 02 15:57:37 crc kubenswrapper[4933]: I1202 15:57:37.059867 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf3d53b9-25c5-4ce1-8cfa-2341f0e96b7f" path="/var/lib/kubelet/pods/cf3d53b9-25c5-4ce1-8cfa-2341f0e96b7f/volumes" Dec 02 15:57:37 crc kubenswrapper[4933]: I1202 15:57:37.705551 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6d9f96d886-md9gd" event={"ID":"41d5e210-0a43-4e21-a725-d1a9a0d75128","Type":"ContainerStarted","Data":"f3e5f16c76ad452ac89713233792593c5d19c44d2c670d46680cfc9942893516"} Dec 02 15:57:37 crc kubenswrapper[4933]: I1202 15:57:37.705889 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6d9f96d886-md9gd" Dec 02 15:57:37 crc kubenswrapper[4933]: I1202 15:57:37.710745 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6d9f96d886-md9gd" Dec 02 15:57:37 crc kubenswrapper[4933]: I1202 15:57:37.722574 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6d9f96d886-md9gd" podStartSLOduration=2.722561144 podStartE2EDuration="2.722561144s" podCreationTimestamp="2025-12-02 15:57:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 15:57:37.722364078 +0000 UTC m=+320.973590781" watchObservedRunningTime="2025-12-02 15:57:37.722561144 +0000 UTC m=+320.973787867" Dec 02 15:57:38 crc kubenswrapper[4933]: I1202 15:57:38.568280 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5464fdff99-rll9g"] Dec 02 15:57:38 crc kubenswrapper[4933]: E1202 15:57:38.568522 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf3d53b9-25c5-4ce1-8cfa-2341f0e96b7f" containerName="route-controller-manager" Dec 02 15:57:38 crc kubenswrapper[4933]: I1202 15:57:38.568537 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf3d53b9-25c5-4ce1-8cfa-2341f0e96b7f" containerName="route-controller-manager" Dec 02 15:57:38 crc kubenswrapper[4933]: I1202 15:57:38.568652 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf3d53b9-25c5-4ce1-8cfa-2341f0e96b7f" containerName="route-controller-manager" Dec 02 15:57:38 crc kubenswrapper[4933]: I1202 15:57:38.569096 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5464fdff99-rll9g" Dec 02 15:57:38 crc kubenswrapper[4933]: I1202 15:57:38.570807 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 02 15:57:38 crc kubenswrapper[4933]: I1202 15:57:38.571248 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 02 15:57:38 crc kubenswrapper[4933]: I1202 15:57:38.571332 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 02 15:57:38 crc kubenswrapper[4933]: I1202 15:57:38.571709 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 02 15:57:38 crc kubenswrapper[4933]: I1202 15:57:38.572666 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 02 15:57:38 crc kubenswrapper[4933]: I1202 15:57:38.572753 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 02 15:57:38 crc kubenswrapper[4933]: I1202 15:57:38.584250 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5464fdff99-rll9g"] Dec 02 15:57:38 crc kubenswrapper[4933]: I1202 15:57:38.679217 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c174fb0-8328-4292-8c6a-156d6ae892b8-config\") pod \"route-controller-manager-5464fdff99-rll9g\" (UID: \"3c174fb0-8328-4292-8c6a-156d6ae892b8\") " pod="openshift-route-controller-manager/route-controller-manager-5464fdff99-rll9g" Dec 02 15:57:38 crc kubenswrapper[4933]: I1202 15:57:38.679271 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3c174fb0-8328-4292-8c6a-156d6ae892b8-client-ca\") pod \"route-controller-manager-5464fdff99-rll9g\" (UID: \"3c174fb0-8328-4292-8c6a-156d6ae892b8\") " pod="openshift-route-controller-manager/route-controller-manager-5464fdff99-rll9g" Dec 02 15:57:38 crc kubenswrapper[4933]: I1202 15:57:38.679331 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3c174fb0-8328-4292-8c6a-156d6ae892b8-serving-cert\") pod \"route-controller-manager-5464fdff99-rll9g\" (UID: \"3c174fb0-8328-4292-8c6a-156d6ae892b8\") " pod="openshift-route-controller-manager/route-controller-manager-5464fdff99-rll9g" Dec 02 15:57:38 crc kubenswrapper[4933]: I1202 15:57:38.679386 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhtbd\" (UniqueName: \"kubernetes.io/projected/3c174fb0-8328-4292-8c6a-156d6ae892b8-kube-api-access-nhtbd\") pod \"route-controller-manager-5464fdff99-rll9g\" (UID: \"3c174fb0-8328-4292-8c6a-156d6ae892b8\") " pod="openshift-route-controller-manager/route-controller-manager-5464fdff99-rll9g" Dec 02 15:57:38 crc kubenswrapper[4933]: I1202 15:57:38.780173 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3c174fb0-8328-4292-8c6a-156d6ae892b8-serving-cert\") pod \"route-controller-manager-5464fdff99-rll9g\" (UID: \"3c174fb0-8328-4292-8c6a-156d6ae892b8\") " pod="openshift-route-controller-manager/route-controller-manager-5464fdff99-rll9g" Dec 02 15:57:38 crc kubenswrapper[4933]: I1202 15:57:38.780264 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhtbd\" (UniqueName: \"kubernetes.io/projected/3c174fb0-8328-4292-8c6a-156d6ae892b8-kube-api-access-nhtbd\") pod \"route-controller-manager-5464fdff99-rll9g\" (UID: \"3c174fb0-8328-4292-8c6a-156d6ae892b8\") " pod="openshift-route-controller-manager/route-controller-manager-5464fdff99-rll9g" Dec 02 15:57:38 crc kubenswrapper[4933]: I1202 15:57:38.780294 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c174fb0-8328-4292-8c6a-156d6ae892b8-config\") pod \"route-controller-manager-5464fdff99-rll9g\" (UID: \"3c174fb0-8328-4292-8c6a-156d6ae892b8\") " pod="openshift-route-controller-manager/route-controller-manager-5464fdff99-rll9g" Dec 02 15:57:38 crc kubenswrapper[4933]: I1202 15:57:38.780317 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3c174fb0-8328-4292-8c6a-156d6ae892b8-client-ca\") pod \"route-controller-manager-5464fdff99-rll9g\" (UID: \"3c174fb0-8328-4292-8c6a-156d6ae892b8\") " pod="openshift-route-controller-manager/route-controller-manager-5464fdff99-rll9g" Dec 02 15:57:38 crc kubenswrapper[4933]: I1202 15:57:38.781158 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3c174fb0-8328-4292-8c6a-156d6ae892b8-client-ca\") pod \"route-controller-manager-5464fdff99-rll9g\" (UID: \"3c174fb0-8328-4292-8c6a-156d6ae892b8\") " pod="openshift-route-controller-manager/route-controller-manager-5464fdff99-rll9g" Dec 02 15:57:38 crc kubenswrapper[4933]: I1202 15:57:38.782221 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c174fb0-8328-4292-8c6a-156d6ae892b8-config\") pod \"route-controller-manager-5464fdff99-rll9g\" (UID: \"3c174fb0-8328-4292-8c6a-156d6ae892b8\") " pod="openshift-route-controller-manager/route-controller-manager-5464fdff99-rll9g" Dec 02 15:57:38 crc kubenswrapper[4933]: I1202 15:57:38.789810 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3c174fb0-8328-4292-8c6a-156d6ae892b8-serving-cert\") pod \"route-controller-manager-5464fdff99-rll9g\" (UID: \"3c174fb0-8328-4292-8c6a-156d6ae892b8\") " pod="openshift-route-controller-manager/route-controller-manager-5464fdff99-rll9g" Dec 02 15:57:38 crc kubenswrapper[4933]: I1202 15:57:38.800940 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhtbd\" (UniqueName: \"kubernetes.io/projected/3c174fb0-8328-4292-8c6a-156d6ae892b8-kube-api-access-nhtbd\") pod \"route-controller-manager-5464fdff99-rll9g\" (UID: \"3c174fb0-8328-4292-8c6a-156d6ae892b8\") " pod="openshift-route-controller-manager/route-controller-manager-5464fdff99-rll9g" Dec 02 15:57:38 crc kubenswrapper[4933]: I1202 15:57:38.899133 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5464fdff99-rll9g" Dec 02 15:57:39 crc kubenswrapper[4933]: I1202 15:57:39.354325 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5464fdff99-rll9g"] Dec 02 15:57:39 crc kubenswrapper[4933]: W1202 15:57:39.366056 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c174fb0_8328_4292_8c6a_156d6ae892b8.slice/crio-93d1fd8e14262b5a384f61776d1bc0ebc901d774b65c8ec81881036cc541f6c3 WatchSource:0}: Error finding container 93d1fd8e14262b5a384f61776d1bc0ebc901d774b65c8ec81881036cc541f6c3: Status 404 returned error can't find the container with id 93d1fd8e14262b5a384f61776d1bc0ebc901d774b65c8ec81881036cc541f6c3 Dec 02 15:57:39 crc kubenswrapper[4933]: I1202 15:57:39.719220 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5464fdff99-rll9g" event={"ID":"3c174fb0-8328-4292-8c6a-156d6ae892b8","Type":"ContainerStarted","Data":"b6c59eaebc636311a7499cc60e7a12ef4228d603fabdcce04d58c430ec94b413"} Dec 02 15:57:39 crc kubenswrapper[4933]: I1202 15:57:39.719624 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5464fdff99-rll9g" event={"ID":"3c174fb0-8328-4292-8c6a-156d6ae892b8","Type":"ContainerStarted","Data":"93d1fd8e14262b5a384f61776d1bc0ebc901d774b65c8ec81881036cc541f6c3"} Dec 02 15:57:39 crc kubenswrapper[4933]: I1202 15:57:39.719731 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5464fdff99-rll9g" Dec 02 15:57:39 crc kubenswrapper[4933]: I1202 15:57:39.736007 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5464fdff99-rll9g" podStartSLOduration=4.735985491 podStartE2EDuration="4.735985491s" podCreationTimestamp="2025-12-02 15:57:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 15:57:39.733425828 +0000 UTC m=+322.984652541" watchObservedRunningTime="2025-12-02 15:57:39.735985491 +0000 UTC m=+322.987212204" Dec 02 15:57:39 crc kubenswrapper[4933]: I1202 15:57:39.999111 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5464fdff99-rll9g" Dec 02 15:57:42 crc kubenswrapper[4933]: I1202 15:57:42.459486 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6d5b84845-b7kqq"] Dec 02 15:57:42 crc kubenswrapper[4933]: I1202 15:57:42.460621 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-b7kqq" Dec 02 15:57:42 crc kubenswrapper[4933]: I1202 15:57:42.464342 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"openshift-service-ca.crt" Dec 02 15:57:42 crc kubenswrapper[4933]: I1202 15:57:42.465472 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemetry-config" Dec 02 15:57:42 crc kubenswrapper[4933]: I1202 15:57:42.465483 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-dockercfg-wwt9l" Dec 02 15:57:42 crc kubenswrapper[4933]: I1202 15:57:42.465676 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-tls" Dec 02 15:57:42 crc kubenswrapper[4933]: I1202 15:57:42.470259 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6d5b84845-b7kqq"] Dec 02 15:57:42 crc kubenswrapper[4933]: I1202 15:57:42.470414 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-root-ca.crt" Dec 02 15:57:42 crc kubenswrapper[4933]: I1202 15:57:42.624484 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/da2c9462-d760-47be-a66d-c8493db9f7e9-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6d5b84845-b7kqq\" (UID: \"da2c9462-d760-47be-a66d-c8493db9f7e9\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-b7kqq" Dec 02 15:57:42 crc kubenswrapper[4933]: I1202 15:57:42.624541 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gmv4\" (UniqueName: \"kubernetes.io/projected/da2c9462-d760-47be-a66d-c8493db9f7e9-kube-api-access-6gmv4\") pod \"cluster-monitoring-operator-6d5b84845-b7kqq\" (UID: \"da2c9462-d760-47be-a66d-c8493db9f7e9\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-b7kqq" Dec 02 15:57:42 crc kubenswrapper[4933]: I1202 15:57:42.624575 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/da2c9462-d760-47be-a66d-c8493db9f7e9-telemetry-config\") pod \"cluster-monitoring-operator-6d5b84845-b7kqq\" (UID: \"da2c9462-d760-47be-a66d-c8493db9f7e9\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-b7kqq" Dec 02 15:57:42 crc kubenswrapper[4933]: I1202 15:57:42.726204 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/da2c9462-d760-47be-a66d-c8493db9f7e9-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6d5b84845-b7kqq\" (UID: \"da2c9462-d760-47be-a66d-c8493db9f7e9\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-b7kqq" Dec 02 15:57:42 crc kubenswrapper[4933]: I1202 15:57:42.726258 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6gmv4\" (UniqueName: \"kubernetes.io/projected/da2c9462-d760-47be-a66d-c8493db9f7e9-kube-api-access-6gmv4\") pod \"cluster-monitoring-operator-6d5b84845-b7kqq\" (UID: \"da2c9462-d760-47be-a66d-c8493db9f7e9\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-b7kqq" Dec 02 15:57:42 crc kubenswrapper[4933]: I1202 15:57:42.726285 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/da2c9462-d760-47be-a66d-c8493db9f7e9-telemetry-config\") pod \"cluster-monitoring-operator-6d5b84845-b7kqq\" (UID: \"da2c9462-d760-47be-a66d-c8493db9f7e9\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-b7kqq" Dec 02 15:57:42 crc kubenswrapper[4933]: I1202 15:57:42.727220 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/da2c9462-d760-47be-a66d-c8493db9f7e9-telemetry-config\") pod \"cluster-monitoring-operator-6d5b84845-b7kqq\" (UID: \"da2c9462-d760-47be-a66d-c8493db9f7e9\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-b7kqq" Dec 02 15:57:42 crc kubenswrapper[4933]: I1202 15:57:42.740652 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/da2c9462-d760-47be-a66d-c8493db9f7e9-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6d5b84845-b7kqq\" (UID: \"da2c9462-d760-47be-a66d-c8493db9f7e9\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-b7kqq" Dec 02 15:57:42 crc kubenswrapper[4933]: I1202 15:57:42.744350 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gmv4\" (UniqueName: \"kubernetes.io/projected/da2c9462-d760-47be-a66d-c8493db9f7e9-kube-api-access-6gmv4\") pod \"cluster-monitoring-operator-6d5b84845-b7kqq\" (UID: \"da2c9462-d760-47be-a66d-c8493db9f7e9\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-b7kqq" Dec 02 15:57:42 crc kubenswrapper[4933]: I1202 15:57:42.777067 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-b7kqq" Dec 02 15:57:43 crc kubenswrapper[4933]: I1202 15:57:43.172225 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6d5b84845-b7kqq"] Dec 02 15:57:43 crc kubenswrapper[4933]: I1202 15:57:43.741042 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-b7kqq" event={"ID":"da2c9462-d760-47be-a66d-c8493db9f7e9","Type":"ContainerStarted","Data":"db157f701ae1b9eb3ff122a1994a1794f76041a1aee4ad75b19ccc592abd0c95"} Dec 02 15:57:44 crc kubenswrapper[4933]: I1202 15:57:44.831055 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-jpwwl"] Dec 02 15:57:44 crc kubenswrapper[4933]: I1202 15:57:44.832836 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jpwwl" Dec 02 15:57:44 crc kubenswrapper[4933]: I1202 15:57:44.834929 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 02 15:57:44 crc kubenswrapper[4933]: I1202 15:57:44.842148 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jpwwl"] Dec 02 15:57:44 crc kubenswrapper[4933]: I1202 15:57:44.852366 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb82c223-384a-463c-9deb-8cfe4a50ffd7-utilities\") pod \"redhat-operators-jpwwl\" (UID: \"cb82c223-384a-463c-9deb-8cfe4a50ffd7\") " pod="openshift-marketplace/redhat-operators-jpwwl" Dec 02 15:57:44 crc kubenswrapper[4933]: I1202 15:57:44.852428 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rv52\" (UniqueName: \"kubernetes.io/projected/cb82c223-384a-463c-9deb-8cfe4a50ffd7-kube-api-access-4rv52\") pod \"redhat-operators-jpwwl\" (UID: \"cb82c223-384a-463c-9deb-8cfe4a50ffd7\") " pod="openshift-marketplace/redhat-operators-jpwwl" Dec 02 15:57:44 crc kubenswrapper[4933]: I1202 15:57:44.852484 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb82c223-384a-463c-9deb-8cfe4a50ffd7-catalog-content\") pod \"redhat-operators-jpwwl\" (UID: \"cb82c223-384a-463c-9deb-8cfe4a50ffd7\") " pod="openshift-marketplace/redhat-operators-jpwwl" Dec 02 15:57:44 crc kubenswrapper[4933]: I1202 15:57:44.953479 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb82c223-384a-463c-9deb-8cfe4a50ffd7-utilities\") pod \"redhat-operators-jpwwl\" (UID: \"cb82c223-384a-463c-9deb-8cfe4a50ffd7\") " pod="openshift-marketplace/redhat-operators-jpwwl" Dec 02 15:57:44 crc kubenswrapper[4933]: I1202 15:57:44.953553 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rv52\" (UniqueName: \"kubernetes.io/projected/cb82c223-384a-463c-9deb-8cfe4a50ffd7-kube-api-access-4rv52\") pod \"redhat-operators-jpwwl\" (UID: \"cb82c223-384a-463c-9deb-8cfe4a50ffd7\") " pod="openshift-marketplace/redhat-operators-jpwwl" Dec 02 15:57:44 crc kubenswrapper[4933]: I1202 15:57:44.953619 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb82c223-384a-463c-9deb-8cfe4a50ffd7-catalog-content\") pod \"redhat-operators-jpwwl\" (UID: \"cb82c223-384a-463c-9deb-8cfe4a50ffd7\") " pod="openshift-marketplace/redhat-operators-jpwwl" Dec 02 15:57:44 crc kubenswrapper[4933]: I1202 15:57:44.954253 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb82c223-384a-463c-9deb-8cfe4a50ffd7-catalog-content\") pod \"redhat-operators-jpwwl\" (UID: \"cb82c223-384a-463c-9deb-8cfe4a50ffd7\") " pod="openshift-marketplace/redhat-operators-jpwwl" Dec 02 15:57:44 crc kubenswrapper[4933]: I1202 15:57:44.954311 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb82c223-384a-463c-9deb-8cfe4a50ffd7-utilities\") pod \"redhat-operators-jpwwl\" (UID: \"cb82c223-384a-463c-9deb-8cfe4a50ffd7\") " pod="openshift-marketplace/redhat-operators-jpwwl" Dec 02 15:57:44 crc kubenswrapper[4933]: I1202 15:57:44.983086 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rv52\" (UniqueName: \"kubernetes.io/projected/cb82c223-384a-463c-9deb-8cfe4a50ffd7-kube-api-access-4rv52\") pod \"redhat-operators-jpwwl\" (UID: \"cb82c223-384a-463c-9deb-8cfe4a50ffd7\") " pod="openshift-marketplace/redhat-operators-jpwwl" Dec 02 15:57:45 crc kubenswrapper[4933]: I1202 15:57:45.019853 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-pr6cs"] Dec 02 15:57:45 crc kubenswrapper[4933]: I1202 15:57:45.020877 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pr6cs" Dec 02 15:57:45 crc kubenswrapper[4933]: I1202 15:57:45.022431 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 02 15:57:45 crc kubenswrapper[4933]: I1202 15:57:45.028963 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pr6cs"] Dec 02 15:57:45 crc kubenswrapper[4933]: I1202 15:57:45.054018 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/797f2838-8711-4f72-af0c-2fe515a73e03-catalog-content\") pod \"redhat-marketplace-pr6cs\" (UID: \"797f2838-8711-4f72-af0c-2fe515a73e03\") " pod="openshift-marketplace/redhat-marketplace-pr6cs" Dec 02 15:57:45 crc kubenswrapper[4933]: I1202 15:57:45.054053 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/797f2838-8711-4f72-af0c-2fe515a73e03-utilities\") pod \"redhat-marketplace-pr6cs\" (UID: \"797f2838-8711-4f72-af0c-2fe515a73e03\") " pod="openshift-marketplace/redhat-marketplace-pr6cs" Dec 02 15:57:45 crc kubenswrapper[4933]: I1202 15:57:45.054168 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ncng\" (UniqueName: \"kubernetes.io/projected/797f2838-8711-4f72-af0c-2fe515a73e03-kube-api-access-2ncng\") pod \"redhat-marketplace-pr6cs\" (UID: \"797f2838-8711-4f72-af0c-2fe515a73e03\") " pod="openshift-marketplace/redhat-marketplace-pr6cs" Dec 02 15:57:45 crc kubenswrapper[4933]: I1202 15:57:45.155422 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/797f2838-8711-4f72-af0c-2fe515a73e03-catalog-content\") pod \"redhat-marketplace-pr6cs\" (UID: \"797f2838-8711-4f72-af0c-2fe515a73e03\") " pod="openshift-marketplace/redhat-marketplace-pr6cs" Dec 02 15:57:45 crc kubenswrapper[4933]: I1202 15:57:45.155481 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/797f2838-8711-4f72-af0c-2fe515a73e03-utilities\") pod \"redhat-marketplace-pr6cs\" (UID: \"797f2838-8711-4f72-af0c-2fe515a73e03\") " pod="openshift-marketplace/redhat-marketplace-pr6cs" Dec 02 15:57:45 crc kubenswrapper[4933]: I1202 15:57:45.155572 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ncng\" (UniqueName: \"kubernetes.io/projected/797f2838-8711-4f72-af0c-2fe515a73e03-kube-api-access-2ncng\") pod \"redhat-marketplace-pr6cs\" (UID: \"797f2838-8711-4f72-af0c-2fe515a73e03\") " pod="openshift-marketplace/redhat-marketplace-pr6cs" Dec 02 15:57:45 crc kubenswrapper[4933]: I1202 15:57:45.156355 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/797f2838-8711-4f72-af0c-2fe515a73e03-catalog-content\") pod \"redhat-marketplace-pr6cs\" (UID: \"797f2838-8711-4f72-af0c-2fe515a73e03\") " pod="openshift-marketplace/redhat-marketplace-pr6cs" Dec 02 15:57:45 crc kubenswrapper[4933]: I1202 15:57:45.156856 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/797f2838-8711-4f72-af0c-2fe515a73e03-utilities\") pod \"redhat-marketplace-pr6cs\" (UID: \"797f2838-8711-4f72-af0c-2fe515a73e03\") " pod="openshift-marketplace/redhat-marketplace-pr6cs" Dec 02 15:57:45 crc kubenswrapper[4933]: I1202 15:57:45.162076 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jpwwl" Dec 02 15:57:45 crc kubenswrapper[4933]: I1202 15:57:45.173593 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ncng\" (UniqueName: \"kubernetes.io/projected/797f2838-8711-4f72-af0c-2fe515a73e03-kube-api-access-2ncng\") pod \"redhat-marketplace-pr6cs\" (UID: \"797f2838-8711-4f72-af0c-2fe515a73e03\") " pod="openshift-marketplace/redhat-marketplace-pr6cs" Dec 02 15:57:45 crc kubenswrapper[4933]: I1202 15:57:45.342103 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pr6cs" Dec 02 15:57:45 crc kubenswrapper[4933]: I1202 15:57:45.550860 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jpwwl"] Dec 02 15:57:45 crc kubenswrapper[4933]: W1202 15:57:45.997964 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcb82c223_384a_463c_9deb_8cfe4a50ffd7.slice/crio-9ee2cfcbd5520973969fd2e318e75efb0a269b2ac80673328f9236d21b9e6afe WatchSource:0}: Error finding container 9ee2cfcbd5520973969fd2e318e75efb0a269b2ac80673328f9236d21b9e6afe: Status 404 returned error can't find the container with id 9ee2cfcbd5520973969fd2e318e75efb0a269b2ac80673328f9236d21b9e6afe Dec 02 15:57:46 crc kubenswrapper[4933]: I1202 15:57:46.458251 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pr6cs"] Dec 02 15:57:46 crc kubenswrapper[4933]: W1202 15:57:46.468300 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod797f2838_8711_4f72_af0c_2fe515a73e03.slice/crio-4a864fd8df4b54c8b0fd84c7e7238b356f98029f720e5ee4f0e04ae6f7016444 WatchSource:0}: Error finding container 4a864fd8df4b54c8b0fd84c7e7238b356f98029f720e5ee4f0e04ae6f7016444: Status 404 returned error can't find the container with id 4a864fd8df4b54c8b0fd84c7e7238b356f98029f720e5ee4f0e04ae6f7016444 Dec 02 15:57:46 crc kubenswrapper[4933]: I1202 15:57:46.622615 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-m288s"] Dec 02 15:57:46 crc kubenswrapper[4933]: I1202 15:57:46.623769 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m288s" Dec 02 15:57:46 crc kubenswrapper[4933]: I1202 15:57:46.628156 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 02 15:57:46 crc kubenswrapper[4933]: I1202 15:57:46.634107 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-m288s"] Dec 02 15:57:46 crc kubenswrapper[4933]: I1202 15:57:46.712081 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-42bwd"] Dec 02 15:57:46 crc kubenswrapper[4933]: I1202 15:57:46.712711 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-42bwd" Dec 02 15:57:46 crc kubenswrapper[4933]: I1202 15:57:46.715301 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-admission-webhook-tls" Dec 02 15:57:46 crc kubenswrapper[4933]: I1202 15:57:46.715326 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-admission-webhook-dockercfg-qzr2t" Dec 02 15:57:46 crc kubenswrapper[4933]: I1202 15:57:46.757081 4933 generic.go:334] "Generic (PLEG): container finished" podID="797f2838-8711-4f72-af0c-2fe515a73e03" containerID="9b87c8e561404c1f5e7218dd91b49131b63b9409179f0003b30b33d2077d859f" exitCode=0 Dec 02 15:57:46 crc kubenswrapper[4933]: I1202 15:57:46.757149 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pr6cs" event={"ID":"797f2838-8711-4f72-af0c-2fe515a73e03","Type":"ContainerDied","Data":"9b87c8e561404c1f5e7218dd91b49131b63b9409179f0003b30b33d2077d859f"} Dec 02 15:57:46 crc kubenswrapper[4933]: I1202 15:57:46.757496 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pr6cs" event={"ID":"797f2838-8711-4f72-af0c-2fe515a73e03","Type":"ContainerStarted","Data":"4a864fd8df4b54c8b0fd84c7e7238b356f98029f720e5ee4f0e04ae6f7016444"} Dec 02 15:57:46 crc kubenswrapper[4933]: I1202 15:57:46.759416 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-b7kqq" event={"ID":"da2c9462-d760-47be-a66d-c8493db9f7e9","Type":"ContainerStarted","Data":"cab5115156b0fab59f9d6abf04d1078efc9449d866fe0fa11b7b34dc65acba23"} Dec 02 15:57:46 crc kubenswrapper[4933]: I1202 15:57:46.759662 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-42bwd"] Dec 02 15:57:46 crc kubenswrapper[4933]: I1202 15:57:46.763651 4933 generic.go:334] "Generic (PLEG): container finished" podID="cb82c223-384a-463c-9deb-8cfe4a50ffd7" containerID="0a0f7ef7b2adbb7c08ea23aefa5bae9afe6082ce819de64914c2ce18ed36c2f1" exitCode=0 Dec 02 15:57:46 crc kubenswrapper[4933]: I1202 15:57:46.763704 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jpwwl" event={"ID":"cb82c223-384a-463c-9deb-8cfe4a50ffd7","Type":"ContainerDied","Data":"0a0f7ef7b2adbb7c08ea23aefa5bae9afe6082ce819de64914c2ce18ed36c2f1"} Dec 02 15:57:46 crc kubenswrapper[4933]: I1202 15:57:46.763733 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jpwwl" event={"ID":"cb82c223-384a-463c-9deb-8cfe4a50ffd7","Type":"ContainerStarted","Data":"9ee2cfcbd5520973969fd2e318e75efb0a269b2ac80673328f9236d21b9e6afe"} Dec 02 15:57:46 crc kubenswrapper[4933]: I1202 15:57:46.773626 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb2fcf27-bfa3-429d-8c22-ff2ba63fcdb0-utilities\") pod \"community-operators-m288s\" (UID: \"cb2fcf27-bfa3-429d-8c22-ff2ba63fcdb0\") " pod="openshift-marketplace/community-operators-m288s" Dec 02 15:57:46 crc kubenswrapper[4933]: I1202 15:57:46.773708 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb2fcf27-bfa3-429d-8c22-ff2ba63fcdb0-catalog-content\") pod \"community-operators-m288s\" (UID: \"cb2fcf27-bfa3-429d-8c22-ff2ba63fcdb0\") " pod="openshift-marketplace/community-operators-m288s" Dec 02 15:57:46 crc kubenswrapper[4933]: I1202 15:57:46.773732 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfvts\" (UniqueName: \"kubernetes.io/projected/cb2fcf27-bfa3-429d-8c22-ff2ba63fcdb0-kube-api-access-kfvts\") pod \"community-operators-m288s\" (UID: \"cb2fcf27-bfa3-429d-8c22-ff2ba63fcdb0\") " pod="openshift-marketplace/community-operators-m288s" Dec 02 15:57:46 crc kubenswrapper[4933]: I1202 15:57:46.811479 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-b7kqq" podStartSLOduration=1.930377609 podStartE2EDuration="4.811464258s" podCreationTimestamp="2025-12-02 15:57:42 +0000 UTC" firstStartedPulling="2025-12-02 15:57:43.183036528 +0000 UTC m=+326.434263231" lastFinishedPulling="2025-12-02 15:57:46.064123177 +0000 UTC m=+329.315349880" observedRunningTime="2025-12-02 15:57:46.793152574 +0000 UTC m=+330.044379337" watchObservedRunningTime="2025-12-02 15:57:46.811464258 +0000 UTC m=+330.062690961" Dec 02 15:57:46 crc kubenswrapper[4933]: I1202 15:57:46.874546 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/6b5a63a5-6cb6-4cd7-a927-d0390e66dac8-tls-certificates\") pod \"prometheus-operator-admission-webhook-f54c54754-42bwd\" (UID: \"6b5a63a5-6cb6-4cd7-a927-d0390e66dac8\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-42bwd" Dec 02 15:57:46 crc kubenswrapper[4933]: I1202 15:57:46.874643 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb2fcf27-bfa3-429d-8c22-ff2ba63fcdb0-catalog-content\") pod \"community-operators-m288s\" (UID: \"cb2fcf27-bfa3-429d-8c22-ff2ba63fcdb0\") " pod="openshift-marketplace/community-operators-m288s" Dec 02 15:57:46 crc kubenswrapper[4933]: I1202 15:57:46.874664 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfvts\" (UniqueName: \"kubernetes.io/projected/cb2fcf27-bfa3-429d-8c22-ff2ba63fcdb0-kube-api-access-kfvts\") pod \"community-operators-m288s\" (UID: \"cb2fcf27-bfa3-429d-8c22-ff2ba63fcdb0\") " pod="openshift-marketplace/community-operators-m288s" Dec 02 15:57:46 crc kubenswrapper[4933]: I1202 15:57:46.874700 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb2fcf27-bfa3-429d-8c22-ff2ba63fcdb0-utilities\") pod \"community-operators-m288s\" (UID: \"cb2fcf27-bfa3-429d-8c22-ff2ba63fcdb0\") " pod="openshift-marketplace/community-operators-m288s" Dec 02 15:57:46 crc kubenswrapper[4933]: I1202 15:57:46.875433 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb2fcf27-bfa3-429d-8c22-ff2ba63fcdb0-utilities\") pod \"community-operators-m288s\" (UID: \"cb2fcf27-bfa3-429d-8c22-ff2ba63fcdb0\") " pod="openshift-marketplace/community-operators-m288s" Dec 02 15:57:46 crc kubenswrapper[4933]: I1202 15:57:46.875489 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb2fcf27-bfa3-429d-8c22-ff2ba63fcdb0-catalog-content\") pod \"community-operators-m288s\" (UID: \"cb2fcf27-bfa3-429d-8c22-ff2ba63fcdb0\") " pod="openshift-marketplace/community-operators-m288s" Dec 02 15:57:46 crc kubenswrapper[4933]: I1202 15:57:46.894212 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfvts\" (UniqueName: \"kubernetes.io/projected/cb2fcf27-bfa3-429d-8c22-ff2ba63fcdb0-kube-api-access-kfvts\") pod \"community-operators-m288s\" (UID: \"cb2fcf27-bfa3-429d-8c22-ff2ba63fcdb0\") " pod="openshift-marketplace/community-operators-m288s" Dec 02 15:57:46 crc kubenswrapper[4933]: I1202 15:57:46.975919 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/6b5a63a5-6cb6-4cd7-a927-d0390e66dac8-tls-certificates\") pod \"prometheus-operator-admission-webhook-f54c54754-42bwd\" (UID: \"6b5a63a5-6cb6-4cd7-a927-d0390e66dac8\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-42bwd" Dec 02 15:57:46 crc kubenswrapper[4933]: I1202 15:57:46.979530 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/6b5a63a5-6cb6-4cd7-a927-d0390e66dac8-tls-certificates\") pod \"prometheus-operator-admission-webhook-f54c54754-42bwd\" (UID: \"6b5a63a5-6cb6-4cd7-a927-d0390e66dac8\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-42bwd" Dec 02 15:57:46 crc kubenswrapper[4933]: I1202 15:57:46.999133 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m288s" Dec 02 15:57:47 crc kubenswrapper[4933]: I1202 15:57:47.026252 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-42bwd" Dec 02 15:57:47 crc kubenswrapper[4933]: I1202 15:57:47.261695 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-m288s"] Dec 02 15:57:47 crc kubenswrapper[4933]: W1202 15:57:47.284117 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcb2fcf27_bfa3_429d_8c22_ff2ba63fcdb0.slice/crio-847cf8c6a96570454add401f7d0dd57ca9f8b35c57070876957a99110e00d54b WatchSource:0}: Error finding container 847cf8c6a96570454add401f7d0dd57ca9f8b35c57070876957a99110e00d54b: Status 404 returned error can't find the container with id 847cf8c6a96570454add401f7d0dd57ca9f8b35c57070876957a99110e00d54b Dec 02 15:57:47 crc kubenswrapper[4933]: W1202 15:57:47.509372 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b5a63a5_6cb6_4cd7_a927_d0390e66dac8.slice/crio-28d2396d6d9c828ff85801583f850dba37a3f8658c20064a42b5d168a46c49fc WatchSource:0}: Error finding container 28d2396d6d9c828ff85801583f850dba37a3f8658c20064a42b5d168a46c49fc: Status 404 returned error can't find the container with id 28d2396d6d9c828ff85801583f850dba37a3f8658c20064a42b5d168a46c49fc Dec 02 15:57:47 crc kubenswrapper[4933]: I1202 15:57:47.509799 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-42bwd"] Dec 02 15:57:47 crc kubenswrapper[4933]: I1202 15:57:47.772707 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pr6cs" event={"ID":"797f2838-8711-4f72-af0c-2fe515a73e03","Type":"ContainerStarted","Data":"d753ae944eddbc2ef1f3256efe3cb314b0cf4ff0c0003aed0271dc147202187f"} Dec 02 15:57:47 crc kubenswrapper[4933]: I1202 15:57:47.774038 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-42bwd" event={"ID":"6b5a63a5-6cb6-4cd7-a927-d0390e66dac8","Type":"ContainerStarted","Data":"28d2396d6d9c828ff85801583f850dba37a3f8658c20064a42b5d168a46c49fc"} Dec 02 15:57:47 crc kubenswrapper[4933]: I1202 15:57:47.775602 4933 generic.go:334] "Generic (PLEG): container finished" podID="cb2fcf27-bfa3-429d-8c22-ff2ba63fcdb0" containerID="64c15a8f1950117bf0ef473c7ac3d574ccb1eea8b971594c28e660cada5dbe5a" exitCode=0 Dec 02 15:57:47 crc kubenswrapper[4933]: I1202 15:57:47.775735 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m288s" event={"ID":"cb2fcf27-bfa3-429d-8c22-ff2ba63fcdb0","Type":"ContainerDied","Data":"64c15a8f1950117bf0ef473c7ac3d574ccb1eea8b971594c28e660cada5dbe5a"} Dec 02 15:57:47 crc kubenswrapper[4933]: I1202 15:57:47.775767 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m288s" event={"ID":"cb2fcf27-bfa3-429d-8c22-ff2ba63fcdb0","Type":"ContainerStarted","Data":"847cf8c6a96570454add401f7d0dd57ca9f8b35c57070876957a99110e00d54b"} Dec 02 15:57:47 crc kubenswrapper[4933]: I1202 15:57:47.779158 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jpwwl" event={"ID":"cb82c223-384a-463c-9deb-8cfe4a50ffd7","Type":"ContainerStarted","Data":"9f201a644e335da0eebfffae605bc8ca9885743ef3b9a2475b5419c98e9f912a"} Dec 02 15:57:48 crc kubenswrapper[4933]: I1202 15:57:48.225047 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-hv4zj"] Dec 02 15:57:48 crc kubenswrapper[4933]: I1202 15:57:48.228424 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hv4zj" Dec 02 15:57:48 crc kubenswrapper[4933]: I1202 15:57:48.230348 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 02 15:57:48 crc kubenswrapper[4933]: I1202 15:57:48.235808 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hv4zj"] Dec 02 15:57:48 crc kubenswrapper[4933]: I1202 15:57:48.408590 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mt9kp\" (UniqueName: \"kubernetes.io/projected/fa22a7e7-ae56-415c-ba73-37c19aa18fcb-kube-api-access-mt9kp\") pod \"certified-operators-hv4zj\" (UID: \"fa22a7e7-ae56-415c-ba73-37c19aa18fcb\") " pod="openshift-marketplace/certified-operators-hv4zj" Dec 02 15:57:48 crc kubenswrapper[4933]: I1202 15:57:48.408737 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa22a7e7-ae56-415c-ba73-37c19aa18fcb-catalog-content\") pod \"certified-operators-hv4zj\" (UID: \"fa22a7e7-ae56-415c-ba73-37c19aa18fcb\") " pod="openshift-marketplace/certified-operators-hv4zj" Dec 02 15:57:48 crc kubenswrapper[4933]: I1202 15:57:48.408857 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa22a7e7-ae56-415c-ba73-37c19aa18fcb-utilities\") pod \"certified-operators-hv4zj\" (UID: \"fa22a7e7-ae56-415c-ba73-37c19aa18fcb\") " pod="openshift-marketplace/certified-operators-hv4zj" Dec 02 15:57:48 crc kubenswrapper[4933]: I1202 15:57:48.510542 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mt9kp\" (UniqueName: \"kubernetes.io/projected/fa22a7e7-ae56-415c-ba73-37c19aa18fcb-kube-api-access-mt9kp\") pod \"certified-operators-hv4zj\" (UID: \"fa22a7e7-ae56-415c-ba73-37c19aa18fcb\") " pod="openshift-marketplace/certified-operators-hv4zj" Dec 02 15:57:48 crc kubenswrapper[4933]: I1202 15:57:48.510597 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa22a7e7-ae56-415c-ba73-37c19aa18fcb-catalog-content\") pod \"certified-operators-hv4zj\" (UID: \"fa22a7e7-ae56-415c-ba73-37c19aa18fcb\") " pod="openshift-marketplace/certified-operators-hv4zj" Dec 02 15:57:48 crc kubenswrapper[4933]: I1202 15:57:48.510633 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa22a7e7-ae56-415c-ba73-37c19aa18fcb-utilities\") pod \"certified-operators-hv4zj\" (UID: \"fa22a7e7-ae56-415c-ba73-37c19aa18fcb\") " pod="openshift-marketplace/certified-operators-hv4zj" Dec 02 15:57:48 crc kubenswrapper[4933]: I1202 15:57:48.511169 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa22a7e7-ae56-415c-ba73-37c19aa18fcb-utilities\") pod \"certified-operators-hv4zj\" (UID: \"fa22a7e7-ae56-415c-ba73-37c19aa18fcb\") " pod="openshift-marketplace/certified-operators-hv4zj" Dec 02 15:57:48 crc kubenswrapper[4933]: I1202 15:57:48.511520 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa22a7e7-ae56-415c-ba73-37c19aa18fcb-catalog-content\") pod \"certified-operators-hv4zj\" (UID: \"fa22a7e7-ae56-415c-ba73-37c19aa18fcb\") " pod="openshift-marketplace/certified-operators-hv4zj" Dec 02 15:57:48 crc kubenswrapper[4933]: I1202 15:57:48.536092 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mt9kp\" (UniqueName: \"kubernetes.io/projected/fa22a7e7-ae56-415c-ba73-37c19aa18fcb-kube-api-access-mt9kp\") pod \"certified-operators-hv4zj\" (UID: \"fa22a7e7-ae56-415c-ba73-37c19aa18fcb\") " pod="openshift-marketplace/certified-operators-hv4zj" Dec 02 15:57:48 crc kubenswrapper[4933]: I1202 15:57:48.628454 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hv4zj" Dec 02 15:57:48 crc kubenswrapper[4933]: I1202 15:57:48.787383 4933 generic.go:334] "Generic (PLEG): container finished" podID="cb82c223-384a-463c-9deb-8cfe4a50ffd7" containerID="9f201a644e335da0eebfffae605bc8ca9885743ef3b9a2475b5419c98e9f912a" exitCode=0 Dec 02 15:57:48 crc kubenswrapper[4933]: I1202 15:57:48.787450 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jpwwl" event={"ID":"cb82c223-384a-463c-9deb-8cfe4a50ffd7","Type":"ContainerDied","Data":"9f201a644e335da0eebfffae605bc8ca9885743ef3b9a2475b5419c98e9f912a"} Dec 02 15:57:48 crc kubenswrapper[4933]: I1202 15:57:48.795707 4933 generic.go:334] "Generic (PLEG): container finished" podID="797f2838-8711-4f72-af0c-2fe515a73e03" containerID="d753ae944eddbc2ef1f3256efe3cb314b0cf4ff0c0003aed0271dc147202187f" exitCode=0 Dec 02 15:57:48 crc kubenswrapper[4933]: I1202 15:57:48.795759 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pr6cs" event={"ID":"797f2838-8711-4f72-af0c-2fe515a73e03","Type":"ContainerDied","Data":"d753ae944eddbc2ef1f3256efe3cb314b0cf4ff0c0003aed0271dc147202187f"} Dec 02 15:57:48 crc kubenswrapper[4933]: I1202 15:57:48.798059 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m288s" event={"ID":"cb2fcf27-bfa3-429d-8c22-ff2ba63fcdb0","Type":"ContainerStarted","Data":"d94de596053ab6113630c603d0368a9e7fee83ffcfc3bf8871cdf07b7fc62b44"} Dec 02 15:57:49 crc kubenswrapper[4933]: I1202 15:57:49.038748 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hv4zj"] Dec 02 15:57:49 crc kubenswrapper[4933]: W1202 15:57:49.278997 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa22a7e7_ae56_415c_ba73_37c19aa18fcb.slice/crio-46c2097144dc8e1e1a8be1047fbde31a41edb5a03f04db7f3d811d09e5928154 WatchSource:0}: Error finding container 46c2097144dc8e1e1a8be1047fbde31a41edb5a03f04db7f3d811d09e5928154: Status 404 returned error can't find the container with id 46c2097144dc8e1e1a8be1047fbde31a41edb5a03f04db7f3d811d09e5928154 Dec 02 15:57:49 crc kubenswrapper[4933]: I1202 15:57:49.804330 4933 generic.go:334] "Generic (PLEG): container finished" podID="fa22a7e7-ae56-415c-ba73-37c19aa18fcb" containerID="bc8b73d25f68b72074487a968ed98ddc8c3d663ee7047c702256407ea39c7f0d" exitCode=0 Dec 02 15:57:49 crc kubenswrapper[4933]: I1202 15:57:49.804395 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hv4zj" event={"ID":"fa22a7e7-ae56-415c-ba73-37c19aa18fcb","Type":"ContainerDied","Data":"bc8b73d25f68b72074487a968ed98ddc8c3d663ee7047c702256407ea39c7f0d"} Dec 02 15:57:49 crc kubenswrapper[4933]: I1202 15:57:49.804735 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hv4zj" event={"ID":"fa22a7e7-ae56-415c-ba73-37c19aa18fcb","Type":"ContainerStarted","Data":"46c2097144dc8e1e1a8be1047fbde31a41edb5a03f04db7f3d811d09e5928154"} Dec 02 15:57:49 crc kubenswrapper[4933]: I1202 15:57:49.806747 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-42bwd" event={"ID":"6b5a63a5-6cb6-4cd7-a927-d0390e66dac8","Type":"ContainerStarted","Data":"68cffc9105b9383abe9314da33f585e8e85301c7955d01fa1e239921c0cfa249"} Dec 02 15:57:49 crc kubenswrapper[4933]: I1202 15:57:49.808679 4933 generic.go:334] "Generic (PLEG): container finished" podID="cb2fcf27-bfa3-429d-8c22-ff2ba63fcdb0" containerID="d94de596053ab6113630c603d0368a9e7fee83ffcfc3bf8871cdf07b7fc62b44" exitCode=0 Dec 02 15:57:49 crc kubenswrapper[4933]: I1202 15:57:49.808728 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m288s" event={"ID":"cb2fcf27-bfa3-429d-8c22-ff2ba63fcdb0","Type":"ContainerDied","Data":"d94de596053ab6113630c603d0368a9e7fee83ffcfc3bf8871cdf07b7fc62b44"} Dec 02 15:57:49 crc kubenswrapper[4933]: I1202 15:57:49.844448 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-42bwd" podStartSLOduration=2.044727453 podStartE2EDuration="3.844429106s" podCreationTimestamp="2025-12-02 15:57:46 +0000 UTC" firstStartedPulling="2025-12-02 15:57:47.531520138 +0000 UTC m=+330.782746841" lastFinishedPulling="2025-12-02 15:57:49.331221791 +0000 UTC m=+332.582448494" observedRunningTime="2025-12-02 15:57:49.839090794 +0000 UTC m=+333.090317517" watchObservedRunningTime="2025-12-02 15:57:49.844429106 +0000 UTC m=+333.095655829" Dec 02 15:57:50 crc kubenswrapper[4933]: I1202 15:57:50.814906 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pr6cs" event={"ID":"797f2838-8711-4f72-af0c-2fe515a73e03","Type":"ContainerStarted","Data":"e3d04384c845eb878a92b073d75a802e8dd6ffe8ee21c1ed6382f575850443f5"} Dec 02 15:57:50 crc kubenswrapper[4933]: I1202 15:57:50.816554 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m288s" event={"ID":"cb2fcf27-bfa3-429d-8c22-ff2ba63fcdb0","Type":"ContainerStarted","Data":"3fe6ada5e8e517625cd45722d3e2289ed1ad3e59a2454e74217b73668d8565c8"} Dec 02 15:57:50 crc kubenswrapper[4933]: I1202 15:57:50.818551 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jpwwl" event={"ID":"cb82c223-384a-463c-9deb-8cfe4a50ffd7","Type":"ContainerStarted","Data":"4293397134d7c30745ebac3f0ae1ba9dfb2303cc96080c6cd1cd9b08b6700159"} Dec 02 15:57:50 crc kubenswrapper[4933]: I1202 15:57:50.818712 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-42bwd" Dec 02 15:57:50 crc kubenswrapper[4933]: I1202 15:57:50.826723 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-42bwd" Dec 02 15:57:50 crc kubenswrapper[4933]: I1202 15:57:50.841077 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-pr6cs" podStartSLOduration=2.8223550250000002 podStartE2EDuration="5.841055666s" podCreationTimestamp="2025-12-02 15:57:45 +0000 UTC" firstStartedPulling="2025-12-02 15:57:46.758925385 +0000 UTC m=+330.010152098" lastFinishedPulling="2025-12-02 15:57:49.777626036 +0000 UTC m=+333.028852739" observedRunningTime="2025-12-02 15:57:50.83945797 +0000 UTC m=+334.090684673" watchObservedRunningTime="2025-12-02 15:57:50.841055666 +0000 UTC m=+334.092282369" Dec 02 15:57:50 crc kubenswrapper[4933]: I1202 15:57:50.875579 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-m288s" podStartSLOduration=2.417860143 podStartE2EDuration="4.875562382s" podCreationTimestamp="2025-12-02 15:57:46 +0000 UTC" firstStartedPulling="2025-12-02 15:57:47.777503212 +0000 UTC m=+331.028729925" lastFinishedPulling="2025-12-02 15:57:50.235205461 +0000 UTC m=+333.486432164" observedRunningTime="2025-12-02 15:57:50.873703969 +0000 UTC m=+334.124930672" watchObservedRunningTime="2025-12-02 15:57:50.875562382 +0000 UTC m=+334.126789105" Dec 02 15:57:50 crc kubenswrapper[4933]: I1202 15:57:50.888368 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-jpwwl" podStartSLOduration=3.909811766 podStartE2EDuration="6.888349358s" podCreationTimestamp="2025-12-02 15:57:44 +0000 UTC" firstStartedPulling="2025-12-02 15:57:46.765512714 +0000 UTC m=+330.016739417" lastFinishedPulling="2025-12-02 15:57:49.744050306 +0000 UTC m=+332.995277009" observedRunningTime="2025-12-02 15:57:50.887564516 +0000 UTC m=+334.138791229" watchObservedRunningTime="2025-12-02 15:57:50.888349358 +0000 UTC m=+334.139576061" Dec 02 15:57:51 crc kubenswrapper[4933]: I1202 15:57:51.799138 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-db54df47d-vlt62"] Dec 02 15:57:51 crc kubenswrapper[4933]: I1202 15:57:51.800243 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-db54df47d-vlt62" Dec 02 15:57:51 crc kubenswrapper[4933]: I1202 15:57:51.802181 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-dockercfg-fnpwf" Dec 02 15:57:51 crc kubenswrapper[4933]: I1202 15:57:51.803436 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-kube-rbac-proxy-config" Dec 02 15:57:51 crc kubenswrapper[4933]: I1202 15:57:51.803714 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-tls" Dec 02 15:57:51 crc kubenswrapper[4933]: I1202 15:57:51.804083 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-client-ca" Dec 02 15:57:51 crc kubenswrapper[4933]: I1202 15:57:51.822561 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-db54df47d-vlt62"] Dec 02 15:57:51 crc kubenswrapper[4933]: I1202 15:57:51.950541 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/cd1667e7-457a-400d-9cf2-1d9c8b0a3a87-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-db54df47d-vlt62\" (UID: \"cd1667e7-457a-400d-9cf2-1d9c8b0a3a87\") " pod="openshift-monitoring/prometheus-operator-db54df47d-vlt62" Dec 02 15:57:51 crc kubenswrapper[4933]: I1202 15:57:51.950596 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cd1667e7-457a-400d-9cf2-1d9c8b0a3a87-metrics-client-ca\") pod \"prometheus-operator-db54df47d-vlt62\" (UID: \"cd1667e7-457a-400d-9cf2-1d9c8b0a3a87\") " pod="openshift-monitoring/prometheus-operator-db54df47d-vlt62" Dec 02 15:57:51 crc kubenswrapper[4933]: I1202 15:57:51.950617 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/cd1667e7-457a-400d-9cf2-1d9c8b0a3a87-prometheus-operator-tls\") pod \"prometheus-operator-db54df47d-vlt62\" (UID: \"cd1667e7-457a-400d-9cf2-1d9c8b0a3a87\") " pod="openshift-monitoring/prometheus-operator-db54df47d-vlt62" Dec 02 15:57:51 crc kubenswrapper[4933]: I1202 15:57:51.950735 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96p4h\" (UniqueName: \"kubernetes.io/projected/cd1667e7-457a-400d-9cf2-1d9c8b0a3a87-kube-api-access-96p4h\") pod \"prometheus-operator-db54df47d-vlt62\" (UID: \"cd1667e7-457a-400d-9cf2-1d9c8b0a3a87\") " pod="openshift-monitoring/prometheus-operator-db54df47d-vlt62" Dec 02 15:57:52 crc kubenswrapper[4933]: I1202 15:57:52.052152 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96p4h\" (UniqueName: \"kubernetes.io/projected/cd1667e7-457a-400d-9cf2-1d9c8b0a3a87-kube-api-access-96p4h\") pod \"prometheus-operator-db54df47d-vlt62\" (UID: \"cd1667e7-457a-400d-9cf2-1d9c8b0a3a87\") " pod="openshift-monitoring/prometheus-operator-db54df47d-vlt62" Dec 02 15:57:52 crc kubenswrapper[4933]: I1202 15:57:52.052217 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/cd1667e7-457a-400d-9cf2-1d9c8b0a3a87-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-db54df47d-vlt62\" (UID: \"cd1667e7-457a-400d-9cf2-1d9c8b0a3a87\") " pod="openshift-monitoring/prometheus-operator-db54df47d-vlt62" Dec 02 15:57:52 crc kubenswrapper[4933]: I1202 15:57:52.052259 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cd1667e7-457a-400d-9cf2-1d9c8b0a3a87-metrics-client-ca\") pod \"prometheus-operator-db54df47d-vlt62\" (UID: \"cd1667e7-457a-400d-9cf2-1d9c8b0a3a87\") " pod="openshift-monitoring/prometheus-operator-db54df47d-vlt62" Dec 02 15:57:52 crc kubenswrapper[4933]: I1202 15:57:52.052289 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/cd1667e7-457a-400d-9cf2-1d9c8b0a3a87-prometheus-operator-tls\") pod \"prometheus-operator-db54df47d-vlt62\" (UID: \"cd1667e7-457a-400d-9cf2-1d9c8b0a3a87\") " pod="openshift-monitoring/prometheus-operator-db54df47d-vlt62" Dec 02 15:57:52 crc kubenswrapper[4933]: I1202 15:57:52.053680 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cd1667e7-457a-400d-9cf2-1d9c8b0a3a87-metrics-client-ca\") pod \"prometheus-operator-db54df47d-vlt62\" (UID: \"cd1667e7-457a-400d-9cf2-1d9c8b0a3a87\") " pod="openshift-monitoring/prometheus-operator-db54df47d-vlt62" Dec 02 15:57:52 crc kubenswrapper[4933]: I1202 15:57:52.059405 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/cd1667e7-457a-400d-9cf2-1d9c8b0a3a87-prometheus-operator-tls\") pod \"prometheus-operator-db54df47d-vlt62\" (UID: \"cd1667e7-457a-400d-9cf2-1d9c8b0a3a87\") " pod="openshift-monitoring/prometheus-operator-db54df47d-vlt62" Dec 02 15:57:52 crc kubenswrapper[4933]: I1202 15:57:52.062870 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/cd1667e7-457a-400d-9cf2-1d9c8b0a3a87-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-db54df47d-vlt62\" (UID: \"cd1667e7-457a-400d-9cf2-1d9c8b0a3a87\") " pod="openshift-monitoring/prometheus-operator-db54df47d-vlt62" Dec 02 15:57:52 crc kubenswrapper[4933]: I1202 15:57:52.073816 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96p4h\" (UniqueName: \"kubernetes.io/projected/cd1667e7-457a-400d-9cf2-1d9c8b0a3a87-kube-api-access-96p4h\") pod \"prometheus-operator-db54df47d-vlt62\" (UID: \"cd1667e7-457a-400d-9cf2-1d9c8b0a3a87\") " pod="openshift-monitoring/prometheus-operator-db54df47d-vlt62" Dec 02 15:57:52 crc kubenswrapper[4933]: I1202 15:57:52.113642 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-db54df47d-vlt62" Dec 02 15:57:52 crc kubenswrapper[4933]: I1202 15:57:52.559352 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-db54df47d-vlt62"] Dec 02 15:57:52 crc kubenswrapper[4933]: W1202 15:57:52.564538 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd1667e7_457a_400d_9cf2_1d9c8b0a3a87.slice/crio-9c602c6722a430706279363326fef9a487f80e180fedb2396677ed2cf2f6c88d WatchSource:0}: Error finding container 9c602c6722a430706279363326fef9a487f80e180fedb2396677ed2cf2f6c88d: Status 404 returned error can't find the container with id 9c602c6722a430706279363326fef9a487f80e180fedb2396677ed2cf2f6c88d Dec 02 15:57:52 crc kubenswrapper[4933]: I1202 15:57:52.840038 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-db54df47d-vlt62" event={"ID":"cd1667e7-457a-400d-9cf2-1d9c8b0a3a87","Type":"ContainerStarted","Data":"9c602c6722a430706279363326fef9a487f80e180fedb2396677ed2cf2f6c88d"} Dec 02 15:57:52 crc kubenswrapper[4933]: I1202 15:57:52.841663 4933 generic.go:334] "Generic (PLEG): container finished" podID="fa22a7e7-ae56-415c-ba73-37c19aa18fcb" containerID="8a48af9a91d20a63f6a8d7ff6d87cca3969b3a3814c3fe8037503f3ee44f7fc7" exitCode=0 Dec 02 15:57:52 crc kubenswrapper[4933]: I1202 15:57:52.841722 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hv4zj" event={"ID":"fa22a7e7-ae56-415c-ba73-37c19aa18fcb","Type":"ContainerDied","Data":"8a48af9a91d20a63f6a8d7ff6d87cca3969b3a3814c3fe8037503f3ee44f7fc7"} Dec 02 15:57:53 crc kubenswrapper[4933]: I1202 15:57:53.195180 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6d9f96d886-md9gd"] Dec 02 15:57:53 crc kubenswrapper[4933]: I1202 15:57:53.195720 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6d9f96d886-md9gd" podUID="41d5e210-0a43-4e21-a725-d1a9a0d75128" containerName="controller-manager" containerID="cri-o://f3e5f16c76ad452ac89713233792593c5d19c44d2c670d46680cfc9942893516" gracePeriod=30 Dec 02 15:57:55 crc kubenswrapper[4933]: I1202 15:57:55.162978 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-jpwwl" Dec 02 15:57:55 crc kubenswrapper[4933]: I1202 15:57:55.163574 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-jpwwl" Dec 02 15:57:55 crc kubenswrapper[4933]: I1202 15:57:55.208526 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-jpwwl" Dec 02 15:57:55 crc kubenswrapper[4933]: I1202 15:57:55.342710 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-pr6cs" Dec 02 15:57:55 crc kubenswrapper[4933]: I1202 15:57:55.343397 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-pr6cs" Dec 02 15:57:55 crc kubenswrapper[4933]: I1202 15:57:55.389806 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-pr6cs" Dec 02 15:57:55 crc kubenswrapper[4933]: I1202 15:57:55.900701 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-pr6cs" Dec 02 15:57:55 crc kubenswrapper[4933]: I1202 15:57:55.910240 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-jpwwl" Dec 02 15:57:56 crc kubenswrapper[4933]: I1202 15:57:56.386243 4933 patch_prober.go:28] interesting pod/controller-manager-6d9f96d886-md9gd container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.62:8443/healthz\": dial tcp 10.217.0.62:8443: connect: connection refused" start-of-body= Dec 02 15:57:56 crc kubenswrapper[4933]: I1202 15:57:56.386353 4933 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-6d9f96d886-md9gd" podUID="41d5e210-0a43-4e21-a725-d1a9a0d75128" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.62:8443/healthz\": dial tcp 10.217.0.62:8443: connect: connection refused" Dec 02 15:57:56 crc kubenswrapper[4933]: I1202 15:57:56.865142 4933 generic.go:334] "Generic (PLEG): container finished" podID="41d5e210-0a43-4e21-a725-d1a9a0d75128" containerID="f3e5f16c76ad452ac89713233792593c5d19c44d2c670d46680cfc9942893516" exitCode=0 Dec 02 15:57:56 crc kubenswrapper[4933]: I1202 15:57:56.865256 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6d9f96d886-md9gd" event={"ID":"41d5e210-0a43-4e21-a725-d1a9a0d75128","Type":"ContainerDied","Data":"f3e5f16c76ad452ac89713233792593c5d19c44d2c670d46680cfc9942893516"} Dec 02 15:57:56 crc kubenswrapper[4933]: I1202 15:57:56.999810 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-m288s" Dec 02 15:57:56 crc kubenswrapper[4933]: I1202 15:57:56.999893 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-m288s" Dec 02 15:57:57 crc kubenswrapper[4933]: I1202 15:57:57.037161 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-m288s" Dec 02 15:57:57 crc kubenswrapper[4933]: I1202 15:57:57.900919 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-m288s" Dec 02 15:57:58 crc kubenswrapper[4933]: I1202 15:57:58.892002 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hv4zj" event={"ID":"fa22a7e7-ae56-415c-ba73-37c19aa18fcb","Type":"ContainerStarted","Data":"96d69df58069f7fa032e9b40a075ff1eb71e7b97d294bcb20e28250b25b80e27"} Dec 02 15:57:58 crc kubenswrapper[4933]: I1202 15:57:58.916237 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-hv4zj" podStartSLOduration=2.393632471 podStartE2EDuration="10.916219839s" podCreationTimestamp="2025-12-02 15:57:48 +0000 UTC" firstStartedPulling="2025-12-02 15:57:49.807699156 +0000 UTC m=+333.058925859" lastFinishedPulling="2025-12-02 15:57:58.330286524 +0000 UTC m=+341.581513227" observedRunningTime="2025-12-02 15:57:58.912717599 +0000 UTC m=+342.163944342" watchObservedRunningTime="2025-12-02 15:57:58.916219839 +0000 UTC m=+342.167446542" Dec 02 15:57:59 crc kubenswrapper[4933]: I1202 15:57:59.365747 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6d9f96d886-md9gd" Dec 02 15:57:59 crc kubenswrapper[4933]: I1202 15:57:59.393343 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6b7f57fc78-42jl9"] Dec 02 15:57:59 crc kubenswrapper[4933]: E1202 15:57:59.393579 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41d5e210-0a43-4e21-a725-d1a9a0d75128" containerName="controller-manager" Dec 02 15:57:59 crc kubenswrapper[4933]: I1202 15:57:59.393593 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="41d5e210-0a43-4e21-a725-d1a9a0d75128" containerName="controller-manager" Dec 02 15:57:59 crc kubenswrapper[4933]: I1202 15:57:59.393707 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="41d5e210-0a43-4e21-a725-d1a9a0d75128" containerName="controller-manager" Dec 02 15:57:59 crc kubenswrapper[4933]: I1202 15:57:59.394118 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6b7f57fc78-42jl9" Dec 02 15:57:59 crc kubenswrapper[4933]: I1202 15:57:59.400303 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6b7f57fc78-42jl9"] Dec 02 15:57:59 crc kubenswrapper[4933]: I1202 15:57:59.544484 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/41d5e210-0a43-4e21-a725-d1a9a0d75128-client-ca\") pod \"41d5e210-0a43-4e21-a725-d1a9a0d75128\" (UID: \"41d5e210-0a43-4e21-a725-d1a9a0d75128\") " Dec 02 15:57:59 crc kubenswrapper[4933]: I1202 15:57:59.544566 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/41d5e210-0a43-4e21-a725-d1a9a0d75128-serving-cert\") pod \"41d5e210-0a43-4e21-a725-d1a9a0d75128\" (UID: \"41d5e210-0a43-4e21-a725-d1a9a0d75128\") " Dec 02 15:57:59 crc kubenswrapper[4933]: I1202 15:57:59.544607 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-48g6x\" (UniqueName: \"kubernetes.io/projected/41d5e210-0a43-4e21-a725-d1a9a0d75128-kube-api-access-48g6x\") pod \"41d5e210-0a43-4e21-a725-d1a9a0d75128\" (UID: \"41d5e210-0a43-4e21-a725-d1a9a0d75128\") " Dec 02 15:57:59 crc kubenswrapper[4933]: I1202 15:57:59.544643 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/41d5e210-0a43-4e21-a725-d1a9a0d75128-proxy-ca-bundles\") pod \"41d5e210-0a43-4e21-a725-d1a9a0d75128\" (UID: \"41d5e210-0a43-4e21-a725-d1a9a0d75128\") " Dec 02 15:57:59 crc kubenswrapper[4933]: I1202 15:57:59.544681 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/41d5e210-0a43-4e21-a725-d1a9a0d75128-config\") pod \"41d5e210-0a43-4e21-a725-d1a9a0d75128\" (UID: \"41d5e210-0a43-4e21-a725-d1a9a0d75128\") " Dec 02 15:57:59 crc kubenswrapper[4933]: I1202 15:57:59.544921 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grhw8\" (UniqueName: \"kubernetes.io/projected/b69ed63c-6af1-4aa9-8712-75c9aa48d50c-kube-api-access-grhw8\") pod \"controller-manager-6b7f57fc78-42jl9\" (UID: \"b69ed63c-6af1-4aa9-8712-75c9aa48d50c\") " pod="openshift-controller-manager/controller-manager-6b7f57fc78-42jl9" Dec 02 15:57:59 crc kubenswrapper[4933]: I1202 15:57:59.544962 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b69ed63c-6af1-4aa9-8712-75c9aa48d50c-client-ca\") pod \"controller-manager-6b7f57fc78-42jl9\" (UID: \"b69ed63c-6af1-4aa9-8712-75c9aa48d50c\") " pod="openshift-controller-manager/controller-manager-6b7f57fc78-42jl9" Dec 02 15:57:59 crc kubenswrapper[4933]: I1202 15:57:59.544995 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b69ed63c-6af1-4aa9-8712-75c9aa48d50c-serving-cert\") pod \"controller-manager-6b7f57fc78-42jl9\" (UID: \"b69ed63c-6af1-4aa9-8712-75c9aa48d50c\") " pod="openshift-controller-manager/controller-manager-6b7f57fc78-42jl9" Dec 02 15:57:59 crc kubenswrapper[4933]: I1202 15:57:59.545041 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b69ed63c-6af1-4aa9-8712-75c9aa48d50c-config\") pod \"controller-manager-6b7f57fc78-42jl9\" (UID: \"b69ed63c-6af1-4aa9-8712-75c9aa48d50c\") " pod="openshift-controller-manager/controller-manager-6b7f57fc78-42jl9" Dec 02 15:57:59 crc kubenswrapper[4933]: I1202 15:57:59.545066 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b69ed63c-6af1-4aa9-8712-75c9aa48d50c-proxy-ca-bundles\") pod \"controller-manager-6b7f57fc78-42jl9\" (UID: \"b69ed63c-6af1-4aa9-8712-75c9aa48d50c\") " pod="openshift-controller-manager/controller-manager-6b7f57fc78-42jl9" Dec 02 15:57:59 crc kubenswrapper[4933]: I1202 15:57:59.545519 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41d5e210-0a43-4e21-a725-d1a9a0d75128-client-ca" (OuterVolumeSpecName: "client-ca") pod "41d5e210-0a43-4e21-a725-d1a9a0d75128" (UID: "41d5e210-0a43-4e21-a725-d1a9a0d75128"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 15:57:59 crc kubenswrapper[4933]: I1202 15:57:59.545549 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41d5e210-0a43-4e21-a725-d1a9a0d75128-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "41d5e210-0a43-4e21-a725-d1a9a0d75128" (UID: "41d5e210-0a43-4e21-a725-d1a9a0d75128"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 15:57:59 crc kubenswrapper[4933]: I1202 15:57:59.545717 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41d5e210-0a43-4e21-a725-d1a9a0d75128-config" (OuterVolumeSpecName: "config") pod "41d5e210-0a43-4e21-a725-d1a9a0d75128" (UID: "41d5e210-0a43-4e21-a725-d1a9a0d75128"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 15:57:59 crc kubenswrapper[4933]: I1202 15:57:59.553180 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41d5e210-0a43-4e21-a725-d1a9a0d75128-kube-api-access-48g6x" (OuterVolumeSpecName: "kube-api-access-48g6x") pod "41d5e210-0a43-4e21-a725-d1a9a0d75128" (UID: "41d5e210-0a43-4e21-a725-d1a9a0d75128"). InnerVolumeSpecName "kube-api-access-48g6x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 15:57:59 crc kubenswrapper[4933]: I1202 15:57:59.553552 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41d5e210-0a43-4e21-a725-d1a9a0d75128-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "41d5e210-0a43-4e21-a725-d1a9a0d75128" (UID: "41d5e210-0a43-4e21-a725-d1a9a0d75128"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 15:57:59 crc kubenswrapper[4933]: I1202 15:57:59.646483 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b69ed63c-6af1-4aa9-8712-75c9aa48d50c-config\") pod \"controller-manager-6b7f57fc78-42jl9\" (UID: \"b69ed63c-6af1-4aa9-8712-75c9aa48d50c\") " pod="openshift-controller-manager/controller-manager-6b7f57fc78-42jl9" Dec 02 15:57:59 crc kubenswrapper[4933]: I1202 15:57:59.646543 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b69ed63c-6af1-4aa9-8712-75c9aa48d50c-proxy-ca-bundles\") pod \"controller-manager-6b7f57fc78-42jl9\" (UID: \"b69ed63c-6af1-4aa9-8712-75c9aa48d50c\") " pod="openshift-controller-manager/controller-manager-6b7f57fc78-42jl9" Dec 02 15:57:59 crc kubenswrapper[4933]: I1202 15:57:59.646617 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grhw8\" (UniqueName: \"kubernetes.io/projected/b69ed63c-6af1-4aa9-8712-75c9aa48d50c-kube-api-access-grhw8\") pod \"controller-manager-6b7f57fc78-42jl9\" (UID: \"b69ed63c-6af1-4aa9-8712-75c9aa48d50c\") " pod="openshift-controller-manager/controller-manager-6b7f57fc78-42jl9" Dec 02 15:57:59 crc kubenswrapper[4933]: I1202 15:57:59.646656 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b69ed63c-6af1-4aa9-8712-75c9aa48d50c-client-ca\") pod \"controller-manager-6b7f57fc78-42jl9\" (UID: \"b69ed63c-6af1-4aa9-8712-75c9aa48d50c\") " pod="openshift-controller-manager/controller-manager-6b7f57fc78-42jl9" Dec 02 15:57:59 crc kubenswrapper[4933]: I1202 15:57:59.646699 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b69ed63c-6af1-4aa9-8712-75c9aa48d50c-serving-cert\") pod \"controller-manager-6b7f57fc78-42jl9\" (UID: \"b69ed63c-6af1-4aa9-8712-75c9aa48d50c\") " pod="openshift-controller-manager/controller-manager-6b7f57fc78-42jl9" Dec 02 15:57:59 crc kubenswrapper[4933]: I1202 15:57:59.646741 4933 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/41d5e210-0a43-4e21-a725-d1a9a0d75128-client-ca\") on node \"crc\" DevicePath \"\"" Dec 02 15:57:59 crc kubenswrapper[4933]: I1202 15:57:59.646769 4933 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/41d5e210-0a43-4e21-a725-d1a9a0d75128-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 15:57:59 crc kubenswrapper[4933]: I1202 15:57:59.646782 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-48g6x\" (UniqueName: \"kubernetes.io/projected/41d5e210-0a43-4e21-a725-d1a9a0d75128-kube-api-access-48g6x\") on node \"crc\" DevicePath \"\"" Dec 02 15:57:59 crc kubenswrapper[4933]: I1202 15:57:59.646793 4933 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/41d5e210-0a43-4e21-a725-d1a9a0d75128-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 02 15:57:59 crc kubenswrapper[4933]: I1202 15:57:59.646803 4933 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/41d5e210-0a43-4e21-a725-d1a9a0d75128-config\") on node \"crc\" DevicePath \"\"" Dec 02 15:57:59 crc kubenswrapper[4933]: I1202 15:57:59.647700 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b69ed63c-6af1-4aa9-8712-75c9aa48d50c-proxy-ca-bundles\") pod \"controller-manager-6b7f57fc78-42jl9\" (UID: \"b69ed63c-6af1-4aa9-8712-75c9aa48d50c\") " pod="openshift-controller-manager/controller-manager-6b7f57fc78-42jl9" Dec 02 15:57:59 crc kubenswrapper[4933]: I1202 15:57:59.647959 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b69ed63c-6af1-4aa9-8712-75c9aa48d50c-config\") pod \"controller-manager-6b7f57fc78-42jl9\" (UID: \"b69ed63c-6af1-4aa9-8712-75c9aa48d50c\") " pod="openshift-controller-manager/controller-manager-6b7f57fc78-42jl9" Dec 02 15:57:59 crc kubenswrapper[4933]: I1202 15:57:59.648368 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b69ed63c-6af1-4aa9-8712-75c9aa48d50c-client-ca\") pod \"controller-manager-6b7f57fc78-42jl9\" (UID: \"b69ed63c-6af1-4aa9-8712-75c9aa48d50c\") " pod="openshift-controller-manager/controller-manager-6b7f57fc78-42jl9" Dec 02 15:57:59 crc kubenswrapper[4933]: I1202 15:57:59.657731 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b69ed63c-6af1-4aa9-8712-75c9aa48d50c-serving-cert\") pod \"controller-manager-6b7f57fc78-42jl9\" (UID: \"b69ed63c-6af1-4aa9-8712-75c9aa48d50c\") " pod="openshift-controller-manager/controller-manager-6b7f57fc78-42jl9" Dec 02 15:57:59 crc kubenswrapper[4933]: I1202 15:57:59.668135 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grhw8\" (UniqueName: \"kubernetes.io/projected/b69ed63c-6af1-4aa9-8712-75c9aa48d50c-kube-api-access-grhw8\") pod \"controller-manager-6b7f57fc78-42jl9\" (UID: \"b69ed63c-6af1-4aa9-8712-75c9aa48d50c\") " pod="openshift-controller-manager/controller-manager-6b7f57fc78-42jl9" Dec 02 15:57:59 crc kubenswrapper[4933]: I1202 15:57:59.708626 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6b7f57fc78-42jl9" Dec 02 15:57:59 crc kubenswrapper[4933]: I1202 15:57:59.899655 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-db54df47d-vlt62" event={"ID":"cd1667e7-457a-400d-9cf2-1d9c8b0a3a87","Type":"ContainerStarted","Data":"7ce8e0744ee226fcdfe3ccd3392ba2134a9b7211c8847a10b67859e5e52dbf98"} Dec 02 15:57:59 crc kubenswrapper[4933]: I1202 15:57:59.899919 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-db54df47d-vlt62" event={"ID":"cd1667e7-457a-400d-9cf2-1d9c8b0a3a87","Type":"ContainerStarted","Data":"24bfbeb069b5378ac9241a0dd8aa170dce3121b63e6459a692f6c340cdbeb94c"} Dec 02 15:57:59 crc kubenswrapper[4933]: I1202 15:57:59.909650 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6d9f96d886-md9gd" event={"ID":"41d5e210-0a43-4e21-a725-d1a9a0d75128","Type":"ContainerDied","Data":"5bd947be55896c947b05796e09b72445f201d35b3abe1f3564cab07bb84de9e7"} Dec 02 15:57:59 crc kubenswrapper[4933]: I1202 15:57:59.909694 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6d9f96d886-md9gd" Dec 02 15:57:59 crc kubenswrapper[4933]: I1202 15:57:59.909713 4933 scope.go:117] "RemoveContainer" containerID="f3e5f16c76ad452ac89713233792593c5d19c44d2c670d46680cfc9942893516" Dec 02 15:57:59 crc kubenswrapper[4933]: I1202 15:57:59.923682 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6b7f57fc78-42jl9"] Dec 02 15:57:59 crc kubenswrapper[4933]: I1202 15:57:59.932955 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-db54df47d-vlt62" podStartSLOduration=2.033733716 podStartE2EDuration="8.932932272s" podCreationTimestamp="2025-12-02 15:57:51 +0000 UTC" firstStartedPulling="2025-12-02 15:57:52.567121973 +0000 UTC m=+335.818348676" lastFinishedPulling="2025-12-02 15:57:59.466320529 +0000 UTC m=+342.717547232" observedRunningTime="2025-12-02 15:57:59.927707862 +0000 UTC m=+343.178934565" watchObservedRunningTime="2025-12-02 15:57:59.932932272 +0000 UTC m=+343.184158985" Dec 02 15:57:59 crc kubenswrapper[4933]: I1202 15:57:59.945279 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6d9f96d886-md9gd"] Dec 02 15:57:59 crc kubenswrapper[4933]: W1202 15:57:59.952077 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb69ed63c_6af1_4aa9_8712_75c9aa48d50c.slice/crio-c41d045ac76819f54c320a1200941c3b60e6f9a35ca05a067712bc5a3853d64e WatchSource:0}: Error finding container c41d045ac76819f54c320a1200941c3b60e6f9a35ca05a067712bc5a3853d64e: Status 404 returned error can't find the container with id c41d045ac76819f54c320a1200941c3b60e6f9a35ca05a067712bc5a3853d64e Dec 02 15:57:59 crc kubenswrapper[4933]: I1202 15:57:59.953857 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6d9f96d886-md9gd"] Dec 02 15:58:00 crc kubenswrapper[4933]: I1202 15:58:00.917253 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6b7f57fc78-42jl9" event={"ID":"b69ed63c-6af1-4aa9-8712-75c9aa48d50c","Type":"ContainerStarted","Data":"719f668efe386086ad896b006415ebcf0c265c5db0bd5ef8a6da29c3be55af09"} Dec 02 15:58:00 crc kubenswrapper[4933]: I1202 15:58:00.917296 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6b7f57fc78-42jl9" event={"ID":"b69ed63c-6af1-4aa9-8712-75c9aa48d50c","Type":"ContainerStarted","Data":"c41d045ac76819f54c320a1200941c3b60e6f9a35ca05a067712bc5a3853d64e"} Dec 02 15:58:01 crc kubenswrapper[4933]: I1202 15:58:01.060411 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41d5e210-0a43-4e21-a725-d1a9a0d75128" path="/var/lib/kubelet/pods/41d5e210-0a43-4e21-a725-d1a9a0d75128/volumes" Dec 02 15:58:01 crc kubenswrapper[4933]: I1202 15:58:01.923694 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6b7f57fc78-42jl9" Dec 02 15:58:01 crc kubenswrapper[4933]: I1202 15:58:01.928352 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6b7f57fc78-42jl9" Dec 02 15:58:01 crc kubenswrapper[4933]: I1202 15:58:01.954109 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6b7f57fc78-42jl9" podStartSLOduration=8.954088008 podStartE2EDuration="8.954088008s" podCreationTimestamp="2025-12-02 15:57:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 15:58:00.941638577 +0000 UTC m=+344.192865280" watchObservedRunningTime="2025-12-02 15:58:01.954088008 +0000 UTC m=+345.205314721" Dec 02 15:58:02 crc kubenswrapper[4933]: I1202 15:58:02.098675 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-777cb5bd5d-rwz9t"] Dec 02 15:58:02 crc kubenswrapper[4933]: I1202 15:58:02.100000 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-rwz9t" Dec 02 15:58:02 crc kubenswrapper[4933]: I1202 15:58:02.111235 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-777cb5bd5d-rwz9t"] Dec 02 15:58:02 crc kubenswrapper[4933]: I1202 15:58:02.119401 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-tls" Dec 02 15:58:02 crc kubenswrapper[4933]: I1202 15:58:02.119487 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-dockercfg-vddss" Dec 02 15:58:02 crc kubenswrapper[4933]: I1202 15:58:02.119552 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-kube-rbac-proxy-config" Dec 02 15:58:02 crc kubenswrapper[4933]: I1202 15:58:02.120412 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-state-metrics-custom-resource-state-configmap" Dec 02 15:58:02 crc kubenswrapper[4933]: I1202 15:58:02.195688 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-566fddb674-58vt9"] Dec 02 15:58:02 crc kubenswrapper[4933]: I1202 15:58:02.197788 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-566fddb674-58vt9" Dec 02 15:58:02 crc kubenswrapper[4933]: I1202 15:58:02.200113 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/57482a1a-ab70-41d9-a050-6b6b1bd131f9-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-777cb5bd5d-rwz9t\" (UID: \"57482a1a-ab70-41d9-a050-6b6b1bd131f9\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-rwz9t" Dec 02 15:58:02 crc kubenswrapper[4933]: I1202 15:58:02.200197 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/57482a1a-ab70-41d9-a050-6b6b1bd131f9-metrics-client-ca\") pod \"kube-state-metrics-777cb5bd5d-rwz9t\" (UID: \"57482a1a-ab70-41d9-a050-6b6b1bd131f9\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-rwz9t" Dec 02 15:58:02 crc kubenswrapper[4933]: I1202 15:58:02.200222 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fw66h\" (UniqueName: \"kubernetes.io/projected/57482a1a-ab70-41d9-a050-6b6b1bd131f9-kube-api-access-fw66h\") pod \"kube-state-metrics-777cb5bd5d-rwz9t\" (UID: \"57482a1a-ab70-41d9-a050-6b6b1bd131f9\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-rwz9t" Dec 02 15:58:02 crc kubenswrapper[4933]: I1202 15:58:02.200251 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/57482a1a-ab70-41d9-a050-6b6b1bd131f9-volume-directive-shadow\") pod \"kube-state-metrics-777cb5bd5d-rwz9t\" (UID: \"57482a1a-ab70-41d9-a050-6b6b1bd131f9\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-rwz9t" Dec 02 15:58:02 crc kubenswrapper[4933]: I1202 15:58:02.200292 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/57482a1a-ab70-41d9-a050-6b6b1bd131f9-kube-state-metrics-tls\") pod \"kube-state-metrics-777cb5bd5d-rwz9t\" (UID: \"57482a1a-ab70-41d9-a050-6b6b1bd131f9\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-rwz9t" Dec 02 15:58:02 crc kubenswrapper[4933]: I1202 15:58:02.200324 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/57482a1a-ab70-41d9-a050-6b6b1bd131f9-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-777cb5bd5d-rwz9t\" (UID: \"57482a1a-ab70-41d9-a050-6b6b1bd131f9\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-rwz9t" Dec 02 15:58:02 crc kubenswrapper[4933]: I1202 15:58:02.204230 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-tls" Dec 02 15:58:02 crc kubenswrapper[4933]: I1202 15:58:02.204296 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-kube-rbac-proxy-config" Dec 02 15:58:02 crc kubenswrapper[4933]: I1202 15:58:02.205757 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-dockercfg-rh9cr" Dec 02 15:58:02 crc kubenswrapper[4933]: I1202 15:58:02.225139 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-566fddb674-58vt9"] Dec 02 15:58:02 crc kubenswrapper[4933]: I1202 15:58:02.234673 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-c5xwx"] Dec 02 15:58:02 crc kubenswrapper[4933]: I1202 15:58:02.235934 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-c5xwx" Dec 02 15:58:02 crc kubenswrapper[4933]: I1202 15:58:02.254170 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-tls" Dec 02 15:58:02 crc kubenswrapper[4933]: I1202 15:58:02.254195 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-kube-rbac-proxy-config" Dec 02 15:58:02 crc kubenswrapper[4933]: I1202 15:58:02.254661 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-dockercfg-x2dvm" Dec 02 15:58:02 crc kubenswrapper[4933]: I1202 15:58:02.301054 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fw66h\" (UniqueName: \"kubernetes.io/projected/57482a1a-ab70-41d9-a050-6b6b1bd131f9-kube-api-access-fw66h\") pod \"kube-state-metrics-777cb5bd5d-rwz9t\" (UID: \"57482a1a-ab70-41d9-a050-6b6b1bd131f9\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-rwz9t" Dec 02 15:58:02 crc kubenswrapper[4933]: I1202 15:58:02.301106 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/57482a1a-ab70-41d9-a050-6b6b1bd131f9-volume-directive-shadow\") pod \"kube-state-metrics-777cb5bd5d-rwz9t\" (UID: \"57482a1a-ab70-41d9-a050-6b6b1bd131f9\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-rwz9t" Dec 02 15:58:02 crc kubenswrapper[4933]: I1202 15:58:02.301129 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h24gt\" (UniqueName: \"kubernetes.io/projected/41cf2470-9560-4fa8-9b7f-63324e2acfd0-kube-api-access-h24gt\") pod \"openshift-state-metrics-566fddb674-58vt9\" (UID: \"41cf2470-9560-4fa8-9b7f-63324e2acfd0\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-58vt9" Dec 02 15:58:02 crc kubenswrapper[4933]: I1202 15:58:02.301151 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/3725235e-c426-4820-928e-16d3c9682550-node-exporter-tls\") pod \"node-exporter-c5xwx\" (UID: \"3725235e-c426-4820-928e-16d3c9682550\") " pod="openshift-monitoring/node-exporter-c5xwx" Dec 02 15:58:02 crc kubenswrapper[4933]: I1202 15:58:02.301168 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3725235e-c426-4820-928e-16d3c9682550-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-c5xwx\" (UID: \"3725235e-c426-4820-928e-16d3c9682550\") " pod="openshift-monitoring/node-exporter-c5xwx" Dec 02 15:58:02 crc kubenswrapper[4933]: I1202 15:58:02.301257 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/41cf2470-9560-4fa8-9b7f-63324e2acfd0-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-566fddb674-58vt9\" (UID: \"41cf2470-9560-4fa8-9b7f-63324e2acfd0\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-58vt9" Dec 02 15:58:02 crc kubenswrapper[4933]: I1202 15:58:02.301389 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/3725235e-c426-4820-928e-16d3c9682550-node-exporter-textfile\") pod \"node-exporter-c5xwx\" (UID: \"3725235e-c426-4820-928e-16d3c9682550\") " pod="openshift-monitoring/node-exporter-c5xwx" Dec 02 15:58:02 crc kubenswrapper[4933]: I1202 15:58:02.301443 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/3725235e-c426-4820-928e-16d3c9682550-node-exporter-wtmp\") pod \"node-exporter-c5xwx\" (UID: \"3725235e-c426-4820-928e-16d3c9682550\") " pod="openshift-monitoring/node-exporter-c5xwx" Dec 02 15:58:02 crc kubenswrapper[4933]: I1202 15:58:02.301485 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/57482a1a-ab70-41d9-a050-6b6b1bd131f9-kube-state-metrics-tls\") pod \"kube-state-metrics-777cb5bd5d-rwz9t\" (UID: \"57482a1a-ab70-41d9-a050-6b6b1bd131f9\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-rwz9t" Dec 02 15:58:02 crc kubenswrapper[4933]: E1202 15:58:02.301561 4933 secret.go:188] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Dec 02 15:58:02 crc kubenswrapper[4933]: I1202 15:58:02.301579 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3725235e-c426-4820-928e-16d3c9682550-metrics-client-ca\") pod \"node-exporter-c5xwx\" (UID: \"3725235e-c426-4820-928e-16d3c9682550\") " pod="openshift-monitoring/node-exporter-c5xwx" Dec 02 15:58:02 crc kubenswrapper[4933]: E1202 15:58:02.301609 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/57482a1a-ab70-41d9-a050-6b6b1bd131f9-kube-state-metrics-tls podName:57482a1a-ab70-41d9-a050-6b6b1bd131f9 nodeName:}" failed. No retries permitted until 2025-12-02 15:58:02.801592305 +0000 UTC m=+346.052819008 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/57482a1a-ab70-41d9-a050-6b6b1bd131f9-kube-state-metrics-tls") pod "kube-state-metrics-777cb5bd5d-rwz9t" (UID: "57482a1a-ab70-41d9-a050-6b6b1bd131f9") : secret "kube-state-metrics-tls" not found Dec 02 15:58:02 crc kubenswrapper[4933]: I1202 15:58:02.301641 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/57482a1a-ab70-41d9-a050-6b6b1bd131f9-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-777cb5bd5d-rwz9t\" (UID: \"57482a1a-ab70-41d9-a050-6b6b1bd131f9\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-rwz9t" Dec 02 15:58:02 crc kubenswrapper[4933]: I1202 15:58:02.301678 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/41cf2470-9560-4fa8-9b7f-63324e2acfd0-metrics-client-ca\") pod \"openshift-state-metrics-566fddb674-58vt9\" (UID: \"41cf2470-9560-4fa8-9b7f-63324e2acfd0\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-58vt9" Dec 02 15:58:02 crc kubenswrapper[4933]: I1202 15:58:02.301719 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bd8lk\" (UniqueName: \"kubernetes.io/projected/3725235e-c426-4820-928e-16d3c9682550-kube-api-access-bd8lk\") pod \"node-exporter-c5xwx\" (UID: \"3725235e-c426-4820-928e-16d3c9682550\") " pod="openshift-monitoring/node-exporter-c5xwx" Dec 02 15:58:02 crc kubenswrapper[4933]: I1202 15:58:02.301742 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/3725235e-c426-4820-928e-16d3c9682550-root\") pod \"node-exporter-c5xwx\" (UID: \"3725235e-c426-4820-928e-16d3c9682550\") " pod="openshift-monitoring/node-exporter-c5xwx" Dec 02 15:58:02 crc kubenswrapper[4933]: I1202 15:58:02.301803 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/41cf2470-9560-4fa8-9b7f-63324e2acfd0-openshift-state-metrics-tls\") pod \"openshift-state-metrics-566fddb674-58vt9\" (UID: \"41cf2470-9560-4fa8-9b7f-63324e2acfd0\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-58vt9" Dec 02 15:58:02 crc kubenswrapper[4933]: I1202 15:58:02.301861 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/57482a1a-ab70-41d9-a050-6b6b1bd131f9-volume-directive-shadow\") pod \"kube-state-metrics-777cb5bd5d-rwz9t\" (UID: \"57482a1a-ab70-41d9-a050-6b6b1bd131f9\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-rwz9t" Dec 02 15:58:02 crc kubenswrapper[4933]: I1202 15:58:02.301886 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3725235e-c426-4820-928e-16d3c9682550-sys\") pod \"node-exporter-c5xwx\" (UID: \"3725235e-c426-4820-928e-16d3c9682550\") " pod="openshift-monitoring/node-exporter-c5xwx" Dec 02 15:58:02 crc kubenswrapper[4933]: I1202 15:58:02.301942 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/57482a1a-ab70-41d9-a050-6b6b1bd131f9-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-777cb5bd5d-rwz9t\" (UID: \"57482a1a-ab70-41d9-a050-6b6b1bd131f9\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-rwz9t" Dec 02 15:58:02 crc kubenswrapper[4933]: I1202 15:58:02.301976 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/57482a1a-ab70-41d9-a050-6b6b1bd131f9-metrics-client-ca\") pod \"kube-state-metrics-777cb5bd5d-rwz9t\" (UID: \"57482a1a-ab70-41d9-a050-6b6b1bd131f9\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-rwz9t" Dec 02 15:58:02 crc kubenswrapper[4933]: I1202 15:58:02.302620 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/57482a1a-ab70-41d9-a050-6b6b1bd131f9-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-777cb5bd5d-rwz9t\" (UID: \"57482a1a-ab70-41d9-a050-6b6b1bd131f9\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-rwz9t" Dec 02 15:58:02 crc kubenswrapper[4933]: I1202 15:58:02.302645 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/57482a1a-ab70-41d9-a050-6b6b1bd131f9-metrics-client-ca\") pod \"kube-state-metrics-777cb5bd5d-rwz9t\" (UID: \"57482a1a-ab70-41d9-a050-6b6b1bd131f9\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-rwz9t" Dec 02 15:58:02 crc kubenswrapper[4933]: I1202 15:58:02.307184 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/57482a1a-ab70-41d9-a050-6b6b1bd131f9-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-777cb5bd5d-rwz9t\" (UID: \"57482a1a-ab70-41d9-a050-6b6b1bd131f9\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-rwz9t" Dec 02 15:58:02 crc kubenswrapper[4933]: I1202 15:58:02.321879 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fw66h\" (UniqueName: \"kubernetes.io/projected/57482a1a-ab70-41d9-a050-6b6b1bd131f9-kube-api-access-fw66h\") pod \"kube-state-metrics-777cb5bd5d-rwz9t\" (UID: \"57482a1a-ab70-41d9-a050-6b6b1bd131f9\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-rwz9t" Dec 02 15:58:02 crc kubenswrapper[4933]: I1202 15:58:02.403530 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/3725235e-c426-4820-928e-16d3c9682550-node-exporter-tls\") pod \"node-exporter-c5xwx\" (UID: \"3725235e-c426-4820-928e-16d3c9682550\") " pod="openshift-monitoring/node-exporter-c5xwx" Dec 02 15:58:02 crc kubenswrapper[4933]: I1202 15:58:02.403582 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3725235e-c426-4820-928e-16d3c9682550-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-c5xwx\" (UID: \"3725235e-c426-4820-928e-16d3c9682550\") " pod="openshift-monitoring/node-exporter-c5xwx" Dec 02 15:58:02 crc kubenswrapper[4933]: I1202 15:58:02.403617 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/41cf2470-9560-4fa8-9b7f-63324e2acfd0-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-566fddb674-58vt9\" (UID: \"41cf2470-9560-4fa8-9b7f-63324e2acfd0\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-58vt9" Dec 02 15:58:02 crc kubenswrapper[4933]: I1202 15:58:02.403646 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/3725235e-c426-4820-928e-16d3c9682550-node-exporter-textfile\") pod \"node-exporter-c5xwx\" (UID: \"3725235e-c426-4820-928e-16d3c9682550\") " pod="openshift-monitoring/node-exporter-c5xwx" Dec 02 15:58:02 crc kubenswrapper[4933]: I1202 15:58:02.403686 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/3725235e-c426-4820-928e-16d3c9682550-node-exporter-wtmp\") pod \"node-exporter-c5xwx\" (UID: \"3725235e-c426-4820-928e-16d3c9682550\") " pod="openshift-monitoring/node-exporter-c5xwx" Dec 02 15:58:02 crc kubenswrapper[4933]: I1202 15:58:02.403740 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3725235e-c426-4820-928e-16d3c9682550-metrics-client-ca\") pod \"node-exporter-c5xwx\" (UID: \"3725235e-c426-4820-928e-16d3c9682550\") " pod="openshift-monitoring/node-exporter-c5xwx" Dec 02 15:58:02 crc kubenswrapper[4933]: I1202 15:58:02.403773 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/41cf2470-9560-4fa8-9b7f-63324e2acfd0-metrics-client-ca\") pod \"openshift-state-metrics-566fddb674-58vt9\" (UID: \"41cf2470-9560-4fa8-9b7f-63324e2acfd0\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-58vt9" Dec 02 15:58:02 crc kubenswrapper[4933]: I1202 15:58:02.403801 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bd8lk\" (UniqueName: \"kubernetes.io/projected/3725235e-c426-4820-928e-16d3c9682550-kube-api-access-bd8lk\") pod \"node-exporter-c5xwx\" (UID: \"3725235e-c426-4820-928e-16d3c9682550\") " pod="openshift-monitoring/node-exporter-c5xwx" Dec 02 15:58:02 crc kubenswrapper[4933]: I1202 15:58:02.403838 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/3725235e-c426-4820-928e-16d3c9682550-root\") pod \"node-exporter-c5xwx\" (UID: \"3725235e-c426-4820-928e-16d3c9682550\") " pod="openshift-monitoring/node-exporter-c5xwx" Dec 02 15:58:02 crc kubenswrapper[4933]: I1202 15:58:02.403873 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/41cf2470-9560-4fa8-9b7f-63324e2acfd0-openshift-state-metrics-tls\") pod \"openshift-state-metrics-566fddb674-58vt9\" (UID: \"41cf2470-9560-4fa8-9b7f-63324e2acfd0\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-58vt9" Dec 02 15:58:02 crc kubenswrapper[4933]: I1202 15:58:02.403912 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3725235e-c426-4820-928e-16d3c9682550-sys\") pod \"node-exporter-c5xwx\" (UID: \"3725235e-c426-4820-928e-16d3c9682550\") " pod="openshift-monitoring/node-exporter-c5xwx" Dec 02 15:58:02 crc kubenswrapper[4933]: I1202 15:58:02.403952 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h24gt\" (UniqueName: \"kubernetes.io/projected/41cf2470-9560-4fa8-9b7f-63324e2acfd0-kube-api-access-h24gt\") pod \"openshift-state-metrics-566fddb674-58vt9\" (UID: \"41cf2470-9560-4fa8-9b7f-63324e2acfd0\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-58vt9" Dec 02 15:58:02 crc kubenswrapper[4933]: I1202 15:58:02.404900 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/3725235e-c426-4820-928e-16d3c9682550-root\") pod \"node-exporter-c5xwx\" (UID: \"3725235e-c426-4820-928e-16d3c9682550\") " pod="openshift-monitoring/node-exporter-c5xwx" Dec 02 15:58:02 crc kubenswrapper[4933]: I1202 15:58:02.405328 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/3725235e-c426-4820-928e-16d3c9682550-node-exporter-textfile\") pod \"node-exporter-c5xwx\" (UID: \"3725235e-c426-4820-928e-16d3c9682550\") " pod="openshift-monitoring/node-exporter-c5xwx" Dec 02 15:58:02 crc kubenswrapper[4933]: I1202 15:58:02.405473 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/3725235e-c426-4820-928e-16d3c9682550-node-exporter-wtmp\") pod \"node-exporter-c5xwx\" (UID: \"3725235e-c426-4820-928e-16d3c9682550\") " pod="openshift-monitoring/node-exporter-c5xwx" Dec 02 15:58:02 crc kubenswrapper[4933]: I1202 15:58:02.405547 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3725235e-c426-4820-928e-16d3c9682550-sys\") pod \"node-exporter-c5xwx\" (UID: \"3725235e-c426-4820-928e-16d3c9682550\") " pod="openshift-monitoring/node-exporter-c5xwx" Dec 02 15:58:02 crc kubenswrapper[4933]: I1202 15:58:02.405644 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/41cf2470-9560-4fa8-9b7f-63324e2acfd0-metrics-client-ca\") pod \"openshift-state-metrics-566fddb674-58vt9\" (UID: \"41cf2470-9560-4fa8-9b7f-63324e2acfd0\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-58vt9" Dec 02 15:58:02 crc kubenswrapper[4933]: I1202 15:58:02.405937 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3725235e-c426-4820-928e-16d3c9682550-metrics-client-ca\") pod \"node-exporter-c5xwx\" (UID: \"3725235e-c426-4820-928e-16d3c9682550\") " pod="openshift-monitoring/node-exporter-c5xwx" Dec 02 15:58:02 crc kubenswrapper[4933]: I1202 15:58:02.408068 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/41cf2470-9560-4fa8-9b7f-63324e2acfd0-openshift-state-metrics-tls\") pod \"openshift-state-metrics-566fddb674-58vt9\" (UID: \"41cf2470-9560-4fa8-9b7f-63324e2acfd0\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-58vt9" Dec 02 15:58:02 crc kubenswrapper[4933]: I1202 15:58:02.408136 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3725235e-c426-4820-928e-16d3c9682550-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-c5xwx\" (UID: \"3725235e-c426-4820-928e-16d3c9682550\") " pod="openshift-monitoring/node-exporter-c5xwx" Dec 02 15:58:02 crc kubenswrapper[4933]: I1202 15:58:02.408709 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/41cf2470-9560-4fa8-9b7f-63324e2acfd0-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-566fddb674-58vt9\" (UID: \"41cf2470-9560-4fa8-9b7f-63324e2acfd0\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-58vt9" Dec 02 15:58:02 crc kubenswrapper[4933]: I1202 15:58:02.408778 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/3725235e-c426-4820-928e-16d3c9682550-node-exporter-tls\") pod \"node-exporter-c5xwx\" (UID: \"3725235e-c426-4820-928e-16d3c9682550\") " pod="openshift-monitoring/node-exporter-c5xwx" Dec 02 15:58:02 crc kubenswrapper[4933]: I1202 15:58:02.426901 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bd8lk\" (UniqueName: \"kubernetes.io/projected/3725235e-c426-4820-928e-16d3c9682550-kube-api-access-bd8lk\") pod \"node-exporter-c5xwx\" (UID: \"3725235e-c426-4820-928e-16d3c9682550\") " pod="openshift-monitoring/node-exporter-c5xwx" Dec 02 15:58:02 crc kubenswrapper[4933]: I1202 15:58:02.440701 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h24gt\" (UniqueName: \"kubernetes.io/projected/41cf2470-9560-4fa8-9b7f-63324e2acfd0-kube-api-access-h24gt\") pod \"openshift-state-metrics-566fddb674-58vt9\" (UID: \"41cf2470-9560-4fa8-9b7f-63324e2acfd0\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-58vt9" Dec 02 15:58:02 crc kubenswrapper[4933]: I1202 15:58:02.524713 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-566fddb674-58vt9" Dec 02 15:58:02 crc kubenswrapper[4933]: I1202 15:58:02.564346 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-c5xwx" Dec 02 15:58:02 crc kubenswrapper[4933]: W1202 15:58:02.586248 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3725235e_c426_4820_928e_16d3c9682550.slice/crio-6f39f236369bb39274cba31b6b2797e0e6d8c3a9f9ad3ec589bc1c1f32fceddc WatchSource:0}: Error finding container 6f39f236369bb39274cba31b6b2797e0e6d8c3a9f9ad3ec589bc1c1f32fceddc: Status 404 returned error can't find the container with id 6f39f236369bb39274cba31b6b2797e0e6d8c3a9f9ad3ec589bc1c1f32fceddc Dec 02 15:58:02 crc kubenswrapper[4933]: I1202 15:58:02.807290 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/57482a1a-ab70-41d9-a050-6b6b1bd131f9-kube-state-metrics-tls\") pod \"kube-state-metrics-777cb5bd5d-rwz9t\" (UID: \"57482a1a-ab70-41d9-a050-6b6b1bd131f9\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-rwz9t" Dec 02 15:58:02 crc kubenswrapper[4933]: I1202 15:58:02.816984 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/57482a1a-ab70-41d9-a050-6b6b1bd131f9-kube-state-metrics-tls\") pod \"kube-state-metrics-777cb5bd5d-rwz9t\" (UID: \"57482a1a-ab70-41d9-a050-6b6b1bd131f9\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-rwz9t" Dec 02 15:58:02 crc kubenswrapper[4933]: I1202 15:58:02.921505 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-566fddb674-58vt9"] Dec 02 15:58:02 crc kubenswrapper[4933]: W1202 15:58:02.925107 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod41cf2470_9560_4fa8_9b7f_63324e2acfd0.slice/crio-a16af0a7fc10195cb37bf8573caead12ccc5e209fd53c78b1639f18974d99ac3 WatchSource:0}: Error finding container a16af0a7fc10195cb37bf8573caead12ccc5e209fd53c78b1639f18974d99ac3: Status 404 returned error can't find the container with id a16af0a7fc10195cb37bf8573caead12ccc5e209fd53c78b1639f18974d99ac3 Dec 02 15:58:02 crc kubenswrapper[4933]: I1202 15:58:02.929811 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-c5xwx" event={"ID":"3725235e-c426-4820-928e-16d3c9682550","Type":"ContainerStarted","Data":"6f39f236369bb39274cba31b6b2797e0e6d8c3a9f9ad3ec589bc1c1f32fceddc"} Dec 02 15:58:03 crc kubenswrapper[4933]: I1202 15:58:03.017378 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-rwz9t" Dec 02 15:58:03 crc kubenswrapper[4933]: I1202 15:58:03.385875 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Dec 02 15:58:03 crc kubenswrapper[4933]: I1202 15:58:03.390274 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Dec 02 15:58:03 crc kubenswrapper[4933]: I1202 15:58:03.393711 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls-assets-0" Dec 02 15:58:03 crc kubenswrapper[4933]: I1202 15:58:03.393864 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy" Dec 02 15:58:03 crc kubenswrapper[4933]: I1202 15:58:03.393961 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls" Dec 02 15:58:03 crc kubenswrapper[4933]: I1202 15:58:03.394020 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-web" Dec 02 15:58:03 crc kubenswrapper[4933]: I1202 15:58:03.394326 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-generated" Dec 02 15:58:03 crc kubenswrapper[4933]: I1202 15:58:03.394454 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-metric" Dec 02 15:58:03 crc kubenswrapper[4933]: I1202 15:58:03.394595 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-dockercfg-27kqx" Dec 02 15:58:03 crc kubenswrapper[4933]: I1202 15:58:03.395084 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-web-config" Dec 02 15:58:03 crc kubenswrapper[4933]: I1202 15:58:03.405071 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"alertmanager-trusted-ca-bundle" Dec 02 15:58:03 crc kubenswrapper[4933]: I1202 15:58:03.411607 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Dec 02 15:58:03 crc kubenswrapper[4933]: I1202 15:58:03.422008 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8a72be14-2144-4028-8130-716033a27045-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"8a72be14-2144-4028-8130-716033a27045\") " pod="openshift-monitoring/alertmanager-main-0" Dec 02 15:58:03 crc kubenswrapper[4933]: I1202 15:58:03.422088 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8a72be14-2144-4028-8130-716033a27045-tls-assets\") pod \"alertmanager-main-0\" (UID: \"8a72be14-2144-4028-8130-716033a27045\") " pod="openshift-monitoring/alertmanager-main-0" Dec 02 15:58:03 crc kubenswrapper[4933]: I1202 15:58:03.422111 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/8a72be14-2144-4028-8130-716033a27045-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"8a72be14-2144-4028-8130-716033a27045\") " pod="openshift-monitoring/alertmanager-main-0" Dec 02 15:58:03 crc kubenswrapper[4933]: I1202 15:58:03.422134 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/8a72be14-2144-4028-8130-716033a27045-config-volume\") pod \"alertmanager-main-0\" (UID: \"8a72be14-2144-4028-8130-716033a27045\") " pod="openshift-monitoring/alertmanager-main-0" Dec 02 15:58:03 crc kubenswrapper[4933]: I1202 15:58:03.422164 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/8a72be14-2144-4028-8130-716033a27045-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"8a72be14-2144-4028-8130-716033a27045\") " pod="openshift-monitoring/alertmanager-main-0" Dec 02 15:58:03 crc kubenswrapper[4933]: I1202 15:58:03.422193 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8a72be14-2144-4028-8130-716033a27045-config-out\") pod \"alertmanager-main-0\" (UID: \"8a72be14-2144-4028-8130-716033a27045\") " pod="openshift-monitoring/alertmanager-main-0" Dec 02 15:58:03 crc kubenswrapper[4933]: I1202 15:58:03.422214 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/8a72be14-2144-4028-8130-716033a27045-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"8a72be14-2144-4028-8130-716033a27045\") " pod="openshift-monitoring/alertmanager-main-0" Dec 02 15:58:03 crc kubenswrapper[4933]: I1202 15:58:03.422236 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8a72be14-2144-4028-8130-716033a27045-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"8a72be14-2144-4028-8130-716033a27045\") " pod="openshift-monitoring/alertmanager-main-0" Dec 02 15:58:03 crc kubenswrapper[4933]: I1202 15:58:03.422275 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dfnm\" (UniqueName: \"kubernetes.io/projected/8a72be14-2144-4028-8130-716033a27045-kube-api-access-9dfnm\") pod \"alertmanager-main-0\" (UID: \"8a72be14-2144-4028-8130-716033a27045\") " pod="openshift-monitoring/alertmanager-main-0" Dec 02 15:58:03 crc kubenswrapper[4933]: I1202 15:58:03.422298 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8a72be14-2144-4028-8130-716033a27045-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"8a72be14-2144-4028-8130-716033a27045\") " pod="openshift-monitoring/alertmanager-main-0" Dec 02 15:58:03 crc kubenswrapper[4933]: I1202 15:58:03.422321 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8a72be14-2144-4028-8130-716033a27045-web-config\") pod \"alertmanager-main-0\" (UID: \"8a72be14-2144-4028-8130-716033a27045\") " pod="openshift-monitoring/alertmanager-main-0" Dec 02 15:58:03 crc kubenswrapper[4933]: I1202 15:58:03.422356 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8a72be14-2144-4028-8130-716033a27045-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"8a72be14-2144-4028-8130-716033a27045\") " pod="openshift-monitoring/alertmanager-main-0" Dec 02 15:58:03 crc kubenswrapper[4933]: I1202 15:58:03.523320 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8a72be14-2144-4028-8130-716033a27045-tls-assets\") pod \"alertmanager-main-0\" (UID: \"8a72be14-2144-4028-8130-716033a27045\") " pod="openshift-monitoring/alertmanager-main-0" Dec 02 15:58:03 crc kubenswrapper[4933]: I1202 15:58:03.523364 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/8a72be14-2144-4028-8130-716033a27045-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"8a72be14-2144-4028-8130-716033a27045\") " pod="openshift-monitoring/alertmanager-main-0" Dec 02 15:58:03 crc kubenswrapper[4933]: I1202 15:58:03.523401 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/8a72be14-2144-4028-8130-716033a27045-config-volume\") pod \"alertmanager-main-0\" (UID: \"8a72be14-2144-4028-8130-716033a27045\") " pod="openshift-monitoring/alertmanager-main-0" Dec 02 15:58:03 crc kubenswrapper[4933]: I1202 15:58:03.523427 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/8a72be14-2144-4028-8130-716033a27045-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"8a72be14-2144-4028-8130-716033a27045\") " pod="openshift-monitoring/alertmanager-main-0" Dec 02 15:58:03 crc kubenswrapper[4933]: I1202 15:58:03.523451 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8a72be14-2144-4028-8130-716033a27045-config-out\") pod \"alertmanager-main-0\" (UID: \"8a72be14-2144-4028-8130-716033a27045\") " pod="openshift-monitoring/alertmanager-main-0" Dec 02 15:58:03 crc kubenswrapper[4933]: I1202 15:58:03.523469 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/8a72be14-2144-4028-8130-716033a27045-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"8a72be14-2144-4028-8130-716033a27045\") " pod="openshift-monitoring/alertmanager-main-0" Dec 02 15:58:03 crc kubenswrapper[4933]: I1202 15:58:03.523490 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8a72be14-2144-4028-8130-716033a27045-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"8a72be14-2144-4028-8130-716033a27045\") " pod="openshift-monitoring/alertmanager-main-0" Dec 02 15:58:03 crc kubenswrapper[4933]: I1202 15:58:03.523522 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9dfnm\" (UniqueName: \"kubernetes.io/projected/8a72be14-2144-4028-8130-716033a27045-kube-api-access-9dfnm\") pod \"alertmanager-main-0\" (UID: \"8a72be14-2144-4028-8130-716033a27045\") " pod="openshift-monitoring/alertmanager-main-0" Dec 02 15:58:03 crc kubenswrapper[4933]: I1202 15:58:03.523539 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8a72be14-2144-4028-8130-716033a27045-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"8a72be14-2144-4028-8130-716033a27045\") " pod="openshift-monitoring/alertmanager-main-0" Dec 02 15:58:03 crc kubenswrapper[4933]: I1202 15:58:03.523557 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8a72be14-2144-4028-8130-716033a27045-web-config\") pod \"alertmanager-main-0\" (UID: \"8a72be14-2144-4028-8130-716033a27045\") " pod="openshift-monitoring/alertmanager-main-0" Dec 02 15:58:03 crc kubenswrapper[4933]: I1202 15:58:03.523591 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8a72be14-2144-4028-8130-716033a27045-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"8a72be14-2144-4028-8130-716033a27045\") " pod="openshift-monitoring/alertmanager-main-0" Dec 02 15:58:03 crc kubenswrapper[4933]: I1202 15:58:03.523615 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8a72be14-2144-4028-8130-716033a27045-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"8a72be14-2144-4028-8130-716033a27045\") " pod="openshift-monitoring/alertmanager-main-0" Dec 02 15:58:03 crc kubenswrapper[4933]: I1202 15:58:03.524665 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8a72be14-2144-4028-8130-716033a27045-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"8a72be14-2144-4028-8130-716033a27045\") " pod="openshift-monitoring/alertmanager-main-0" Dec 02 15:58:03 crc kubenswrapper[4933]: I1202 15:58:03.525005 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/8a72be14-2144-4028-8130-716033a27045-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"8a72be14-2144-4028-8130-716033a27045\") " pod="openshift-monitoring/alertmanager-main-0" Dec 02 15:58:03 crc kubenswrapper[4933]: I1202 15:58:03.526719 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8a72be14-2144-4028-8130-716033a27045-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"8a72be14-2144-4028-8130-716033a27045\") " pod="openshift-monitoring/alertmanager-main-0" Dec 02 15:58:03 crc kubenswrapper[4933]: I1202 15:58:03.529138 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8a72be14-2144-4028-8130-716033a27045-tls-assets\") pod \"alertmanager-main-0\" (UID: \"8a72be14-2144-4028-8130-716033a27045\") " pod="openshift-monitoring/alertmanager-main-0" Dec 02 15:58:03 crc kubenswrapper[4933]: I1202 15:58:03.531803 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-777cb5bd5d-rwz9t"] Dec 02 15:58:03 crc kubenswrapper[4933]: I1202 15:58:03.532079 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8a72be14-2144-4028-8130-716033a27045-config-out\") pod \"alertmanager-main-0\" (UID: \"8a72be14-2144-4028-8130-716033a27045\") " pod="openshift-monitoring/alertmanager-main-0" Dec 02 15:58:03 crc kubenswrapper[4933]: I1202 15:58:03.532602 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8a72be14-2144-4028-8130-716033a27045-web-config\") pod \"alertmanager-main-0\" (UID: \"8a72be14-2144-4028-8130-716033a27045\") " pod="openshift-monitoring/alertmanager-main-0" Dec 02 15:58:03 crc kubenswrapper[4933]: I1202 15:58:03.532756 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/8a72be14-2144-4028-8130-716033a27045-config-volume\") pod \"alertmanager-main-0\" (UID: \"8a72be14-2144-4028-8130-716033a27045\") " pod="openshift-monitoring/alertmanager-main-0" Dec 02 15:58:03 crc kubenswrapper[4933]: I1202 15:58:03.535200 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8a72be14-2144-4028-8130-716033a27045-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"8a72be14-2144-4028-8130-716033a27045\") " pod="openshift-monitoring/alertmanager-main-0" Dec 02 15:58:03 crc kubenswrapper[4933]: I1202 15:58:03.542180 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dfnm\" (UniqueName: \"kubernetes.io/projected/8a72be14-2144-4028-8130-716033a27045-kube-api-access-9dfnm\") pod \"alertmanager-main-0\" (UID: \"8a72be14-2144-4028-8130-716033a27045\") " pod="openshift-monitoring/alertmanager-main-0" Dec 02 15:58:03 crc kubenswrapper[4933]: I1202 15:58:03.542205 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/8a72be14-2144-4028-8130-716033a27045-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"8a72be14-2144-4028-8130-716033a27045\") " pod="openshift-monitoring/alertmanager-main-0" Dec 02 15:58:03 crc kubenswrapper[4933]: I1202 15:58:03.544286 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/8a72be14-2144-4028-8130-716033a27045-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"8a72be14-2144-4028-8130-716033a27045\") " pod="openshift-monitoring/alertmanager-main-0" Dec 02 15:58:03 crc kubenswrapper[4933]: I1202 15:58:03.545634 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8a72be14-2144-4028-8130-716033a27045-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"8a72be14-2144-4028-8130-716033a27045\") " pod="openshift-monitoring/alertmanager-main-0" Dec 02 15:58:03 crc kubenswrapper[4933]: I1202 15:58:03.712804 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Dec 02 15:58:03 crc kubenswrapper[4933]: I1202 15:58:03.940191 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-566fddb674-58vt9" event={"ID":"41cf2470-9560-4fa8-9b7f-63324e2acfd0","Type":"ContainerStarted","Data":"19837ed3174f131064860053b29e4a45d8ddd2720931ffa46b199d8a4cd7158b"} Dec 02 15:58:03 crc kubenswrapper[4933]: I1202 15:58:03.940491 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-566fddb674-58vt9" event={"ID":"41cf2470-9560-4fa8-9b7f-63324e2acfd0","Type":"ContainerStarted","Data":"acea33f591b63ac99f44867c15c5067ffbcb2bd080c7d0160be7890071cb415d"} Dec 02 15:58:03 crc kubenswrapper[4933]: I1202 15:58:03.940501 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-566fddb674-58vt9" event={"ID":"41cf2470-9560-4fa8-9b7f-63324e2acfd0","Type":"ContainerStarted","Data":"a16af0a7fc10195cb37bf8573caead12ccc5e209fd53c78b1639f18974d99ac3"} Dec 02 15:58:03 crc kubenswrapper[4933]: I1202 15:58:03.941271 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-rwz9t" event={"ID":"57482a1a-ab70-41d9-a050-6b6b1bd131f9","Type":"ContainerStarted","Data":"8006d48333efd007715a1f45d17fe3bb067af8a5e06d5197030baede3068081f"} Dec 02 15:58:04 crc kubenswrapper[4933]: I1202 15:58:04.141719 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Dec 02 15:58:04 crc kubenswrapper[4933]: W1202 15:58:04.150745 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a72be14_2144_4028_8130_716033a27045.slice/crio-e7ae21f14590a7504851b4281a50907d39159d4444b746bd6b70c8a0166d258a WatchSource:0}: Error finding container e7ae21f14590a7504851b4281a50907d39159d4444b746bd6b70c8a0166d258a: Status 404 returned error can't find the container with id e7ae21f14590a7504851b4281a50907d39159d4444b746bd6b70c8a0166d258a Dec 02 15:58:04 crc kubenswrapper[4933]: I1202 15:58:04.184787 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-7869797c6b-dzrq2"] Dec 02 15:58:04 crc kubenswrapper[4933]: I1202 15:58:04.187043 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-7869797c6b-dzrq2" Dec 02 15:58:04 crc kubenswrapper[4933]: I1202 15:58:04.190102 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy" Dec 02 15:58:04 crc kubenswrapper[4933]: I1202 15:58:04.190261 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-rules" Dec 02 15:58:04 crc kubenswrapper[4933]: I1202 15:58:04.190363 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-web" Dec 02 15:58:04 crc kubenswrapper[4933]: I1202 15:58:04.190578 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-dockercfg-vd62b" Dec 02 15:58:04 crc kubenswrapper[4933]: I1202 15:58:04.190684 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-tls" Dec 02 15:58:04 crc kubenswrapper[4933]: I1202 15:58:04.190841 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-metrics" Dec 02 15:58:04 crc kubenswrapper[4933]: I1202 15:58:04.190951 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-grpc-tls-39q7ku2bnqvn" Dec 02 15:58:04 crc kubenswrapper[4933]: I1202 15:58:04.201289 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-7869797c6b-dzrq2"] Dec 02 15:58:04 crc kubenswrapper[4933]: I1202 15:58:04.335210 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/3a6cc04c-98a7-45f0-aef1-e8bb3466606a-secret-thanos-querier-tls\") pod \"thanos-querier-7869797c6b-dzrq2\" (UID: \"3a6cc04c-98a7-45f0-aef1-e8bb3466606a\") " pod="openshift-monitoring/thanos-querier-7869797c6b-dzrq2" Dec 02 15:58:04 crc kubenswrapper[4933]: I1202 15:58:04.335273 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/3a6cc04c-98a7-45f0-aef1-e8bb3466606a-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-7869797c6b-dzrq2\" (UID: \"3a6cc04c-98a7-45f0-aef1-e8bb3466606a\") " pod="openshift-monitoring/thanos-querier-7869797c6b-dzrq2" Dec 02 15:58:04 crc kubenswrapper[4933]: I1202 15:58:04.335307 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3a6cc04c-98a7-45f0-aef1-e8bb3466606a-metrics-client-ca\") pod \"thanos-querier-7869797c6b-dzrq2\" (UID: \"3a6cc04c-98a7-45f0-aef1-e8bb3466606a\") " pod="openshift-monitoring/thanos-querier-7869797c6b-dzrq2" Dec 02 15:58:04 crc kubenswrapper[4933]: I1202 15:58:04.335381 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/3a6cc04c-98a7-45f0-aef1-e8bb3466606a-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-7869797c6b-dzrq2\" (UID: \"3a6cc04c-98a7-45f0-aef1-e8bb3466606a\") " pod="openshift-monitoring/thanos-querier-7869797c6b-dzrq2" Dec 02 15:58:04 crc kubenswrapper[4933]: I1202 15:58:04.335430 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/3a6cc04c-98a7-45f0-aef1-e8bb3466606a-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-7869797c6b-dzrq2\" (UID: \"3a6cc04c-98a7-45f0-aef1-e8bb3466606a\") " pod="openshift-monitoring/thanos-querier-7869797c6b-dzrq2" Dec 02 15:58:04 crc kubenswrapper[4933]: I1202 15:58:04.335471 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/3a6cc04c-98a7-45f0-aef1-e8bb3466606a-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-7869797c6b-dzrq2\" (UID: \"3a6cc04c-98a7-45f0-aef1-e8bb3466606a\") " pod="openshift-monitoring/thanos-querier-7869797c6b-dzrq2" Dec 02 15:58:04 crc kubenswrapper[4933]: I1202 15:58:04.335504 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/3a6cc04c-98a7-45f0-aef1-e8bb3466606a-secret-grpc-tls\") pod \"thanos-querier-7869797c6b-dzrq2\" (UID: \"3a6cc04c-98a7-45f0-aef1-e8bb3466606a\") " pod="openshift-monitoring/thanos-querier-7869797c6b-dzrq2" Dec 02 15:58:04 crc kubenswrapper[4933]: I1202 15:58:04.335536 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vv4l\" (UniqueName: \"kubernetes.io/projected/3a6cc04c-98a7-45f0-aef1-e8bb3466606a-kube-api-access-8vv4l\") pod \"thanos-querier-7869797c6b-dzrq2\" (UID: \"3a6cc04c-98a7-45f0-aef1-e8bb3466606a\") " pod="openshift-monitoring/thanos-querier-7869797c6b-dzrq2" Dec 02 15:58:04 crc kubenswrapper[4933]: I1202 15:58:04.436939 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/3a6cc04c-98a7-45f0-aef1-e8bb3466606a-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-7869797c6b-dzrq2\" (UID: \"3a6cc04c-98a7-45f0-aef1-e8bb3466606a\") " pod="openshift-monitoring/thanos-querier-7869797c6b-dzrq2" Dec 02 15:58:04 crc kubenswrapper[4933]: I1202 15:58:04.437059 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/3a6cc04c-98a7-45f0-aef1-e8bb3466606a-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-7869797c6b-dzrq2\" (UID: \"3a6cc04c-98a7-45f0-aef1-e8bb3466606a\") " pod="openshift-monitoring/thanos-querier-7869797c6b-dzrq2" Dec 02 15:58:04 crc kubenswrapper[4933]: I1202 15:58:04.437125 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/3a6cc04c-98a7-45f0-aef1-e8bb3466606a-secret-grpc-tls\") pod \"thanos-querier-7869797c6b-dzrq2\" (UID: \"3a6cc04c-98a7-45f0-aef1-e8bb3466606a\") " pod="openshift-monitoring/thanos-querier-7869797c6b-dzrq2" Dec 02 15:58:04 crc kubenswrapper[4933]: I1202 15:58:04.437151 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vv4l\" (UniqueName: \"kubernetes.io/projected/3a6cc04c-98a7-45f0-aef1-e8bb3466606a-kube-api-access-8vv4l\") pod \"thanos-querier-7869797c6b-dzrq2\" (UID: \"3a6cc04c-98a7-45f0-aef1-e8bb3466606a\") " pod="openshift-monitoring/thanos-querier-7869797c6b-dzrq2" Dec 02 15:58:04 crc kubenswrapper[4933]: I1202 15:58:04.437328 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/3a6cc04c-98a7-45f0-aef1-e8bb3466606a-secret-thanos-querier-tls\") pod \"thanos-querier-7869797c6b-dzrq2\" (UID: \"3a6cc04c-98a7-45f0-aef1-e8bb3466606a\") " pod="openshift-monitoring/thanos-querier-7869797c6b-dzrq2" Dec 02 15:58:04 crc kubenswrapper[4933]: I1202 15:58:04.437378 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/3a6cc04c-98a7-45f0-aef1-e8bb3466606a-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-7869797c6b-dzrq2\" (UID: \"3a6cc04c-98a7-45f0-aef1-e8bb3466606a\") " pod="openshift-monitoring/thanos-querier-7869797c6b-dzrq2" Dec 02 15:58:04 crc kubenswrapper[4933]: I1202 15:58:04.437416 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3a6cc04c-98a7-45f0-aef1-e8bb3466606a-metrics-client-ca\") pod \"thanos-querier-7869797c6b-dzrq2\" (UID: \"3a6cc04c-98a7-45f0-aef1-e8bb3466606a\") " pod="openshift-monitoring/thanos-querier-7869797c6b-dzrq2" Dec 02 15:58:04 crc kubenswrapper[4933]: I1202 15:58:04.437451 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/3a6cc04c-98a7-45f0-aef1-e8bb3466606a-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-7869797c6b-dzrq2\" (UID: \"3a6cc04c-98a7-45f0-aef1-e8bb3466606a\") " pod="openshift-monitoring/thanos-querier-7869797c6b-dzrq2" Dec 02 15:58:04 crc kubenswrapper[4933]: I1202 15:58:04.442525 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3a6cc04c-98a7-45f0-aef1-e8bb3466606a-metrics-client-ca\") pod \"thanos-querier-7869797c6b-dzrq2\" (UID: \"3a6cc04c-98a7-45f0-aef1-e8bb3466606a\") " pod="openshift-monitoring/thanos-querier-7869797c6b-dzrq2" Dec 02 15:58:04 crc kubenswrapper[4933]: I1202 15:58:04.445039 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/3a6cc04c-98a7-45f0-aef1-e8bb3466606a-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-7869797c6b-dzrq2\" (UID: \"3a6cc04c-98a7-45f0-aef1-e8bb3466606a\") " pod="openshift-monitoring/thanos-querier-7869797c6b-dzrq2" Dec 02 15:58:04 crc kubenswrapper[4933]: I1202 15:58:04.445731 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/3a6cc04c-98a7-45f0-aef1-e8bb3466606a-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-7869797c6b-dzrq2\" (UID: \"3a6cc04c-98a7-45f0-aef1-e8bb3466606a\") " pod="openshift-monitoring/thanos-querier-7869797c6b-dzrq2" Dec 02 15:58:04 crc kubenswrapper[4933]: I1202 15:58:04.446778 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/3a6cc04c-98a7-45f0-aef1-e8bb3466606a-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-7869797c6b-dzrq2\" (UID: \"3a6cc04c-98a7-45f0-aef1-e8bb3466606a\") " pod="openshift-monitoring/thanos-querier-7869797c6b-dzrq2" Dec 02 15:58:04 crc kubenswrapper[4933]: I1202 15:58:04.448594 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/3a6cc04c-98a7-45f0-aef1-e8bb3466606a-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-7869797c6b-dzrq2\" (UID: \"3a6cc04c-98a7-45f0-aef1-e8bb3466606a\") " pod="openshift-monitoring/thanos-querier-7869797c6b-dzrq2" Dec 02 15:58:04 crc kubenswrapper[4933]: I1202 15:58:04.448678 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/3a6cc04c-98a7-45f0-aef1-e8bb3466606a-secret-grpc-tls\") pod \"thanos-querier-7869797c6b-dzrq2\" (UID: \"3a6cc04c-98a7-45f0-aef1-e8bb3466606a\") " pod="openshift-monitoring/thanos-querier-7869797c6b-dzrq2" Dec 02 15:58:04 crc kubenswrapper[4933]: I1202 15:58:04.449046 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/3a6cc04c-98a7-45f0-aef1-e8bb3466606a-secret-thanos-querier-tls\") pod \"thanos-querier-7869797c6b-dzrq2\" (UID: \"3a6cc04c-98a7-45f0-aef1-e8bb3466606a\") " pod="openshift-monitoring/thanos-querier-7869797c6b-dzrq2" Dec 02 15:58:04 crc kubenswrapper[4933]: I1202 15:58:04.463415 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vv4l\" (UniqueName: \"kubernetes.io/projected/3a6cc04c-98a7-45f0-aef1-e8bb3466606a-kube-api-access-8vv4l\") pod \"thanos-querier-7869797c6b-dzrq2\" (UID: \"3a6cc04c-98a7-45f0-aef1-e8bb3466606a\") " pod="openshift-monitoring/thanos-querier-7869797c6b-dzrq2" Dec 02 15:58:04 crc kubenswrapper[4933]: I1202 15:58:04.502071 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-7869797c6b-dzrq2" Dec 02 15:58:04 crc kubenswrapper[4933]: I1202 15:58:04.922379 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-7869797c6b-dzrq2"] Dec 02 15:58:04 crc kubenswrapper[4933]: I1202 15:58:04.952040 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8a72be14-2144-4028-8130-716033a27045","Type":"ContainerStarted","Data":"e7ae21f14590a7504851b4281a50907d39159d4444b746bd6b70c8a0166d258a"} Dec 02 15:58:04 crc kubenswrapper[4933]: I1202 15:58:04.954420 4933 generic.go:334] "Generic (PLEG): container finished" podID="3725235e-c426-4820-928e-16d3c9682550" containerID="2bbcabe65efdecc6fb85c56cb6a316c0426b8d6f9287e2f0d2eb0c7ce8ef2796" exitCode=0 Dec 02 15:58:04 crc kubenswrapper[4933]: I1202 15:58:04.954454 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-c5xwx" event={"ID":"3725235e-c426-4820-928e-16d3c9682550","Type":"ContainerDied","Data":"2bbcabe65efdecc6fb85c56cb6a316c0426b8d6f9287e2f0d2eb0c7ce8ef2796"} Dec 02 15:58:05 crc kubenswrapper[4933]: W1202 15:58:05.052874 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3a6cc04c_98a7_45f0_aef1_e8bb3466606a.slice/crio-9f942881039e7f3584816887d8f7ae8f6558056deb9ff6fe0457a79a0f1d88e2 WatchSource:0}: Error finding container 9f942881039e7f3584816887d8f7ae8f6558056deb9ff6fe0457a79a0f1d88e2: Status 404 returned error can't find the container with id 9f942881039e7f3584816887d8f7ae8f6558056deb9ff6fe0457a79a0f1d88e2 Dec 02 15:58:05 crc kubenswrapper[4933]: I1202 15:58:05.962591 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7869797c6b-dzrq2" event={"ID":"3a6cc04c-98a7-45f0-aef1-e8bb3466606a","Type":"ContainerStarted","Data":"9f942881039e7f3584816887d8f7ae8f6558056deb9ff6fe0457a79a0f1d88e2"} Dec 02 15:58:06 crc kubenswrapper[4933]: I1202 15:58:06.962302 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-64864bcb5f-7cr62"] Dec 02 15:58:06 crc kubenswrapper[4933]: I1202 15:58:06.963536 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-64864bcb5f-7cr62" Dec 02 15:58:06 crc kubenswrapper[4933]: I1202 15:58:06.974593 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-rwz9t" event={"ID":"57482a1a-ab70-41d9-a050-6b6b1bd131f9","Type":"ContainerStarted","Data":"3e569d331b293d97d95e61f865b75b14eff48c6fe7c330eb6b43ee2475c2d0b7"} Dec 02 15:58:06 crc kubenswrapper[4933]: I1202 15:58:06.974795 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-rwz9t" event={"ID":"57482a1a-ab70-41d9-a050-6b6b1bd131f9","Type":"ContainerStarted","Data":"c3eadd4245541334a24ea9c245b258f87f291acad2eb29b489f59f17124af99b"} Dec 02 15:58:06 crc kubenswrapper[4933]: I1202 15:58:06.974839 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-rwz9t" event={"ID":"57482a1a-ab70-41d9-a050-6b6b1bd131f9","Type":"ContainerStarted","Data":"b3d26aa7c75938a51fba40e892b06af776404d0197974e97a3cc63fe6924db69"} Dec 02 15:58:06 crc kubenswrapper[4933]: I1202 15:58:06.977727 4933 generic.go:334] "Generic (PLEG): container finished" podID="8a72be14-2144-4028-8130-716033a27045" containerID="25738dc81d8468563fffe3cd8ae20bef5d8b8401b59adc3587b158f702fb406e" exitCode=0 Dec 02 15:58:06 crc kubenswrapper[4933]: I1202 15:58:06.977864 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8a72be14-2144-4028-8130-716033a27045","Type":"ContainerDied","Data":"25738dc81d8468563fffe3cd8ae20bef5d8b8401b59adc3587b158f702fb406e"} Dec 02 15:58:06 crc kubenswrapper[4933]: I1202 15:58:06.985233 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-566fddb674-58vt9" event={"ID":"41cf2470-9560-4fa8-9b7f-63324e2acfd0","Type":"ContainerStarted","Data":"fee8d7d5fdd9b380072518e3709ae4ebf412e7d0cbef44d013e116cccd8f41d8"} Dec 02 15:58:06 crc kubenswrapper[4933]: I1202 15:58:06.986172 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-64864bcb5f-7cr62"] Dec 02 15:58:06 crc kubenswrapper[4933]: I1202 15:58:06.994016 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-c5xwx" event={"ID":"3725235e-c426-4820-928e-16d3c9682550","Type":"ContainerStarted","Data":"a5404e510b9609e7361302d5b938b5d3e37b72dccf29934a3740f7210ec254e4"} Dec 02 15:58:06 crc kubenswrapper[4933]: I1202 15:58:06.994068 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-c5xwx" event={"ID":"3725235e-c426-4820-928e-16d3c9682550","Type":"ContainerStarted","Data":"584bbcdd3e235df881f2bae998026d30ade6393d3d44b52ce9d455c33c904d22"} Dec 02 15:58:07 crc kubenswrapper[4933]: I1202 15:58:07.033528 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-566fddb674-58vt9" podStartSLOduration=2.3787149899999998 podStartE2EDuration="5.033511786s" podCreationTimestamp="2025-12-02 15:58:02 +0000 UTC" firstStartedPulling="2025-12-02 15:58:03.303576107 +0000 UTC m=+346.554802810" lastFinishedPulling="2025-12-02 15:58:05.958372903 +0000 UTC m=+349.209599606" observedRunningTime="2025-12-02 15:58:07.031222121 +0000 UTC m=+350.282448844" watchObservedRunningTime="2025-12-02 15:58:07.033511786 +0000 UTC m=+350.284738489" Dec 02 15:58:07 crc kubenswrapper[4933]: I1202 15:58:07.082917 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/13082fa8-865a-4639-8384-f140198b768f-service-ca\") pod \"console-64864bcb5f-7cr62\" (UID: \"13082fa8-865a-4639-8384-f140198b768f\") " pod="openshift-console/console-64864bcb5f-7cr62" Dec 02 15:58:07 crc kubenswrapper[4933]: I1202 15:58:07.083023 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/13082fa8-865a-4639-8384-f140198b768f-console-config\") pod \"console-64864bcb5f-7cr62\" (UID: \"13082fa8-865a-4639-8384-f140198b768f\") " pod="openshift-console/console-64864bcb5f-7cr62" Dec 02 15:58:07 crc kubenswrapper[4933]: I1202 15:58:07.083156 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/13082fa8-865a-4639-8384-f140198b768f-console-oauth-config\") pod \"console-64864bcb5f-7cr62\" (UID: \"13082fa8-865a-4639-8384-f140198b768f\") " pod="openshift-console/console-64864bcb5f-7cr62" Dec 02 15:58:07 crc kubenswrapper[4933]: I1202 15:58:07.083228 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/13082fa8-865a-4639-8384-f140198b768f-trusted-ca-bundle\") pod \"console-64864bcb5f-7cr62\" (UID: \"13082fa8-865a-4639-8384-f140198b768f\") " pod="openshift-console/console-64864bcb5f-7cr62" Dec 02 15:58:07 crc kubenswrapper[4933]: I1202 15:58:07.083270 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/13082fa8-865a-4639-8384-f140198b768f-oauth-serving-cert\") pod \"console-64864bcb5f-7cr62\" (UID: \"13082fa8-865a-4639-8384-f140198b768f\") " pod="openshift-console/console-64864bcb5f-7cr62" Dec 02 15:58:07 crc kubenswrapper[4933]: I1202 15:58:07.083399 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dftdh\" (UniqueName: \"kubernetes.io/projected/13082fa8-865a-4639-8384-f140198b768f-kube-api-access-dftdh\") pod \"console-64864bcb5f-7cr62\" (UID: \"13082fa8-865a-4639-8384-f140198b768f\") " pod="openshift-console/console-64864bcb5f-7cr62" Dec 02 15:58:07 crc kubenswrapper[4933]: I1202 15:58:07.083479 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/13082fa8-865a-4639-8384-f140198b768f-console-serving-cert\") pod \"console-64864bcb5f-7cr62\" (UID: \"13082fa8-865a-4639-8384-f140198b768f\") " pod="openshift-console/console-64864bcb5f-7cr62" Dec 02 15:58:07 crc kubenswrapper[4933]: I1202 15:58:07.142428 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-c5xwx" podStartSLOduration=3.617420813 podStartE2EDuration="5.14240218s" podCreationTimestamp="2025-12-02 15:58:02 +0000 UTC" firstStartedPulling="2025-12-02 15:58:02.587779519 +0000 UTC m=+345.839006222" lastFinishedPulling="2025-12-02 15:58:04.112760886 +0000 UTC m=+347.363987589" observedRunningTime="2025-12-02 15:58:07.133489205 +0000 UTC m=+350.384715918" watchObservedRunningTime="2025-12-02 15:58:07.14240218 +0000 UTC m=+350.393628903" Dec 02 15:58:07 crc kubenswrapper[4933]: I1202 15:58:07.149144 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-rwz9t" podStartSLOduration=2.745876619 podStartE2EDuration="5.149122512s" podCreationTimestamp="2025-12-02 15:58:02 +0000 UTC" firstStartedPulling="2025-12-02 15:58:03.561551304 +0000 UTC m=+346.812778007" lastFinishedPulling="2025-12-02 15:58:05.964797197 +0000 UTC m=+349.216023900" observedRunningTime="2025-12-02 15:58:07.115577893 +0000 UTC m=+350.366804606" watchObservedRunningTime="2025-12-02 15:58:07.149122512 +0000 UTC m=+350.400349215" Dec 02 15:58:07 crc kubenswrapper[4933]: I1202 15:58:07.184405 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/13082fa8-865a-4639-8384-f140198b768f-console-config\") pod \"console-64864bcb5f-7cr62\" (UID: \"13082fa8-865a-4639-8384-f140198b768f\") " pod="openshift-console/console-64864bcb5f-7cr62" Dec 02 15:58:07 crc kubenswrapper[4933]: I1202 15:58:07.184478 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/13082fa8-865a-4639-8384-f140198b768f-console-oauth-config\") pod \"console-64864bcb5f-7cr62\" (UID: \"13082fa8-865a-4639-8384-f140198b768f\") " pod="openshift-console/console-64864bcb5f-7cr62" Dec 02 15:58:07 crc kubenswrapper[4933]: I1202 15:58:07.184504 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/13082fa8-865a-4639-8384-f140198b768f-trusted-ca-bundle\") pod \"console-64864bcb5f-7cr62\" (UID: \"13082fa8-865a-4639-8384-f140198b768f\") " pod="openshift-console/console-64864bcb5f-7cr62" Dec 02 15:58:07 crc kubenswrapper[4933]: I1202 15:58:07.184533 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/13082fa8-865a-4639-8384-f140198b768f-oauth-serving-cert\") pod \"console-64864bcb5f-7cr62\" (UID: \"13082fa8-865a-4639-8384-f140198b768f\") " pod="openshift-console/console-64864bcb5f-7cr62" Dec 02 15:58:07 crc kubenswrapper[4933]: I1202 15:58:07.184576 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dftdh\" (UniqueName: \"kubernetes.io/projected/13082fa8-865a-4639-8384-f140198b768f-kube-api-access-dftdh\") pod \"console-64864bcb5f-7cr62\" (UID: \"13082fa8-865a-4639-8384-f140198b768f\") " pod="openshift-console/console-64864bcb5f-7cr62" Dec 02 15:58:07 crc kubenswrapper[4933]: I1202 15:58:07.184612 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/13082fa8-865a-4639-8384-f140198b768f-console-serving-cert\") pod \"console-64864bcb5f-7cr62\" (UID: \"13082fa8-865a-4639-8384-f140198b768f\") " pod="openshift-console/console-64864bcb5f-7cr62" Dec 02 15:58:07 crc kubenswrapper[4933]: I1202 15:58:07.184663 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/13082fa8-865a-4639-8384-f140198b768f-service-ca\") pod \"console-64864bcb5f-7cr62\" (UID: \"13082fa8-865a-4639-8384-f140198b768f\") " pod="openshift-console/console-64864bcb5f-7cr62" Dec 02 15:58:07 crc kubenswrapper[4933]: I1202 15:58:07.185674 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/13082fa8-865a-4639-8384-f140198b768f-trusted-ca-bundle\") pod \"console-64864bcb5f-7cr62\" (UID: \"13082fa8-865a-4639-8384-f140198b768f\") " pod="openshift-console/console-64864bcb5f-7cr62" Dec 02 15:58:07 crc kubenswrapper[4933]: I1202 15:58:07.185848 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/13082fa8-865a-4639-8384-f140198b768f-console-config\") pod \"console-64864bcb5f-7cr62\" (UID: \"13082fa8-865a-4639-8384-f140198b768f\") " pod="openshift-console/console-64864bcb5f-7cr62" Dec 02 15:58:07 crc kubenswrapper[4933]: I1202 15:58:07.185925 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/13082fa8-865a-4639-8384-f140198b768f-oauth-serving-cert\") pod \"console-64864bcb5f-7cr62\" (UID: \"13082fa8-865a-4639-8384-f140198b768f\") " pod="openshift-console/console-64864bcb5f-7cr62" Dec 02 15:58:07 crc kubenswrapper[4933]: I1202 15:58:07.186400 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/13082fa8-865a-4639-8384-f140198b768f-service-ca\") pod \"console-64864bcb5f-7cr62\" (UID: \"13082fa8-865a-4639-8384-f140198b768f\") " pod="openshift-console/console-64864bcb5f-7cr62" Dec 02 15:58:07 crc kubenswrapper[4933]: I1202 15:58:07.191097 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/13082fa8-865a-4639-8384-f140198b768f-console-serving-cert\") pod \"console-64864bcb5f-7cr62\" (UID: \"13082fa8-865a-4639-8384-f140198b768f\") " pod="openshift-console/console-64864bcb5f-7cr62" Dec 02 15:58:07 crc kubenswrapper[4933]: I1202 15:58:07.204853 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dftdh\" (UniqueName: \"kubernetes.io/projected/13082fa8-865a-4639-8384-f140198b768f-kube-api-access-dftdh\") pod \"console-64864bcb5f-7cr62\" (UID: \"13082fa8-865a-4639-8384-f140198b768f\") " pod="openshift-console/console-64864bcb5f-7cr62" Dec 02 15:58:07 crc kubenswrapper[4933]: I1202 15:58:07.213484 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/13082fa8-865a-4639-8384-f140198b768f-console-oauth-config\") pod \"console-64864bcb5f-7cr62\" (UID: \"13082fa8-865a-4639-8384-f140198b768f\") " pod="openshift-console/console-64864bcb5f-7cr62" Dec 02 15:58:07 crc kubenswrapper[4933]: I1202 15:58:07.279753 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-64864bcb5f-7cr62" Dec 02 15:58:07 crc kubenswrapper[4933]: I1202 15:58:07.473380 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-c758f956-cq472"] Dec 02 15:58:07 crc kubenswrapper[4933]: I1202 15:58:07.474474 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-c758f956-cq472" Dec 02 15:58:07 crc kubenswrapper[4933]: I1202 15:58:07.477066 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-tls" Dec 02 15:58:07 crc kubenswrapper[4933]: I1202 15:58:07.477288 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-client-certs" Dec 02 15:58:07 crc kubenswrapper[4933]: I1202 15:58:07.477286 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-dockercfg-f7llp" Dec 02 15:58:07 crc kubenswrapper[4933]: I1202 15:58:07.477330 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kubelet-serving-ca-bundle" Dec 02 15:58:07 crc kubenswrapper[4933]: I1202 15:58:07.477180 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-server-audit-profiles" Dec 02 15:58:07 crc kubenswrapper[4933]: I1202 15:58:07.477442 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-976kb9siv2slb" Dec 02 15:58:07 crc kubenswrapper[4933]: I1202 15:58:07.486104 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-c758f956-cq472"] Dec 02 15:58:07 crc kubenswrapper[4933]: I1202 15:58:07.539938 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/c90920b1-0803-4eef-a4dd-1d531494ea38-metrics-server-audit-profiles\") pod \"metrics-server-c758f956-cq472\" (UID: \"c90920b1-0803-4eef-a4dd-1d531494ea38\") " pod="openshift-monitoring/metrics-server-c758f956-cq472" Dec 02 15:58:07 crc kubenswrapper[4933]: I1202 15:58:07.539978 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gr54z\" (UniqueName: \"kubernetes.io/projected/c90920b1-0803-4eef-a4dd-1d531494ea38-kube-api-access-gr54z\") pod \"metrics-server-c758f956-cq472\" (UID: \"c90920b1-0803-4eef-a4dd-1d531494ea38\") " pod="openshift-monitoring/metrics-server-c758f956-cq472" Dec 02 15:58:07 crc kubenswrapper[4933]: I1202 15:58:07.540116 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/c90920b1-0803-4eef-a4dd-1d531494ea38-secret-metrics-client-certs\") pod \"metrics-server-c758f956-cq472\" (UID: \"c90920b1-0803-4eef-a4dd-1d531494ea38\") " pod="openshift-monitoring/metrics-server-c758f956-cq472" Dec 02 15:58:07 crc kubenswrapper[4933]: I1202 15:58:07.540166 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/c90920b1-0803-4eef-a4dd-1d531494ea38-audit-log\") pod \"metrics-server-c758f956-cq472\" (UID: \"c90920b1-0803-4eef-a4dd-1d531494ea38\") " pod="openshift-monitoring/metrics-server-c758f956-cq472" Dec 02 15:58:07 crc kubenswrapper[4933]: I1202 15:58:07.540215 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c90920b1-0803-4eef-a4dd-1d531494ea38-client-ca-bundle\") pod \"metrics-server-c758f956-cq472\" (UID: \"c90920b1-0803-4eef-a4dd-1d531494ea38\") " pod="openshift-monitoring/metrics-server-c758f956-cq472" Dec 02 15:58:07 crc kubenswrapper[4933]: I1202 15:58:07.540527 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c90920b1-0803-4eef-a4dd-1d531494ea38-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-c758f956-cq472\" (UID: \"c90920b1-0803-4eef-a4dd-1d531494ea38\") " pod="openshift-monitoring/metrics-server-c758f956-cq472" Dec 02 15:58:07 crc kubenswrapper[4933]: I1202 15:58:07.540595 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/c90920b1-0803-4eef-a4dd-1d531494ea38-secret-metrics-server-tls\") pod \"metrics-server-c758f956-cq472\" (UID: \"c90920b1-0803-4eef-a4dd-1d531494ea38\") " pod="openshift-monitoring/metrics-server-c758f956-cq472" Dec 02 15:58:07 crc kubenswrapper[4933]: I1202 15:58:07.641434 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c90920b1-0803-4eef-a4dd-1d531494ea38-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-c758f956-cq472\" (UID: \"c90920b1-0803-4eef-a4dd-1d531494ea38\") " pod="openshift-monitoring/metrics-server-c758f956-cq472" Dec 02 15:58:07 crc kubenswrapper[4933]: I1202 15:58:07.641495 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/c90920b1-0803-4eef-a4dd-1d531494ea38-secret-metrics-server-tls\") pod \"metrics-server-c758f956-cq472\" (UID: \"c90920b1-0803-4eef-a4dd-1d531494ea38\") " pod="openshift-monitoring/metrics-server-c758f956-cq472" Dec 02 15:58:07 crc kubenswrapper[4933]: I1202 15:58:07.641517 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/c90920b1-0803-4eef-a4dd-1d531494ea38-metrics-server-audit-profiles\") pod \"metrics-server-c758f956-cq472\" (UID: \"c90920b1-0803-4eef-a4dd-1d531494ea38\") " pod="openshift-monitoring/metrics-server-c758f956-cq472" Dec 02 15:58:07 crc kubenswrapper[4933]: I1202 15:58:07.641533 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gr54z\" (UniqueName: \"kubernetes.io/projected/c90920b1-0803-4eef-a4dd-1d531494ea38-kube-api-access-gr54z\") pod \"metrics-server-c758f956-cq472\" (UID: \"c90920b1-0803-4eef-a4dd-1d531494ea38\") " pod="openshift-monitoring/metrics-server-c758f956-cq472" Dec 02 15:58:07 crc kubenswrapper[4933]: I1202 15:58:07.641563 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/c90920b1-0803-4eef-a4dd-1d531494ea38-secret-metrics-client-certs\") pod \"metrics-server-c758f956-cq472\" (UID: \"c90920b1-0803-4eef-a4dd-1d531494ea38\") " pod="openshift-monitoring/metrics-server-c758f956-cq472" Dec 02 15:58:07 crc kubenswrapper[4933]: I1202 15:58:07.641586 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/c90920b1-0803-4eef-a4dd-1d531494ea38-audit-log\") pod \"metrics-server-c758f956-cq472\" (UID: \"c90920b1-0803-4eef-a4dd-1d531494ea38\") " pod="openshift-monitoring/metrics-server-c758f956-cq472" Dec 02 15:58:07 crc kubenswrapper[4933]: I1202 15:58:07.642262 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/c90920b1-0803-4eef-a4dd-1d531494ea38-audit-log\") pod \"metrics-server-c758f956-cq472\" (UID: \"c90920b1-0803-4eef-a4dd-1d531494ea38\") " pod="openshift-monitoring/metrics-server-c758f956-cq472" Dec 02 15:58:07 crc kubenswrapper[4933]: I1202 15:58:07.642550 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c90920b1-0803-4eef-a4dd-1d531494ea38-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-c758f956-cq472\" (UID: \"c90920b1-0803-4eef-a4dd-1d531494ea38\") " pod="openshift-monitoring/metrics-server-c758f956-cq472" Dec 02 15:58:07 crc kubenswrapper[4933]: I1202 15:58:07.642923 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c90920b1-0803-4eef-a4dd-1d531494ea38-client-ca-bundle\") pod \"metrics-server-c758f956-cq472\" (UID: \"c90920b1-0803-4eef-a4dd-1d531494ea38\") " pod="openshift-monitoring/metrics-server-c758f956-cq472" Dec 02 15:58:07 crc kubenswrapper[4933]: I1202 15:58:07.642968 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/c90920b1-0803-4eef-a4dd-1d531494ea38-metrics-server-audit-profiles\") pod \"metrics-server-c758f956-cq472\" (UID: \"c90920b1-0803-4eef-a4dd-1d531494ea38\") " pod="openshift-monitoring/metrics-server-c758f956-cq472" Dec 02 15:58:07 crc kubenswrapper[4933]: I1202 15:58:07.645401 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/c90920b1-0803-4eef-a4dd-1d531494ea38-secret-metrics-client-certs\") pod \"metrics-server-c758f956-cq472\" (UID: \"c90920b1-0803-4eef-a4dd-1d531494ea38\") " pod="openshift-monitoring/metrics-server-c758f956-cq472" Dec 02 15:58:07 crc kubenswrapper[4933]: I1202 15:58:07.645724 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c90920b1-0803-4eef-a4dd-1d531494ea38-client-ca-bundle\") pod \"metrics-server-c758f956-cq472\" (UID: \"c90920b1-0803-4eef-a4dd-1d531494ea38\") " pod="openshift-monitoring/metrics-server-c758f956-cq472" Dec 02 15:58:07 crc kubenswrapper[4933]: I1202 15:58:07.659857 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/c90920b1-0803-4eef-a4dd-1d531494ea38-secret-metrics-server-tls\") pod \"metrics-server-c758f956-cq472\" (UID: \"c90920b1-0803-4eef-a4dd-1d531494ea38\") " pod="openshift-monitoring/metrics-server-c758f956-cq472" Dec 02 15:58:07 crc kubenswrapper[4933]: I1202 15:58:07.662835 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gr54z\" (UniqueName: \"kubernetes.io/projected/c90920b1-0803-4eef-a4dd-1d531494ea38-kube-api-access-gr54z\") pod \"metrics-server-c758f956-cq472\" (UID: \"c90920b1-0803-4eef-a4dd-1d531494ea38\") " pod="openshift-monitoring/metrics-server-c758f956-cq472" Dec 02 15:58:07 crc kubenswrapper[4933]: I1202 15:58:07.857408 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-c758f956-cq472" Dec 02 15:58:07 crc kubenswrapper[4933]: I1202 15:58:07.950960 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-5f6548ffdf-xz8vn"] Dec 02 15:58:07 crc kubenswrapper[4933]: I1202 15:58:07.951673 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-5f6548ffdf-xz8vn" Dec 02 15:58:07 crc kubenswrapper[4933]: I1202 15:58:07.954224 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"default-dockercfg-6tstp" Dec 02 15:58:07 crc kubenswrapper[4933]: I1202 15:58:07.954350 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-5f6548ffdf-xz8vn"] Dec 02 15:58:07 crc kubenswrapper[4933]: I1202 15:58:07.960772 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"monitoring-plugin-cert" Dec 02 15:58:08 crc kubenswrapper[4933]: I1202 15:58:08.007490 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7869797c6b-dzrq2" event={"ID":"3a6cc04c-98a7-45f0-aef1-e8bb3466606a","Type":"ContainerStarted","Data":"2d6f7c30912f4e567988aacc7f31b818816b4f1a761b952acccf4a5b793b1249"} Dec 02 15:58:08 crc kubenswrapper[4933]: I1202 15:58:08.048372 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/c215d303-c7ac-40ef-a969-ac30a28d1a83-monitoring-plugin-cert\") pod \"monitoring-plugin-5f6548ffdf-xz8vn\" (UID: \"c215d303-c7ac-40ef-a969-ac30a28d1a83\") " pod="openshift-monitoring/monitoring-plugin-5f6548ffdf-xz8vn" Dec 02 15:58:08 crc kubenswrapper[4933]: I1202 15:58:08.155457 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/c215d303-c7ac-40ef-a969-ac30a28d1a83-monitoring-plugin-cert\") pod \"monitoring-plugin-5f6548ffdf-xz8vn\" (UID: \"c215d303-c7ac-40ef-a969-ac30a28d1a83\") " pod="openshift-monitoring/monitoring-plugin-5f6548ffdf-xz8vn" Dec 02 15:58:08 crc kubenswrapper[4933]: I1202 15:58:08.156412 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-64864bcb5f-7cr62"] Dec 02 15:58:08 crc kubenswrapper[4933]: I1202 15:58:08.166579 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/c215d303-c7ac-40ef-a969-ac30a28d1a83-monitoring-plugin-cert\") pod \"monitoring-plugin-5f6548ffdf-xz8vn\" (UID: \"c215d303-c7ac-40ef-a969-ac30a28d1a83\") " pod="openshift-monitoring/monitoring-plugin-5f6548ffdf-xz8vn" Dec 02 15:58:08 crc kubenswrapper[4933]: I1202 15:58:08.284853 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-5f6548ffdf-xz8vn" Dec 02 15:58:08 crc kubenswrapper[4933]: I1202 15:58:08.302293 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-c758f956-cq472"] Dec 02 15:58:08 crc kubenswrapper[4933]: I1202 15:58:08.629281 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-hv4zj" Dec 02 15:58:08 crc kubenswrapper[4933]: I1202 15:58:08.629500 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-hv4zj" Dec 02 15:58:08 crc kubenswrapper[4933]: I1202 15:58:08.661544 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-5f6548ffdf-xz8vn"] Dec 02 15:58:08 crc kubenswrapper[4933]: I1202 15:58:08.706753 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Dec 02 15:58:08 crc kubenswrapper[4933]: I1202 15:58:08.710730 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Dec 02 15:58:08 crc kubenswrapper[4933]: I1202 15:58:08.712465 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-hv4zj" Dec 02 15:58:08 crc kubenswrapper[4933]: I1202 15:58:08.717305 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-sidecar-tls" Dec 02 15:58:08 crc kubenswrapper[4933]: I1202 15:58:08.718982 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s" Dec 02 15:58:08 crc kubenswrapper[4933]: I1202 15:58:08.719144 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-dockercfg-t8ktg" Dec 02 15:58:08 crc kubenswrapper[4933]: I1202 15:58:08.719294 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-web-config" Dec 02 15:58:08 crc kubenswrapper[4933]: I1202 15:58:08.719993 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-grpc-tls-4j46bi8nc1e24" Dec 02 15:58:08 crc kubenswrapper[4933]: I1202 15:58:08.720097 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"serving-certs-ca-bundle" Dec 02 15:58:08 crc kubenswrapper[4933]: I1202 15:58:08.720163 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls-assets-0" Dec 02 15:58:08 crc kubenswrapper[4933]: I1202 15:58:08.720241 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-kube-rbac-proxy-web" Dec 02 15:58:08 crc kubenswrapper[4933]: I1202 15:58:08.720278 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-rbac-proxy" Dec 02 15:58:08 crc kubenswrapper[4933]: I1202 15:58:08.720348 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-prometheus-http-client-file" Dec 02 15:58:08 crc kubenswrapper[4933]: I1202 15:58:08.720419 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls" Dec 02 15:58:08 crc kubenswrapper[4933]: I1202 15:58:08.725474 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-k8s-rulefiles-0" Dec 02 15:58:08 crc kubenswrapper[4933]: I1202 15:58:08.730041 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-trusted-ca-bundle" Dec 02 15:58:08 crc kubenswrapper[4933]: I1202 15:58:08.730578 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Dec 02 15:58:08 crc kubenswrapper[4933]: I1202 15:58:08.879202 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/9b8cb102-56f2-48f2-a9cd-3811df574aaf-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"9b8cb102-56f2-48f2-a9cd-3811df574aaf\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 02 15:58:08 crc kubenswrapper[4933]: I1202 15:58:08.879265 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/9b8cb102-56f2-48f2-a9cd-3811df574aaf-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"9b8cb102-56f2-48f2-a9cd-3811df574aaf\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 02 15:58:08 crc kubenswrapper[4933]: I1202 15:58:08.879297 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9b8cb102-56f2-48f2-a9cd-3811df574aaf-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"9b8cb102-56f2-48f2-a9cd-3811df574aaf\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 02 15:58:08 crc kubenswrapper[4933]: I1202 15:58:08.879314 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/9b8cb102-56f2-48f2-a9cd-3811df574aaf-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"9b8cb102-56f2-48f2-a9cd-3811df574aaf\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 02 15:58:08 crc kubenswrapper[4933]: I1202 15:58:08.879333 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9b8cb102-56f2-48f2-a9cd-3811df574aaf-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"9b8cb102-56f2-48f2-a9cd-3811df574aaf\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 02 15:58:08 crc kubenswrapper[4933]: I1202 15:58:08.879349 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/9b8cb102-56f2-48f2-a9cd-3811df574aaf-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"9b8cb102-56f2-48f2-a9cd-3811df574aaf\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 02 15:58:08 crc kubenswrapper[4933]: I1202 15:58:08.879364 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9b8cb102-56f2-48f2-a9cd-3811df574aaf-config\") pod \"prometheus-k8s-0\" (UID: \"9b8cb102-56f2-48f2-a9cd-3811df574aaf\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 02 15:58:08 crc kubenswrapper[4933]: I1202 15:58:08.879393 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/9b8cb102-56f2-48f2-a9cd-3811df574aaf-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"9b8cb102-56f2-48f2-a9cd-3811df574aaf\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 02 15:58:08 crc kubenswrapper[4933]: I1202 15:58:08.879413 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/9b8cb102-56f2-48f2-a9cd-3811df574aaf-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"9b8cb102-56f2-48f2-a9cd-3811df574aaf\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 02 15:58:08 crc kubenswrapper[4933]: I1202 15:58:08.879429 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9b8cb102-56f2-48f2-a9cd-3811df574aaf-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"9b8cb102-56f2-48f2-a9cd-3811df574aaf\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 02 15:58:08 crc kubenswrapper[4933]: I1202 15:58:08.879447 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9b8cb102-56f2-48f2-a9cd-3811df574aaf-web-config\") pod \"prometheus-k8s-0\" (UID: \"9b8cb102-56f2-48f2-a9cd-3811df574aaf\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 02 15:58:08 crc kubenswrapper[4933]: I1202 15:58:08.879472 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9b8cb102-56f2-48f2-a9cd-3811df574aaf-config-out\") pod \"prometheus-k8s-0\" (UID: \"9b8cb102-56f2-48f2-a9cd-3811df574aaf\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 02 15:58:08 crc kubenswrapper[4933]: I1202 15:58:08.880077 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9b8cb102-56f2-48f2-a9cd-3811df574aaf-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"9b8cb102-56f2-48f2-a9cd-3811df574aaf\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 02 15:58:08 crc kubenswrapper[4933]: I1202 15:58:08.880159 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/9b8cb102-56f2-48f2-a9cd-3811df574aaf-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"9b8cb102-56f2-48f2-a9cd-3811df574aaf\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 02 15:58:08 crc kubenswrapper[4933]: I1202 15:58:08.880196 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m59f7\" (UniqueName: \"kubernetes.io/projected/9b8cb102-56f2-48f2-a9cd-3811df574aaf-kube-api-access-m59f7\") pod \"prometheus-k8s-0\" (UID: \"9b8cb102-56f2-48f2-a9cd-3811df574aaf\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 02 15:58:08 crc kubenswrapper[4933]: I1202 15:58:08.880225 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9b8cb102-56f2-48f2-a9cd-3811df574aaf-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"9b8cb102-56f2-48f2-a9cd-3811df574aaf\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 02 15:58:08 crc kubenswrapper[4933]: I1202 15:58:08.880310 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/9b8cb102-56f2-48f2-a9cd-3811df574aaf-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"9b8cb102-56f2-48f2-a9cd-3811df574aaf\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 02 15:58:08 crc kubenswrapper[4933]: I1202 15:58:08.880359 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/9b8cb102-56f2-48f2-a9cd-3811df574aaf-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"9b8cb102-56f2-48f2-a9cd-3811df574aaf\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 02 15:58:08 crc kubenswrapper[4933]: I1202 15:58:08.981533 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9b8cb102-56f2-48f2-a9cd-3811df574aaf-config\") pod \"prometheus-k8s-0\" (UID: \"9b8cb102-56f2-48f2-a9cd-3811df574aaf\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 02 15:58:08 crc kubenswrapper[4933]: I1202 15:58:08.982554 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/9b8cb102-56f2-48f2-a9cd-3811df574aaf-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"9b8cb102-56f2-48f2-a9cd-3811df574aaf\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 02 15:58:08 crc kubenswrapper[4933]: I1202 15:58:08.982593 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/9b8cb102-56f2-48f2-a9cd-3811df574aaf-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"9b8cb102-56f2-48f2-a9cd-3811df574aaf\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 02 15:58:08 crc kubenswrapper[4933]: I1202 15:58:08.982612 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9b8cb102-56f2-48f2-a9cd-3811df574aaf-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"9b8cb102-56f2-48f2-a9cd-3811df574aaf\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 02 15:58:08 crc kubenswrapper[4933]: I1202 15:58:08.982628 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9b8cb102-56f2-48f2-a9cd-3811df574aaf-web-config\") pod \"prometheus-k8s-0\" (UID: \"9b8cb102-56f2-48f2-a9cd-3811df574aaf\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 02 15:58:08 crc kubenswrapper[4933]: I1202 15:58:08.982646 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9b8cb102-56f2-48f2-a9cd-3811df574aaf-config-out\") pod \"prometheus-k8s-0\" (UID: \"9b8cb102-56f2-48f2-a9cd-3811df574aaf\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 02 15:58:08 crc kubenswrapper[4933]: I1202 15:58:08.982663 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9b8cb102-56f2-48f2-a9cd-3811df574aaf-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"9b8cb102-56f2-48f2-a9cd-3811df574aaf\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 02 15:58:08 crc kubenswrapper[4933]: I1202 15:58:08.982681 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/9b8cb102-56f2-48f2-a9cd-3811df574aaf-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"9b8cb102-56f2-48f2-a9cd-3811df574aaf\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 02 15:58:08 crc kubenswrapper[4933]: I1202 15:58:08.982704 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m59f7\" (UniqueName: \"kubernetes.io/projected/9b8cb102-56f2-48f2-a9cd-3811df574aaf-kube-api-access-m59f7\") pod \"prometheus-k8s-0\" (UID: \"9b8cb102-56f2-48f2-a9cd-3811df574aaf\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 02 15:58:08 crc kubenswrapper[4933]: I1202 15:58:08.982720 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9b8cb102-56f2-48f2-a9cd-3811df574aaf-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"9b8cb102-56f2-48f2-a9cd-3811df574aaf\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 02 15:58:08 crc kubenswrapper[4933]: I1202 15:58:08.982745 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/9b8cb102-56f2-48f2-a9cd-3811df574aaf-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"9b8cb102-56f2-48f2-a9cd-3811df574aaf\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 02 15:58:08 crc kubenswrapper[4933]: I1202 15:58:08.982764 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/9b8cb102-56f2-48f2-a9cd-3811df574aaf-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"9b8cb102-56f2-48f2-a9cd-3811df574aaf\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 02 15:58:08 crc kubenswrapper[4933]: I1202 15:58:08.982798 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/9b8cb102-56f2-48f2-a9cd-3811df574aaf-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"9b8cb102-56f2-48f2-a9cd-3811df574aaf\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 02 15:58:08 crc kubenswrapper[4933]: I1202 15:58:08.982841 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/9b8cb102-56f2-48f2-a9cd-3811df574aaf-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"9b8cb102-56f2-48f2-a9cd-3811df574aaf\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 02 15:58:08 crc kubenswrapper[4933]: I1202 15:58:08.982865 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9b8cb102-56f2-48f2-a9cd-3811df574aaf-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"9b8cb102-56f2-48f2-a9cd-3811df574aaf\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 02 15:58:08 crc kubenswrapper[4933]: I1202 15:58:08.982882 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/9b8cb102-56f2-48f2-a9cd-3811df574aaf-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"9b8cb102-56f2-48f2-a9cd-3811df574aaf\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 02 15:58:08 crc kubenswrapper[4933]: I1202 15:58:08.982902 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9b8cb102-56f2-48f2-a9cd-3811df574aaf-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"9b8cb102-56f2-48f2-a9cd-3811df574aaf\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 02 15:58:08 crc kubenswrapper[4933]: I1202 15:58:08.982923 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/9b8cb102-56f2-48f2-a9cd-3811df574aaf-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"9b8cb102-56f2-48f2-a9cd-3811df574aaf\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 02 15:58:08 crc kubenswrapper[4933]: I1202 15:58:08.984053 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/9b8cb102-56f2-48f2-a9cd-3811df574aaf-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"9b8cb102-56f2-48f2-a9cd-3811df574aaf\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 02 15:58:08 crc kubenswrapper[4933]: I1202 15:58:08.987864 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9b8cb102-56f2-48f2-a9cd-3811df574aaf-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"9b8cb102-56f2-48f2-a9cd-3811df574aaf\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 02 15:58:08 crc kubenswrapper[4933]: I1202 15:58:08.988664 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/9b8cb102-56f2-48f2-a9cd-3811df574aaf-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"9b8cb102-56f2-48f2-a9cd-3811df574aaf\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 02 15:58:08 crc kubenswrapper[4933]: I1202 15:58:08.988797 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9b8cb102-56f2-48f2-a9cd-3811df574aaf-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"9b8cb102-56f2-48f2-a9cd-3811df574aaf\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 02 15:58:08 crc kubenswrapper[4933]: I1202 15:58:08.988900 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/9b8cb102-56f2-48f2-a9cd-3811df574aaf-config\") pod \"prometheus-k8s-0\" (UID: \"9b8cb102-56f2-48f2-a9cd-3811df574aaf\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 02 15:58:08 crc kubenswrapper[4933]: I1202 15:58:08.989342 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9b8cb102-56f2-48f2-a9cd-3811df574aaf-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"9b8cb102-56f2-48f2-a9cd-3811df574aaf\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 02 15:58:08 crc kubenswrapper[4933]: I1202 15:58:08.992657 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/9b8cb102-56f2-48f2-a9cd-3811df574aaf-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"9b8cb102-56f2-48f2-a9cd-3811df574aaf\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 02 15:58:08 crc kubenswrapper[4933]: I1202 15:58:08.992786 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/9b8cb102-56f2-48f2-a9cd-3811df574aaf-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"9b8cb102-56f2-48f2-a9cd-3811df574aaf\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 02 15:58:08 crc kubenswrapper[4933]: I1202 15:58:08.993576 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/9b8cb102-56f2-48f2-a9cd-3811df574aaf-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"9b8cb102-56f2-48f2-a9cd-3811df574aaf\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 02 15:58:08 crc kubenswrapper[4933]: I1202 15:58:08.993733 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9b8cb102-56f2-48f2-a9cd-3811df574aaf-config-out\") pod \"prometheus-k8s-0\" (UID: \"9b8cb102-56f2-48f2-a9cd-3811df574aaf\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 02 15:58:08 crc kubenswrapper[4933]: I1202 15:58:08.994733 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9b8cb102-56f2-48f2-a9cd-3811df574aaf-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"9b8cb102-56f2-48f2-a9cd-3811df574aaf\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 02 15:58:08 crc kubenswrapper[4933]: I1202 15:58:08.995162 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/9b8cb102-56f2-48f2-a9cd-3811df574aaf-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"9b8cb102-56f2-48f2-a9cd-3811df574aaf\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 02 15:58:08 crc kubenswrapper[4933]: I1202 15:58:08.995330 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9b8cb102-56f2-48f2-a9cd-3811df574aaf-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"9b8cb102-56f2-48f2-a9cd-3811df574aaf\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 02 15:58:08 crc kubenswrapper[4933]: I1202 15:58:08.995779 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9b8cb102-56f2-48f2-a9cd-3811df574aaf-web-config\") pod \"prometheus-k8s-0\" (UID: \"9b8cb102-56f2-48f2-a9cd-3811df574aaf\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 02 15:58:09 crc kubenswrapper[4933]: I1202 15:58:09.001998 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/9b8cb102-56f2-48f2-a9cd-3811df574aaf-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"9b8cb102-56f2-48f2-a9cd-3811df574aaf\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 02 15:58:09 crc kubenswrapper[4933]: I1202 15:58:09.003576 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/9b8cb102-56f2-48f2-a9cd-3811df574aaf-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"9b8cb102-56f2-48f2-a9cd-3811df574aaf\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 02 15:58:09 crc kubenswrapper[4933]: I1202 15:58:09.007571 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m59f7\" (UniqueName: \"kubernetes.io/projected/9b8cb102-56f2-48f2-a9cd-3811df574aaf-kube-api-access-m59f7\") pod \"prometheus-k8s-0\" (UID: \"9b8cb102-56f2-48f2-a9cd-3811df574aaf\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 02 15:58:09 crc kubenswrapper[4933]: I1202 15:58:09.011145 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/9b8cb102-56f2-48f2-a9cd-3811df574aaf-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"9b8cb102-56f2-48f2-a9cd-3811df574aaf\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 02 15:58:09 crc kubenswrapper[4933]: I1202 15:58:09.016198 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-5f6548ffdf-xz8vn" event={"ID":"c215d303-c7ac-40ef-a969-ac30a28d1a83","Type":"ContainerStarted","Data":"465d24523b39ebe719c14d44bee4c12945856dba67e793c270bea4f24bce53ea"} Dec 02 15:58:09 crc kubenswrapper[4933]: I1202 15:58:09.022459 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-64864bcb5f-7cr62" event={"ID":"13082fa8-865a-4639-8384-f140198b768f","Type":"ContainerStarted","Data":"6e85931020b2439f6b42ac837577835cf09429bd198a15ae90d8496c9caebea7"} Dec 02 15:58:09 crc kubenswrapper[4933]: I1202 15:58:09.022496 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-64864bcb5f-7cr62" event={"ID":"13082fa8-865a-4639-8384-f140198b768f","Type":"ContainerStarted","Data":"8bcf3d04f6c38abc3da4b0ba3430e208fa5643c20a42769f0f3a213293451f53"} Dec 02 15:58:09 crc kubenswrapper[4933]: I1202 15:58:09.027424 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7869797c6b-dzrq2" event={"ID":"3a6cc04c-98a7-45f0-aef1-e8bb3466606a","Type":"ContainerStarted","Data":"78b05d573512e2bb6ce0ba47d90db56e96c6208e6b035184eeeed8d1108495e0"} Dec 02 15:58:09 crc kubenswrapper[4933]: I1202 15:58:09.027457 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7869797c6b-dzrq2" event={"ID":"3a6cc04c-98a7-45f0-aef1-e8bb3466606a","Type":"ContainerStarted","Data":"2ead52323362d4d99f98c1388e3523676f4e8e2a47ab37d793b1301785fccbc8"} Dec 02 15:58:09 crc kubenswrapper[4933]: I1202 15:58:09.029881 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-c758f956-cq472" event={"ID":"c90920b1-0803-4eef-a4dd-1d531494ea38","Type":"ContainerStarted","Data":"8f6c244377f9558ca5ce972dc2f3ba30d3eb52718a09765240d034697d4a5156"} Dec 02 15:58:09 crc kubenswrapper[4933]: I1202 15:58:09.043444 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Dec 02 15:58:09 crc kubenswrapper[4933]: I1202 15:58:09.101985 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-hv4zj" Dec 02 15:58:09 crc kubenswrapper[4933]: I1202 15:58:09.123369 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-64864bcb5f-7cr62" podStartSLOduration=3.123338686 podStartE2EDuration="3.123338686s" podCreationTimestamp="2025-12-02 15:58:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 15:58:09.043018639 +0000 UTC m=+352.294245342" watchObservedRunningTime="2025-12-02 15:58:09.123338686 +0000 UTC m=+352.374565389" Dec 02 15:58:09 crc kubenswrapper[4933]: I1202 15:58:09.511755 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Dec 02 15:58:09 crc kubenswrapper[4933]: W1202 15:58:09.817656 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b8cb102_56f2_48f2_a9cd_3811df574aaf.slice/crio-2f56cc0ab88e469c5d4a164f7e5376f1e84d2662df7b3d6ffcd08bc52773004f WatchSource:0}: Error finding container 2f56cc0ab88e469c5d4a164f7e5376f1e84d2662df7b3d6ffcd08bc52773004f: Status 404 returned error can't find the container with id 2f56cc0ab88e469c5d4a164f7e5376f1e84d2662df7b3d6ffcd08bc52773004f Dec 02 15:58:10 crc kubenswrapper[4933]: I1202 15:58:10.038072 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"9b8cb102-56f2-48f2-a9cd-3811df574aaf","Type":"ContainerStarted","Data":"2f56cc0ab88e469c5d4a164f7e5376f1e84d2662df7b3d6ffcd08bc52773004f"} Dec 02 15:58:11 crc kubenswrapper[4933]: I1202 15:58:11.047251 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7869797c6b-dzrq2" event={"ID":"3a6cc04c-98a7-45f0-aef1-e8bb3466606a","Type":"ContainerStarted","Data":"19b90f2e3c2ba538b168f6056f563542f7e61587531e86c1b8e296fd17d62043"} Dec 02 15:58:12 crc kubenswrapper[4933]: I1202 15:58:12.060496 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7869797c6b-dzrq2" event={"ID":"3a6cc04c-98a7-45f0-aef1-e8bb3466606a","Type":"ContainerStarted","Data":"cfa034a5df917d5709518a8bab66744c52faa39860c5b757a3bb0afb8cb55df9"} Dec 02 15:58:12 crc kubenswrapper[4933]: I1202 15:58:12.063129 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"9b8cb102-56f2-48f2-a9cd-3811df574aaf","Type":"ContainerDied","Data":"aad31f57402780198ca97877c872811652c66f9ba5d5758dcfc5fb9e75d49198"} Dec 02 15:58:12 crc kubenswrapper[4933]: I1202 15:58:12.063001 4933 generic.go:334] "Generic (PLEG): container finished" podID="9b8cb102-56f2-48f2-a9cd-3811df574aaf" containerID="aad31f57402780198ca97877c872811652c66f9ba5d5758dcfc5fb9e75d49198" exitCode=0 Dec 02 15:58:12 crc kubenswrapper[4933]: I1202 15:58:12.065627 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-c758f956-cq472" event={"ID":"c90920b1-0803-4eef-a4dd-1d531494ea38","Type":"ContainerStarted","Data":"890a69a09726fbba15484afdbe69cb0efd8e6fe16a4aeca25d0b33cb92769636"} Dec 02 15:58:12 crc kubenswrapper[4933]: I1202 15:58:12.070162 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8a72be14-2144-4028-8130-716033a27045","Type":"ContainerStarted","Data":"c79b44a1d2d17cf3e033ecfdd0b25563ecca754ffe5c521e0f14a99d673d462e"} Dec 02 15:58:12 crc kubenswrapper[4933]: I1202 15:58:12.070192 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8a72be14-2144-4028-8130-716033a27045","Type":"ContainerStarted","Data":"fbca301bcf5bb1ced212f94a0fe4f6a574938511de1f9d4f5a1318eb52911b4e"} Dec 02 15:58:12 crc kubenswrapper[4933]: I1202 15:58:12.126531 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-c758f956-cq472" podStartSLOduration=2.664629674 podStartE2EDuration="5.126510522s" podCreationTimestamp="2025-12-02 15:58:07 +0000 UTC" firstStartedPulling="2025-12-02 15:58:08.341298783 +0000 UTC m=+351.592525486" lastFinishedPulling="2025-12-02 15:58:10.803179631 +0000 UTC m=+354.054406334" observedRunningTime="2025-12-02 15:58:12.121071067 +0000 UTC m=+355.372297790" watchObservedRunningTime="2025-12-02 15:58:12.126510522 +0000 UTC m=+355.377737225" Dec 02 15:58:13 crc kubenswrapper[4933]: I1202 15:58:13.080257 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7869797c6b-dzrq2" event={"ID":"3a6cc04c-98a7-45f0-aef1-e8bb3466606a","Type":"ContainerStarted","Data":"c4a6d0555811dce622247a83456eb2ffdc4ed0ee5710464ac2c4dc831ee32a5c"} Dec 02 15:58:13 crc kubenswrapper[4933]: I1202 15:58:13.080620 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/thanos-querier-7869797c6b-dzrq2" Dec 02 15:58:13 crc kubenswrapper[4933]: I1202 15:58:13.083836 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8a72be14-2144-4028-8130-716033a27045","Type":"ContainerStarted","Data":"c060c02896879d36228ed6282991f53aedc3c2d74d910cadbca9baafdd1a1ba1"} Dec 02 15:58:13 crc kubenswrapper[4933]: I1202 15:58:13.084077 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8a72be14-2144-4028-8130-716033a27045","Type":"ContainerStarted","Data":"0c9a8a54092acba737d8fa773a28a2078ff954e9d109d2b225645970361240cb"} Dec 02 15:58:13 crc kubenswrapper[4933]: I1202 15:58:13.084092 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8a72be14-2144-4028-8130-716033a27045","Type":"ContainerStarted","Data":"a28d50aca054eeeb5a3a5193f140291bf30f00907b891be40211fb1b7b849a12"} Dec 02 15:58:13 crc kubenswrapper[4933]: I1202 15:58:13.084104 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8a72be14-2144-4028-8130-716033a27045","Type":"ContainerStarted","Data":"0273fdaba091370569bb75cf712f5eb83d4439950f872535be2477966f5ff765"} Dec 02 15:58:13 crc kubenswrapper[4933]: I1202 15:58:13.086298 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-5f6548ffdf-xz8vn" event={"ID":"c215d303-c7ac-40ef-a969-ac30a28d1a83","Type":"ContainerStarted","Data":"fcd6d6aad48946fbdfaed5ab240ec3ce5f0c0ef44a4f13efe60541687992d3f8"} Dec 02 15:58:13 crc kubenswrapper[4933]: I1202 15:58:13.086491 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/monitoring-plugin-5f6548ffdf-xz8vn" Dec 02 15:58:13 crc kubenswrapper[4933]: I1202 15:58:13.094001 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-5f6548ffdf-xz8vn" Dec 02 15:58:13 crc kubenswrapper[4933]: I1202 15:58:13.094733 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-7869797c6b-dzrq2" Dec 02 15:58:13 crc kubenswrapper[4933]: I1202 15:58:13.110408 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-7869797c6b-dzrq2" podStartSLOduration=3.369807434 podStartE2EDuration="9.110388377s" podCreationTimestamp="2025-12-02 15:58:04 +0000 UTC" firstStartedPulling="2025-12-02 15:58:05.054713772 +0000 UTC m=+348.305940475" lastFinishedPulling="2025-12-02 15:58:10.795294715 +0000 UTC m=+354.046521418" observedRunningTime="2025-12-02 15:58:13.106180496 +0000 UTC m=+356.357407219" watchObservedRunningTime="2025-12-02 15:58:13.110388377 +0000 UTC m=+356.361615090" Dec 02 15:58:13 crc kubenswrapper[4933]: I1202 15:58:13.141045 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=3.500567877 podStartE2EDuration="10.141019442s" podCreationTimestamp="2025-12-02 15:58:03 +0000 UTC" firstStartedPulling="2025-12-02 15:58:04.154111369 +0000 UTC m=+347.405338072" lastFinishedPulling="2025-12-02 15:58:10.794562944 +0000 UTC m=+354.045789637" observedRunningTime="2025-12-02 15:58:13.129586786 +0000 UTC m=+356.380813499" watchObservedRunningTime="2025-12-02 15:58:13.141019442 +0000 UTC m=+356.392246145" Dec 02 15:58:13 crc kubenswrapper[4933]: I1202 15:58:13.177983 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-5f6548ffdf-xz8vn" podStartSLOduration=2.9092866600000002 podStartE2EDuration="6.177968329s" podCreationTimestamp="2025-12-02 15:58:07 +0000 UTC" firstStartedPulling="2025-12-02 15:58:08.688732198 +0000 UTC m=+351.939958901" lastFinishedPulling="2025-12-02 15:58:11.957413847 +0000 UTC m=+355.208640570" observedRunningTime="2025-12-02 15:58:13.177711562 +0000 UTC m=+356.428938265" watchObservedRunningTime="2025-12-02 15:58:13.177968329 +0000 UTC m=+356.429195032" Dec 02 15:58:13 crc kubenswrapper[4933]: I1202 15:58:13.230058 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5464fdff99-rll9g"] Dec 02 15:58:13 crc kubenswrapper[4933]: I1202 15:58:13.230509 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-5464fdff99-rll9g" podUID="3c174fb0-8328-4292-8c6a-156d6ae892b8" containerName="route-controller-manager" containerID="cri-o://b6c59eaebc636311a7499cc60e7a12ef4228d603fabdcce04d58c430ec94b413" gracePeriod=30 Dec 02 15:58:13 crc kubenswrapper[4933]: I1202 15:58:13.693845 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5464fdff99-rll9g" Dec 02 15:58:13 crc kubenswrapper[4933]: I1202 15:58:13.787634 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3c174fb0-8328-4292-8c6a-156d6ae892b8-client-ca\") pod \"3c174fb0-8328-4292-8c6a-156d6ae892b8\" (UID: \"3c174fb0-8328-4292-8c6a-156d6ae892b8\") " Dec 02 15:58:13 crc kubenswrapper[4933]: I1202 15:58:13.787734 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c174fb0-8328-4292-8c6a-156d6ae892b8-config\") pod \"3c174fb0-8328-4292-8c6a-156d6ae892b8\" (UID: \"3c174fb0-8328-4292-8c6a-156d6ae892b8\") " Dec 02 15:58:13 crc kubenswrapper[4933]: I1202 15:58:13.787752 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3c174fb0-8328-4292-8c6a-156d6ae892b8-serving-cert\") pod \"3c174fb0-8328-4292-8c6a-156d6ae892b8\" (UID: \"3c174fb0-8328-4292-8c6a-156d6ae892b8\") " Dec 02 15:58:13 crc kubenswrapper[4933]: I1202 15:58:13.787796 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nhtbd\" (UniqueName: \"kubernetes.io/projected/3c174fb0-8328-4292-8c6a-156d6ae892b8-kube-api-access-nhtbd\") pod \"3c174fb0-8328-4292-8c6a-156d6ae892b8\" (UID: \"3c174fb0-8328-4292-8c6a-156d6ae892b8\") " Dec 02 15:58:13 crc kubenswrapper[4933]: I1202 15:58:13.788843 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c174fb0-8328-4292-8c6a-156d6ae892b8-client-ca" (OuterVolumeSpecName: "client-ca") pod "3c174fb0-8328-4292-8c6a-156d6ae892b8" (UID: "3c174fb0-8328-4292-8c6a-156d6ae892b8"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 15:58:13 crc kubenswrapper[4933]: I1202 15:58:13.789423 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c174fb0-8328-4292-8c6a-156d6ae892b8-config" (OuterVolumeSpecName: "config") pod "3c174fb0-8328-4292-8c6a-156d6ae892b8" (UID: "3c174fb0-8328-4292-8c6a-156d6ae892b8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 15:58:13 crc kubenswrapper[4933]: I1202 15:58:13.804212 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c174fb0-8328-4292-8c6a-156d6ae892b8-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "3c174fb0-8328-4292-8c6a-156d6ae892b8" (UID: "3c174fb0-8328-4292-8c6a-156d6ae892b8"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 15:58:13 crc kubenswrapper[4933]: I1202 15:58:13.804245 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c174fb0-8328-4292-8c6a-156d6ae892b8-kube-api-access-nhtbd" (OuterVolumeSpecName: "kube-api-access-nhtbd") pod "3c174fb0-8328-4292-8c6a-156d6ae892b8" (UID: "3c174fb0-8328-4292-8c6a-156d6ae892b8"). InnerVolumeSpecName "kube-api-access-nhtbd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 15:58:13 crc kubenswrapper[4933]: I1202 15:58:13.889212 4933 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c174fb0-8328-4292-8c6a-156d6ae892b8-config\") on node \"crc\" DevicePath \"\"" Dec 02 15:58:13 crc kubenswrapper[4933]: I1202 15:58:13.889245 4933 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3c174fb0-8328-4292-8c6a-156d6ae892b8-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 15:58:13 crc kubenswrapper[4933]: I1202 15:58:13.889256 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nhtbd\" (UniqueName: \"kubernetes.io/projected/3c174fb0-8328-4292-8c6a-156d6ae892b8-kube-api-access-nhtbd\") on node \"crc\" DevicePath \"\"" Dec 02 15:58:13 crc kubenswrapper[4933]: I1202 15:58:13.889267 4933 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3c174fb0-8328-4292-8c6a-156d6ae892b8-client-ca\") on node \"crc\" DevicePath \"\"" Dec 02 15:58:14 crc kubenswrapper[4933]: I1202 15:58:14.096881 4933 generic.go:334] "Generic (PLEG): container finished" podID="3c174fb0-8328-4292-8c6a-156d6ae892b8" containerID="b6c59eaebc636311a7499cc60e7a12ef4228d603fabdcce04d58c430ec94b413" exitCode=0 Dec 02 15:58:14 crc kubenswrapper[4933]: I1202 15:58:14.098392 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5464fdff99-rll9g" Dec 02 15:58:14 crc kubenswrapper[4933]: I1202 15:58:14.103550 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5464fdff99-rll9g" event={"ID":"3c174fb0-8328-4292-8c6a-156d6ae892b8","Type":"ContainerDied","Data":"b6c59eaebc636311a7499cc60e7a12ef4228d603fabdcce04d58c430ec94b413"} Dec 02 15:58:14 crc kubenswrapper[4933]: I1202 15:58:14.103632 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5464fdff99-rll9g" event={"ID":"3c174fb0-8328-4292-8c6a-156d6ae892b8","Type":"ContainerDied","Data":"93d1fd8e14262b5a384f61776d1bc0ebc901d774b65c8ec81881036cc541f6c3"} Dec 02 15:58:14 crc kubenswrapper[4933]: I1202 15:58:14.103663 4933 scope.go:117] "RemoveContainer" containerID="b6c59eaebc636311a7499cc60e7a12ef4228d603fabdcce04d58c430ec94b413" Dec 02 15:58:14 crc kubenswrapper[4933]: I1202 15:58:14.131217 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5464fdff99-rll9g"] Dec 02 15:58:14 crc kubenswrapper[4933]: I1202 15:58:14.133979 4933 scope.go:117] "RemoveContainer" containerID="b6c59eaebc636311a7499cc60e7a12ef4228d603fabdcce04d58c430ec94b413" Dec 02 15:58:14 crc kubenswrapper[4933]: E1202 15:58:14.134950 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6c59eaebc636311a7499cc60e7a12ef4228d603fabdcce04d58c430ec94b413\": container with ID starting with b6c59eaebc636311a7499cc60e7a12ef4228d603fabdcce04d58c430ec94b413 not found: ID does not exist" containerID="b6c59eaebc636311a7499cc60e7a12ef4228d603fabdcce04d58c430ec94b413" Dec 02 15:58:14 crc kubenswrapper[4933]: I1202 15:58:14.134987 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6c59eaebc636311a7499cc60e7a12ef4228d603fabdcce04d58c430ec94b413"} err="failed to get container status \"b6c59eaebc636311a7499cc60e7a12ef4228d603fabdcce04d58c430ec94b413\": rpc error: code = NotFound desc = could not find container \"b6c59eaebc636311a7499cc60e7a12ef4228d603fabdcce04d58c430ec94b413\": container with ID starting with b6c59eaebc636311a7499cc60e7a12ef4228d603fabdcce04d58c430ec94b413 not found: ID does not exist" Dec 02 15:58:14 crc kubenswrapper[4933]: I1202 15:58:14.135048 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5464fdff99-rll9g"] Dec 02 15:58:14 crc kubenswrapper[4933]: I1202 15:58:14.596034 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f4666cb67-6vfkb"] Dec 02 15:58:14 crc kubenswrapper[4933]: E1202 15:58:14.596298 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c174fb0-8328-4292-8c6a-156d6ae892b8" containerName="route-controller-manager" Dec 02 15:58:14 crc kubenswrapper[4933]: I1202 15:58:14.596311 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c174fb0-8328-4292-8c6a-156d6ae892b8" containerName="route-controller-manager" Dec 02 15:58:14 crc kubenswrapper[4933]: I1202 15:58:14.596423 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c174fb0-8328-4292-8c6a-156d6ae892b8" containerName="route-controller-manager" Dec 02 15:58:14 crc kubenswrapper[4933]: I1202 15:58:14.596815 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7f4666cb67-6vfkb" Dec 02 15:58:14 crc kubenswrapper[4933]: I1202 15:58:14.611498 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f4666cb67-6vfkb"] Dec 02 15:58:14 crc kubenswrapper[4933]: I1202 15:58:14.612404 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 02 15:58:14 crc kubenswrapper[4933]: I1202 15:58:14.612618 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 02 15:58:14 crc kubenswrapper[4933]: I1202 15:58:14.612780 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 02 15:58:14 crc kubenswrapper[4933]: I1202 15:58:14.613387 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 02 15:58:14 crc kubenswrapper[4933]: I1202 15:58:14.613537 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 02 15:58:14 crc kubenswrapper[4933]: I1202 15:58:14.613691 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 02 15:58:14 crc kubenswrapper[4933]: I1202 15:58:14.705705 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a837f9ac-906d-4b20-bcd8-5872d41923da-config\") pod \"route-controller-manager-7f4666cb67-6vfkb\" (UID: \"a837f9ac-906d-4b20-bcd8-5872d41923da\") " pod="openshift-route-controller-manager/route-controller-manager-7f4666cb67-6vfkb" Dec 02 15:58:14 crc kubenswrapper[4933]: I1202 15:58:14.705759 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a837f9ac-906d-4b20-bcd8-5872d41923da-client-ca\") pod \"route-controller-manager-7f4666cb67-6vfkb\" (UID: \"a837f9ac-906d-4b20-bcd8-5872d41923da\") " pod="openshift-route-controller-manager/route-controller-manager-7f4666cb67-6vfkb" Dec 02 15:58:14 crc kubenswrapper[4933]: I1202 15:58:14.705802 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a837f9ac-906d-4b20-bcd8-5872d41923da-serving-cert\") pod \"route-controller-manager-7f4666cb67-6vfkb\" (UID: \"a837f9ac-906d-4b20-bcd8-5872d41923da\") " pod="openshift-route-controller-manager/route-controller-manager-7f4666cb67-6vfkb" Dec 02 15:58:14 crc kubenswrapper[4933]: I1202 15:58:14.705894 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wvv4\" (UniqueName: \"kubernetes.io/projected/a837f9ac-906d-4b20-bcd8-5872d41923da-kube-api-access-8wvv4\") pod \"route-controller-manager-7f4666cb67-6vfkb\" (UID: \"a837f9ac-906d-4b20-bcd8-5872d41923da\") " pod="openshift-route-controller-manager/route-controller-manager-7f4666cb67-6vfkb" Dec 02 15:58:14 crc kubenswrapper[4933]: I1202 15:58:14.807030 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wvv4\" (UniqueName: \"kubernetes.io/projected/a837f9ac-906d-4b20-bcd8-5872d41923da-kube-api-access-8wvv4\") pod \"route-controller-manager-7f4666cb67-6vfkb\" (UID: \"a837f9ac-906d-4b20-bcd8-5872d41923da\") " pod="openshift-route-controller-manager/route-controller-manager-7f4666cb67-6vfkb" Dec 02 15:58:14 crc kubenswrapper[4933]: I1202 15:58:14.807122 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a837f9ac-906d-4b20-bcd8-5872d41923da-config\") pod \"route-controller-manager-7f4666cb67-6vfkb\" (UID: \"a837f9ac-906d-4b20-bcd8-5872d41923da\") " pod="openshift-route-controller-manager/route-controller-manager-7f4666cb67-6vfkb" Dec 02 15:58:14 crc kubenswrapper[4933]: I1202 15:58:14.807146 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a837f9ac-906d-4b20-bcd8-5872d41923da-client-ca\") pod \"route-controller-manager-7f4666cb67-6vfkb\" (UID: \"a837f9ac-906d-4b20-bcd8-5872d41923da\") " pod="openshift-route-controller-manager/route-controller-manager-7f4666cb67-6vfkb" Dec 02 15:58:14 crc kubenswrapper[4933]: I1202 15:58:14.807182 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a837f9ac-906d-4b20-bcd8-5872d41923da-serving-cert\") pod \"route-controller-manager-7f4666cb67-6vfkb\" (UID: \"a837f9ac-906d-4b20-bcd8-5872d41923da\") " pod="openshift-route-controller-manager/route-controller-manager-7f4666cb67-6vfkb" Dec 02 15:58:14 crc kubenswrapper[4933]: I1202 15:58:14.808892 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a837f9ac-906d-4b20-bcd8-5872d41923da-client-ca\") pod \"route-controller-manager-7f4666cb67-6vfkb\" (UID: \"a837f9ac-906d-4b20-bcd8-5872d41923da\") " pod="openshift-route-controller-manager/route-controller-manager-7f4666cb67-6vfkb" Dec 02 15:58:14 crc kubenswrapper[4933]: I1202 15:58:14.809899 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a837f9ac-906d-4b20-bcd8-5872d41923da-config\") pod \"route-controller-manager-7f4666cb67-6vfkb\" (UID: \"a837f9ac-906d-4b20-bcd8-5872d41923da\") " pod="openshift-route-controller-manager/route-controller-manager-7f4666cb67-6vfkb" Dec 02 15:58:14 crc kubenswrapper[4933]: I1202 15:58:14.816608 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a837f9ac-906d-4b20-bcd8-5872d41923da-serving-cert\") pod \"route-controller-manager-7f4666cb67-6vfkb\" (UID: \"a837f9ac-906d-4b20-bcd8-5872d41923da\") " pod="openshift-route-controller-manager/route-controller-manager-7f4666cb67-6vfkb" Dec 02 15:58:14 crc kubenswrapper[4933]: I1202 15:58:14.826053 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wvv4\" (UniqueName: \"kubernetes.io/projected/a837f9ac-906d-4b20-bcd8-5872d41923da-kube-api-access-8wvv4\") pod \"route-controller-manager-7f4666cb67-6vfkb\" (UID: \"a837f9ac-906d-4b20-bcd8-5872d41923da\") " pod="openshift-route-controller-manager/route-controller-manager-7f4666cb67-6vfkb" Dec 02 15:58:14 crc kubenswrapper[4933]: I1202 15:58:14.914286 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7f4666cb67-6vfkb" Dec 02 15:58:15 crc kubenswrapper[4933]: I1202 15:58:15.060582 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c174fb0-8328-4292-8c6a-156d6ae892b8" path="/var/lib/kubelet/pods/3c174fb0-8328-4292-8c6a-156d6ae892b8/volumes" Dec 02 15:58:15 crc kubenswrapper[4933]: I1202 15:58:15.372518 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f4666cb67-6vfkb"] Dec 02 15:58:16 crc kubenswrapper[4933]: W1202 15:58:16.933536 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda837f9ac_906d_4b20_bcd8_5872d41923da.slice/crio-0e8339da7171ec00ec5041a882383d44f43b8d84ef46f79f1ef869b97c4082f3 WatchSource:0}: Error finding container 0e8339da7171ec00ec5041a882383d44f43b8d84ef46f79f1ef869b97c4082f3: Status 404 returned error can't find the container with id 0e8339da7171ec00ec5041a882383d44f43b8d84ef46f79f1ef869b97c4082f3 Dec 02 15:58:17 crc kubenswrapper[4933]: I1202 15:58:17.129936 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7f4666cb67-6vfkb" event={"ID":"a837f9ac-906d-4b20-bcd8-5872d41923da","Type":"ContainerStarted","Data":"0e8339da7171ec00ec5041a882383d44f43b8d84ef46f79f1ef869b97c4082f3"} Dec 02 15:58:17 crc kubenswrapper[4933]: I1202 15:58:17.169659 4933 patch_prober.go:28] interesting pod/machine-config-daemon-d2p6w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 15:58:17 crc kubenswrapper[4933]: I1202 15:58:17.169724 4933 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 15:58:17 crc kubenswrapper[4933]: I1202 15:58:17.775176 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-64864bcb5f-7cr62" Dec 02 15:58:17 crc kubenswrapper[4933]: I1202 15:58:17.775867 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-64864bcb5f-7cr62" Dec 02 15:58:17 crc kubenswrapper[4933]: I1202 15:58:17.785856 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-64864bcb5f-7cr62" Dec 02 15:58:18 crc kubenswrapper[4933]: I1202 15:58:18.141180 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-64864bcb5f-7cr62" Dec 02 15:58:18 crc kubenswrapper[4933]: I1202 15:58:18.207461 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-lcwgg"] Dec 02 15:58:19 crc kubenswrapper[4933]: I1202 15:58:19.141847 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7f4666cb67-6vfkb" event={"ID":"a837f9ac-906d-4b20-bcd8-5872d41923da","Type":"ContainerStarted","Data":"2b05376208bacf95eba35a774d0b2c542e03344e35278b406b443aa0843a38f8"} Dec 02 15:58:19 crc kubenswrapper[4933]: I1202 15:58:19.142198 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7f4666cb67-6vfkb" Dec 02 15:58:19 crc kubenswrapper[4933]: I1202 15:58:19.145640 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"9b8cb102-56f2-48f2-a9cd-3811df574aaf","Type":"ContainerStarted","Data":"e8eff6e7ffdabb6b68e702e1f558a2357c0f4273f14330219d21574d8bf566fa"} Dec 02 15:58:19 crc kubenswrapper[4933]: I1202 15:58:19.145915 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"9b8cb102-56f2-48f2-a9cd-3811df574aaf","Type":"ContainerStarted","Data":"2af2ee45c28f64a0764288df7df76bb52a9bdfcc9c7edeffdee43d78e83aa935"} Dec 02 15:58:19 crc kubenswrapper[4933]: I1202 15:58:19.145928 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"9b8cb102-56f2-48f2-a9cd-3811df574aaf","Type":"ContainerStarted","Data":"0bec1b83aad2d1db73fdbae8de6269542ab5812ee9a773d4ccf10f2e318899dc"} Dec 02 15:58:19 crc kubenswrapper[4933]: I1202 15:58:19.145938 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"9b8cb102-56f2-48f2-a9cd-3811df574aaf","Type":"ContainerStarted","Data":"ed3e4a3de89efea3797c3a62349c87db96a67c40569cf0a0e5df0e7a3e4a002a"} Dec 02 15:58:19 crc kubenswrapper[4933]: I1202 15:58:19.145948 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"9b8cb102-56f2-48f2-a9cd-3811df574aaf","Type":"ContainerStarted","Data":"8c0e5d64a5d2d7819ef9f1cd0e1e6fff699d1e5536832e689babde25cfaf4e4c"} Dec 02 15:58:19 crc kubenswrapper[4933]: I1202 15:58:19.145957 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"9b8cb102-56f2-48f2-a9cd-3811df574aaf","Type":"ContainerStarted","Data":"d0c09e240a289254c6e19420e210c9caf22aa9f35bbb47f0646ace0ef5e454fc"} Dec 02 15:58:19 crc kubenswrapper[4933]: I1202 15:58:19.147722 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7f4666cb67-6vfkb" Dec 02 15:58:19 crc kubenswrapper[4933]: I1202 15:58:19.157340 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7f4666cb67-6vfkb" podStartSLOduration=6.157322161 podStartE2EDuration="6.157322161s" podCreationTimestamp="2025-12-02 15:58:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 15:58:19.156087666 +0000 UTC m=+362.407314389" watchObservedRunningTime="2025-12-02 15:58:19.157322161 +0000 UTC m=+362.408548874" Dec 02 15:58:19 crc kubenswrapper[4933]: I1202 15:58:19.205531 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=6.257811867 podStartE2EDuration="11.205511479s" podCreationTimestamp="2025-12-02 15:58:08 +0000 UTC" firstStartedPulling="2025-12-02 15:58:12.065210009 +0000 UTC m=+355.316436712" lastFinishedPulling="2025-12-02 15:58:17.012909621 +0000 UTC m=+360.264136324" observedRunningTime="2025-12-02 15:58:19.204182851 +0000 UTC m=+362.455409584" watchObservedRunningTime="2025-12-02 15:58:19.205511479 +0000 UTC m=+362.456738182" Dec 02 15:58:24 crc kubenswrapper[4933]: I1202 15:58:24.044062 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-k8s-0" Dec 02 15:58:27 crc kubenswrapper[4933]: I1202 15:58:27.858050 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/metrics-server-c758f956-cq472" Dec 02 15:58:27 crc kubenswrapper[4933]: I1202 15:58:27.860024 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-c758f956-cq472" Dec 02 15:58:43 crc kubenswrapper[4933]: I1202 15:58:43.248397 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-lcwgg" podUID="fcc5c745-13b8-4ff8-b677-095dd8a46081" containerName="console" containerID="cri-o://aa44205b3705eb4d58add5f1ca3e85e96f3d8caf0d5a9efe641cb1bd3ad6c400" gracePeriod=15 Dec 02 15:58:44 crc kubenswrapper[4933]: I1202 15:58:44.733900 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-lcwgg_fcc5c745-13b8-4ff8-b677-095dd8a46081/console/0.log" Dec 02 15:58:44 crc kubenswrapper[4933]: I1202 15:58:44.734345 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-lcwgg" Dec 02 15:58:44 crc kubenswrapper[4933]: I1202 15:58:44.862017 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fcc5c745-13b8-4ff8-b677-095dd8a46081-oauth-serving-cert\") pod \"fcc5c745-13b8-4ff8-b677-095dd8a46081\" (UID: \"fcc5c745-13b8-4ff8-b677-095dd8a46081\") " Dec 02 15:58:44 crc kubenswrapper[4933]: I1202 15:58:44.862098 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fcc5c745-13b8-4ff8-b677-095dd8a46081-service-ca\") pod \"fcc5c745-13b8-4ff8-b677-095dd8a46081\" (UID: \"fcc5c745-13b8-4ff8-b677-095dd8a46081\") " Dec 02 15:58:44 crc kubenswrapper[4933]: I1202 15:58:44.862145 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fcc5c745-13b8-4ff8-b677-095dd8a46081-console-config\") pod \"fcc5c745-13b8-4ff8-b677-095dd8a46081\" (UID: \"fcc5c745-13b8-4ff8-b677-095dd8a46081\") " Dec 02 15:58:44 crc kubenswrapper[4933]: I1202 15:58:44.862174 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rd48q\" (UniqueName: \"kubernetes.io/projected/fcc5c745-13b8-4ff8-b677-095dd8a46081-kube-api-access-rd48q\") pod \"fcc5c745-13b8-4ff8-b677-095dd8a46081\" (UID: \"fcc5c745-13b8-4ff8-b677-095dd8a46081\") " Dec 02 15:58:44 crc kubenswrapper[4933]: I1202 15:58:44.862213 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fcc5c745-13b8-4ff8-b677-095dd8a46081-console-serving-cert\") pod \"fcc5c745-13b8-4ff8-b677-095dd8a46081\" (UID: \"fcc5c745-13b8-4ff8-b677-095dd8a46081\") " Dec 02 15:58:44 crc kubenswrapper[4933]: I1202 15:58:44.862271 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fcc5c745-13b8-4ff8-b677-095dd8a46081-console-oauth-config\") pod \"fcc5c745-13b8-4ff8-b677-095dd8a46081\" (UID: \"fcc5c745-13b8-4ff8-b677-095dd8a46081\") " Dec 02 15:58:44 crc kubenswrapper[4933]: I1202 15:58:44.862296 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fcc5c745-13b8-4ff8-b677-095dd8a46081-trusted-ca-bundle\") pod \"fcc5c745-13b8-4ff8-b677-095dd8a46081\" (UID: \"fcc5c745-13b8-4ff8-b677-095dd8a46081\") " Dec 02 15:58:44 crc kubenswrapper[4933]: I1202 15:58:44.863015 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fcc5c745-13b8-4ff8-b677-095dd8a46081-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "fcc5c745-13b8-4ff8-b677-095dd8a46081" (UID: "fcc5c745-13b8-4ff8-b677-095dd8a46081"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 15:58:44 crc kubenswrapper[4933]: I1202 15:58:44.863008 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fcc5c745-13b8-4ff8-b677-095dd8a46081-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "fcc5c745-13b8-4ff8-b677-095dd8a46081" (UID: "fcc5c745-13b8-4ff8-b677-095dd8a46081"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 15:58:44 crc kubenswrapper[4933]: I1202 15:58:44.863035 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fcc5c745-13b8-4ff8-b677-095dd8a46081-console-config" (OuterVolumeSpecName: "console-config") pod "fcc5c745-13b8-4ff8-b677-095dd8a46081" (UID: "fcc5c745-13b8-4ff8-b677-095dd8a46081"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 15:58:44 crc kubenswrapper[4933]: I1202 15:58:44.863753 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fcc5c745-13b8-4ff8-b677-095dd8a46081-service-ca" (OuterVolumeSpecName: "service-ca") pod "fcc5c745-13b8-4ff8-b677-095dd8a46081" (UID: "fcc5c745-13b8-4ff8-b677-095dd8a46081"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 15:58:44 crc kubenswrapper[4933]: I1202 15:58:44.868871 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fcc5c745-13b8-4ff8-b677-095dd8a46081-kube-api-access-rd48q" (OuterVolumeSpecName: "kube-api-access-rd48q") pod "fcc5c745-13b8-4ff8-b677-095dd8a46081" (UID: "fcc5c745-13b8-4ff8-b677-095dd8a46081"). InnerVolumeSpecName "kube-api-access-rd48q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 15:58:44 crc kubenswrapper[4933]: I1202 15:58:44.869004 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fcc5c745-13b8-4ff8-b677-095dd8a46081-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "fcc5c745-13b8-4ff8-b677-095dd8a46081" (UID: "fcc5c745-13b8-4ff8-b677-095dd8a46081"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 15:58:44 crc kubenswrapper[4933]: I1202 15:58:44.872847 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fcc5c745-13b8-4ff8-b677-095dd8a46081-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "fcc5c745-13b8-4ff8-b677-095dd8a46081" (UID: "fcc5c745-13b8-4ff8-b677-095dd8a46081"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 15:58:44 crc kubenswrapper[4933]: I1202 15:58:44.963588 4933 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fcc5c745-13b8-4ff8-b677-095dd8a46081-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 15:58:44 crc kubenswrapper[4933]: I1202 15:58:44.963630 4933 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fcc5c745-13b8-4ff8-b677-095dd8a46081-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 02 15:58:44 crc kubenswrapper[4933]: I1202 15:58:44.963643 4933 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fcc5c745-13b8-4ff8-b677-095dd8a46081-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 15:58:44 crc kubenswrapper[4933]: I1202 15:58:44.963654 4933 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fcc5c745-13b8-4ff8-b677-095dd8a46081-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 15:58:44 crc kubenswrapper[4933]: I1202 15:58:44.963667 4933 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fcc5c745-13b8-4ff8-b677-095dd8a46081-service-ca\") on node \"crc\" DevicePath \"\"" Dec 02 15:58:44 crc kubenswrapper[4933]: I1202 15:58:44.963678 4933 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fcc5c745-13b8-4ff8-b677-095dd8a46081-console-config\") on node \"crc\" DevicePath \"\"" Dec 02 15:58:44 crc kubenswrapper[4933]: I1202 15:58:44.963689 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rd48q\" (UniqueName: \"kubernetes.io/projected/fcc5c745-13b8-4ff8-b677-095dd8a46081-kube-api-access-rd48q\") on node \"crc\" DevicePath \"\"" Dec 02 15:58:45 crc kubenswrapper[4933]: I1202 15:58:45.322518 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-lcwgg_fcc5c745-13b8-4ff8-b677-095dd8a46081/console/0.log" Dec 02 15:58:45 crc kubenswrapper[4933]: I1202 15:58:45.322954 4933 generic.go:334] "Generic (PLEG): container finished" podID="fcc5c745-13b8-4ff8-b677-095dd8a46081" containerID="aa44205b3705eb4d58add5f1ca3e85e96f3d8caf0d5a9efe641cb1bd3ad6c400" exitCode=2 Dec 02 15:58:45 crc kubenswrapper[4933]: I1202 15:58:45.322994 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-lcwgg" event={"ID":"fcc5c745-13b8-4ff8-b677-095dd8a46081","Type":"ContainerDied","Data":"aa44205b3705eb4d58add5f1ca3e85e96f3d8caf0d5a9efe641cb1bd3ad6c400"} Dec 02 15:58:45 crc kubenswrapper[4933]: I1202 15:58:45.323029 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-lcwgg" event={"ID":"fcc5c745-13b8-4ff8-b677-095dd8a46081","Type":"ContainerDied","Data":"b75e4a2db0750eb91a9ff384a0d37c436e19ae4a376b29bcf9d645755d20500c"} Dec 02 15:58:45 crc kubenswrapper[4933]: I1202 15:58:45.323056 4933 scope.go:117] "RemoveContainer" containerID="aa44205b3705eb4d58add5f1ca3e85e96f3d8caf0d5a9efe641cb1bd3ad6c400" Dec 02 15:58:45 crc kubenswrapper[4933]: I1202 15:58:45.323094 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-lcwgg" Dec 02 15:58:45 crc kubenswrapper[4933]: I1202 15:58:45.347492 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-lcwgg"] Dec 02 15:58:45 crc kubenswrapper[4933]: I1202 15:58:45.349061 4933 scope.go:117] "RemoveContainer" containerID="aa44205b3705eb4d58add5f1ca3e85e96f3d8caf0d5a9efe641cb1bd3ad6c400" Dec 02 15:58:45 crc kubenswrapper[4933]: E1202 15:58:45.349501 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa44205b3705eb4d58add5f1ca3e85e96f3d8caf0d5a9efe641cb1bd3ad6c400\": container with ID starting with aa44205b3705eb4d58add5f1ca3e85e96f3d8caf0d5a9efe641cb1bd3ad6c400 not found: ID does not exist" containerID="aa44205b3705eb4d58add5f1ca3e85e96f3d8caf0d5a9efe641cb1bd3ad6c400" Dec 02 15:58:45 crc kubenswrapper[4933]: I1202 15:58:45.349544 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa44205b3705eb4d58add5f1ca3e85e96f3d8caf0d5a9efe641cb1bd3ad6c400"} err="failed to get container status \"aa44205b3705eb4d58add5f1ca3e85e96f3d8caf0d5a9efe641cb1bd3ad6c400\": rpc error: code = NotFound desc = could not find container \"aa44205b3705eb4d58add5f1ca3e85e96f3d8caf0d5a9efe641cb1bd3ad6c400\": container with ID starting with aa44205b3705eb4d58add5f1ca3e85e96f3d8caf0d5a9efe641cb1bd3ad6c400 not found: ID does not exist" Dec 02 15:58:45 crc kubenswrapper[4933]: I1202 15:58:45.353042 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-lcwgg"] Dec 02 15:58:47 crc kubenswrapper[4933]: I1202 15:58:47.060497 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fcc5c745-13b8-4ff8-b677-095dd8a46081" path="/var/lib/kubelet/pods/fcc5c745-13b8-4ff8-b677-095dd8a46081/volumes" Dec 02 15:58:47 crc kubenswrapper[4933]: I1202 15:58:47.169096 4933 patch_prober.go:28] interesting pod/machine-config-daemon-d2p6w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 15:58:47 crc kubenswrapper[4933]: I1202 15:58:47.169161 4933 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 15:58:47 crc kubenswrapper[4933]: I1202 15:58:47.863597 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-c758f956-cq472" Dec 02 15:58:47 crc kubenswrapper[4933]: I1202 15:58:47.868235 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-c758f956-cq472" Dec 02 15:59:09 crc kubenswrapper[4933]: I1202 15:59:09.044486 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Dec 02 15:59:09 crc kubenswrapper[4933]: I1202 15:59:09.073113 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Dec 02 15:59:09 crc kubenswrapper[4933]: I1202 15:59:09.534167 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Dec 02 15:59:17 crc kubenswrapper[4933]: I1202 15:59:17.180855 4933 patch_prober.go:28] interesting pod/machine-config-daemon-d2p6w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 15:59:17 crc kubenswrapper[4933]: I1202 15:59:17.181358 4933 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 15:59:17 crc kubenswrapper[4933]: I1202 15:59:17.182209 4933 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" Dec 02 15:59:17 crc kubenswrapper[4933]: I1202 15:59:17.182910 4933 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"84975a1489617fef6b371e8f083c03eea00aaba65ae72fcc68b86a8281f97488"} pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 15:59:17 crc kubenswrapper[4933]: I1202 15:59:17.182956 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" containerName="machine-config-daemon" containerID="cri-o://84975a1489617fef6b371e8f083c03eea00aaba65ae72fcc68b86a8281f97488" gracePeriod=600 Dec 02 15:59:17 crc kubenswrapper[4933]: I1202 15:59:17.581380 4933 generic.go:334] "Generic (PLEG): container finished" podID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" containerID="84975a1489617fef6b371e8f083c03eea00aaba65ae72fcc68b86a8281f97488" exitCode=0 Dec 02 15:59:17 crc kubenswrapper[4933]: I1202 15:59:17.581496 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" event={"ID":"e6c1c5e6-50dd-428a-890c-2c3f0456f2fa","Type":"ContainerDied","Data":"84975a1489617fef6b371e8f083c03eea00aaba65ae72fcc68b86a8281f97488"} Dec 02 15:59:17 crc kubenswrapper[4933]: I1202 15:59:17.581760 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" event={"ID":"e6c1c5e6-50dd-428a-890c-2c3f0456f2fa","Type":"ContainerStarted","Data":"ad85216952cd1ae86bb25d4b70ffbba75c0b70e8a399ab6abbe21111aa2e2cec"} Dec 02 15:59:17 crc kubenswrapper[4933]: I1202 15:59:17.581805 4933 scope.go:117] "RemoveContainer" containerID="54194f3459a2bbe748821e4f8e94abdd18e7c4e483d4cc2c9d5b765db584dd01" Dec 02 15:59:38 crc kubenswrapper[4933]: I1202 15:59:38.791005 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-6d6c4458c4-cfwr8"] Dec 02 15:59:38 crc kubenswrapper[4933]: E1202 15:59:38.791886 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcc5c745-13b8-4ff8-b677-095dd8a46081" containerName="console" Dec 02 15:59:38 crc kubenswrapper[4933]: I1202 15:59:38.791902 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcc5c745-13b8-4ff8-b677-095dd8a46081" containerName="console" Dec 02 15:59:38 crc kubenswrapper[4933]: I1202 15:59:38.792055 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="fcc5c745-13b8-4ff8-b677-095dd8a46081" containerName="console" Dec 02 15:59:38 crc kubenswrapper[4933]: I1202 15:59:38.792550 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6d6c4458c4-cfwr8" Dec 02 15:59:38 crc kubenswrapper[4933]: I1202 15:59:38.809035 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6d6c4458c4-cfwr8"] Dec 02 15:59:38 crc kubenswrapper[4933]: I1202 15:59:38.873872 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/af27e767-99b2-48f8-99d5-19772d768d0c-console-oauth-config\") pod \"console-6d6c4458c4-cfwr8\" (UID: \"af27e767-99b2-48f8-99d5-19772d768d0c\") " pod="openshift-console/console-6d6c4458c4-cfwr8" Dec 02 15:59:38 crc kubenswrapper[4933]: I1202 15:59:38.873951 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/af27e767-99b2-48f8-99d5-19772d768d0c-console-serving-cert\") pod \"console-6d6c4458c4-cfwr8\" (UID: \"af27e767-99b2-48f8-99d5-19772d768d0c\") " pod="openshift-console/console-6d6c4458c4-cfwr8" Dec 02 15:59:38 crc kubenswrapper[4933]: I1202 15:59:38.873989 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/af27e767-99b2-48f8-99d5-19772d768d0c-console-config\") pod \"console-6d6c4458c4-cfwr8\" (UID: \"af27e767-99b2-48f8-99d5-19772d768d0c\") " pod="openshift-console/console-6d6c4458c4-cfwr8" Dec 02 15:59:38 crc kubenswrapper[4933]: I1202 15:59:38.874023 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/af27e767-99b2-48f8-99d5-19772d768d0c-oauth-serving-cert\") pod \"console-6d6c4458c4-cfwr8\" (UID: \"af27e767-99b2-48f8-99d5-19772d768d0c\") " pod="openshift-console/console-6d6c4458c4-cfwr8" Dec 02 15:59:38 crc kubenswrapper[4933]: I1202 15:59:38.874071 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/af27e767-99b2-48f8-99d5-19772d768d0c-service-ca\") pod \"console-6d6c4458c4-cfwr8\" (UID: \"af27e767-99b2-48f8-99d5-19772d768d0c\") " pod="openshift-console/console-6d6c4458c4-cfwr8" Dec 02 15:59:38 crc kubenswrapper[4933]: I1202 15:59:38.874095 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/af27e767-99b2-48f8-99d5-19772d768d0c-trusted-ca-bundle\") pod \"console-6d6c4458c4-cfwr8\" (UID: \"af27e767-99b2-48f8-99d5-19772d768d0c\") " pod="openshift-console/console-6d6c4458c4-cfwr8" Dec 02 15:59:38 crc kubenswrapper[4933]: I1202 15:59:38.874138 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2m2tx\" (UniqueName: \"kubernetes.io/projected/af27e767-99b2-48f8-99d5-19772d768d0c-kube-api-access-2m2tx\") pod \"console-6d6c4458c4-cfwr8\" (UID: \"af27e767-99b2-48f8-99d5-19772d768d0c\") " pod="openshift-console/console-6d6c4458c4-cfwr8" Dec 02 15:59:38 crc kubenswrapper[4933]: I1202 15:59:38.975504 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/af27e767-99b2-48f8-99d5-19772d768d0c-service-ca\") pod \"console-6d6c4458c4-cfwr8\" (UID: \"af27e767-99b2-48f8-99d5-19772d768d0c\") " pod="openshift-console/console-6d6c4458c4-cfwr8" Dec 02 15:59:38 crc kubenswrapper[4933]: I1202 15:59:38.975582 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/af27e767-99b2-48f8-99d5-19772d768d0c-trusted-ca-bundle\") pod \"console-6d6c4458c4-cfwr8\" (UID: \"af27e767-99b2-48f8-99d5-19772d768d0c\") " pod="openshift-console/console-6d6c4458c4-cfwr8" Dec 02 15:59:38 crc kubenswrapper[4933]: I1202 15:59:38.975629 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2m2tx\" (UniqueName: \"kubernetes.io/projected/af27e767-99b2-48f8-99d5-19772d768d0c-kube-api-access-2m2tx\") pod \"console-6d6c4458c4-cfwr8\" (UID: \"af27e767-99b2-48f8-99d5-19772d768d0c\") " pod="openshift-console/console-6d6c4458c4-cfwr8" Dec 02 15:59:38 crc kubenswrapper[4933]: I1202 15:59:38.975672 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/af27e767-99b2-48f8-99d5-19772d768d0c-console-oauth-config\") pod \"console-6d6c4458c4-cfwr8\" (UID: \"af27e767-99b2-48f8-99d5-19772d768d0c\") " pod="openshift-console/console-6d6c4458c4-cfwr8" Dec 02 15:59:38 crc kubenswrapper[4933]: I1202 15:59:38.975701 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/af27e767-99b2-48f8-99d5-19772d768d0c-console-serving-cert\") pod \"console-6d6c4458c4-cfwr8\" (UID: \"af27e767-99b2-48f8-99d5-19772d768d0c\") " pod="openshift-console/console-6d6c4458c4-cfwr8" Dec 02 15:59:38 crc kubenswrapper[4933]: I1202 15:59:38.975719 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/af27e767-99b2-48f8-99d5-19772d768d0c-console-config\") pod \"console-6d6c4458c4-cfwr8\" (UID: \"af27e767-99b2-48f8-99d5-19772d768d0c\") " pod="openshift-console/console-6d6c4458c4-cfwr8" Dec 02 15:59:38 crc kubenswrapper[4933]: I1202 15:59:38.975745 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/af27e767-99b2-48f8-99d5-19772d768d0c-oauth-serving-cert\") pod \"console-6d6c4458c4-cfwr8\" (UID: \"af27e767-99b2-48f8-99d5-19772d768d0c\") " pod="openshift-console/console-6d6c4458c4-cfwr8" Dec 02 15:59:38 crc kubenswrapper[4933]: I1202 15:59:38.976964 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/af27e767-99b2-48f8-99d5-19772d768d0c-service-ca\") pod \"console-6d6c4458c4-cfwr8\" (UID: \"af27e767-99b2-48f8-99d5-19772d768d0c\") " pod="openshift-console/console-6d6c4458c4-cfwr8" Dec 02 15:59:38 crc kubenswrapper[4933]: I1202 15:59:38.977063 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/af27e767-99b2-48f8-99d5-19772d768d0c-oauth-serving-cert\") pod \"console-6d6c4458c4-cfwr8\" (UID: \"af27e767-99b2-48f8-99d5-19772d768d0c\") " pod="openshift-console/console-6d6c4458c4-cfwr8" Dec 02 15:59:38 crc kubenswrapper[4933]: I1202 15:59:38.977195 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/af27e767-99b2-48f8-99d5-19772d768d0c-trusted-ca-bundle\") pod \"console-6d6c4458c4-cfwr8\" (UID: \"af27e767-99b2-48f8-99d5-19772d768d0c\") " pod="openshift-console/console-6d6c4458c4-cfwr8" Dec 02 15:59:38 crc kubenswrapper[4933]: I1202 15:59:38.977228 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/af27e767-99b2-48f8-99d5-19772d768d0c-console-config\") pod \"console-6d6c4458c4-cfwr8\" (UID: \"af27e767-99b2-48f8-99d5-19772d768d0c\") " pod="openshift-console/console-6d6c4458c4-cfwr8" Dec 02 15:59:38 crc kubenswrapper[4933]: I1202 15:59:38.980979 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/af27e767-99b2-48f8-99d5-19772d768d0c-console-oauth-config\") pod \"console-6d6c4458c4-cfwr8\" (UID: \"af27e767-99b2-48f8-99d5-19772d768d0c\") " pod="openshift-console/console-6d6c4458c4-cfwr8" Dec 02 15:59:38 crc kubenswrapper[4933]: I1202 15:59:38.981534 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/af27e767-99b2-48f8-99d5-19772d768d0c-console-serving-cert\") pod \"console-6d6c4458c4-cfwr8\" (UID: \"af27e767-99b2-48f8-99d5-19772d768d0c\") " pod="openshift-console/console-6d6c4458c4-cfwr8" Dec 02 15:59:38 crc kubenswrapper[4933]: I1202 15:59:38.996228 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2m2tx\" (UniqueName: \"kubernetes.io/projected/af27e767-99b2-48f8-99d5-19772d768d0c-kube-api-access-2m2tx\") pod \"console-6d6c4458c4-cfwr8\" (UID: \"af27e767-99b2-48f8-99d5-19772d768d0c\") " pod="openshift-console/console-6d6c4458c4-cfwr8" Dec 02 15:59:39 crc kubenswrapper[4933]: I1202 15:59:39.202776 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6d6c4458c4-cfwr8" Dec 02 15:59:39 crc kubenswrapper[4933]: I1202 15:59:39.779748 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6d6c4458c4-cfwr8"] Dec 02 15:59:40 crc kubenswrapper[4933]: I1202 15:59:40.741750 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6d6c4458c4-cfwr8" event={"ID":"af27e767-99b2-48f8-99d5-19772d768d0c","Type":"ContainerStarted","Data":"d7fd1a1d0912d78511622833f4d3f50655404895a4988ed895227989ad9f2dd5"} Dec 02 15:59:40 crc kubenswrapper[4933]: I1202 15:59:40.742184 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6d6c4458c4-cfwr8" event={"ID":"af27e767-99b2-48f8-99d5-19772d768d0c","Type":"ContainerStarted","Data":"ade3b1fe68ed83b8a549e7cef3b7d55771370257aff8f6c845ebd35d0d7cbf40"} Dec 02 15:59:40 crc kubenswrapper[4933]: I1202 15:59:40.778148 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6d6c4458c4-cfwr8" podStartSLOduration=2.778117711 podStartE2EDuration="2.778117711s" podCreationTimestamp="2025-12-02 15:59:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 15:59:40.767334786 +0000 UTC m=+444.018561499" watchObservedRunningTime="2025-12-02 15:59:40.778117711 +0000 UTC m=+444.029344454" Dec 02 15:59:49 crc kubenswrapper[4933]: I1202 15:59:49.203916 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6d6c4458c4-cfwr8" Dec 02 15:59:49 crc kubenswrapper[4933]: I1202 15:59:49.204729 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-6d6c4458c4-cfwr8" Dec 02 15:59:49 crc kubenswrapper[4933]: I1202 15:59:49.209251 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6d6c4458c4-cfwr8" Dec 02 15:59:49 crc kubenswrapper[4933]: I1202 15:59:49.818231 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6d6c4458c4-cfwr8" Dec 02 15:59:49 crc kubenswrapper[4933]: I1202 15:59:49.871870 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-64864bcb5f-7cr62"] Dec 02 16:00:00 crc kubenswrapper[4933]: I1202 16:00:00.177455 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411520-6kmwh"] Dec 02 16:00:00 crc kubenswrapper[4933]: I1202 16:00:00.178752 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411520-6kmwh" Dec 02 16:00:00 crc kubenswrapper[4933]: I1202 16:00:00.180529 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 02 16:00:00 crc kubenswrapper[4933]: I1202 16:00:00.180615 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 02 16:00:00 crc kubenswrapper[4933]: I1202 16:00:00.186079 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411520-6kmwh"] Dec 02 16:00:00 crc kubenswrapper[4933]: I1202 16:00:00.294415 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhb5t\" (UniqueName: \"kubernetes.io/projected/f5122e65-9bfd-4382-8057-177aa3d4450c-kube-api-access-rhb5t\") pod \"collect-profiles-29411520-6kmwh\" (UID: \"f5122e65-9bfd-4382-8057-177aa3d4450c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411520-6kmwh" Dec 02 16:00:00 crc kubenswrapper[4933]: I1202 16:00:00.294498 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f5122e65-9bfd-4382-8057-177aa3d4450c-secret-volume\") pod \"collect-profiles-29411520-6kmwh\" (UID: \"f5122e65-9bfd-4382-8057-177aa3d4450c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411520-6kmwh" Dec 02 16:00:00 crc kubenswrapper[4933]: I1202 16:00:00.294606 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f5122e65-9bfd-4382-8057-177aa3d4450c-config-volume\") pod \"collect-profiles-29411520-6kmwh\" (UID: \"f5122e65-9bfd-4382-8057-177aa3d4450c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411520-6kmwh" Dec 02 16:00:00 crc kubenswrapper[4933]: I1202 16:00:00.395527 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhb5t\" (UniqueName: \"kubernetes.io/projected/f5122e65-9bfd-4382-8057-177aa3d4450c-kube-api-access-rhb5t\") pod \"collect-profiles-29411520-6kmwh\" (UID: \"f5122e65-9bfd-4382-8057-177aa3d4450c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411520-6kmwh" Dec 02 16:00:00 crc kubenswrapper[4933]: I1202 16:00:00.395596 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f5122e65-9bfd-4382-8057-177aa3d4450c-secret-volume\") pod \"collect-profiles-29411520-6kmwh\" (UID: \"f5122e65-9bfd-4382-8057-177aa3d4450c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411520-6kmwh" Dec 02 16:00:00 crc kubenswrapper[4933]: I1202 16:00:00.395663 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f5122e65-9bfd-4382-8057-177aa3d4450c-config-volume\") pod \"collect-profiles-29411520-6kmwh\" (UID: \"f5122e65-9bfd-4382-8057-177aa3d4450c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411520-6kmwh" Dec 02 16:00:00 crc kubenswrapper[4933]: I1202 16:00:00.397956 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f5122e65-9bfd-4382-8057-177aa3d4450c-config-volume\") pod \"collect-profiles-29411520-6kmwh\" (UID: \"f5122e65-9bfd-4382-8057-177aa3d4450c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411520-6kmwh" Dec 02 16:00:00 crc kubenswrapper[4933]: I1202 16:00:00.404749 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f5122e65-9bfd-4382-8057-177aa3d4450c-secret-volume\") pod \"collect-profiles-29411520-6kmwh\" (UID: \"f5122e65-9bfd-4382-8057-177aa3d4450c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411520-6kmwh" Dec 02 16:00:00 crc kubenswrapper[4933]: I1202 16:00:00.412616 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhb5t\" (UniqueName: \"kubernetes.io/projected/f5122e65-9bfd-4382-8057-177aa3d4450c-kube-api-access-rhb5t\") pod \"collect-profiles-29411520-6kmwh\" (UID: \"f5122e65-9bfd-4382-8057-177aa3d4450c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411520-6kmwh" Dec 02 16:00:00 crc kubenswrapper[4933]: I1202 16:00:00.532871 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411520-6kmwh" Dec 02 16:00:00 crc kubenswrapper[4933]: I1202 16:00:00.950077 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411520-6kmwh"] Dec 02 16:00:01 crc kubenswrapper[4933]: I1202 16:00:01.895194 4933 generic.go:334] "Generic (PLEG): container finished" podID="f5122e65-9bfd-4382-8057-177aa3d4450c" containerID="1b55922489ae09b24390d69952c65f42a945ff2320201273adb2f754532f3c1a" exitCode=0 Dec 02 16:00:01 crc kubenswrapper[4933]: I1202 16:00:01.895265 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411520-6kmwh" event={"ID":"f5122e65-9bfd-4382-8057-177aa3d4450c","Type":"ContainerDied","Data":"1b55922489ae09b24390d69952c65f42a945ff2320201273adb2f754532f3c1a"} Dec 02 16:00:01 crc kubenswrapper[4933]: I1202 16:00:01.895591 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411520-6kmwh" event={"ID":"f5122e65-9bfd-4382-8057-177aa3d4450c","Type":"ContainerStarted","Data":"4d49f150321c0741c6dab28d090d0406c6bfecc0d5662c922162c6951c0aab72"} Dec 02 16:00:03 crc kubenswrapper[4933]: I1202 16:00:03.151893 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411520-6kmwh" Dec 02 16:00:03 crc kubenswrapper[4933]: I1202 16:00:03.333542 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f5122e65-9bfd-4382-8057-177aa3d4450c-config-volume\") pod \"f5122e65-9bfd-4382-8057-177aa3d4450c\" (UID: \"f5122e65-9bfd-4382-8057-177aa3d4450c\") " Dec 02 16:00:03 crc kubenswrapper[4933]: I1202 16:00:03.334006 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rhb5t\" (UniqueName: \"kubernetes.io/projected/f5122e65-9bfd-4382-8057-177aa3d4450c-kube-api-access-rhb5t\") pod \"f5122e65-9bfd-4382-8057-177aa3d4450c\" (UID: \"f5122e65-9bfd-4382-8057-177aa3d4450c\") " Dec 02 16:00:03 crc kubenswrapper[4933]: I1202 16:00:03.334187 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f5122e65-9bfd-4382-8057-177aa3d4450c-secret-volume\") pod \"f5122e65-9bfd-4382-8057-177aa3d4450c\" (UID: \"f5122e65-9bfd-4382-8057-177aa3d4450c\") " Dec 02 16:00:03 crc kubenswrapper[4933]: I1202 16:00:03.334608 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5122e65-9bfd-4382-8057-177aa3d4450c-config-volume" (OuterVolumeSpecName: "config-volume") pod "f5122e65-9bfd-4382-8057-177aa3d4450c" (UID: "f5122e65-9bfd-4382-8057-177aa3d4450c"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:00:03 crc kubenswrapper[4933]: I1202 16:00:03.341042 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5122e65-9bfd-4382-8057-177aa3d4450c-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "f5122e65-9bfd-4382-8057-177aa3d4450c" (UID: "f5122e65-9bfd-4382-8057-177aa3d4450c"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:00:03 crc kubenswrapper[4933]: I1202 16:00:03.341100 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5122e65-9bfd-4382-8057-177aa3d4450c-kube-api-access-rhb5t" (OuterVolumeSpecName: "kube-api-access-rhb5t") pod "f5122e65-9bfd-4382-8057-177aa3d4450c" (UID: "f5122e65-9bfd-4382-8057-177aa3d4450c"). InnerVolumeSpecName "kube-api-access-rhb5t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:00:03 crc kubenswrapper[4933]: I1202 16:00:03.436396 4933 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f5122e65-9bfd-4382-8057-177aa3d4450c-config-volume\") on node \"crc\" DevicePath \"\"" Dec 02 16:00:03 crc kubenswrapper[4933]: I1202 16:00:03.436454 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rhb5t\" (UniqueName: \"kubernetes.io/projected/f5122e65-9bfd-4382-8057-177aa3d4450c-kube-api-access-rhb5t\") on node \"crc\" DevicePath \"\"" Dec 02 16:00:03 crc kubenswrapper[4933]: I1202 16:00:03.436517 4933 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f5122e65-9bfd-4382-8057-177aa3d4450c-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 02 16:00:03 crc kubenswrapper[4933]: I1202 16:00:03.908895 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411520-6kmwh" event={"ID":"f5122e65-9bfd-4382-8057-177aa3d4450c","Type":"ContainerDied","Data":"4d49f150321c0741c6dab28d090d0406c6bfecc0d5662c922162c6951c0aab72"} Dec 02 16:00:03 crc kubenswrapper[4933]: I1202 16:00:03.908931 4933 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4d49f150321c0741c6dab28d090d0406c6bfecc0d5662c922162c6951c0aab72" Dec 02 16:00:03 crc kubenswrapper[4933]: I1202 16:00:03.909020 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411520-6kmwh" Dec 02 16:00:14 crc kubenswrapper[4933]: I1202 16:00:14.926171 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-64864bcb5f-7cr62" podUID="13082fa8-865a-4639-8384-f140198b768f" containerName="console" containerID="cri-o://6e85931020b2439f6b42ac837577835cf09429bd198a15ae90d8496c9caebea7" gracePeriod=15 Dec 02 16:00:15 crc kubenswrapper[4933]: I1202 16:00:15.273409 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-64864bcb5f-7cr62_13082fa8-865a-4639-8384-f140198b768f/console/0.log" Dec 02 16:00:15 crc kubenswrapper[4933]: I1202 16:00:15.273686 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-64864bcb5f-7cr62" Dec 02 16:00:15 crc kubenswrapper[4933]: I1202 16:00:15.458337 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/13082fa8-865a-4639-8384-f140198b768f-service-ca\") pod \"13082fa8-865a-4639-8384-f140198b768f\" (UID: \"13082fa8-865a-4639-8384-f140198b768f\") " Dec 02 16:00:15 crc kubenswrapper[4933]: I1202 16:00:15.458384 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/13082fa8-865a-4639-8384-f140198b768f-console-serving-cert\") pod \"13082fa8-865a-4639-8384-f140198b768f\" (UID: \"13082fa8-865a-4639-8384-f140198b768f\") " Dec 02 16:00:15 crc kubenswrapper[4933]: I1202 16:00:15.458405 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/13082fa8-865a-4639-8384-f140198b768f-trusted-ca-bundle\") pod \"13082fa8-865a-4639-8384-f140198b768f\" (UID: \"13082fa8-865a-4639-8384-f140198b768f\") " Dec 02 16:00:15 crc kubenswrapper[4933]: I1202 16:00:15.458422 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/13082fa8-865a-4639-8384-f140198b768f-console-config\") pod \"13082fa8-865a-4639-8384-f140198b768f\" (UID: \"13082fa8-865a-4639-8384-f140198b768f\") " Dec 02 16:00:15 crc kubenswrapper[4933]: I1202 16:00:15.458447 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/13082fa8-865a-4639-8384-f140198b768f-oauth-serving-cert\") pod \"13082fa8-865a-4639-8384-f140198b768f\" (UID: \"13082fa8-865a-4639-8384-f140198b768f\") " Dec 02 16:00:15 crc kubenswrapper[4933]: I1202 16:00:15.458509 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dftdh\" (UniqueName: \"kubernetes.io/projected/13082fa8-865a-4639-8384-f140198b768f-kube-api-access-dftdh\") pod \"13082fa8-865a-4639-8384-f140198b768f\" (UID: \"13082fa8-865a-4639-8384-f140198b768f\") " Dec 02 16:00:15 crc kubenswrapper[4933]: I1202 16:00:15.458542 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/13082fa8-865a-4639-8384-f140198b768f-console-oauth-config\") pod \"13082fa8-865a-4639-8384-f140198b768f\" (UID: \"13082fa8-865a-4639-8384-f140198b768f\") " Dec 02 16:00:15 crc kubenswrapper[4933]: I1202 16:00:15.459900 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13082fa8-865a-4639-8384-f140198b768f-console-config" (OuterVolumeSpecName: "console-config") pod "13082fa8-865a-4639-8384-f140198b768f" (UID: "13082fa8-865a-4639-8384-f140198b768f"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:00:15 crc kubenswrapper[4933]: I1202 16:00:15.460170 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13082fa8-865a-4639-8384-f140198b768f-service-ca" (OuterVolumeSpecName: "service-ca") pod "13082fa8-865a-4639-8384-f140198b768f" (UID: "13082fa8-865a-4639-8384-f140198b768f"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:00:15 crc kubenswrapper[4933]: I1202 16:00:15.460311 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13082fa8-865a-4639-8384-f140198b768f-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "13082fa8-865a-4639-8384-f140198b768f" (UID: "13082fa8-865a-4639-8384-f140198b768f"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:00:15 crc kubenswrapper[4933]: I1202 16:00:15.460525 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13082fa8-865a-4639-8384-f140198b768f-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "13082fa8-865a-4639-8384-f140198b768f" (UID: "13082fa8-865a-4639-8384-f140198b768f"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:00:15 crc kubenswrapper[4933]: I1202 16:00:15.464538 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13082fa8-865a-4639-8384-f140198b768f-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "13082fa8-865a-4639-8384-f140198b768f" (UID: "13082fa8-865a-4639-8384-f140198b768f"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:00:15 crc kubenswrapper[4933]: I1202 16:00:15.464891 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13082fa8-865a-4639-8384-f140198b768f-kube-api-access-dftdh" (OuterVolumeSpecName: "kube-api-access-dftdh") pod "13082fa8-865a-4639-8384-f140198b768f" (UID: "13082fa8-865a-4639-8384-f140198b768f"). InnerVolumeSpecName "kube-api-access-dftdh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:00:15 crc kubenswrapper[4933]: I1202 16:00:15.465241 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13082fa8-865a-4639-8384-f140198b768f-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "13082fa8-865a-4639-8384-f140198b768f" (UID: "13082fa8-865a-4639-8384-f140198b768f"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:00:15 crc kubenswrapper[4933]: I1202 16:00:15.560583 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dftdh\" (UniqueName: \"kubernetes.io/projected/13082fa8-865a-4639-8384-f140198b768f-kube-api-access-dftdh\") on node \"crc\" DevicePath \"\"" Dec 02 16:00:15 crc kubenswrapper[4933]: I1202 16:00:15.560625 4933 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/13082fa8-865a-4639-8384-f140198b768f-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 02 16:00:15 crc kubenswrapper[4933]: I1202 16:00:15.560640 4933 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/13082fa8-865a-4639-8384-f140198b768f-service-ca\") on node \"crc\" DevicePath \"\"" Dec 02 16:00:15 crc kubenswrapper[4933]: I1202 16:00:15.560659 4933 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/13082fa8-865a-4639-8384-f140198b768f-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 16:00:15 crc kubenswrapper[4933]: I1202 16:00:15.560671 4933 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/13082fa8-865a-4639-8384-f140198b768f-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 16:00:15 crc kubenswrapper[4933]: I1202 16:00:15.560683 4933 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/13082fa8-865a-4639-8384-f140198b768f-console-config\") on node \"crc\" DevicePath \"\"" Dec 02 16:00:15 crc kubenswrapper[4933]: I1202 16:00:15.560698 4933 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/13082fa8-865a-4639-8384-f140198b768f-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 16:00:15 crc kubenswrapper[4933]: I1202 16:00:15.981572 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-64864bcb5f-7cr62_13082fa8-865a-4639-8384-f140198b768f/console/0.log" Dec 02 16:00:15 crc kubenswrapper[4933]: I1202 16:00:15.981852 4933 generic.go:334] "Generic (PLEG): container finished" podID="13082fa8-865a-4639-8384-f140198b768f" containerID="6e85931020b2439f6b42ac837577835cf09429bd198a15ae90d8496c9caebea7" exitCode=2 Dec 02 16:00:15 crc kubenswrapper[4933]: I1202 16:00:15.981882 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-64864bcb5f-7cr62" event={"ID":"13082fa8-865a-4639-8384-f140198b768f","Type":"ContainerDied","Data":"6e85931020b2439f6b42ac837577835cf09429bd198a15ae90d8496c9caebea7"} Dec 02 16:00:15 crc kubenswrapper[4933]: I1202 16:00:15.981907 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-64864bcb5f-7cr62" event={"ID":"13082fa8-865a-4639-8384-f140198b768f","Type":"ContainerDied","Data":"8bcf3d04f6c38abc3da4b0ba3430e208fa5643c20a42769f0f3a213293451f53"} Dec 02 16:00:15 crc kubenswrapper[4933]: I1202 16:00:15.981921 4933 scope.go:117] "RemoveContainer" containerID="6e85931020b2439f6b42ac837577835cf09429bd198a15ae90d8496c9caebea7" Dec 02 16:00:15 crc kubenswrapper[4933]: I1202 16:00:15.982037 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-64864bcb5f-7cr62" Dec 02 16:00:16 crc kubenswrapper[4933]: I1202 16:00:16.011725 4933 scope.go:117] "RemoveContainer" containerID="6e85931020b2439f6b42ac837577835cf09429bd198a15ae90d8496c9caebea7" Dec 02 16:00:16 crc kubenswrapper[4933]: E1202 16:00:16.012279 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e85931020b2439f6b42ac837577835cf09429bd198a15ae90d8496c9caebea7\": container with ID starting with 6e85931020b2439f6b42ac837577835cf09429bd198a15ae90d8496c9caebea7 not found: ID does not exist" containerID="6e85931020b2439f6b42ac837577835cf09429bd198a15ae90d8496c9caebea7" Dec 02 16:00:16 crc kubenswrapper[4933]: I1202 16:00:16.012482 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e85931020b2439f6b42ac837577835cf09429bd198a15ae90d8496c9caebea7"} err="failed to get container status \"6e85931020b2439f6b42ac837577835cf09429bd198a15ae90d8496c9caebea7\": rpc error: code = NotFound desc = could not find container \"6e85931020b2439f6b42ac837577835cf09429bd198a15ae90d8496c9caebea7\": container with ID starting with 6e85931020b2439f6b42ac837577835cf09429bd198a15ae90d8496c9caebea7 not found: ID does not exist" Dec 02 16:00:16 crc kubenswrapper[4933]: I1202 16:00:16.016542 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-64864bcb5f-7cr62"] Dec 02 16:00:16 crc kubenswrapper[4933]: I1202 16:00:16.022790 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-64864bcb5f-7cr62"] Dec 02 16:00:17 crc kubenswrapper[4933]: I1202 16:00:17.062759 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13082fa8-865a-4639-8384-f140198b768f" path="/var/lib/kubelet/pods/13082fa8-865a-4639-8384-f140198b768f/volumes" Dec 02 16:01:17 crc kubenswrapper[4933]: I1202 16:01:17.169659 4933 patch_prober.go:28] interesting pod/machine-config-daemon-d2p6w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 16:01:17 crc kubenswrapper[4933]: I1202 16:01:17.170627 4933 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 16:01:17 crc kubenswrapper[4933]: I1202 16:01:17.255553 4933 scope.go:117] "RemoveContainer" containerID="115ff0ad4b0b271baef8b36cfbfbb04e13362ecf4986a5fa009c168aa3b0bc64" Dec 02 16:01:47 crc kubenswrapper[4933]: I1202 16:01:47.169363 4933 patch_prober.go:28] interesting pod/machine-config-daemon-d2p6w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 16:01:47 crc kubenswrapper[4933]: I1202 16:01:47.169903 4933 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 16:02:17 crc kubenswrapper[4933]: I1202 16:02:17.169591 4933 patch_prober.go:28] interesting pod/machine-config-daemon-d2p6w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 16:02:17 crc kubenswrapper[4933]: I1202 16:02:17.170287 4933 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 16:02:17 crc kubenswrapper[4933]: I1202 16:02:17.170399 4933 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" Dec 02 16:02:17 crc kubenswrapper[4933]: I1202 16:02:17.171700 4933 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ad85216952cd1ae86bb25d4b70ffbba75c0b70e8a399ab6abbe21111aa2e2cec"} pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 16:02:17 crc kubenswrapper[4933]: I1202 16:02:17.171795 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" containerName="machine-config-daemon" containerID="cri-o://ad85216952cd1ae86bb25d4b70ffbba75c0b70e8a399ab6abbe21111aa2e2cec" gracePeriod=600 Dec 02 16:02:17 crc kubenswrapper[4933]: I1202 16:02:17.288248 4933 scope.go:117] "RemoveContainer" containerID="b7908cb3ce05f99c0ff3b8500eb3564591f14c669382713dd6aca2221b612082" Dec 02 16:02:17 crc kubenswrapper[4933]: I1202 16:02:17.341301 4933 scope.go:117] "RemoveContainer" containerID="8361cc21fe6cf67feae0c8dccefb304d37c678dc42099934e664c3eb8a8446fc" Dec 02 16:02:18 crc kubenswrapper[4933]: I1202 16:02:18.147416 4933 generic.go:334] "Generic (PLEG): container finished" podID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" containerID="ad85216952cd1ae86bb25d4b70ffbba75c0b70e8a399ab6abbe21111aa2e2cec" exitCode=0 Dec 02 16:02:18 crc kubenswrapper[4933]: I1202 16:02:18.147485 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" event={"ID":"e6c1c5e6-50dd-428a-890c-2c3f0456f2fa","Type":"ContainerDied","Data":"ad85216952cd1ae86bb25d4b70ffbba75c0b70e8a399ab6abbe21111aa2e2cec"} Dec 02 16:02:18 crc kubenswrapper[4933]: I1202 16:02:18.147750 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" event={"ID":"e6c1c5e6-50dd-428a-890c-2c3f0456f2fa","Type":"ContainerStarted","Data":"338ae7bd911845a8a5e2ec13c95bb1233e30afce64a9ca00f9d9ea1398ecf073"} Dec 02 16:02:18 crc kubenswrapper[4933]: I1202 16:02:18.147776 4933 scope.go:117] "RemoveContainer" containerID="84975a1489617fef6b371e8f083c03eea00aaba65ae72fcc68b86a8281f97488" Dec 02 16:03:17 crc kubenswrapper[4933]: I1202 16:03:17.186740 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210ql9cd"] Dec 02 16:03:17 crc kubenswrapper[4933]: E1202 16:03:17.187623 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13082fa8-865a-4639-8384-f140198b768f" containerName="console" Dec 02 16:03:17 crc kubenswrapper[4933]: I1202 16:03:17.187640 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="13082fa8-865a-4639-8384-f140198b768f" containerName="console" Dec 02 16:03:17 crc kubenswrapper[4933]: E1202 16:03:17.187661 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5122e65-9bfd-4382-8057-177aa3d4450c" containerName="collect-profiles" Dec 02 16:03:17 crc kubenswrapper[4933]: I1202 16:03:17.187670 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5122e65-9bfd-4382-8057-177aa3d4450c" containerName="collect-profiles" Dec 02 16:03:17 crc kubenswrapper[4933]: I1202 16:03:17.187793 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="13082fa8-865a-4639-8384-f140198b768f" containerName="console" Dec 02 16:03:17 crc kubenswrapper[4933]: I1202 16:03:17.187856 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5122e65-9bfd-4382-8057-177aa3d4450c" containerName="collect-profiles" Dec 02 16:03:17 crc kubenswrapper[4933]: I1202 16:03:17.189512 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210ql9cd" Dec 02 16:03:17 crc kubenswrapper[4933]: I1202 16:03:17.194058 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 02 16:03:17 crc kubenswrapper[4933]: I1202 16:03:17.199955 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/90462580-2133-44d1-b7bc-c838b66ef30b-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210ql9cd\" (UID: \"90462580-2133-44d1-b7bc-c838b66ef30b\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210ql9cd" Dec 02 16:03:17 crc kubenswrapper[4933]: I1202 16:03:17.200005 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v27bq\" (UniqueName: \"kubernetes.io/projected/90462580-2133-44d1-b7bc-c838b66ef30b-kube-api-access-v27bq\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210ql9cd\" (UID: \"90462580-2133-44d1-b7bc-c838b66ef30b\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210ql9cd" Dec 02 16:03:17 crc kubenswrapper[4933]: I1202 16:03:17.200078 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/90462580-2133-44d1-b7bc-c838b66ef30b-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210ql9cd\" (UID: \"90462580-2133-44d1-b7bc-c838b66ef30b\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210ql9cd" Dec 02 16:03:17 crc kubenswrapper[4933]: I1202 16:03:17.201440 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210ql9cd"] Dec 02 16:03:17 crc kubenswrapper[4933]: I1202 16:03:17.301265 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/90462580-2133-44d1-b7bc-c838b66ef30b-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210ql9cd\" (UID: \"90462580-2133-44d1-b7bc-c838b66ef30b\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210ql9cd" Dec 02 16:03:17 crc kubenswrapper[4933]: I1202 16:03:17.301324 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v27bq\" (UniqueName: \"kubernetes.io/projected/90462580-2133-44d1-b7bc-c838b66ef30b-kube-api-access-v27bq\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210ql9cd\" (UID: \"90462580-2133-44d1-b7bc-c838b66ef30b\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210ql9cd" Dec 02 16:03:17 crc kubenswrapper[4933]: I1202 16:03:17.301363 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/90462580-2133-44d1-b7bc-c838b66ef30b-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210ql9cd\" (UID: \"90462580-2133-44d1-b7bc-c838b66ef30b\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210ql9cd" Dec 02 16:03:17 crc kubenswrapper[4933]: I1202 16:03:17.301615 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/90462580-2133-44d1-b7bc-c838b66ef30b-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210ql9cd\" (UID: \"90462580-2133-44d1-b7bc-c838b66ef30b\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210ql9cd" Dec 02 16:03:17 crc kubenswrapper[4933]: I1202 16:03:17.301660 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/90462580-2133-44d1-b7bc-c838b66ef30b-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210ql9cd\" (UID: \"90462580-2133-44d1-b7bc-c838b66ef30b\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210ql9cd" Dec 02 16:03:17 crc kubenswrapper[4933]: I1202 16:03:17.318672 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v27bq\" (UniqueName: \"kubernetes.io/projected/90462580-2133-44d1-b7bc-c838b66ef30b-kube-api-access-v27bq\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210ql9cd\" (UID: \"90462580-2133-44d1-b7bc-c838b66ef30b\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210ql9cd" Dec 02 16:03:17 crc kubenswrapper[4933]: I1202 16:03:17.519079 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210ql9cd" Dec 02 16:03:17 crc kubenswrapper[4933]: I1202 16:03:17.790342 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210ql9cd"] Dec 02 16:03:18 crc kubenswrapper[4933]: I1202 16:03:18.565289 4933 generic.go:334] "Generic (PLEG): container finished" podID="90462580-2133-44d1-b7bc-c838b66ef30b" containerID="0d0deeedbc52bb48a732da29cbce749ada81950a1c179b0e65cad85f7986f1a0" exitCode=0 Dec 02 16:03:18 crc kubenswrapper[4933]: I1202 16:03:18.565348 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210ql9cd" event={"ID":"90462580-2133-44d1-b7bc-c838b66ef30b","Type":"ContainerDied","Data":"0d0deeedbc52bb48a732da29cbce749ada81950a1c179b0e65cad85f7986f1a0"} Dec 02 16:03:18 crc kubenswrapper[4933]: I1202 16:03:18.565595 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210ql9cd" event={"ID":"90462580-2133-44d1-b7bc-c838b66ef30b","Type":"ContainerStarted","Data":"b48ecae157af84b7b410ffc6607614f8cec6b63ba543d3c94077c076d00b5988"} Dec 02 16:03:18 crc kubenswrapper[4933]: I1202 16:03:18.567228 4933 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 16:03:21 crc kubenswrapper[4933]: I1202 16:03:21.583003 4933 generic.go:334] "Generic (PLEG): container finished" podID="90462580-2133-44d1-b7bc-c838b66ef30b" containerID="621e74ad87dec05fcfab92b7345d5932b5e0b3343e965aa8447c88fbbc691a83" exitCode=0 Dec 02 16:03:21 crc kubenswrapper[4933]: I1202 16:03:21.583070 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210ql9cd" event={"ID":"90462580-2133-44d1-b7bc-c838b66ef30b","Type":"ContainerDied","Data":"621e74ad87dec05fcfab92b7345d5932b5e0b3343e965aa8447c88fbbc691a83"} Dec 02 16:03:22 crc kubenswrapper[4933]: I1202 16:03:22.594494 4933 generic.go:334] "Generic (PLEG): container finished" podID="90462580-2133-44d1-b7bc-c838b66ef30b" containerID="d062cf392443854ecb39c1f4ff538f979128fb96ddd5d3056895ed3320c8e251" exitCode=0 Dec 02 16:03:22 crc kubenswrapper[4933]: I1202 16:03:22.595050 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210ql9cd" event={"ID":"90462580-2133-44d1-b7bc-c838b66ef30b","Type":"ContainerDied","Data":"d062cf392443854ecb39c1f4ff538f979128fb96ddd5d3056895ed3320c8e251"} Dec 02 16:03:23 crc kubenswrapper[4933]: I1202 16:03:23.875987 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210ql9cd" Dec 02 16:03:23 crc kubenswrapper[4933]: I1202 16:03:23.986189 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/90462580-2133-44d1-b7bc-c838b66ef30b-util\") pod \"90462580-2133-44d1-b7bc-c838b66ef30b\" (UID: \"90462580-2133-44d1-b7bc-c838b66ef30b\") " Dec 02 16:03:23 crc kubenswrapper[4933]: I1202 16:03:23.986301 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/90462580-2133-44d1-b7bc-c838b66ef30b-bundle\") pod \"90462580-2133-44d1-b7bc-c838b66ef30b\" (UID: \"90462580-2133-44d1-b7bc-c838b66ef30b\") " Dec 02 16:03:23 crc kubenswrapper[4933]: I1202 16:03:23.986353 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v27bq\" (UniqueName: \"kubernetes.io/projected/90462580-2133-44d1-b7bc-c838b66ef30b-kube-api-access-v27bq\") pod \"90462580-2133-44d1-b7bc-c838b66ef30b\" (UID: \"90462580-2133-44d1-b7bc-c838b66ef30b\") " Dec 02 16:03:23 crc kubenswrapper[4933]: I1202 16:03:23.988552 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/90462580-2133-44d1-b7bc-c838b66ef30b-bundle" (OuterVolumeSpecName: "bundle") pod "90462580-2133-44d1-b7bc-c838b66ef30b" (UID: "90462580-2133-44d1-b7bc-c838b66ef30b"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 16:03:23 crc kubenswrapper[4933]: I1202 16:03:23.993378 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90462580-2133-44d1-b7bc-c838b66ef30b-kube-api-access-v27bq" (OuterVolumeSpecName: "kube-api-access-v27bq") pod "90462580-2133-44d1-b7bc-c838b66ef30b" (UID: "90462580-2133-44d1-b7bc-c838b66ef30b"). InnerVolumeSpecName "kube-api-access-v27bq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:03:24 crc kubenswrapper[4933]: I1202 16:03:24.009952 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/90462580-2133-44d1-b7bc-c838b66ef30b-util" (OuterVolumeSpecName: "util") pod "90462580-2133-44d1-b7bc-c838b66ef30b" (UID: "90462580-2133-44d1-b7bc-c838b66ef30b"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 16:03:24 crc kubenswrapper[4933]: I1202 16:03:24.088501 4933 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/90462580-2133-44d1-b7bc-c838b66ef30b-util\") on node \"crc\" DevicePath \"\"" Dec 02 16:03:24 crc kubenswrapper[4933]: I1202 16:03:24.088533 4933 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/90462580-2133-44d1-b7bc-c838b66ef30b-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 16:03:24 crc kubenswrapper[4933]: I1202 16:03:24.088543 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v27bq\" (UniqueName: \"kubernetes.io/projected/90462580-2133-44d1-b7bc-c838b66ef30b-kube-api-access-v27bq\") on node \"crc\" DevicePath \"\"" Dec 02 16:03:24 crc kubenswrapper[4933]: I1202 16:03:24.609761 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210ql9cd" event={"ID":"90462580-2133-44d1-b7bc-c838b66ef30b","Type":"ContainerDied","Data":"b48ecae157af84b7b410ffc6607614f8cec6b63ba543d3c94077c076d00b5988"} Dec 02 16:03:24 crc kubenswrapper[4933]: I1202 16:03:24.609813 4933 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b48ecae157af84b7b410ffc6607614f8cec6b63ba543d3c94077c076d00b5988" Dec 02 16:03:24 crc kubenswrapper[4933]: I1202 16:03:24.609813 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210ql9cd" Dec 02 16:03:28 crc kubenswrapper[4933]: I1202 16:03:28.388177 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-8mklc"] Dec 02 16:03:28 crc kubenswrapper[4933]: I1202 16:03:28.390105 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" podUID="1972064c-ea30-421c-b009-2bc675a98fcc" containerName="ovn-controller" containerID="cri-o://14c25265781df0f480ecebbc167b5e413d869433f7258cc299a2c2bc12ee2af3" gracePeriod=30 Dec 02 16:03:28 crc kubenswrapper[4933]: I1202 16:03:28.390241 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" podUID="1972064c-ea30-421c-b009-2bc675a98fcc" containerName="nbdb" containerID="cri-o://7d4c0805f39e3410185b2d898b61b42dfeb544cd47c1ac7570a099ece7921a20" gracePeriod=30 Dec 02 16:03:28 crc kubenswrapper[4933]: I1202 16:03:28.390302 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" podUID="1972064c-ea30-421c-b009-2bc675a98fcc" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://e2a3f2ac3ef3759389e0c4807b4235aea66b8d9570ef020af90110402296a755" gracePeriod=30 Dec 02 16:03:28 crc kubenswrapper[4933]: I1202 16:03:28.390425 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" podUID="1972064c-ea30-421c-b009-2bc675a98fcc" containerName="northd" containerID="cri-o://170155e5803e9eabfc4866a3cd76bd74567d9ff6707bebfdf7f24a2330173140" gracePeriod=30 Dec 02 16:03:28 crc kubenswrapper[4933]: I1202 16:03:28.390467 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" podUID="1972064c-ea30-421c-b009-2bc675a98fcc" containerName="ovn-acl-logging" containerID="cri-o://a499bb33dae499d565aa22f537a095fe9465bd1d68d829248fc7a896f692f67d" gracePeriod=30 Dec 02 16:03:28 crc kubenswrapper[4933]: I1202 16:03:28.390518 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" podUID="1972064c-ea30-421c-b009-2bc675a98fcc" containerName="kube-rbac-proxy-node" containerID="cri-o://3920b75a79accaa154ca827b9d813f5bbb7c7c18369c0b6d3bb82cab653bcb66" gracePeriod=30 Dec 02 16:03:28 crc kubenswrapper[4933]: I1202 16:03:28.390804 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" podUID="1972064c-ea30-421c-b009-2bc675a98fcc" containerName="sbdb" containerID="cri-o://9ae6c9973a0f8983a5da9a3a25ba23d7d836e7f5bdaa60b60627d604ef3d2afc" gracePeriod=30 Dec 02 16:03:28 crc kubenswrapper[4933]: I1202 16:03:28.443101 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" podUID="1972064c-ea30-421c-b009-2bc675a98fcc" containerName="ovnkube-controller" containerID="cri-o://6c83315e1cf2e0f0b5ef4f463b5d421bdd6c99328c413d5e4bc17487fafcc629" gracePeriod=30 Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.617413 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8mklc_1972064c-ea30-421c-b009-2bc675a98fcc/ovnkube-controller/3.log" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.621140 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8mklc_1972064c-ea30-421c-b009-2bc675a98fcc/ovn-acl-logging/0.log" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.621538 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8mklc_1972064c-ea30-421c-b009-2bc675a98fcc/ovn-controller/0.log" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.621902 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.644564 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-z6kjz_b033c545-93a2-4401-842b-22456e44216b/kube-multus/2.log" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.645323 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-z6kjz_b033c545-93a2-4401-842b-22456e44216b/kube-multus/1.log" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.645396 4933 generic.go:334] "Generic (PLEG): container finished" podID="b033c545-93a2-4401-842b-22456e44216b" containerID="fdc8b3cab6425c14392d3e1388ade4844bac3e7998924034f044e5a47d73be67" exitCode=2 Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.645441 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-z6kjz" event={"ID":"b033c545-93a2-4401-842b-22456e44216b","Type":"ContainerDied","Data":"fdc8b3cab6425c14392d3e1388ade4844bac3e7998924034f044e5a47d73be67"} Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.645504 4933 scope.go:117] "RemoveContainer" containerID="cf98d43fe6e267ef8509fcedf5375bfd2049a7a7964ddcb2cb97b6710013fb7a" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.646348 4933 scope.go:117] "RemoveContainer" containerID="fdc8b3cab6425c14392d3e1388ade4844bac3e7998924034f044e5a47d73be67" Dec 02 16:03:29 crc kubenswrapper[4933]: E1202 16:03:29.646605 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-z6kjz_openshift-multus(b033c545-93a2-4401-842b-22456e44216b)\"" pod="openshift-multus/multus-z6kjz" podUID="b033c545-93a2-4401-842b-22456e44216b" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.649111 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8mklc_1972064c-ea30-421c-b009-2bc675a98fcc/ovnkube-controller/3.log" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.650533 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8mklc_1972064c-ea30-421c-b009-2bc675a98fcc/ovn-acl-logging/0.log" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.650889 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8mklc_1972064c-ea30-421c-b009-2bc675a98fcc/ovn-controller/0.log" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.654260 4933 generic.go:334] "Generic (PLEG): container finished" podID="1972064c-ea30-421c-b009-2bc675a98fcc" containerID="6c83315e1cf2e0f0b5ef4f463b5d421bdd6c99328c413d5e4bc17487fafcc629" exitCode=0 Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.654302 4933 generic.go:334] "Generic (PLEG): container finished" podID="1972064c-ea30-421c-b009-2bc675a98fcc" containerID="9ae6c9973a0f8983a5da9a3a25ba23d7d836e7f5bdaa60b60627d604ef3d2afc" exitCode=0 Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.654312 4933 generic.go:334] "Generic (PLEG): container finished" podID="1972064c-ea30-421c-b009-2bc675a98fcc" containerID="7d4c0805f39e3410185b2d898b61b42dfeb544cd47c1ac7570a099ece7921a20" exitCode=0 Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.654325 4933 generic.go:334] "Generic (PLEG): container finished" podID="1972064c-ea30-421c-b009-2bc675a98fcc" containerID="170155e5803e9eabfc4866a3cd76bd74567d9ff6707bebfdf7f24a2330173140" exitCode=0 Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.654334 4933 generic.go:334] "Generic (PLEG): container finished" podID="1972064c-ea30-421c-b009-2bc675a98fcc" containerID="e2a3f2ac3ef3759389e0c4807b4235aea66b8d9570ef020af90110402296a755" exitCode=0 Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.654344 4933 generic.go:334] "Generic (PLEG): container finished" podID="1972064c-ea30-421c-b009-2bc675a98fcc" containerID="3920b75a79accaa154ca827b9d813f5bbb7c7c18369c0b6d3bb82cab653bcb66" exitCode=0 Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.654353 4933 generic.go:334] "Generic (PLEG): container finished" podID="1972064c-ea30-421c-b009-2bc675a98fcc" containerID="a499bb33dae499d565aa22f537a095fe9465bd1d68d829248fc7a896f692f67d" exitCode=143 Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.654363 4933 generic.go:334] "Generic (PLEG): container finished" podID="1972064c-ea30-421c-b009-2bc675a98fcc" containerID="14c25265781df0f480ecebbc167b5e413d869433f7258cc299a2c2bc12ee2af3" exitCode=143 Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.654340 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" event={"ID":"1972064c-ea30-421c-b009-2bc675a98fcc","Type":"ContainerDied","Data":"6c83315e1cf2e0f0b5ef4f463b5d421bdd6c99328c413d5e4bc17487fafcc629"} Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.654423 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" event={"ID":"1972064c-ea30-421c-b009-2bc675a98fcc","Type":"ContainerDied","Data":"9ae6c9973a0f8983a5da9a3a25ba23d7d836e7f5bdaa60b60627d604ef3d2afc"} Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.654449 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" event={"ID":"1972064c-ea30-421c-b009-2bc675a98fcc","Type":"ContainerDied","Data":"7d4c0805f39e3410185b2d898b61b42dfeb544cd47c1ac7570a099ece7921a20"} Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.654466 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" event={"ID":"1972064c-ea30-421c-b009-2bc675a98fcc","Type":"ContainerDied","Data":"170155e5803e9eabfc4866a3cd76bd74567d9ff6707bebfdf7f24a2330173140"} Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.654483 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" event={"ID":"1972064c-ea30-421c-b009-2bc675a98fcc","Type":"ContainerDied","Data":"e2a3f2ac3ef3759389e0c4807b4235aea66b8d9570ef020af90110402296a755"} Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.654503 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" event={"ID":"1972064c-ea30-421c-b009-2bc675a98fcc","Type":"ContainerDied","Data":"3920b75a79accaa154ca827b9d813f5bbb7c7c18369c0b6d3bb82cab653bcb66"} Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.654521 4933 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6c83315e1cf2e0f0b5ef4f463b5d421bdd6c99328c413d5e4bc17487fafcc629"} Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.654540 4933 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2f7591b746ed9942ce7644e8005251baa49bdeaef3618aa1d679249fd3a96058"} Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.654550 4933 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9ae6c9973a0f8983a5da9a3a25ba23d7d836e7f5bdaa60b60627d604ef3d2afc"} Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.654559 4933 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7d4c0805f39e3410185b2d898b61b42dfeb544cd47c1ac7570a099ece7921a20"} Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.654568 4933 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"170155e5803e9eabfc4866a3cd76bd74567d9ff6707bebfdf7f24a2330173140"} Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.654578 4933 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e2a3f2ac3ef3759389e0c4807b4235aea66b8d9570ef020af90110402296a755"} Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.654585 4933 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3920b75a79accaa154ca827b9d813f5bbb7c7c18369c0b6d3bb82cab653bcb66"} Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.654591 4933 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a499bb33dae499d565aa22f537a095fe9465bd1d68d829248fc7a896f692f67d"} Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.654598 4933 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"14c25265781df0f480ecebbc167b5e413d869433f7258cc299a2c2bc12ee2af3"} Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.654605 4933 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d0062caebc274431e99ef7dc875a50f2f2d756215bd2700b7206582f68e5f376"} Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.654617 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" event={"ID":"1972064c-ea30-421c-b009-2bc675a98fcc","Type":"ContainerDied","Data":"a499bb33dae499d565aa22f537a095fe9465bd1d68d829248fc7a896f692f67d"} Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.654630 4933 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6c83315e1cf2e0f0b5ef4f463b5d421bdd6c99328c413d5e4bc17487fafcc629"} Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.654640 4933 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2f7591b746ed9942ce7644e8005251baa49bdeaef3618aa1d679249fd3a96058"} Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.654648 4933 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9ae6c9973a0f8983a5da9a3a25ba23d7d836e7f5bdaa60b60627d604ef3d2afc"} Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.654656 4933 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7d4c0805f39e3410185b2d898b61b42dfeb544cd47c1ac7570a099ece7921a20"} Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.654663 4933 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"170155e5803e9eabfc4866a3cd76bd74567d9ff6707bebfdf7f24a2330173140"} Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.654671 4933 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e2a3f2ac3ef3759389e0c4807b4235aea66b8d9570ef020af90110402296a755"} Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.654678 4933 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3920b75a79accaa154ca827b9d813f5bbb7c7c18369c0b6d3bb82cab653bcb66"} Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.654686 4933 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a499bb33dae499d565aa22f537a095fe9465bd1d68d829248fc7a896f692f67d"} Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.654693 4933 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"14c25265781df0f480ecebbc167b5e413d869433f7258cc299a2c2bc12ee2af3"} Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.654699 4933 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d0062caebc274431e99ef7dc875a50f2f2d756215bd2700b7206582f68e5f376"} Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.654709 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" event={"ID":"1972064c-ea30-421c-b009-2bc675a98fcc","Type":"ContainerDied","Data":"14c25265781df0f480ecebbc167b5e413d869433f7258cc299a2c2bc12ee2af3"} Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.654729 4933 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6c83315e1cf2e0f0b5ef4f463b5d421bdd6c99328c413d5e4bc17487fafcc629"} Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.654744 4933 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2f7591b746ed9942ce7644e8005251baa49bdeaef3618aa1d679249fd3a96058"} Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.654753 4933 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9ae6c9973a0f8983a5da9a3a25ba23d7d836e7f5bdaa60b60627d604ef3d2afc"} Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.654762 4933 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7d4c0805f39e3410185b2d898b61b42dfeb544cd47c1ac7570a099ece7921a20"} Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.654771 4933 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"170155e5803e9eabfc4866a3cd76bd74567d9ff6707bebfdf7f24a2330173140"} Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.654780 4933 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e2a3f2ac3ef3759389e0c4807b4235aea66b8d9570ef020af90110402296a755"} Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.654788 4933 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3920b75a79accaa154ca827b9d813f5bbb7c7c18369c0b6d3bb82cab653bcb66"} Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.654795 4933 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a499bb33dae499d565aa22f537a095fe9465bd1d68d829248fc7a896f692f67d"} Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.654803 4933 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"14c25265781df0f480ecebbc167b5e413d869433f7258cc299a2c2bc12ee2af3"} Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.654810 4933 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d0062caebc274431e99ef7dc875a50f2f2d756215bd2700b7206582f68e5f376"} Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.654840 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" event={"ID":"1972064c-ea30-421c-b009-2bc675a98fcc","Type":"ContainerDied","Data":"25abdf54a55a12332b638f065187ad0f7c9d3127374dcb58155f40674bc59d43"} Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.654854 4933 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6c83315e1cf2e0f0b5ef4f463b5d421bdd6c99328c413d5e4bc17487fafcc629"} Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.654864 4933 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2f7591b746ed9942ce7644e8005251baa49bdeaef3618aa1d679249fd3a96058"} Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.654873 4933 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9ae6c9973a0f8983a5da9a3a25ba23d7d836e7f5bdaa60b60627d604ef3d2afc"} Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.654881 4933 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7d4c0805f39e3410185b2d898b61b42dfeb544cd47c1ac7570a099ece7921a20"} Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.654890 4933 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"170155e5803e9eabfc4866a3cd76bd74567d9ff6707bebfdf7f24a2330173140"} Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.654898 4933 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e2a3f2ac3ef3759389e0c4807b4235aea66b8d9570ef020af90110402296a755"} Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.654906 4933 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3920b75a79accaa154ca827b9d813f5bbb7c7c18369c0b6d3bb82cab653bcb66"} Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.654914 4933 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a499bb33dae499d565aa22f537a095fe9465bd1d68d829248fc7a896f692f67d"} Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.654922 4933 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"14c25265781df0f480ecebbc167b5e413d869433f7258cc299a2c2bc12ee2af3"} Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.654930 4933 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d0062caebc274431e99ef7dc875a50f2f2d756215bd2700b7206582f68e5f376"} Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.654371 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-8mklc" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.684588 4933 scope.go:117] "RemoveContainer" containerID="6c83315e1cf2e0f0b5ef4f463b5d421bdd6c99328c413d5e4bc17487fafcc629" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.701472 4933 scope.go:117] "RemoveContainer" containerID="2f7591b746ed9942ce7644e8005251baa49bdeaef3618aa1d679249fd3a96058" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.717518 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-ngtkv"] Dec 02 16:03:29 crc kubenswrapper[4933]: E1202 16:03:29.717790 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1972064c-ea30-421c-b009-2bc675a98fcc" containerName="ovn-controller" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.717816 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="1972064c-ea30-421c-b009-2bc675a98fcc" containerName="ovn-controller" Dec 02 16:03:29 crc kubenswrapper[4933]: E1202 16:03:29.723474 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1972064c-ea30-421c-b009-2bc675a98fcc" containerName="sbdb" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.723495 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="1972064c-ea30-421c-b009-2bc675a98fcc" containerName="sbdb" Dec 02 16:03:29 crc kubenswrapper[4933]: E1202 16:03:29.723506 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90462580-2133-44d1-b7bc-c838b66ef30b" containerName="extract" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.723514 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="90462580-2133-44d1-b7bc-c838b66ef30b" containerName="extract" Dec 02 16:03:29 crc kubenswrapper[4933]: E1202 16:03:29.723527 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90462580-2133-44d1-b7bc-c838b66ef30b" containerName="pull" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.723535 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="90462580-2133-44d1-b7bc-c838b66ef30b" containerName="pull" Dec 02 16:03:29 crc kubenswrapper[4933]: E1202 16:03:29.723544 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1972064c-ea30-421c-b009-2bc675a98fcc" containerName="ovnkube-controller" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.723552 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="1972064c-ea30-421c-b009-2bc675a98fcc" containerName="ovnkube-controller" Dec 02 16:03:29 crc kubenswrapper[4933]: E1202 16:03:29.723565 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1972064c-ea30-421c-b009-2bc675a98fcc" containerName="kubecfg-setup" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.723578 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="1972064c-ea30-421c-b009-2bc675a98fcc" containerName="kubecfg-setup" Dec 02 16:03:29 crc kubenswrapper[4933]: E1202 16:03:29.723601 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90462580-2133-44d1-b7bc-c838b66ef30b" containerName="util" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.723608 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="90462580-2133-44d1-b7bc-c838b66ef30b" containerName="util" Dec 02 16:03:29 crc kubenswrapper[4933]: E1202 16:03:29.723621 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1972064c-ea30-421c-b009-2bc675a98fcc" containerName="ovn-acl-logging" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.723632 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="1972064c-ea30-421c-b009-2bc675a98fcc" containerName="ovn-acl-logging" Dec 02 16:03:29 crc kubenswrapper[4933]: E1202 16:03:29.723649 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1972064c-ea30-421c-b009-2bc675a98fcc" containerName="kube-rbac-proxy-node" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.723661 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="1972064c-ea30-421c-b009-2bc675a98fcc" containerName="kube-rbac-proxy-node" Dec 02 16:03:29 crc kubenswrapper[4933]: E1202 16:03:29.723684 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1972064c-ea30-421c-b009-2bc675a98fcc" containerName="kube-rbac-proxy-ovn-metrics" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.723692 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="1972064c-ea30-421c-b009-2bc675a98fcc" containerName="kube-rbac-proxy-ovn-metrics" Dec 02 16:03:29 crc kubenswrapper[4933]: E1202 16:03:29.723702 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1972064c-ea30-421c-b009-2bc675a98fcc" containerName="ovnkube-controller" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.723716 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="1972064c-ea30-421c-b009-2bc675a98fcc" containerName="ovnkube-controller" Dec 02 16:03:29 crc kubenswrapper[4933]: E1202 16:03:29.723736 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1972064c-ea30-421c-b009-2bc675a98fcc" containerName="northd" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.723744 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="1972064c-ea30-421c-b009-2bc675a98fcc" containerName="northd" Dec 02 16:03:29 crc kubenswrapper[4933]: E1202 16:03:29.723764 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1972064c-ea30-421c-b009-2bc675a98fcc" containerName="ovnkube-controller" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.723773 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="1972064c-ea30-421c-b009-2bc675a98fcc" containerName="ovnkube-controller" Dec 02 16:03:29 crc kubenswrapper[4933]: E1202 16:03:29.723784 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1972064c-ea30-421c-b009-2bc675a98fcc" containerName="ovnkube-controller" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.723791 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="1972064c-ea30-421c-b009-2bc675a98fcc" containerName="ovnkube-controller" Dec 02 16:03:29 crc kubenswrapper[4933]: E1202 16:03:29.723804 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1972064c-ea30-421c-b009-2bc675a98fcc" containerName="nbdb" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.723818 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="1972064c-ea30-421c-b009-2bc675a98fcc" containerName="nbdb" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.724117 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="1972064c-ea30-421c-b009-2bc675a98fcc" containerName="ovn-acl-logging" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.724140 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="1972064c-ea30-421c-b009-2bc675a98fcc" containerName="kube-rbac-proxy-ovn-metrics" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.724153 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="1972064c-ea30-421c-b009-2bc675a98fcc" containerName="ovnkube-controller" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.724162 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="1972064c-ea30-421c-b009-2bc675a98fcc" containerName="nbdb" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.724170 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="1972064c-ea30-421c-b009-2bc675a98fcc" containerName="kube-rbac-proxy-node" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.724178 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="1972064c-ea30-421c-b009-2bc675a98fcc" containerName="ovnkube-controller" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.724190 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="1972064c-ea30-421c-b009-2bc675a98fcc" containerName="ovnkube-controller" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.724199 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="1972064c-ea30-421c-b009-2bc675a98fcc" containerName="ovnkube-controller" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.724206 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="1972064c-ea30-421c-b009-2bc675a98fcc" containerName="ovn-controller" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.724214 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="1972064c-ea30-421c-b009-2bc675a98fcc" containerName="sbdb" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.724223 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="1972064c-ea30-421c-b009-2bc675a98fcc" containerName="northd" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.724236 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="90462580-2133-44d1-b7bc-c838b66ef30b" containerName="extract" Dec 02 16:03:29 crc kubenswrapper[4933]: E1202 16:03:29.724388 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1972064c-ea30-421c-b009-2bc675a98fcc" containerName="ovnkube-controller" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.724400 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="1972064c-ea30-421c-b009-2bc675a98fcc" containerName="ovnkube-controller" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.724528 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="1972064c-ea30-421c-b009-2bc675a98fcc" containerName="ovnkube-controller" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.726868 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-ngtkv" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.732230 4933 scope.go:117] "RemoveContainer" containerID="9ae6c9973a0f8983a5da9a3a25ba23d7d836e7f5bdaa60b60627d604ef3d2afc" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.752726 4933 scope.go:117] "RemoveContainer" containerID="7d4c0805f39e3410185b2d898b61b42dfeb544cd47c1ac7570a099ece7921a20" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.770930 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcpjn\" (UniqueName: \"kubernetes.io/projected/1972064c-ea30-421c-b009-2bc675a98fcc-kube-api-access-pcpjn\") pod \"1972064c-ea30-421c-b009-2bc675a98fcc\" (UID: \"1972064c-ea30-421c-b009-2bc675a98fcc\") " Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.770981 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1972064c-ea30-421c-b009-2bc675a98fcc-run-ovn\") pod \"1972064c-ea30-421c-b009-2bc675a98fcc\" (UID: \"1972064c-ea30-421c-b009-2bc675a98fcc\") " Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.771008 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1972064c-ea30-421c-b009-2bc675a98fcc-ovn-node-metrics-cert\") pod \"1972064c-ea30-421c-b009-2bc675a98fcc\" (UID: \"1972064c-ea30-421c-b009-2bc675a98fcc\") " Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.771041 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1972064c-ea30-421c-b009-2bc675a98fcc-node-log\") pod \"1972064c-ea30-421c-b009-2bc675a98fcc\" (UID: \"1972064c-ea30-421c-b009-2bc675a98fcc\") " Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.771066 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1972064c-ea30-421c-b009-2bc675a98fcc-var-lib-openvswitch\") pod \"1972064c-ea30-421c-b009-2bc675a98fcc\" (UID: \"1972064c-ea30-421c-b009-2bc675a98fcc\") " Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.771082 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1972064c-ea30-421c-b009-2bc675a98fcc-host-slash\") pod \"1972064c-ea30-421c-b009-2bc675a98fcc\" (UID: \"1972064c-ea30-421c-b009-2bc675a98fcc\") " Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.771102 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1972064c-ea30-421c-b009-2bc675a98fcc-run-openvswitch\") pod \"1972064c-ea30-421c-b009-2bc675a98fcc\" (UID: \"1972064c-ea30-421c-b009-2bc675a98fcc\") " Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.771135 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1972064c-ea30-421c-b009-2bc675a98fcc-host-var-lib-cni-networks-ovn-kubernetes\") pod \"1972064c-ea30-421c-b009-2bc675a98fcc\" (UID: \"1972064c-ea30-421c-b009-2bc675a98fcc\") " Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.771175 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1972064c-ea30-421c-b009-2bc675a98fcc-env-overrides\") pod \"1972064c-ea30-421c-b009-2bc675a98fcc\" (UID: \"1972064c-ea30-421c-b009-2bc675a98fcc\") " Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.771197 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1972064c-ea30-421c-b009-2bc675a98fcc-ovnkube-config\") pod \"1972064c-ea30-421c-b009-2bc675a98fcc\" (UID: \"1972064c-ea30-421c-b009-2bc675a98fcc\") " Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.771219 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1972064c-ea30-421c-b009-2bc675a98fcc-host-run-netns\") pod \"1972064c-ea30-421c-b009-2bc675a98fcc\" (UID: \"1972064c-ea30-421c-b009-2bc675a98fcc\") " Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.771238 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1972064c-ea30-421c-b009-2bc675a98fcc-systemd-units\") pod \"1972064c-ea30-421c-b009-2bc675a98fcc\" (UID: \"1972064c-ea30-421c-b009-2bc675a98fcc\") " Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.771254 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1972064c-ea30-421c-b009-2bc675a98fcc-etc-openvswitch\") pod \"1972064c-ea30-421c-b009-2bc675a98fcc\" (UID: \"1972064c-ea30-421c-b009-2bc675a98fcc\") " Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.771275 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1972064c-ea30-421c-b009-2bc675a98fcc-host-kubelet\") pod \"1972064c-ea30-421c-b009-2bc675a98fcc\" (UID: \"1972064c-ea30-421c-b009-2bc675a98fcc\") " Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.771289 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1972064c-ea30-421c-b009-2bc675a98fcc-run-systemd\") pod \"1972064c-ea30-421c-b009-2bc675a98fcc\" (UID: \"1972064c-ea30-421c-b009-2bc675a98fcc\") " Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.771346 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1972064c-ea30-421c-b009-2bc675a98fcc-host-cni-bin\") pod \"1972064c-ea30-421c-b009-2bc675a98fcc\" (UID: \"1972064c-ea30-421c-b009-2bc675a98fcc\") " Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.771369 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1972064c-ea30-421c-b009-2bc675a98fcc-log-socket\") pod \"1972064c-ea30-421c-b009-2bc675a98fcc\" (UID: \"1972064c-ea30-421c-b009-2bc675a98fcc\") " Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.771389 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1972064c-ea30-421c-b009-2bc675a98fcc-ovnkube-script-lib\") pod \"1972064c-ea30-421c-b009-2bc675a98fcc\" (UID: \"1972064c-ea30-421c-b009-2bc675a98fcc\") " Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.771406 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1972064c-ea30-421c-b009-2bc675a98fcc-host-run-ovn-kubernetes\") pod \"1972064c-ea30-421c-b009-2bc675a98fcc\" (UID: \"1972064c-ea30-421c-b009-2bc675a98fcc\") " Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.771427 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1972064c-ea30-421c-b009-2bc675a98fcc-host-cni-netd\") pod \"1972064c-ea30-421c-b009-2bc675a98fcc\" (UID: \"1972064c-ea30-421c-b009-2bc675a98fcc\") " Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.771614 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1972064c-ea30-421c-b009-2bc675a98fcc-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "1972064c-ea30-421c-b009-2bc675a98fcc" (UID: "1972064c-ea30-421c-b009-2bc675a98fcc"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.772525 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1972064c-ea30-421c-b009-2bc675a98fcc-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "1972064c-ea30-421c-b009-2bc675a98fcc" (UID: "1972064c-ea30-421c-b009-2bc675a98fcc"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.773174 4933 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1972064c-ea30-421c-b009-2bc675a98fcc-host-cni-netd\") on node \"crc\" DevicePath \"\"" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.773607 4933 scope.go:117] "RemoveContainer" containerID="170155e5803e9eabfc4866a3cd76bd74567d9ff6707bebfdf7f24a2330173140" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.773767 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1972064c-ea30-421c-b009-2bc675a98fcc-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "1972064c-ea30-421c-b009-2bc675a98fcc" (UID: "1972064c-ea30-421c-b009-2bc675a98fcc"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.773801 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1972064c-ea30-421c-b009-2bc675a98fcc-node-log" (OuterVolumeSpecName: "node-log") pod "1972064c-ea30-421c-b009-2bc675a98fcc" (UID: "1972064c-ea30-421c-b009-2bc675a98fcc"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.773844 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1972064c-ea30-421c-b009-2bc675a98fcc-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "1972064c-ea30-421c-b009-2bc675a98fcc" (UID: "1972064c-ea30-421c-b009-2bc675a98fcc"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.773867 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1972064c-ea30-421c-b009-2bc675a98fcc-host-slash" (OuterVolumeSpecName: "host-slash") pod "1972064c-ea30-421c-b009-2bc675a98fcc" (UID: "1972064c-ea30-421c-b009-2bc675a98fcc"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.773888 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1972064c-ea30-421c-b009-2bc675a98fcc-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "1972064c-ea30-421c-b009-2bc675a98fcc" (UID: "1972064c-ea30-421c-b009-2bc675a98fcc"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.773913 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1972064c-ea30-421c-b009-2bc675a98fcc-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "1972064c-ea30-421c-b009-2bc675a98fcc" (UID: "1972064c-ea30-421c-b009-2bc675a98fcc"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.773967 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1972064c-ea30-421c-b009-2bc675a98fcc-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "1972064c-ea30-421c-b009-2bc675a98fcc" (UID: "1972064c-ea30-421c-b009-2bc675a98fcc"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.774100 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1972064c-ea30-421c-b009-2bc675a98fcc-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "1972064c-ea30-421c-b009-2bc675a98fcc" (UID: "1972064c-ea30-421c-b009-2bc675a98fcc"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.774138 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1972064c-ea30-421c-b009-2bc675a98fcc-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "1972064c-ea30-421c-b009-2bc675a98fcc" (UID: "1972064c-ea30-421c-b009-2bc675a98fcc"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.774307 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1972064c-ea30-421c-b009-2bc675a98fcc-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "1972064c-ea30-421c-b009-2bc675a98fcc" (UID: "1972064c-ea30-421c-b009-2bc675a98fcc"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.774693 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1972064c-ea30-421c-b009-2bc675a98fcc-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "1972064c-ea30-421c-b009-2bc675a98fcc" (UID: "1972064c-ea30-421c-b009-2bc675a98fcc"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.774931 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1972064c-ea30-421c-b009-2bc675a98fcc-log-socket" (OuterVolumeSpecName: "log-socket") pod "1972064c-ea30-421c-b009-2bc675a98fcc" (UID: "1972064c-ea30-421c-b009-2bc675a98fcc"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.774973 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1972064c-ea30-421c-b009-2bc675a98fcc-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "1972064c-ea30-421c-b009-2bc675a98fcc" (UID: "1972064c-ea30-421c-b009-2bc675a98fcc"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.775048 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1972064c-ea30-421c-b009-2bc675a98fcc-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "1972064c-ea30-421c-b009-2bc675a98fcc" (UID: "1972064c-ea30-421c-b009-2bc675a98fcc"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.775753 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1972064c-ea30-421c-b009-2bc675a98fcc-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "1972064c-ea30-421c-b009-2bc675a98fcc" (UID: "1972064c-ea30-421c-b009-2bc675a98fcc"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.776531 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1972064c-ea30-421c-b009-2bc675a98fcc-kube-api-access-pcpjn" (OuterVolumeSpecName: "kube-api-access-pcpjn") pod "1972064c-ea30-421c-b009-2bc675a98fcc" (UID: "1972064c-ea30-421c-b009-2bc675a98fcc"). InnerVolumeSpecName "kube-api-access-pcpjn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.782683 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1972064c-ea30-421c-b009-2bc675a98fcc-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "1972064c-ea30-421c-b009-2bc675a98fcc" (UID: "1972064c-ea30-421c-b009-2bc675a98fcc"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.793471 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1972064c-ea30-421c-b009-2bc675a98fcc-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "1972064c-ea30-421c-b009-2bc675a98fcc" (UID: "1972064c-ea30-421c-b009-2bc675a98fcc"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.795917 4933 scope.go:117] "RemoveContainer" containerID="e2a3f2ac3ef3759389e0c4807b4235aea66b8d9570ef020af90110402296a755" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.818014 4933 scope.go:117] "RemoveContainer" containerID="3920b75a79accaa154ca827b9d813f5bbb7c7c18369c0b6d3bb82cab653bcb66" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.834277 4933 scope.go:117] "RemoveContainer" containerID="a499bb33dae499d565aa22f537a095fe9465bd1d68d829248fc7a896f692f67d" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.848418 4933 scope.go:117] "RemoveContainer" containerID="14c25265781df0f480ecebbc167b5e413d869433f7258cc299a2c2bc12ee2af3" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.865380 4933 scope.go:117] "RemoveContainer" containerID="d0062caebc274431e99ef7dc875a50f2f2d756215bd2700b7206582f68e5f376" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.875273 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/64d8adc1-c851-429a-b58d-426086603759-ovnkube-config\") pod \"ovnkube-node-ngtkv\" (UID: \"64d8adc1-c851-429a-b58d-426086603759\") " pod="openshift-ovn-kubernetes/ovnkube-node-ngtkv" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.875328 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tl4mf\" (UniqueName: \"kubernetes.io/projected/64d8adc1-c851-429a-b58d-426086603759-kube-api-access-tl4mf\") pod \"ovnkube-node-ngtkv\" (UID: \"64d8adc1-c851-429a-b58d-426086603759\") " pod="openshift-ovn-kubernetes/ovnkube-node-ngtkv" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.875365 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/64d8adc1-c851-429a-b58d-426086603759-run-ovn\") pod \"ovnkube-node-ngtkv\" (UID: \"64d8adc1-c851-429a-b58d-426086603759\") " pod="openshift-ovn-kubernetes/ovnkube-node-ngtkv" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.875400 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/64d8adc1-c851-429a-b58d-426086603759-etc-openvswitch\") pod \"ovnkube-node-ngtkv\" (UID: \"64d8adc1-c851-429a-b58d-426086603759\") " pod="openshift-ovn-kubernetes/ovnkube-node-ngtkv" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.875425 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/64d8adc1-c851-429a-b58d-426086603759-systemd-units\") pod \"ovnkube-node-ngtkv\" (UID: \"64d8adc1-c851-429a-b58d-426086603759\") " pod="openshift-ovn-kubernetes/ovnkube-node-ngtkv" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.875443 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/64d8adc1-c851-429a-b58d-426086603759-var-lib-openvswitch\") pod \"ovnkube-node-ngtkv\" (UID: \"64d8adc1-c851-429a-b58d-426086603759\") " pod="openshift-ovn-kubernetes/ovnkube-node-ngtkv" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.875463 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/64d8adc1-c851-429a-b58d-426086603759-log-socket\") pod \"ovnkube-node-ngtkv\" (UID: \"64d8adc1-c851-429a-b58d-426086603759\") " pod="openshift-ovn-kubernetes/ovnkube-node-ngtkv" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.875486 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/64d8adc1-c851-429a-b58d-426086603759-host-cni-netd\") pod \"ovnkube-node-ngtkv\" (UID: \"64d8adc1-c851-429a-b58d-426086603759\") " pod="openshift-ovn-kubernetes/ovnkube-node-ngtkv" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.875507 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/64d8adc1-c851-429a-b58d-426086603759-ovnkube-script-lib\") pod \"ovnkube-node-ngtkv\" (UID: \"64d8adc1-c851-429a-b58d-426086603759\") " pod="openshift-ovn-kubernetes/ovnkube-node-ngtkv" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.875528 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/64d8adc1-c851-429a-b58d-426086603759-host-slash\") pod \"ovnkube-node-ngtkv\" (UID: \"64d8adc1-c851-429a-b58d-426086603759\") " pod="openshift-ovn-kubernetes/ovnkube-node-ngtkv" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.875586 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/64d8adc1-c851-429a-b58d-426086603759-host-cni-bin\") pod \"ovnkube-node-ngtkv\" (UID: \"64d8adc1-c851-429a-b58d-426086603759\") " pod="openshift-ovn-kubernetes/ovnkube-node-ngtkv" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.875609 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/64d8adc1-c851-429a-b58d-426086603759-ovn-node-metrics-cert\") pod \"ovnkube-node-ngtkv\" (UID: \"64d8adc1-c851-429a-b58d-426086603759\") " pod="openshift-ovn-kubernetes/ovnkube-node-ngtkv" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.875634 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/64d8adc1-c851-429a-b58d-426086603759-host-run-ovn-kubernetes\") pod \"ovnkube-node-ngtkv\" (UID: \"64d8adc1-c851-429a-b58d-426086603759\") " pod="openshift-ovn-kubernetes/ovnkube-node-ngtkv" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.875652 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/64d8adc1-c851-429a-b58d-426086603759-host-kubelet\") pod \"ovnkube-node-ngtkv\" (UID: \"64d8adc1-c851-429a-b58d-426086603759\") " pod="openshift-ovn-kubernetes/ovnkube-node-ngtkv" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.875674 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/64d8adc1-c851-429a-b58d-426086603759-host-run-netns\") pod \"ovnkube-node-ngtkv\" (UID: \"64d8adc1-c851-429a-b58d-426086603759\") " pod="openshift-ovn-kubernetes/ovnkube-node-ngtkv" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.875692 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/64d8adc1-c851-429a-b58d-426086603759-env-overrides\") pod \"ovnkube-node-ngtkv\" (UID: \"64d8adc1-c851-429a-b58d-426086603759\") " pod="openshift-ovn-kubernetes/ovnkube-node-ngtkv" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.875708 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/64d8adc1-c851-429a-b58d-426086603759-node-log\") pod \"ovnkube-node-ngtkv\" (UID: \"64d8adc1-c851-429a-b58d-426086603759\") " pod="openshift-ovn-kubernetes/ovnkube-node-ngtkv" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.875733 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/64d8adc1-c851-429a-b58d-426086603759-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ngtkv\" (UID: \"64d8adc1-c851-429a-b58d-426086603759\") " pod="openshift-ovn-kubernetes/ovnkube-node-ngtkv" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.875757 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/64d8adc1-c851-429a-b58d-426086603759-run-openvswitch\") pod \"ovnkube-node-ngtkv\" (UID: \"64d8adc1-c851-429a-b58d-426086603759\") " pod="openshift-ovn-kubernetes/ovnkube-node-ngtkv" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.875780 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/64d8adc1-c851-429a-b58d-426086603759-run-systemd\") pod \"ovnkube-node-ngtkv\" (UID: \"64d8adc1-c851-429a-b58d-426086603759\") " pod="openshift-ovn-kubernetes/ovnkube-node-ngtkv" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.875843 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcpjn\" (UniqueName: \"kubernetes.io/projected/1972064c-ea30-421c-b009-2bc675a98fcc-kube-api-access-pcpjn\") on node \"crc\" DevicePath \"\"" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.875857 4933 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1972064c-ea30-421c-b009-2bc675a98fcc-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.875868 4933 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1972064c-ea30-421c-b009-2bc675a98fcc-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.875878 4933 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1972064c-ea30-421c-b009-2bc675a98fcc-node-log\") on node \"crc\" DevicePath \"\"" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.875888 4933 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1972064c-ea30-421c-b009-2bc675a98fcc-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.875898 4933 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1972064c-ea30-421c-b009-2bc675a98fcc-host-slash\") on node \"crc\" DevicePath \"\"" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.875907 4933 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1972064c-ea30-421c-b009-2bc675a98fcc-run-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.875917 4933 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1972064c-ea30-421c-b009-2bc675a98fcc-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.875927 4933 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1972064c-ea30-421c-b009-2bc675a98fcc-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.875940 4933 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1972064c-ea30-421c-b009-2bc675a98fcc-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.875952 4933 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1972064c-ea30-421c-b009-2bc675a98fcc-host-run-netns\") on node \"crc\" DevicePath \"\"" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.875965 4933 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1972064c-ea30-421c-b009-2bc675a98fcc-systemd-units\") on node \"crc\" DevicePath \"\"" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.875979 4933 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1972064c-ea30-421c-b009-2bc675a98fcc-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.875989 4933 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1972064c-ea30-421c-b009-2bc675a98fcc-host-kubelet\") on node \"crc\" DevicePath \"\"" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.875997 4933 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1972064c-ea30-421c-b009-2bc675a98fcc-run-systemd\") on node \"crc\" DevicePath \"\"" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.876007 4933 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1972064c-ea30-421c-b009-2bc675a98fcc-host-cni-bin\") on node \"crc\" DevicePath \"\"" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.876017 4933 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1972064c-ea30-421c-b009-2bc675a98fcc-log-socket\") on node \"crc\" DevicePath \"\"" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.876028 4933 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1972064c-ea30-421c-b009-2bc675a98fcc-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.876037 4933 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1972064c-ea30-421c-b009-2bc675a98fcc-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.884526 4933 scope.go:117] "RemoveContainer" containerID="6c83315e1cf2e0f0b5ef4f463b5d421bdd6c99328c413d5e4bc17487fafcc629" Dec 02 16:03:29 crc kubenswrapper[4933]: E1202 16:03:29.885062 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c83315e1cf2e0f0b5ef4f463b5d421bdd6c99328c413d5e4bc17487fafcc629\": container with ID starting with 6c83315e1cf2e0f0b5ef4f463b5d421bdd6c99328c413d5e4bc17487fafcc629 not found: ID does not exist" containerID="6c83315e1cf2e0f0b5ef4f463b5d421bdd6c99328c413d5e4bc17487fafcc629" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.885113 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c83315e1cf2e0f0b5ef4f463b5d421bdd6c99328c413d5e4bc17487fafcc629"} err="failed to get container status \"6c83315e1cf2e0f0b5ef4f463b5d421bdd6c99328c413d5e4bc17487fafcc629\": rpc error: code = NotFound desc = could not find container \"6c83315e1cf2e0f0b5ef4f463b5d421bdd6c99328c413d5e4bc17487fafcc629\": container with ID starting with 6c83315e1cf2e0f0b5ef4f463b5d421bdd6c99328c413d5e4bc17487fafcc629 not found: ID does not exist" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.885150 4933 scope.go:117] "RemoveContainer" containerID="2f7591b746ed9942ce7644e8005251baa49bdeaef3618aa1d679249fd3a96058" Dec 02 16:03:29 crc kubenswrapper[4933]: E1202 16:03:29.885633 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f7591b746ed9942ce7644e8005251baa49bdeaef3618aa1d679249fd3a96058\": container with ID starting with 2f7591b746ed9942ce7644e8005251baa49bdeaef3618aa1d679249fd3a96058 not found: ID does not exist" containerID="2f7591b746ed9942ce7644e8005251baa49bdeaef3618aa1d679249fd3a96058" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.885683 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f7591b746ed9942ce7644e8005251baa49bdeaef3618aa1d679249fd3a96058"} err="failed to get container status \"2f7591b746ed9942ce7644e8005251baa49bdeaef3618aa1d679249fd3a96058\": rpc error: code = NotFound desc = could not find container \"2f7591b746ed9942ce7644e8005251baa49bdeaef3618aa1d679249fd3a96058\": container with ID starting with 2f7591b746ed9942ce7644e8005251baa49bdeaef3618aa1d679249fd3a96058 not found: ID does not exist" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.885731 4933 scope.go:117] "RemoveContainer" containerID="9ae6c9973a0f8983a5da9a3a25ba23d7d836e7f5bdaa60b60627d604ef3d2afc" Dec 02 16:03:29 crc kubenswrapper[4933]: E1202 16:03:29.886027 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ae6c9973a0f8983a5da9a3a25ba23d7d836e7f5bdaa60b60627d604ef3d2afc\": container with ID starting with 9ae6c9973a0f8983a5da9a3a25ba23d7d836e7f5bdaa60b60627d604ef3d2afc not found: ID does not exist" containerID="9ae6c9973a0f8983a5da9a3a25ba23d7d836e7f5bdaa60b60627d604ef3d2afc" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.886062 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ae6c9973a0f8983a5da9a3a25ba23d7d836e7f5bdaa60b60627d604ef3d2afc"} err="failed to get container status \"9ae6c9973a0f8983a5da9a3a25ba23d7d836e7f5bdaa60b60627d604ef3d2afc\": rpc error: code = NotFound desc = could not find container \"9ae6c9973a0f8983a5da9a3a25ba23d7d836e7f5bdaa60b60627d604ef3d2afc\": container with ID starting with 9ae6c9973a0f8983a5da9a3a25ba23d7d836e7f5bdaa60b60627d604ef3d2afc not found: ID does not exist" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.886111 4933 scope.go:117] "RemoveContainer" containerID="7d4c0805f39e3410185b2d898b61b42dfeb544cd47c1ac7570a099ece7921a20" Dec 02 16:03:29 crc kubenswrapper[4933]: E1202 16:03:29.886391 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d4c0805f39e3410185b2d898b61b42dfeb544cd47c1ac7570a099ece7921a20\": container with ID starting with 7d4c0805f39e3410185b2d898b61b42dfeb544cd47c1ac7570a099ece7921a20 not found: ID does not exist" containerID="7d4c0805f39e3410185b2d898b61b42dfeb544cd47c1ac7570a099ece7921a20" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.886424 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d4c0805f39e3410185b2d898b61b42dfeb544cd47c1ac7570a099ece7921a20"} err="failed to get container status \"7d4c0805f39e3410185b2d898b61b42dfeb544cd47c1ac7570a099ece7921a20\": rpc error: code = NotFound desc = could not find container \"7d4c0805f39e3410185b2d898b61b42dfeb544cd47c1ac7570a099ece7921a20\": container with ID starting with 7d4c0805f39e3410185b2d898b61b42dfeb544cd47c1ac7570a099ece7921a20 not found: ID does not exist" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.886444 4933 scope.go:117] "RemoveContainer" containerID="170155e5803e9eabfc4866a3cd76bd74567d9ff6707bebfdf7f24a2330173140" Dec 02 16:03:29 crc kubenswrapper[4933]: E1202 16:03:29.886708 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"170155e5803e9eabfc4866a3cd76bd74567d9ff6707bebfdf7f24a2330173140\": container with ID starting with 170155e5803e9eabfc4866a3cd76bd74567d9ff6707bebfdf7f24a2330173140 not found: ID does not exist" containerID="170155e5803e9eabfc4866a3cd76bd74567d9ff6707bebfdf7f24a2330173140" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.886743 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"170155e5803e9eabfc4866a3cd76bd74567d9ff6707bebfdf7f24a2330173140"} err="failed to get container status \"170155e5803e9eabfc4866a3cd76bd74567d9ff6707bebfdf7f24a2330173140\": rpc error: code = NotFound desc = could not find container \"170155e5803e9eabfc4866a3cd76bd74567d9ff6707bebfdf7f24a2330173140\": container with ID starting with 170155e5803e9eabfc4866a3cd76bd74567d9ff6707bebfdf7f24a2330173140 not found: ID does not exist" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.886761 4933 scope.go:117] "RemoveContainer" containerID="e2a3f2ac3ef3759389e0c4807b4235aea66b8d9570ef020af90110402296a755" Dec 02 16:03:29 crc kubenswrapper[4933]: E1202 16:03:29.887340 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2a3f2ac3ef3759389e0c4807b4235aea66b8d9570ef020af90110402296a755\": container with ID starting with e2a3f2ac3ef3759389e0c4807b4235aea66b8d9570ef020af90110402296a755 not found: ID does not exist" containerID="e2a3f2ac3ef3759389e0c4807b4235aea66b8d9570ef020af90110402296a755" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.887373 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2a3f2ac3ef3759389e0c4807b4235aea66b8d9570ef020af90110402296a755"} err="failed to get container status \"e2a3f2ac3ef3759389e0c4807b4235aea66b8d9570ef020af90110402296a755\": rpc error: code = NotFound desc = could not find container \"e2a3f2ac3ef3759389e0c4807b4235aea66b8d9570ef020af90110402296a755\": container with ID starting with e2a3f2ac3ef3759389e0c4807b4235aea66b8d9570ef020af90110402296a755 not found: ID does not exist" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.887394 4933 scope.go:117] "RemoveContainer" containerID="3920b75a79accaa154ca827b9d813f5bbb7c7c18369c0b6d3bb82cab653bcb66" Dec 02 16:03:29 crc kubenswrapper[4933]: E1202 16:03:29.887701 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3920b75a79accaa154ca827b9d813f5bbb7c7c18369c0b6d3bb82cab653bcb66\": container with ID starting with 3920b75a79accaa154ca827b9d813f5bbb7c7c18369c0b6d3bb82cab653bcb66 not found: ID does not exist" containerID="3920b75a79accaa154ca827b9d813f5bbb7c7c18369c0b6d3bb82cab653bcb66" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.887736 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3920b75a79accaa154ca827b9d813f5bbb7c7c18369c0b6d3bb82cab653bcb66"} err="failed to get container status \"3920b75a79accaa154ca827b9d813f5bbb7c7c18369c0b6d3bb82cab653bcb66\": rpc error: code = NotFound desc = could not find container \"3920b75a79accaa154ca827b9d813f5bbb7c7c18369c0b6d3bb82cab653bcb66\": container with ID starting with 3920b75a79accaa154ca827b9d813f5bbb7c7c18369c0b6d3bb82cab653bcb66 not found: ID does not exist" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.887756 4933 scope.go:117] "RemoveContainer" containerID="a499bb33dae499d565aa22f537a095fe9465bd1d68d829248fc7a896f692f67d" Dec 02 16:03:29 crc kubenswrapper[4933]: E1202 16:03:29.888175 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a499bb33dae499d565aa22f537a095fe9465bd1d68d829248fc7a896f692f67d\": container with ID starting with a499bb33dae499d565aa22f537a095fe9465bd1d68d829248fc7a896f692f67d not found: ID does not exist" containerID="a499bb33dae499d565aa22f537a095fe9465bd1d68d829248fc7a896f692f67d" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.888207 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a499bb33dae499d565aa22f537a095fe9465bd1d68d829248fc7a896f692f67d"} err="failed to get container status \"a499bb33dae499d565aa22f537a095fe9465bd1d68d829248fc7a896f692f67d\": rpc error: code = NotFound desc = could not find container \"a499bb33dae499d565aa22f537a095fe9465bd1d68d829248fc7a896f692f67d\": container with ID starting with a499bb33dae499d565aa22f537a095fe9465bd1d68d829248fc7a896f692f67d not found: ID does not exist" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.888225 4933 scope.go:117] "RemoveContainer" containerID="14c25265781df0f480ecebbc167b5e413d869433f7258cc299a2c2bc12ee2af3" Dec 02 16:03:29 crc kubenswrapper[4933]: E1202 16:03:29.888505 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14c25265781df0f480ecebbc167b5e413d869433f7258cc299a2c2bc12ee2af3\": container with ID starting with 14c25265781df0f480ecebbc167b5e413d869433f7258cc299a2c2bc12ee2af3 not found: ID does not exist" containerID="14c25265781df0f480ecebbc167b5e413d869433f7258cc299a2c2bc12ee2af3" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.888540 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14c25265781df0f480ecebbc167b5e413d869433f7258cc299a2c2bc12ee2af3"} err="failed to get container status \"14c25265781df0f480ecebbc167b5e413d869433f7258cc299a2c2bc12ee2af3\": rpc error: code = NotFound desc = could not find container \"14c25265781df0f480ecebbc167b5e413d869433f7258cc299a2c2bc12ee2af3\": container with ID starting with 14c25265781df0f480ecebbc167b5e413d869433f7258cc299a2c2bc12ee2af3 not found: ID does not exist" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.888560 4933 scope.go:117] "RemoveContainer" containerID="d0062caebc274431e99ef7dc875a50f2f2d756215bd2700b7206582f68e5f376" Dec 02 16:03:29 crc kubenswrapper[4933]: E1202 16:03:29.888889 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0062caebc274431e99ef7dc875a50f2f2d756215bd2700b7206582f68e5f376\": container with ID starting with d0062caebc274431e99ef7dc875a50f2f2d756215bd2700b7206582f68e5f376 not found: ID does not exist" containerID="d0062caebc274431e99ef7dc875a50f2f2d756215bd2700b7206582f68e5f376" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.888920 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0062caebc274431e99ef7dc875a50f2f2d756215bd2700b7206582f68e5f376"} err="failed to get container status \"d0062caebc274431e99ef7dc875a50f2f2d756215bd2700b7206582f68e5f376\": rpc error: code = NotFound desc = could not find container \"d0062caebc274431e99ef7dc875a50f2f2d756215bd2700b7206582f68e5f376\": container with ID starting with d0062caebc274431e99ef7dc875a50f2f2d756215bd2700b7206582f68e5f376 not found: ID does not exist" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.888940 4933 scope.go:117] "RemoveContainer" containerID="6c83315e1cf2e0f0b5ef4f463b5d421bdd6c99328c413d5e4bc17487fafcc629" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.889206 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c83315e1cf2e0f0b5ef4f463b5d421bdd6c99328c413d5e4bc17487fafcc629"} err="failed to get container status \"6c83315e1cf2e0f0b5ef4f463b5d421bdd6c99328c413d5e4bc17487fafcc629\": rpc error: code = NotFound desc = could not find container \"6c83315e1cf2e0f0b5ef4f463b5d421bdd6c99328c413d5e4bc17487fafcc629\": container with ID starting with 6c83315e1cf2e0f0b5ef4f463b5d421bdd6c99328c413d5e4bc17487fafcc629 not found: ID does not exist" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.889225 4933 scope.go:117] "RemoveContainer" containerID="2f7591b746ed9942ce7644e8005251baa49bdeaef3618aa1d679249fd3a96058" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.889558 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f7591b746ed9942ce7644e8005251baa49bdeaef3618aa1d679249fd3a96058"} err="failed to get container status \"2f7591b746ed9942ce7644e8005251baa49bdeaef3618aa1d679249fd3a96058\": rpc error: code = NotFound desc = could not find container \"2f7591b746ed9942ce7644e8005251baa49bdeaef3618aa1d679249fd3a96058\": container with ID starting with 2f7591b746ed9942ce7644e8005251baa49bdeaef3618aa1d679249fd3a96058 not found: ID does not exist" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.889585 4933 scope.go:117] "RemoveContainer" containerID="9ae6c9973a0f8983a5da9a3a25ba23d7d836e7f5bdaa60b60627d604ef3d2afc" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.889883 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ae6c9973a0f8983a5da9a3a25ba23d7d836e7f5bdaa60b60627d604ef3d2afc"} err="failed to get container status \"9ae6c9973a0f8983a5da9a3a25ba23d7d836e7f5bdaa60b60627d604ef3d2afc\": rpc error: code = NotFound desc = could not find container \"9ae6c9973a0f8983a5da9a3a25ba23d7d836e7f5bdaa60b60627d604ef3d2afc\": container with ID starting with 9ae6c9973a0f8983a5da9a3a25ba23d7d836e7f5bdaa60b60627d604ef3d2afc not found: ID does not exist" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.889971 4933 scope.go:117] "RemoveContainer" containerID="7d4c0805f39e3410185b2d898b61b42dfeb544cd47c1ac7570a099ece7921a20" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.890277 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d4c0805f39e3410185b2d898b61b42dfeb544cd47c1ac7570a099ece7921a20"} err="failed to get container status \"7d4c0805f39e3410185b2d898b61b42dfeb544cd47c1ac7570a099ece7921a20\": rpc error: code = NotFound desc = could not find container \"7d4c0805f39e3410185b2d898b61b42dfeb544cd47c1ac7570a099ece7921a20\": container with ID starting with 7d4c0805f39e3410185b2d898b61b42dfeb544cd47c1ac7570a099ece7921a20 not found: ID does not exist" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.890307 4933 scope.go:117] "RemoveContainer" containerID="170155e5803e9eabfc4866a3cd76bd74567d9ff6707bebfdf7f24a2330173140" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.890583 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"170155e5803e9eabfc4866a3cd76bd74567d9ff6707bebfdf7f24a2330173140"} err="failed to get container status \"170155e5803e9eabfc4866a3cd76bd74567d9ff6707bebfdf7f24a2330173140\": rpc error: code = NotFound desc = could not find container \"170155e5803e9eabfc4866a3cd76bd74567d9ff6707bebfdf7f24a2330173140\": container with ID starting with 170155e5803e9eabfc4866a3cd76bd74567d9ff6707bebfdf7f24a2330173140 not found: ID does not exist" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.890617 4933 scope.go:117] "RemoveContainer" containerID="e2a3f2ac3ef3759389e0c4807b4235aea66b8d9570ef020af90110402296a755" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.891130 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2a3f2ac3ef3759389e0c4807b4235aea66b8d9570ef020af90110402296a755"} err="failed to get container status \"e2a3f2ac3ef3759389e0c4807b4235aea66b8d9570ef020af90110402296a755\": rpc error: code = NotFound desc = could not find container \"e2a3f2ac3ef3759389e0c4807b4235aea66b8d9570ef020af90110402296a755\": container with ID starting with e2a3f2ac3ef3759389e0c4807b4235aea66b8d9570ef020af90110402296a755 not found: ID does not exist" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.891163 4933 scope.go:117] "RemoveContainer" containerID="3920b75a79accaa154ca827b9d813f5bbb7c7c18369c0b6d3bb82cab653bcb66" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.892044 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3920b75a79accaa154ca827b9d813f5bbb7c7c18369c0b6d3bb82cab653bcb66"} err="failed to get container status \"3920b75a79accaa154ca827b9d813f5bbb7c7c18369c0b6d3bb82cab653bcb66\": rpc error: code = NotFound desc = could not find container \"3920b75a79accaa154ca827b9d813f5bbb7c7c18369c0b6d3bb82cab653bcb66\": container with ID starting with 3920b75a79accaa154ca827b9d813f5bbb7c7c18369c0b6d3bb82cab653bcb66 not found: ID does not exist" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.892072 4933 scope.go:117] "RemoveContainer" containerID="a499bb33dae499d565aa22f537a095fe9465bd1d68d829248fc7a896f692f67d" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.892532 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a499bb33dae499d565aa22f537a095fe9465bd1d68d829248fc7a896f692f67d"} err="failed to get container status \"a499bb33dae499d565aa22f537a095fe9465bd1d68d829248fc7a896f692f67d\": rpc error: code = NotFound desc = could not find container \"a499bb33dae499d565aa22f537a095fe9465bd1d68d829248fc7a896f692f67d\": container with ID starting with a499bb33dae499d565aa22f537a095fe9465bd1d68d829248fc7a896f692f67d not found: ID does not exist" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.892558 4933 scope.go:117] "RemoveContainer" containerID="14c25265781df0f480ecebbc167b5e413d869433f7258cc299a2c2bc12ee2af3" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.893015 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14c25265781df0f480ecebbc167b5e413d869433f7258cc299a2c2bc12ee2af3"} err="failed to get container status \"14c25265781df0f480ecebbc167b5e413d869433f7258cc299a2c2bc12ee2af3\": rpc error: code = NotFound desc = could not find container \"14c25265781df0f480ecebbc167b5e413d869433f7258cc299a2c2bc12ee2af3\": container with ID starting with 14c25265781df0f480ecebbc167b5e413d869433f7258cc299a2c2bc12ee2af3 not found: ID does not exist" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.893042 4933 scope.go:117] "RemoveContainer" containerID="d0062caebc274431e99ef7dc875a50f2f2d756215bd2700b7206582f68e5f376" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.893358 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0062caebc274431e99ef7dc875a50f2f2d756215bd2700b7206582f68e5f376"} err="failed to get container status \"d0062caebc274431e99ef7dc875a50f2f2d756215bd2700b7206582f68e5f376\": rpc error: code = NotFound desc = could not find container \"d0062caebc274431e99ef7dc875a50f2f2d756215bd2700b7206582f68e5f376\": container with ID starting with d0062caebc274431e99ef7dc875a50f2f2d756215bd2700b7206582f68e5f376 not found: ID does not exist" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.893381 4933 scope.go:117] "RemoveContainer" containerID="6c83315e1cf2e0f0b5ef4f463b5d421bdd6c99328c413d5e4bc17487fafcc629" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.893756 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c83315e1cf2e0f0b5ef4f463b5d421bdd6c99328c413d5e4bc17487fafcc629"} err="failed to get container status \"6c83315e1cf2e0f0b5ef4f463b5d421bdd6c99328c413d5e4bc17487fafcc629\": rpc error: code = NotFound desc = could not find container \"6c83315e1cf2e0f0b5ef4f463b5d421bdd6c99328c413d5e4bc17487fafcc629\": container with ID starting with 6c83315e1cf2e0f0b5ef4f463b5d421bdd6c99328c413d5e4bc17487fafcc629 not found: ID does not exist" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.893776 4933 scope.go:117] "RemoveContainer" containerID="2f7591b746ed9942ce7644e8005251baa49bdeaef3618aa1d679249fd3a96058" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.894184 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f7591b746ed9942ce7644e8005251baa49bdeaef3618aa1d679249fd3a96058"} err="failed to get container status \"2f7591b746ed9942ce7644e8005251baa49bdeaef3618aa1d679249fd3a96058\": rpc error: code = NotFound desc = could not find container \"2f7591b746ed9942ce7644e8005251baa49bdeaef3618aa1d679249fd3a96058\": container with ID starting with 2f7591b746ed9942ce7644e8005251baa49bdeaef3618aa1d679249fd3a96058 not found: ID does not exist" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.894216 4933 scope.go:117] "RemoveContainer" containerID="9ae6c9973a0f8983a5da9a3a25ba23d7d836e7f5bdaa60b60627d604ef3d2afc" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.894593 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ae6c9973a0f8983a5da9a3a25ba23d7d836e7f5bdaa60b60627d604ef3d2afc"} err="failed to get container status \"9ae6c9973a0f8983a5da9a3a25ba23d7d836e7f5bdaa60b60627d604ef3d2afc\": rpc error: code = NotFound desc = could not find container \"9ae6c9973a0f8983a5da9a3a25ba23d7d836e7f5bdaa60b60627d604ef3d2afc\": container with ID starting with 9ae6c9973a0f8983a5da9a3a25ba23d7d836e7f5bdaa60b60627d604ef3d2afc not found: ID does not exist" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.894617 4933 scope.go:117] "RemoveContainer" containerID="7d4c0805f39e3410185b2d898b61b42dfeb544cd47c1ac7570a099ece7921a20" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.894961 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d4c0805f39e3410185b2d898b61b42dfeb544cd47c1ac7570a099ece7921a20"} err="failed to get container status \"7d4c0805f39e3410185b2d898b61b42dfeb544cd47c1ac7570a099ece7921a20\": rpc error: code = NotFound desc = could not find container \"7d4c0805f39e3410185b2d898b61b42dfeb544cd47c1ac7570a099ece7921a20\": container with ID starting with 7d4c0805f39e3410185b2d898b61b42dfeb544cd47c1ac7570a099ece7921a20 not found: ID does not exist" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.894995 4933 scope.go:117] "RemoveContainer" containerID="170155e5803e9eabfc4866a3cd76bd74567d9ff6707bebfdf7f24a2330173140" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.895304 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"170155e5803e9eabfc4866a3cd76bd74567d9ff6707bebfdf7f24a2330173140"} err="failed to get container status \"170155e5803e9eabfc4866a3cd76bd74567d9ff6707bebfdf7f24a2330173140\": rpc error: code = NotFound desc = could not find container \"170155e5803e9eabfc4866a3cd76bd74567d9ff6707bebfdf7f24a2330173140\": container with ID starting with 170155e5803e9eabfc4866a3cd76bd74567d9ff6707bebfdf7f24a2330173140 not found: ID does not exist" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.895335 4933 scope.go:117] "RemoveContainer" containerID="e2a3f2ac3ef3759389e0c4807b4235aea66b8d9570ef020af90110402296a755" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.895660 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2a3f2ac3ef3759389e0c4807b4235aea66b8d9570ef020af90110402296a755"} err="failed to get container status \"e2a3f2ac3ef3759389e0c4807b4235aea66b8d9570ef020af90110402296a755\": rpc error: code = NotFound desc = could not find container \"e2a3f2ac3ef3759389e0c4807b4235aea66b8d9570ef020af90110402296a755\": container with ID starting with e2a3f2ac3ef3759389e0c4807b4235aea66b8d9570ef020af90110402296a755 not found: ID does not exist" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.895692 4933 scope.go:117] "RemoveContainer" containerID="3920b75a79accaa154ca827b9d813f5bbb7c7c18369c0b6d3bb82cab653bcb66" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.896039 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3920b75a79accaa154ca827b9d813f5bbb7c7c18369c0b6d3bb82cab653bcb66"} err="failed to get container status \"3920b75a79accaa154ca827b9d813f5bbb7c7c18369c0b6d3bb82cab653bcb66\": rpc error: code = NotFound desc = could not find container \"3920b75a79accaa154ca827b9d813f5bbb7c7c18369c0b6d3bb82cab653bcb66\": container with ID starting with 3920b75a79accaa154ca827b9d813f5bbb7c7c18369c0b6d3bb82cab653bcb66 not found: ID does not exist" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.896058 4933 scope.go:117] "RemoveContainer" containerID="a499bb33dae499d565aa22f537a095fe9465bd1d68d829248fc7a896f692f67d" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.896310 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a499bb33dae499d565aa22f537a095fe9465bd1d68d829248fc7a896f692f67d"} err="failed to get container status \"a499bb33dae499d565aa22f537a095fe9465bd1d68d829248fc7a896f692f67d\": rpc error: code = NotFound desc = could not find container \"a499bb33dae499d565aa22f537a095fe9465bd1d68d829248fc7a896f692f67d\": container with ID starting with a499bb33dae499d565aa22f537a095fe9465bd1d68d829248fc7a896f692f67d not found: ID does not exist" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.896337 4933 scope.go:117] "RemoveContainer" containerID="14c25265781df0f480ecebbc167b5e413d869433f7258cc299a2c2bc12ee2af3" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.896607 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14c25265781df0f480ecebbc167b5e413d869433f7258cc299a2c2bc12ee2af3"} err="failed to get container status \"14c25265781df0f480ecebbc167b5e413d869433f7258cc299a2c2bc12ee2af3\": rpc error: code = NotFound desc = could not find container \"14c25265781df0f480ecebbc167b5e413d869433f7258cc299a2c2bc12ee2af3\": container with ID starting with 14c25265781df0f480ecebbc167b5e413d869433f7258cc299a2c2bc12ee2af3 not found: ID does not exist" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.896632 4933 scope.go:117] "RemoveContainer" containerID="d0062caebc274431e99ef7dc875a50f2f2d756215bd2700b7206582f68e5f376" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.896952 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0062caebc274431e99ef7dc875a50f2f2d756215bd2700b7206582f68e5f376"} err="failed to get container status \"d0062caebc274431e99ef7dc875a50f2f2d756215bd2700b7206582f68e5f376\": rpc error: code = NotFound desc = could not find container \"d0062caebc274431e99ef7dc875a50f2f2d756215bd2700b7206582f68e5f376\": container with ID starting with d0062caebc274431e99ef7dc875a50f2f2d756215bd2700b7206582f68e5f376 not found: ID does not exist" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.896979 4933 scope.go:117] "RemoveContainer" containerID="6c83315e1cf2e0f0b5ef4f463b5d421bdd6c99328c413d5e4bc17487fafcc629" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.897206 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c83315e1cf2e0f0b5ef4f463b5d421bdd6c99328c413d5e4bc17487fafcc629"} err="failed to get container status \"6c83315e1cf2e0f0b5ef4f463b5d421bdd6c99328c413d5e4bc17487fafcc629\": rpc error: code = NotFound desc = could not find container \"6c83315e1cf2e0f0b5ef4f463b5d421bdd6c99328c413d5e4bc17487fafcc629\": container with ID starting with 6c83315e1cf2e0f0b5ef4f463b5d421bdd6c99328c413d5e4bc17487fafcc629 not found: ID does not exist" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.897229 4933 scope.go:117] "RemoveContainer" containerID="2f7591b746ed9942ce7644e8005251baa49bdeaef3618aa1d679249fd3a96058" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.897606 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f7591b746ed9942ce7644e8005251baa49bdeaef3618aa1d679249fd3a96058"} err="failed to get container status \"2f7591b746ed9942ce7644e8005251baa49bdeaef3618aa1d679249fd3a96058\": rpc error: code = NotFound desc = could not find container \"2f7591b746ed9942ce7644e8005251baa49bdeaef3618aa1d679249fd3a96058\": container with ID starting with 2f7591b746ed9942ce7644e8005251baa49bdeaef3618aa1d679249fd3a96058 not found: ID does not exist" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.897625 4933 scope.go:117] "RemoveContainer" containerID="9ae6c9973a0f8983a5da9a3a25ba23d7d836e7f5bdaa60b60627d604ef3d2afc" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.897855 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ae6c9973a0f8983a5da9a3a25ba23d7d836e7f5bdaa60b60627d604ef3d2afc"} err="failed to get container status \"9ae6c9973a0f8983a5da9a3a25ba23d7d836e7f5bdaa60b60627d604ef3d2afc\": rpc error: code = NotFound desc = could not find container \"9ae6c9973a0f8983a5da9a3a25ba23d7d836e7f5bdaa60b60627d604ef3d2afc\": container with ID starting with 9ae6c9973a0f8983a5da9a3a25ba23d7d836e7f5bdaa60b60627d604ef3d2afc not found: ID does not exist" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.897877 4933 scope.go:117] "RemoveContainer" containerID="7d4c0805f39e3410185b2d898b61b42dfeb544cd47c1ac7570a099ece7921a20" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.898123 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d4c0805f39e3410185b2d898b61b42dfeb544cd47c1ac7570a099ece7921a20"} err="failed to get container status \"7d4c0805f39e3410185b2d898b61b42dfeb544cd47c1ac7570a099ece7921a20\": rpc error: code = NotFound desc = could not find container \"7d4c0805f39e3410185b2d898b61b42dfeb544cd47c1ac7570a099ece7921a20\": container with ID starting with 7d4c0805f39e3410185b2d898b61b42dfeb544cd47c1ac7570a099ece7921a20 not found: ID does not exist" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.898143 4933 scope.go:117] "RemoveContainer" containerID="170155e5803e9eabfc4866a3cd76bd74567d9ff6707bebfdf7f24a2330173140" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.898342 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"170155e5803e9eabfc4866a3cd76bd74567d9ff6707bebfdf7f24a2330173140"} err="failed to get container status \"170155e5803e9eabfc4866a3cd76bd74567d9ff6707bebfdf7f24a2330173140\": rpc error: code = NotFound desc = could not find container \"170155e5803e9eabfc4866a3cd76bd74567d9ff6707bebfdf7f24a2330173140\": container with ID starting with 170155e5803e9eabfc4866a3cd76bd74567d9ff6707bebfdf7f24a2330173140 not found: ID does not exist" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.898363 4933 scope.go:117] "RemoveContainer" containerID="e2a3f2ac3ef3759389e0c4807b4235aea66b8d9570ef020af90110402296a755" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.898551 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2a3f2ac3ef3759389e0c4807b4235aea66b8d9570ef020af90110402296a755"} err="failed to get container status \"e2a3f2ac3ef3759389e0c4807b4235aea66b8d9570ef020af90110402296a755\": rpc error: code = NotFound desc = could not find container \"e2a3f2ac3ef3759389e0c4807b4235aea66b8d9570ef020af90110402296a755\": container with ID starting with e2a3f2ac3ef3759389e0c4807b4235aea66b8d9570ef020af90110402296a755 not found: ID does not exist" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.898571 4933 scope.go:117] "RemoveContainer" containerID="3920b75a79accaa154ca827b9d813f5bbb7c7c18369c0b6d3bb82cab653bcb66" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.898984 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3920b75a79accaa154ca827b9d813f5bbb7c7c18369c0b6d3bb82cab653bcb66"} err="failed to get container status \"3920b75a79accaa154ca827b9d813f5bbb7c7c18369c0b6d3bb82cab653bcb66\": rpc error: code = NotFound desc = could not find container \"3920b75a79accaa154ca827b9d813f5bbb7c7c18369c0b6d3bb82cab653bcb66\": container with ID starting with 3920b75a79accaa154ca827b9d813f5bbb7c7c18369c0b6d3bb82cab653bcb66 not found: ID does not exist" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.899003 4933 scope.go:117] "RemoveContainer" containerID="a499bb33dae499d565aa22f537a095fe9465bd1d68d829248fc7a896f692f67d" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.899203 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a499bb33dae499d565aa22f537a095fe9465bd1d68d829248fc7a896f692f67d"} err="failed to get container status \"a499bb33dae499d565aa22f537a095fe9465bd1d68d829248fc7a896f692f67d\": rpc error: code = NotFound desc = could not find container \"a499bb33dae499d565aa22f537a095fe9465bd1d68d829248fc7a896f692f67d\": container with ID starting with a499bb33dae499d565aa22f537a095fe9465bd1d68d829248fc7a896f692f67d not found: ID does not exist" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.899233 4933 scope.go:117] "RemoveContainer" containerID="14c25265781df0f480ecebbc167b5e413d869433f7258cc299a2c2bc12ee2af3" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.899438 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14c25265781df0f480ecebbc167b5e413d869433f7258cc299a2c2bc12ee2af3"} err="failed to get container status \"14c25265781df0f480ecebbc167b5e413d869433f7258cc299a2c2bc12ee2af3\": rpc error: code = NotFound desc = could not find container \"14c25265781df0f480ecebbc167b5e413d869433f7258cc299a2c2bc12ee2af3\": container with ID starting with 14c25265781df0f480ecebbc167b5e413d869433f7258cc299a2c2bc12ee2af3 not found: ID does not exist" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.899466 4933 scope.go:117] "RemoveContainer" containerID="d0062caebc274431e99ef7dc875a50f2f2d756215bd2700b7206582f68e5f376" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.899703 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0062caebc274431e99ef7dc875a50f2f2d756215bd2700b7206582f68e5f376"} err="failed to get container status \"d0062caebc274431e99ef7dc875a50f2f2d756215bd2700b7206582f68e5f376\": rpc error: code = NotFound desc = could not find container \"d0062caebc274431e99ef7dc875a50f2f2d756215bd2700b7206582f68e5f376\": container with ID starting with d0062caebc274431e99ef7dc875a50f2f2d756215bd2700b7206582f68e5f376 not found: ID does not exist" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.977575 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/64d8adc1-c851-429a-b58d-426086603759-ovnkube-script-lib\") pod \"ovnkube-node-ngtkv\" (UID: \"64d8adc1-c851-429a-b58d-426086603759\") " pod="openshift-ovn-kubernetes/ovnkube-node-ngtkv" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.977619 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/64d8adc1-c851-429a-b58d-426086603759-host-slash\") pod \"ovnkube-node-ngtkv\" (UID: \"64d8adc1-c851-429a-b58d-426086603759\") " pod="openshift-ovn-kubernetes/ovnkube-node-ngtkv" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.977666 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/64d8adc1-c851-429a-b58d-426086603759-host-cni-bin\") pod \"ovnkube-node-ngtkv\" (UID: \"64d8adc1-c851-429a-b58d-426086603759\") " pod="openshift-ovn-kubernetes/ovnkube-node-ngtkv" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.977690 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/64d8adc1-c851-429a-b58d-426086603759-ovn-node-metrics-cert\") pod \"ovnkube-node-ngtkv\" (UID: \"64d8adc1-c851-429a-b58d-426086603759\") " pod="openshift-ovn-kubernetes/ovnkube-node-ngtkv" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.977712 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/64d8adc1-c851-429a-b58d-426086603759-host-run-ovn-kubernetes\") pod \"ovnkube-node-ngtkv\" (UID: \"64d8adc1-c851-429a-b58d-426086603759\") " pod="openshift-ovn-kubernetes/ovnkube-node-ngtkv" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.977758 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/64d8adc1-c851-429a-b58d-426086603759-host-kubelet\") pod \"ovnkube-node-ngtkv\" (UID: \"64d8adc1-c851-429a-b58d-426086603759\") " pod="openshift-ovn-kubernetes/ovnkube-node-ngtkv" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.977770 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/64d8adc1-c851-429a-b58d-426086603759-host-cni-bin\") pod \"ovnkube-node-ngtkv\" (UID: \"64d8adc1-c851-429a-b58d-426086603759\") " pod="openshift-ovn-kubernetes/ovnkube-node-ngtkv" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.977798 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/64d8adc1-c851-429a-b58d-426086603759-host-run-netns\") pod \"ovnkube-node-ngtkv\" (UID: \"64d8adc1-c851-429a-b58d-426086603759\") " pod="openshift-ovn-kubernetes/ovnkube-node-ngtkv" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.977777 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/64d8adc1-c851-429a-b58d-426086603759-host-run-netns\") pod \"ovnkube-node-ngtkv\" (UID: \"64d8adc1-c851-429a-b58d-426086603759\") " pod="openshift-ovn-kubernetes/ovnkube-node-ngtkv" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.977833 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/64d8adc1-c851-429a-b58d-426086603759-host-slash\") pod \"ovnkube-node-ngtkv\" (UID: \"64d8adc1-c851-429a-b58d-426086603759\") " pod="openshift-ovn-kubernetes/ovnkube-node-ngtkv" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.977899 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/64d8adc1-c851-429a-b58d-426086603759-host-kubelet\") pod \"ovnkube-node-ngtkv\" (UID: \"64d8adc1-c851-429a-b58d-426086603759\") " pod="openshift-ovn-kubernetes/ovnkube-node-ngtkv" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.977870 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/64d8adc1-c851-429a-b58d-426086603759-env-overrides\") pod \"ovnkube-node-ngtkv\" (UID: \"64d8adc1-c851-429a-b58d-426086603759\") " pod="openshift-ovn-kubernetes/ovnkube-node-ngtkv" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.977990 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/64d8adc1-c851-429a-b58d-426086603759-node-log\") pod \"ovnkube-node-ngtkv\" (UID: \"64d8adc1-c851-429a-b58d-426086603759\") " pod="openshift-ovn-kubernetes/ovnkube-node-ngtkv" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.977985 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/64d8adc1-c851-429a-b58d-426086603759-host-run-ovn-kubernetes\") pod \"ovnkube-node-ngtkv\" (UID: \"64d8adc1-c851-429a-b58d-426086603759\") " pod="openshift-ovn-kubernetes/ovnkube-node-ngtkv" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.978089 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/64d8adc1-c851-429a-b58d-426086603759-node-log\") pod \"ovnkube-node-ngtkv\" (UID: \"64d8adc1-c851-429a-b58d-426086603759\") " pod="openshift-ovn-kubernetes/ovnkube-node-ngtkv" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.978122 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/64d8adc1-c851-429a-b58d-426086603759-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ngtkv\" (UID: \"64d8adc1-c851-429a-b58d-426086603759\") " pod="openshift-ovn-kubernetes/ovnkube-node-ngtkv" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.978168 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/64d8adc1-c851-429a-b58d-426086603759-run-openvswitch\") pod \"ovnkube-node-ngtkv\" (UID: \"64d8adc1-c851-429a-b58d-426086603759\") " pod="openshift-ovn-kubernetes/ovnkube-node-ngtkv" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.978226 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/64d8adc1-c851-429a-b58d-426086603759-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ngtkv\" (UID: \"64d8adc1-c851-429a-b58d-426086603759\") " pod="openshift-ovn-kubernetes/ovnkube-node-ngtkv" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.978261 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/64d8adc1-c851-429a-b58d-426086603759-run-systemd\") pod \"ovnkube-node-ngtkv\" (UID: \"64d8adc1-c851-429a-b58d-426086603759\") " pod="openshift-ovn-kubernetes/ovnkube-node-ngtkv" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.978311 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/64d8adc1-c851-429a-b58d-426086603759-ovnkube-config\") pod \"ovnkube-node-ngtkv\" (UID: \"64d8adc1-c851-429a-b58d-426086603759\") " pod="openshift-ovn-kubernetes/ovnkube-node-ngtkv" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.978338 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tl4mf\" (UniqueName: \"kubernetes.io/projected/64d8adc1-c851-429a-b58d-426086603759-kube-api-access-tl4mf\") pod \"ovnkube-node-ngtkv\" (UID: \"64d8adc1-c851-429a-b58d-426086603759\") " pod="openshift-ovn-kubernetes/ovnkube-node-ngtkv" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.978342 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/64d8adc1-c851-429a-b58d-426086603759-run-systemd\") pod \"ovnkube-node-ngtkv\" (UID: \"64d8adc1-c851-429a-b58d-426086603759\") " pod="openshift-ovn-kubernetes/ovnkube-node-ngtkv" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.978352 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/64d8adc1-c851-429a-b58d-426086603759-env-overrides\") pod \"ovnkube-node-ngtkv\" (UID: \"64d8adc1-c851-429a-b58d-426086603759\") " pod="openshift-ovn-kubernetes/ovnkube-node-ngtkv" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.978376 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/64d8adc1-c851-429a-b58d-426086603759-run-openvswitch\") pod \"ovnkube-node-ngtkv\" (UID: \"64d8adc1-c851-429a-b58d-426086603759\") " pod="openshift-ovn-kubernetes/ovnkube-node-ngtkv" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.978400 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/64d8adc1-c851-429a-b58d-426086603759-run-ovn\") pod \"ovnkube-node-ngtkv\" (UID: \"64d8adc1-c851-429a-b58d-426086603759\") " pod="openshift-ovn-kubernetes/ovnkube-node-ngtkv" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.978400 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/64d8adc1-c851-429a-b58d-426086603759-ovnkube-script-lib\") pod \"ovnkube-node-ngtkv\" (UID: \"64d8adc1-c851-429a-b58d-426086603759\") " pod="openshift-ovn-kubernetes/ovnkube-node-ngtkv" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.978436 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/64d8adc1-c851-429a-b58d-426086603759-run-ovn\") pod \"ovnkube-node-ngtkv\" (UID: \"64d8adc1-c851-429a-b58d-426086603759\") " pod="openshift-ovn-kubernetes/ovnkube-node-ngtkv" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.978506 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/64d8adc1-c851-429a-b58d-426086603759-etc-openvswitch\") pod \"ovnkube-node-ngtkv\" (UID: \"64d8adc1-c851-429a-b58d-426086603759\") " pod="openshift-ovn-kubernetes/ovnkube-node-ngtkv" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.978541 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/64d8adc1-c851-429a-b58d-426086603759-systemd-units\") pod \"ovnkube-node-ngtkv\" (UID: \"64d8adc1-c851-429a-b58d-426086603759\") " pod="openshift-ovn-kubernetes/ovnkube-node-ngtkv" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.978566 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/64d8adc1-c851-429a-b58d-426086603759-var-lib-openvswitch\") pod \"ovnkube-node-ngtkv\" (UID: \"64d8adc1-c851-429a-b58d-426086603759\") " pod="openshift-ovn-kubernetes/ovnkube-node-ngtkv" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.978576 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/64d8adc1-c851-429a-b58d-426086603759-etc-openvswitch\") pod \"ovnkube-node-ngtkv\" (UID: \"64d8adc1-c851-429a-b58d-426086603759\") " pod="openshift-ovn-kubernetes/ovnkube-node-ngtkv" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.978586 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/64d8adc1-c851-429a-b58d-426086603759-log-socket\") pod \"ovnkube-node-ngtkv\" (UID: \"64d8adc1-c851-429a-b58d-426086603759\") " pod="openshift-ovn-kubernetes/ovnkube-node-ngtkv" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.978619 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/64d8adc1-c851-429a-b58d-426086603759-systemd-units\") pod \"ovnkube-node-ngtkv\" (UID: \"64d8adc1-c851-429a-b58d-426086603759\") " pod="openshift-ovn-kubernetes/ovnkube-node-ngtkv" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.978622 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/64d8adc1-c851-429a-b58d-426086603759-host-cni-netd\") pod \"ovnkube-node-ngtkv\" (UID: \"64d8adc1-c851-429a-b58d-426086603759\") " pod="openshift-ovn-kubernetes/ovnkube-node-ngtkv" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.978638 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/64d8adc1-c851-429a-b58d-426086603759-host-cni-netd\") pod \"ovnkube-node-ngtkv\" (UID: \"64d8adc1-c851-429a-b58d-426086603759\") " pod="openshift-ovn-kubernetes/ovnkube-node-ngtkv" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.978677 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/64d8adc1-c851-429a-b58d-426086603759-var-lib-openvswitch\") pod \"ovnkube-node-ngtkv\" (UID: \"64d8adc1-c851-429a-b58d-426086603759\") " pod="openshift-ovn-kubernetes/ovnkube-node-ngtkv" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.978693 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/64d8adc1-c851-429a-b58d-426086603759-log-socket\") pod \"ovnkube-node-ngtkv\" (UID: \"64d8adc1-c851-429a-b58d-426086603759\") " pod="openshift-ovn-kubernetes/ovnkube-node-ngtkv" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.978839 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/64d8adc1-c851-429a-b58d-426086603759-ovnkube-config\") pod \"ovnkube-node-ngtkv\" (UID: \"64d8adc1-c851-429a-b58d-426086603759\") " pod="openshift-ovn-kubernetes/ovnkube-node-ngtkv" Dec 02 16:03:29 crc kubenswrapper[4933]: I1202 16:03:29.981346 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/64d8adc1-c851-429a-b58d-426086603759-ovn-node-metrics-cert\") pod \"ovnkube-node-ngtkv\" (UID: \"64d8adc1-c851-429a-b58d-426086603759\") " pod="openshift-ovn-kubernetes/ovnkube-node-ngtkv" Dec 02 16:03:30 crc kubenswrapper[4933]: I1202 16:03:30.000489 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tl4mf\" (UniqueName: \"kubernetes.io/projected/64d8adc1-c851-429a-b58d-426086603759-kube-api-access-tl4mf\") pod \"ovnkube-node-ngtkv\" (UID: \"64d8adc1-c851-429a-b58d-426086603759\") " pod="openshift-ovn-kubernetes/ovnkube-node-ngtkv" Dec 02 16:03:30 crc kubenswrapper[4933]: I1202 16:03:30.005208 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-8mklc"] Dec 02 16:03:30 crc kubenswrapper[4933]: I1202 16:03:30.014162 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-8mklc"] Dec 02 16:03:30 crc kubenswrapper[4933]: I1202 16:03:30.060655 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-ngtkv" Dec 02 16:03:30 crc kubenswrapper[4933]: W1202 16:03:30.093193 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod64d8adc1_c851_429a_b58d_426086603759.slice/crio-c34f8d0e940bd1902c8afff4ffcb199d4534643a22858aec1afe89f909211c38 WatchSource:0}: Error finding container c34f8d0e940bd1902c8afff4ffcb199d4534643a22858aec1afe89f909211c38: Status 404 returned error can't find the container with id c34f8d0e940bd1902c8afff4ffcb199d4534643a22858aec1afe89f909211c38 Dec 02 16:03:30 crc kubenswrapper[4933]: I1202 16:03:30.660816 4933 generic.go:334] "Generic (PLEG): container finished" podID="64d8adc1-c851-429a-b58d-426086603759" containerID="928d77e82c398c0d8c6cd08fee3822570a1f516fc683bab82182a6026267077e" exitCode=0 Dec 02 16:03:30 crc kubenswrapper[4933]: I1202 16:03:30.660913 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ngtkv" event={"ID":"64d8adc1-c851-429a-b58d-426086603759","Type":"ContainerDied","Data":"928d77e82c398c0d8c6cd08fee3822570a1f516fc683bab82182a6026267077e"} Dec 02 16:03:30 crc kubenswrapper[4933]: I1202 16:03:30.661244 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ngtkv" event={"ID":"64d8adc1-c851-429a-b58d-426086603759","Type":"ContainerStarted","Data":"c34f8d0e940bd1902c8afff4ffcb199d4534643a22858aec1afe89f909211c38"} Dec 02 16:03:30 crc kubenswrapper[4933]: I1202 16:03:30.664151 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-z6kjz_b033c545-93a2-4401-842b-22456e44216b/kube-multus/2.log" Dec 02 16:03:31 crc kubenswrapper[4933]: I1202 16:03:31.060488 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1972064c-ea30-421c-b009-2bc675a98fcc" path="/var/lib/kubelet/pods/1972064c-ea30-421c-b009-2bc675a98fcc/volumes" Dec 02 16:03:31 crc kubenswrapper[4933]: I1202 16:03:31.674422 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ngtkv" event={"ID":"64d8adc1-c851-429a-b58d-426086603759","Type":"ContainerStarted","Data":"5fb2079bb173caecd278170fbbdd72e8aa6f7831f8944e9b673d7b0772e9d1e5"} Dec 02 16:03:31 crc kubenswrapper[4933]: I1202 16:03:31.674710 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ngtkv" event={"ID":"64d8adc1-c851-429a-b58d-426086603759","Type":"ContainerStarted","Data":"44042419de2530d520f1f7bf6bd273c053345bd9661191dc607ef68787630060"} Dec 02 16:03:31 crc kubenswrapper[4933]: I1202 16:03:31.674722 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ngtkv" event={"ID":"64d8adc1-c851-429a-b58d-426086603759","Type":"ContainerStarted","Data":"b99125c58c4ea681b05c824e458f9037553420ea43b889b42aeba17d8e55b137"} Dec 02 16:03:31 crc kubenswrapper[4933]: I1202 16:03:31.674730 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ngtkv" event={"ID":"64d8adc1-c851-429a-b58d-426086603759","Type":"ContainerStarted","Data":"a2bd5d189cd9ba3d18bb655df17b86a12128735fa0b5cd8347ff9dbc0dc774d7"} Dec 02 16:03:31 crc kubenswrapper[4933]: I1202 16:03:31.674738 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ngtkv" event={"ID":"64d8adc1-c851-429a-b58d-426086603759","Type":"ContainerStarted","Data":"eab347fc8dfc0a78c329101d206bd167a88ee62bc69cbc13e3b7c9d54906fb07"} Dec 02 16:03:31 crc kubenswrapper[4933]: I1202 16:03:31.674746 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ngtkv" event={"ID":"64d8adc1-c851-429a-b58d-426086603759","Type":"ContainerStarted","Data":"f045ea1882d20361a5b6a3a8b1e26f016cc946605e5c2b16765902d3cf0fc12c"} Dec 02 16:03:34 crc kubenswrapper[4933]: I1202 16:03:34.710631 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ngtkv" event={"ID":"64d8adc1-c851-429a-b58d-426086603759","Type":"ContainerStarted","Data":"b19c96b8917acaf12bc0d25735326faa32bc6b59a703d84edf364e76d4637387"} Dec 02 16:03:35 crc kubenswrapper[4933]: I1202 16:03:35.074746 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-pcq8l"] Dec 02 16:03:35 crc kubenswrapper[4933]: I1202 16:03:35.075496 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-pcq8l" Dec 02 16:03:35 crc kubenswrapper[4933]: I1202 16:03:35.077400 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-lb5sp" Dec 02 16:03:35 crc kubenswrapper[4933]: I1202 16:03:35.078201 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Dec 02 16:03:35 crc kubenswrapper[4933]: I1202 16:03:35.078492 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Dec 02 16:03:35 crc kubenswrapper[4933]: I1202 16:03:35.150288 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwj8z\" (UniqueName: \"kubernetes.io/projected/9d0e9982-e6e0-43d3-8e6d-8835a52fe9d8-kube-api-access-pwj8z\") pod \"obo-prometheus-operator-668cf9dfbb-pcq8l\" (UID: \"9d0e9982-e6e0-43d3-8e6d-8835a52fe9d8\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-pcq8l" Dec 02 16:03:35 crc kubenswrapper[4933]: I1202 16:03:35.205533 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-cd6cfcdf6-wnjsz"] Dec 02 16:03:35 crc kubenswrapper[4933]: I1202 16:03:35.206512 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-cd6cfcdf6-wnjsz" Dec 02 16:03:35 crc kubenswrapper[4933]: I1202 16:03:35.209265 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-c58jl" Dec 02 16:03:35 crc kubenswrapper[4933]: I1202 16:03:35.209915 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Dec 02 16:03:35 crc kubenswrapper[4933]: I1202 16:03:35.227399 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-cd6cfcdf6-cwxhj"] Dec 02 16:03:35 crc kubenswrapper[4933]: I1202 16:03:35.228358 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-cd6cfcdf6-cwxhj" Dec 02 16:03:35 crc kubenswrapper[4933]: I1202 16:03:35.252633 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/15f56600-a066-40fd-8433-d0552173dc57-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-cd6cfcdf6-wnjsz\" (UID: \"15f56600-a066-40fd-8433-d0552173dc57\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-cd6cfcdf6-wnjsz" Dec 02 16:03:35 crc kubenswrapper[4933]: I1202 16:03:35.252706 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwj8z\" (UniqueName: \"kubernetes.io/projected/9d0e9982-e6e0-43d3-8e6d-8835a52fe9d8-kube-api-access-pwj8z\") pod \"obo-prometheus-operator-668cf9dfbb-pcq8l\" (UID: \"9d0e9982-e6e0-43d3-8e6d-8835a52fe9d8\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-pcq8l" Dec 02 16:03:35 crc kubenswrapper[4933]: I1202 16:03:35.252769 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/15f56600-a066-40fd-8433-d0552173dc57-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-cd6cfcdf6-wnjsz\" (UID: \"15f56600-a066-40fd-8433-d0552173dc57\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-cd6cfcdf6-wnjsz" Dec 02 16:03:35 crc kubenswrapper[4933]: I1202 16:03:35.252788 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a8b5527a-5f6c-461a-8397-c911f538eb3a-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-cd6cfcdf6-cwxhj\" (UID: \"a8b5527a-5f6c-461a-8397-c911f538eb3a\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-cd6cfcdf6-cwxhj" Dec 02 16:03:35 crc kubenswrapper[4933]: I1202 16:03:35.252864 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a8b5527a-5f6c-461a-8397-c911f538eb3a-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-cd6cfcdf6-cwxhj\" (UID: \"a8b5527a-5f6c-461a-8397-c911f538eb3a\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-cd6cfcdf6-cwxhj" Dec 02 16:03:35 crc kubenswrapper[4933]: I1202 16:03:35.276620 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwj8z\" (UniqueName: \"kubernetes.io/projected/9d0e9982-e6e0-43d3-8e6d-8835a52fe9d8-kube-api-access-pwj8z\") pod \"obo-prometheus-operator-668cf9dfbb-pcq8l\" (UID: \"9d0e9982-e6e0-43d3-8e6d-8835a52fe9d8\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-pcq8l" Dec 02 16:03:35 crc kubenswrapper[4933]: I1202 16:03:35.354265 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/15f56600-a066-40fd-8433-d0552173dc57-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-cd6cfcdf6-wnjsz\" (UID: \"15f56600-a066-40fd-8433-d0552173dc57\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-cd6cfcdf6-wnjsz" Dec 02 16:03:35 crc kubenswrapper[4933]: I1202 16:03:35.354360 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/15f56600-a066-40fd-8433-d0552173dc57-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-cd6cfcdf6-wnjsz\" (UID: \"15f56600-a066-40fd-8433-d0552173dc57\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-cd6cfcdf6-wnjsz" Dec 02 16:03:35 crc kubenswrapper[4933]: I1202 16:03:35.354382 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a8b5527a-5f6c-461a-8397-c911f538eb3a-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-cd6cfcdf6-cwxhj\" (UID: \"a8b5527a-5f6c-461a-8397-c911f538eb3a\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-cd6cfcdf6-cwxhj" Dec 02 16:03:35 crc kubenswrapper[4933]: I1202 16:03:35.354411 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a8b5527a-5f6c-461a-8397-c911f538eb3a-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-cd6cfcdf6-cwxhj\" (UID: \"a8b5527a-5f6c-461a-8397-c911f538eb3a\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-cd6cfcdf6-cwxhj" Dec 02 16:03:35 crc kubenswrapper[4933]: I1202 16:03:35.357739 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/15f56600-a066-40fd-8433-d0552173dc57-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-cd6cfcdf6-wnjsz\" (UID: \"15f56600-a066-40fd-8433-d0552173dc57\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-cd6cfcdf6-wnjsz" Dec 02 16:03:35 crc kubenswrapper[4933]: I1202 16:03:35.357854 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a8b5527a-5f6c-461a-8397-c911f538eb3a-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-cd6cfcdf6-cwxhj\" (UID: \"a8b5527a-5f6c-461a-8397-c911f538eb3a\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-cd6cfcdf6-cwxhj" Dec 02 16:03:35 crc kubenswrapper[4933]: I1202 16:03:35.360662 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/15f56600-a066-40fd-8433-d0552173dc57-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-cd6cfcdf6-wnjsz\" (UID: \"15f56600-a066-40fd-8433-d0552173dc57\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-cd6cfcdf6-wnjsz" Dec 02 16:03:35 crc kubenswrapper[4933]: I1202 16:03:35.361609 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a8b5527a-5f6c-461a-8397-c911f538eb3a-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-cd6cfcdf6-cwxhj\" (UID: \"a8b5527a-5f6c-461a-8397-c911f538eb3a\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-cd6cfcdf6-cwxhj" Dec 02 16:03:35 crc kubenswrapper[4933]: I1202 16:03:35.389908 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-pcq8l" Dec 02 16:03:35 crc kubenswrapper[4933]: I1202 16:03:35.408937 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-dpvq4"] Dec 02 16:03:35 crc kubenswrapper[4933]: I1202 16:03:35.409684 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-dpvq4" Dec 02 16:03:35 crc kubenswrapper[4933]: I1202 16:03:35.414248 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-hmmdc" Dec 02 16:03:35 crc kubenswrapper[4933]: I1202 16:03:35.414378 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Dec 02 16:03:35 crc kubenswrapper[4933]: E1202 16:03:35.420393 4933 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-pcq8l_openshift-operators_9d0e9982-e6e0-43d3-8e6d-8835a52fe9d8_0(73f1e5c2b48a27a73eeab29cfda8cc1b80017fe378a4a58305b80abb269faeb1): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 02 16:03:35 crc kubenswrapper[4933]: E1202 16:03:35.420470 4933 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-pcq8l_openshift-operators_9d0e9982-e6e0-43d3-8e6d-8835a52fe9d8_0(73f1e5c2b48a27a73eeab29cfda8cc1b80017fe378a4a58305b80abb269faeb1): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-pcq8l" Dec 02 16:03:35 crc kubenswrapper[4933]: E1202 16:03:35.420498 4933 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-pcq8l_openshift-operators_9d0e9982-e6e0-43d3-8e6d-8835a52fe9d8_0(73f1e5c2b48a27a73eeab29cfda8cc1b80017fe378a4a58305b80abb269faeb1): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-pcq8l" Dec 02 16:03:35 crc kubenswrapper[4933]: E1202 16:03:35.420542 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-668cf9dfbb-pcq8l_openshift-operators(9d0e9982-e6e0-43d3-8e6d-8835a52fe9d8)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-668cf9dfbb-pcq8l_openshift-operators(9d0e9982-e6e0-43d3-8e6d-8835a52fe9d8)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-pcq8l_openshift-operators_9d0e9982-e6e0-43d3-8e6d-8835a52fe9d8_0(73f1e5c2b48a27a73eeab29cfda8cc1b80017fe378a4a58305b80abb269faeb1): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-pcq8l" podUID="9d0e9982-e6e0-43d3-8e6d-8835a52fe9d8" Dec 02 16:03:35 crc kubenswrapper[4933]: I1202 16:03:35.455885 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/9f22c4f9-d4f0-4ff8-9322-f03662c116a8-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-dpvq4\" (UID: \"9f22c4f9-d4f0-4ff8-9322-f03662c116a8\") " pod="openshift-operators/observability-operator-d8bb48f5d-dpvq4" Dec 02 16:03:35 crc kubenswrapper[4933]: I1202 16:03:35.455993 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x989r\" (UniqueName: \"kubernetes.io/projected/9f22c4f9-d4f0-4ff8-9322-f03662c116a8-kube-api-access-x989r\") pod \"observability-operator-d8bb48f5d-dpvq4\" (UID: \"9f22c4f9-d4f0-4ff8-9322-f03662c116a8\") " pod="openshift-operators/observability-operator-d8bb48f5d-dpvq4" Dec 02 16:03:35 crc kubenswrapper[4933]: I1202 16:03:35.519809 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-cd6cfcdf6-wnjsz" Dec 02 16:03:35 crc kubenswrapper[4933]: I1202 16:03:35.543081 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-cd6cfcdf6-cwxhj" Dec 02 16:03:35 crc kubenswrapper[4933]: E1202 16:03:35.546302 4933 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-cd6cfcdf6-wnjsz_openshift-operators_15f56600-a066-40fd-8433-d0552173dc57_0(130713cb341e3f8fbe62244fa22dcbe3f75935076f2effdab22dc68d7663dd90): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 02 16:03:35 crc kubenswrapper[4933]: E1202 16:03:35.546358 4933 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-cd6cfcdf6-wnjsz_openshift-operators_15f56600-a066-40fd-8433-d0552173dc57_0(130713cb341e3f8fbe62244fa22dcbe3f75935076f2effdab22dc68d7663dd90): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-cd6cfcdf6-wnjsz" Dec 02 16:03:35 crc kubenswrapper[4933]: E1202 16:03:35.546378 4933 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-cd6cfcdf6-wnjsz_openshift-operators_15f56600-a066-40fd-8433-d0552173dc57_0(130713cb341e3f8fbe62244fa22dcbe3f75935076f2effdab22dc68d7663dd90): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-cd6cfcdf6-wnjsz" Dec 02 16:03:35 crc kubenswrapper[4933]: E1202 16:03:35.546430 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-cd6cfcdf6-wnjsz_openshift-operators(15f56600-a066-40fd-8433-d0552173dc57)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-cd6cfcdf6-wnjsz_openshift-operators(15f56600-a066-40fd-8433-d0552173dc57)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-cd6cfcdf6-wnjsz_openshift-operators_15f56600-a066-40fd-8433-d0552173dc57_0(130713cb341e3f8fbe62244fa22dcbe3f75935076f2effdab22dc68d7663dd90): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-cd6cfcdf6-wnjsz" podUID="15f56600-a066-40fd-8433-d0552173dc57" Dec 02 16:03:35 crc kubenswrapper[4933]: I1202 16:03:35.557875 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/9f22c4f9-d4f0-4ff8-9322-f03662c116a8-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-dpvq4\" (UID: \"9f22c4f9-d4f0-4ff8-9322-f03662c116a8\") " pod="openshift-operators/observability-operator-d8bb48f5d-dpvq4" Dec 02 16:03:35 crc kubenswrapper[4933]: I1202 16:03:35.557970 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x989r\" (UniqueName: \"kubernetes.io/projected/9f22c4f9-d4f0-4ff8-9322-f03662c116a8-kube-api-access-x989r\") pod \"observability-operator-d8bb48f5d-dpvq4\" (UID: \"9f22c4f9-d4f0-4ff8-9322-f03662c116a8\") " pod="openshift-operators/observability-operator-d8bb48f5d-dpvq4" Dec 02 16:03:35 crc kubenswrapper[4933]: I1202 16:03:35.561462 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/9f22c4f9-d4f0-4ff8-9322-f03662c116a8-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-dpvq4\" (UID: \"9f22c4f9-d4f0-4ff8-9322-f03662c116a8\") " pod="openshift-operators/observability-operator-d8bb48f5d-dpvq4" Dec 02 16:03:35 crc kubenswrapper[4933]: I1202 16:03:35.575555 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x989r\" (UniqueName: \"kubernetes.io/projected/9f22c4f9-d4f0-4ff8-9322-f03662c116a8-kube-api-access-x989r\") pod \"observability-operator-d8bb48f5d-dpvq4\" (UID: \"9f22c4f9-d4f0-4ff8-9322-f03662c116a8\") " pod="openshift-operators/observability-operator-d8bb48f5d-dpvq4" Dec 02 16:03:35 crc kubenswrapper[4933]: E1202 16:03:35.581878 4933 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-cd6cfcdf6-cwxhj_openshift-operators_a8b5527a-5f6c-461a-8397-c911f538eb3a_0(b469652be2f07c00e65a6df2d920c6396d77f9f0bc2e6808964582a887a2485e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 02 16:03:35 crc kubenswrapper[4933]: E1202 16:03:35.581957 4933 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-cd6cfcdf6-cwxhj_openshift-operators_a8b5527a-5f6c-461a-8397-c911f538eb3a_0(b469652be2f07c00e65a6df2d920c6396d77f9f0bc2e6808964582a887a2485e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-cd6cfcdf6-cwxhj" Dec 02 16:03:35 crc kubenswrapper[4933]: E1202 16:03:35.581987 4933 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-cd6cfcdf6-cwxhj_openshift-operators_a8b5527a-5f6c-461a-8397-c911f538eb3a_0(b469652be2f07c00e65a6df2d920c6396d77f9f0bc2e6808964582a887a2485e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-cd6cfcdf6-cwxhj" Dec 02 16:03:35 crc kubenswrapper[4933]: E1202 16:03:35.582042 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-cd6cfcdf6-cwxhj_openshift-operators(a8b5527a-5f6c-461a-8397-c911f538eb3a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-cd6cfcdf6-cwxhj_openshift-operators(a8b5527a-5f6c-461a-8397-c911f538eb3a)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-cd6cfcdf6-cwxhj_openshift-operators_a8b5527a-5f6c-461a-8397-c911f538eb3a_0(b469652be2f07c00e65a6df2d920c6396d77f9f0bc2e6808964582a887a2485e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-cd6cfcdf6-cwxhj" podUID="a8b5527a-5f6c-461a-8397-c911f538eb3a" Dec 02 16:03:35 crc kubenswrapper[4933]: I1202 16:03:35.609814 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5446b9c989-s9dt9"] Dec 02 16:03:35 crc kubenswrapper[4933]: I1202 16:03:35.610797 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-s9dt9" Dec 02 16:03:35 crc kubenswrapper[4933]: I1202 16:03:35.619664 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-b9mp6" Dec 02 16:03:35 crc kubenswrapper[4933]: I1202 16:03:35.659789 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfqhw\" (UniqueName: \"kubernetes.io/projected/1ec8da33-52a7-4abb-a205-7c14a8186f5e-kube-api-access-zfqhw\") pod \"perses-operator-5446b9c989-s9dt9\" (UID: \"1ec8da33-52a7-4abb-a205-7c14a8186f5e\") " pod="openshift-operators/perses-operator-5446b9c989-s9dt9" Dec 02 16:03:35 crc kubenswrapper[4933]: I1202 16:03:35.659901 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/1ec8da33-52a7-4abb-a205-7c14a8186f5e-openshift-service-ca\") pod \"perses-operator-5446b9c989-s9dt9\" (UID: \"1ec8da33-52a7-4abb-a205-7c14a8186f5e\") " pod="openshift-operators/perses-operator-5446b9c989-s9dt9" Dec 02 16:03:35 crc kubenswrapper[4933]: I1202 16:03:35.747681 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-dpvq4" Dec 02 16:03:35 crc kubenswrapper[4933]: I1202 16:03:35.761315 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/1ec8da33-52a7-4abb-a205-7c14a8186f5e-openshift-service-ca\") pod \"perses-operator-5446b9c989-s9dt9\" (UID: \"1ec8da33-52a7-4abb-a205-7c14a8186f5e\") " pod="openshift-operators/perses-operator-5446b9c989-s9dt9" Dec 02 16:03:35 crc kubenswrapper[4933]: I1202 16:03:35.761354 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zfqhw\" (UniqueName: \"kubernetes.io/projected/1ec8da33-52a7-4abb-a205-7c14a8186f5e-kube-api-access-zfqhw\") pod \"perses-operator-5446b9c989-s9dt9\" (UID: \"1ec8da33-52a7-4abb-a205-7c14a8186f5e\") " pod="openshift-operators/perses-operator-5446b9c989-s9dt9" Dec 02 16:03:35 crc kubenswrapper[4933]: I1202 16:03:35.762328 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/1ec8da33-52a7-4abb-a205-7c14a8186f5e-openshift-service-ca\") pod \"perses-operator-5446b9c989-s9dt9\" (UID: \"1ec8da33-52a7-4abb-a205-7c14a8186f5e\") " pod="openshift-operators/perses-operator-5446b9c989-s9dt9" Dec 02 16:03:35 crc kubenswrapper[4933]: E1202 16:03:35.769476 4933 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-dpvq4_openshift-operators_9f22c4f9-d4f0-4ff8-9322-f03662c116a8_0(817a4cf1340313a90de16d3b066dfa01bd2d83d2748446e6baf98caad9d7d623): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 02 16:03:35 crc kubenswrapper[4933]: E1202 16:03:35.769548 4933 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-dpvq4_openshift-operators_9f22c4f9-d4f0-4ff8-9322-f03662c116a8_0(817a4cf1340313a90de16d3b066dfa01bd2d83d2748446e6baf98caad9d7d623): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-d8bb48f5d-dpvq4" Dec 02 16:03:35 crc kubenswrapper[4933]: E1202 16:03:35.769569 4933 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-dpvq4_openshift-operators_9f22c4f9-d4f0-4ff8-9322-f03662c116a8_0(817a4cf1340313a90de16d3b066dfa01bd2d83d2748446e6baf98caad9d7d623): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-d8bb48f5d-dpvq4" Dec 02 16:03:35 crc kubenswrapper[4933]: E1202 16:03:35.769612 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-d8bb48f5d-dpvq4_openshift-operators(9f22c4f9-d4f0-4ff8-9322-f03662c116a8)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-d8bb48f5d-dpvq4_openshift-operators(9f22c4f9-d4f0-4ff8-9322-f03662c116a8)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-dpvq4_openshift-operators_9f22c4f9-d4f0-4ff8-9322-f03662c116a8_0(817a4cf1340313a90de16d3b066dfa01bd2d83d2748446e6baf98caad9d7d623): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-d8bb48f5d-dpvq4" podUID="9f22c4f9-d4f0-4ff8-9322-f03662c116a8" Dec 02 16:03:35 crc kubenswrapper[4933]: I1202 16:03:35.778315 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfqhw\" (UniqueName: \"kubernetes.io/projected/1ec8da33-52a7-4abb-a205-7c14a8186f5e-kube-api-access-zfqhw\") pod \"perses-operator-5446b9c989-s9dt9\" (UID: \"1ec8da33-52a7-4abb-a205-7c14a8186f5e\") " pod="openshift-operators/perses-operator-5446b9c989-s9dt9" Dec 02 16:03:35 crc kubenswrapper[4933]: I1202 16:03:35.926334 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-s9dt9" Dec 02 16:03:36 crc kubenswrapper[4933]: E1202 16:03:36.016376 4933 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-s9dt9_openshift-operators_1ec8da33-52a7-4abb-a205-7c14a8186f5e_0(5a9ec510832b4905902c8ead7186492898a7dc2b0cdb013289c826dee2ed24a5): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 02 16:03:36 crc kubenswrapper[4933]: E1202 16:03:36.016449 4933 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-s9dt9_openshift-operators_1ec8da33-52a7-4abb-a205-7c14a8186f5e_0(5a9ec510832b4905902c8ead7186492898a7dc2b0cdb013289c826dee2ed24a5): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5446b9c989-s9dt9" Dec 02 16:03:36 crc kubenswrapper[4933]: E1202 16:03:36.016476 4933 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-s9dt9_openshift-operators_1ec8da33-52a7-4abb-a205-7c14a8186f5e_0(5a9ec510832b4905902c8ead7186492898a7dc2b0cdb013289c826dee2ed24a5): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5446b9c989-s9dt9" Dec 02 16:03:36 crc kubenswrapper[4933]: E1202 16:03:36.016555 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-5446b9c989-s9dt9_openshift-operators(1ec8da33-52a7-4abb-a205-7c14a8186f5e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-5446b9c989-s9dt9_openshift-operators(1ec8da33-52a7-4abb-a205-7c14a8186f5e)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-s9dt9_openshift-operators_1ec8da33-52a7-4abb-a205-7c14a8186f5e_0(5a9ec510832b4905902c8ead7186492898a7dc2b0cdb013289c826dee2ed24a5): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-5446b9c989-s9dt9" podUID="1ec8da33-52a7-4abb-a205-7c14a8186f5e" Dec 02 16:03:36 crc kubenswrapper[4933]: I1202 16:03:36.723986 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ngtkv" event={"ID":"64d8adc1-c851-429a-b58d-426086603759","Type":"ContainerStarted","Data":"39c6f10e657293d19d2a29c0050f6b7854381ab1b4c18bcc8c19c8ff1622e40e"} Dec 02 16:03:36 crc kubenswrapper[4933]: I1202 16:03:36.724535 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-ngtkv" Dec 02 16:03:36 crc kubenswrapper[4933]: I1202 16:03:36.724577 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-ngtkv" Dec 02 16:03:36 crc kubenswrapper[4933]: I1202 16:03:36.724588 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-ngtkv" Dec 02 16:03:36 crc kubenswrapper[4933]: I1202 16:03:36.751509 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ngtkv" Dec 02 16:03:36 crc kubenswrapper[4933]: I1202 16:03:36.767675 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ngtkv" Dec 02 16:03:36 crc kubenswrapper[4933]: I1202 16:03:36.776038 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-ngtkv" podStartSLOduration=7.776022395 podStartE2EDuration="7.776022395s" podCreationTimestamp="2025-12-02 16:03:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 16:03:36.774813973 +0000 UTC m=+680.026040696" watchObservedRunningTime="2025-12-02 16:03:36.776022395 +0000 UTC m=+680.027249098" Dec 02 16:03:37 crc kubenswrapper[4933]: I1202 16:03:37.318876 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-cd6cfcdf6-wnjsz"] Dec 02 16:03:37 crc kubenswrapper[4933]: I1202 16:03:37.319351 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-cd6cfcdf6-wnjsz" Dec 02 16:03:37 crc kubenswrapper[4933]: I1202 16:03:37.319935 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-cd6cfcdf6-wnjsz" Dec 02 16:03:37 crc kubenswrapper[4933]: I1202 16:03:37.326474 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-dpvq4"] Dec 02 16:03:37 crc kubenswrapper[4933]: I1202 16:03:37.326572 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-dpvq4" Dec 02 16:03:37 crc kubenswrapper[4933]: I1202 16:03:37.326939 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-dpvq4" Dec 02 16:03:37 crc kubenswrapper[4933]: I1202 16:03:37.332509 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5446b9c989-s9dt9"] Dec 02 16:03:37 crc kubenswrapper[4933]: I1202 16:03:37.332594 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-s9dt9" Dec 02 16:03:37 crc kubenswrapper[4933]: I1202 16:03:37.343728 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-s9dt9" Dec 02 16:03:37 crc kubenswrapper[4933]: I1202 16:03:37.352167 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-pcq8l"] Dec 02 16:03:37 crc kubenswrapper[4933]: I1202 16:03:37.352275 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-pcq8l" Dec 02 16:03:37 crc kubenswrapper[4933]: I1202 16:03:37.352791 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-pcq8l" Dec 02 16:03:37 crc kubenswrapper[4933]: I1202 16:03:37.373652 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-cd6cfcdf6-cwxhj"] Dec 02 16:03:37 crc kubenswrapper[4933]: I1202 16:03:37.373809 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-cd6cfcdf6-cwxhj" Dec 02 16:03:37 crc kubenswrapper[4933]: I1202 16:03:37.382935 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-cd6cfcdf6-cwxhj" Dec 02 16:03:37 crc kubenswrapper[4933]: E1202 16:03:37.395032 4933 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-dpvq4_openshift-operators_9f22c4f9-d4f0-4ff8-9322-f03662c116a8_0(6a9f7175fe6ec2f0d7ea7b4276158c0114d3394dee53c2b4b7ab08a66ccae19e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 02 16:03:37 crc kubenswrapper[4933]: E1202 16:03:37.395121 4933 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-dpvq4_openshift-operators_9f22c4f9-d4f0-4ff8-9322-f03662c116a8_0(6a9f7175fe6ec2f0d7ea7b4276158c0114d3394dee53c2b4b7ab08a66ccae19e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-d8bb48f5d-dpvq4" Dec 02 16:03:37 crc kubenswrapper[4933]: E1202 16:03:37.395148 4933 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-dpvq4_openshift-operators_9f22c4f9-d4f0-4ff8-9322-f03662c116a8_0(6a9f7175fe6ec2f0d7ea7b4276158c0114d3394dee53c2b4b7ab08a66ccae19e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-d8bb48f5d-dpvq4" Dec 02 16:03:37 crc kubenswrapper[4933]: E1202 16:03:37.395195 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-d8bb48f5d-dpvq4_openshift-operators(9f22c4f9-d4f0-4ff8-9322-f03662c116a8)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-d8bb48f5d-dpvq4_openshift-operators(9f22c4f9-d4f0-4ff8-9322-f03662c116a8)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-dpvq4_openshift-operators_9f22c4f9-d4f0-4ff8-9322-f03662c116a8_0(6a9f7175fe6ec2f0d7ea7b4276158c0114d3394dee53c2b4b7ab08a66ccae19e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-d8bb48f5d-dpvq4" podUID="9f22c4f9-d4f0-4ff8-9322-f03662c116a8" Dec 02 16:03:37 crc kubenswrapper[4933]: E1202 16:03:37.399721 4933 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-cd6cfcdf6-wnjsz_openshift-operators_15f56600-a066-40fd-8433-d0552173dc57_0(6c5a30b3e2ffd1517207be4d15e705ec4cb021a96e0383631a29a88e9a72edfb): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 02 16:03:37 crc kubenswrapper[4933]: E1202 16:03:37.399790 4933 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-cd6cfcdf6-wnjsz_openshift-operators_15f56600-a066-40fd-8433-d0552173dc57_0(6c5a30b3e2ffd1517207be4d15e705ec4cb021a96e0383631a29a88e9a72edfb): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-cd6cfcdf6-wnjsz" Dec 02 16:03:37 crc kubenswrapper[4933]: E1202 16:03:37.399814 4933 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-cd6cfcdf6-wnjsz_openshift-operators_15f56600-a066-40fd-8433-d0552173dc57_0(6c5a30b3e2ffd1517207be4d15e705ec4cb021a96e0383631a29a88e9a72edfb): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-cd6cfcdf6-wnjsz" Dec 02 16:03:37 crc kubenswrapper[4933]: E1202 16:03:37.399880 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-cd6cfcdf6-wnjsz_openshift-operators(15f56600-a066-40fd-8433-d0552173dc57)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-cd6cfcdf6-wnjsz_openshift-operators(15f56600-a066-40fd-8433-d0552173dc57)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-cd6cfcdf6-wnjsz_openshift-operators_15f56600-a066-40fd-8433-d0552173dc57_0(6c5a30b3e2ffd1517207be4d15e705ec4cb021a96e0383631a29a88e9a72edfb): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-cd6cfcdf6-wnjsz" podUID="15f56600-a066-40fd-8433-d0552173dc57" Dec 02 16:03:37 crc kubenswrapper[4933]: E1202 16:03:37.425708 4933 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-s9dt9_openshift-operators_1ec8da33-52a7-4abb-a205-7c14a8186f5e_0(24fc8fe3d9c1462184259937ad0226b8c4a288e07249217246c479dfe4504481): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 02 16:03:37 crc kubenswrapper[4933]: E1202 16:03:37.425774 4933 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-s9dt9_openshift-operators_1ec8da33-52a7-4abb-a205-7c14a8186f5e_0(24fc8fe3d9c1462184259937ad0226b8c4a288e07249217246c479dfe4504481): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5446b9c989-s9dt9" Dec 02 16:03:37 crc kubenswrapper[4933]: E1202 16:03:37.425802 4933 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-s9dt9_openshift-operators_1ec8da33-52a7-4abb-a205-7c14a8186f5e_0(24fc8fe3d9c1462184259937ad0226b8c4a288e07249217246c479dfe4504481): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5446b9c989-s9dt9" Dec 02 16:03:37 crc kubenswrapper[4933]: E1202 16:03:37.425860 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-5446b9c989-s9dt9_openshift-operators(1ec8da33-52a7-4abb-a205-7c14a8186f5e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-5446b9c989-s9dt9_openshift-operators(1ec8da33-52a7-4abb-a205-7c14a8186f5e)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-s9dt9_openshift-operators_1ec8da33-52a7-4abb-a205-7c14a8186f5e_0(24fc8fe3d9c1462184259937ad0226b8c4a288e07249217246c479dfe4504481): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-5446b9c989-s9dt9" podUID="1ec8da33-52a7-4abb-a205-7c14a8186f5e" Dec 02 16:03:37 crc kubenswrapper[4933]: E1202 16:03:37.436015 4933 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-pcq8l_openshift-operators_9d0e9982-e6e0-43d3-8e6d-8835a52fe9d8_0(2d755944ddf60c8d2f849f12165f4f193a7a7dd485a99dd106133d3294f6c6d1): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 02 16:03:37 crc kubenswrapper[4933]: E1202 16:03:37.436081 4933 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-pcq8l_openshift-operators_9d0e9982-e6e0-43d3-8e6d-8835a52fe9d8_0(2d755944ddf60c8d2f849f12165f4f193a7a7dd485a99dd106133d3294f6c6d1): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-pcq8l" Dec 02 16:03:37 crc kubenswrapper[4933]: E1202 16:03:37.436106 4933 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-pcq8l_openshift-operators_9d0e9982-e6e0-43d3-8e6d-8835a52fe9d8_0(2d755944ddf60c8d2f849f12165f4f193a7a7dd485a99dd106133d3294f6c6d1): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-pcq8l" Dec 02 16:03:37 crc kubenswrapper[4933]: E1202 16:03:37.436153 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-668cf9dfbb-pcq8l_openshift-operators(9d0e9982-e6e0-43d3-8e6d-8835a52fe9d8)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-668cf9dfbb-pcq8l_openshift-operators(9d0e9982-e6e0-43d3-8e6d-8835a52fe9d8)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-pcq8l_openshift-operators_9d0e9982-e6e0-43d3-8e6d-8835a52fe9d8_0(2d755944ddf60c8d2f849f12165f4f193a7a7dd485a99dd106133d3294f6c6d1): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-pcq8l" podUID="9d0e9982-e6e0-43d3-8e6d-8835a52fe9d8" Dec 02 16:03:37 crc kubenswrapper[4933]: E1202 16:03:37.464334 4933 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-cd6cfcdf6-cwxhj_openshift-operators_a8b5527a-5f6c-461a-8397-c911f538eb3a_0(abc2da0cdee16712e56d9a373a6eb43184237c9021233abe5f2ce57867eb311f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 02 16:03:37 crc kubenswrapper[4933]: E1202 16:03:37.464400 4933 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-cd6cfcdf6-cwxhj_openshift-operators_a8b5527a-5f6c-461a-8397-c911f538eb3a_0(abc2da0cdee16712e56d9a373a6eb43184237c9021233abe5f2ce57867eb311f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-cd6cfcdf6-cwxhj" Dec 02 16:03:37 crc kubenswrapper[4933]: E1202 16:03:37.464434 4933 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-cd6cfcdf6-cwxhj_openshift-operators_a8b5527a-5f6c-461a-8397-c911f538eb3a_0(abc2da0cdee16712e56d9a373a6eb43184237c9021233abe5f2ce57867eb311f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-cd6cfcdf6-cwxhj" Dec 02 16:03:37 crc kubenswrapper[4933]: E1202 16:03:37.464486 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-cd6cfcdf6-cwxhj_openshift-operators(a8b5527a-5f6c-461a-8397-c911f538eb3a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-cd6cfcdf6-cwxhj_openshift-operators(a8b5527a-5f6c-461a-8397-c911f538eb3a)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-cd6cfcdf6-cwxhj_openshift-operators_a8b5527a-5f6c-461a-8397-c911f538eb3a_0(abc2da0cdee16712e56d9a373a6eb43184237c9021233abe5f2ce57867eb311f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-cd6cfcdf6-cwxhj" podUID="a8b5527a-5f6c-461a-8397-c911f538eb3a" Dec 02 16:03:41 crc kubenswrapper[4933]: I1202 16:03:41.053362 4933 scope.go:117] "RemoveContainer" containerID="fdc8b3cab6425c14392d3e1388ade4844bac3e7998924034f044e5a47d73be67" Dec 02 16:03:41 crc kubenswrapper[4933]: E1202 16:03:41.055058 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-z6kjz_openshift-multus(b033c545-93a2-4401-842b-22456e44216b)\"" pod="openshift-multus/multus-z6kjz" podUID="b033c545-93a2-4401-842b-22456e44216b" Dec 02 16:03:48 crc kubenswrapper[4933]: I1202 16:03:48.052228 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-dpvq4" Dec 02 16:03:48 crc kubenswrapper[4933]: I1202 16:03:48.053046 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-dpvq4" Dec 02 16:03:48 crc kubenswrapper[4933]: E1202 16:03:48.089916 4933 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-dpvq4_openshift-operators_9f22c4f9-d4f0-4ff8-9322-f03662c116a8_0(3521cde134c179ee3d13172e1440078fa52adce21316f1b686d8c11f0152e622): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 02 16:03:48 crc kubenswrapper[4933]: E1202 16:03:48.090186 4933 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-dpvq4_openshift-operators_9f22c4f9-d4f0-4ff8-9322-f03662c116a8_0(3521cde134c179ee3d13172e1440078fa52adce21316f1b686d8c11f0152e622): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-d8bb48f5d-dpvq4" Dec 02 16:03:48 crc kubenswrapper[4933]: E1202 16:03:48.090213 4933 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-dpvq4_openshift-operators_9f22c4f9-d4f0-4ff8-9322-f03662c116a8_0(3521cde134c179ee3d13172e1440078fa52adce21316f1b686d8c11f0152e622): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-d8bb48f5d-dpvq4" Dec 02 16:03:48 crc kubenswrapper[4933]: E1202 16:03:48.090260 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-d8bb48f5d-dpvq4_openshift-operators(9f22c4f9-d4f0-4ff8-9322-f03662c116a8)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-d8bb48f5d-dpvq4_openshift-operators(9f22c4f9-d4f0-4ff8-9322-f03662c116a8)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-dpvq4_openshift-operators_9f22c4f9-d4f0-4ff8-9322-f03662c116a8_0(3521cde134c179ee3d13172e1440078fa52adce21316f1b686d8c11f0152e622): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-d8bb48f5d-dpvq4" podUID="9f22c4f9-d4f0-4ff8-9322-f03662c116a8" Dec 02 16:03:49 crc kubenswrapper[4933]: I1202 16:03:49.053422 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-pcq8l" Dec 02 16:03:49 crc kubenswrapper[4933]: I1202 16:03:49.053947 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-pcq8l" Dec 02 16:03:49 crc kubenswrapper[4933]: E1202 16:03:49.090576 4933 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-pcq8l_openshift-operators_9d0e9982-e6e0-43d3-8e6d-8835a52fe9d8_0(aa96c48a44dfd68fe19362e27f35d84a9c91abcff404ef4c2bef8e6f20c43e64): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 02 16:03:49 crc kubenswrapper[4933]: E1202 16:03:49.090913 4933 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-pcq8l_openshift-operators_9d0e9982-e6e0-43d3-8e6d-8835a52fe9d8_0(aa96c48a44dfd68fe19362e27f35d84a9c91abcff404ef4c2bef8e6f20c43e64): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-pcq8l" Dec 02 16:03:49 crc kubenswrapper[4933]: E1202 16:03:49.090943 4933 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-pcq8l_openshift-operators_9d0e9982-e6e0-43d3-8e6d-8835a52fe9d8_0(aa96c48a44dfd68fe19362e27f35d84a9c91abcff404ef4c2bef8e6f20c43e64): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-pcq8l" Dec 02 16:03:49 crc kubenswrapper[4933]: E1202 16:03:49.090989 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-668cf9dfbb-pcq8l_openshift-operators(9d0e9982-e6e0-43d3-8e6d-8835a52fe9d8)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-668cf9dfbb-pcq8l_openshift-operators(9d0e9982-e6e0-43d3-8e6d-8835a52fe9d8)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-pcq8l_openshift-operators_9d0e9982-e6e0-43d3-8e6d-8835a52fe9d8_0(aa96c48a44dfd68fe19362e27f35d84a9c91abcff404ef4c2bef8e6f20c43e64): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-pcq8l" podUID="9d0e9982-e6e0-43d3-8e6d-8835a52fe9d8" Dec 02 16:03:50 crc kubenswrapper[4933]: I1202 16:03:50.053304 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-cd6cfcdf6-wnjsz" Dec 02 16:03:50 crc kubenswrapper[4933]: I1202 16:03:50.054085 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-cd6cfcdf6-wnjsz" Dec 02 16:03:50 crc kubenswrapper[4933]: E1202 16:03:50.086902 4933 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-cd6cfcdf6-wnjsz_openshift-operators_15f56600-a066-40fd-8433-d0552173dc57_0(404c27311142b86973a257395e87ed34e2a15218cb74aeddd80aaee55f7d8276): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 02 16:03:50 crc kubenswrapper[4933]: E1202 16:03:50.087096 4933 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-cd6cfcdf6-wnjsz_openshift-operators_15f56600-a066-40fd-8433-d0552173dc57_0(404c27311142b86973a257395e87ed34e2a15218cb74aeddd80aaee55f7d8276): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-cd6cfcdf6-wnjsz" Dec 02 16:03:50 crc kubenswrapper[4933]: E1202 16:03:50.087209 4933 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-cd6cfcdf6-wnjsz_openshift-operators_15f56600-a066-40fd-8433-d0552173dc57_0(404c27311142b86973a257395e87ed34e2a15218cb74aeddd80aaee55f7d8276): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-cd6cfcdf6-wnjsz" Dec 02 16:03:50 crc kubenswrapper[4933]: E1202 16:03:50.087361 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-cd6cfcdf6-wnjsz_openshift-operators(15f56600-a066-40fd-8433-d0552173dc57)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-cd6cfcdf6-wnjsz_openshift-operators(15f56600-a066-40fd-8433-d0552173dc57)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-cd6cfcdf6-wnjsz_openshift-operators_15f56600-a066-40fd-8433-d0552173dc57_0(404c27311142b86973a257395e87ed34e2a15218cb74aeddd80aaee55f7d8276): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-cd6cfcdf6-wnjsz" podUID="15f56600-a066-40fd-8433-d0552173dc57" Dec 02 16:03:51 crc kubenswrapper[4933]: I1202 16:03:51.053312 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-s9dt9" Dec 02 16:03:51 crc kubenswrapper[4933]: I1202 16:03:51.054154 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-s9dt9" Dec 02 16:03:51 crc kubenswrapper[4933]: E1202 16:03:51.087906 4933 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-s9dt9_openshift-operators_1ec8da33-52a7-4abb-a205-7c14a8186f5e_0(04fc52318d21b64f5ee8b8b77ea6aa736c1961dd6b2b366a9ed7bbe68dcb3de7): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 02 16:03:51 crc kubenswrapper[4933]: E1202 16:03:51.087986 4933 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-s9dt9_openshift-operators_1ec8da33-52a7-4abb-a205-7c14a8186f5e_0(04fc52318d21b64f5ee8b8b77ea6aa736c1961dd6b2b366a9ed7bbe68dcb3de7): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5446b9c989-s9dt9" Dec 02 16:03:51 crc kubenswrapper[4933]: E1202 16:03:51.088018 4933 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-s9dt9_openshift-operators_1ec8da33-52a7-4abb-a205-7c14a8186f5e_0(04fc52318d21b64f5ee8b8b77ea6aa736c1961dd6b2b366a9ed7bbe68dcb3de7): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5446b9c989-s9dt9" Dec 02 16:03:51 crc kubenswrapper[4933]: E1202 16:03:51.088083 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-5446b9c989-s9dt9_openshift-operators(1ec8da33-52a7-4abb-a205-7c14a8186f5e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-5446b9c989-s9dt9_openshift-operators(1ec8da33-52a7-4abb-a205-7c14a8186f5e)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-s9dt9_openshift-operators_1ec8da33-52a7-4abb-a205-7c14a8186f5e_0(04fc52318d21b64f5ee8b8b77ea6aa736c1961dd6b2b366a9ed7bbe68dcb3de7): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-5446b9c989-s9dt9" podUID="1ec8da33-52a7-4abb-a205-7c14a8186f5e" Dec 02 16:03:52 crc kubenswrapper[4933]: I1202 16:03:52.052350 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-cd6cfcdf6-cwxhj" Dec 02 16:03:52 crc kubenswrapper[4933]: I1202 16:03:52.052742 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-cd6cfcdf6-cwxhj" Dec 02 16:03:52 crc kubenswrapper[4933]: E1202 16:03:52.080559 4933 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-cd6cfcdf6-cwxhj_openshift-operators_a8b5527a-5f6c-461a-8397-c911f538eb3a_0(59fa14b7ca3025369130e27efff13e7be5551bb9169ea48be339acd20df49412): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 02 16:03:52 crc kubenswrapper[4933]: E1202 16:03:52.080643 4933 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-cd6cfcdf6-cwxhj_openshift-operators_a8b5527a-5f6c-461a-8397-c911f538eb3a_0(59fa14b7ca3025369130e27efff13e7be5551bb9169ea48be339acd20df49412): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-cd6cfcdf6-cwxhj" Dec 02 16:03:52 crc kubenswrapper[4933]: E1202 16:03:52.080666 4933 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-cd6cfcdf6-cwxhj_openshift-operators_a8b5527a-5f6c-461a-8397-c911f538eb3a_0(59fa14b7ca3025369130e27efff13e7be5551bb9169ea48be339acd20df49412): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-cd6cfcdf6-cwxhj" Dec 02 16:03:52 crc kubenswrapper[4933]: E1202 16:03:52.080713 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-cd6cfcdf6-cwxhj_openshift-operators(a8b5527a-5f6c-461a-8397-c911f538eb3a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-cd6cfcdf6-cwxhj_openshift-operators(a8b5527a-5f6c-461a-8397-c911f538eb3a)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-cd6cfcdf6-cwxhj_openshift-operators_a8b5527a-5f6c-461a-8397-c911f538eb3a_0(59fa14b7ca3025369130e27efff13e7be5551bb9169ea48be339acd20df49412): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-cd6cfcdf6-cwxhj" podUID="a8b5527a-5f6c-461a-8397-c911f538eb3a" Dec 02 16:03:53 crc kubenswrapper[4933]: I1202 16:03:53.053382 4933 scope.go:117] "RemoveContainer" containerID="fdc8b3cab6425c14392d3e1388ade4844bac3e7998924034f044e5a47d73be67" Dec 02 16:03:53 crc kubenswrapper[4933]: I1202 16:03:53.811380 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-z6kjz_b033c545-93a2-4401-842b-22456e44216b/kube-multus/2.log" Dec 02 16:03:53 crc kubenswrapper[4933]: I1202 16:03:53.811932 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-z6kjz" event={"ID":"b033c545-93a2-4401-842b-22456e44216b","Type":"ContainerStarted","Data":"3f3caa4f3dec0c3f1ac74deadf856fa32174a068c15c4cc28aea60cc595fe52d"} Dec 02 16:04:00 crc kubenswrapper[4933]: I1202 16:04:00.091767 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ngtkv" Dec 02 16:04:01 crc kubenswrapper[4933]: I1202 16:04:01.053179 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-cd6cfcdf6-wnjsz" Dec 02 16:04:01 crc kubenswrapper[4933]: I1202 16:04:01.053223 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-dpvq4" Dec 02 16:04:01 crc kubenswrapper[4933]: I1202 16:04:01.053582 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-cd6cfcdf6-wnjsz" Dec 02 16:04:01 crc kubenswrapper[4933]: I1202 16:04:01.053921 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-dpvq4" Dec 02 16:04:01 crc kubenswrapper[4933]: I1202 16:04:01.496501 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-dpvq4"] Dec 02 16:04:01 crc kubenswrapper[4933]: I1202 16:04:01.542596 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-cd6cfcdf6-wnjsz"] Dec 02 16:04:01 crc kubenswrapper[4933]: W1202 16:04:01.549546 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod15f56600_a066_40fd_8433_d0552173dc57.slice/crio-59d3c9d2d1e40cfa703afe8f2807297b484362728a8fa3059a4455e43f3f78f6 WatchSource:0}: Error finding container 59d3c9d2d1e40cfa703afe8f2807297b484362728a8fa3059a4455e43f3f78f6: Status 404 returned error can't find the container with id 59d3c9d2d1e40cfa703afe8f2807297b484362728a8fa3059a4455e43f3f78f6 Dec 02 16:04:01 crc kubenswrapper[4933]: I1202 16:04:01.853569 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-d8bb48f5d-dpvq4" event={"ID":"9f22c4f9-d4f0-4ff8-9322-f03662c116a8","Type":"ContainerStarted","Data":"2f1c6b7ebdbbc8f47590a0c99fca79afb7d0f803d68a05b2b25aa95a6ad5bb79"} Dec 02 16:04:01 crc kubenswrapper[4933]: I1202 16:04:01.855508 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-cd6cfcdf6-wnjsz" event={"ID":"15f56600-a066-40fd-8433-d0552173dc57","Type":"ContainerStarted","Data":"59d3c9d2d1e40cfa703afe8f2807297b484362728a8fa3059a4455e43f3f78f6"} Dec 02 16:04:02 crc kubenswrapper[4933]: I1202 16:04:02.053167 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-pcq8l" Dec 02 16:04:02 crc kubenswrapper[4933]: I1202 16:04:02.053737 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-pcq8l" Dec 02 16:04:02 crc kubenswrapper[4933]: I1202 16:04:02.258386 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-pcq8l"] Dec 02 16:04:02 crc kubenswrapper[4933]: W1202 16:04:02.261293 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d0e9982_e6e0_43d3_8e6d_8835a52fe9d8.slice/crio-524c0b6fae1c0325d6086d18d89e245e2a6c65c3f7ba8b3f0d89521782d29561 WatchSource:0}: Error finding container 524c0b6fae1c0325d6086d18d89e245e2a6c65c3f7ba8b3f0d89521782d29561: Status 404 returned error can't find the container with id 524c0b6fae1c0325d6086d18d89e245e2a6c65c3f7ba8b3f0d89521782d29561 Dec 02 16:04:02 crc kubenswrapper[4933]: I1202 16:04:02.864673 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-pcq8l" event={"ID":"9d0e9982-e6e0-43d3-8e6d-8835a52fe9d8","Type":"ContainerStarted","Data":"524c0b6fae1c0325d6086d18d89e245e2a6c65c3f7ba8b3f0d89521782d29561"} Dec 02 16:04:05 crc kubenswrapper[4933]: I1202 16:04:05.057799 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-cd6cfcdf6-cwxhj" Dec 02 16:04:05 crc kubenswrapper[4933]: I1202 16:04:05.058529 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-cd6cfcdf6-cwxhj" Dec 02 16:04:05 crc kubenswrapper[4933]: I1202 16:04:05.538124 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-cd6cfcdf6-cwxhj"] Dec 02 16:04:05 crc kubenswrapper[4933]: I1202 16:04:05.886290 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-cd6cfcdf6-cwxhj" event={"ID":"a8b5527a-5f6c-461a-8397-c911f538eb3a","Type":"ContainerStarted","Data":"d60292b925454ff7ccf57d7fea7c5433a4d5a0308e5346024261df12831b61a0"} Dec 02 16:04:06 crc kubenswrapper[4933]: I1202 16:04:06.052878 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-s9dt9" Dec 02 16:04:06 crc kubenswrapper[4933]: I1202 16:04:06.053855 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-s9dt9" Dec 02 16:04:06 crc kubenswrapper[4933]: I1202 16:04:06.535541 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5446b9c989-s9dt9"] Dec 02 16:04:09 crc kubenswrapper[4933]: W1202 16:04:09.985720 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ec8da33_52a7_4abb_a205_7c14a8186f5e.slice/crio-36e48d19164b8a5386fe173d1e78ff92880cc163c5baf1ba94fc14b5d82527d7 WatchSource:0}: Error finding container 36e48d19164b8a5386fe173d1e78ff92880cc163c5baf1ba94fc14b5d82527d7: Status 404 returned error can't find the container with id 36e48d19164b8a5386fe173d1e78ff92880cc163c5baf1ba94fc14b5d82527d7 Dec 02 16:04:10 crc kubenswrapper[4933]: I1202 16:04:10.926468 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5446b9c989-s9dt9" event={"ID":"1ec8da33-52a7-4abb-a205-7c14a8186f5e","Type":"ContainerStarted","Data":"36e48d19164b8a5386fe173d1e78ff92880cc163c5baf1ba94fc14b5d82527d7"} Dec 02 16:04:15 crc kubenswrapper[4933]: I1202 16:04:15.961139 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-pcq8l" event={"ID":"9d0e9982-e6e0-43d3-8e6d-8835a52fe9d8","Type":"ContainerStarted","Data":"d4d96f5cc9b3df506c0be65b618de79a6800f18cf31be772d578366ced4c1ab2"} Dec 02 16:04:15 crc kubenswrapper[4933]: I1202 16:04:15.963067 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-d8bb48f5d-dpvq4" event={"ID":"9f22c4f9-d4f0-4ff8-9322-f03662c116a8","Type":"ContainerStarted","Data":"ea31ff759bbb04f9cdaaf1677561090546bbe133f5a6e4b5e7adb29b334dfd9f"} Dec 02 16:04:15 crc kubenswrapper[4933]: I1202 16:04:15.963132 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-d8bb48f5d-dpvq4" Dec 02 16:04:15 crc kubenswrapper[4933]: I1202 16:04:15.965157 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-cd6cfcdf6-wnjsz" event={"ID":"15f56600-a066-40fd-8433-d0552173dc57","Type":"ContainerStarted","Data":"ca942d67a093372e2ee87721e7a7d0aeac1678ec50fce9d5d32951cb89797f69"} Dec 02 16:04:15 crc kubenswrapper[4933]: I1202 16:04:15.967236 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-cd6cfcdf6-cwxhj" event={"ID":"a8b5527a-5f6c-461a-8397-c911f538eb3a","Type":"ContainerStarted","Data":"98895911eaeb524ff54312a52cc1eb016524f9c580de8c83319c181446cd847f"} Dec 02 16:04:15 crc kubenswrapper[4933]: I1202 16:04:15.972004 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5446b9c989-s9dt9" event={"ID":"1ec8da33-52a7-4abb-a205-7c14a8186f5e","Type":"ContainerStarted","Data":"95d0d2a2cee121a7252ca6564fafd1634b10b0b8c4d5f0e1e5100db445e30300"} Dec 02 16:04:15 crc kubenswrapper[4933]: I1202 16:04:15.972182 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5446b9c989-s9dt9" Dec 02 16:04:15 crc kubenswrapper[4933]: I1202 16:04:15.980871 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-pcq8l" podStartSLOduration=28.151769095 podStartE2EDuration="40.980850756s" podCreationTimestamp="2025-12-02 16:03:35 +0000 UTC" firstStartedPulling="2025-12-02 16:04:02.263016056 +0000 UTC m=+705.514242759" lastFinishedPulling="2025-12-02 16:04:15.092097717 +0000 UTC m=+718.343324420" observedRunningTime="2025-12-02 16:04:15.979176762 +0000 UTC m=+719.230403485" watchObservedRunningTime="2025-12-02 16:04:15.980850756 +0000 UTC m=+719.232077479" Dec 02 16:04:15 crc kubenswrapper[4933]: I1202 16:04:15.993531 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-d8bb48f5d-dpvq4" Dec 02 16:04:16 crc kubenswrapper[4933]: I1202 16:04:16.012209 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5446b9c989-s9dt9" podStartSLOduration=35.825157513 podStartE2EDuration="41.012187561s" podCreationTimestamp="2025-12-02 16:03:35 +0000 UTC" firstStartedPulling="2025-12-02 16:04:09.990130424 +0000 UTC m=+713.241357137" lastFinishedPulling="2025-12-02 16:04:15.177160462 +0000 UTC m=+718.428387185" observedRunningTime="2025-12-02 16:04:16.008182774 +0000 UTC m=+719.259409497" watchObservedRunningTime="2025-12-02 16:04:16.012187561 +0000 UTC m=+719.263414264" Dec 02 16:04:16 crc kubenswrapper[4933]: I1202 16:04:16.031409 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-cd6cfcdf6-wnjsz" podStartSLOduration=27.492097277 podStartE2EDuration="41.031390432s" podCreationTimestamp="2025-12-02 16:03:35 +0000 UTC" firstStartedPulling="2025-12-02 16:04:01.552767341 +0000 UTC m=+704.803994044" lastFinishedPulling="2025-12-02 16:04:15.092060496 +0000 UTC m=+718.343287199" observedRunningTime="2025-12-02 16:04:16.027300353 +0000 UTC m=+719.278527076" watchObservedRunningTime="2025-12-02 16:04:16.031390432 +0000 UTC m=+719.282617135" Dec 02 16:04:16 crc kubenswrapper[4933]: I1202 16:04:16.054860 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-d8bb48f5d-dpvq4" podStartSLOduration=27.472526246 podStartE2EDuration="41.054842607s" podCreationTimestamp="2025-12-02 16:03:35 +0000 UTC" firstStartedPulling="2025-12-02 16:04:01.509788816 +0000 UTC m=+704.761015519" lastFinishedPulling="2025-12-02 16:04:15.092105177 +0000 UTC m=+718.343331880" observedRunningTime="2025-12-02 16:04:16.05195088 +0000 UTC m=+719.303177613" watchObservedRunningTime="2025-12-02 16:04:16.054842607 +0000 UTC m=+719.306069310" Dec 02 16:04:16 crc kubenswrapper[4933]: I1202 16:04:16.073057 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-cd6cfcdf6-cwxhj" podStartSLOduration=31.471339811 podStartE2EDuration="41.073037631s" podCreationTimestamp="2025-12-02 16:03:35 +0000 UTC" firstStartedPulling="2025-12-02 16:04:05.550073176 +0000 UTC m=+708.801299879" lastFinishedPulling="2025-12-02 16:04:15.151770996 +0000 UTC m=+718.402997699" observedRunningTime="2025-12-02 16:04:16.069043175 +0000 UTC m=+719.320269888" watchObservedRunningTime="2025-12-02 16:04:16.073037631 +0000 UTC m=+719.324264334" Dec 02 16:04:17 crc kubenswrapper[4933]: I1202 16:04:17.170002 4933 patch_prober.go:28] interesting pod/machine-config-daemon-d2p6w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 16:04:17 crc kubenswrapper[4933]: I1202 16:04:17.170071 4933 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 16:04:25 crc kubenswrapper[4933]: I1202 16:04:25.929792 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5446b9c989-s9dt9" Dec 02 16:04:26 crc kubenswrapper[4933]: I1202 16:04:26.358750 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-q4f6z"] Dec 02 16:04:26 crc kubenswrapper[4933]: I1202 16:04:26.359975 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-q4f6z" Dec 02 16:04:26 crc kubenswrapper[4933]: I1202 16:04:26.361919 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Dec 02 16:04:26 crc kubenswrapper[4933]: I1202 16:04:26.362337 4933 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-fk4hv" Dec 02 16:04:26 crc kubenswrapper[4933]: I1202 16:04:26.365421 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Dec 02 16:04:26 crc kubenswrapper[4933]: I1202 16:04:26.371867 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-q4f6z"] Dec 02 16:04:26 crc kubenswrapper[4933]: I1202 16:04:26.378144 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-dhbnp"] Dec 02 16:04:26 crc kubenswrapper[4933]: I1202 16:04:26.379030 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-dhbnp" Dec 02 16:04:26 crc kubenswrapper[4933]: I1202 16:04:26.381473 4933 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-k7ps6" Dec 02 16:04:26 crc kubenswrapper[4933]: I1202 16:04:26.394367 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-lpsrb"] Dec 02 16:04:26 crc kubenswrapper[4933]: I1202 16:04:26.395304 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-lpsrb" Dec 02 16:04:26 crc kubenswrapper[4933]: I1202 16:04:26.399640 4933 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-scpvv" Dec 02 16:04:26 crc kubenswrapper[4933]: I1202 16:04:26.428771 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-dhbnp"] Dec 02 16:04:26 crc kubenswrapper[4933]: I1202 16:04:26.437970 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-lpsrb"] Dec 02 16:04:26 crc kubenswrapper[4933]: I1202 16:04:26.457811 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khslt\" (UniqueName: \"kubernetes.io/projected/41523154-4286-47e3-9d29-3adb94b50073-kube-api-access-khslt\") pod \"cert-manager-webhook-5655c58dd6-lpsrb\" (UID: \"41523154-4286-47e3-9d29-3adb94b50073\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-lpsrb" Dec 02 16:04:26 crc kubenswrapper[4933]: I1202 16:04:26.457897 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfpcr\" (UniqueName: \"kubernetes.io/projected/e81749dd-af4f-46b4-954a-4581cf720cd5-kube-api-access-bfpcr\") pod \"cert-manager-5b446d88c5-dhbnp\" (UID: \"e81749dd-af4f-46b4-954a-4581cf720cd5\") " pod="cert-manager/cert-manager-5b446d88c5-dhbnp" Dec 02 16:04:26 crc kubenswrapper[4933]: I1202 16:04:26.457942 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kmt6\" (UniqueName: \"kubernetes.io/projected/b2d605a1-7e07-47c6-af5f-e2f2a41df05d-kube-api-access-8kmt6\") pod \"cert-manager-cainjector-7f985d654d-q4f6z\" (UID: \"b2d605a1-7e07-47c6-af5f-e2f2a41df05d\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-q4f6z" Dec 02 16:04:26 crc kubenswrapper[4933]: I1202 16:04:26.558883 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khslt\" (UniqueName: \"kubernetes.io/projected/41523154-4286-47e3-9d29-3adb94b50073-kube-api-access-khslt\") pod \"cert-manager-webhook-5655c58dd6-lpsrb\" (UID: \"41523154-4286-47e3-9d29-3adb94b50073\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-lpsrb" Dec 02 16:04:26 crc kubenswrapper[4933]: I1202 16:04:26.558935 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfpcr\" (UniqueName: \"kubernetes.io/projected/e81749dd-af4f-46b4-954a-4581cf720cd5-kube-api-access-bfpcr\") pod \"cert-manager-5b446d88c5-dhbnp\" (UID: \"e81749dd-af4f-46b4-954a-4581cf720cd5\") " pod="cert-manager/cert-manager-5b446d88c5-dhbnp" Dec 02 16:04:26 crc kubenswrapper[4933]: I1202 16:04:26.558968 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kmt6\" (UniqueName: \"kubernetes.io/projected/b2d605a1-7e07-47c6-af5f-e2f2a41df05d-kube-api-access-8kmt6\") pod \"cert-manager-cainjector-7f985d654d-q4f6z\" (UID: \"b2d605a1-7e07-47c6-af5f-e2f2a41df05d\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-q4f6z" Dec 02 16:04:26 crc kubenswrapper[4933]: I1202 16:04:26.578264 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khslt\" (UniqueName: \"kubernetes.io/projected/41523154-4286-47e3-9d29-3adb94b50073-kube-api-access-khslt\") pod \"cert-manager-webhook-5655c58dd6-lpsrb\" (UID: \"41523154-4286-47e3-9d29-3adb94b50073\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-lpsrb" Dec 02 16:04:26 crc kubenswrapper[4933]: I1202 16:04:26.579818 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kmt6\" (UniqueName: \"kubernetes.io/projected/b2d605a1-7e07-47c6-af5f-e2f2a41df05d-kube-api-access-8kmt6\") pod \"cert-manager-cainjector-7f985d654d-q4f6z\" (UID: \"b2d605a1-7e07-47c6-af5f-e2f2a41df05d\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-q4f6z" Dec 02 16:04:26 crc kubenswrapper[4933]: I1202 16:04:26.582888 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfpcr\" (UniqueName: \"kubernetes.io/projected/e81749dd-af4f-46b4-954a-4581cf720cd5-kube-api-access-bfpcr\") pod \"cert-manager-5b446d88c5-dhbnp\" (UID: \"e81749dd-af4f-46b4-954a-4581cf720cd5\") " pod="cert-manager/cert-manager-5b446d88c5-dhbnp" Dec 02 16:04:26 crc kubenswrapper[4933]: I1202 16:04:26.684802 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-q4f6z" Dec 02 16:04:26 crc kubenswrapper[4933]: I1202 16:04:26.708003 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-dhbnp" Dec 02 16:04:26 crc kubenswrapper[4933]: I1202 16:04:26.717157 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-lpsrb" Dec 02 16:04:27 crc kubenswrapper[4933]: I1202 16:04:27.081634 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-lpsrb"] Dec 02 16:04:27 crc kubenswrapper[4933]: W1202 16:04:27.087078 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod41523154_4286_47e3_9d29_3adb94b50073.slice/crio-47f9dc112080804c1bf179c5a0b8c30eed9704df9fdf5951263c0630935376a5 WatchSource:0}: Error finding container 47f9dc112080804c1bf179c5a0b8c30eed9704df9fdf5951263c0630935376a5: Status 404 returned error can't find the container with id 47f9dc112080804c1bf179c5a0b8c30eed9704df9fdf5951263c0630935376a5 Dec 02 16:04:27 crc kubenswrapper[4933]: I1202 16:04:27.189122 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-dhbnp"] Dec 02 16:04:27 crc kubenswrapper[4933]: W1202 16:04:27.190618 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode81749dd_af4f_46b4_954a_4581cf720cd5.slice/crio-8403239b0cba1244531b599d1c5d80d7b098212f2c41f078a190b132e85b95ed WatchSource:0}: Error finding container 8403239b0cba1244531b599d1c5d80d7b098212f2c41f078a190b132e85b95ed: Status 404 returned error can't find the container with id 8403239b0cba1244531b599d1c5d80d7b098212f2c41f078a190b132e85b95ed Dec 02 16:04:27 crc kubenswrapper[4933]: I1202 16:04:27.199836 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-q4f6z"] Dec 02 16:04:27 crc kubenswrapper[4933]: W1202 16:04:27.203048 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb2d605a1_7e07_47c6_af5f_e2f2a41df05d.slice/crio-1cf4306b907f30af843b2d4dae6995f6b5ee72aa960c7999f056e4bc5212d1bc WatchSource:0}: Error finding container 1cf4306b907f30af843b2d4dae6995f6b5ee72aa960c7999f056e4bc5212d1bc: Status 404 returned error can't find the container with id 1cf4306b907f30af843b2d4dae6995f6b5ee72aa960c7999f056e4bc5212d1bc Dec 02 16:04:28 crc kubenswrapper[4933]: I1202 16:04:28.049000 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-dhbnp" event={"ID":"e81749dd-af4f-46b4-954a-4581cf720cd5","Type":"ContainerStarted","Data":"8403239b0cba1244531b599d1c5d80d7b098212f2c41f078a190b132e85b95ed"} Dec 02 16:04:28 crc kubenswrapper[4933]: I1202 16:04:28.050373 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-lpsrb" event={"ID":"41523154-4286-47e3-9d29-3adb94b50073","Type":"ContainerStarted","Data":"47f9dc112080804c1bf179c5a0b8c30eed9704df9fdf5951263c0630935376a5"} Dec 02 16:04:28 crc kubenswrapper[4933]: I1202 16:04:28.051746 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-q4f6z" event={"ID":"b2d605a1-7e07-47c6-af5f-e2f2a41df05d","Type":"ContainerStarted","Data":"1cf4306b907f30af843b2d4dae6995f6b5ee72aa960c7999f056e4bc5212d1bc"} Dec 02 16:04:37 crc kubenswrapper[4933]: I1202 16:04:37.107981 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-q4f6z" event={"ID":"b2d605a1-7e07-47c6-af5f-e2f2a41df05d","Type":"ContainerStarted","Data":"1719d9e9d2c24fcf3a3f2582bfcad70e8bb4f5fe50e9b05455e4028d6808bcbe"} Dec 02 16:04:37 crc kubenswrapper[4933]: I1202 16:04:37.109553 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-dhbnp" event={"ID":"e81749dd-af4f-46b4-954a-4581cf720cd5","Type":"ContainerStarted","Data":"b0bda68b6b357d935df35122fa6cd141b070b073c9b95c51c7fe828c81b64e31"} Dec 02 16:04:37 crc kubenswrapper[4933]: I1202 16:04:37.111860 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-lpsrb" event={"ID":"41523154-4286-47e3-9d29-3adb94b50073","Type":"ContainerStarted","Data":"de7f59ebe007e3791d31e1a71d6cb9efda965bb84fdfdf357fb99bf48a09b123"} Dec 02 16:04:37 crc kubenswrapper[4933]: I1202 16:04:37.112019 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-lpsrb" Dec 02 16:04:37 crc kubenswrapper[4933]: I1202 16:04:37.134767 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-q4f6z" podStartSLOduration=1.501191092 podStartE2EDuration="11.13475088s" podCreationTimestamp="2025-12-02 16:04:26 +0000 UTC" firstStartedPulling="2025-12-02 16:04:27.205549179 +0000 UTC m=+730.456775882" lastFinishedPulling="2025-12-02 16:04:36.839108967 +0000 UTC m=+740.090335670" observedRunningTime="2025-12-02 16:04:37.130086106 +0000 UTC m=+740.381312819" watchObservedRunningTime="2025-12-02 16:04:37.13475088 +0000 UTC m=+740.385977583" Dec 02 16:04:37 crc kubenswrapper[4933]: I1202 16:04:37.148281 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-lpsrb" podStartSLOduration=1.459745789 podStartE2EDuration="11.14826417s" podCreationTimestamp="2025-12-02 16:04:26 +0000 UTC" firstStartedPulling="2025-12-02 16:04:27.089358345 +0000 UTC m=+730.340585048" lastFinishedPulling="2025-12-02 16:04:36.777876706 +0000 UTC m=+740.029103429" observedRunningTime="2025-12-02 16:04:37.142385834 +0000 UTC m=+740.393612537" watchObservedRunningTime="2025-12-02 16:04:37.14826417 +0000 UTC m=+740.399490873" Dec 02 16:04:37 crc kubenswrapper[4933]: I1202 16:04:37.161573 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-dhbnp" podStartSLOduration=1.571725391 podStartE2EDuration="11.161551604s" podCreationTimestamp="2025-12-02 16:04:26 +0000 UTC" firstStartedPulling="2025-12-02 16:04:27.192733898 +0000 UTC m=+730.443960601" lastFinishedPulling="2025-12-02 16:04:36.782560101 +0000 UTC m=+740.033786814" observedRunningTime="2025-12-02 16:04:37.156208142 +0000 UTC m=+740.407434845" watchObservedRunningTime="2025-12-02 16:04:37.161551604 +0000 UTC m=+740.412778307" Dec 02 16:04:46 crc kubenswrapper[4933]: I1202 16:04:46.721915 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-lpsrb" Dec 02 16:04:47 crc kubenswrapper[4933]: I1202 16:04:47.169300 4933 patch_prober.go:28] interesting pod/machine-config-daemon-d2p6w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 16:04:47 crc kubenswrapper[4933]: I1202 16:04:47.169373 4933 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 16:04:58 crc kubenswrapper[4933]: I1202 16:04:58.263334 4933 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 02 16:05:10 crc kubenswrapper[4933]: I1202 16:05:10.805102 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fm792l"] Dec 02 16:05:10 crc kubenswrapper[4933]: I1202 16:05:10.808212 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fm792l" Dec 02 16:05:10 crc kubenswrapper[4933]: I1202 16:05:10.817128 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 02 16:05:10 crc kubenswrapper[4933]: I1202 16:05:10.821463 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fm792l"] Dec 02 16:05:10 crc kubenswrapper[4933]: I1202 16:05:10.883914 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nq8g7\" (UniqueName: \"kubernetes.io/projected/bf9b2c95-38b7-4a61-89ae-d40735ddccaa-kube-api-access-nq8g7\") pod \"a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fm792l\" (UID: \"bf9b2c95-38b7-4a61-89ae-d40735ddccaa\") " pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fm792l" Dec 02 16:05:10 crc kubenswrapper[4933]: I1202 16:05:10.883973 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bf9b2c95-38b7-4a61-89ae-d40735ddccaa-util\") pod \"a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fm792l\" (UID: \"bf9b2c95-38b7-4a61-89ae-d40735ddccaa\") " pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fm792l" Dec 02 16:05:10 crc kubenswrapper[4933]: I1202 16:05:10.884045 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bf9b2c95-38b7-4a61-89ae-d40735ddccaa-bundle\") pod \"a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fm792l\" (UID: \"bf9b2c95-38b7-4a61-89ae-d40735ddccaa\") " pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fm792l" Dec 02 16:05:10 crc kubenswrapper[4933]: I1202 16:05:10.985263 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nq8g7\" (UniqueName: \"kubernetes.io/projected/bf9b2c95-38b7-4a61-89ae-d40735ddccaa-kube-api-access-nq8g7\") pod \"a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fm792l\" (UID: \"bf9b2c95-38b7-4a61-89ae-d40735ddccaa\") " pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fm792l" Dec 02 16:05:10 crc kubenswrapper[4933]: I1202 16:05:10.985640 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bf9b2c95-38b7-4a61-89ae-d40735ddccaa-util\") pod \"a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fm792l\" (UID: \"bf9b2c95-38b7-4a61-89ae-d40735ddccaa\") " pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fm792l" Dec 02 16:05:10 crc kubenswrapper[4933]: I1202 16:05:10.985687 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bf9b2c95-38b7-4a61-89ae-d40735ddccaa-bundle\") pod \"a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fm792l\" (UID: \"bf9b2c95-38b7-4a61-89ae-d40735ddccaa\") " pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fm792l" Dec 02 16:05:10 crc kubenswrapper[4933]: I1202 16:05:10.986203 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bf9b2c95-38b7-4a61-89ae-d40735ddccaa-bundle\") pod \"a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fm792l\" (UID: \"bf9b2c95-38b7-4a61-89ae-d40735ddccaa\") " pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fm792l" Dec 02 16:05:10 crc kubenswrapper[4933]: I1202 16:05:10.986286 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bf9b2c95-38b7-4a61-89ae-d40735ddccaa-util\") pod \"a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fm792l\" (UID: \"bf9b2c95-38b7-4a61-89ae-d40735ddccaa\") " pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fm792l" Dec 02 16:05:11 crc kubenswrapper[4933]: I1202 16:05:11.015599 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nq8g7\" (UniqueName: \"kubernetes.io/projected/bf9b2c95-38b7-4a61-89ae-d40735ddccaa-kube-api-access-nq8g7\") pod \"a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fm792l\" (UID: \"bf9b2c95-38b7-4a61-89ae-d40735ddccaa\") " pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fm792l" Dec 02 16:05:11 crc kubenswrapper[4933]: I1202 16:05:11.124481 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fm792l" Dec 02 16:05:11 crc kubenswrapper[4933]: I1202 16:05:11.400654 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fm792l"] Dec 02 16:05:12 crc kubenswrapper[4933]: I1202 16:05:12.343703 4933 generic.go:334] "Generic (PLEG): container finished" podID="bf9b2c95-38b7-4a61-89ae-d40735ddccaa" containerID="ed832889736297345d7923186b398b4be40b4ed8cd27a2045c43a3c46f5ca866" exitCode=0 Dec 02 16:05:12 crc kubenswrapper[4933]: I1202 16:05:12.343752 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fm792l" event={"ID":"bf9b2c95-38b7-4a61-89ae-d40735ddccaa","Type":"ContainerDied","Data":"ed832889736297345d7923186b398b4be40b4ed8cd27a2045c43a3c46f5ca866"} Dec 02 16:05:12 crc kubenswrapper[4933]: I1202 16:05:12.343775 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fm792l" event={"ID":"bf9b2c95-38b7-4a61-89ae-d40735ddccaa","Type":"ContainerStarted","Data":"3ff0ecbd6995a9f735fe07b1d387a3fd496ae52424d7ed334a9352d05b6670d4"} Dec 02 16:05:14 crc kubenswrapper[4933]: I1202 16:05:14.356723 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fm792l" event={"ID":"bf9b2c95-38b7-4a61-89ae-d40735ddccaa","Type":"ContainerStarted","Data":"89729bbbae846f393b6625d3d35816b04c595d5e64e452b44e66c9880e5b960f"} Dec 02 16:05:14 crc kubenswrapper[4933]: I1202 16:05:14.369410 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-sg5rj"] Dec 02 16:05:14 crc kubenswrapper[4933]: I1202 16:05:14.375132 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sg5rj" Dec 02 16:05:14 crc kubenswrapper[4933]: I1202 16:05:14.389331 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sg5rj"] Dec 02 16:05:14 crc kubenswrapper[4933]: I1202 16:05:14.467052 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jt9pq\" (UniqueName: \"kubernetes.io/projected/69c0027d-cb42-4cdd-83c6-fa8bc851bc3d-kube-api-access-jt9pq\") pod \"redhat-operators-sg5rj\" (UID: \"69c0027d-cb42-4cdd-83c6-fa8bc851bc3d\") " pod="openshift-marketplace/redhat-operators-sg5rj" Dec 02 16:05:14 crc kubenswrapper[4933]: I1202 16:05:14.467115 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69c0027d-cb42-4cdd-83c6-fa8bc851bc3d-catalog-content\") pod \"redhat-operators-sg5rj\" (UID: \"69c0027d-cb42-4cdd-83c6-fa8bc851bc3d\") " pod="openshift-marketplace/redhat-operators-sg5rj" Dec 02 16:05:14 crc kubenswrapper[4933]: I1202 16:05:14.467154 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69c0027d-cb42-4cdd-83c6-fa8bc851bc3d-utilities\") pod \"redhat-operators-sg5rj\" (UID: \"69c0027d-cb42-4cdd-83c6-fa8bc851bc3d\") " pod="openshift-marketplace/redhat-operators-sg5rj" Dec 02 16:05:14 crc kubenswrapper[4933]: I1202 16:05:14.568594 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69c0027d-cb42-4cdd-83c6-fa8bc851bc3d-catalog-content\") pod \"redhat-operators-sg5rj\" (UID: \"69c0027d-cb42-4cdd-83c6-fa8bc851bc3d\") " pod="openshift-marketplace/redhat-operators-sg5rj" Dec 02 16:05:14 crc kubenswrapper[4933]: I1202 16:05:14.568659 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69c0027d-cb42-4cdd-83c6-fa8bc851bc3d-utilities\") pod \"redhat-operators-sg5rj\" (UID: \"69c0027d-cb42-4cdd-83c6-fa8bc851bc3d\") " pod="openshift-marketplace/redhat-operators-sg5rj" Dec 02 16:05:14 crc kubenswrapper[4933]: I1202 16:05:14.568773 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jt9pq\" (UniqueName: \"kubernetes.io/projected/69c0027d-cb42-4cdd-83c6-fa8bc851bc3d-kube-api-access-jt9pq\") pod \"redhat-operators-sg5rj\" (UID: \"69c0027d-cb42-4cdd-83c6-fa8bc851bc3d\") " pod="openshift-marketplace/redhat-operators-sg5rj" Dec 02 16:05:14 crc kubenswrapper[4933]: I1202 16:05:14.569217 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69c0027d-cb42-4cdd-83c6-fa8bc851bc3d-catalog-content\") pod \"redhat-operators-sg5rj\" (UID: \"69c0027d-cb42-4cdd-83c6-fa8bc851bc3d\") " pod="openshift-marketplace/redhat-operators-sg5rj" Dec 02 16:05:14 crc kubenswrapper[4933]: I1202 16:05:14.569261 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69c0027d-cb42-4cdd-83c6-fa8bc851bc3d-utilities\") pod \"redhat-operators-sg5rj\" (UID: \"69c0027d-cb42-4cdd-83c6-fa8bc851bc3d\") " pod="openshift-marketplace/redhat-operators-sg5rj" Dec 02 16:05:14 crc kubenswrapper[4933]: I1202 16:05:14.592862 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jt9pq\" (UniqueName: \"kubernetes.io/projected/69c0027d-cb42-4cdd-83c6-fa8bc851bc3d-kube-api-access-jt9pq\") pod \"redhat-operators-sg5rj\" (UID: \"69c0027d-cb42-4cdd-83c6-fa8bc851bc3d\") " pod="openshift-marketplace/redhat-operators-sg5rj" Dec 02 16:05:14 crc kubenswrapper[4933]: I1202 16:05:14.698458 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sg5rj" Dec 02 16:05:14 crc kubenswrapper[4933]: I1202 16:05:14.919211 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sg5rj"] Dec 02 16:05:14 crc kubenswrapper[4933]: W1202 16:05:14.924080 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod69c0027d_cb42_4cdd_83c6_fa8bc851bc3d.slice/crio-f223f71cee9d9b63bd855d714986f1564f0ec6d47ddf199518858fded1467131 WatchSource:0}: Error finding container f223f71cee9d9b63bd855d714986f1564f0ec6d47ddf199518858fded1467131: Status 404 returned error can't find the container with id f223f71cee9d9b63bd855d714986f1564f0ec6d47ddf199518858fded1467131 Dec 02 16:05:15 crc kubenswrapper[4933]: I1202 16:05:15.368258 4933 generic.go:334] "Generic (PLEG): container finished" podID="69c0027d-cb42-4cdd-83c6-fa8bc851bc3d" containerID="7288d5a2807d82429f0de541a1080cb2d231a613ee29681b378a9ffe52365a39" exitCode=0 Dec 02 16:05:15 crc kubenswrapper[4933]: I1202 16:05:15.368540 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sg5rj" event={"ID":"69c0027d-cb42-4cdd-83c6-fa8bc851bc3d","Type":"ContainerDied","Data":"7288d5a2807d82429f0de541a1080cb2d231a613ee29681b378a9ffe52365a39"} Dec 02 16:05:15 crc kubenswrapper[4933]: I1202 16:05:15.368567 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sg5rj" event={"ID":"69c0027d-cb42-4cdd-83c6-fa8bc851bc3d","Type":"ContainerStarted","Data":"f223f71cee9d9b63bd855d714986f1564f0ec6d47ddf199518858fded1467131"} Dec 02 16:05:15 crc kubenswrapper[4933]: I1202 16:05:15.374718 4933 generic.go:334] "Generic (PLEG): container finished" podID="bf9b2c95-38b7-4a61-89ae-d40735ddccaa" containerID="89729bbbae846f393b6625d3d35816b04c595d5e64e452b44e66c9880e5b960f" exitCode=0 Dec 02 16:05:15 crc kubenswrapper[4933]: I1202 16:05:15.374768 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fm792l" event={"ID":"bf9b2c95-38b7-4a61-89ae-d40735ddccaa","Type":"ContainerDied","Data":"89729bbbae846f393b6625d3d35816b04c595d5e64e452b44e66c9880e5b960f"} Dec 02 16:05:16 crc kubenswrapper[4933]: I1202 16:05:16.387973 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sg5rj" event={"ID":"69c0027d-cb42-4cdd-83c6-fa8bc851bc3d","Type":"ContainerStarted","Data":"24d230c9cee7e4b14efda4fb4ecc9302ced089db94cf462e88973e251b2fcd3a"} Dec 02 16:05:16 crc kubenswrapper[4933]: I1202 16:05:16.390193 4933 generic.go:334] "Generic (PLEG): container finished" podID="bf9b2c95-38b7-4a61-89ae-d40735ddccaa" containerID="01b70433a241fce8b5f661640a9ec903cebc41ae60553fdaf3245a32dadcff04" exitCode=0 Dec 02 16:05:16 crc kubenswrapper[4933]: I1202 16:05:16.390243 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fm792l" event={"ID":"bf9b2c95-38b7-4a61-89ae-d40735ddccaa","Type":"ContainerDied","Data":"01b70433a241fce8b5f661640a9ec903cebc41ae60553fdaf3245a32dadcff04"} Dec 02 16:05:17 crc kubenswrapper[4933]: I1202 16:05:17.169943 4933 patch_prober.go:28] interesting pod/machine-config-daemon-d2p6w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 16:05:17 crc kubenswrapper[4933]: I1202 16:05:17.169994 4933 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 16:05:17 crc kubenswrapper[4933]: I1202 16:05:17.170032 4933 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" Dec 02 16:05:17 crc kubenswrapper[4933]: I1202 16:05:17.170560 4933 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"338ae7bd911845a8a5e2ec13c95bb1233e30afce64a9ca00f9d9ea1398ecf073"} pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 16:05:17 crc kubenswrapper[4933]: I1202 16:05:17.170610 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" containerName="machine-config-daemon" containerID="cri-o://338ae7bd911845a8a5e2ec13c95bb1233e30afce64a9ca00f9d9ea1398ecf073" gracePeriod=600 Dec 02 16:05:17 crc kubenswrapper[4933]: I1202 16:05:17.397618 4933 generic.go:334] "Generic (PLEG): container finished" podID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" containerID="338ae7bd911845a8a5e2ec13c95bb1233e30afce64a9ca00f9d9ea1398ecf073" exitCode=0 Dec 02 16:05:17 crc kubenswrapper[4933]: I1202 16:05:17.398252 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" event={"ID":"e6c1c5e6-50dd-428a-890c-2c3f0456f2fa","Type":"ContainerDied","Data":"338ae7bd911845a8a5e2ec13c95bb1233e30afce64a9ca00f9d9ea1398ecf073"} Dec 02 16:05:17 crc kubenswrapper[4933]: I1202 16:05:17.398293 4933 scope.go:117] "RemoveContainer" containerID="ad85216952cd1ae86bb25d4b70ffbba75c0b70e8a399ab6abbe21111aa2e2cec" Dec 02 16:05:17 crc kubenswrapper[4933]: I1202 16:05:17.967693 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fm792l" Dec 02 16:05:18 crc kubenswrapper[4933]: I1202 16:05:18.057088 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bf9b2c95-38b7-4a61-89ae-d40735ddccaa-bundle\") pod \"bf9b2c95-38b7-4a61-89ae-d40735ddccaa\" (UID: \"bf9b2c95-38b7-4a61-89ae-d40735ddccaa\") " Dec 02 16:05:18 crc kubenswrapper[4933]: I1202 16:05:18.057547 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bf9b2c95-38b7-4a61-89ae-d40735ddccaa-util\") pod \"bf9b2c95-38b7-4a61-89ae-d40735ddccaa\" (UID: \"bf9b2c95-38b7-4a61-89ae-d40735ddccaa\") " Dec 02 16:05:18 crc kubenswrapper[4933]: I1202 16:05:18.057682 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nq8g7\" (UniqueName: \"kubernetes.io/projected/bf9b2c95-38b7-4a61-89ae-d40735ddccaa-kube-api-access-nq8g7\") pod \"bf9b2c95-38b7-4a61-89ae-d40735ddccaa\" (UID: \"bf9b2c95-38b7-4a61-89ae-d40735ddccaa\") " Dec 02 16:05:18 crc kubenswrapper[4933]: I1202 16:05:18.058087 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf9b2c95-38b7-4a61-89ae-d40735ddccaa-bundle" (OuterVolumeSpecName: "bundle") pod "bf9b2c95-38b7-4a61-89ae-d40735ddccaa" (UID: "bf9b2c95-38b7-4a61-89ae-d40735ddccaa"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 16:05:18 crc kubenswrapper[4933]: I1202 16:05:18.058506 4933 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bf9b2c95-38b7-4a61-89ae-d40735ddccaa-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 16:05:18 crc kubenswrapper[4933]: I1202 16:05:18.081617 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf9b2c95-38b7-4a61-89ae-d40735ddccaa-util" (OuterVolumeSpecName: "util") pod "bf9b2c95-38b7-4a61-89ae-d40735ddccaa" (UID: "bf9b2c95-38b7-4a61-89ae-d40735ddccaa"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 16:05:18 crc kubenswrapper[4933]: I1202 16:05:18.087370 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf9b2c95-38b7-4a61-89ae-d40735ddccaa-kube-api-access-nq8g7" (OuterVolumeSpecName: "kube-api-access-nq8g7") pod "bf9b2c95-38b7-4a61-89ae-d40735ddccaa" (UID: "bf9b2c95-38b7-4a61-89ae-d40735ddccaa"). InnerVolumeSpecName "kube-api-access-nq8g7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:05:18 crc kubenswrapper[4933]: I1202 16:05:18.159641 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nq8g7\" (UniqueName: \"kubernetes.io/projected/bf9b2c95-38b7-4a61-89ae-d40735ddccaa-kube-api-access-nq8g7\") on node \"crc\" DevicePath \"\"" Dec 02 16:05:18 crc kubenswrapper[4933]: I1202 16:05:18.159685 4933 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bf9b2c95-38b7-4a61-89ae-d40735ddccaa-util\") on node \"crc\" DevicePath \"\"" Dec 02 16:05:18 crc kubenswrapper[4933]: I1202 16:05:18.407848 4933 generic.go:334] "Generic (PLEG): container finished" podID="69c0027d-cb42-4cdd-83c6-fa8bc851bc3d" containerID="24d230c9cee7e4b14efda4fb4ecc9302ced089db94cf462e88973e251b2fcd3a" exitCode=0 Dec 02 16:05:18 crc kubenswrapper[4933]: I1202 16:05:18.407892 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sg5rj" event={"ID":"69c0027d-cb42-4cdd-83c6-fa8bc851bc3d","Type":"ContainerDied","Data":"24d230c9cee7e4b14efda4fb4ecc9302ced089db94cf462e88973e251b2fcd3a"} Dec 02 16:05:18 crc kubenswrapper[4933]: I1202 16:05:18.411743 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fm792l" event={"ID":"bf9b2c95-38b7-4a61-89ae-d40735ddccaa","Type":"ContainerDied","Data":"3ff0ecbd6995a9f735fe07b1d387a3fd496ae52424d7ed334a9352d05b6670d4"} Dec 02 16:05:18 crc kubenswrapper[4933]: I1202 16:05:18.411779 4933 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3ff0ecbd6995a9f735fe07b1d387a3fd496ae52424d7ed334a9352d05b6670d4" Dec 02 16:05:18 crc kubenswrapper[4933]: I1202 16:05:18.411880 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fm792l" Dec 02 16:05:18 crc kubenswrapper[4933]: I1202 16:05:18.417715 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" event={"ID":"e6c1c5e6-50dd-428a-890c-2c3f0456f2fa","Type":"ContainerStarted","Data":"21d5a98b0f608aa28a64cf9b966f8d88f136c42a20cbdba910ec855c8416478d"} Dec 02 16:05:19 crc kubenswrapper[4933]: I1202 16:05:19.427055 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sg5rj" event={"ID":"69c0027d-cb42-4cdd-83c6-fa8bc851bc3d","Type":"ContainerStarted","Data":"16f2bc6ef4d2f96bbde82a7af13186b1f791018172e640bc0a67284f20066e3d"} Dec 02 16:05:19 crc kubenswrapper[4933]: I1202 16:05:19.452603 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-sg5rj" podStartSLOduration=1.903960134 podStartE2EDuration="5.45258425s" podCreationTimestamp="2025-12-02 16:05:14 +0000 UTC" firstStartedPulling="2025-12-02 16:05:15.369817279 +0000 UTC m=+778.621043982" lastFinishedPulling="2025-12-02 16:05:18.918441395 +0000 UTC m=+782.169668098" observedRunningTime="2025-12-02 16:05:19.452280642 +0000 UTC m=+782.703507345" watchObservedRunningTime="2025-12-02 16:05:19.45258425 +0000 UTC m=+782.703810953" Dec 02 16:05:20 crc kubenswrapper[4933]: I1202 16:05:20.593807 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8g5ctb"] Dec 02 16:05:20 crc kubenswrapper[4933]: E1202 16:05:20.594105 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf9b2c95-38b7-4a61-89ae-d40735ddccaa" containerName="extract" Dec 02 16:05:20 crc kubenswrapper[4933]: I1202 16:05:20.594119 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf9b2c95-38b7-4a61-89ae-d40735ddccaa" containerName="extract" Dec 02 16:05:20 crc kubenswrapper[4933]: E1202 16:05:20.594138 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf9b2c95-38b7-4a61-89ae-d40735ddccaa" containerName="util" Dec 02 16:05:20 crc kubenswrapper[4933]: I1202 16:05:20.594147 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf9b2c95-38b7-4a61-89ae-d40735ddccaa" containerName="util" Dec 02 16:05:20 crc kubenswrapper[4933]: E1202 16:05:20.594171 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf9b2c95-38b7-4a61-89ae-d40735ddccaa" containerName="pull" Dec 02 16:05:20 crc kubenswrapper[4933]: I1202 16:05:20.594181 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf9b2c95-38b7-4a61-89ae-d40735ddccaa" containerName="pull" Dec 02 16:05:20 crc kubenswrapper[4933]: I1202 16:05:20.594336 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf9b2c95-38b7-4a61-89ae-d40735ddccaa" containerName="extract" Dec 02 16:05:20 crc kubenswrapper[4933]: I1202 16:05:20.595436 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8g5ctb" Dec 02 16:05:20 crc kubenswrapper[4933]: I1202 16:05:20.597274 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 02 16:05:20 crc kubenswrapper[4933]: I1202 16:05:20.606987 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8g5ctb"] Dec 02 16:05:20 crc kubenswrapper[4933]: I1202 16:05:20.694716 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e3bd3f79-2e8f-42fc-b9c1-f7dc2ab455d7-bundle\") pod \"4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8g5ctb\" (UID: \"e3bd3f79-2e8f-42fc-b9c1-f7dc2ab455d7\") " pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8g5ctb" Dec 02 16:05:20 crc kubenswrapper[4933]: I1202 16:05:20.694776 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xt6m2\" (UniqueName: \"kubernetes.io/projected/e3bd3f79-2e8f-42fc-b9c1-f7dc2ab455d7-kube-api-access-xt6m2\") pod \"4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8g5ctb\" (UID: \"e3bd3f79-2e8f-42fc-b9c1-f7dc2ab455d7\") " pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8g5ctb" Dec 02 16:05:20 crc kubenswrapper[4933]: I1202 16:05:20.694795 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e3bd3f79-2e8f-42fc-b9c1-f7dc2ab455d7-util\") pod \"4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8g5ctb\" (UID: \"e3bd3f79-2e8f-42fc-b9c1-f7dc2ab455d7\") " pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8g5ctb" Dec 02 16:05:20 crc kubenswrapper[4933]: I1202 16:05:20.796115 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e3bd3f79-2e8f-42fc-b9c1-f7dc2ab455d7-bundle\") pod \"4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8g5ctb\" (UID: \"e3bd3f79-2e8f-42fc-b9c1-f7dc2ab455d7\") " pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8g5ctb" Dec 02 16:05:20 crc kubenswrapper[4933]: I1202 16:05:20.796203 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xt6m2\" (UniqueName: \"kubernetes.io/projected/e3bd3f79-2e8f-42fc-b9c1-f7dc2ab455d7-kube-api-access-xt6m2\") pod \"4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8g5ctb\" (UID: \"e3bd3f79-2e8f-42fc-b9c1-f7dc2ab455d7\") " pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8g5ctb" Dec 02 16:05:20 crc kubenswrapper[4933]: I1202 16:05:20.796242 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e3bd3f79-2e8f-42fc-b9c1-f7dc2ab455d7-util\") pod \"4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8g5ctb\" (UID: \"e3bd3f79-2e8f-42fc-b9c1-f7dc2ab455d7\") " pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8g5ctb" Dec 02 16:05:20 crc kubenswrapper[4933]: I1202 16:05:20.796686 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e3bd3f79-2e8f-42fc-b9c1-f7dc2ab455d7-bundle\") pod \"4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8g5ctb\" (UID: \"e3bd3f79-2e8f-42fc-b9c1-f7dc2ab455d7\") " pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8g5ctb" Dec 02 16:05:20 crc kubenswrapper[4933]: I1202 16:05:20.796808 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e3bd3f79-2e8f-42fc-b9c1-f7dc2ab455d7-util\") pod \"4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8g5ctb\" (UID: \"e3bd3f79-2e8f-42fc-b9c1-f7dc2ab455d7\") " pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8g5ctb" Dec 02 16:05:20 crc kubenswrapper[4933]: I1202 16:05:20.824349 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xt6m2\" (UniqueName: \"kubernetes.io/projected/e3bd3f79-2e8f-42fc-b9c1-f7dc2ab455d7-kube-api-access-xt6m2\") pod \"4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8g5ctb\" (UID: \"e3bd3f79-2e8f-42fc-b9c1-f7dc2ab455d7\") " pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8g5ctb" Dec 02 16:05:20 crc kubenswrapper[4933]: I1202 16:05:20.915787 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8g5ctb" Dec 02 16:05:21 crc kubenswrapper[4933]: I1202 16:05:21.138790 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8g5ctb"] Dec 02 16:05:21 crc kubenswrapper[4933]: W1202 16:05:21.143397 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3bd3f79_2e8f_42fc_b9c1_f7dc2ab455d7.slice/crio-5979e5850bb1ed2388e5735ae5de372e0b85d981d7ab6441f149592150d2f4fa WatchSource:0}: Error finding container 5979e5850bb1ed2388e5735ae5de372e0b85d981d7ab6441f149592150d2f4fa: Status 404 returned error can't find the container with id 5979e5850bb1ed2388e5735ae5de372e0b85d981d7ab6441f149592150d2f4fa Dec 02 16:05:21 crc kubenswrapper[4933]: I1202 16:05:21.440426 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8g5ctb" event={"ID":"e3bd3f79-2e8f-42fc-b9c1-f7dc2ab455d7","Type":"ContainerStarted","Data":"ad95d28969bac27c93d13ef3abcd21ce983abff9b9f234f3f6547ba45aa1dd43"} Dec 02 16:05:21 crc kubenswrapper[4933]: I1202 16:05:21.440478 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8g5ctb" event={"ID":"e3bd3f79-2e8f-42fc-b9c1-f7dc2ab455d7","Type":"ContainerStarted","Data":"5979e5850bb1ed2388e5735ae5de372e0b85d981d7ab6441f149592150d2f4fa"} Dec 02 16:05:22 crc kubenswrapper[4933]: I1202 16:05:22.467876 4933 generic.go:334] "Generic (PLEG): container finished" podID="e3bd3f79-2e8f-42fc-b9c1-f7dc2ab455d7" containerID="ad95d28969bac27c93d13ef3abcd21ce983abff9b9f234f3f6547ba45aa1dd43" exitCode=0 Dec 02 16:05:22 crc kubenswrapper[4933]: I1202 16:05:22.468200 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8g5ctb" event={"ID":"e3bd3f79-2e8f-42fc-b9c1-f7dc2ab455d7","Type":"ContainerDied","Data":"ad95d28969bac27c93d13ef3abcd21ce983abff9b9f234f3f6547ba45aa1dd43"} Dec 02 16:05:24 crc kubenswrapper[4933]: E1202 16:05:24.233596 4933 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3bd3f79_2e8f_42fc_b9c1_f7dc2ab455d7.slice/crio-4727d2c0341f273f1c246bc1668789214a87094dcbf77047c4552670c7efd5da.scope\": RecentStats: unable to find data in memory cache]" Dec 02 16:05:24 crc kubenswrapper[4933]: I1202 16:05:24.481534 4933 generic.go:334] "Generic (PLEG): container finished" podID="e3bd3f79-2e8f-42fc-b9c1-f7dc2ab455d7" containerID="4727d2c0341f273f1c246bc1668789214a87094dcbf77047c4552670c7efd5da" exitCode=0 Dec 02 16:05:24 crc kubenswrapper[4933]: I1202 16:05:24.481594 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8g5ctb" event={"ID":"e3bd3f79-2e8f-42fc-b9c1-f7dc2ab455d7","Type":"ContainerDied","Data":"4727d2c0341f273f1c246bc1668789214a87094dcbf77047c4552670c7efd5da"} Dec 02 16:05:24 crc kubenswrapper[4933]: I1202 16:05:24.698999 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-sg5rj" Dec 02 16:05:24 crc kubenswrapper[4933]: I1202 16:05:24.699072 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-sg5rj" Dec 02 16:05:24 crc kubenswrapper[4933]: I1202 16:05:24.737459 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-sg5rj" Dec 02 16:05:25 crc kubenswrapper[4933]: I1202 16:05:25.489459 4933 generic.go:334] "Generic (PLEG): container finished" podID="e3bd3f79-2e8f-42fc-b9c1-f7dc2ab455d7" containerID="5dfd4cf8b490ce589f7f34f30438fef65e2f40f369f0f613a35c71e26b13324d" exitCode=0 Dec 02 16:05:25 crc kubenswrapper[4933]: I1202 16:05:25.489536 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8g5ctb" event={"ID":"e3bd3f79-2e8f-42fc-b9c1-f7dc2ab455d7","Type":"ContainerDied","Data":"5dfd4cf8b490ce589f7f34f30438fef65e2f40f369f0f613a35c71e26b13324d"} Dec 02 16:05:25 crc kubenswrapper[4933]: I1202 16:05:25.534525 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-sg5rj" Dec 02 16:05:26 crc kubenswrapper[4933]: I1202 16:05:26.191405 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-5dfc66d568-zbwvk"] Dec 02 16:05:26 crc kubenswrapper[4933]: I1202 16:05:26.192461 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators-redhat/loki-operator-controller-manager-5dfc66d568-zbwvk" Dec 02 16:05:26 crc kubenswrapper[4933]: I1202 16:05:26.193881 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-metrics" Dec 02 16:05:26 crc kubenswrapper[4933]: I1202 16:05:26.194791 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-controller-manager-dockercfg-z96v7" Dec 02 16:05:26 crc kubenswrapper[4933]: I1202 16:05:26.195483 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"kube-root-ca.crt" Dec 02 16:05:26 crc kubenswrapper[4933]: I1202 16:05:26.195685 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-controller-manager-service-cert" Dec 02 16:05:26 crc kubenswrapper[4933]: I1202 16:05:26.195995 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"openshift-service-ca.crt" Dec 02 16:05:26 crc kubenswrapper[4933]: I1202 16:05:26.196029 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"loki-operator-manager-config" Dec 02 16:05:26 crc kubenswrapper[4933]: I1202 16:05:26.211641 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-5dfc66d568-zbwvk"] Dec 02 16:05:26 crc kubenswrapper[4933]: I1202 16:05:26.293717 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a390b3c7-fa88-43d0-81d1-ca767f1e9133-apiservice-cert\") pod \"loki-operator-controller-manager-5dfc66d568-zbwvk\" (UID: \"a390b3c7-fa88-43d0-81d1-ca767f1e9133\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5dfc66d568-zbwvk" Dec 02 16:05:26 crc kubenswrapper[4933]: I1202 16:05:26.293782 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7nhf\" (UniqueName: \"kubernetes.io/projected/a390b3c7-fa88-43d0-81d1-ca767f1e9133-kube-api-access-l7nhf\") pod \"loki-operator-controller-manager-5dfc66d568-zbwvk\" (UID: \"a390b3c7-fa88-43d0-81d1-ca767f1e9133\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5dfc66d568-zbwvk" Dec 02 16:05:26 crc kubenswrapper[4933]: I1202 16:05:26.293920 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/a390b3c7-fa88-43d0-81d1-ca767f1e9133-manager-config\") pod \"loki-operator-controller-manager-5dfc66d568-zbwvk\" (UID: \"a390b3c7-fa88-43d0-81d1-ca767f1e9133\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5dfc66d568-zbwvk" Dec 02 16:05:26 crc kubenswrapper[4933]: I1202 16:05:26.293974 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a390b3c7-fa88-43d0-81d1-ca767f1e9133-webhook-cert\") pod \"loki-operator-controller-manager-5dfc66d568-zbwvk\" (UID: \"a390b3c7-fa88-43d0-81d1-ca767f1e9133\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5dfc66d568-zbwvk" Dec 02 16:05:26 crc kubenswrapper[4933]: I1202 16:05:26.294014 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a390b3c7-fa88-43d0-81d1-ca767f1e9133-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-5dfc66d568-zbwvk\" (UID: \"a390b3c7-fa88-43d0-81d1-ca767f1e9133\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5dfc66d568-zbwvk" Dec 02 16:05:26 crc kubenswrapper[4933]: I1202 16:05:26.395339 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a390b3c7-fa88-43d0-81d1-ca767f1e9133-webhook-cert\") pod \"loki-operator-controller-manager-5dfc66d568-zbwvk\" (UID: \"a390b3c7-fa88-43d0-81d1-ca767f1e9133\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5dfc66d568-zbwvk" Dec 02 16:05:26 crc kubenswrapper[4933]: I1202 16:05:26.396639 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a390b3c7-fa88-43d0-81d1-ca767f1e9133-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-5dfc66d568-zbwvk\" (UID: \"a390b3c7-fa88-43d0-81d1-ca767f1e9133\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5dfc66d568-zbwvk" Dec 02 16:05:26 crc kubenswrapper[4933]: I1202 16:05:26.396691 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a390b3c7-fa88-43d0-81d1-ca767f1e9133-apiservice-cert\") pod \"loki-operator-controller-manager-5dfc66d568-zbwvk\" (UID: \"a390b3c7-fa88-43d0-81d1-ca767f1e9133\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5dfc66d568-zbwvk" Dec 02 16:05:26 crc kubenswrapper[4933]: I1202 16:05:26.396729 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7nhf\" (UniqueName: \"kubernetes.io/projected/a390b3c7-fa88-43d0-81d1-ca767f1e9133-kube-api-access-l7nhf\") pod \"loki-operator-controller-manager-5dfc66d568-zbwvk\" (UID: \"a390b3c7-fa88-43d0-81d1-ca767f1e9133\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5dfc66d568-zbwvk" Dec 02 16:05:26 crc kubenswrapper[4933]: I1202 16:05:26.396895 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/a390b3c7-fa88-43d0-81d1-ca767f1e9133-manager-config\") pod \"loki-operator-controller-manager-5dfc66d568-zbwvk\" (UID: \"a390b3c7-fa88-43d0-81d1-ca767f1e9133\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5dfc66d568-zbwvk" Dec 02 16:05:26 crc kubenswrapper[4933]: I1202 16:05:26.397811 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/a390b3c7-fa88-43d0-81d1-ca767f1e9133-manager-config\") pod \"loki-operator-controller-manager-5dfc66d568-zbwvk\" (UID: \"a390b3c7-fa88-43d0-81d1-ca767f1e9133\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5dfc66d568-zbwvk" Dec 02 16:05:26 crc kubenswrapper[4933]: I1202 16:05:26.401783 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a390b3c7-fa88-43d0-81d1-ca767f1e9133-webhook-cert\") pod \"loki-operator-controller-manager-5dfc66d568-zbwvk\" (UID: \"a390b3c7-fa88-43d0-81d1-ca767f1e9133\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5dfc66d568-zbwvk" Dec 02 16:05:26 crc kubenswrapper[4933]: I1202 16:05:26.402999 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a390b3c7-fa88-43d0-81d1-ca767f1e9133-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-5dfc66d568-zbwvk\" (UID: \"a390b3c7-fa88-43d0-81d1-ca767f1e9133\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5dfc66d568-zbwvk" Dec 02 16:05:26 crc kubenswrapper[4933]: I1202 16:05:26.405354 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a390b3c7-fa88-43d0-81d1-ca767f1e9133-apiservice-cert\") pod \"loki-operator-controller-manager-5dfc66d568-zbwvk\" (UID: \"a390b3c7-fa88-43d0-81d1-ca767f1e9133\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5dfc66d568-zbwvk" Dec 02 16:05:26 crc kubenswrapper[4933]: I1202 16:05:26.416617 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7nhf\" (UniqueName: \"kubernetes.io/projected/a390b3c7-fa88-43d0-81d1-ca767f1e9133-kube-api-access-l7nhf\") pod \"loki-operator-controller-manager-5dfc66d568-zbwvk\" (UID: \"a390b3c7-fa88-43d0-81d1-ca767f1e9133\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5dfc66d568-zbwvk" Dec 02 16:05:26 crc kubenswrapper[4933]: I1202 16:05:26.506979 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators-redhat/loki-operator-controller-manager-5dfc66d568-zbwvk" Dec 02 16:05:26 crc kubenswrapper[4933]: I1202 16:05:26.769115 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8g5ctb" Dec 02 16:05:26 crc kubenswrapper[4933]: I1202 16:05:26.813459 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-5dfc66d568-zbwvk"] Dec 02 16:05:26 crc kubenswrapper[4933]: I1202 16:05:26.815499 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e3bd3f79-2e8f-42fc-b9c1-f7dc2ab455d7-util\") pod \"e3bd3f79-2e8f-42fc-b9c1-f7dc2ab455d7\" (UID: \"e3bd3f79-2e8f-42fc-b9c1-f7dc2ab455d7\") " Dec 02 16:05:26 crc kubenswrapper[4933]: I1202 16:05:26.816131 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e3bd3f79-2e8f-42fc-b9c1-f7dc2ab455d7-bundle\") pod \"e3bd3f79-2e8f-42fc-b9c1-f7dc2ab455d7\" (UID: \"e3bd3f79-2e8f-42fc-b9c1-f7dc2ab455d7\") " Dec 02 16:05:26 crc kubenswrapper[4933]: I1202 16:05:26.816196 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xt6m2\" (UniqueName: \"kubernetes.io/projected/e3bd3f79-2e8f-42fc-b9c1-f7dc2ab455d7-kube-api-access-xt6m2\") pod \"e3bd3f79-2e8f-42fc-b9c1-f7dc2ab455d7\" (UID: \"e3bd3f79-2e8f-42fc-b9c1-f7dc2ab455d7\") " Dec 02 16:05:26 crc kubenswrapper[4933]: I1202 16:05:26.817150 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3bd3f79-2e8f-42fc-b9c1-f7dc2ab455d7-bundle" (OuterVolumeSpecName: "bundle") pod "e3bd3f79-2e8f-42fc-b9c1-f7dc2ab455d7" (UID: "e3bd3f79-2e8f-42fc-b9c1-f7dc2ab455d7"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 16:05:26 crc kubenswrapper[4933]: I1202 16:05:26.823619 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3bd3f79-2e8f-42fc-b9c1-f7dc2ab455d7-kube-api-access-xt6m2" (OuterVolumeSpecName: "kube-api-access-xt6m2") pod "e3bd3f79-2e8f-42fc-b9c1-f7dc2ab455d7" (UID: "e3bd3f79-2e8f-42fc-b9c1-f7dc2ab455d7"). InnerVolumeSpecName "kube-api-access-xt6m2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:05:26 crc kubenswrapper[4933]: I1202 16:05:26.917339 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xt6m2\" (UniqueName: \"kubernetes.io/projected/e3bd3f79-2e8f-42fc-b9c1-f7dc2ab455d7-kube-api-access-xt6m2\") on node \"crc\" DevicePath \"\"" Dec 02 16:05:26 crc kubenswrapper[4933]: I1202 16:05:26.917381 4933 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e3bd3f79-2e8f-42fc-b9c1-f7dc2ab455d7-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 16:05:27 crc kubenswrapper[4933]: I1202 16:05:27.504758 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8g5ctb" Dec 02 16:05:27 crc kubenswrapper[4933]: I1202 16:05:27.504799 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8g5ctb" event={"ID":"e3bd3f79-2e8f-42fc-b9c1-f7dc2ab455d7","Type":"ContainerDied","Data":"5979e5850bb1ed2388e5735ae5de372e0b85d981d7ab6441f149592150d2f4fa"} Dec 02 16:05:27 crc kubenswrapper[4933]: I1202 16:05:27.504896 4933 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5979e5850bb1ed2388e5735ae5de372e0b85d981d7ab6441f149592150d2f4fa" Dec 02 16:05:27 crc kubenswrapper[4933]: I1202 16:05:27.506911 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-5dfc66d568-zbwvk" event={"ID":"a390b3c7-fa88-43d0-81d1-ca767f1e9133","Type":"ContainerStarted","Data":"f6d7bcf7e51126943e65f8423429dfcba4148fd3b76c6cbf833560dbb10745eb"} Dec 02 16:05:28 crc kubenswrapper[4933]: I1202 16:05:28.222503 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3bd3f79-2e8f-42fc-b9c1-f7dc2ab455d7-util" (OuterVolumeSpecName: "util") pod "e3bd3f79-2e8f-42fc-b9c1-f7dc2ab455d7" (UID: "e3bd3f79-2e8f-42fc-b9c1-f7dc2ab455d7"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 16:05:28 crc kubenswrapper[4933]: I1202 16:05:28.249549 4933 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e3bd3f79-2e8f-42fc-b9c1-f7dc2ab455d7-util\") on node \"crc\" DevicePath \"\"" Dec 02 16:05:28 crc kubenswrapper[4933]: I1202 16:05:28.351642 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-sg5rj"] Dec 02 16:05:28 crc kubenswrapper[4933]: I1202 16:05:28.352140 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-sg5rj" podUID="69c0027d-cb42-4cdd-83c6-fa8bc851bc3d" containerName="registry-server" containerID="cri-o://16f2bc6ef4d2f96bbde82a7af13186b1f791018172e640bc0a67284f20066e3d" gracePeriod=2 Dec 02 16:05:29 crc kubenswrapper[4933]: I1202 16:05:29.521868 4933 generic.go:334] "Generic (PLEG): container finished" podID="69c0027d-cb42-4cdd-83c6-fa8bc851bc3d" containerID="16f2bc6ef4d2f96bbde82a7af13186b1f791018172e640bc0a67284f20066e3d" exitCode=0 Dec 02 16:05:29 crc kubenswrapper[4933]: I1202 16:05:29.521944 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sg5rj" event={"ID":"69c0027d-cb42-4cdd-83c6-fa8bc851bc3d","Type":"ContainerDied","Data":"16f2bc6ef4d2f96bbde82a7af13186b1f791018172e640bc0a67284f20066e3d"} Dec 02 16:05:29 crc kubenswrapper[4933]: I1202 16:05:29.925804 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sg5rj" Dec 02 16:05:29 crc kubenswrapper[4933]: I1202 16:05:29.982991 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jt9pq\" (UniqueName: \"kubernetes.io/projected/69c0027d-cb42-4cdd-83c6-fa8bc851bc3d-kube-api-access-jt9pq\") pod \"69c0027d-cb42-4cdd-83c6-fa8bc851bc3d\" (UID: \"69c0027d-cb42-4cdd-83c6-fa8bc851bc3d\") " Dec 02 16:05:29 crc kubenswrapper[4933]: I1202 16:05:29.983083 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69c0027d-cb42-4cdd-83c6-fa8bc851bc3d-catalog-content\") pod \"69c0027d-cb42-4cdd-83c6-fa8bc851bc3d\" (UID: \"69c0027d-cb42-4cdd-83c6-fa8bc851bc3d\") " Dec 02 16:05:29 crc kubenswrapper[4933]: I1202 16:05:29.983220 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69c0027d-cb42-4cdd-83c6-fa8bc851bc3d-utilities\") pod \"69c0027d-cb42-4cdd-83c6-fa8bc851bc3d\" (UID: \"69c0027d-cb42-4cdd-83c6-fa8bc851bc3d\") " Dec 02 16:05:29 crc kubenswrapper[4933]: I1202 16:05:29.985015 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69c0027d-cb42-4cdd-83c6-fa8bc851bc3d-utilities" (OuterVolumeSpecName: "utilities") pod "69c0027d-cb42-4cdd-83c6-fa8bc851bc3d" (UID: "69c0027d-cb42-4cdd-83c6-fa8bc851bc3d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 16:05:30 crc kubenswrapper[4933]: I1202 16:05:30.021005 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69c0027d-cb42-4cdd-83c6-fa8bc851bc3d-kube-api-access-jt9pq" (OuterVolumeSpecName: "kube-api-access-jt9pq") pod "69c0027d-cb42-4cdd-83c6-fa8bc851bc3d" (UID: "69c0027d-cb42-4cdd-83c6-fa8bc851bc3d"). InnerVolumeSpecName "kube-api-access-jt9pq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:05:30 crc kubenswrapper[4933]: I1202 16:05:30.096416 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jt9pq\" (UniqueName: \"kubernetes.io/projected/69c0027d-cb42-4cdd-83c6-fa8bc851bc3d-kube-api-access-jt9pq\") on node \"crc\" DevicePath \"\"" Dec 02 16:05:30 crc kubenswrapper[4933]: I1202 16:05:30.096477 4933 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69c0027d-cb42-4cdd-83c6-fa8bc851bc3d-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 16:05:30 crc kubenswrapper[4933]: I1202 16:05:30.143415 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69c0027d-cb42-4cdd-83c6-fa8bc851bc3d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "69c0027d-cb42-4cdd-83c6-fa8bc851bc3d" (UID: "69c0027d-cb42-4cdd-83c6-fa8bc851bc3d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 16:05:30 crc kubenswrapper[4933]: I1202 16:05:30.197719 4933 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69c0027d-cb42-4cdd-83c6-fa8bc851bc3d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 16:05:30 crc kubenswrapper[4933]: I1202 16:05:30.535365 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sg5rj" event={"ID":"69c0027d-cb42-4cdd-83c6-fa8bc851bc3d","Type":"ContainerDied","Data":"f223f71cee9d9b63bd855d714986f1564f0ec6d47ddf199518858fded1467131"} Dec 02 16:05:30 crc kubenswrapper[4933]: I1202 16:05:30.535423 4933 scope.go:117] "RemoveContainer" containerID="16f2bc6ef4d2f96bbde82a7af13186b1f791018172e640bc0a67284f20066e3d" Dec 02 16:05:30 crc kubenswrapper[4933]: I1202 16:05:30.535547 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sg5rj" Dec 02 16:05:30 crc kubenswrapper[4933]: I1202 16:05:30.557258 4933 scope.go:117] "RemoveContainer" containerID="24d230c9cee7e4b14efda4fb4ecc9302ced089db94cf462e88973e251b2fcd3a" Dec 02 16:05:30 crc kubenswrapper[4933]: I1202 16:05:30.574552 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-sg5rj"] Dec 02 16:05:30 crc kubenswrapper[4933]: I1202 16:05:30.581307 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-sg5rj"] Dec 02 16:05:30 crc kubenswrapper[4933]: I1202 16:05:30.588607 4933 scope.go:117] "RemoveContainer" containerID="7288d5a2807d82429f0de541a1080cb2d231a613ee29681b378a9ffe52365a39" Dec 02 16:05:31 crc kubenswrapper[4933]: I1202 16:05:31.063220 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69c0027d-cb42-4cdd-83c6-fa8bc851bc3d" path="/var/lib/kubelet/pods/69c0027d-cb42-4cdd-83c6-fa8bc851bc3d/volumes" Dec 02 16:05:34 crc kubenswrapper[4933]: I1202 16:05:34.564775 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-5dfc66d568-zbwvk" event={"ID":"a390b3c7-fa88-43d0-81d1-ca767f1e9133","Type":"ContainerStarted","Data":"9ddb003e65767328dbb3ea4d29e677d99f9bba9c3a718ef5b31bb217c63ce50e"} Dec 02 16:05:34 crc kubenswrapper[4933]: I1202 16:05:34.886958 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/cluster-logging-operator-ff9846bd-sg6kn"] Dec 02 16:05:34 crc kubenswrapper[4933]: E1202 16:05:34.887249 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3bd3f79-2e8f-42fc-b9c1-f7dc2ab455d7" containerName="pull" Dec 02 16:05:34 crc kubenswrapper[4933]: I1202 16:05:34.887269 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3bd3f79-2e8f-42fc-b9c1-f7dc2ab455d7" containerName="pull" Dec 02 16:05:34 crc kubenswrapper[4933]: E1202 16:05:34.887285 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3bd3f79-2e8f-42fc-b9c1-f7dc2ab455d7" containerName="util" Dec 02 16:05:34 crc kubenswrapper[4933]: I1202 16:05:34.887294 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3bd3f79-2e8f-42fc-b9c1-f7dc2ab455d7" containerName="util" Dec 02 16:05:34 crc kubenswrapper[4933]: E1202 16:05:34.887310 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69c0027d-cb42-4cdd-83c6-fa8bc851bc3d" containerName="extract-utilities" Dec 02 16:05:34 crc kubenswrapper[4933]: I1202 16:05:34.887317 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="69c0027d-cb42-4cdd-83c6-fa8bc851bc3d" containerName="extract-utilities" Dec 02 16:05:34 crc kubenswrapper[4933]: E1202 16:05:34.887337 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69c0027d-cb42-4cdd-83c6-fa8bc851bc3d" containerName="extract-content" Dec 02 16:05:34 crc kubenswrapper[4933]: I1202 16:05:34.887366 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="69c0027d-cb42-4cdd-83c6-fa8bc851bc3d" containerName="extract-content" Dec 02 16:05:34 crc kubenswrapper[4933]: E1202 16:05:34.887376 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3bd3f79-2e8f-42fc-b9c1-f7dc2ab455d7" containerName="extract" Dec 02 16:05:34 crc kubenswrapper[4933]: I1202 16:05:34.887385 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3bd3f79-2e8f-42fc-b9c1-f7dc2ab455d7" containerName="extract" Dec 02 16:05:34 crc kubenswrapper[4933]: E1202 16:05:34.887397 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69c0027d-cb42-4cdd-83c6-fa8bc851bc3d" containerName="registry-server" Dec 02 16:05:34 crc kubenswrapper[4933]: I1202 16:05:34.887404 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="69c0027d-cb42-4cdd-83c6-fa8bc851bc3d" containerName="registry-server" Dec 02 16:05:34 crc kubenswrapper[4933]: I1202 16:05:34.887557 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="69c0027d-cb42-4cdd-83c6-fa8bc851bc3d" containerName="registry-server" Dec 02 16:05:34 crc kubenswrapper[4933]: I1202 16:05:34.887573 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3bd3f79-2e8f-42fc-b9c1-f7dc2ab455d7" containerName="extract" Dec 02 16:05:34 crc kubenswrapper[4933]: I1202 16:05:34.888097 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/cluster-logging-operator-ff9846bd-sg6kn" Dec 02 16:05:34 crc kubenswrapper[4933]: I1202 16:05:34.891028 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"kube-root-ca.crt" Dec 02 16:05:34 crc kubenswrapper[4933]: I1202 16:05:34.891672 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"openshift-service-ca.crt" Dec 02 16:05:34 crc kubenswrapper[4933]: I1202 16:05:34.891760 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"cluster-logging-operator-dockercfg-mnt8s" Dec 02 16:05:34 crc kubenswrapper[4933]: I1202 16:05:34.918583 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/cluster-logging-operator-ff9846bd-sg6kn"] Dec 02 16:05:34 crc kubenswrapper[4933]: I1202 16:05:34.975276 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-945ds\" (UniqueName: \"kubernetes.io/projected/5390a7b1-0b5d-485c-8fc3-44fca44d4286-kube-api-access-945ds\") pod \"cluster-logging-operator-ff9846bd-sg6kn\" (UID: \"5390a7b1-0b5d-485c-8fc3-44fca44d4286\") " pod="openshift-logging/cluster-logging-operator-ff9846bd-sg6kn" Dec 02 16:05:35 crc kubenswrapper[4933]: I1202 16:05:35.077155 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-945ds\" (UniqueName: \"kubernetes.io/projected/5390a7b1-0b5d-485c-8fc3-44fca44d4286-kube-api-access-945ds\") pod \"cluster-logging-operator-ff9846bd-sg6kn\" (UID: \"5390a7b1-0b5d-485c-8fc3-44fca44d4286\") " pod="openshift-logging/cluster-logging-operator-ff9846bd-sg6kn" Dec 02 16:05:35 crc kubenswrapper[4933]: I1202 16:05:35.099035 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-945ds\" (UniqueName: \"kubernetes.io/projected/5390a7b1-0b5d-485c-8fc3-44fca44d4286-kube-api-access-945ds\") pod \"cluster-logging-operator-ff9846bd-sg6kn\" (UID: \"5390a7b1-0b5d-485c-8fc3-44fca44d4286\") " pod="openshift-logging/cluster-logging-operator-ff9846bd-sg6kn" Dec 02 16:05:35 crc kubenswrapper[4933]: I1202 16:05:35.203026 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/cluster-logging-operator-ff9846bd-sg6kn" Dec 02 16:05:35 crc kubenswrapper[4933]: I1202 16:05:35.688655 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/cluster-logging-operator-ff9846bd-sg6kn"] Dec 02 16:05:36 crc kubenswrapper[4933]: I1202 16:05:36.582113 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/cluster-logging-operator-ff9846bd-sg6kn" event={"ID":"5390a7b1-0b5d-485c-8fc3-44fca44d4286","Type":"ContainerStarted","Data":"e2d6db5dbd5ae8a6aa8237b60b2cb4bc6389190683d7c6fb18042afe0c8663b6"} Dec 02 16:05:41 crc kubenswrapper[4933]: I1202 16:05:41.615425 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-5dfc66d568-zbwvk" event={"ID":"a390b3c7-fa88-43d0-81d1-ca767f1e9133","Type":"ContainerStarted","Data":"da189bc02bcf4af292fe3fd5c313ec9b3d0902df513f0de6899e48bafb1fa558"} Dec 02 16:05:41 crc kubenswrapper[4933]: I1202 16:05:41.616043 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators-redhat/loki-operator-controller-manager-5dfc66d568-zbwvk" Dec 02 16:05:41 crc kubenswrapper[4933]: I1202 16:05:41.618700 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators-redhat/loki-operator-controller-manager-5dfc66d568-zbwvk" Dec 02 16:05:41 crc kubenswrapper[4933]: I1202 16:05:41.682056 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators-redhat/loki-operator-controller-manager-5dfc66d568-zbwvk" podStartSLOduration=1.657303145 podStartE2EDuration="15.682039309s" podCreationTimestamp="2025-12-02 16:05:26 +0000 UTC" firstStartedPulling="2025-12-02 16:05:26.836945338 +0000 UTC m=+790.088172041" lastFinishedPulling="2025-12-02 16:05:40.861681502 +0000 UTC m=+804.112908205" observedRunningTime="2025-12-02 16:05:41.680106167 +0000 UTC m=+804.931332890" watchObservedRunningTime="2025-12-02 16:05:41.682039309 +0000 UTC m=+804.933266022" Dec 02 16:05:44 crc kubenswrapper[4933]: I1202 16:05:44.634834 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/cluster-logging-operator-ff9846bd-sg6kn" event={"ID":"5390a7b1-0b5d-485c-8fc3-44fca44d4286","Type":"ContainerStarted","Data":"595a38b9cfb6d0c6947e5cc145ac0e2d1ab6ad55fbfa7efc188dd8a97ef40cbc"} Dec 02 16:05:44 crc kubenswrapper[4933]: I1202 16:05:44.656480 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/cluster-logging-operator-ff9846bd-sg6kn" podStartSLOduration=2.108471075 podStartE2EDuration="10.656466403s" podCreationTimestamp="2025-12-02 16:05:34 +0000 UTC" firstStartedPulling="2025-12-02 16:05:35.705330949 +0000 UTC m=+798.956557652" lastFinishedPulling="2025-12-02 16:05:44.253326277 +0000 UTC m=+807.504552980" observedRunningTime="2025-12-02 16:05:44.653630537 +0000 UTC m=+807.904857240" watchObservedRunningTime="2025-12-02 16:05:44.656466403 +0000 UTC m=+807.907693106" Dec 02 16:05:48 crc kubenswrapper[4933]: I1202 16:05:48.557796 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["minio-dev/minio"] Dec 02 16:05:48 crc kubenswrapper[4933]: I1202 16:05:48.559456 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="minio-dev/minio" Dec 02 16:05:48 crc kubenswrapper[4933]: I1202 16:05:48.564331 4933 reflector.go:368] Caches populated for *v1.Secret from object-"minio-dev"/"default-dockercfg-lwkjl" Dec 02 16:05:48 crc kubenswrapper[4933]: I1202 16:05:48.564568 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"minio-dev"/"kube-root-ca.crt" Dec 02 16:05:48 crc kubenswrapper[4933]: I1202 16:05:48.569621 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["minio-dev/minio"] Dec 02 16:05:48 crc kubenswrapper[4933]: I1202 16:05:48.571870 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"minio-dev"/"openshift-service-ca.crt" Dec 02 16:05:48 crc kubenswrapper[4933]: I1202 16:05:48.710102 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-927fcad4-144b-4f22-a4d3-0d99d893a3a2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-927fcad4-144b-4f22-a4d3-0d99d893a3a2\") pod \"minio\" (UID: \"e62361ed-a86d-4e7c-95d4-3c5619e8e7bc\") " pod="minio-dev/minio" Dec 02 16:05:48 crc kubenswrapper[4933]: I1202 16:05:48.710463 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hq266\" (UniqueName: \"kubernetes.io/projected/e62361ed-a86d-4e7c-95d4-3c5619e8e7bc-kube-api-access-hq266\") pod \"minio\" (UID: \"e62361ed-a86d-4e7c-95d4-3c5619e8e7bc\") " pod="minio-dev/minio" Dec 02 16:05:48 crc kubenswrapper[4933]: I1202 16:05:48.812430 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-927fcad4-144b-4f22-a4d3-0d99d893a3a2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-927fcad4-144b-4f22-a4d3-0d99d893a3a2\") pod \"minio\" (UID: \"e62361ed-a86d-4e7c-95d4-3c5619e8e7bc\") " pod="minio-dev/minio" Dec 02 16:05:48 crc kubenswrapper[4933]: I1202 16:05:48.812562 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hq266\" (UniqueName: \"kubernetes.io/projected/e62361ed-a86d-4e7c-95d4-3c5619e8e7bc-kube-api-access-hq266\") pod \"minio\" (UID: \"e62361ed-a86d-4e7c-95d4-3c5619e8e7bc\") " pod="minio-dev/minio" Dec 02 16:05:48 crc kubenswrapper[4933]: I1202 16:05:48.815549 4933 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 02 16:05:48 crc kubenswrapper[4933]: I1202 16:05:48.815588 4933 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-927fcad4-144b-4f22-a4d3-0d99d893a3a2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-927fcad4-144b-4f22-a4d3-0d99d893a3a2\") pod \"minio\" (UID: \"e62361ed-a86d-4e7c-95d4-3c5619e8e7bc\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/8c48df7cb739d12287f1326c60668c9f68c2161d765ee4dc442a3148d94e416f/globalmount\"" pod="minio-dev/minio" Dec 02 16:05:48 crc kubenswrapper[4933]: I1202 16:05:48.838651 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hq266\" (UniqueName: \"kubernetes.io/projected/e62361ed-a86d-4e7c-95d4-3c5619e8e7bc-kube-api-access-hq266\") pod \"minio\" (UID: \"e62361ed-a86d-4e7c-95d4-3c5619e8e7bc\") " pod="minio-dev/minio" Dec 02 16:05:48 crc kubenswrapper[4933]: I1202 16:05:48.848741 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-927fcad4-144b-4f22-a4d3-0d99d893a3a2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-927fcad4-144b-4f22-a4d3-0d99d893a3a2\") pod \"minio\" (UID: \"e62361ed-a86d-4e7c-95d4-3c5619e8e7bc\") " pod="minio-dev/minio" Dec 02 16:05:48 crc kubenswrapper[4933]: I1202 16:05:48.884707 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="minio-dev/minio" Dec 02 16:05:49 crc kubenswrapper[4933]: I1202 16:05:49.129317 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["minio-dev/minio"] Dec 02 16:05:49 crc kubenswrapper[4933]: W1202 16:05:49.133008 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode62361ed_a86d_4e7c_95d4_3c5619e8e7bc.slice/crio-659b15d753ef19b4b08b3d1ddb2a0b281488ff76e21370969686b55704646043 WatchSource:0}: Error finding container 659b15d753ef19b4b08b3d1ddb2a0b281488ff76e21370969686b55704646043: Status 404 returned error can't find the container with id 659b15d753ef19b4b08b3d1ddb2a0b281488ff76e21370969686b55704646043 Dec 02 16:05:49 crc kubenswrapper[4933]: I1202 16:05:49.681404 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="minio-dev/minio" event={"ID":"e62361ed-a86d-4e7c-95d4-3c5619e8e7bc","Type":"ContainerStarted","Data":"659b15d753ef19b4b08b3d1ddb2a0b281488ff76e21370969686b55704646043"} Dec 02 16:05:55 crc kubenswrapper[4933]: I1202 16:05:55.741138 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="minio-dev/minio" event={"ID":"e62361ed-a86d-4e7c-95d4-3c5619e8e7bc","Type":"ContainerStarted","Data":"cf830ecf87e09c5a9972a934a45cbad75cb00ed923f7a63c96c11b182877b9b9"} Dec 02 16:05:55 crc kubenswrapper[4933]: I1202 16:05:55.765450 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="minio-dev/minio" podStartSLOduration=4.259810689 podStartE2EDuration="9.765434503s" podCreationTimestamp="2025-12-02 16:05:46 +0000 UTC" firstStartedPulling="2025-12-02 16:05:49.134571643 +0000 UTC m=+812.385798356" lastFinishedPulling="2025-12-02 16:05:54.640195467 +0000 UTC m=+817.891422170" observedRunningTime="2025-12-02 16:05:55.761116408 +0000 UTC m=+819.012343121" watchObservedRunningTime="2025-12-02 16:05:55.765434503 +0000 UTC m=+819.016661206" Dec 02 16:05:59 crc kubenswrapper[4933]: I1202 16:05:59.680790 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-distributor-76cc67bf56-bps9l"] Dec 02 16:05:59 crc kubenswrapper[4933]: I1202 16:05:59.682333 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-distributor-76cc67bf56-bps9l" Dec 02 16:05:59 crc kubenswrapper[4933]: I1202 16:05:59.685945 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-ca-bundle" Dec 02 16:05:59 crc kubenswrapper[4933]: I1202 16:05:59.685989 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-config" Dec 02 16:05:59 crc kubenswrapper[4933]: I1202 16:05:59.686589 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-distributor-grpc" Dec 02 16:05:59 crc kubenswrapper[4933]: I1202 16:05:59.686804 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-distributor-http" Dec 02 16:05:59 crc kubenswrapper[4933]: I1202 16:05:59.688231 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-dockercfg-8xc5d" Dec 02 16:05:59 crc kubenswrapper[4933]: I1202 16:05:59.707911 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-distributor-76cc67bf56-bps9l"] Dec 02 16:05:59 crc kubenswrapper[4933]: I1202 16:05:59.795424 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/92d8d29e-d630-4e5b-9697-5f388253535a-logging-loki-distributor-grpc\") pod \"logging-loki-distributor-76cc67bf56-bps9l\" (UID: \"92d8d29e-d630-4e5b-9697-5f388253535a\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-bps9l" Dec 02 16:05:59 crc kubenswrapper[4933]: I1202 16:05:59.795601 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-distributor-http\" (UniqueName: \"kubernetes.io/secret/92d8d29e-d630-4e5b-9697-5f388253535a-logging-loki-distributor-http\") pod \"logging-loki-distributor-76cc67bf56-bps9l\" (UID: \"92d8d29e-d630-4e5b-9697-5f388253535a\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-bps9l" Dec 02 16:05:59 crc kubenswrapper[4933]: I1202 16:05:59.795723 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92d8d29e-d630-4e5b-9697-5f388253535a-config\") pod \"logging-loki-distributor-76cc67bf56-bps9l\" (UID: \"92d8d29e-d630-4e5b-9697-5f388253535a\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-bps9l" Dec 02 16:05:59 crc kubenswrapper[4933]: I1202 16:05:59.795807 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/92d8d29e-d630-4e5b-9697-5f388253535a-logging-loki-ca-bundle\") pod \"logging-loki-distributor-76cc67bf56-bps9l\" (UID: \"92d8d29e-d630-4e5b-9697-5f388253535a\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-bps9l" Dec 02 16:05:59 crc kubenswrapper[4933]: I1202 16:05:59.795885 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmm5c\" (UniqueName: \"kubernetes.io/projected/92d8d29e-d630-4e5b-9697-5f388253535a-kube-api-access-hmm5c\") pod \"logging-loki-distributor-76cc67bf56-bps9l\" (UID: \"92d8d29e-d630-4e5b-9697-5f388253535a\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-bps9l" Dec 02 16:05:59 crc kubenswrapper[4933]: I1202 16:05:59.858141 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-querier-5895d59bb8-d4wmg"] Dec 02 16:05:59 crc kubenswrapper[4933]: I1202 16:05:59.859352 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-querier-5895d59bb8-d4wmg" Dec 02 16:05:59 crc kubenswrapper[4933]: I1202 16:05:59.866013 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-s3" Dec 02 16:05:59 crc kubenswrapper[4933]: I1202 16:05:59.866226 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-querier-grpc" Dec 02 16:05:59 crc kubenswrapper[4933]: I1202 16:05:59.869630 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-querier-http" Dec 02 16:05:59 crc kubenswrapper[4933]: I1202 16:05:59.879664 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-querier-5895d59bb8-d4wmg"] Dec 02 16:05:59 crc kubenswrapper[4933]: I1202 16:05:59.896797 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92d8d29e-d630-4e5b-9697-5f388253535a-config\") pod \"logging-loki-distributor-76cc67bf56-bps9l\" (UID: \"92d8d29e-d630-4e5b-9697-5f388253535a\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-bps9l" Dec 02 16:05:59 crc kubenswrapper[4933]: I1202 16:05:59.896899 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/92d8d29e-d630-4e5b-9697-5f388253535a-logging-loki-ca-bundle\") pod \"logging-loki-distributor-76cc67bf56-bps9l\" (UID: \"92d8d29e-d630-4e5b-9697-5f388253535a\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-bps9l" Dec 02 16:05:59 crc kubenswrapper[4933]: I1202 16:05:59.896936 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmm5c\" (UniqueName: \"kubernetes.io/projected/92d8d29e-d630-4e5b-9697-5f388253535a-kube-api-access-hmm5c\") pod \"logging-loki-distributor-76cc67bf56-bps9l\" (UID: \"92d8d29e-d630-4e5b-9697-5f388253535a\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-bps9l" Dec 02 16:05:59 crc kubenswrapper[4933]: I1202 16:05:59.897021 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/92d8d29e-d630-4e5b-9697-5f388253535a-logging-loki-distributor-grpc\") pod \"logging-loki-distributor-76cc67bf56-bps9l\" (UID: \"92d8d29e-d630-4e5b-9697-5f388253535a\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-bps9l" Dec 02 16:05:59 crc kubenswrapper[4933]: I1202 16:05:59.897061 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-distributor-http\" (UniqueName: \"kubernetes.io/secret/92d8d29e-d630-4e5b-9697-5f388253535a-logging-loki-distributor-http\") pod \"logging-loki-distributor-76cc67bf56-bps9l\" (UID: \"92d8d29e-d630-4e5b-9697-5f388253535a\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-bps9l" Dec 02 16:05:59 crc kubenswrapper[4933]: I1202 16:05:59.898912 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92d8d29e-d630-4e5b-9697-5f388253535a-config\") pod \"logging-loki-distributor-76cc67bf56-bps9l\" (UID: \"92d8d29e-d630-4e5b-9697-5f388253535a\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-bps9l" Dec 02 16:05:59 crc kubenswrapper[4933]: I1202 16:05:59.902700 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/92d8d29e-d630-4e5b-9697-5f388253535a-logging-loki-ca-bundle\") pod \"logging-loki-distributor-76cc67bf56-bps9l\" (UID: \"92d8d29e-d630-4e5b-9697-5f388253535a\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-bps9l" Dec 02 16:05:59 crc kubenswrapper[4933]: I1202 16:05:59.904267 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/92d8d29e-d630-4e5b-9697-5f388253535a-logging-loki-distributor-grpc\") pod \"logging-loki-distributor-76cc67bf56-bps9l\" (UID: \"92d8d29e-d630-4e5b-9697-5f388253535a\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-bps9l" Dec 02 16:05:59 crc kubenswrapper[4933]: I1202 16:05:59.905358 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-distributor-http\" (UniqueName: \"kubernetes.io/secret/92d8d29e-d630-4e5b-9697-5f388253535a-logging-loki-distributor-http\") pod \"logging-loki-distributor-76cc67bf56-bps9l\" (UID: \"92d8d29e-d630-4e5b-9697-5f388253535a\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-bps9l" Dec 02 16:05:59 crc kubenswrapper[4933]: I1202 16:05:59.927665 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmm5c\" (UniqueName: \"kubernetes.io/projected/92d8d29e-d630-4e5b-9697-5f388253535a-kube-api-access-hmm5c\") pod \"logging-loki-distributor-76cc67bf56-bps9l\" (UID: \"92d8d29e-d630-4e5b-9697-5f388253535a\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-bps9l" Dec 02 16:05:59 crc kubenswrapper[4933]: I1202 16:05:59.959121 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-query-frontend-84558f7c9f-drtkk"] Dec 02 16:05:59 crc kubenswrapper[4933]: I1202 16:05:59.960230 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-drtkk" Dec 02 16:05:59 crc kubenswrapper[4933]: I1202 16:05:59.962674 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-query-frontend-http" Dec 02 16:05:59 crc kubenswrapper[4933]: I1202 16:05:59.962999 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-query-frontend-grpc" Dec 02 16:05:59 crc kubenswrapper[4933]: I1202 16:05:59.978393 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-query-frontend-84558f7c9f-drtkk"] Dec 02 16:05:59 crc kubenswrapper[4933]: I1202 16:05:59.999178 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-distributor-76cc67bf56-bps9l" Dec 02 16:05:59 crc kubenswrapper[4933]: I1202 16:05:59.999686 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-querier-http\" (UniqueName: \"kubernetes.io/secret/b969ec3f-f1c9-4752-b473-1a14450526c1-logging-loki-querier-http\") pod \"logging-loki-querier-5895d59bb8-d4wmg\" (UID: \"b969ec3f-f1c9-4752-b473-1a14450526c1\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-d4wmg" Dec 02 16:06:00 crc kubenswrapper[4933]: I1202 16:05:59.999816 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b969ec3f-f1c9-4752-b473-1a14450526c1-config\") pod \"logging-loki-querier-5895d59bb8-d4wmg\" (UID: \"b969ec3f-f1c9-4752-b473-1a14450526c1\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-d4wmg" Dec 02 16:06:00 crc kubenswrapper[4933]: I1202 16:05:59.999943 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b969ec3f-f1c9-4752-b473-1a14450526c1-logging-loki-ca-bundle\") pod \"logging-loki-querier-5895d59bb8-d4wmg\" (UID: \"b969ec3f-f1c9-4752-b473-1a14450526c1\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-d4wmg" Dec 02 16:06:00 crc kubenswrapper[4933]: I1202 16:06:00.000160 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/b969ec3f-f1c9-4752-b473-1a14450526c1-logging-loki-s3\") pod \"logging-loki-querier-5895d59bb8-d4wmg\" (UID: \"b969ec3f-f1c9-4752-b473-1a14450526c1\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-d4wmg" Dec 02 16:06:00 crc kubenswrapper[4933]: I1202 16:06:00.000212 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-querier-grpc\" (UniqueName: \"kubernetes.io/secret/b969ec3f-f1c9-4752-b473-1a14450526c1-logging-loki-querier-grpc\") pod \"logging-loki-querier-5895d59bb8-d4wmg\" (UID: \"b969ec3f-f1c9-4752-b473-1a14450526c1\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-d4wmg" Dec 02 16:06:00 crc kubenswrapper[4933]: I1202 16:06:00.000252 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4xm5\" (UniqueName: \"kubernetes.io/projected/b969ec3f-f1c9-4752-b473-1a14450526c1-kube-api-access-s4xm5\") pod \"logging-loki-querier-5895d59bb8-d4wmg\" (UID: \"b969ec3f-f1c9-4752-b473-1a14450526c1\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-d4wmg" Dec 02 16:06:00 crc kubenswrapper[4933]: I1202 16:06:00.063487 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-gateway-fdb4c644d-28287"] Dec 02 16:06:00 crc kubenswrapper[4933]: I1202 16:06:00.069629 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-fdb4c644d-28287" Dec 02 16:06:00 crc kubenswrapper[4933]: I1202 16:06:00.078644 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway-http" Dec 02 16:06:00 crc kubenswrapper[4933]: I1202 16:06:00.078867 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-gateway-ca-bundle" Dec 02 16:06:00 crc kubenswrapper[4933]: I1202 16:06:00.078995 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-gateway" Dec 02 16:06:00 crc kubenswrapper[4933]: I1202 16:06:00.079103 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway-client-http" Dec 02 16:06:00 crc kubenswrapper[4933]: I1202 16:06:00.079223 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway" Dec 02 16:06:00 crc kubenswrapper[4933]: I1202 16:06:00.111184 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b969ec3f-f1c9-4752-b473-1a14450526c1-logging-loki-ca-bundle\") pod \"logging-loki-querier-5895d59bb8-d4wmg\" (UID: \"b969ec3f-f1c9-4752-b473-1a14450526c1\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-d4wmg" Dec 02 16:06:00 crc kubenswrapper[4933]: I1202 16:06:00.111273 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/4020f625-85e6-4986-84aa-1f73717025cb-logging-loki-query-frontend-grpc\") pod \"logging-loki-query-frontend-84558f7c9f-drtkk\" (UID: \"4020f625-85e6-4986-84aa-1f73717025cb\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-drtkk" Dec 02 16:06:00 crc kubenswrapper[4933]: I1202 16:06:00.111320 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/b969ec3f-f1c9-4752-b473-1a14450526c1-logging-loki-s3\") pod \"logging-loki-querier-5895d59bb8-d4wmg\" (UID: \"b969ec3f-f1c9-4752-b473-1a14450526c1\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-d4wmg" Dec 02 16:06:00 crc kubenswrapper[4933]: I1202 16:06:00.111345 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-querier-grpc\" (UniqueName: \"kubernetes.io/secret/b969ec3f-f1c9-4752-b473-1a14450526c1-logging-loki-querier-grpc\") pod \"logging-loki-querier-5895d59bb8-d4wmg\" (UID: \"b969ec3f-f1c9-4752-b473-1a14450526c1\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-d4wmg" Dec 02 16:06:00 crc kubenswrapper[4933]: I1202 16:06:00.111415 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4020f625-85e6-4986-84aa-1f73717025cb-config\") pod \"logging-loki-query-frontend-84558f7c9f-drtkk\" (UID: \"4020f625-85e6-4986-84aa-1f73717025cb\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-drtkk" Dec 02 16:06:00 crc kubenswrapper[4933]: I1202 16:06:00.111448 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4020f625-85e6-4986-84aa-1f73717025cb-logging-loki-ca-bundle\") pod \"logging-loki-query-frontend-84558f7c9f-drtkk\" (UID: \"4020f625-85e6-4986-84aa-1f73717025cb\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-drtkk" Dec 02 16:06:00 crc kubenswrapper[4933]: I1202 16:06:00.111500 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4xm5\" (UniqueName: \"kubernetes.io/projected/b969ec3f-f1c9-4752-b473-1a14450526c1-kube-api-access-s4xm5\") pod \"logging-loki-querier-5895d59bb8-d4wmg\" (UID: \"b969ec3f-f1c9-4752-b473-1a14450526c1\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-d4wmg" Dec 02 16:06:00 crc kubenswrapper[4933]: I1202 16:06:00.111528 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/4020f625-85e6-4986-84aa-1f73717025cb-logging-loki-query-frontend-http\") pod \"logging-loki-query-frontend-84558f7c9f-drtkk\" (UID: \"4020f625-85e6-4986-84aa-1f73717025cb\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-drtkk" Dec 02 16:06:00 crc kubenswrapper[4933]: I1202 16:06:00.111551 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwnkl\" (UniqueName: \"kubernetes.io/projected/4020f625-85e6-4986-84aa-1f73717025cb-kube-api-access-bwnkl\") pod \"logging-loki-query-frontend-84558f7c9f-drtkk\" (UID: \"4020f625-85e6-4986-84aa-1f73717025cb\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-drtkk" Dec 02 16:06:00 crc kubenswrapper[4933]: I1202 16:06:00.111615 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-querier-http\" (UniqueName: \"kubernetes.io/secret/b969ec3f-f1c9-4752-b473-1a14450526c1-logging-loki-querier-http\") pod \"logging-loki-querier-5895d59bb8-d4wmg\" (UID: \"b969ec3f-f1c9-4752-b473-1a14450526c1\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-d4wmg" Dec 02 16:06:00 crc kubenswrapper[4933]: I1202 16:06:00.111701 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b969ec3f-f1c9-4752-b473-1a14450526c1-config\") pod \"logging-loki-querier-5895d59bb8-d4wmg\" (UID: \"b969ec3f-f1c9-4752-b473-1a14450526c1\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-d4wmg" Dec 02 16:06:00 crc kubenswrapper[4933]: I1202 16:06:00.113131 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b969ec3f-f1c9-4752-b473-1a14450526c1-config\") pod \"logging-loki-querier-5895d59bb8-d4wmg\" (UID: \"b969ec3f-f1c9-4752-b473-1a14450526c1\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-d4wmg" Dec 02 16:06:00 crc kubenswrapper[4933]: I1202 16:06:00.113785 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b969ec3f-f1c9-4752-b473-1a14450526c1-logging-loki-ca-bundle\") pod \"logging-loki-querier-5895d59bb8-d4wmg\" (UID: \"b969ec3f-f1c9-4752-b473-1a14450526c1\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-d4wmg" Dec 02 16:06:00 crc kubenswrapper[4933]: I1202 16:06:00.122258 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/b969ec3f-f1c9-4752-b473-1a14450526c1-logging-loki-s3\") pod \"logging-loki-querier-5895d59bb8-d4wmg\" (UID: \"b969ec3f-f1c9-4752-b473-1a14450526c1\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-d4wmg" Dec 02 16:06:00 crc kubenswrapper[4933]: I1202 16:06:00.122950 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-gateway-fdb4c644d-wj2pd"] Dec 02 16:06:00 crc kubenswrapper[4933]: I1202 16:06:00.130289 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-fdb4c644d-wj2pd" Dec 02 16:06:00 crc kubenswrapper[4933]: I1202 16:06:00.133187 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway-dockercfg-nrxx7" Dec 02 16:06:00 crc kubenswrapper[4933]: I1202 16:06:00.135442 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-querier-http\" (UniqueName: \"kubernetes.io/secret/b969ec3f-f1c9-4752-b473-1a14450526c1-logging-loki-querier-http\") pod \"logging-loki-querier-5895d59bb8-d4wmg\" (UID: \"b969ec3f-f1c9-4752-b473-1a14450526c1\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-d4wmg" Dec 02 16:06:00 crc kubenswrapper[4933]: I1202 16:06:00.139912 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-querier-grpc\" (UniqueName: \"kubernetes.io/secret/b969ec3f-f1c9-4752-b473-1a14450526c1-logging-loki-querier-grpc\") pod \"logging-loki-querier-5895d59bb8-d4wmg\" (UID: \"b969ec3f-f1c9-4752-b473-1a14450526c1\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-d4wmg" Dec 02 16:06:00 crc kubenswrapper[4933]: I1202 16:06:00.157090 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-fdb4c644d-28287"] Dec 02 16:06:00 crc kubenswrapper[4933]: I1202 16:06:00.162268 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4xm5\" (UniqueName: \"kubernetes.io/projected/b969ec3f-f1c9-4752-b473-1a14450526c1-kube-api-access-s4xm5\") pod \"logging-loki-querier-5895d59bb8-d4wmg\" (UID: \"b969ec3f-f1c9-4752-b473-1a14450526c1\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-d4wmg" Dec 02 16:06:00 crc kubenswrapper[4933]: I1202 16:06:00.171293 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-fdb4c644d-wj2pd"] Dec 02 16:06:00 crc kubenswrapper[4933]: I1202 16:06:00.173148 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-querier-5895d59bb8-d4wmg" Dec 02 16:06:00 crc kubenswrapper[4933]: I1202 16:06:00.217928 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4020f625-85e6-4986-84aa-1f73717025cb-logging-loki-ca-bundle\") pod \"logging-loki-query-frontend-84558f7c9f-drtkk\" (UID: \"4020f625-85e6-4986-84aa-1f73717025cb\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-drtkk" Dec 02 16:06:00 crc kubenswrapper[4933]: I1202 16:06:00.217983 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/c016a791-ff2e-4466-95e5-80333c909784-rbac\") pod \"logging-loki-gateway-fdb4c644d-wj2pd\" (UID: \"c016a791-ff2e-4466-95e5-80333c909784\") " pod="openshift-logging/logging-loki-gateway-fdb4c644d-wj2pd" Dec 02 16:06:00 crc kubenswrapper[4933]: I1202 16:06:00.218018 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c016a791-ff2e-4466-95e5-80333c909784-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-fdb4c644d-wj2pd\" (UID: \"c016a791-ff2e-4466-95e5-80333c909784\") " pod="openshift-logging/logging-loki-gateway-fdb4c644d-wj2pd" Dec 02 16:06:00 crc kubenswrapper[4933]: I1202 16:06:00.218079 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/4020f625-85e6-4986-84aa-1f73717025cb-logging-loki-query-frontend-http\") pod \"logging-loki-query-frontend-84558f7c9f-drtkk\" (UID: \"4020f625-85e6-4986-84aa-1f73717025cb\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-drtkk" Dec 02 16:06:00 crc kubenswrapper[4933]: I1202 16:06:00.218106 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bwnkl\" (UniqueName: \"kubernetes.io/projected/4020f625-85e6-4986-84aa-1f73717025cb-kube-api-access-bwnkl\") pod \"logging-loki-query-frontend-84558f7c9f-drtkk\" (UID: \"4020f625-85e6-4986-84aa-1f73717025cb\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-drtkk" Dec 02 16:06:00 crc kubenswrapper[4933]: I1202 16:06:00.218138 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/c016a791-ff2e-4466-95e5-80333c909784-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-fdb4c644d-wj2pd\" (UID: \"c016a791-ff2e-4466-95e5-80333c909784\") " pod="openshift-logging/logging-loki-gateway-fdb4c644d-wj2pd" Dec 02 16:06:00 crc kubenswrapper[4933]: I1202 16:06:00.218176 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/c016a791-ff2e-4466-95e5-80333c909784-lokistack-gateway\") pod \"logging-loki-gateway-fdb4c644d-wj2pd\" (UID: \"c016a791-ff2e-4466-95e5-80333c909784\") " pod="openshift-logging/logging-loki-gateway-fdb4c644d-wj2pd" Dec 02 16:06:00 crc kubenswrapper[4933]: I1202 16:06:00.218199 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22djj\" (UniqueName: \"kubernetes.io/projected/c016a791-ff2e-4466-95e5-80333c909784-kube-api-access-22djj\") pod \"logging-loki-gateway-fdb4c644d-wj2pd\" (UID: \"c016a791-ff2e-4466-95e5-80333c909784\") " pod="openshift-logging/logging-loki-gateway-fdb4c644d-wj2pd" Dec 02 16:06:00 crc kubenswrapper[4933]: I1202 16:06:00.218248 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/349b144f-569c-411e-9651-db4659c64925-tls-secret\") pod \"logging-loki-gateway-fdb4c644d-28287\" (UID: \"349b144f-569c-411e-9651-db4659c64925\") " pod="openshift-logging/logging-loki-gateway-fdb4c644d-28287" Dec 02 16:06:00 crc kubenswrapper[4933]: I1202 16:06:00.218284 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/349b144f-569c-411e-9651-db4659c64925-rbac\") pod \"logging-loki-gateway-fdb4c644d-28287\" (UID: \"349b144f-569c-411e-9651-db4659c64925\") " pod="openshift-logging/logging-loki-gateway-fdb4c644d-28287" Dec 02 16:06:00 crc kubenswrapper[4933]: I1202 16:06:00.218321 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/349b144f-569c-411e-9651-db4659c64925-tenants\") pod \"logging-loki-gateway-fdb4c644d-28287\" (UID: \"349b144f-569c-411e-9651-db4659c64925\") " pod="openshift-logging/logging-loki-gateway-fdb4c644d-28287" Dec 02 16:06:00 crc kubenswrapper[4933]: I1202 16:06:00.218357 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/349b144f-569c-411e-9651-db4659c64925-logging-loki-ca-bundle\") pod \"logging-loki-gateway-fdb4c644d-28287\" (UID: \"349b144f-569c-411e-9651-db4659c64925\") " pod="openshift-logging/logging-loki-gateway-fdb4c644d-28287" Dec 02 16:06:00 crc kubenswrapper[4933]: I1202 16:06:00.218378 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/349b144f-569c-411e-9651-db4659c64925-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-fdb4c644d-28287\" (UID: \"349b144f-569c-411e-9651-db4659c64925\") " pod="openshift-logging/logging-loki-gateway-fdb4c644d-28287" Dec 02 16:06:00 crc kubenswrapper[4933]: I1202 16:06:00.218429 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/349b144f-569c-411e-9651-db4659c64925-lokistack-gateway\") pod \"logging-loki-gateway-fdb4c644d-28287\" (UID: \"349b144f-569c-411e-9651-db4659c64925\") " pod="openshift-logging/logging-loki-gateway-fdb4c644d-28287" Dec 02 16:06:00 crc kubenswrapper[4933]: I1202 16:06:00.218481 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/c016a791-ff2e-4466-95e5-80333c909784-tls-secret\") pod \"logging-loki-gateway-fdb4c644d-wj2pd\" (UID: \"c016a791-ff2e-4466-95e5-80333c909784\") " pod="openshift-logging/logging-loki-gateway-fdb4c644d-wj2pd" Dec 02 16:06:00 crc kubenswrapper[4933]: I1202 16:06:00.218533 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdthd\" (UniqueName: \"kubernetes.io/projected/349b144f-569c-411e-9651-db4659c64925-kube-api-access-bdthd\") pod \"logging-loki-gateway-fdb4c644d-28287\" (UID: \"349b144f-569c-411e-9651-db4659c64925\") " pod="openshift-logging/logging-loki-gateway-fdb4c644d-28287" Dec 02 16:06:00 crc kubenswrapper[4933]: I1202 16:06:00.218568 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/4020f625-85e6-4986-84aa-1f73717025cb-logging-loki-query-frontend-grpc\") pod \"logging-loki-query-frontend-84558f7c9f-drtkk\" (UID: \"4020f625-85e6-4986-84aa-1f73717025cb\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-drtkk" Dec 02 16:06:00 crc kubenswrapper[4933]: I1202 16:06:00.218599 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/c016a791-ff2e-4466-95e5-80333c909784-tenants\") pod \"logging-loki-gateway-fdb4c644d-wj2pd\" (UID: \"c016a791-ff2e-4466-95e5-80333c909784\") " pod="openshift-logging/logging-loki-gateway-fdb4c644d-wj2pd" Dec 02 16:06:00 crc kubenswrapper[4933]: I1202 16:06:00.219538 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c016a791-ff2e-4466-95e5-80333c909784-logging-loki-ca-bundle\") pod \"logging-loki-gateway-fdb4c644d-wj2pd\" (UID: \"c016a791-ff2e-4466-95e5-80333c909784\") " pod="openshift-logging/logging-loki-gateway-fdb4c644d-wj2pd" Dec 02 16:06:00 crc kubenswrapper[4933]: I1202 16:06:00.219579 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4020f625-85e6-4986-84aa-1f73717025cb-config\") pod \"logging-loki-query-frontend-84558f7c9f-drtkk\" (UID: \"4020f625-85e6-4986-84aa-1f73717025cb\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-drtkk" Dec 02 16:06:00 crc kubenswrapper[4933]: I1202 16:06:00.219597 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/349b144f-569c-411e-9651-db4659c64925-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-fdb4c644d-28287\" (UID: \"349b144f-569c-411e-9651-db4659c64925\") " pod="openshift-logging/logging-loki-gateway-fdb4c644d-28287" Dec 02 16:06:00 crc kubenswrapper[4933]: I1202 16:06:00.220005 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4020f625-85e6-4986-84aa-1f73717025cb-logging-loki-ca-bundle\") pod \"logging-loki-query-frontend-84558f7c9f-drtkk\" (UID: \"4020f625-85e6-4986-84aa-1f73717025cb\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-drtkk" Dec 02 16:06:00 crc kubenswrapper[4933]: I1202 16:06:00.221380 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4020f625-85e6-4986-84aa-1f73717025cb-config\") pod \"logging-loki-query-frontend-84558f7c9f-drtkk\" (UID: \"4020f625-85e6-4986-84aa-1f73717025cb\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-drtkk" Dec 02 16:06:00 crc kubenswrapper[4933]: I1202 16:06:00.226680 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/4020f625-85e6-4986-84aa-1f73717025cb-logging-loki-query-frontend-grpc\") pod \"logging-loki-query-frontend-84558f7c9f-drtkk\" (UID: \"4020f625-85e6-4986-84aa-1f73717025cb\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-drtkk" Dec 02 16:06:00 crc kubenswrapper[4933]: I1202 16:06:00.229336 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/4020f625-85e6-4986-84aa-1f73717025cb-logging-loki-query-frontend-http\") pod \"logging-loki-query-frontend-84558f7c9f-drtkk\" (UID: \"4020f625-85e6-4986-84aa-1f73717025cb\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-drtkk" Dec 02 16:06:00 crc kubenswrapper[4933]: I1202 16:06:00.248469 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwnkl\" (UniqueName: \"kubernetes.io/projected/4020f625-85e6-4986-84aa-1f73717025cb-kube-api-access-bwnkl\") pod \"logging-loki-query-frontend-84558f7c9f-drtkk\" (UID: \"4020f625-85e6-4986-84aa-1f73717025cb\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-drtkk" Dec 02 16:06:00 crc kubenswrapper[4933]: I1202 16:06:00.283949 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-drtkk" Dec 02 16:06:00 crc kubenswrapper[4933]: I1202 16:06:00.321503 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/c016a791-ff2e-4466-95e5-80333c909784-tls-secret\") pod \"logging-loki-gateway-fdb4c644d-wj2pd\" (UID: \"c016a791-ff2e-4466-95e5-80333c909784\") " pod="openshift-logging/logging-loki-gateway-fdb4c644d-wj2pd" Dec 02 16:06:00 crc kubenswrapper[4933]: I1202 16:06:00.321561 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdthd\" (UniqueName: \"kubernetes.io/projected/349b144f-569c-411e-9651-db4659c64925-kube-api-access-bdthd\") pod \"logging-loki-gateway-fdb4c644d-28287\" (UID: \"349b144f-569c-411e-9651-db4659c64925\") " pod="openshift-logging/logging-loki-gateway-fdb4c644d-28287" Dec 02 16:06:00 crc kubenswrapper[4933]: I1202 16:06:00.321590 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/c016a791-ff2e-4466-95e5-80333c909784-tenants\") pod \"logging-loki-gateway-fdb4c644d-wj2pd\" (UID: \"c016a791-ff2e-4466-95e5-80333c909784\") " pod="openshift-logging/logging-loki-gateway-fdb4c644d-wj2pd" Dec 02 16:06:00 crc kubenswrapper[4933]: I1202 16:06:00.321611 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c016a791-ff2e-4466-95e5-80333c909784-logging-loki-ca-bundle\") pod \"logging-loki-gateway-fdb4c644d-wj2pd\" (UID: \"c016a791-ff2e-4466-95e5-80333c909784\") " pod="openshift-logging/logging-loki-gateway-fdb4c644d-wj2pd" Dec 02 16:06:00 crc kubenswrapper[4933]: I1202 16:06:00.321637 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/349b144f-569c-411e-9651-db4659c64925-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-fdb4c644d-28287\" (UID: \"349b144f-569c-411e-9651-db4659c64925\") " pod="openshift-logging/logging-loki-gateway-fdb4c644d-28287" Dec 02 16:06:00 crc kubenswrapper[4933]: I1202 16:06:00.321661 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/c016a791-ff2e-4466-95e5-80333c909784-rbac\") pod \"logging-loki-gateway-fdb4c644d-wj2pd\" (UID: \"c016a791-ff2e-4466-95e5-80333c909784\") " pod="openshift-logging/logging-loki-gateway-fdb4c644d-wj2pd" Dec 02 16:06:00 crc kubenswrapper[4933]: I1202 16:06:00.321680 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c016a791-ff2e-4466-95e5-80333c909784-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-fdb4c644d-wj2pd\" (UID: \"c016a791-ff2e-4466-95e5-80333c909784\") " pod="openshift-logging/logging-loki-gateway-fdb4c644d-wj2pd" Dec 02 16:06:00 crc kubenswrapper[4933]: I1202 16:06:00.321711 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/c016a791-ff2e-4466-95e5-80333c909784-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-fdb4c644d-wj2pd\" (UID: \"c016a791-ff2e-4466-95e5-80333c909784\") " pod="openshift-logging/logging-loki-gateway-fdb4c644d-wj2pd" Dec 02 16:06:00 crc kubenswrapper[4933]: I1202 16:06:00.321730 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/c016a791-ff2e-4466-95e5-80333c909784-lokistack-gateway\") pod \"logging-loki-gateway-fdb4c644d-wj2pd\" (UID: \"c016a791-ff2e-4466-95e5-80333c909784\") " pod="openshift-logging/logging-loki-gateway-fdb4c644d-wj2pd" Dec 02 16:06:00 crc kubenswrapper[4933]: I1202 16:06:00.321744 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22djj\" (UniqueName: \"kubernetes.io/projected/c016a791-ff2e-4466-95e5-80333c909784-kube-api-access-22djj\") pod \"logging-loki-gateway-fdb4c644d-wj2pd\" (UID: \"c016a791-ff2e-4466-95e5-80333c909784\") " pod="openshift-logging/logging-loki-gateway-fdb4c644d-wj2pd" Dec 02 16:06:00 crc kubenswrapper[4933]: I1202 16:06:00.321844 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/349b144f-569c-411e-9651-db4659c64925-tls-secret\") pod \"logging-loki-gateway-fdb4c644d-28287\" (UID: \"349b144f-569c-411e-9651-db4659c64925\") " pod="openshift-logging/logging-loki-gateway-fdb4c644d-28287" Dec 02 16:06:00 crc kubenswrapper[4933]: I1202 16:06:00.321876 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/349b144f-569c-411e-9651-db4659c64925-rbac\") pod \"logging-loki-gateway-fdb4c644d-28287\" (UID: \"349b144f-569c-411e-9651-db4659c64925\") " pod="openshift-logging/logging-loki-gateway-fdb4c644d-28287" Dec 02 16:06:00 crc kubenswrapper[4933]: I1202 16:06:00.321963 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/349b144f-569c-411e-9651-db4659c64925-tenants\") pod \"logging-loki-gateway-fdb4c644d-28287\" (UID: \"349b144f-569c-411e-9651-db4659c64925\") " pod="openshift-logging/logging-loki-gateway-fdb4c644d-28287" Dec 02 16:06:00 crc kubenswrapper[4933]: I1202 16:06:00.322016 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/349b144f-569c-411e-9651-db4659c64925-logging-loki-ca-bundle\") pod \"logging-loki-gateway-fdb4c644d-28287\" (UID: \"349b144f-569c-411e-9651-db4659c64925\") " pod="openshift-logging/logging-loki-gateway-fdb4c644d-28287" Dec 02 16:06:00 crc kubenswrapper[4933]: I1202 16:06:00.322034 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/349b144f-569c-411e-9651-db4659c64925-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-fdb4c644d-28287\" (UID: \"349b144f-569c-411e-9651-db4659c64925\") " pod="openshift-logging/logging-loki-gateway-fdb4c644d-28287" Dec 02 16:06:00 crc kubenswrapper[4933]: I1202 16:06:00.322056 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/349b144f-569c-411e-9651-db4659c64925-lokistack-gateway\") pod \"logging-loki-gateway-fdb4c644d-28287\" (UID: \"349b144f-569c-411e-9651-db4659c64925\") " pod="openshift-logging/logging-loki-gateway-fdb4c644d-28287" Dec 02 16:06:00 crc kubenswrapper[4933]: I1202 16:06:00.322781 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c016a791-ff2e-4466-95e5-80333c909784-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-fdb4c644d-wj2pd\" (UID: \"c016a791-ff2e-4466-95e5-80333c909784\") " pod="openshift-logging/logging-loki-gateway-fdb4c644d-wj2pd" Dec 02 16:06:00 crc kubenswrapper[4933]: I1202 16:06:00.323068 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/349b144f-569c-411e-9651-db4659c64925-lokistack-gateway\") pod \"logging-loki-gateway-fdb4c644d-28287\" (UID: \"349b144f-569c-411e-9651-db4659c64925\") " pod="openshift-logging/logging-loki-gateway-fdb4c644d-28287" Dec 02 16:06:00 crc kubenswrapper[4933]: I1202 16:06:00.324208 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/349b144f-569c-411e-9651-db4659c64925-rbac\") pod \"logging-loki-gateway-fdb4c644d-28287\" (UID: \"349b144f-569c-411e-9651-db4659c64925\") " pod="openshift-logging/logging-loki-gateway-fdb4c644d-28287" Dec 02 16:06:00 crc kubenswrapper[4933]: I1202 16:06:00.324807 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/c016a791-ff2e-4466-95e5-80333c909784-lokistack-gateway\") pod \"logging-loki-gateway-fdb4c644d-wj2pd\" (UID: \"c016a791-ff2e-4466-95e5-80333c909784\") " pod="openshift-logging/logging-loki-gateway-fdb4c644d-wj2pd" Dec 02 16:06:00 crc kubenswrapper[4933]: I1202 16:06:00.325885 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/c016a791-ff2e-4466-95e5-80333c909784-tls-secret\") pod \"logging-loki-gateway-fdb4c644d-wj2pd\" (UID: \"c016a791-ff2e-4466-95e5-80333c909784\") " pod="openshift-logging/logging-loki-gateway-fdb4c644d-wj2pd" Dec 02 16:06:00 crc kubenswrapper[4933]: I1202 16:06:00.326027 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/c016a791-ff2e-4466-95e5-80333c909784-tenants\") pod \"logging-loki-gateway-fdb4c644d-wj2pd\" (UID: \"c016a791-ff2e-4466-95e5-80333c909784\") " pod="openshift-logging/logging-loki-gateway-fdb4c644d-wj2pd" Dec 02 16:06:00 crc kubenswrapper[4933]: I1202 16:06:00.326559 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/c016a791-ff2e-4466-95e5-80333c909784-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-fdb4c644d-wj2pd\" (UID: \"c016a791-ff2e-4466-95e5-80333c909784\") " pod="openshift-logging/logging-loki-gateway-fdb4c644d-wj2pd" Dec 02 16:06:00 crc kubenswrapper[4933]: I1202 16:06:00.326657 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/c016a791-ff2e-4466-95e5-80333c909784-rbac\") pod \"logging-loki-gateway-fdb4c644d-wj2pd\" (UID: \"c016a791-ff2e-4466-95e5-80333c909784\") " pod="openshift-logging/logging-loki-gateway-fdb4c644d-wj2pd" Dec 02 16:06:00 crc kubenswrapper[4933]: I1202 16:06:00.327142 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/349b144f-569c-411e-9651-db4659c64925-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-fdb4c644d-28287\" (UID: \"349b144f-569c-411e-9651-db4659c64925\") " pod="openshift-logging/logging-loki-gateway-fdb4c644d-28287" Dec 02 16:06:00 crc kubenswrapper[4933]: I1202 16:06:00.327323 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c016a791-ff2e-4466-95e5-80333c909784-logging-loki-ca-bundle\") pod \"logging-loki-gateway-fdb4c644d-wj2pd\" (UID: \"c016a791-ff2e-4466-95e5-80333c909784\") " pod="openshift-logging/logging-loki-gateway-fdb4c644d-wj2pd" Dec 02 16:06:00 crc kubenswrapper[4933]: I1202 16:06:00.327902 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/349b144f-569c-411e-9651-db4659c64925-logging-loki-ca-bundle\") pod \"logging-loki-gateway-fdb4c644d-28287\" (UID: \"349b144f-569c-411e-9651-db4659c64925\") " pod="openshift-logging/logging-loki-gateway-fdb4c644d-28287" Dec 02 16:06:00 crc kubenswrapper[4933]: I1202 16:06:00.328569 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/349b144f-569c-411e-9651-db4659c64925-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-fdb4c644d-28287\" (UID: \"349b144f-569c-411e-9651-db4659c64925\") " pod="openshift-logging/logging-loki-gateway-fdb4c644d-28287" Dec 02 16:06:00 crc kubenswrapper[4933]: I1202 16:06:00.328971 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/349b144f-569c-411e-9651-db4659c64925-tls-secret\") pod \"logging-loki-gateway-fdb4c644d-28287\" (UID: \"349b144f-569c-411e-9651-db4659c64925\") " pod="openshift-logging/logging-loki-gateway-fdb4c644d-28287" Dec 02 16:06:00 crc kubenswrapper[4933]: I1202 16:06:00.337460 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/349b144f-569c-411e-9651-db4659c64925-tenants\") pod \"logging-loki-gateway-fdb4c644d-28287\" (UID: \"349b144f-569c-411e-9651-db4659c64925\") " pod="openshift-logging/logging-loki-gateway-fdb4c644d-28287" Dec 02 16:06:00 crc kubenswrapper[4933]: I1202 16:06:00.341909 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdthd\" (UniqueName: \"kubernetes.io/projected/349b144f-569c-411e-9651-db4659c64925-kube-api-access-bdthd\") pod \"logging-loki-gateway-fdb4c644d-28287\" (UID: \"349b144f-569c-411e-9651-db4659c64925\") " pod="openshift-logging/logging-loki-gateway-fdb4c644d-28287" Dec 02 16:06:00 crc kubenswrapper[4933]: I1202 16:06:00.344177 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22djj\" (UniqueName: \"kubernetes.io/projected/c016a791-ff2e-4466-95e5-80333c909784-kube-api-access-22djj\") pod \"logging-loki-gateway-fdb4c644d-wj2pd\" (UID: \"c016a791-ff2e-4466-95e5-80333c909784\") " pod="openshift-logging/logging-loki-gateway-fdb4c644d-wj2pd" Dec 02 16:06:00 crc kubenswrapper[4933]: I1202 16:06:00.472858 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-fdb4c644d-28287" Dec 02 16:06:00 crc kubenswrapper[4933]: I1202 16:06:00.487334 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-fdb4c644d-wj2pd" Dec 02 16:06:00 crc kubenswrapper[4933]: I1202 16:06:00.550535 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-distributor-76cc67bf56-bps9l"] Dec 02 16:06:00 crc kubenswrapper[4933]: I1202 16:06:00.657483 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-querier-5895d59bb8-d4wmg"] Dec 02 16:06:00 crc kubenswrapper[4933]: W1202 16:06:00.672219 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb969ec3f_f1c9_4752_b473_1a14450526c1.slice/crio-f1cfd70128a95a04b4f336c470185d2c0791bc68b092169436b88ac551a70adf WatchSource:0}: Error finding container f1cfd70128a95a04b4f336c470185d2c0791bc68b092169436b88ac551a70adf: Status 404 returned error can't find the container with id f1cfd70128a95a04b4f336c470185d2c0791bc68b092169436b88ac551a70adf Dec 02 16:06:00 crc kubenswrapper[4933]: I1202 16:06:00.772139 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-querier-5895d59bb8-d4wmg" event={"ID":"b969ec3f-f1c9-4752-b473-1a14450526c1","Type":"ContainerStarted","Data":"f1cfd70128a95a04b4f336c470185d2c0791bc68b092169436b88ac551a70adf"} Dec 02 16:06:00 crc kubenswrapper[4933]: I1202 16:06:00.778189 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-distributor-76cc67bf56-bps9l" event={"ID":"92d8d29e-d630-4e5b-9697-5f388253535a","Type":"ContainerStarted","Data":"99fd2eeb27acb9fd8641d8cf36660f5497b2fe86aae3d1b3223310d8eff64125"} Dec 02 16:06:00 crc kubenswrapper[4933]: I1202 16:06:00.780897 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-query-frontend-84558f7c9f-drtkk"] Dec 02 16:06:00 crc kubenswrapper[4933]: W1202 16:06:00.803068 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4020f625_85e6_4986_84aa_1f73717025cb.slice/crio-cecd9a46ce396d049d7976277581ee046d8140d2a5a2a2044b91951938cc629e WatchSource:0}: Error finding container cecd9a46ce396d049d7976277581ee046d8140d2a5a2a2044b91951938cc629e: Status 404 returned error can't find the container with id cecd9a46ce396d049d7976277581ee046d8140d2a5a2a2044b91951938cc629e Dec 02 16:06:00 crc kubenswrapper[4933]: I1202 16:06:00.859582 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-ingester-0"] Dec 02 16:06:00 crc kubenswrapper[4933]: I1202 16:06:00.860632 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-ingester-0" Dec 02 16:06:00 crc kubenswrapper[4933]: I1202 16:06:00.862684 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-ingester-http" Dec 02 16:06:00 crc kubenswrapper[4933]: I1202 16:06:00.863424 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-ingester-grpc" Dec 02 16:06:00 crc kubenswrapper[4933]: I1202 16:06:00.880782 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-ingester-0"] Dec 02 16:06:00 crc kubenswrapper[4933]: I1202 16:06:00.934162 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-compactor-0"] Dec 02 16:06:00 crc kubenswrapper[4933]: I1202 16:06:00.935202 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-compactor-0" Dec 02 16:06:00 crc kubenswrapper[4933]: I1202 16:06:00.937575 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-compactor-http" Dec 02 16:06:00 crc kubenswrapper[4933]: I1202 16:06:00.937840 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-compactor-grpc" Dec 02 16:06:00 crc kubenswrapper[4933]: I1202 16:06:00.942768 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-compactor-0"] Dec 02 16:06:01 crc kubenswrapper[4933]: I1202 16:06:01.004640 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-index-gateway-0"] Dec 02 16:06:01 crc kubenswrapper[4933]: I1202 16:06:01.005454 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-index-gateway-0" Dec 02 16:06:01 crc kubenswrapper[4933]: I1202 16:06:01.007090 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-index-gateway-grpc" Dec 02 16:06:01 crc kubenswrapper[4933]: I1202 16:06:01.007301 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-index-gateway-http" Dec 02 16:06:01 crc kubenswrapper[4933]: I1202 16:06:01.024130 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-index-gateway-0"] Dec 02 16:06:01 crc kubenswrapper[4933]: I1202 16:06:01.040898 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/39e91993-69e7-44dc-bb73-5372571ce233-logging-loki-ca-bundle\") pod \"logging-loki-compactor-0\" (UID: \"39e91993-69e7-44dc-bb73-5372571ce233\") " pod="openshift-logging/logging-loki-compactor-0" Dec 02 16:06:01 crc kubenswrapper[4933]: I1202 16:06:01.040950 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-c356f5ce-090e-46b0-a711-5dd96f2e8d89\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c356f5ce-090e-46b0-a711-5dd96f2e8d89\") pod \"logging-loki-ingester-0\" (UID: \"cb0e0e26-ef5a-48bd-aa32-af5919aaf24a\") " pod="openshift-logging/logging-loki-ingester-0" Dec 02 16:06:01 crc kubenswrapper[4933]: I1202 16:06:01.040985 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-efb9f168-4416-43a0-86b6-11c8b061ba34\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-efb9f168-4416-43a0-86b6-11c8b061ba34\") pod \"logging-loki-ingester-0\" (UID: \"cb0e0e26-ef5a-48bd-aa32-af5919aaf24a\") " pod="openshift-logging/logging-loki-ingester-0" Dec 02 16:06:01 crc kubenswrapper[4933]: I1202 16:06:01.041064 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/cb0e0e26-ef5a-48bd-aa32-af5919aaf24a-logging-loki-s3\") pod \"logging-loki-ingester-0\" (UID: \"cb0e0e26-ef5a-48bd-aa32-af5919aaf24a\") " pod="openshift-logging/logging-loki-ingester-0" Dec 02 16:06:01 crc kubenswrapper[4933]: I1202 16:06:01.041092 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzdlb\" (UniqueName: \"kubernetes.io/projected/cb0e0e26-ef5a-48bd-aa32-af5919aaf24a-kube-api-access-jzdlb\") pod \"logging-loki-ingester-0\" (UID: \"cb0e0e26-ef5a-48bd-aa32-af5919aaf24a\") " pod="openshift-logging/logging-loki-ingester-0" Dec 02 16:06:01 crc kubenswrapper[4933]: I1202 16:06:01.041114 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39e91993-69e7-44dc-bb73-5372571ce233-config\") pod \"logging-loki-compactor-0\" (UID: \"39e91993-69e7-44dc-bb73-5372571ce233\") " pod="openshift-logging/logging-loki-compactor-0" Dec 02 16:06:01 crc kubenswrapper[4933]: I1202 16:06:01.041133 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb0e0e26-ef5a-48bd-aa32-af5919aaf24a-config\") pod \"logging-loki-ingester-0\" (UID: \"cb0e0e26-ef5a-48bd-aa32-af5919aaf24a\") " pod="openshift-logging/logging-loki-ingester-0" Dec 02 16:06:01 crc kubenswrapper[4933]: I1202 16:06:01.041162 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-89148a37-d021-48ae-b1cd-4b797c6e0f51\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-89148a37-d021-48ae-b1cd-4b797c6e0f51\") pod \"logging-loki-compactor-0\" (UID: \"39e91993-69e7-44dc-bb73-5372571ce233\") " pod="openshift-logging/logging-loki-compactor-0" Dec 02 16:06:01 crc kubenswrapper[4933]: I1202 16:06:01.041752 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ingester-http\" (UniqueName: \"kubernetes.io/secret/cb0e0e26-ef5a-48bd-aa32-af5919aaf24a-logging-loki-ingester-http\") pod \"logging-loki-ingester-0\" (UID: \"cb0e0e26-ef5a-48bd-aa32-af5919aaf24a\") " pod="openshift-logging/logging-loki-ingester-0" Dec 02 16:06:01 crc kubenswrapper[4933]: I1202 16:06:01.043638 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-compactor-http\" (UniqueName: \"kubernetes.io/secret/39e91993-69e7-44dc-bb73-5372571ce233-logging-loki-compactor-http\") pod \"logging-loki-compactor-0\" (UID: \"39e91993-69e7-44dc-bb73-5372571ce233\") " pod="openshift-logging/logging-loki-compactor-0" Dec 02 16:06:01 crc kubenswrapper[4933]: I1202 16:06:01.043811 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j67ng\" (UniqueName: \"kubernetes.io/projected/39e91993-69e7-44dc-bb73-5372571ce233-kube-api-access-j67ng\") pod \"logging-loki-compactor-0\" (UID: \"39e91993-69e7-44dc-bb73-5372571ce233\") " pod="openshift-logging/logging-loki-compactor-0" Dec 02 16:06:01 crc kubenswrapper[4933]: I1202 16:06:01.043869 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/cb0e0e26-ef5a-48bd-aa32-af5919aaf24a-logging-loki-ingester-grpc\") pod \"logging-loki-ingester-0\" (UID: \"cb0e0e26-ef5a-48bd-aa32-af5919aaf24a\") " pod="openshift-logging/logging-loki-ingester-0" Dec 02 16:06:01 crc kubenswrapper[4933]: I1202 16:06:01.043931 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/39e91993-69e7-44dc-bb73-5372571ce233-logging-loki-compactor-grpc\") pod \"logging-loki-compactor-0\" (UID: \"39e91993-69e7-44dc-bb73-5372571ce233\") " pod="openshift-logging/logging-loki-compactor-0" Dec 02 16:06:01 crc kubenswrapper[4933]: I1202 16:06:01.044028 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/39e91993-69e7-44dc-bb73-5372571ce233-logging-loki-s3\") pod \"logging-loki-compactor-0\" (UID: \"39e91993-69e7-44dc-bb73-5372571ce233\") " pod="openshift-logging/logging-loki-compactor-0" Dec 02 16:06:01 crc kubenswrapper[4933]: I1202 16:06:01.044065 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cb0e0e26-ef5a-48bd-aa32-af5919aaf24a-logging-loki-ca-bundle\") pod \"logging-loki-ingester-0\" (UID: \"cb0e0e26-ef5a-48bd-aa32-af5919aaf24a\") " pod="openshift-logging/logging-loki-ingester-0" Dec 02 16:06:01 crc kubenswrapper[4933]: I1202 16:06:01.087916 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-fdb4c644d-wj2pd"] Dec 02 16:06:01 crc kubenswrapper[4933]: I1202 16:06:01.145343 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-9ad069d5-32d5-4fcf-b18f-3f9846b0a02d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9ad069d5-32d5-4fcf-b18f-3f9846b0a02d\") pod \"logging-loki-index-gateway-0\" (UID: \"50300b3a-1c70-41d4-a37c-5b121b3c0efb\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 02 16:06:01 crc kubenswrapper[4933]: I1202 16:06:01.145397 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/39e91993-69e7-44dc-bb73-5372571ce233-logging-loki-ca-bundle\") pod \"logging-loki-compactor-0\" (UID: \"39e91993-69e7-44dc-bb73-5372571ce233\") " pod="openshift-logging/logging-loki-compactor-0" Dec 02 16:06:01 crc kubenswrapper[4933]: I1202 16:06:01.145439 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39e91993-69e7-44dc-bb73-5372571ce233-config\") pod \"logging-loki-compactor-0\" (UID: \"39e91993-69e7-44dc-bb73-5372571ce233\") " pod="openshift-logging/logging-loki-compactor-0" Dec 02 16:06:01 crc kubenswrapper[4933]: I1202 16:06:01.145459 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb0e0e26-ef5a-48bd-aa32-af5919aaf24a-config\") pod \"logging-loki-ingester-0\" (UID: \"cb0e0e26-ef5a-48bd-aa32-af5919aaf24a\") " pod="openshift-logging/logging-loki-ingester-0" Dec 02 16:06:01 crc kubenswrapper[4933]: I1202 16:06:01.145480 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-89148a37-d021-48ae-b1cd-4b797c6e0f51\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-89148a37-d021-48ae-b1cd-4b797c6e0f51\") pod \"logging-loki-compactor-0\" (UID: \"39e91993-69e7-44dc-bb73-5372571ce233\") " pod="openshift-logging/logging-loki-compactor-0" Dec 02 16:06:01 crc kubenswrapper[4933]: I1202 16:06:01.145499 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ingester-http\" (UniqueName: \"kubernetes.io/secret/cb0e0e26-ef5a-48bd-aa32-af5919aaf24a-logging-loki-ingester-http\") pod \"logging-loki-ingester-0\" (UID: \"cb0e0e26-ef5a-48bd-aa32-af5919aaf24a\") " pod="openshift-logging/logging-loki-ingester-0" Dec 02 16:06:01 crc kubenswrapper[4933]: I1202 16:06:01.145515 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-compactor-http\" (UniqueName: \"kubernetes.io/secret/39e91993-69e7-44dc-bb73-5372571ce233-logging-loki-compactor-http\") pod \"logging-loki-compactor-0\" (UID: \"39e91993-69e7-44dc-bb73-5372571ce233\") " pod="openshift-logging/logging-loki-compactor-0" Dec 02 16:06:01 crc kubenswrapper[4933]: I1202 16:06:01.145531 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j67ng\" (UniqueName: \"kubernetes.io/projected/39e91993-69e7-44dc-bb73-5372571ce233-kube-api-access-j67ng\") pod \"logging-loki-compactor-0\" (UID: \"39e91993-69e7-44dc-bb73-5372571ce233\") " pod="openshift-logging/logging-loki-compactor-0" Dec 02 16:06:01 crc kubenswrapper[4933]: I1202 16:06:01.145551 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/cb0e0e26-ef5a-48bd-aa32-af5919aaf24a-logging-loki-ingester-grpc\") pod \"logging-loki-ingester-0\" (UID: \"cb0e0e26-ef5a-48bd-aa32-af5919aaf24a\") " pod="openshift-logging/logging-loki-ingester-0" Dec 02 16:06:01 crc kubenswrapper[4933]: I1202 16:06:01.145573 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/39e91993-69e7-44dc-bb73-5372571ce233-logging-loki-s3\") pod \"logging-loki-compactor-0\" (UID: \"39e91993-69e7-44dc-bb73-5372571ce233\") " pod="openshift-logging/logging-loki-compactor-0" Dec 02 16:06:01 crc kubenswrapper[4933]: I1202 16:06:01.145589 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/50300b3a-1c70-41d4-a37c-5b121b3c0efb-logging-loki-ca-bundle\") pod \"logging-loki-index-gateway-0\" (UID: \"50300b3a-1c70-41d4-a37c-5b121b3c0efb\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 02 16:06:01 crc kubenswrapper[4933]: I1202 16:06:01.145605 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cb0e0e26-ef5a-48bd-aa32-af5919aaf24a-logging-loki-ca-bundle\") pod \"logging-loki-ingester-0\" (UID: \"cb0e0e26-ef5a-48bd-aa32-af5919aaf24a\") " pod="openshift-logging/logging-loki-ingester-0" Dec 02 16:06:01 crc kubenswrapper[4933]: I1202 16:06:01.145624 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/50300b3a-1c70-41d4-a37c-5b121b3c0efb-logging-loki-index-gateway-grpc\") pod \"logging-loki-index-gateway-0\" (UID: \"50300b3a-1c70-41d4-a37c-5b121b3c0efb\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 02 16:06:01 crc kubenswrapper[4933]: I1202 16:06:01.145660 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-c356f5ce-090e-46b0-a711-5dd96f2e8d89\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c356f5ce-090e-46b0-a711-5dd96f2e8d89\") pod \"logging-loki-ingester-0\" (UID: \"cb0e0e26-ef5a-48bd-aa32-af5919aaf24a\") " pod="openshift-logging/logging-loki-ingester-0" Dec 02 16:06:01 crc kubenswrapper[4933]: I1202 16:06:01.145683 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-efb9f168-4416-43a0-86b6-11c8b061ba34\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-efb9f168-4416-43a0-86b6-11c8b061ba34\") pod \"logging-loki-ingester-0\" (UID: \"cb0e0e26-ef5a-48bd-aa32-af5919aaf24a\") " pod="openshift-logging/logging-loki-ingester-0" Dec 02 16:06:01 crc kubenswrapper[4933]: I1202 16:06:01.145707 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/cb0e0e26-ef5a-48bd-aa32-af5919aaf24a-logging-loki-s3\") pod \"logging-loki-ingester-0\" (UID: \"cb0e0e26-ef5a-48bd-aa32-af5919aaf24a\") " pod="openshift-logging/logging-loki-ingester-0" Dec 02 16:06:01 crc kubenswrapper[4933]: I1202 16:06:01.145725 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50300b3a-1c70-41d4-a37c-5b121b3c0efb-config\") pod \"logging-loki-index-gateway-0\" (UID: \"50300b3a-1c70-41d4-a37c-5b121b3c0efb\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 02 16:06:01 crc kubenswrapper[4933]: I1202 16:06:01.145742 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jzdlb\" (UniqueName: \"kubernetes.io/projected/cb0e0e26-ef5a-48bd-aa32-af5919aaf24a-kube-api-access-jzdlb\") pod \"logging-loki-ingester-0\" (UID: \"cb0e0e26-ef5a-48bd-aa32-af5919aaf24a\") " pod="openshift-logging/logging-loki-ingester-0" Dec 02 16:06:01 crc kubenswrapper[4933]: I1202 16:06:01.145775 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftlxs\" (UniqueName: \"kubernetes.io/projected/50300b3a-1c70-41d4-a37c-5b121b3c0efb-kube-api-access-ftlxs\") pod \"logging-loki-index-gateway-0\" (UID: \"50300b3a-1c70-41d4-a37c-5b121b3c0efb\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 02 16:06:01 crc kubenswrapper[4933]: I1202 16:06:01.145793 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/50300b3a-1c70-41d4-a37c-5b121b3c0efb-logging-loki-s3\") pod \"logging-loki-index-gateway-0\" (UID: \"50300b3a-1c70-41d4-a37c-5b121b3c0efb\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 02 16:06:01 crc kubenswrapper[4933]: I1202 16:06:01.145813 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/39e91993-69e7-44dc-bb73-5372571ce233-logging-loki-compactor-grpc\") pod \"logging-loki-compactor-0\" (UID: \"39e91993-69e7-44dc-bb73-5372571ce233\") " pod="openshift-logging/logging-loki-compactor-0" Dec 02 16:06:01 crc kubenswrapper[4933]: I1202 16:06:01.145843 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/50300b3a-1c70-41d4-a37c-5b121b3c0efb-logging-loki-index-gateway-http\") pod \"logging-loki-index-gateway-0\" (UID: \"50300b3a-1c70-41d4-a37c-5b121b3c0efb\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 02 16:06:01 crc kubenswrapper[4933]: I1202 16:06:01.146464 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/39e91993-69e7-44dc-bb73-5372571ce233-logging-loki-ca-bundle\") pod \"logging-loki-compactor-0\" (UID: \"39e91993-69e7-44dc-bb73-5372571ce233\") " pod="openshift-logging/logging-loki-compactor-0" Dec 02 16:06:01 crc kubenswrapper[4933]: I1202 16:06:01.146792 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cb0e0e26-ef5a-48bd-aa32-af5919aaf24a-logging-loki-ca-bundle\") pod \"logging-loki-ingester-0\" (UID: \"cb0e0e26-ef5a-48bd-aa32-af5919aaf24a\") " pod="openshift-logging/logging-loki-ingester-0" Dec 02 16:06:01 crc kubenswrapper[4933]: I1202 16:06:01.147755 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39e91993-69e7-44dc-bb73-5372571ce233-config\") pod \"logging-loki-compactor-0\" (UID: \"39e91993-69e7-44dc-bb73-5372571ce233\") " pod="openshift-logging/logging-loki-compactor-0" Dec 02 16:06:01 crc kubenswrapper[4933]: I1202 16:06:01.148052 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb0e0e26-ef5a-48bd-aa32-af5919aaf24a-config\") pod \"logging-loki-ingester-0\" (UID: \"cb0e0e26-ef5a-48bd-aa32-af5919aaf24a\") " pod="openshift-logging/logging-loki-ingester-0" Dec 02 16:06:01 crc kubenswrapper[4933]: I1202 16:06:01.150907 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/39e91993-69e7-44dc-bb73-5372571ce233-logging-loki-compactor-grpc\") pod \"logging-loki-compactor-0\" (UID: \"39e91993-69e7-44dc-bb73-5372571ce233\") " pod="openshift-logging/logging-loki-compactor-0" Dec 02 16:06:01 crc kubenswrapper[4933]: I1202 16:06:01.151844 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/39e91993-69e7-44dc-bb73-5372571ce233-logging-loki-s3\") pod \"logging-loki-compactor-0\" (UID: \"39e91993-69e7-44dc-bb73-5372571ce233\") " pod="openshift-logging/logging-loki-compactor-0" Dec 02 16:06:01 crc kubenswrapper[4933]: I1202 16:06:01.152420 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-compactor-http\" (UniqueName: \"kubernetes.io/secret/39e91993-69e7-44dc-bb73-5372571ce233-logging-loki-compactor-http\") pod \"logging-loki-compactor-0\" (UID: \"39e91993-69e7-44dc-bb73-5372571ce233\") " pod="openshift-logging/logging-loki-compactor-0" Dec 02 16:06:01 crc kubenswrapper[4933]: I1202 16:06:01.154125 4933 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 02 16:06:01 crc kubenswrapper[4933]: I1202 16:06:01.154144 4933 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 02 16:06:01 crc kubenswrapper[4933]: I1202 16:06:01.154159 4933 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-c356f5ce-090e-46b0-a711-5dd96f2e8d89\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c356f5ce-090e-46b0-a711-5dd96f2e8d89\") pod \"logging-loki-ingester-0\" (UID: \"cb0e0e26-ef5a-48bd-aa32-af5919aaf24a\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/dfd635db5b74b1982f0d1546fd596cb0c8c33c566f4cf1dd5730107952de5c35/globalmount\"" pod="openshift-logging/logging-loki-ingester-0" Dec 02 16:06:01 crc kubenswrapper[4933]: I1202 16:06:01.154182 4933 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-89148a37-d021-48ae-b1cd-4b797c6e0f51\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-89148a37-d021-48ae-b1cd-4b797c6e0f51\") pod \"logging-loki-compactor-0\" (UID: \"39e91993-69e7-44dc-bb73-5372571ce233\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/43df9582a8e71d8ed226f87f161da5957b3cb8fe905e34d730ebba2903d931ff/globalmount\"" pod="openshift-logging/logging-loki-compactor-0" Dec 02 16:06:01 crc kubenswrapper[4933]: I1202 16:06:01.154804 4933 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 02 16:06:01 crc kubenswrapper[4933]: I1202 16:06:01.154860 4933 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-efb9f168-4416-43a0-86b6-11c8b061ba34\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-efb9f168-4416-43a0-86b6-11c8b061ba34\") pod \"logging-loki-ingester-0\" (UID: \"cb0e0e26-ef5a-48bd-aa32-af5919aaf24a\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/6a8e3a697723867034900be076f16df3b33a6dd3360d59844f6bac066d4228bb/globalmount\"" pod="openshift-logging/logging-loki-ingester-0" Dec 02 16:06:01 crc kubenswrapper[4933]: I1202 16:06:01.158750 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/cb0e0e26-ef5a-48bd-aa32-af5919aaf24a-logging-loki-ingester-grpc\") pod \"logging-loki-ingester-0\" (UID: \"cb0e0e26-ef5a-48bd-aa32-af5919aaf24a\") " pod="openshift-logging/logging-loki-ingester-0" Dec 02 16:06:01 crc kubenswrapper[4933]: I1202 16:06:01.158872 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/cb0e0e26-ef5a-48bd-aa32-af5919aaf24a-logging-loki-s3\") pod \"logging-loki-ingester-0\" (UID: \"cb0e0e26-ef5a-48bd-aa32-af5919aaf24a\") " pod="openshift-logging/logging-loki-ingester-0" Dec 02 16:06:01 crc kubenswrapper[4933]: I1202 16:06:01.163433 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j67ng\" (UniqueName: \"kubernetes.io/projected/39e91993-69e7-44dc-bb73-5372571ce233-kube-api-access-j67ng\") pod \"logging-loki-compactor-0\" (UID: \"39e91993-69e7-44dc-bb73-5372571ce233\") " pod="openshift-logging/logging-loki-compactor-0" Dec 02 16:06:01 crc kubenswrapper[4933]: I1202 16:06:01.164052 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzdlb\" (UniqueName: \"kubernetes.io/projected/cb0e0e26-ef5a-48bd-aa32-af5919aaf24a-kube-api-access-jzdlb\") pod \"logging-loki-ingester-0\" (UID: \"cb0e0e26-ef5a-48bd-aa32-af5919aaf24a\") " pod="openshift-logging/logging-loki-ingester-0" Dec 02 16:06:01 crc kubenswrapper[4933]: I1202 16:06:01.164784 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ingester-http\" (UniqueName: \"kubernetes.io/secret/cb0e0e26-ef5a-48bd-aa32-af5919aaf24a-logging-loki-ingester-http\") pod \"logging-loki-ingester-0\" (UID: \"cb0e0e26-ef5a-48bd-aa32-af5919aaf24a\") " pod="openshift-logging/logging-loki-ingester-0" Dec 02 16:06:01 crc kubenswrapper[4933]: I1202 16:06:01.196718 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-fdb4c644d-28287"] Dec 02 16:06:01 crc kubenswrapper[4933]: I1202 16:06:01.200165 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-89148a37-d021-48ae-b1cd-4b797c6e0f51\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-89148a37-d021-48ae-b1cd-4b797c6e0f51\") pod \"logging-loki-compactor-0\" (UID: \"39e91993-69e7-44dc-bb73-5372571ce233\") " pod="openshift-logging/logging-loki-compactor-0" Dec 02 16:06:01 crc kubenswrapper[4933]: I1202 16:06:01.205678 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-efb9f168-4416-43a0-86b6-11c8b061ba34\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-efb9f168-4416-43a0-86b6-11c8b061ba34\") pod \"logging-loki-ingester-0\" (UID: \"cb0e0e26-ef5a-48bd-aa32-af5919aaf24a\") " pod="openshift-logging/logging-loki-ingester-0" Dec 02 16:06:01 crc kubenswrapper[4933]: I1202 16:06:01.206463 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-c356f5ce-090e-46b0-a711-5dd96f2e8d89\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c356f5ce-090e-46b0-a711-5dd96f2e8d89\") pod \"logging-loki-ingester-0\" (UID: \"cb0e0e26-ef5a-48bd-aa32-af5919aaf24a\") " pod="openshift-logging/logging-loki-ingester-0" Dec 02 16:06:01 crc kubenswrapper[4933]: I1202 16:06:01.247190 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/50300b3a-1c70-41d4-a37c-5b121b3c0efb-logging-loki-ca-bundle\") pod \"logging-loki-index-gateway-0\" (UID: \"50300b3a-1c70-41d4-a37c-5b121b3c0efb\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 02 16:06:01 crc kubenswrapper[4933]: I1202 16:06:01.247234 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/50300b3a-1c70-41d4-a37c-5b121b3c0efb-logging-loki-index-gateway-grpc\") pod \"logging-loki-index-gateway-0\" (UID: \"50300b3a-1c70-41d4-a37c-5b121b3c0efb\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 02 16:06:01 crc kubenswrapper[4933]: I1202 16:06:01.247279 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50300b3a-1c70-41d4-a37c-5b121b3c0efb-config\") pod \"logging-loki-index-gateway-0\" (UID: \"50300b3a-1c70-41d4-a37c-5b121b3c0efb\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 02 16:06:01 crc kubenswrapper[4933]: I1202 16:06:01.247309 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftlxs\" (UniqueName: \"kubernetes.io/projected/50300b3a-1c70-41d4-a37c-5b121b3c0efb-kube-api-access-ftlxs\") pod \"logging-loki-index-gateway-0\" (UID: \"50300b3a-1c70-41d4-a37c-5b121b3c0efb\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 02 16:06:01 crc kubenswrapper[4933]: I1202 16:06:01.247328 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/50300b3a-1c70-41d4-a37c-5b121b3c0efb-logging-loki-s3\") pod \"logging-loki-index-gateway-0\" (UID: \"50300b3a-1c70-41d4-a37c-5b121b3c0efb\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 02 16:06:01 crc kubenswrapper[4933]: I1202 16:06:01.247347 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/50300b3a-1c70-41d4-a37c-5b121b3c0efb-logging-loki-index-gateway-http\") pod \"logging-loki-index-gateway-0\" (UID: \"50300b3a-1c70-41d4-a37c-5b121b3c0efb\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 02 16:06:01 crc kubenswrapper[4933]: I1202 16:06:01.247378 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-9ad069d5-32d5-4fcf-b18f-3f9846b0a02d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9ad069d5-32d5-4fcf-b18f-3f9846b0a02d\") pod \"logging-loki-index-gateway-0\" (UID: \"50300b3a-1c70-41d4-a37c-5b121b3c0efb\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 02 16:06:01 crc kubenswrapper[4933]: I1202 16:06:01.248864 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/50300b3a-1c70-41d4-a37c-5b121b3c0efb-logging-loki-ca-bundle\") pod \"logging-loki-index-gateway-0\" (UID: \"50300b3a-1c70-41d4-a37c-5b121b3c0efb\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 02 16:06:01 crc kubenswrapper[4933]: I1202 16:06:01.249728 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50300b3a-1c70-41d4-a37c-5b121b3c0efb-config\") pod \"logging-loki-index-gateway-0\" (UID: \"50300b3a-1c70-41d4-a37c-5b121b3c0efb\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 02 16:06:01 crc kubenswrapper[4933]: I1202 16:06:01.250901 4933 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 02 16:06:01 crc kubenswrapper[4933]: I1202 16:06:01.250931 4933 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-9ad069d5-32d5-4fcf-b18f-3f9846b0a02d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9ad069d5-32d5-4fcf-b18f-3f9846b0a02d\") pod \"logging-loki-index-gateway-0\" (UID: \"50300b3a-1c70-41d4-a37c-5b121b3c0efb\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/04cbc09a2bc3dfa25627a23a5fffc401832755c54927fa2c40924f774ed68b7d/globalmount\"" pod="openshift-logging/logging-loki-index-gateway-0" Dec 02 16:06:01 crc kubenswrapper[4933]: I1202 16:06:01.251699 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/50300b3a-1c70-41d4-a37c-5b121b3c0efb-logging-loki-index-gateway-http\") pod \"logging-loki-index-gateway-0\" (UID: \"50300b3a-1c70-41d4-a37c-5b121b3c0efb\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 02 16:06:01 crc kubenswrapper[4933]: I1202 16:06:01.253907 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/50300b3a-1c70-41d4-a37c-5b121b3c0efb-logging-loki-s3\") pod \"logging-loki-index-gateway-0\" (UID: \"50300b3a-1c70-41d4-a37c-5b121b3c0efb\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 02 16:06:01 crc kubenswrapper[4933]: I1202 16:06:01.256362 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/50300b3a-1c70-41d4-a37c-5b121b3c0efb-logging-loki-index-gateway-grpc\") pod \"logging-loki-index-gateway-0\" (UID: \"50300b3a-1c70-41d4-a37c-5b121b3c0efb\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 02 16:06:01 crc kubenswrapper[4933]: I1202 16:06:01.271416 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftlxs\" (UniqueName: \"kubernetes.io/projected/50300b3a-1c70-41d4-a37c-5b121b3c0efb-kube-api-access-ftlxs\") pod \"logging-loki-index-gateway-0\" (UID: \"50300b3a-1c70-41d4-a37c-5b121b3c0efb\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 02 16:06:01 crc kubenswrapper[4933]: I1202 16:06:01.272898 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-compactor-0" Dec 02 16:06:01 crc kubenswrapper[4933]: I1202 16:06:01.285677 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-9ad069d5-32d5-4fcf-b18f-3f9846b0a02d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9ad069d5-32d5-4fcf-b18f-3f9846b0a02d\") pod \"logging-loki-index-gateway-0\" (UID: \"50300b3a-1c70-41d4-a37c-5b121b3c0efb\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 02 16:06:01 crc kubenswrapper[4933]: I1202 16:06:01.321125 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-index-gateway-0" Dec 02 16:06:01 crc kubenswrapper[4933]: I1202 16:06:01.485430 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-ingester-0" Dec 02 16:06:01 crc kubenswrapper[4933]: I1202 16:06:01.686149 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-compactor-0"] Dec 02 16:06:01 crc kubenswrapper[4933]: W1202 16:06:01.712418 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod39e91993_69e7_44dc_bb73_5372571ce233.slice/crio-9107e1b292d4e1bc8fb46c2e2804cb519cf89f4a3a05aceaab097cff49ac09eb WatchSource:0}: Error finding container 9107e1b292d4e1bc8fb46c2e2804cb519cf89f4a3a05aceaab097cff49ac09eb: Status 404 returned error can't find the container with id 9107e1b292d4e1bc8fb46c2e2804cb519cf89f4a3a05aceaab097cff49ac09eb Dec 02 16:06:01 crc kubenswrapper[4933]: I1202 16:06:01.712847 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-ingester-0"] Dec 02 16:06:01 crc kubenswrapper[4933]: W1202 16:06:01.718114 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcb0e0e26_ef5a_48bd_aa32_af5919aaf24a.slice/crio-044cc7ad4a496ced3ac5cf05afd68e2078391dc33a0ff34bd165a7768033793a WatchSource:0}: Error finding container 044cc7ad4a496ced3ac5cf05afd68e2078391dc33a0ff34bd165a7768033793a: Status 404 returned error can't find the container with id 044cc7ad4a496ced3ac5cf05afd68e2078391dc33a0ff34bd165a7768033793a Dec 02 16:06:01 crc kubenswrapper[4933]: I1202 16:06:01.763910 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-index-gateway-0"] Dec 02 16:06:01 crc kubenswrapper[4933]: I1202 16:06:01.786646 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-drtkk" event={"ID":"4020f625-85e6-4986-84aa-1f73717025cb","Type":"ContainerStarted","Data":"cecd9a46ce396d049d7976277581ee046d8140d2a5a2a2044b91951938cc629e"} Dec 02 16:06:01 crc kubenswrapper[4933]: I1202 16:06:01.788331 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-fdb4c644d-wj2pd" event={"ID":"c016a791-ff2e-4466-95e5-80333c909784","Type":"ContainerStarted","Data":"2e0203bea89e4964278fbb3b6e57324b313bb0b16c3aa05a8a54235d434d669a"} Dec 02 16:06:01 crc kubenswrapper[4933]: I1202 16:06:01.789472 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-compactor-0" event={"ID":"39e91993-69e7-44dc-bb73-5372571ce233","Type":"ContainerStarted","Data":"9107e1b292d4e1bc8fb46c2e2804cb519cf89f4a3a05aceaab097cff49ac09eb"} Dec 02 16:06:01 crc kubenswrapper[4933]: I1202 16:06:01.791042 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-ingester-0" event={"ID":"cb0e0e26-ef5a-48bd-aa32-af5919aaf24a","Type":"ContainerStarted","Data":"044cc7ad4a496ced3ac5cf05afd68e2078391dc33a0ff34bd165a7768033793a"} Dec 02 16:06:01 crc kubenswrapper[4933]: I1202 16:06:01.793025 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-fdb4c644d-28287" event={"ID":"349b144f-569c-411e-9651-db4659c64925","Type":"ContainerStarted","Data":"6d200ac68ee8946b651e14ee24130c7dc3d0fca1235ebc7a3d9ec133d1bc5c8c"} Dec 02 16:06:02 crc kubenswrapper[4933]: I1202 16:06:02.801740 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-index-gateway-0" event={"ID":"50300b3a-1c70-41d4-a37c-5b121b3c0efb","Type":"ContainerStarted","Data":"77c3ed194ebf1d6ad0d452a6795dc2f8a46b1f4759da5c8c4fd3b092c899e2bb"} Dec 02 16:06:11 crc kubenswrapper[4933]: I1202 16:06:11.916021 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-fdb4c644d-28287" event={"ID":"349b144f-569c-411e-9651-db4659c64925","Type":"ContainerStarted","Data":"bfa4e80ef7050298f1f78d6beab560aab470756784c0c751859c9c4f25fcc472"} Dec 02 16:06:11 crc kubenswrapper[4933]: I1202 16:06:11.917454 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-drtkk" event={"ID":"4020f625-85e6-4986-84aa-1f73717025cb","Type":"ContainerStarted","Data":"7d63c5411f1dbad86aa6ef844a1ffd5e6abc0fc31904a4222ad0353ae00e2160"} Dec 02 16:06:11 crc kubenswrapper[4933]: I1202 16:06:11.917621 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-drtkk" Dec 02 16:06:11 crc kubenswrapper[4933]: I1202 16:06:11.919141 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-fdb4c644d-wj2pd" event={"ID":"c016a791-ff2e-4466-95e5-80333c909784","Type":"ContainerStarted","Data":"ae5fd89bef1b37efdb8733705c3d0ec6191c4cfcc658fc95fadd8eea64cc79f7"} Dec 02 16:06:11 crc kubenswrapper[4933]: I1202 16:06:11.920663 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-querier-5895d59bb8-d4wmg" event={"ID":"b969ec3f-f1c9-4752-b473-1a14450526c1","Type":"ContainerStarted","Data":"9cb557e5f48e5b46dd2f2ae33c180b14140577dc8b97b2c00d7f36999c5ac0f8"} Dec 02 16:06:11 crc kubenswrapper[4933]: I1202 16:06:11.921135 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-querier-5895d59bb8-d4wmg" Dec 02 16:06:11 crc kubenswrapper[4933]: I1202 16:06:11.922730 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-distributor-76cc67bf56-bps9l" event={"ID":"92d8d29e-d630-4e5b-9697-5f388253535a","Type":"ContainerStarted","Data":"c5b09e1e8efa1fd42363639e33c3066004b4bf24f3532f23265c9b13f1c7ef6a"} Dec 02 16:06:11 crc kubenswrapper[4933]: I1202 16:06:11.922873 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-distributor-76cc67bf56-bps9l" Dec 02 16:06:11 crc kubenswrapper[4933]: I1202 16:06:11.923901 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-compactor-0" event={"ID":"39e91993-69e7-44dc-bb73-5372571ce233","Type":"ContainerStarted","Data":"d4f00af3f2e2121a7d920e46d9849c69d64b724f74c93b7a1d596cd7f53fe8fd"} Dec 02 16:06:11 crc kubenswrapper[4933]: I1202 16:06:11.923997 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-compactor-0" Dec 02 16:06:11 crc kubenswrapper[4933]: I1202 16:06:11.925227 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-ingester-0" event={"ID":"cb0e0e26-ef5a-48bd-aa32-af5919aaf24a","Type":"ContainerStarted","Data":"ff038e5f2082cac550b59f523fe0018c9c33ad42dea31726434a64cd06e12855"} Dec 02 16:06:11 crc kubenswrapper[4933]: I1202 16:06:11.925331 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-ingester-0" Dec 02 16:06:11 crc kubenswrapper[4933]: I1202 16:06:11.926489 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-index-gateway-0" event={"ID":"50300b3a-1c70-41d4-a37c-5b121b3c0efb","Type":"ContainerStarted","Data":"ffb7490a82eb150f85d45e9e9a40712e9e1706025cf283ed85e61f4750f721cd"} Dec 02 16:06:11 crc kubenswrapper[4933]: I1202 16:06:11.926599 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-index-gateway-0" Dec 02 16:06:11 crc kubenswrapper[4933]: I1202 16:06:11.937271 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-drtkk" podStartSLOduration=3.059863995 podStartE2EDuration="12.937236816s" podCreationTimestamp="2025-12-02 16:05:59 +0000 UTC" firstStartedPulling="2025-12-02 16:06:00.804498282 +0000 UTC m=+824.055724985" lastFinishedPulling="2025-12-02 16:06:10.681871083 +0000 UTC m=+833.933097806" observedRunningTime="2025-12-02 16:06:11.936154787 +0000 UTC m=+835.187381510" watchObservedRunningTime="2025-12-02 16:06:11.937236816 +0000 UTC m=+835.188463529" Dec 02 16:06:11 crc kubenswrapper[4933]: I1202 16:06:11.962712 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-ingester-0" podStartSLOduration=4.023415667 podStartE2EDuration="12.962688384s" podCreationTimestamp="2025-12-02 16:05:59 +0000 UTC" firstStartedPulling="2025-12-02 16:06:01.720002224 +0000 UTC m=+824.971228927" lastFinishedPulling="2025-12-02 16:06:10.659274941 +0000 UTC m=+833.910501644" observedRunningTime="2025-12-02 16:06:11.956469338 +0000 UTC m=+835.207696061" watchObservedRunningTime="2025-12-02 16:06:11.962688384 +0000 UTC m=+835.213915087" Dec 02 16:06:11 crc kubenswrapper[4933]: I1202 16:06:11.987394 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-compactor-0" podStartSLOduration=4.043092851 podStartE2EDuration="12.987359751s" podCreationTimestamp="2025-12-02 16:05:59 +0000 UTC" firstStartedPulling="2025-12-02 16:06:01.714624071 +0000 UTC m=+824.965850774" lastFinishedPulling="2025-12-02 16:06:10.658890981 +0000 UTC m=+833.910117674" observedRunningTime="2025-12-02 16:06:11.980387905 +0000 UTC m=+835.231614658" watchObservedRunningTime="2025-12-02 16:06:11.987359751 +0000 UTC m=+835.238586534" Dec 02 16:06:12 crc kubenswrapper[4933]: I1202 16:06:12.004508 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-distributor-76cc67bf56-bps9l" podStartSLOduration=2.908661448 podStartE2EDuration="13.004486427s" podCreationTimestamp="2025-12-02 16:05:59 +0000 UTC" firstStartedPulling="2025-12-02 16:06:00.585928441 +0000 UTC m=+823.837155134" lastFinishedPulling="2025-12-02 16:06:10.68175341 +0000 UTC m=+833.932980113" observedRunningTime="2025-12-02 16:06:11.997255154 +0000 UTC m=+835.248481877" watchObservedRunningTime="2025-12-02 16:06:12.004486427 +0000 UTC m=+835.255713140" Dec 02 16:06:12 crc kubenswrapper[4933]: I1202 16:06:12.019704 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-querier-5895d59bb8-d4wmg" podStartSLOduration=3.104254567 podStartE2EDuration="13.019675901s" podCreationTimestamp="2025-12-02 16:05:59 +0000 UTC" firstStartedPulling="2025-12-02 16:06:00.693802444 +0000 UTC m=+823.945029137" lastFinishedPulling="2025-12-02 16:06:10.609223768 +0000 UTC m=+833.860450471" observedRunningTime="2025-12-02 16:06:12.01247341 +0000 UTC m=+835.263700163" watchObservedRunningTime="2025-12-02 16:06:12.019675901 +0000 UTC m=+835.270902624" Dec 02 16:06:12 crc kubenswrapper[4933]: I1202 16:06:12.037751 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-index-gateway-0" podStartSLOduration=4.152496934 podStartE2EDuration="13.037728992s" podCreationTimestamp="2025-12-02 16:05:59 +0000 UTC" firstStartedPulling="2025-12-02 16:06:01.788234081 +0000 UTC m=+825.039460784" lastFinishedPulling="2025-12-02 16:06:10.673466139 +0000 UTC m=+833.924692842" observedRunningTime="2025-12-02 16:06:12.028113346 +0000 UTC m=+835.279340049" watchObservedRunningTime="2025-12-02 16:06:12.037728992 +0000 UTC m=+835.288955705" Dec 02 16:06:13 crc kubenswrapper[4933]: I1202 16:06:13.949765 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-fdb4c644d-28287" event={"ID":"349b144f-569c-411e-9651-db4659c64925","Type":"ContainerStarted","Data":"dba38f364913ed21a49d4576d91518cf56c3a1908ab9f7984b43917ad0f38c3d"} Dec 02 16:06:13 crc kubenswrapper[4933]: I1202 16:06:13.950492 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-fdb4c644d-28287" Dec 02 16:06:13 crc kubenswrapper[4933]: I1202 16:06:13.952968 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-fdb4c644d-wj2pd" event={"ID":"c016a791-ff2e-4466-95e5-80333c909784","Type":"ContainerStarted","Data":"071a540998b0c6b464da6d4604da8959e5d9a555113588b5d05b5d7fc5f23ee8"} Dec 02 16:06:13 crc kubenswrapper[4933]: I1202 16:06:13.953205 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-fdb4c644d-wj2pd" Dec 02 16:06:13 crc kubenswrapper[4933]: I1202 16:06:13.963141 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-fdb4c644d-28287" Dec 02 16:06:13 crc kubenswrapper[4933]: I1202 16:06:13.963328 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-fdb4c644d-wj2pd" Dec 02 16:06:13 crc kubenswrapper[4933]: I1202 16:06:13.970898 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-gateway-fdb4c644d-28287" podStartSLOduration=1.606448551 podStartE2EDuration="13.970869905s" podCreationTimestamp="2025-12-02 16:06:00 +0000 UTC" firstStartedPulling="2025-12-02 16:06:01.203051017 +0000 UTC m=+824.454277720" lastFinishedPulling="2025-12-02 16:06:13.567472371 +0000 UTC m=+836.818699074" observedRunningTime="2025-12-02 16:06:13.968762829 +0000 UTC m=+837.219989532" watchObservedRunningTime="2025-12-02 16:06:13.970869905 +0000 UTC m=+837.222096628" Dec 02 16:06:14 crc kubenswrapper[4933]: I1202 16:06:14.031153 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-gateway-fdb4c644d-wj2pd" podStartSLOduration=1.546192225 podStartE2EDuration="14.03113299s" podCreationTimestamp="2025-12-02 16:06:00 +0000 UTC" firstStartedPulling="2025-12-02 16:06:01.077358699 +0000 UTC m=+824.328585402" lastFinishedPulling="2025-12-02 16:06:13.562299464 +0000 UTC m=+836.813526167" observedRunningTime="2025-12-02 16:06:14.024929634 +0000 UTC m=+837.276156357" watchObservedRunningTime="2025-12-02 16:06:14.03113299 +0000 UTC m=+837.282359693" Dec 02 16:06:14 crc kubenswrapper[4933]: I1202 16:06:14.968058 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-fdb4c644d-wj2pd" Dec 02 16:06:14 crc kubenswrapper[4933]: I1202 16:06:14.968155 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-fdb4c644d-28287" Dec 02 16:06:14 crc kubenswrapper[4933]: I1202 16:06:14.977668 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-fdb4c644d-28287" Dec 02 16:06:14 crc kubenswrapper[4933]: I1202 16:06:14.984906 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-fdb4c644d-wj2pd" Dec 02 16:06:30 crc kubenswrapper[4933]: I1202 16:06:30.005364 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-distributor-76cc67bf56-bps9l" Dec 02 16:06:30 crc kubenswrapper[4933]: I1202 16:06:30.180265 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-querier-5895d59bb8-d4wmg" Dec 02 16:06:30 crc kubenswrapper[4933]: I1202 16:06:30.293283 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-drtkk" Dec 02 16:06:31 crc kubenswrapper[4933]: I1202 16:06:31.279784 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-compactor-0" Dec 02 16:06:31 crc kubenswrapper[4933]: I1202 16:06:31.329311 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-index-gateway-0" Dec 02 16:06:31 crc kubenswrapper[4933]: I1202 16:06:31.491653 4933 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=Ingester not ready: this instance owns no tokens Dec 02 16:06:31 crc kubenswrapper[4933]: I1202 16:06:31.491702 4933 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="cb0e0e26-ef5a-48bd-aa32-af5919aaf24a" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Dec 02 16:06:41 crc kubenswrapper[4933]: I1202 16:06:41.493113 4933 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=Ingester not ready: waiting for 15s after being ready Dec 02 16:06:41 crc kubenswrapper[4933]: I1202 16:06:41.493649 4933 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="cb0e0e26-ef5a-48bd-aa32-af5919aaf24a" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Dec 02 16:06:51 crc kubenswrapper[4933]: I1202 16:06:51.513033 4933 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=Ingester not ready: waiting for 15s after being ready Dec 02 16:06:51 crc kubenswrapper[4933]: I1202 16:06:51.513722 4933 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="cb0e0e26-ef5a-48bd-aa32-af5919aaf24a" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Dec 02 16:07:01 crc kubenswrapper[4933]: I1202 16:07:01.490531 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-ingester-0" Dec 02 16:07:07 crc kubenswrapper[4933]: I1202 16:07:07.872389 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-wwk6n"] Dec 02 16:07:07 crc kubenswrapper[4933]: I1202 16:07:07.874520 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wwk6n" Dec 02 16:07:07 crc kubenswrapper[4933]: I1202 16:07:07.885469 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wwk6n"] Dec 02 16:07:08 crc kubenswrapper[4933]: I1202 16:07:08.011679 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/147e5cf5-d45f-48df-89af-fad86d91085c-catalog-content\") pod \"redhat-marketplace-wwk6n\" (UID: \"147e5cf5-d45f-48df-89af-fad86d91085c\") " pod="openshift-marketplace/redhat-marketplace-wwk6n" Dec 02 16:07:08 crc kubenswrapper[4933]: I1202 16:07:08.012003 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29rt8\" (UniqueName: \"kubernetes.io/projected/147e5cf5-d45f-48df-89af-fad86d91085c-kube-api-access-29rt8\") pod \"redhat-marketplace-wwk6n\" (UID: \"147e5cf5-d45f-48df-89af-fad86d91085c\") " pod="openshift-marketplace/redhat-marketplace-wwk6n" Dec 02 16:07:08 crc kubenswrapper[4933]: I1202 16:07:08.012201 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/147e5cf5-d45f-48df-89af-fad86d91085c-utilities\") pod \"redhat-marketplace-wwk6n\" (UID: \"147e5cf5-d45f-48df-89af-fad86d91085c\") " pod="openshift-marketplace/redhat-marketplace-wwk6n" Dec 02 16:07:08 crc kubenswrapper[4933]: I1202 16:07:08.113292 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/147e5cf5-d45f-48df-89af-fad86d91085c-utilities\") pod \"redhat-marketplace-wwk6n\" (UID: \"147e5cf5-d45f-48df-89af-fad86d91085c\") " pod="openshift-marketplace/redhat-marketplace-wwk6n" Dec 02 16:07:08 crc kubenswrapper[4933]: I1202 16:07:08.113419 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/147e5cf5-d45f-48df-89af-fad86d91085c-catalog-content\") pod \"redhat-marketplace-wwk6n\" (UID: \"147e5cf5-d45f-48df-89af-fad86d91085c\") " pod="openshift-marketplace/redhat-marketplace-wwk6n" Dec 02 16:07:08 crc kubenswrapper[4933]: I1202 16:07:08.113473 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29rt8\" (UniqueName: \"kubernetes.io/projected/147e5cf5-d45f-48df-89af-fad86d91085c-kube-api-access-29rt8\") pod \"redhat-marketplace-wwk6n\" (UID: \"147e5cf5-d45f-48df-89af-fad86d91085c\") " pod="openshift-marketplace/redhat-marketplace-wwk6n" Dec 02 16:07:08 crc kubenswrapper[4933]: I1202 16:07:08.114248 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/147e5cf5-d45f-48df-89af-fad86d91085c-utilities\") pod \"redhat-marketplace-wwk6n\" (UID: \"147e5cf5-d45f-48df-89af-fad86d91085c\") " pod="openshift-marketplace/redhat-marketplace-wwk6n" Dec 02 16:07:08 crc kubenswrapper[4933]: I1202 16:07:08.114466 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/147e5cf5-d45f-48df-89af-fad86d91085c-catalog-content\") pod \"redhat-marketplace-wwk6n\" (UID: \"147e5cf5-d45f-48df-89af-fad86d91085c\") " pod="openshift-marketplace/redhat-marketplace-wwk6n" Dec 02 16:07:08 crc kubenswrapper[4933]: I1202 16:07:08.134441 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29rt8\" (UniqueName: \"kubernetes.io/projected/147e5cf5-d45f-48df-89af-fad86d91085c-kube-api-access-29rt8\") pod \"redhat-marketplace-wwk6n\" (UID: \"147e5cf5-d45f-48df-89af-fad86d91085c\") " pod="openshift-marketplace/redhat-marketplace-wwk6n" Dec 02 16:07:08 crc kubenswrapper[4933]: I1202 16:07:08.192772 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wwk6n" Dec 02 16:07:08 crc kubenswrapper[4933]: I1202 16:07:08.657497 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wwk6n"] Dec 02 16:07:08 crc kubenswrapper[4933]: W1202 16:07:08.677693 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod147e5cf5_d45f_48df_89af_fad86d91085c.slice/crio-cb515e4227c51115dfc40982819e37ad448253589f8519eafdd0202e2365116e WatchSource:0}: Error finding container cb515e4227c51115dfc40982819e37ad448253589f8519eafdd0202e2365116e: Status 404 returned error can't find the container with id cb515e4227c51115dfc40982819e37ad448253589f8519eafdd0202e2365116e Dec 02 16:07:09 crc kubenswrapper[4933]: I1202 16:07:09.374078 4933 generic.go:334] "Generic (PLEG): container finished" podID="147e5cf5-d45f-48df-89af-fad86d91085c" containerID="13a9ecca697c798df3d824658e5eebaaa6c3d5695b47fb063a17604651a0d88b" exitCode=0 Dec 02 16:07:09 crc kubenswrapper[4933]: I1202 16:07:09.374548 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wwk6n" event={"ID":"147e5cf5-d45f-48df-89af-fad86d91085c","Type":"ContainerDied","Data":"13a9ecca697c798df3d824658e5eebaaa6c3d5695b47fb063a17604651a0d88b"} Dec 02 16:07:09 crc kubenswrapper[4933]: I1202 16:07:09.374640 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wwk6n" event={"ID":"147e5cf5-d45f-48df-89af-fad86d91085c","Type":"ContainerStarted","Data":"cb515e4227c51115dfc40982819e37ad448253589f8519eafdd0202e2365116e"} Dec 02 16:07:10 crc kubenswrapper[4933]: I1202 16:07:10.382921 4933 generic.go:334] "Generic (PLEG): container finished" podID="147e5cf5-d45f-48df-89af-fad86d91085c" containerID="3dec896d1ab73062b0710d156f553cd3cc794cf726b6b7359d98f6653ff43831" exitCode=0 Dec 02 16:07:10 crc kubenswrapper[4933]: I1202 16:07:10.383004 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wwk6n" event={"ID":"147e5cf5-d45f-48df-89af-fad86d91085c","Type":"ContainerDied","Data":"3dec896d1ab73062b0710d156f553cd3cc794cf726b6b7359d98f6653ff43831"} Dec 02 16:07:11 crc kubenswrapper[4933]: I1202 16:07:11.390753 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wwk6n" event={"ID":"147e5cf5-d45f-48df-89af-fad86d91085c","Type":"ContainerStarted","Data":"366722504a4d507d246a5e0ad5755e2392e80f348a08160c55a734d257159624"} Dec 02 16:07:11 crc kubenswrapper[4933]: I1202 16:07:11.415137 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-wwk6n" podStartSLOduration=2.932255949 podStartE2EDuration="4.415121079s" podCreationTimestamp="2025-12-02 16:07:07 +0000 UTC" firstStartedPulling="2025-12-02 16:07:09.377721256 +0000 UTC m=+892.628947999" lastFinishedPulling="2025-12-02 16:07:10.860586426 +0000 UTC m=+894.111813129" observedRunningTime="2025-12-02 16:07:11.412107109 +0000 UTC m=+894.663333812" watchObservedRunningTime="2025-12-02 16:07:11.415121079 +0000 UTC m=+894.666347782" Dec 02 16:07:14 crc kubenswrapper[4933]: I1202 16:07:14.269814 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-qm2j4"] Dec 02 16:07:14 crc kubenswrapper[4933]: I1202 16:07:14.272522 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qm2j4" Dec 02 16:07:14 crc kubenswrapper[4933]: I1202 16:07:14.282270 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qm2j4"] Dec 02 16:07:14 crc kubenswrapper[4933]: I1202 16:07:14.311042 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7bc26a40-ae58-4c28-ade2-7ea4e2bb9900-catalog-content\") pod \"community-operators-qm2j4\" (UID: \"7bc26a40-ae58-4c28-ade2-7ea4e2bb9900\") " pod="openshift-marketplace/community-operators-qm2j4" Dec 02 16:07:14 crc kubenswrapper[4933]: I1202 16:07:14.311151 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7bc26a40-ae58-4c28-ade2-7ea4e2bb9900-utilities\") pod \"community-operators-qm2j4\" (UID: \"7bc26a40-ae58-4c28-ade2-7ea4e2bb9900\") " pod="openshift-marketplace/community-operators-qm2j4" Dec 02 16:07:14 crc kubenswrapper[4933]: I1202 16:07:14.311227 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lrbl\" (UniqueName: \"kubernetes.io/projected/7bc26a40-ae58-4c28-ade2-7ea4e2bb9900-kube-api-access-5lrbl\") pod \"community-operators-qm2j4\" (UID: \"7bc26a40-ae58-4c28-ade2-7ea4e2bb9900\") " pod="openshift-marketplace/community-operators-qm2j4" Dec 02 16:07:14 crc kubenswrapper[4933]: I1202 16:07:14.412563 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5lrbl\" (UniqueName: \"kubernetes.io/projected/7bc26a40-ae58-4c28-ade2-7ea4e2bb9900-kube-api-access-5lrbl\") pod \"community-operators-qm2j4\" (UID: \"7bc26a40-ae58-4c28-ade2-7ea4e2bb9900\") " pod="openshift-marketplace/community-operators-qm2j4" Dec 02 16:07:14 crc kubenswrapper[4933]: I1202 16:07:14.412699 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7bc26a40-ae58-4c28-ade2-7ea4e2bb9900-catalog-content\") pod \"community-operators-qm2j4\" (UID: \"7bc26a40-ae58-4c28-ade2-7ea4e2bb9900\") " pod="openshift-marketplace/community-operators-qm2j4" Dec 02 16:07:14 crc kubenswrapper[4933]: I1202 16:07:14.412728 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7bc26a40-ae58-4c28-ade2-7ea4e2bb9900-utilities\") pod \"community-operators-qm2j4\" (UID: \"7bc26a40-ae58-4c28-ade2-7ea4e2bb9900\") " pod="openshift-marketplace/community-operators-qm2j4" Dec 02 16:07:14 crc kubenswrapper[4933]: I1202 16:07:14.413189 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7bc26a40-ae58-4c28-ade2-7ea4e2bb9900-utilities\") pod \"community-operators-qm2j4\" (UID: \"7bc26a40-ae58-4c28-ade2-7ea4e2bb9900\") " pod="openshift-marketplace/community-operators-qm2j4" Dec 02 16:07:14 crc kubenswrapper[4933]: I1202 16:07:14.413295 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7bc26a40-ae58-4c28-ade2-7ea4e2bb9900-catalog-content\") pod \"community-operators-qm2j4\" (UID: \"7bc26a40-ae58-4c28-ade2-7ea4e2bb9900\") " pod="openshift-marketplace/community-operators-qm2j4" Dec 02 16:07:14 crc kubenswrapper[4933]: I1202 16:07:14.437761 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lrbl\" (UniqueName: \"kubernetes.io/projected/7bc26a40-ae58-4c28-ade2-7ea4e2bb9900-kube-api-access-5lrbl\") pod \"community-operators-qm2j4\" (UID: \"7bc26a40-ae58-4c28-ade2-7ea4e2bb9900\") " pod="openshift-marketplace/community-operators-qm2j4" Dec 02 16:07:14 crc kubenswrapper[4933]: I1202 16:07:14.599640 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qm2j4" Dec 02 16:07:15 crc kubenswrapper[4933]: I1202 16:07:15.131831 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qm2j4"] Dec 02 16:07:15 crc kubenswrapper[4933]: I1202 16:07:15.415403 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qm2j4" event={"ID":"7bc26a40-ae58-4c28-ade2-7ea4e2bb9900","Type":"ContainerStarted","Data":"f1681296642ad5eaf69e44500dd1746757b03f605d0fee7b0a204851533b0f7b"} Dec 02 16:07:15 crc kubenswrapper[4933]: I1202 16:07:15.415709 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qm2j4" event={"ID":"7bc26a40-ae58-4c28-ade2-7ea4e2bb9900","Type":"ContainerStarted","Data":"72a782cff9c2fcde9c21773ca0b59cab4bbc56519fcd2d5b2f3267abbebe9923"} Dec 02 16:07:16 crc kubenswrapper[4933]: I1202 16:07:16.421694 4933 generic.go:334] "Generic (PLEG): container finished" podID="7bc26a40-ae58-4c28-ade2-7ea4e2bb9900" containerID="f1681296642ad5eaf69e44500dd1746757b03f605d0fee7b0a204851533b0f7b" exitCode=0 Dec 02 16:07:16 crc kubenswrapper[4933]: I1202 16:07:16.421729 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qm2j4" event={"ID":"7bc26a40-ae58-4c28-ade2-7ea4e2bb9900","Type":"ContainerDied","Data":"f1681296642ad5eaf69e44500dd1746757b03f605d0fee7b0a204851533b0f7b"} Dec 02 16:07:17 crc kubenswrapper[4933]: I1202 16:07:17.170659 4933 patch_prober.go:28] interesting pod/machine-config-daemon-d2p6w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 16:07:17 crc kubenswrapper[4933]: I1202 16:07:17.171065 4933 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 16:07:17 crc kubenswrapper[4933]: I1202 16:07:17.430063 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qm2j4" event={"ID":"7bc26a40-ae58-4c28-ade2-7ea4e2bb9900","Type":"ContainerStarted","Data":"a6ad93c09b6c59fb8d2355d3a552dd14944baa457a08c174e03f9cb8c03f9a99"} Dec 02 16:07:18 crc kubenswrapper[4933]: I1202 16:07:18.193448 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-wwk6n" Dec 02 16:07:18 crc kubenswrapper[4933]: I1202 16:07:18.194316 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-wwk6n" Dec 02 16:07:18 crc kubenswrapper[4933]: I1202 16:07:18.268704 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-wwk6n" Dec 02 16:07:18 crc kubenswrapper[4933]: I1202 16:07:18.439903 4933 generic.go:334] "Generic (PLEG): container finished" podID="7bc26a40-ae58-4c28-ade2-7ea4e2bb9900" containerID="a6ad93c09b6c59fb8d2355d3a552dd14944baa457a08c174e03f9cb8c03f9a99" exitCode=0 Dec 02 16:07:18 crc kubenswrapper[4933]: I1202 16:07:18.440001 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qm2j4" event={"ID":"7bc26a40-ae58-4c28-ade2-7ea4e2bb9900","Type":"ContainerDied","Data":"a6ad93c09b6c59fb8d2355d3a552dd14944baa457a08c174e03f9cb8c03f9a99"} Dec 02 16:07:18 crc kubenswrapper[4933]: I1202 16:07:18.492912 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-wwk6n" Dec 02 16:07:19 crc kubenswrapper[4933]: I1202 16:07:19.449205 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qm2j4" event={"ID":"7bc26a40-ae58-4c28-ade2-7ea4e2bb9900","Type":"ContainerStarted","Data":"f9653bba51eb155877136b396e0191619cd165d85184cb460c69c5cf94fed441"} Dec 02 16:07:19 crc kubenswrapper[4933]: I1202 16:07:19.469165 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-qm2j4" podStartSLOduration=2.959892048 podStartE2EDuration="5.469143255s" podCreationTimestamp="2025-12-02 16:07:14 +0000 UTC" firstStartedPulling="2025-12-02 16:07:16.423319933 +0000 UTC m=+899.674546636" lastFinishedPulling="2025-12-02 16:07:18.93257113 +0000 UTC m=+902.183797843" observedRunningTime="2025-12-02 16:07:19.464609755 +0000 UTC m=+902.715836458" watchObservedRunningTime="2025-12-02 16:07:19.469143255 +0000 UTC m=+902.720369948" Dec 02 16:07:20 crc kubenswrapper[4933]: I1202 16:07:20.672649 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wwk6n"] Dec 02 16:07:20 crc kubenswrapper[4933]: I1202 16:07:20.900433 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/collector-p9c9w"] Dec 02 16:07:20 crc kubenswrapper[4933]: I1202 16:07:20.902138 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-p9c9w" Dec 02 16:07:20 crc kubenswrapper[4933]: I1202 16:07:20.909119 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-token" Dec 02 16:07:20 crc kubenswrapper[4933]: I1202 16:07:20.909284 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-metrics" Dec 02 16:07:20 crc kubenswrapper[4933]: I1202 16:07:20.909468 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-dockercfg-tm4kr" Dec 02 16:07:20 crc kubenswrapper[4933]: I1202 16:07:20.910742 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/2016e5cd-b786-40e5-a0a9-f77df27d8272-sa-token\") pod \"collector-p9c9w\" (UID: \"2016e5cd-b786-40e5-a0a9-f77df27d8272\") " pod="openshift-logging/collector-p9c9w" Dec 02 16:07:20 crc kubenswrapper[4933]: I1202 16:07:20.910940 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqgsg\" (UniqueName: \"kubernetes.io/projected/2016e5cd-b786-40e5-a0a9-f77df27d8272-kube-api-access-pqgsg\") pod \"collector-p9c9w\" (UID: \"2016e5cd-b786-40e5-a0a9-f77df27d8272\") " pod="openshift-logging/collector-p9c9w" Dec 02 16:07:20 crc kubenswrapper[4933]: I1202 16:07:20.911130 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2016e5cd-b786-40e5-a0a9-f77df27d8272-tmp\") pod \"collector-p9c9w\" (UID: \"2016e5cd-b786-40e5-a0a9-f77df27d8272\") " pod="openshift-logging/collector-p9c9w" Dec 02 16:07:20 crc kubenswrapper[4933]: I1202 16:07:20.911314 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/2016e5cd-b786-40e5-a0a9-f77df27d8272-collector-token\") pod \"collector-p9c9w\" (UID: \"2016e5cd-b786-40e5-a0a9-f77df27d8272\") " pod="openshift-logging/collector-p9c9w" Dec 02 16:07:20 crc kubenswrapper[4933]: I1202 16:07:20.911461 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2016e5cd-b786-40e5-a0a9-f77df27d8272-trusted-ca\") pod \"collector-p9c9w\" (UID: \"2016e5cd-b786-40e5-a0a9-f77df27d8272\") " pod="openshift-logging/collector-p9c9w" Dec 02 16:07:20 crc kubenswrapper[4933]: I1202 16:07:20.911652 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/2016e5cd-b786-40e5-a0a9-f77df27d8272-entrypoint\") pod \"collector-p9c9w\" (UID: \"2016e5cd-b786-40e5-a0a9-f77df27d8272\") " pod="openshift-logging/collector-p9c9w" Dec 02 16:07:20 crc kubenswrapper[4933]: I1202 16:07:20.911789 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/2016e5cd-b786-40e5-a0a9-f77df27d8272-collector-syslog-receiver\") pod \"collector-p9c9w\" (UID: \"2016e5cd-b786-40e5-a0a9-f77df27d8272\") " pod="openshift-logging/collector-p9c9w" Dec 02 16:07:20 crc kubenswrapper[4933]: I1202 16:07:20.911959 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/2016e5cd-b786-40e5-a0a9-f77df27d8272-datadir\") pod \"collector-p9c9w\" (UID: \"2016e5cd-b786-40e5-a0a9-f77df27d8272\") " pod="openshift-logging/collector-p9c9w" Dec 02 16:07:20 crc kubenswrapper[4933]: I1202 16:07:20.912114 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2016e5cd-b786-40e5-a0a9-f77df27d8272-config\") pod \"collector-p9c9w\" (UID: \"2016e5cd-b786-40e5-a0a9-f77df27d8272\") " pod="openshift-logging/collector-p9c9w" Dec 02 16:07:20 crc kubenswrapper[4933]: I1202 16:07:20.912264 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/2016e5cd-b786-40e5-a0a9-f77df27d8272-config-openshift-service-cacrt\") pod \"collector-p9c9w\" (UID: \"2016e5cd-b786-40e5-a0a9-f77df27d8272\") " pod="openshift-logging/collector-p9c9w" Dec 02 16:07:20 crc kubenswrapper[4933]: I1202 16:07:20.912428 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/2016e5cd-b786-40e5-a0a9-f77df27d8272-metrics\") pod \"collector-p9c9w\" (UID: \"2016e5cd-b786-40e5-a0a9-f77df27d8272\") " pod="openshift-logging/collector-p9c9w" Dec 02 16:07:20 crc kubenswrapper[4933]: I1202 16:07:20.923926 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-syslog-receiver" Dec 02 16:07:20 crc kubenswrapper[4933]: I1202 16:07:20.929302 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-config" Dec 02 16:07:20 crc kubenswrapper[4933]: I1202 16:07:20.945399 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/collector-p9c9w"] Dec 02 16:07:20 crc kubenswrapper[4933]: I1202 16:07:20.946957 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-trustbundle" Dec 02 16:07:21 crc kubenswrapper[4933]: I1202 16:07:21.013995 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2016e5cd-b786-40e5-a0a9-f77df27d8272-tmp\") pod \"collector-p9c9w\" (UID: \"2016e5cd-b786-40e5-a0a9-f77df27d8272\") " pod="openshift-logging/collector-p9c9w" Dec 02 16:07:21 crc kubenswrapper[4933]: I1202 16:07:21.014065 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/2016e5cd-b786-40e5-a0a9-f77df27d8272-collector-token\") pod \"collector-p9c9w\" (UID: \"2016e5cd-b786-40e5-a0a9-f77df27d8272\") " pod="openshift-logging/collector-p9c9w" Dec 02 16:07:21 crc kubenswrapper[4933]: I1202 16:07:21.014089 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2016e5cd-b786-40e5-a0a9-f77df27d8272-trusted-ca\") pod \"collector-p9c9w\" (UID: \"2016e5cd-b786-40e5-a0a9-f77df27d8272\") " pod="openshift-logging/collector-p9c9w" Dec 02 16:07:21 crc kubenswrapper[4933]: I1202 16:07:21.014120 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/2016e5cd-b786-40e5-a0a9-f77df27d8272-entrypoint\") pod \"collector-p9c9w\" (UID: \"2016e5cd-b786-40e5-a0a9-f77df27d8272\") " pod="openshift-logging/collector-p9c9w" Dec 02 16:07:21 crc kubenswrapper[4933]: I1202 16:07:21.014139 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/2016e5cd-b786-40e5-a0a9-f77df27d8272-collector-syslog-receiver\") pod \"collector-p9c9w\" (UID: \"2016e5cd-b786-40e5-a0a9-f77df27d8272\") " pod="openshift-logging/collector-p9c9w" Dec 02 16:07:21 crc kubenswrapper[4933]: I1202 16:07:21.014161 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/2016e5cd-b786-40e5-a0a9-f77df27d8272-datadir\") pod \"collector-p9c9w\" (UID: \"2016e5cd-b786-40e5-a0a9-f77df27d8272\") " pod="openshift-logging/collector-p9c9w" Dec 02 16:07:21 crc kubenswrapper[4933]: I1202 16:07:21.014178 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2016e5cd-b786-40e5-a0a9-f77df27d8272-config\") pod \"collector-p9c9w\" (UID: \"2016e5cd-b786-40e5-a0a9-f77df27d8272\") " pod="openshift-logging/collector-p9c9w" Dec 02 16:07:21 crc kubenswrapper[4933]: I1202 16:07:21.014199 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/2016e5cd-b786-40e5-a0a9-f77df27d8272-config-openshift-service-cacrt\") pod \"collector-p9c9w\" (UID: \"2016e5cd-b786-40e5-a0a9-f77df27d8272\") " pod="openshift-logging/collector-p9c9w" Dec 02 16:07:21 crc kubenswrapper[4933]: I1202 16:07:21.014230 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/2016e5cd-b786-40e5-a0a9-f77df27d8272-metrics\") pod \"collector-p9c9w\" (UID: \"2016e5cd-b786-40e5-a0a9-f77df27d8272\") " pod="openshift-logging/collector-p9c9w" Dec 02 16:07:21 crc kubenswrapper[4933]: I1202 16:07:21.014270 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/2016e5cd-b786-40e5-a0a9-f77df27d8272-sa-token\") pod \"collector-p9c9w\" (UID: \"2016e5cd-b786-40e5-a0a9-f77df27d8272\") " pod="openshift-logging/collector-p9c9w" Dec 02 16:07:21 crc kubenswrapper[4933]: I1202 16:07:21.014288 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pqgsg\" (UniqueName: \"kubernetes.io/projected/2016e5cd-b786-40e5-a0a9-f77df27d8272-kube-api-access-pqgsg\") pod \"collector-p9c9w\" (UID: \"2016e5cd-b786-40e5-a0a9-f77df27d8272\") " pod="openshift-logging/collector-p9c9w" Dec 02 16:07:21 crc kubenswrapper[4933]: E1202 16:07:21.014315 4933 secret.go:188] Couldn't get secret openshift-logging/collector-syslog-receiver: secret "collector-syslog-receiver" not found Dec 02 16:07:21 crc kubenswrapper[4933]: I1202 16:07:21.014366 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/2016e5cd-b786-40e5-a0a9-f77df27d8272-datadir\") pod \"collector-p9c9w\" (UID: \"2016e5cd-b786-40e5-a0a9-f77df27d8272\") " pod="openshift-logging/collector-p9c9w" Dec 02 16:07:21 crc kubenswrapper[4933]: E1202 16:07:21.014389 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2016e5cd-b786-40e5-a0a9-f77df27d8272-collector-syslog-receiver podName:2016e5cd-b786-40e5-a0a9-f77df27d8272 nodeName:}" failed. No retries permitted until 2025-12-02 16:07:21.514371245 +0000 UTC m=+904.765597948 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "collector-syslog-receiver" (UniqueName: "kubernetes.io/secret/2016e5cd-b786-40e5-a0a9-f77df27d8272-collector-syslog-receiver") pod "collector-p9c9w" (UID: "2016e5cd-b786-40e5-a0a9-f77df27d8272") : secret "collector-syslog-receiver" not found Dec 02 16:07:21 crc kubenswrapper[4933]: E1202 16:07:21.014518 4933 secret.go:188] Couldn't get secret openshift-logging/collector-metrics: secret "collector-metrics" not found Dec 02 16:07:21 crc kubenswrapper[4933]: E1202 16:07:21.014575 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2016e5cd-b786-40e5-a0a9-f77df27d8272-metrics podName:2016e5cd-b786-40e5-a0a9-f77df27d8272 nodeName:}" failed. No retries permitted until 2025-12-02 16:07:21.51455199 +0000 UTC m=+904.765778763 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics" (UniqueName: "kubernetes.io/secret/2016e5cd-b786-40e5-a0a9-f77df27d8272-metrics") pod "collector-p9c9w" (UID: "2016e5cd-b786-40e5-a0a9-f77df27d8272") : secret "collector-metrics" not found Dec 02 16:07:21 crc kubenswrapper[4933]: I1202 16:07:21.015313 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2016e5cd-b786-40e5-a0a9-f77df27d8272-trusted-ca\") pod \"collector-p9c9w\" (UID: \"2016e5cd-b786-40e5-a0a9-f77df27d8272\") " pod="openshift-logging/collector-p9c9w" Dec 02 16:07:21 crc kubenswrapper[4933]: I1202 16:07:21.015343 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/2016e5cd-b786-40e5-a0a9-f77df27d8272-config-openshift-service-cacrt\") pod \"collector-p9c9w\" (UID: \"2016e5cd-b786-40e5-a0a9-f77df27d8272\") " pod="openshift-logging/collector-p9c9w" Dec 02 16:07:21 crc kubenswrapper[4933]: I1202 16:07:21.015353 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2016e5cd-b786-40e5-a0a9-f77df27d8272-config\") pod \"collector-p9c9w\" (UID: \"2016e5cd-b786-40e5-a0a9-f77df27d8272\") " pod="openshift-logging/collector-p9c9w" Dec 02 16:07:21 crc kubenswrapper[4933]: I1202 16:07:21.015341 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/2016e5cd-b786-40e5-a0a9-f77df27d8272-entrypoint\") pod \"collector-p9c9w\" (UID: \"2016e5cd-b786-40e5-a0a9-f77df27d8272\") " pod="openshift-logging/collector-p9c9w" Dec 02 16:07:21 crc kubenswrapper[4933]: I1202 16:07:21.019070 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2016e5cd-b786-40e5-a0a9-f77df27d8272-tmp\") pod \"collector-p9c9w\" (UID: \"2016e5cd-b786-40e5-a0a9-f77df27d8272\") " pod="openshift-logging/collector-p9c9w" Dec 02 16:07:21 crc kubenswrapper[4933]: I1202 16:07:21.028423 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/2016e5cd-b786-40e5-a0a9-f77df27d8272-collector-token\") pod \"collector-p9c9w\" (UID: \"2016e5cd-b786-40e5-a0a9-f77df27d8272\") " pod="openshift-logging/collector-p9c9w" Dec 02 16:07:21 crc kubenswrapper[4933]: I1202 16:07:21.046449 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqgsg\" (UniqueName: \"kubernetes.io/projected/2016e5cd-b786-40e5-a0a9-f77df27d8272-kube-api-access-pqgsg\") pod \"collector-p9c9w\" (UID: \"2016e5cd-b786-40e5-a0a9-f77df27d8272\") " pod="openshift-logging/collector-p9c9w" Dec 02 16:07:21 crc kubenswrapper[4933]: I1202 16:07:21.049055 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/2016e5cd-b786-40e5-a0a9-f77df27d8272-sa-token\") pod \"collector-p9c9w\" (UID: \"2016e5cd-b786-40e5-a0a9-f77df27d8272\") " pod="openshift-logging/collector-p9c9w" Dec 02 16:07:21 crc kubenswrapper[4933]: I1202 16:07:21.154713 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-logging/collector-p9c9w"] Dec 02 16:07:21 crc kubenswrapper[4933]: E1202 16:07:21.155258 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[collector-syslog-receiver metrics], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-logging/collector-p9c9w" podUID="2016e5cd-b786-40e5-a0a9-f77df27d8272" Dec 02 16:07:21 crc kubenswrapper[4933]: I1202 16:07:21.459413 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-p9c9w" Dec 02 16:07:21 crc kubenswrapper[4933]: I1202 16:07:21.459551 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-wwk6n" podUID="147e5cf5-d45f-48df-89af-fad86d91085c" containerName="registry-server" containerID="cri-o://366722504a4d507d246a5e0ad5755e2392e80f348a08160c55a734d257159624" gracePeriod=2 Dec 02 16:07:21 crc kubenswrapper[4933]: I1202 16:07:21.470332 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-p9c9w" Dec 02 16:07:21 crc kubenswrapper[4933]: I1202 16:07:21.521187 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2016e5cd-b786-40e5-a0a9-f77df27d8272-config\") pod \"2016e5cd-b786-40e5-a0a9-f77df27d8272\" (UID: \"2016e5cd-b786-40e5-a0a9-f77df27d8272\") " Dec 02 16:07:21 crc kubenswrapper[4933]: I1202 16:07:21.521226 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/2016e5cd-b786-40e5-a0a9-f77df27d8272-collector-token\") pod \"2016e5cd-b786-40e5-a0a9-f77df27d8272\" (UID: \"2016e5cd-b786-40e5-a0a9-f77df27d8272\") " Dec 02 16:07:21 crc kubenswrapper[4933]: I1202 16:07:21.521249 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/2016e5cd-b786-40e5-a0a9-f77df27d8272-datadir\") pod \"2016e5cd-b786-40e5-a0a9-f77df27d8272\" (UID: \"2016e5cd-b786-40e5-a0a9-f77df27d8272\") " Dec 02 16:07:21 crc kubenswrapper[4933]: I1202 16:07:21.521323 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/2016e5cd-b786-40e5-a0a9-f77df27d8272-config-openshift-service-cacrt\") pod \"2016e5cd-b786-40e5-a0a9-f77df27d8272\" (UID: \"2016e5cd-b786-40e5-a0a9-f77df27d8272\") " Dec 02 16:07:21 crc kubenswrapper[4933]: I1202 16:07:21.521352 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2016e5cd-b786-40e5-a0a9-f77df27d8272-trusted-ca\") pod \"2016e5cd-b786-40e5-a0a9-f77df27d8272\" (UID: \"2016e5cd-b786-40e5-a0a9-f77df27d8272\") " Dec 02 16:07:21 crc kubenswrapper[4933]: I1202 16:07:21.521375 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/2016e5cd-b786-40e5-a0a9-f77df27d8272-entrypoint\") pod \"2016e5cd-b786-40e5-a0a9-f77df27d8272\" (UID: \"2016e5cd-b786-40e5-a0a9-f77df27d8272\") " Dec 02 16:07:21 crc kubenswrapper[4933]: I1202 16:07:21.521398 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/2016e5cd-b786-40e5-a0a9-f77df27d8272-sa-token\") pod \"2016e5cd-b786-40e5-a0a9-f77df27d8272\" (UID: \"2016e5cd-b786-40e5-a0a9-f77df27d8272\") " Dec 02 16:07:21 crc kubenswrapper[4933]: I1202 16:07:21.521384 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2016e5cd-b786-40e5-a0a9-f77df27d8272-datadir" (OuterVolumeSpecName: "datadir") pod "2016e5cd-b786-40e5-a0a9-f77df27d8272" (UID: "2016e5cd-b786-40e5-a0a9-f77df27d8272"). InnerVolumeSpecName "datadir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 16:07:21 crc kubenswrapper[4933]: I1202 16:07:21.521434 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2016e5cd-b786-40e5-a0a9-f77df27d8272-tmp\") pod \"2016e5cd-b786-40e5-a0a9-f77df27d8272\" (UID: \"2016e5cd-b786-40e5-a0a9-f77df27d8272\") " Dec 02 16:07:21 crc kubenswrapper[4933]: I1202 16:07:21.521462 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pqgsg\" (UniqueName: \"kubernetes.io/projected/2016e5cd-b786-40e5-a0a9-f77df27d8272-kube-api-access-pqgsg\") pod \"2016e5cd-b786-40e5-a0a9-f77df27d8272\" (UID: \"2016e5cd-b786-40e5-a0a9-f77df27d8272\") " Dec 02 16:07:21 crc kubenswrapper[4933]: I1202 16:07:21.521693 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/2016e5cd-b786-40e5-a0a9-f77df27d8272-collector-syslog-receiver\") pod \"collector-p9c9w\" (UID: \"2016e5cd-b786-40e5-a0a9-f77df27d8272\") " pod="openshift-logging/collector-p9c9w" Dec 02 16:07:21 crc kubenswrapper[4933]: I1202 16:07:21.521763 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/2016e5cd-b786-40e5-a0a9-f77df27d8272-metrics\") pod \"collector-p9c9w\" (UID: \"2016e5cd-b786-40e5-a0a9-f77df27d8272\") " pod="openshift-logging/collector-p9c9w" Dec 02 16:07:21 crc kubenswrapper[4933]: I1202 16:07:21.521816 4933 reconciler_common.go:293] "Volume detached for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/2016e5cd-b786-40e5-a0a9-f77df27d8272-datadir\") on node \"crc\" DevicePath \"\"" Dec 02 16:07:21 crc kubenswrapper[4933]: I1202 16:07:21.521855 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2016e5cd-b786-40e5-a0a9-f77df27d8272-config-openshift-service-cacrt" (OuterVolumeSpecName: "config-openshift-service-cacrt") pod "2016e5cd-b786-40e5-a0a9-f77df27d8272" (UID: "2016e5cd-b786-40e5-a0a9-f77df27d8272"). InnerVolumeSpecName "config-openshift-service-cacrt". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:07:21 crc kubenswrapper[4933]: I1202 16:07:21.521890 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2016e5cd-b786-40e5-a0a9-f77df27d8272-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "2016e5cd-b786-40e5-a0a9-f77df27d8272" (UID: "2016e5cd-b786-40e5-a0a9-f77df27d8272"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:07:21 crc kubenswrapper[4933]: I1202 16:07:21.521937 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2016e5cd-b786-40e5-a0a9-f77df27d8272-config" (OuterVolumeSpecName: "config") pod "2016e5cd-b786-40e5-a0a9-f77df27d8272" (UID: "2016e5cd-b786-40e5-a0a9-f77df27d8272"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:07:21 crc kubenswrapper[4933]: I1202 16:07:21.521967 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2016e5cd-b786-40e5-a0a9-f77df27d8272-entrypoint" (OuterVolumeSpecName: "entrypoint") pod "2016e5cd-b786-40e5-a0a9-f77df27d8272" (UID: "2016e5cd-b786-40e5-a0a9-f77df27d8272"). InnerVolumeSpecName "entrypoint". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:07:21 crc kubenswrapper[4933]: I1202 16:07:21.524678 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2016e5cd-b786-40e5-a0a9-f77df27d8272-collector-token" (OuterVolumeSpecName: "collector-token") pod "2016e5cd-b786-40e5-a0a9-f77df27d8272" (UID: "2016e5cd-b786-40e5-a0a9-f77df27d8272"). InnerVolumeSpecName "collector-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:07:21 crc kubenswrapper[4933]: I1202 16:07:21.524712 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2016e5cd-b786-40e5-a0a9-f77df27d8272-kube-api-access-pqgsg" (OuterVolumeSpecName: "kube-api-access-pqgsg") pod "2016e5cd-b786-40e5-a0a9-f77df27d8272" (UID: "2016e5cd-b786-40e5-a0a9-f77df27d8272"). InnerVolumeSpecName "kube-api-access-pqgsg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:07:21 crc kubenswrapper[4933]: I1202 16:07:21.524970 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2016e5cd-b786-40e5-a0a9-f77df27d8272-sa-token" (OuterVolumeSpecName: "sa-token") pod "2016e5cd-b786-40e5-a0a9-f77df27d8272" (UID: "2016e5cd-b786-40e5-a0a9-f77df27d8272"). InnerVolumeSpecName "sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:07:21 crc kubenswrapper[4933]: I1202 16:07:21.530972 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2016e5cd-b786-40e5-a0a9-f77df27d8272-tmp" (OuterVolumeSpecName: "tmp") pod "2016e5cd-b786-40e5-a0a9-f77df27d8272" (UID: "2016e5cd-b786-40e5-a0a9-f77df27d8272"). InnerVolumeSpecName "tmp". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 16:07:21 crc kubenswrapper[4933]: I1202 16:07:21.531385 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/2016e5cd-b786-40e5-a0a9-f77df27d8272-metrics\") pod \"collector-p9c9w\" (UID: \"2016e5cd-b786-40e5-a0a9-f77df27d8272\") " pod="openshift-logging/collector-p9c9w" Dec 02 16:07:21 crc kubenswrapper[4933]: I1202 16:07:21.531685 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/2016e5cd-b786-40e5-a0a9-f77df27d8272-collector-syslog-receiver\") pod \"collector-p9c9w\" (UID: \"2016e5cd-b786-40e5-a0a9-f77df27d8272\") " pod="openshift-logging/collector-p9c9w" Dec 02 16:07:21 crc kubenswrapper[4933]: I1202 16:07:21.677803 4933 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2016e5cd-b786-40e5-a0a9-f77df27d8272-config\") on node \"crc\" DevicePath \"\"" Dec 02 16:07:21 crc kubenswrapper[4933]: I1202 16:07:21.677851 4933 reconciler_common.go:293] "Volume detached for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/2016e5cd-b786-40e5-a0a9-f77df27d8272-collector-token\") on node \"crc\" DevicePath \"\"" Dec 02 16:07:21 crc kubenswrapper[4933]: I1202 16:07:21.677866 4933 reconciler_common.go:293] "Volume detached for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/2016e5cd-b786-40e5-a0a9-f77df27d8272-config-openshift-service-cacrt\") on node \"crc\" DevicePath \"\"" Dec 02 16:07:21 crc kubenswrapper[4933]: I1202 16:07:21.677877 4933 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2016e5cd-b786-40e5-a0a9-f77df27d8272-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 02 16:07:21 crc kubenswrapper[4933]: I1202 16:07:21.677888 4933 reconciler_common.go:293] "Volume detached for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/2016e5cd-b786-40e5-a0a9-f77df27d8272-entrypoint\") on node \"crc\" DevicePath \"\"" Dec 02 16:07:21 crc kubenswrapper[4933]: I1202 16:07:21.677899 4933 reconciler_common.go:293] "Volume detached for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/2016e5cd-b786-40e5-a0a9-f77df27d8272-sa-token\") on node \"crc\" DevicePath \"\"" Dec 02 16:07:21 crc kubenswrapper[4933]: I1202 16:07:21.677910 4933 reconciler_common.go:293] "Volume detached for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2016e5cd-b786-40e5-a0a9-f77df27d8272-tmp\") on node \"crc\" DevicePath \"\"" Dec 02 16:07:21 crc kubenswrapper[4933]: I1202 16:07:21.677920 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pqgsg\" (UniqueName: \"kubernetes.io/projected/2016e5cd-b786-40e5-a0a9-f77df27d8272-kube-api-access-pqgsg\") on node \"crc\" DevicePath \"\"" Dec 02 16:07:21 crc kubenswrapper[4933]: I1202 16:07:21.778746 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/2016e5cd-b786-40e5-a0a9-f77df27d8272-metrics\") pod \"2016e5cd-b786-40e5-a0a9-f77df27d8272\" (UID: \"2016e5cd-b786-40e5-a0a9-f77df27d8272\") " Dec 02 16:07:21 crc kubenswrapper[4933]: I1202 16:07:21.778909 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/2016e5cd-b786-40e5-a0a9-f77df27d8272-collector-syslog-receiver\") pod \"2016e5cd-b786-40e5-a0a9-f77df27d8272\" (UID: \"2016e5cd-b786-40e5-a0a9-f77df27d8272\") " Dec 02 16:07:21 crc kubenswrapper[4933]: I1202 16:07:21.785364 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2016e5cd-b786-40e5-a0a9-f77df27d8272-collector-syslog-receiver" (OuterVolumeSpecName: "collector-syslog-receiver") pod "2016e5cd-b786-40e5-a0a9-f77df27d8272" (UID: "2016e5cd-b786-40e5-a0a9-f77df27d8272"). InnerVolumeSpecName "collector-syslog-receiver". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:07:21 crc kubenswrapper[4933]: I1202 16:07:21.785487 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2016e5cd-b786-40e5-a0a9-f77df27d8272-metrics" (OuterVolumeSpecName: "metrics") pod "2016e5cd-b786-40e5-a0a9-f77df27d8272" (UID: "2016e5cd-b786-40e5-a0a9-f77df27d8272"). InnerVolumeSpecName "metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:07:21 crc kubenswrapper[4933]: I1202 16:07:21.873538 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wwk6n" Dec 02 16:07:21 crc kubenswrapper[4933]: I1202 16:07:21.880265 4933 reconciler_common.go:293] "Volume detached for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/2016e5cd-b786-40e5-a0a9-f77df27d8272-metrics\") on node \"crc\" DevicePath \"\"" Dec 02 16:07:21 crc kubenswrapper[4933]: I1202 16:07:21.880384 4933 reconciler_common.go:293] "Volume detached for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/2016e5cd-b786-40e5-a0a9-f77df27d8272-collector-syslog-receiver\") on node \"crc\" DevicePath \"\"" Dec 02 16:07:21 crc kubenswrapper[4933]: I1202 16:07:21.981698 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-29rt8\" (UniqueName: \"kubernetes.io/projected/147e5cf5-d45f-48df-89af-fad86d91085c-kube-api-access-29rt8\") pod \"147e5cf5-d45f-48df-89af-fad86d91085c\" (UID: \"147e5cf5-d45f-48df-89af-fad86d91085c\") " Dec 02 16:07:21 crc kubenswrapper[4933]: I1202 16:07:21.981815 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/147e5cf5-d45f-48df-89af-fad86d91085c-catalog-content\") pod \"147e5cf5-d45f-48df-89af-fad86d91085c\" (UID: \"147e5cf5-d45f-48df-89af-fad86d91085c\") " Dec 02 16:07:21 crc kubenswrapper[4933]: I1202 16:07:21.981924 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/147e5cf5-d45f-48df-89af-fad86d91085c-utilities\") pod \"147e5cf5-d45f-48df-89af-fad86d91085c\" (UID: \"147e5cf5-d45f-48df-89af-fad86d91085c\") " Dec 02 16:07:21 crc kubenswrapper[4933]: I1202 16:07:21.982891 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/147e5cf5-d45f-48df-89af-fad86d91085c-utilities" (OuterVolumeSpecName: "utilities") pod "147e5cf5-d45f-48df-89af-fad86d91085c" (UID: "147e5cf5-d45f-48df-89af-fad86d91085c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 16:07:21 crc kubenswrapper[4933]: I1202 16:07:21.985394 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/147e5cf5-d45f-48df-89af-fad86d91085c-kube-api-access-29rt8" (OuterVolumeSpecName: "kube-api-access-29rt8") pod "147e5cf5-d45f-48df-89af-fad86d91085c" (UID: "147e5cf5-d45f-48df-89af-fad86d91085c"). InnerVolumeSpecName "kube-api-access-29rt8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:07:21 crc kubenswrapper[4933]: I1202 16:07:21.997859 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/147e5cf5-d45f-48df-89af-fad86d91085c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "147e5cf5-d45f-48df-89af-fad86d91085c" (UID: "147e5cf5-d45f-48df-89af-fad86d91085c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 16:07:22 crc kubenswrapper[4933]: I1202 16:07:22.083721 4933 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/147e5cf5-d45f-48df-89af-fad86d91085c-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 16:07:22 crc kubenswrapper[4933]: I1202 16:07:22.084032 4933 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/147e5cf5-d45f-48df-89af-fad86d91085c-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 16:07:22 crc kubenswrapper[4933]: I1202 16:07:22.084190 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-29rt8\" (UniqueName: \"kubernetes.io/projected/147e5cf5-d45f-48df-89af-fad86d91085c-kube-api-access-29rt8\") on node \"crc\" DevicePath \"\"" Dec 02 16:07:22 crc kubenswrapper[4933]: I1202 16:07:22.458069 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-7gxvc"] Dec 02 16:07:22 crc kubenswrapper[4933]: E1202 16:07:22.458394 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="147e5cf5-d45f-48df-89af-fad86d91085c" containerName="extract-utilities" Dec 02 16:07:22 crc kubenswrapper[4933]: I1202 16:07:22.458418 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="147e5cf5-d45f-48df-89af-fad86d91085c" containerName="extract-utilities" Dec 02 16:07:22 crc kubenswrapper[4933]: E1202 16:07:22.458442 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="147e5cf5-d45f-48df-89af-fad86d91085c" containerName="registry-server" Dec 02 16:07:22 crc kubenswrapper[4933]: I1202 16:07:22.458452 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="147e5cf5-d45f-48df-89af-fad86d91085c" containerName="registry-server" Dec 02 16:07:22 crc kubenswrapper[4933]: E1202 16:07:22.458464 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="147e5cf5-d45f-48df-89af-fad86d91085c" containerName="extract-content" Dec 02 16:07:22 crc kubenswrapper[4933]: I1202 16:07:22.458471 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="147e5cf5-d45f-48df-89af-fad86d91085c" containerName="extract-content" Dec 02 16:07:22 crc kubenswrapper[4933]: I1202 16:07:22.458624 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="147e5cf5-d45f-48df-89af-fad86d91085c" containerName="registry-server" Dec 02 16:07:22 crc kubenswrapper[4933]: I1202 16:07:22.459849 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7gxvc" Dec 02 16:07:22 crc kubenswrapper[4933]: I1202 16:07:22.472453 4933 generic.go:334] "Generic (PLEG): container finished" podID="147e5cf5-d45f-48df-89af-fad86d91085c" containerID="366722504a4d507d246a5e0ad5755e2392e80f348a08160c55a734d257159624" exitCode=0 Dec 02 16:07:22 crc kubenswrapper[4933]: I1202 16:07:22.472600 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-p9c9w" Dec 02 16:07:22 crc kubenswrapper[4933]: I1202 16:07:22.472668 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wwk6n" Dec 02 16:07:22 crc kubenswrapper[4933]: I1202 16:07:22.472744 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wwk6n" event={"ID":"147e5cf5-d45f-48df-89af-fad86d91085c","Type":"ContainerDied","Data":"366722504a4d507d246a5e0ad5755e2392e80f348a08160c55a734d257159624"} Dec 02 16:07:22 crc kubenswrapper[4933]: I1202 16:07:22.472807 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wwk6n" event={"ID":"147e5cf5-d45f-48df-89af-fad86d91085c","Type":"ContainerDied","Data":"cb515e4227c51115dfc40982819e37ad448253589f8519eafdd0202e2365116e"} Dec 02 16:07:22 crc kubenswrapper[4933]: I1202 16:07:22.472860 4933 scope.go:117] "RemoveContainer" containerID="366722504a4d507d246a5e0ad5755e2392e80f348a08160c55a734d257159624" Dec 02 16:07:22 crc kubenswrapper[4933]: I1202 16:07:22.491696 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7gxvc"] Dec 02 16:07:22 crc kubenswrapper[4933]: I1202 16:07:22.516776 4933 scope.go:117] "RemoveContainer" containerID="3dec896d1ab73062b0710d156f553cd3cc794cf726b6b7359d98f6653ff43831" Dec 02 16:07:22 crc kubenswrapper[4933]: I1202 16:07:22.547717 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wwk6n"] Dec 02 16:07:22 crc kubenswrapper[4933]: I1202 16:07:22.547806 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-wwk6n"] Dec 02 16:07:22 crc kubenswrapper[4933]: I1202 16:07:22.577923 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-logging/collector-p9c9w"] Dec 02 16:07:22 crc kubenswrapper[4933]: I1202 16:07:22.591780 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab2703eb-5f67-4e57-abb7-f14e41b1daf4-catalog-content\") pod \"certified-operators-7gxvc\" (UID: \"ab2703eb-5f67-4e57-abb7-f14e41b1daf4\") " pod="openshift-marketplace/certified-operators-7gxvc" Dec 02 16:07:22 crc kubenswrapper[4933]: I1202 16:07:22.591864 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xj6fh\" (UniqueName: \"kubernetes.io/projected/ab2703eb-5f67-4e57-abb7-f14e41b1daf4-kube-api-access-xj6fh\") pod \"certified-operators-7gxvc\" (UID: \"ab2703eb-5f67-4e57-abb7-f14e41b1daf4\") " pod="openshift-marketplace/certified-operators-7gxvc" Dec 02 16:07:22 crc kubenswrapper[4933]: I1202 16:07:22.591915 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab2703eb-5f67-4e57-abb7-f14e41b1daf4-utilities\") pod \"certified-operators-7gxvc\" (UID: \"ab2703eb-5f67-4e57-abb7-f14e41b1daf4\") " pod="openshift-marketplace/certified-operators-7gxvc" Dec 02 16:07:22 crc kubenswrapper[4933]: I1202 16:07:22.592139 4933 scope.go:117] "RemoveContainer" containerID="13a9ecca697c798df3d824658e5eebaaa6c3d5695b47fb063a17604651a0d88b" Dec 02 16:07:22 crc kubenswrapper[4933]: I1202 16:07:22.593686 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-logging/collector-p9c9w"] Dec 02 16:07:22 crc kubenswrapper[4933]: I1202 16:07:22.602511 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/collector-9hj2j"] Dec 02 16:07:22 crc kubenswrapper[4933]: I1202 16:07:22.603499 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-9hj2j" Dec 02 16:07:22 crc kubenswrapper[4933]: I1202 16:07:22.606344 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-syslog-receiver" Dec 02 16:07:22 crc kubenswrapper[4933]: I1202 16:07:22.606622 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-token" Dec 02 16:07:22 crc kubenswrapper[4933]: I1202 16:07:22.606785 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-dockercfg-tm4kr" Dec 02 16:07:22 crc kubenswrapper[4933]: I1202 16:07:22.606959 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-config" Dec 02 16:07:22 crc kubenswrapper[4933]: I1202 16:07:22.608921 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/collector-9hj2j"] Dec 02 16:07:22 crc kubenswrapper[4933]: I1202 16:07:22.609356 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-metrics" Dec 02 16:07:22 crc kubenswrapper[4933]: I1202 16:07:22.614306 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-trustbundle" Dec 02 16:07:22 crc kubenswrapper[4933]: I1202 16:07:22.614633 4933 scope.go:117] "RemoveContainer" containerID="366722504a4d507d246a5e0ad5755e2392e80f348a08160c55a734d257159624" Dec 02 16:07:22 crc kubenswrapper[4933]: E1202 16:07:22.615663 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"366722504a4d507d246a5e0ad5755e2392e80f348a08160c55a734d257159624\": container with ID starting with 366722504a4d507d246a5e0ad5755e2392e80f348a08160c55a734d257159624 not found: ID does not exist" containerID="366722504a4d507d246a5e0ad5755e2392e80f348a08160c55a734d257159624" Dec 02 16:07:22 crc kubenswrapper[4933]: I1202 16:07:22.615750 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"366722504a4d507d246a5e0ad5755e2392e80f348a08160c55a734d257159624"} err="failed to get container status \"366722504a4d507d246a5e0ad5755e2392e80f348a08160c55a734d257159624\": rpc error: code = NotFound desc = could not find container \"366722504a4d507d246a5e0ad5755e2392e80f348a08160c55a734d257159624\": container with ID starting with 366722504a4d507d246a5e0ad5755e2392e80f348a08160c55a734d257159624 not found: ID does not exist" Dec 02 16:07:22 crc kubenswrapper[4933]: I1202 16:07:22.615855 4933 scope.go:117] "RemoveContainer" containerID="3dec896d1ab73062b0710d156f553cd3cc794cf726b6b7359d98f6653ff43831" Dec 02 16:07:22 crc kubenswrapper[4933]: E1202 16:07:22.616248 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3dec896d1ab73062b0710d156f553cd3cc794cf726b6b7359d98f6653ff43831\": container with ID starting with 3dec896d1ab73062b0710d156f553cd3cc794cf726b6b7359d98f6653ff43831 not found: ID does not exist" containerID="3dec896d1ab73062b0710d156f553cd3cc794cf726b6b7359d98f6653ff43831" Dec 02 16:07:22 crc kubenswrapper[4933]: I1202 16:07:22.616328 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3dec896d1ab73062b0710d156f553cd3cc794cf726b6b7359d98f6653ff43831"} err="failed to get container status \"3dec896d1ab73062b0710d156f553cd3cc794cf726b6b7359d98f6653ff43831\": rpc error: code = NotFound desc = could not find container \"3dec896d1ab73062b0710d156f553cd3cc794cf726b6b7359d98f6653ff43831\": container with ID starting with 3dec896d1ab73062b0710d156f553cd3cc794cf726b6b7359d98f6653ff43831 not found: ID does not exist" Dec 02 16:07:22 crc kubenswrapper[4933]: I1202 16:07:22.616390 4933 scope.go:117] "RemoveContainer" containerID="13a9ecca697c798df3d824658e5eebaaa6c3d5695b47fb063a17604651a0d88b" Dec 02 16:07:22 crc kubenswrapper[4933]: E1202 16:07:22.616731 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13a9ecca697c798df3d824658e5eebaaa6c3d5695b47fb063a17604651a0d88b\": container with ID starting with 13a9ecca697c798df3d824658e5eebaaa6c3d5695b47fb063a17604651a0d88b not found: ID does not exist" containerID="13a9ecca697c798df3d824658e5eebaaa6c3d5695b47fb063a17604651a0d88b" Dec 02 16:07:22 crc kubenswrapper[4933]: I1202 16:07:22.616813 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13a9ecca697c798df3d824658e5eebaaa6c3d5695b47fb063a17604651a0d88b"} err="failed to get container status \"13a9ecca697c798df3d824658e5eebaaa6c3d5695b47fb063a17604651a0d88b\": rpc error: code = NotFound desc = could not find container \"13a9ecca697c798df3d824658e5eebaaa6c3d5695b47fb063a17604651a0d88b\": container with ID starting with 13a9ecca697c798df3d824658e5eebaaa6c3d5695b47fb063a17604651a0d88b not found: ID does not exist" Dec 02 16:07:22 crc kubenswrapper[4933]: I1202 16:07:22.693759 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/1a5bbd19-7aab-46a5-b5c6-9f3861d9dc8d-metrics\") pod \"collector-9hj2j\" (UID: \"1a5bbd19-7aab-46a5-b5c6-9f3861d9dc8d\") " pod="openshift-logging/collector-9hj2j" Dec 02 16:07:22 crc kubenswrapper[4933]: I1202 16:07:22.693813 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1a5bbd19-7aab-46a5-b5c6-9f3861d9dc8d-tmp\") pod \"collector-9hj2j\" (UID: \"1a5bbd19-7aab-46a5-b5c6-9f3861d9dc8d\") " pod="openshift-logging/collector-9hj2j" Dec 02 16:07:22 crc kubenswrapper[4933]: I1202 16:07:22.693849 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/1a5bbd19-7aab-46a5-b5c6-9f3861d9dc8d-config-openshift-service-cacrt\") pod \"collector-9hj2j\" (UID: \"1a5bbd19-7aab-46a5-b5c6-9f3861d9dc8d\") " pod="openshift-logging/collector-9hj2j" Dec 02 16:07:22 crc kubenswrapper[4933]: I1202 16:07:22.693875 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a5bbd19-7aab-46a5-b5c6-9f3861d9dc8d-config\") pod \"collector-9hj2j\" (UID: \"1a5bbd19-7aab-46a5-b5c6-9f3861d9dc8d\") " pod="openshift-logging/collector-9hj2j" Dec 02 16:07:22 crc kubenswrapper[4933]: I1202 16:07:22.693895 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/1a5bbd19-7aab-46a5-b5c6-9f3861d9dc8d-collector-token\") pod \"collector-9hj2j\" (UID: \"1a5bbd19-7aab-46a5-b5c6-9f3861d9dc8d\") " pod="openshift-logging/collector-9hj2j" Dec 02 16:07:22 crc kubenswrapper[4933]: I1202 16:07:22.694058 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1a5bbd19-7aab-46a5-b5c6-9f3861d9dc8d-trusted-ca\") pod \"collector-9hj2j\" (UID: \"1a5bbd19-7aab-46a5-b5c6-9f3861d9dc8d\") " pod="openshift-logging/collector-9hj2j" Dec 02 16:07:22 crc kubenswrapper[4933]: I1202 16:07:22.694144 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab2703eb-5f67-4e57-abb7-f14e41b1daf4-catalog-content\") pod \"certified-operators-7gxvc\" (UID: \"ab2703eb-5f67-4e57-abb7-f14e41b1daf4\") " pod="openshift-marketplace/certified-operators-7gxvc" Dec 02 16:07:22 crc kubenswrapper[4933]: I1202 16:07:22.694188 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/1a5bbd19-7aab-46a5-b5c6-9f3861d9dc8d-entrypoint\") pod \"collector-9hj2j\" (UID: \"1a5bbd19-7aab-46a5-b5c6-9f3861d9dc8d\") " pod="openshift-logging/collector-9hj2j" Dec 02 16:07:22 crc kubenswrapper[4933]: I1202 16:07:22.694220 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/1a5bbd19-7aab-46a5-b5c6-9f3861d9dc8d-collector-syslog-receiver\") pod \"collector-9hj2j\" (UID: \"1a5bbd19-7aab-46a5-b5c6-9f3861d9dc8d\") " pod="openshift-logging/collector-9hj2j" Dec 02 16:07:22 crc kubenswrapper[4933]: I1202 16:07:22.694242 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtnrs\" (UniqueName: \"kubernetes.io/projected/1a5bbd19-7aab-46a5-b5c6-9f3861d9dc8d-kube-api-access-wtnrs\") pod \"collector-9hj2j\" (UID: \"1a5bbd19-7aab-46a5-b5c6-9f3861d9dc8d\") " pod="openshift-logging/collector-9hj2j" Dec 02 16:07:22 crc kubenswrapper[4933]: I1202 16:07:22.694281 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xj6fh\" (UniqueName: \"kubernetes.io/projected/ab2703eb-5f67-4e57-abb7-f14e41b1daf4-kube-api-access-xj6fh\") pod \"certified-operators-7gxvc\" (UID: \"ab2703eb-5f67-4e57-abb7-f14e41b1daf4\") " pod="openshift-marketplace/certified-operators-7gxvc" Dec 02 16:07:22 crc kubenswrapper[4933]: I1202 16:07:22.694355 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab2703eb-5f67-4e57-abb7-f14e41b1daf4-utilities\") pod \"certified-operators-7gxvc\" (UID: \"ab2703eb-5f67-4e57-abb7-f14e41b1daf4\") " pod="openshift-marketplace/certified-operators-7gxvc" Dec 02 16:07:22 crc kubenswrapper[4933]: I1202 16:07:22.694430 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/1a5bbd19-7aab-46a5-b5c6-9f3861d9dc8d-datadir\") pod \"collector-9hj2j\" (UID: \"1a5bbd19-7aab-46a5-b5c6-9f3861d9dc8d\") " pod="openshift-logging/collector-9hj2j" Dec 02 16:07:22 crc kubenswrapper[4933]: I1202 16:07:22.694451 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/1a5bbd19-7aab-46a5-b5c6-9f3861d9dc8d-sa-token\") pod \"collector-9hj2j\" (UID: \"1a5bbd19-7aab-46a5-b5c6-9f3861d9dc8d\") " pod="openshift-logging/collector-9hj2j" Dec 02 16:07:22 crc kubenswrapper[4933]: I1202 16:07:22.694584 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab2703eb-5f67-4e57-abb7-f14e41b1daf4-catalog-content\") pod \"certified-operators-7gxvc\" (UID: \"ab2703eb-5f67-4e57-abb7-f14e41b1daf4\") " pod="openshift-marketplace/certified-operators-7gxvc" Dec 02 16:07:22 crc kubenswrapper[4933]: I1202 16:07:22.694742 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab2703eb-5f67-4e57-abb7-f14e41b1daf4-utilities\") pod \"certified-operators-7gxvc\" (UID: \"ab2703eb-5f67-4e57-abb7-f14e41b1daf4\") " pod="openshift-marketplace/certified-operators-7gxvc" Dec 02 16:07:22 crc kubenswrapper[4933]: I1202 16:07:22.710733 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xj6fh\" (UniqueName: \"kubernetes.io/projected/ab2703eb-5f67-4e57-abb7-f14e41b1daf4-kube-api-access-xj6fh\") pod \"certified-operators-7gxvc\" (UID: \"ab2703eb-5f67-4e57-abb7-f14e41b1daf4\") " pod="openshift-marketplace/certified-operators-7gxvc" Dec 02 16:07:22 crc kubenswrapper[4933]: I1202 16:07:22.795814 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/1a5bbd19-7aab-46a5-b5c6-9f3861d9dc8d-metrics\") pod \"collector-9hj2j\" (UID: \"1a5bbd19-7aab-46a5-b5c6-9f3861d9dc8d\") " pod="openshift-logging/collector-9hj2j" Dec 02 16:07:22 crc kubenswrapper[4933]: I1202 16:07:22.795973 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1a5bbd19-7aab-46a5-b5c6-9f3861d9dc8d-tmp\") pod \"collector-9hj2j\" (UID: \"1a5bbd19-7aab-46a5-b5c6-9f3861d9dc8d\") " pod="openshift-logging/collector-9hj2j" Dec 02 16:07:22 crc kubenswrapper[4933]: I1202 16:07:22.796004 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/1a5bbd19-7aab-46a5-b5c6-9f3861d9dc8d-config-openshift-service-cacrt\") pod \"collector-9hj2j\" (UID: \"1a5bbd19-7aab-46a5-b5c6-9f3861d9dc8d\") " pod="openshift-logging/collector-9hj2j" Dec 02 16:07:22 crc kubenswrapper[4933]: I1202 16:07:22.796040 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a5bbd19-7aab-46a5-b5c6-9f3861d9dc8d-config\") pod \"collector-9hj2j\" (UID: \"1a5bbd19-7aab-46a5-b5c6-9f3861d9dc8d\") " pod="openshift-logging/collector-9hj2j" Dec 02 16:07:22 crc kubenswrapper[4933]: I1202 16:07:22.796067 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/1a5bbd19-7aab-46a5-b5c6-9f3861d9dc8d-collector-token\") pod \"collector-9hj2j\" (UID: \"1a5bbd19-7aab-46a5-b5c6-9f3861d9dc8d\") " pod="openshift-logging/collector-9hj2j" Dec 02 16:07:22 crc kubenswrapper[4933]: I1202 16:07:22.796127 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1a5bbd19-7aab-46a5-b5c6-9f3861d9dc8d-trusted-ca\") pod \"collector-9hj2j\" (UID: \"1a5bbd19-7aab-46a5-b5c6-9f3861d9dc8d\") " pod="openshift-logging/collector-9hj2j" Dec 02 16:07:22 crc kubenswrapper[4933]: I1202 16:07:22.796176 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/1a5bbd19-7aab-46a5-b5c6-9f3861d9dc8d-entrypoint\") pod \"collector-9hj2j\" (UID: \"1a5bbd19-7aab-46a5-b5c6-9f3861d9dc8d\") " pod="openshift-logging/collector-9hj2j" Dec 02 16:07:22 crc kubenswrapper[4933]: I1202 16:07:22.796205 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/1a5bbd19-7aab-46a5-b5c6-9f3861d9dc8d-collector-syslog-receiver\") pod \"collector-9hj2j\" (UID: \"1a5bbd19-7aab-46a5-b5c6-9f3861d9dc8d\") " pod="openshift-logging/collector-9hj2j" Dec 02 16:07:22 crc kubenswrapper[4933]: I1202 16:07:22.796229 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wtnrs\" (UniqueName: \"kubernetes.io/projected/1a5bbd19-7aab-46a5-b5c6-9f3861d9dc8d-kube-api-access-wtnrs\") pod \"collector-9hj2j\" (UID: \"1a5bbd19-7aab-46a5-b5c6-9f3861d9dc8d\") " pod="openshift-logging/collector-9hj2j" Dec 02 16:07:22 crc kubenswrapper[4933]: I1202 16:07:22.796273 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/1a5bbd19-7aab-46a5-b5c6-9f3861d9dc8d-datadir\") pod \"collector-9hj2j\" (UID: \"1a5bbd19-7aab-46a5-b5c6-9f3861d9dc8d\") " pod="openshift-logging/collector-9hj2j" Dec 02 16:07:22 crc kubenswrapper[4933]: I1202 16:07:22.796296 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/1a5bbd19-7aab-46a5-b5c6-9f3861d9dc8d-sa-token\") pod \"collector-9hj2j\" (UID: \"1a5bbd19-7aab-46a5-b5c6-9f3861d9dc8d\") " pod="openshift-logging/collector-9hj2j" Dec 02 16:07:22 crc kubenswrapper[4933]: I1202 16:07:22.797053 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/1a5bbd19-7aab-46a5-b5c6-9f3861d9dc8d-config-openshift-service-cacrt\") pod \"collector-9hj2j\" (UID: \"1a5bbd19-7aab-46a5-b5c6-9f3861d9dc8d\") " pod="openshift-logging/collector-9hj2j" Dec 02 16:07:22 crc kubenswrapper[4933]: I1202 16:07:22.797375 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/1a5bbd19-7aab-46a5-b5c6-9f3861d9dc8d-entrypoint\") pod \"collector-9hj2j\" (UID: \"1a5bbd19-7aab-46a5-b5c6-9f3861d9dc8d\") " pod="openshift-logging/collector-9hj2j" Dec 02 16:07:22 crc kubenswrapper[4933]: I1202 16:07:22.797629 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1a5bbd19-7aab-46a5-b5c6-9f3861d9dc8d-trusted-ca\") pod \"collector-9hj2j\" (UID: \"1a5bbd19-7aab-46a5-b5c6-9f3861d9dc8d\") " pod="openshift-logging/collector-9hj2j" Dec 02 16:07:22 crc kubenswrapper[4933]: I1202 16:07:22.797990 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/1a5bbd19-7aab-46a5-b5c6-9f3861d9dc8d-datadir\") pod \"collector-9hj2j\" (UID: \"1a5bbd19-7aab-46a5-b5c6-9f3861d9dc8d\") " pod="openshift-logging/collector-9hj2j" Dec 02 16:07:22 crc kubenswrapper[4933]: I1202 16:07:22.798172 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a5bbd19-7aab-46a5-b5c6-9f3861d9dc8d-config\") pod \"collector-9hj2j\" (UID: \"1a5bbd19-7aab-46a5-b5c6-9f3861d9dc8d\") " pod="openshift-logging/collector-9hj2j" Dec 02 16:07:22 crc kubenswrapper[4933]: I1202 16:07:22.799032 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/1a5bbd19-7aab-46a5-b5c6-9f3861d9dc8d-metrics\") pod \"collector-9hj2j\" (UID: \"1a5bbd19-7aab-46a5-b5c6-9f3861d9dc8d\") " pod="openshift-logging/collector-9hj2j" Dec 02 16:07:22 crc kubenswrapper[4933]: I1202 16:07:22.801210 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/1a5bbd19-7aab-46a5-b5c6-9f3861d9dc8d-collector-syslog-receiver\") pod \"collector-9hj2j\" (UID: \"1a5bbd19-7aab-46a5-b5c6-9f3861d9dc8d\") " pod="openshift-logging/collector-9hj2j" Dec 02 16:07:22 crc kubenswrapper[4933]: I1202 16:07:22.801496 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1a5bbd19-7aab-46a5-b5c6-9f3861d9dc8d-tmp\") pod \"collector-9hj2j\" (UID: \"1a5bbd19-7aab-46a5-b5c6-9f3861d9dc8d\") " pod="openshift-logging/collector-9hj2j" Dec 02 16:07:22 crc kubenswrapper[4933]: I1202 16:07:22.804212 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7gxvc" Dec 02 16:07:22 crc kubenswrapper[4933]: I1202 16:07:22.804528 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/1a5bbd19-7aab-46a5-b5c6-9f3861d9dc8d-collector-token\") pod \"collector-9hj2j\" (UID: \"1a5bbd19-7aab-46a5-b5c6-9f3861d9dc8d\") " pod="openshift-logging/collector-9hj2j" Dec 02 16:07:22 crc kubenswrapper[4933]: I1202 16:07:22.820004 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/1a5bbd19-7aab-46a5-b5c6-9f3861d9dc8d-sa-token\") pod \"collector-9hj2j\" (UID: \"1a5bbd19-7aab-46a5-b5c6-9f3861d9dc8d\") " pod="openshift-logging/collector-9hj2j" Dec 02 16:07:22 crc kubenswrapper[4933]: I1202 16:07:22.821095 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtnrs\" (UniqueName: \"kubernetes.io/projected/1a5bbd19-7aab-46a5-b5c6-9f3861d9dc8d-kube-api-access-wtnrs\") pod \"collector-9hj2j\" (UID: \"1a5bbd19-7aab-46a5-b5c6-9f3861d9dc8d\") " pod="openshift-logging/collector-9hj2j" Dec 02 16:07:22 crc kubenswrapper[4933]: I1202 16:07:22.955198 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-9hj2j" Dec 02 16:07:23 crc kubenswrapper[4933]: I1202 16:07:23.074027 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="147e5cf5-d45f-48df-89af-fad86d91085c" path="/var/lib/kubelet/pods/147e5cf5-d45f-48df-89af-fad86d91085c/volumes" Dec 02 16:07:23 crc kubenswrapper[4933]: I1202 16:07:23.074920 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2016e5cd-b786-40e5-a0a9-f77df27d8272" path="/var/lib/kubelet/pods/2016e5cd-b786-40e5-a0a9-f77df27d8272/volumes" Dec 02 16:07:23 crc kubenswrapper[4933]: I1202 16:07:23.147919 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7gxvc"] Dec 02 16:07:23 crc kubenswrapper[4933]: W1202 16:07:23.191717 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab2703eb_5f67_4e57_abb7_f14e41b1daf4.slice/crio-188b8ae8d48ff2c40876a60760bf0c47e5efb84ed2c1f317d4a8c89c722c8cf6 WatchSource:0}: Error finding container 188b8ae8d48ff2c40876a60760bf0c47e5efb84ed2c1f317d4a8c89c722c8cf6: Status 404 returned error can't find the container with id 188b8ae8d48ff2c40876a60760bf0c47e5efb84ed2c1f317d4a8c89c722c8cf6 Dec 02 16:07:23 crc kubenswrapper[4933]: I1202 16:07:23.479849 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7gxvc" event={"ID":"ab2703eb-5f67-4e57-abb7-f14e41b1daf4","Type":"ContainerStarted","Data":"188b8ae8d48ff2c40876a60760bf0c47e5efb84ed2c1f317d4a8c89c722c8cf6"} Dec 02 16:07:23 crc kubenswrapper[4933]: I1202 16:07:23.546333 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/collector-9hj2j"] Dec 02 16:07:24 crc kubenswrapper[4933]: I1202 16:07:24.490085 4933 generic.go:334] "Generic (PLEG): container finished" podID="ab2703eb-5f67-4e57-abb7-f14e41b1daf4" containerID="9e11cba0fd0b7eecceeb4752648aa0c176bafcdc2f645a1def6ead0e347020a8" exitCode=0 Dec 02 16:07:24 crc kubenswrapper[4933]: I1202 16:07:24.490156 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7gxvc" event={"ID":"ab2703eb-5f67-4e57-abb7-f14e41b1daf4","Type":"ContainerDied","Data":"9e11cba0fd0b7eecceeb4752648aa0c176bafcdc2f645a1def6ead0e347020a8"} Dec 02 16:07:24 crc kubenswrapper[4933]: I1202 16:07:24.491981 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/collector-9hj2j" event={"ID":"1a5bbd19-7aab-46a5-b5c6-9f3861d9dc8d","Type":"ContainerStarted","Data":"e52c1ddb3d26b4a8407bb43114582b732dd35a5a422e9be12901b08928d89f6c"} Dec 02 16:07:24 crc kubenswrapper[4933]: I1202 16:07:24.600637 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-qm2j4" Dec 02 16:07:24 crc kubenswrapper[4933]: I1202 16:07:24.600717 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-qm2j4" Dec 02 16:07:24 crc kubenswrapper[4933]: I1202 16:07:24.639021 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-qm2j4" Dec 02 16:07:25 crc kubenswrapper[4933]: I1202 16:07:25.555313 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-qm2j4" Dec 02 16:07:26 crc kubenswrapper[4933]: I1202 16:07:26.511718 4933 generic.go:334] "Generic (PLEG): container finished" podID="ab2703eb-5f67-4e57-abb7-f14e41b1daf4" containerID="d617e3eac8caa07c6694fced28c1075049c243e44d29278ec26931307deed2fd" exitCode=0 Dec 02 16:07:26 crc kubenswrapper[4933]: I1202 16:07:26.511752 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7gxvc" event={"ID":"ab2703eb-5f67-4e57-abb7-f14e41b1daf4","Type":"ContainerDied","Data":"d617e3eac8caa07c6694fced28c1075049c243e44d29278ec26931307deed2fd"} Dec 02 16:07:27 crc kubenswrapper[4933]: I1202 16:07:27.532497 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7gxvc" event={"ID":"ab2703eb-5f67-4e57-abb7-f14e41b1daf4","Type":"ContainerStarted","Data":"54fd4668c5d84e142534e33bae99bd78d6d201fdba52f41a013c198748b6db16"} Dec 02 16:07:27 crc kubenswrapper[4933]: I1202 16:07:27.554344 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-7gxvc" podStartSLOduration=3.047336131 podStartE2EDuration="5.554325758s" podCreationTimestamp="2025-12-02 16:07:22 +0000 UTC" firstStartedPulling="2025-12-02 16:07:24.491971705 +0000 UTC m=+907.743198408" lastFinishedPulling="2025-12-02 16:07:26.998961332 +0000 UTC m=+910.250188035" observedRunningTime="2025-12-02 16:07:27.553563548 +0000 UTC m=+910.804790281" watchObservedRunningTime="2025-12-02 16:07:27.554325758 +0000 UTC m=+910.805552461" Dec 02 16:07:28 crc kubenswrapper[4933]: I1202 16:07:28.854088 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qm2j4"] Dec 02 16:07:28 crc kubenswrapper[4933]: I1202 16:07:28.854577 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-qm2j4" podUID="7bc26a40-ae58-4c28-ade2-7ea4e2bb9900" containerName="registry-server" containerID="cri-o://f9653bba51eb155877136b396e0191619cd165d85184cb460c69c5cf94fed441" gracePeriod=2 Dec 02 16:07:29 crc kubenswrapper[4933]: I1202 16:07:29.549428 4933 generic.go:334] "Generic (PLEG): container finished" podID="7bc26a40-ae58-4c28-ade2-7ea4e2bb9900" containerID="f9653bba51eb155877136b396e0191619cd165d85184cb460c69c5cf94fed441" exitCode=0 Dec 02 16:07:29 crc kubenswrapper[4933]: I1202 16:07:29.549473 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qm2j4" event={"ID":"7bc26a40-ae58-4c28-ade2-7ea4e2bb9900","Type":"ContainerDied","Data":"f9653bba51eb155877136b396e0191619cd165d85184cb460c69c5cf94fed441"} Dec 02 16:07:32 crc kubenswrapper[4933]: I1202 16:07:32.275898 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qm2j4" Dec 02 16:07:32 crc kubenswrapper[4933]: I1202 16:07:32.474581 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7bc26a40-ae58-4c28-ade2-7ea4e2bb9900-utilities\") pod \"7bc26a40-ae58-4c28-ade2-7ea4e2bb9900\" (UID: \"7bc26a40-ae58-4c28-ade2-7ea4e2bb9900\") " Dec 02 16:07:32 crc kubenswrapper[4933]: I1202 16:07:32.474644 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7bc26a40-ae58-4c28-ade2-7ea4e2bb9900-catalog-content\") pod \"7bc26a40-ae58-4c28-ade2-7ea4e2bb9900\" (UID: \"7bc26a40-ae58-4c28-ade2-7ea4e2bb9900\") " Dec 02 16:07:32 crc kubenswrapper[4933]: I1202 16:07:32.474675 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5lrbl\" (UniqueName: \"kubernetes.io/projected/7bc26a40-ae58-4c28-ade2-7ea4e2bb9900-kube-api-access-5lrbl\") pod \"7bc26a40-ae58-4c28-ade2-7ea4e2bb9900\" (UID: \"7bc26a40-ae58-4c28-ade2-7ea4e2bb9900\") " Dec 02 16:07:32 crc kubenswrapper[4933]: I1202 16:07:32.475715 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7bc26a40-ae58-4c28-ade2-7ea4e2bb9900-utilities" (OuterVolumeSpecName: "utilities") pod "7bc26a40-ae58-4c28-ade2-7ea4e2bb9900" (UID: "7bc26a40-ae58-4c28-ade2-7ea4e2bb9900"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 16:07:32 crc kubenswrapper[4933]: I1202 16:07:32.479495 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bc26a40-ae58-4c28-ade2-7ea4e2bb9900-kube-api-access-5lrbl" (OuterVolumeSpecName: "kube-api-access-5lrbl") pod "7bc26a40-ae58-4c28-ade2-7ea4e2bb9900" (UID: "7bc26a40-ae58-4c28-ade2-7ea4e2bb9900"). InnerVolumeSpecName "kube-api-access-5lrbl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:07:32 crc kubenswrapper[4933]: I1202 16:07:32.524039 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7bc26a40-ae58-4c28-ade2-7ea4e2bb9900-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7bc26a40-ae58-4c28-ade2-7ea4e2bb9900" (UID: "7bc26a40-ae58-4c28-ade2-7ea4e2bb9900"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 16:07:32 crc kubenswrapper[4933]: I1202 16:07:32.572485 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qm2j4" event={"ID":"7bc26a40-ae58-4c28-ade2-7ea4e2bb9900","Type":"ContainerDied","Data":"72a782cff9c2fcde9c21773ca0b59cab4bbc56519fcd2d5b2f3267abbebe9923"} Dec 02 16:07:32 crc kubenswrapper[4933]: I1202 16:07:32.572533 4933 scope.go:117] "RemoveContainer" containerID="f9653bba51eb155877136b396e0191619cd165d85184cb460c69c5cf94fed441" Dec 02 16:07:32 crc kubenswrapper[4933]: I1202 16:07:32.572643 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qm2j4" Dec 02 16:07:32 crc kubenswrapper[4933]: I1202 16:07:32.576427 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/collector-9hj2j" event={"ID":"1a5bbd19-7aab-46a5-b5c6-9f3861d9dc8d","Type":"ContainerStarted","Data":"725b0867fe7bc1138ece49713aa22dcba4a7de2a19a9f41d70acb4deeb7b3ec1"} Dec 02 16:07:32 crc kubenswrapper[4933]: I1202 16:07:32.577545 4933 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7bc26a40-ae58-4c28-ade2-7ea4e2bb9900-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 16:07:32 crc kubenswrapper[4933]: I1202 16:07:32.577573 4933 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7bc26a40-ae58-4c28-ade2-7ea4e2bb9900-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 16:07:32 crc kubenswrapper[4933]: I1202 16:07:32.577604 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5lrbl\" (UniqueName: \"kubernetes.io/projected/7bc26a40-ae58-4c28-ade2-7ea4e2bb9900-kube-api-access-5lrbl\") on node \"crc\" DevicePath \"\"" Dec 02 16:07:32 crc kubenswrapper[4933]: I1202 16:07:32.594747 4933 scope.go:117] "RemoveContainer" containerID="a6ad93c09b6c59fb8d2355d3a552dd14944baa457a08c174e03f9cb8c03f9a99" Dec 02 16:07:32 crc kubenswrapper[4933]: I1202 16:07:32.629094 4933 scope.go:117] "RemoveContainer" containerID="f1681296642ad5eaf69e44500dd1746757b03f605d0fee7b0a204851533b0f7b" Dec 02 16:07:32 crc kubenswrapper[4933]: I1202 16:07:32.630644 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/collector-9hj2j" podStartSLOduration=2.161233026 podStartE2EDuration="10.630618579s" podCreationTimestamp="2025-12-02 16:07:22 +0000 UTC" firstStartedPulling="2025-12-02 16:07:23.553183369 +0000 UTC m=+906.804410072" lastFinishedPulling="2025-12-02 16:07:32.022568922 +0000 UTC m=+915.273795625" observedRunningTime="2025-12-02 16:07:32.603810346 +0000 UTC m=+915.855037059" watchObservedRunningTime="2025-12-02 16:07:32.630618579 +0000 UTC m=+915.881845302" Dec 02 16:07:32 crc kubenswrapper[4933]: I1202 16:07:32.633978 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qm2j4"] Dec 02 16:07:32 crc kubenswrapper[4933]: I1202 16:07:32.643786 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-qm2j4"] Dec 02 16:07:32 crc kubenswrapper[4933]: I1202 16:07:32.805149 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-7gxvc" Dec 02 16:07:32 crc kubenswrapper[4933]: I1202 16:07:32.805220 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-7gxvc" Dec 02 16:07:32 crc kubenswrapper[4933]: I1202 16:07:32.859188 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-7gxvc" Dec 02 16:07:33 crc kubenswrapper[4933]: I1202 16:07:33.065552 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bc26a40-ae58-4c28-ade2-7ea4e2bb9900" path="/var/lib/kubelet/pods/7bc26a40-ae58-4c28-ade2-7ea4e2bb9900/volumes" Dec 02 16:07:33 crc kubenswrapper[4933]: I1202 16:07:33.656386 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-7gxvc" Dec 02 16:07:35 crc kubenswrapper[4933]: I1202 16:07:35.248759 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7gxvc"] Dec 02 16:07:35 crc kubenswrapper[4933]: I1202 16:07:35.603105 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-7gxvc" podUID="ab2703eb-5f67-4e57-abb7-f14e41b1daf4" containerName="registry-server" containerID="cri-o://54fd4668c5d84e142534e33bae99bd78d6d201fdba52f41a013c198748b6db16" gracePeriod=2 Dec 02 16:07:36 crc kubenswrapper[4933]: I1202 16:07:36.013439 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7gxvc" Dec 02 16:07:36 crc kubenswrapper[4933]: I1202 16:07:36.137255 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xj6fh\" (UniqueName: \"kubernetes.io/projected/ab2703eb-5f67-4e57-abb7-f14e41b1daf4-kube-api-access-xj6fh\") pod \"ab2703eb-5f67-4e57-abb7-f14e41b1daf4\" (UID: \"ab2703eb-5f67-4e57-abb7-f14e41b1daf4\") " Dec 02 16:07:36 crc kubenswrapper[4933]: I1202 16:07:36.137325 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab2703eb-5f67-4e57-abb7-f14e41b1daf4-utilities\") pod \"ab2703eb-5f67-4e57-abb7-f14e41b1daf4\" (UID: \"ab2703eb-5f67-4e57-abb7-f14e41b1daf4\") " Dec 02 16:07:36 crc kubenswrapper[4933]: I1202 16:07:36.137348 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab2703eb-5f67-4e57-abb7-f14e41b1daf4-catalog-content\") pod \"ab2703eb-5f67-4e57-abb7-f14e41b1daf4\" (UID: \"ab2703eb-5f67-4e57-abb7-f14e41b1daf4\") " Dec 02 16:07:36 crc kubenswrapper[4933]: I1202 16:07:36.139335 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab2703eb-5f67-4e57-abb7-f14e41b1daf4-utilities" (OuterVolumeSpecName: "utilities") pod "ab2703eb-5f67-4e57-abb7-f14e41b1daf4" (UID: "ab2703eb-5f67-4e57-abb7-f14e41b1daf4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 16:07:36 crc kubenswrapper[4933]: I1202 16:07:36.144264 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab2703eb-5f67-4e57-abb7-f14e41b1daf4-kube-api-access-xj6fh" (OuterVolumeSpecName: "kube-api-access-xj6fh") pod "ab2703eb-5f67-4e57-abb7-f14e41b1daf4" (UID: "ab2703eb-5f67-4e57-abb7-f14e41b1daf4"). InnerVolumeSpecName "kube-api-access-xj6fh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:07:36 crc kubenswrapper[4933]: I1202 16:07:36.193027 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab2703eb-5f67-4e57-abb7-f14e41b1daf4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ab2703eb-5f67-4e57-abb7-f14e41b1daf4" (UID: "ab2703eb-5f67-4e57-abb7-f14e41b1daf4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 16:07:36 crc kubenswrapper[4933]: I1202 16:07:36.239625 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xj6fh\" (UniqueName: \"kubernetes.io/projected/ab2703eb-5f67-4e57-abb7-f14e41b1daf4-kube-api-access-xj6fh\") on node \"crc\" DevicePath \"\"" Dec 02 16:07:36 crc kubenswrapper[4933]: I1202 16:07:36.239671 4933 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab2703eb-5f67-4e57-abb7-f14e41b1daf4-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 16:07:36 crc kubenswrapper[4933]: I1202 16:07:36.239803 4933 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab2703eb-5f67-4e57-abb7-f14e41b1daf4-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 16:07:36 crc kubenswrapper[4933]: I1202 16:07:36.611964 4933 generic.go:334] "Generic (PLEG): container finished" podID="ab2703eb-5f67-4e57-abb7-f14e41b1daf4" containerID="54fd4668c5d84e142534e33bae99bd78d6d201fdba52f41a013c198748b6db16" exitCode=0 Dec 02 16:07:36 crc kubenswrapper[4933]: I1202 16:07:36.612007 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7gxvc" event={"ID":"ab2703eb-5f67-4e57-abb7-f14e41b1daf4","Type":"ContainerDied","Data":"54fd4668c5d84e142534e33bae99bd78d6d201fdba52f41a013c198748b6db16"} Dec 02 16:07:36 crc kubenswrapper[4933]: I1202 16:07:36.612033 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7gxvc" event={"ID":"ab2703eb-5f67-4e57-abb7-f14e41b1daf4","Type":"ContainerDied","Data":"188b8ae8d48ff2c40876a60760bf0c47e5efb84ed2c1f317d4a8c89c722c8cf6"} Dec 02 16:07:36 crc kubenswrapper[4933]: I1202 16:07:36.612049 4933 scope.go:117] "RemoveContainer" containerID="54fd4668c5d84e142534e33bae99bd78d6d201fdba52f41a013c198748b6db16" Dec 02 16:07:36 crc kubenswrapper[4933]: I1202 16:07:36.612166 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7gxvc" Dec 02 16:07:36 crc kubenswrapper[4933]: I1202 16:07:36.643530 4933 scope.go:117] "RemoveContainer" containerID="d617e3eac8caa07c6694fced28c1075049c243e44d29278ec26931307deed2fd" Dec 02 16:07:36 crc kubenswrapper[4933]: I1202 16:07:36.647129 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7gxvc"] Dec 02 16:07:36 crc kubenswrapper[4933]: I1202 16:07:36.653068 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-7gxvc"] Dec 02 16:07:36 crc kubenswrapper[4933]: I1202 16:07:36.672236 4933 scope.go:117] "RemoveContainer" containerID="9e11cba0fd0b7eecceeb4752648aa0c176bafcdc2f645a1def6ead0e347020a8" Dec 02 16:07:36 crc kubenswrapper[4933]: I1202 16:07:36.688149 4933 scope.go:117] "RemoveContainer" containerID="54fd4668c5d84e142534e33bae99bd78d6d201fdba52f41a013c198748b6db16" Dec 02 16:07:36 crc kubenswrapper[4933]: E1202 16:07:36.688604 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54fd4668c5d84e142534e33bae99bd78d6d201fdba52f41a013c198748b6db16\": container with ID starting with 54fd4668c5d84e142534e33bae99bd78d6d201fdba52f41a013c198748b6db16 not found: ID does not exist" containerID="54fd4668c5d84e142534e33bae99bd78d6d201fdba52f41a013c198748b6db16" Dec 02 16:07:36 crc kubenswrapper[4933]: I1202 16:07:36.688645 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54fd4668c5d84e142534e33bae99bd78d6d201fdba52f41a013c198748b6db16"} err="failed to get container status \"54fd4668c5d84e142534e33bae99bd78d6d201fdba52f41a013c198748b6db16\": rpc error: code = NotFound desc = could not find container \"54fd4668c5d84e142534e33bae99bd78d6d201fdba52f41a013c198748b6db16\": container with ID starting with 54fd4668c5d84e142534e33bae99bd78d6d201fdba52f41a013c198748b6db16 not found: ID does not exist" Dec 02 16:07:36 crc kubenswrapper[4933]: I1202 16:07:36.688673 4933 scope.go:117] "RemoveContainer" containerID="d617e3eac8caa07c6694fced28c1075049c243e44d29278ec26931307deed2fd" Dec 02 16:07:36 crc kubenswrapper[4933]: E1202 16:07:36.688950 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d617e3eac8caa07c6694fced28c1075049c243e44d29278ec26931307deed2fd\": container with ID starting with d617e3eac8caa07c6694fced28c1075049c243e44d29278ec26931307deed2fd not found: ID does not exist" containerID="d617e3eac8caa07c6694fced28c1075049c243e44d29278ec26931307deed2fd" Dec 02 16:07:36 crc kubenswrapper[4933]: I1202 16:07:36.688973 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d617e3eac8caa07c6694fced28c1075049c243e44d29278ec26931307deed2fd"} err="failed to get container status \"d617e3eac8caa07c6694fced28c1075049c243e44d29278ec26931307deed2fd\": rpc error: code = NotFound desc = could not find container \"d617e3eac8caa07c6694fced28c1075049c243e44d29278ec26931307deed2fd\": container with ID starting with d617e3eac8caa07c6694fced28c1075049c243e44d29278ec26931307deed2fd not found: ID does not exist" Dec 02 16:07:36 crc kubenswrapper[4933]: I1202 16:07:36.688987 4933 scope.go:117] "RemoveContainer" containerID="9e11cba0fd0b7eecceeb4752648aa0c176bafcdc2f645a1def6ead0e347020a8" Dec 02 16:07:36 crc kubenswrapper[4933]: E1202 16:07:36.689228 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e11cba0fd0b7eecceeb4752648aa0c176bafcdc2f645a1def6ead0e347020a8\": container with ID starting with 9e11cba0fd0b7eecceeb4752648aa0c176bafcdc2f645a1def6ead0e347020a8 not found: ID does not exist" containerID="9e11cba0fd0b7eecceeb4752648aa0c176bafcdc2f645a1def6ead0e347020a8" Dec 02 16:07:36 crc kubenswrapper[4933]: I1202 16:07:36.689255 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e11cba0fd0b7eecceeb4752648aa0c176bafcdc2f645a1def6ead0e347020a8"} err="failed to get container status \"9e11cba0fd0b7eecceeb4752648aa0c176bafcdc2f645a1def6ead0e347020a8\": rpc error: code = NotFound desc = could not find container \"9e11cba0fd0b7eecceeb4752648aa0c176bafcdc2f645a1def6ead0e347020a8\": container with ID starting with 9e11cba0fd0b7eecceeb4752648aa0c176bafcdc2f645a1def6ead0e347020a8 not found: ID does not exist" Dec 02 16:07:37 crc kubenswrapper[4933]: I1202 16:07:37.060616 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab2703eb-5f67-4e57-abb7-f14e41b1daf4" path="/var/lib/kubelet/pods/ab2703eb-5f67-4e57-abb7-f14e41b1daf4/volumes" Dec 02 16:07:47 crc kubenswrapper[4933]: I1202 16:07:47.170120 4933 patch_prober.go:28] interesting pod/machine-config-daemon-d2p6w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 16:07:47 crc kubenswrapper[4933]: I1202 16:07:47.171185 4933 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 16:08:00 crc kubenswrapper[4933]: I1202 16:08:00.506133 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7m5qk"] Dec 02 16:08:00 crc kubenswrapper[4933]: E1202 16:08:00.506887 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab2703eb-5f67-4e57-abb7-f14e41b1daf4" containerName="registry-server" Dec 02 16:08:00 crc kubenswrapper[4933]: I1202 16:08:00.506900 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab2703eb-5f67-4e57-abb7-f14e41b1daf4" containerName="registry-server" Dec 02 16:08:00 crc kubenswrapper[4933]: E1202 16:08:00.506917 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab2703eb-5f67-4e57-abb7-f14e41b1daf4" containerName="extract-utilities" Dec 02 16:08:00 crc kubenswrapper[4933]: I1202 16:08:00.506924 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab2703eb-5f67-4e57-abb7-f14e41b1daf4" containerName="extract-utilities" Dec 02 16:08:00 crc kubenswrapper[4933]: E1202 16:08:00.506937 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bc26a40-ae58-4c28-ade2-7ea4e2bb9900" containerName="extract-content" Dec 02 16:08:00 crc kubenswrapper[4933]: I1202 16:08:00.506943 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bc26a40-ae58-4c28-ade2-7ea4e2bb9900" containerName="extract-content" Dec 02 16:08:00 crc kubenswrapper[4933]: E1202 16:08:00.506956 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bc26a40-ae58-4c28-ade2-7ea4e2bb9900" containerName="registry-server" Dec 02 16:08:00 crc kubenswrapper[4933]: I1202 16:08:00.506962 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bc26a40-ae58-4c28-ade2-7ea4e2bb9900" containerName="registry-server" Dec 02 16:08:00 crc kubenswrapper[4933]: E1202 16:08:00.506970 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab2703eb-5f67-4e57-abb7-f14e41b1daf4" containerName="extract-content" Dec 02 16:08:00 crc kubenswrapper[4933]: I1202 16:08:00.506976 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab2703eb-5f67-4e57-abb7-f14e41b1daf4" containerName="extract-content" Dec 02 16:08:00 crc kubenswrapper[4933]: E1202 16:08:00.506984 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bc26a40-ae58-4c28-ade2-7ea4e2bb9900" containerName="extract-utilities" Dec 02 16:08:00 crc kubenswrapper[4933]: I1202 16:08:00.506991 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bc26a40-ae58-4c28-ade2-7ea4e2bb9900" containerName="extract-utilities" Dec 02 16:08:00 crc kubenswrapper[4933]: I1202 16:08:00.507174 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab2703eb-5f67-4e57-abb7-f14e41b1daf4" containerName="registry-server" Dec 02 16:08:00 crc kubenswrapper[4933]: I1202 16:08:00.507189 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bc26a40-ae58-4c28-ade2-7ea4e2bb9900" containerName="registry-server" Dec 02 16:08:00 crc kubenswrapper[4933]: I1202 16:08:00.508299 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7m5qk" Dec 02 16:08:00 crc kubenswrapper[4933]: I1202 16:08:00.510168 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 02 16:08:00 crc kubenswrapper[4933]: I1202 16:08:00.515157 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7m5qk"] Dec 02 16:08:00 crc kubenswrapper[4933]: I1202 16:08:00.534499 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzqxq\" (UniqueName: \"kubernetes.io/projected/9bc38c6b-b881-4fec-9d9f-d7bee85e17eb-kube-api-access-lzqxq\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7m5qk\" (UID: \"9bc38c6b-b881-4fec-9d9f-d7bee85e17eb\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7m5qk" Dec 02 16:08:00 crc kubenswrapper[4933]: I1202 16:08:00.534585 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9bc38c6b-b881-4fec-9d9f-d7bee85e17eb-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7m5qk\" (UID: \"9bc38c6b-b881-4fec-9d9f-d7bee85e17eb\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7m5qk" Dec 02 16:08:00 crc kubenswrapper[4933]: I1202 16:08:00.534625 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9bc38c6b-b881-4fec-9d9f-d7bee85e17eb-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7m5qk\" (UID: \"9bc38c6b-b881-4fec-9d9f-d7bee85e17eb\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7m5qk" Dec 02 16:08:00 crc kubenswrapper[4933]: I1202 16:08:00.636446 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzqxq\" (UniqueName: \"kubernetes.io/projected/9bc38c6b-b881-4fec-9d9f-d7bee85e17eb-kube-api-access-lzqxq\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7m5qk\" (UID: \"9bc38c6b-b881-4fec-9d9f-d7bee85e17eb\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7m5qk" Dec 02 16:08:00 crc kubenswrapper[4933]: I1202 16:08:00.636559 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9bc38c6b-b881-4fec-9d9f-d7bee85e17eb-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7m5qk\" (UID: \"9bc38c6b-b881-4fec-9d9f-d7bee85e17eb\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7m5qk" Dec 02 16:08:00 crc kubenswrapper[4933]: I1202 16:08:00.636605 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9bc38c6b-b881-4fec-9d9f-d7bee85e17eb-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7m5qk\" (UID: \"9bc38c6b-b881-4fec-9d9f-d7bee85e17eb\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7m5qk" Dec 02 16:08:00 crc kubenswrapper[4933]: I1202 16:08:00.637298 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9bc38c6b-b881-4fec-9d9f-d7bee85e17eb-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7m5qk\" (UID: \"9bc38c6b-b881-4fec-9d9f-d7bee85e17eb\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7m5qk" Dec 02 16:08:00 crc kubenswrapper[4933]: I1202 16:08:00.637378 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9bc38c6b-b881-4fec-9d9f-d7bee85e17eb-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7m5qk\" (UID: \"9bc38c6b-b881-4fec-9d9f-d7bee85e17eb\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7m5qk" Dec 02 16:08:00 crc kubenswrapper[4933]: I1202 16:08:00.655314 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzqxq\" (UniqueName: \"kubernetes.io/projected/9bc38c6b-b881-4fec-9d9f-d7bee85e17eb-kube-api-access-lzqxq\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7m5qk\" (UID: \"9bc38c6b-b881-4fec-9d9f-d7bee85e17eb\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7m5qk" Dec 02 16:08:00 crc kubenswrapper[4933]: I1202 16:08:00.836129 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7m5qk" Dec 02 16:08:01 crc kubenswrapper[4933]: I1202 16:08:01.164800 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7m5qk"] Dec 02 16:08:01 crc kubenswrapper[4933]: I1202 16:08:01.832586 4933 generic.go:334] "Generic (PLEG): container finished" podID="9bc38c6b-b881-4fec-9d9f-d7bee85e17eb" containerID="05b0a9642602bccc52f69458803e5d8e47415a9f02ed0a226d8c1b8e7f4fcb39" exitCode=0 Dec 02 16:08:01 crc kubenswrapper[4933]: I1202 16:08:01.832810 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7m5qk" event={"ID":"9bc38c6b-b881-4fec-9d9f-d7bee85e17eb","Type":"ContainerDied","Data":"05b0a9642602bccc52f69458803e5d8e47415a9f02ed0a226d8c1b8e7f4fcb39"} Dec 02 16:08:01 crc kubenswrapper[4933]: I1202 16:08:01.832848 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7m5qk" event={"ID":"9bc38c6b-b881-4fec-9d9f-d7bee85e17eb","Type":"ContainerStarted","Data":"49d03095da379466936d4ab9469f18367d9ea822d93d4d1aa6cf3953ad7b2093"} Dec 02 16:08:03 crc kubenswrapper[4933]: I1202 16:08:03.849707 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7m5qk" event={"ID":"9bc38c6b-b881-4fec-9d9f-d7bee85e17eb","Type":"ContainerStarted","Data":"53ef4f2e1c675d2972142edadc78ab8008a68ffc1e6c99f85e16c5583805644c"} Dec 02 16:08:04 crc kubenswrapper[4933]: I1202 16:08:04.857403 4933 generic.go:334] "Generic (PLEG): container finished" podID="9bc38c6b-b881-4fec-9d9f-d7bee85e17eb" containerID="53ef4f2e1c675d2972142edadc78ab8008a68ffc1e6c99f85e16c5583805644c" exitCode=0 Dec 02 16:08:04 crc kubenswrapper[4933]: I1202 16:08:04.857746 4933 generic.go:334] "Generic (PLEG): container finished" podID="9bc38c6b-b881-4fec-9d9f-d7bee85e17eb" containerID="f98bb25e47f4f3bd475d8126d0d758553287eebd775d3e5e5dacaabc2b80cc69" exitCode=0 Dec 02 16:08:04 crc kubenswrapper[4933]: I1202 16:08:04.857567 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7m5qk" event={"ID":"9bc38c6b-b881-4fec-9d9f-d7bee85e17eb","Type":"ContainerDied","Data":"53ef4f2e1c675d2972142edadc78ab8008a68ffc1e6c99f85e16c5583805644c"} Dec 02 16:08:04 crc kubenswrapper[4933]: I1202 16:08:04.857783 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7m5qk" event={"ID":"9bc38c6b-b881-4fec-9d9f-d7bee85e17eb","Type":"ContainerDied","Data":"f98bb25e47f4f3bd475d8126d0d758553287eebd775d3e5e5dacaabc2b80cc69"} Dec 02 16:08:06 crc kubenswrapper[4933]: I1202 16:08:06.190604 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7m5qk" Dec 02 16:08:06 crc kubenswrapper[4933]: I1202 16:08:06.222011 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzqxq\" (UniqueName: \"kubernetes.io/projected/9bc38c6b-b881-4fec-9d9f-d7bee85e17eb-kube-api-access-lzqxq\") pod \"9bc38c6b-b881-4fec-9d9f-d7bee85e17eb\" (UID: \"9bc38c6b-b881-4fec-9d9f-d7bee85e17eb\") " Dec 02 16:08:06 crc kubenswrapper[4933]: I1202 16:08:06.222087 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9bc38c6b-b881-4fec-9d9f-d7bee85e17eb-util\") pod \"9bc38c6b-b881-4fec-9d9f-d7bee85e17eb\" (UID: \"9bc38c6b-b881-4fec-9d9f-d7bee85e17eb\") " Dec 02 16:08:06 crc kubenswrapper[4933]: I1202 16:08:06.222232 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9bc38c6b-b881-4fec-9d9f-d7bee85e17eb-bundle\") pod \"9bc38c6b-b881-4fec-9d9f-d7bee85e17eb\" (UID: \"9bc38c6b-b881-4fec-9d9f-d7bee85e17eb\") " Dec 02 16:08:06 crc kubenswrapper[4933]: I1202 16:08:06.223312 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9bc38c6b-b881-4fec-9d9f-d7bee85e17eb-bundle" (OuterVolumeSpecName: "bundle") pod "9bc38c6b-b881-4fec-9d9f-d7bee85e17eb" (UID: "9bc38c6b-b881-4fec-9d9f-d7bee85e17eb"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 16:08:06 crc kubenswrapper[4933]: I1202 16:08:06.229163 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9bc38c6b-b881-4fec-9d9f-d7bee85e17eb-kube-api-access-lzqxq" (OuterVolumeSpecName: "kube-api-access-lzqxq") pod "9bc38c6b-b881-4fec-9d9f-d7bee85e17eb" (UID: "9bc38c6b-b881-4fec-9d9f-d7bee85e17eb"). InnerVolumeSpecName "kube-api-access-lzqxq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:08:06 crc kubenswrapper[4933]: I1202 16:08:06.239442 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9bc38c6b-b881-4fec-9d9f-d7bee85e17eb-util" (OuterVolumeSpecName: "util") pod "9bc38c6b-b881-4fec-9d9f-d7bee85e17eb" (UID: "9bc38c6b-b881-4fec-9d9f-d7bee85e17eb"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 16:08:06 crc kubenswrapper[4933]: I1202 16:08:06.324406 4933 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9bc38c6b-b881-4fec-9d9f-d7bee85e17eb-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 16:08:06 crc kubenswrapper[4933]: I1202 16:08:06.324448 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzqxq\" (UniqueName: \"kubernetes.io/projected/9bc38c6b-b881-4fec-9d9f-d7bee85e17eb-kube-api-access-lzqxq\") on node \"crc\" DevicePath \"\"" Dec 02 16:08:06 crc kubenswrapper[4933]: I1202 16:08:06.324460 4933 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9bc38c6b-b881-4fec-9d9f-d7bee85e17eb-util\") on node \"crc\" DevicePath \"\"" Dec 02 16:08:06 crc kubenswrapper[4933]: I1202 16:08:06.875796 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7m5qk" event={"ID":"9bc38c6b-b881-4fec-9d9f-d7bee85e17eb","Type":"ContainerDied","Data":"49d03095da379466936d4ab9469f18367d9ea822d93d4d1aa6cf3953ad7b2093"} Dec 02 16:08:06 crc kubenswrapper[4933]: I1202 16:08:06.876137 4933 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="49d03095da379466936d4ab9469f18367d9ea822d93d4d1aa6cf3953ad7b2093" Dec 02 16:08:06 crc kubenswrapper[4933]: I1202 16:08:06.876031 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7m5qk" Dec 02 16:08:09 crc kubenswrapper[4933]: I1202 16:08:09.887431 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-mqmjq"] Dec 02 16:08:09 crc kubenswrapper[4933]: E1202 16:08:09.888400 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bc38c6b-b881-4fec-9d9f-d7bee85e17eb" containerName="pull" Dec 02 16:08:09 crc kubenswrapper[4933]: I1202 16:08:09.888417 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bc38c6b-b881-4fec-9d9f-d7bee85e17eb" containerName="pull" Dec 02 16:08:09 crc kubenswrapper[4933]: E1202 16:08:09.888435 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bc38c6b-b881-4fec-9d9f-d7bee85e17eb" containerName="extract" Dec 02 16:08:09 crc kubenswrapper[4933]: I1202 16:08:09.888442 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bc38c6b-b881-4fec-9d9f-d7bee85e17eb" containerName="extract" Dec 02 16:08:09 crc kubenswrapper[4933]: E1202 16:08:09.888505 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bc38c6b-b881-4fec-9d9f-d7bee85e17eb" containerName="util" Dec 02 16:08:09 crc kubenswrapper[4933]: I1202 16:08:09.888513 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bc38c6b-b881-4fec-9d9f-d7bee85e17eb" containerName="util" Dec 02 16:08:09 crc kubenswrapper[4933]: I1202 16:08:09.888681 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="9bc38c6b-b881-4fec-9d9f-d7bee85e17eb" containerName="extract" Dec 02 16:08:09 crc kubenswrapper[4933]: I1202 16:08:09.889365 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-mqmjq" Dec 02 16:08:09 crc kubenswrapper[4933]: I1202 16:08:09.894107 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-w8fnm" Dec 02 16:08:09 crc kubenswrapper[4933]: I1202 16:08:09.895326 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Dec 02 16:08:09 crc kubenswrapper[4933]: I1202 16:08:09.895668 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Dec 02 16:08:09 crc kubenswrapper[4933]: I1202 16:08:09.910169 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-mqmjq"] Dec 02 16:08:09 crc kubenswrapper[4933]: I1202 16:08:09.981371 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnt99\" (UniqueName: \"kubernetes.io/projected/37530e12-c39d-43b8-887f-64e9584c1d48-kube-api-access-qnt99\") pod \"nmstate-operator-5b5b58f5c8-mqmjq\" (UID: \"37530e12-c39d-43b8-887f-64e9584c1d48\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-mqmjq" Dec 02 16:08:10 crc kubenswrapper[4933]: I1202 16:08:10.083974 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qnt99\" (UniqueName: \"kubernetes.io/projected/37530e12-c39d-43b8-887f-64e9584c1d48-kube-api-access-qnt99\") pod \"nmstate-operator-5b5b58f5c8-mqmjq\" (UID: \"37530e12-c39d-43b8-887f-64e9584c1d48\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-mqmjq" Dec 02 16:08:10 crc kubenswrapper[4933]: I1202 16:08:10.107302 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnt99\" (UniqueName: \"kubernetes.io/projected/37530e12-c39d-43b8-887f-64e9584c1d48-kube-api-access-qnt99\") pod \"nmstate-operator-5b5b58f5c8-mqmjq\" (UID: \"37530e12-c39d-43b8-887f-64e9584c1d48\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-mqmjq" Dec 02 16:08:10 crc kubenswrapper[4933]: I1202 16:08:10.218630 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-mqmjq" Dec 02 16:08:10 crc kubenswrapper[4933]: I1202 16:08:10.536627 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-mqmjq"] Dec 02 16:08:10 crc kubenswrapper[4933]: I1202 16:08:10.910555 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-mqmjq" event={"ID":"37530e12-c39d-43b8-887f-64e9584c1d48","Type":"ContainerStarted","Data":"6c066c1c47b9d4d117eb62e221efa8f1f7d81fe71aa436c8bd6e947ccab70e09"} Dec 02 16:08:13 crc kubenswrapper[4933]: I1202 16:08:13.934253 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-mqmjq" event={"ID":"37530e12-c39d-43b8-887f-64e9584c1d48","Type":"ContainerStarted","Data":"359a33542c6695801bb246e46d4d8ac8a468fc5f9f47b501dcee64d0251dc5e9"} Dec 02 16:08:13 crc kubenswrapper[4933]: I1202 16:08:13.951630 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-mqmjq" podStartSLOduration=2.522125071 podStartE2EDuration="4.951613256s" podCreationTimestamp="2025-12-02 16:08:09 +0000 UTC" firstStartedPulling="2025-12-02 16:08:10.547314816 +0000 UTC m=+953.798541529" lastFinishedPulling="2025-12-02 16:08:12.976803011 +0000 UTC m=+956.228029714" observedRunningTime="2025-12-02 16:08:13.950750893 +0000 UTC m=+957.201977596" watchObservedRunningTime="2025-12-02 16:08:13.951613256 +0000 UTC m=+957.202839959" Dec 02 16:08:14 crc kubenswrapper[4933]: I1202 16:08:14.921325 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-qr4ph"] Dec 02 16:08:14 crc kubenswrapper[4933]: I1202 16:08:14.922491 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-qr4ph" Dec 02 16:08:14 crc kubenswrapper[4933]: I1202 16:08:14.925743 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-cm9qf" Dec 02 16:08:14 crc kubenswrapper[4933]: I1202 16:08:14.938970 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-85srr"] Dec 02 16:08:14 crc kubenswrapper[4933]: I1202 16:08:14.940306 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-85srr" Dec 02 16:08:14 crc kubenswrapper[4933]: I1202 16:08:14.943074 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Dec 02 16:08:14 crc kubenswrapper[4933]: I1202 16:08:14.946333 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-qr4ph"] Dec 02 16:08:14 crc kubenswrapper[4933]: I1202 16:08:14.965796 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9x6rm\" (UniqueName: \"kubernetes.io/projected/09f6889b-f9be-48f6-a955-c19824c76cdd-kube-api-access-9x6rm\") pod \"nmstate-metrics-7f946cbc9-qr4ph\" (UID: \"09f6889b-f9be-48f6-a955-c19824c76cdd\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-qr4ph" Dec 02 16:08:14 crc kubenswrapper[4933]: I1202 16:08:14.966135 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2kdh\" (UniqueName: \"kubernetes.io/projected/f007b4c7-10dc-452b-b11c-1506dc25da9a-kube-api-access-z2kdh\") pod \"nmstate-webhook-5f6d4c5ccb-85srr\" (UID: \"f007b4c7-10dc-452b-b11c-1506dc25da9a\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-85srr" Dec 02 16:08:14 crc kubenswrapper[4933]: I1202 16:08:14.966216 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/f007b4c7-10dc-452b-b11c-1506dc25da9a-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-85srr\" (UID: \"f007b4c7-10dc-452b-b11c-1506dc25da9a\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-85srr" Dec 02 16:08:14 crc kubenswrapper[4933]: I1202 16:08:14.972630 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-85srr"] Dec 02 16:08:15 crc kubenswrapper[4933]: I1202 16:08:15.008882 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-zjk85"] Dec 02 16:08:15 crc kubenswrapper[4933]: I1202 16:08:15.009906 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-zjk85" Dec 02 16:08:15 crc kubenswrapper[4933]: I1202 16:08:15.071423 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/f007b4c7-10dc-452b-b11c-1506dc25da9a-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-85srr\" (UID: \"f007b4c7-10dc-452b-b11c-1506dc25da9a\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-85srr" Dec 02 16:08:15 crc kubenswrapper[4933]: I1202 16:08:15.071502 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/6dec4a20-a5e7-4e76-80d5-8dfbb82021b3-dbus-socket\") pod \"nmstate-handler-zjk85\" (UID: \"6dec4a20-a5e7-4e76-80d5-8dfbb82021b3\") " pod="openshift-nmstate/nmstate-handler-zjk85" Dec 02 16:08:15 crc kubenswrapper[4933]: I1202 16:08:15.071530 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljdw9\" (UniqueName: \"kubernetes.io/projected/6dec4a20-a5e7-4e76-80d5-8dfbb82021b3-kube-api-access-ljdw9\") pod \"nmstate-handler-zjk85\" (UID: \"6dec4a20-a5e7-4e76-80d5-8dfbb82021b3\") " pod="openshift-nmstate/nmstate-handler-zjk85" Dec 02 16:08:15 crc kubenswrapper[4933]: I1202 16:08:15.071641 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9x6rm\" (UniqueName: \"kubernetes.io/projected/09f6889b-f9be-48f6-a955-c19824c76cdd-kube-api-access-9x6rm\") pod \"nmstate-metrics-7f946cbc9-qr4ph\" (UID: \"09f6889b-f9be-48f6-a955-c19824c76cdd\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-qr4ph" Dec 02 16:08:15 crc kubenswrapper[4933]: I1202 16:08:15.071672 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/6dec4a20-a5e7-4e76-80d5-8dfbb82021b3-nmstate-lock\") pod \"nmstate-handler-zjk85\" (UID: \"6dec4a20-a5e7-4e76-80d5-8dfbb82021b3\") " pod="openshift-nmstate/nmstate-handler-zjk85" Dec 02 16:08:15 crc kubenswrapper[4933]: I1202 16:08:15.072084 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/6dec4a20-a5e7-4e76-80d5-8dfbb82021b3-ovs-socket\") pod \"nmstate-handler-zjk85\" (UID: \"6dec4a20-a5e7-4e76-80d5-8dfbb82021b3\") " pod="openshift-nmstate/nmstate-handler-zjk85" Dec 02 16:08:15 crc kubenswrapper[4933]: I1202 16:08:15.072234 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2kdh\" (UniqueName: \"kubernetes.io/projected/f007b4c7-10dc-452b-b11c-1506dc25da9a-kube-api-access-z2kdh\") pod \"nmstate-webhook-5f6d4c5ccb-85srr\" (UID: \"f007b4c7-10dc-452b-b11c-1506dc25da9a\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-85srr" Dec 02 16:08:15 crc kubenswrapper[4933]: E1202 16:08:15.072433 4933 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Dec 02 16:08:15 crc kubenswrapper[4933]: E1202 16:08:15.072507 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f007b4c7-10dc-452b-b11c-1506dc25da9a-tls-key-pair podName:f007b4c7-10dc-452b-b11c-1506dc25da9a nodeName:}" failed. No retries permitted until 2025-12-02 16:08:15.572487335 +0000 UTC m=+958.823714078 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/f007b4c7-10dc-452b-b11c-1506dc25da9a-tls-key-pair") pod "nmstate-webhook-5f6d4c5ccb-85srr" (UID: "f007b4c7-10dc-452b-b11c-1506dc25da9a") : secret "openshift-nmstate-webhook" not found Dec 02 16:08:15 crc kubenswrapper[4933]: I1202 16:08:15.108461 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-gl4gm"] Dec 02 16:08:15 crc kubenswrapper[4933]: I1202 16:08:15.109606 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-gl4gm" Dec 02 16:08:15 crc kubenswrapper[4933]: I1202 16:08:15.111700 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Dec 02 16:08:15 crc kubenswrapper[4933]: I1202 16:08:15.111989 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Dec 02 16:08:15 crc kubenswrapper[4933]: I1202 16:08:15.112235 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-mp666" Dec 02 16:08:15 crc kubenswrapper[4933]: I1202 16:08:15.118312 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-gl4gm"] Dec 02 16:08:15 crc kubenswrapper[4933]: I1202 16:08:15.122910 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2kdh\" (UniqueName: \"kubernetes.io/projected/f007b4c7-10dc-452b-b11c-1506dc25da9a-kube-api-access-z2kdh\") pod \"nmstate-webhook-5f6d4c5ccb-85srr\" (UID: \"f007b4c7-10dc-452b-b11c-1506dc25da9a\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-85srr" Dec 02 16:08:15 crc kubenswrapper[4933]: I1202 16:08:15.122946 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9x6rm\" (UniqueName: \"kubernetes.io/projected/09f6889b-f9be-48f6-a955-c19824c76cdd-kube-api-access-9x6rm\") pod \"nmstate-metrics-7f946cbc9-qr4ph\" (UID: \"09f6889b-f9be-48f6-a955-c19824c76cdd\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-qr4ph" Dec 02 16:08:15 crc kubenswrapper[4933]: I1202 16:08:15.184623 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cw455\" (UniqueName: \"kubernetes.io/projected/a4ec34d6-4de1-4ecb-a60a-46d969b32ff4-kube-api-access-cw455\") pod \"nmstate-console-plugin-7fbb5f6569-gl4gm\" (UID: \"a4ec34d6-4de1-4ecb-a60a-46d969b32ff4\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-gl4gm" Dec 02 16:08:15 crc kubenswrapper[4933]: I1202 16:08:15.184953 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/6dec4a20-a5e7-4e76-80d5-8dfbb82021b3-dbus-socket\") pod \"nmstate-handler-zjk85\" (UID: \"6dec4a20-a5e7-4e76-80d5-8dfbb82021b3\") " pod="openshift-nmstate/nmstate-handler-zjk85" Dec 02 16:08:15 crc kubenswrapper[4933]: I1202 16:08:15.184988 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljdw9\" (UniqueName: \"kubernetes.io/projected/6dec4a20-a5e7-4e76-80d5-8dfbb82021b3-kube-api-access-ljdw9\") pod \"nmstate-handler-zjk85\" (UID: \"6dec4a20-a5e7-4e76-80d5-8dfbb82021b3\") " pod="openshift-nmstate/nmstate-handler-zjk85" Dec 02 16:08:15 crc kubenswrapper[4933]: I1202 16:08:15.185045 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/a4ec34d6-4de1-4ecb-a60a-46d969b32ff4-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-gl4gm\" (UID: \"a4ec34d6-4de1-4ecb-a60a-46d969b32ff4\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-gl4gm" Dec 02 16:08:15 crc kubenswrapper[4933]: I1202 16:08:15.185115 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/6dec4a20-a5e7-4e76-80d5-8dfbb82021b3-nmstate-lock\") pod \"nmstate-handler-zjk85\" (UID: \"6dec4a20-a5e7-4e76-80d5-8dfbb82021b3\") " pod="openshift-nmstate/nmstate-handler-zjk85" Dec 02 16:08:15 crc kubenswrapper[4933]: I1202 16:08:15.185526 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/6dec4a20-a5e7-4e76-80d5-8dfbb82021b3-dbus-socket\") pod \"nmstate-handler-zjk85\" (UID: \"6dec4a20-a5e7-4e76-80d5-8dfbb82021b3\") " pod="openshift-nmstate/nmstate-handler-zjk85" Dec 02 16:08:15 crc kubenswrapper[4933]: I1202 16:08:15.185952 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/6dec4a20-a5e7-4e76-80d5-8dfbb82021b3-nmstate-lock\") pod \"nmstate-handler-zjk85\" (UID: \"6dec4a20-a5e7-4e76-80d5-8dfbb82021b3\") " pod="openshift-nmstate/nmstate-handler-zjk85" Dec 02 16:08:15 crc kubenswrapper[4933]: I1202 16:08:15.186073 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/6dec4a20-a5e7-4e76-80d5-8dfbb82021b3-ovs-socket\") pod \"nmstate-handler-zjk85\" (UID: \"6dec4a20-a5e7-4e76-80d5-8dfbb82021b3\") " pod="openshift-nmstate/nmstate-handler-zjk85" Dec 02 16:08:15 crc kubenswrapper[4933]: I1202 16:08:15.185152 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/6dec4a20-a5e7-4e76-80d5-8dfbb82021b3-ovs-socket\") pod \"nmstate-handler-zjk85\" (UID: \"6dec4a20-a5e7-4e76-80d5-8dfbb82021b3\") " pod="openshift-nmstate/nmstate-handler-zjk85" Dec 02 16:08:15 crc kubenswrapper[4933]: I1202 16:08:15.186358 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/a4ec34d6-4de1-4ecb-a60a-46d969b32ff4-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-gl4gm\" (UID: \"a4ec34d6-4de1-4ecb-a60a-46d969b32ff4\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-gl4gm" Dec 02 16:08:15 crc kubenswrapper[4933]: I1202 16:08:15.226446 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljdw9\" (UniqueName: \"kubernetes.io/projected/6dec4a20-a5e7-4e76-80d5-8dfbb82021b3-kube-api-access-ljdw9\") pod \"nmstate-handler-zjk85\" (UID: \"6dec4a20-a5e7-4e76-80d5-8dfbb82021b3\") " pod="openshift-nmstate/nmstate-handler-zjk85" Dec 02 16:08:15 crc kubenswrapper[4933]: I1202 16:08:15.244187 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-qr4ph" Dec 02 16:08:15 crc kubenswrapper[4933]: I1202 16:08:15.287501 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cw455\" (UniqueName: \"kubernetes.io/projected/a4ec34d6-4de1-4ecb-a60a-46d969b32ff4-kube-api-access-cw455\") pod \"nmstate-console-plugin-7fbb5f6569-gl4gm\" (UID: \"a4ec34d6-4de1-4ecb-a60a-46d969b32ff4\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-gl4gm" Dec 02 16:08:15 crc kubenswrapper[4933]: I1202 16:08:15.287623 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/a4ec34d6-4de1-4ecb-a60a-46d969b32ff4-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-gl4gm\" (UID: \"a4ec34d6-4de1-4ecb-a60a-46d969b32ff4\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-gl4gm" Dec 02 16:08:15 crc kubenswrapper[4933]: I1202 16:08:15.287665 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/a4ec34d6-4de1-4ecb-a60a-46d969b32ff4-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-gl4gm\" (UID: \"a4ec34d6-4de1-4ecb-a60a-46d969b32ff4\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-gl4gm" Dec 02 16:08:15 crc kubenswrapper[4933]: I1202 16:08:15.288511 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/a4ec34d6-4de1-4ecb-a60a-46d969b32ff4-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-gl4gm\" (UID: \"a4ec34d6-4de1-4ecb-a60a-46d969b32ff4\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-gl4gm" Dec 02 16:08:15 crc kubenswrapper[4933]: I1202 16:08:15.289003 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-84778dd5c6-qprl6"] Dec 02 16:08:15 crc kubenswrapper[4933]: I1202 16:08:15.291314 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/a4ec34d6-4de1-4ecb-a60a-46d969b32ff4-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-gl4gm\" (UID: \"a4ec34d6-4de1-4ecb-a60a-46d969b32ff4\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-gl4gm" Dec 02 16:08:15 crc kubenswrapper[4933]: I1202 16:08:15.306084 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-84778dd5c6-qprl6" Dec 02 16:08:15 crc kubenswrapper[4933]: I1202 16:08:15.322971 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-84778dd5c6-qprl6"] Dec 02 16:08:15 crc kubenswrapper[4933]: I1202 16:08:15.330081 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cw455\" (UniqueName: \"kubernetes.io/projected/a4ec34d6-4de1-4ecb-a60a-46d969b32ff4-kube-api-access-cw455\") pod \"nmstate-console-plugin-7fbb5f6569-gl4gm\" (UID: \"a4ec34d6-4de1-4ecb-a60a-46d969b32ff4\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-gl4gm" Dec 02 16:08:15 crc kubenswrapper[4933]: I1202 16:08:15.340334 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-zjk85" Dec 02 16:08:15 crc kubenswrapper[4933]: I1202 16:08:15.389387 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1e17b7a2-545e-4757-87d6-4bb0d4f6407c-oauth-serving-cert\") pod \"console-84778dd5c6-qprl6\" (UID: \"1e17b7a2-545e-4757-87d6-4bb0d4f6407c\") " pod="openshift-console/console-84778dd5c6-qprl6" Dec 02 16:08:15 crc kubenswrapper[4933]: I1202 16:08:15.389435 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1e17b7a2-545e-4757-87d6-4bb0d4f6407c-console-oauth-config\") pod \"console-84778dd5c6-qprl6\" (UID: \"1e17b7a2-545e-4757-87d6-4bb0d4f6407c\") " pod="openshift-console/console-84778dd5c6-qprl6" Dec 02 16:08:15 crc kubenswrapper[4933]: I1202 16:08:15.389512 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1e17b7a2-545e-4757-87d6-4bb0d4f6407c-console-serving-cert\") pod \"console-84778dd5c6-qprl6\" (UID: \"1e17b7a2-545e-4757-87d6-4bb0d4f6407c\") " pod="openshift-console/console-84778dd5c6-qprl6" Dec 02 16:08:15 crc kubenswrapper[4933]: I1202 16:08:15.389528 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1e17b7a2-545e-4757-87d6-4bb0d4f6407c-trusted-ca-bundle\") pod \"console-84778dd5c6-qprl6\" (UID: \"1e17b7a2-545e-4757-87d6-4bb0d4f6407c\") " pod="openshift-console/console-84778dd5c6-qprl6" Dec 02 16:08:15 crc kubenswrapper[4933]: I1202 16:08:15.389568 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1e17b7a2-545e-4757-87d6-4bb0d4f6407c-service-ca\") pod \"console-84778dd5c6-qprl6\" (UID: \"1e17b7a2-545e-4757-87d6-4bb0d4f6407c\") " pod="openshift-console/console-84778dd5c6-qprl6" Dec 02 16:08:15 crc kubenswrapper[4933]: I1202 16:08:15.389604 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wng4j\" (UniqueName: \"kubernetes.io/projected/1e17b7a2-545e-4757-87d6-4bb0d4f6407c-kube-api-access-wng4j\") pod \"console-84778dd5c6-qprl6\" (UID: \"1e17b7a2-545e-4757-87d6-4bb0d4f6407c\") " pod="openshift-console/console-84778dd5c6-qprl6" Dec 02 16:08:15 crc kubenswrapper[4933]: I1202 16:08:15.389640 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1e17b7a2-545e-4757-87d6-4bb0d4f6407c-console-config\") pod \"console-84778dd5c6-qprl6\" (UID: \"1e17b7a2-545e-4757-87d6-4bb0d4f6407c\") " pod="openshift-console/console-84778dd5c6-qprl6" Dec 02 16:08:15 crc kubenswrapper[4933]: I1202 16:08:15.492814 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1e17b7a2-545e-4757-87d6-4bb0d4f6407c-console-serving-cert\") pod \"console-84778dd5c6-qprl6\" (UID: \"1e17b7a2-545e-4757-87d6-4bb0d4f6407c\") " pod="openshift-console/console-84778dd5c6-qprl6" Dec 02 16:08:15 crc kubenswrapper[4933]: I1202 16:08:15.492931 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1e17b7a2-545e-4757-87d6-4bb0d4f6407c-trusted-ca-bundle\") pod \"console-84778dd5c6-qprl6\" (UID: \"1e17b7a2-545e-4757-87d6-4bb0d4f6407c\") " pod="openshift-console/console-84778dd5c6-qprl6" Dec 02 16:08:15 crc kubenswrapper[4933]: I1202 16:08:15.493069 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1e17b7a2-545e-4757-87d6-4bb0d4f6407c-service-ca\") pod \"console-84778dd5c6-qprl6\" (UID: \"1e17b7a2-545e-4757-87d6-4bb0d4f6407c\") " pod="openshift-console/console-84778dd5c6-qprl6" Dec 02 16:08:15 crc kubenswrapper[4933]: I1202 16:08:15.493167 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wng4j\" (UniqueName: \"kubernetes.io/projected/1e17b7a2-545e-4757-87d6-4bb0d4f6407c-kube-api-access-wng4j\") pod \"console-84778dd5c6-qprl6\" (UID: \"1e17b7a2-545e-4757-87d6-4bb0d4f6407c\") " pod="openshift-console/console-84778dd5c6-qprl6" Dec 02 16:08:15 crc kubenswrapper[4933]: I1202 16:08:15.493268 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1e17b7a2-545e-4757-87d6-4bb0d4f6407c-console-config\") pod \"console-84778dd5c6-qprl6\" (UID: \"1e17b7a2-545e-4757-87d6-4bb0d4f6407c\") " pod="openshift-console/console-84778dd5c6-qprl6" Dec 02 16:08:15 crc kubenswrapper[4933]: I1202 16:08:15.493332 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1e17b7a2-545e-4757-87d6-4bb0d4f6407c-oauth-serving-cert\") pod \"console-84778dd5c6-qprl6\" (UID: \"1e17b7a2-545e-4757-87d6-4bb0d4f6407c\") " pod="openshift-console/console-84778dd5c6-qprl6" Dec 02 16:08:15 crc kubenswrapper[4933]: I1202 16:08:15.493363 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1e17b7a2-545e-4757-87d6-4bb0d4f6407c-console-oauth-config\") pod \"console-84778dd5c6-qprl6\" (UID: \"1e17b7a2-545e-4757-87d6-4bb0d4f6407c\") " pod="openshift-console/console-84778dd5c6-qprl6" Dec 02 16:08:15 crc kubenswrapper[4933]: I1202 16:08:15.495765 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1e17b7a2-545e-4757-87d6-4bb0d4f6407c-service-ca\") pod \"console-84778dd5c6-qprl6\" (UID: \"1e17b7a2-545e-4757-87d6-4bb0d4f6407c\") " pod="openshift-console/console-84778dd5c6-qprl6" Dec 02 16:08:15 crc kubenswrapper[4933]: I1202 16:08:15.496154 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1e17b7a2-545e-4757-87d6-4bb0d4f6407c-console-config\") pod \"console-84778dd5c6-qprl6\" (UID: \"1e17b7a2-545e-4757-87d6-4bb0d4f6407c\") " pod="openshift-console/console-84778dd5c6-qprl6" Dec 02 16:08:15 crc kubenswrapper[4933]: I1202 16:08:15.496532 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-gl4gm" Dec 02 16:08:15 crc kubenswrapper[4933]: I1202 16:08:15.497110 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1e17b7a2-545e-4757-87d6-4bb0d4f6407c-trusted-ca-bundle\") pod \"console-84778dd5c6-qprl6\" (UID: \"1e17b7a2-545e-4757-87d6-4bb0d4f6407c\") " pod="openshift-console/console-84778dd5c6-qprl6" Dec 02 16:08:15 crc kubenswrapper[4933]: I1202 16:08:15.497669 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1e17b7a2-545e-4757-87d6-4bb0d4f6407c-oauth-serving-cert\") pod \"console-84778dd5c6-qprl6\" (UID: \"1e17b7a2-545e-4757-87d6-4bb0d4f6407c\") " pod="openshift-console/console-84778dd5c6-qprl6" Dec 02 16:08:15 crc kubenswrapper[4933]: I1202 16:08:15.502516 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1e17b7a2-545e-4757-87d6-4bb0d4f6407c-console-serving-cert\") pod \"console-84778dd5c6-qprl6\" (UID: \"1e17b7a2-545e-4757-87d6-4bb0d4f6407c\") " pod="openshift-console/console-84778dd5c6-qprl6" Dec 02 16:08:15 crc kubenswrapper[4933]: I1202 16:08:15.505520 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1e17b7a2-545e-4757-87d6-4bb0d4f6407c-console-oauth-config\") pod \"console-84778dd5c6-qprl6\" (UID: \"1e17b7a2-545e-4757-87d6-4bb0d4f6407c\") " pod="openshift-console/console-84778dd5c6-qprl6" Dec 02 16:08:15 crc kubenswrapper[4933]: I1202 16:08:15.519307 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wng4j\" (UniqueName: \"kubernetes.io/projected/1e17b7a2-545e-4757-87d6-4bb0d4f6407c-kube-api-access-wng4j\") pod \"console-84778dd5c6-qprl6\" (UID: \"1e17b7a2-545e-4757-87d6-4bb0d4f6407c\") " pod="openshift-console/console-84778dd5c6-qprl6" Dec 02 16:08:15 crc kubenswrapper[4933]: I1202 16:08:15.595063 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/f007b4c7-10dc-452b-b11c-1506dc25da9a-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-85srr\" (UID: \"f007b4c7-10dc-452b-b11c-1506dc25da9a\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-85srr" Dec 02 16:08:15 crc kubenswrapper[4933]: I1202 16:08:15.599123 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/f007b4c7-10dc-452b-b11c-1506dc25da9a-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-85srr\" (UID: \"f007b4c7-10dc-452b-b11c-1506dc25da9a\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-85srr" Dec 02 16:08:15 crc kubenswrapper[4933]: I1202 16:08:15.655412 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-84778dd5c6-qprl6" Dec 02 16:08:15 crc kubenswrapper[4933]: I1202 16:08:15.795105 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-qr4ph"] Dec 02 16:08:15 crc kubenswrapper[4933]: W1202 16:08:15.796218 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09f6889b_f9be_48f6_a955_c19824c76cdd.slice/crio-11f618307b56e92e6d1160b8de352a18dfdc2bd313da193dc0691ed9ab5cccff WatchSource:0}: Error finding container 11f618307b56e92e6d1160b8de352a18dfdc2bd313da193dc0691ed9ab5cccff: Status 404 returned error can't find the container with id 11f618307b56e92e6d1160b8de352a18dfdc2bd313da193dc0691ed9ab5cccff Dec 02 16:08:15 crc kubenswrapper[4933]: I1202 16:08:15.861659 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-85srr" Dec 02 16:08:15 crc kubenswrapper[4933]: I1202 16:08:15.933086 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-gl4gm"] Dec 02 16:08:15 crc kubenswrapper[4933]: W1202 16:08:15.938930 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda4ec34d6_4de1_4ecb_a60a_46d969b32ff4.slice/crio-344bfe486511950e8abca465e6e41add6c7050910e3a1b65b296bcacea0bb7a2 WatchSource:0}: Error finding container 344bfe486511950e8abca465e6e41add6c7050910e3a1b65b296bcacea0bb7a2: Status 404 returned error can't find the container with id 344bfe486511950e8abca465e6e41add6c7050910e3a1b65b296bcacea0bb7a2 Dec 02 16:08:15 crc kubenswrapper[4933]: I1202 16:08:15.956136 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-qr4ph" event={"ID":"09f6889b-f9be-48f6-a955-c19824c76cdd","Type":"ContainerStarted","Data":"11f618307b56e92e6d1160b8de352a18dfdc2bd313da193dc0691ed9ab5cccff"} Dec 02 16:08:15 crc kubenswrapper[4933]: I1202 16:08:15.958423 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-gl4gm" event={"ID":"a4ec34d6-4de1-4ecb-a60a-46d969b32ff4","Type":"ContainerStarted","Data":"344bfe486511950e8abca465e6e41add6c7050910e3a1b65b296bcacea0bb7a2"} Dec 02 16:08:15 crc kubenswrapper[4933]: I1202 16:08:15.959901 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-zjk85" event={"ID":"6dec4a20-a5e7-4e76-80d5-8dfbb82021b3","Type":"ContainerStarted","Data":"ff25303106c5bc03436f564b2fca14f2a6deff7dc82cfba1b87df63b394afb64"} Dec 02 16:08:16 crc kubenswrapper[4933]: I1202 16:08:16.122801 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-84778dd5c6-qprl6"] Dec 02 16:08:16 crc kubenswrapper[4933]: W1202 16:08:16.124248 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e17b7a2_545e_4757_87d6_4bb0d4f6407c.slice/crio-44475fcd1cc0e6c72863113ea85bfe218ea731f076d0db7d0b8989edbc3c29cf WatchSource:0}: Error finding container 44475fcd1cc0e6c72863113ea85bfe218ea731f076d0db7d0b8989edbc3c29cf: Status 404 returned error can't find the container with id 44475fcd1cc0e6c72863113ea85bfe218ea731f076d0db7d0b8989edbc3c29cf Dec 02 16:08:16 crc kubenswrapper[4933]: I1202 16:08:16.293269 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-85srr"] Dec 02 16:08:16 crc kubenswrapper[4933]: W1202 16:08:16.309033 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf007b4c7_10dc_452b_b11c_1506dc25da9a.slice/crio-1f70bfc920578d29961c8fdd0587d17f6af4fb777a74c37f1c445c585c3d0386 WatchSource:0}: Error finding container 1f70bfc920578d29961c8fdd0587d17f6af4fb777a74c37f1c445c585c3d0386: Status 404 returned error can't find the container with id 1f70bfc920578d29961c8fdd0587d17f6af4fb777a74c37f1c445c585c3d0386 Dec 02 16:08:16 crc kubenswrapper[4933]: I1202 16:08:16.968017 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-84778dd5c6-qprl6" event={"ID":"1e17b7a2-545e-4757-87d6-4bb0d4f6407c","Type":"ContainerStarted","Data":"f092b3ce01fd53d09b740797004140f50a3d7fc565e3d42c7805e63faf6b735f"} Dec 02 16:08:16 crc kubenswrapper[4933]: I1202 16:08:16.968056 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-84778dd5c6-qprl6" event={"ID":"1e17b7a2-545e-4757-87d6-4bb0d4f6407c","Type":"ContainerStarted","Data":"44475fcd1cc0e6c72863113ea85bfe218ea731f076d0db7d0b8989edbc3c29cf"} Dec 02 16:08:16 crc kubenswrapper[4933]: I1202 16:08:16.969271 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-85srr" event={"ID":"f007b4c7-10dc-452b-b11c-1506dc25da9a","Type":"ContainerStarted","Data":"1f70bfc920578d29961c8fdd0587d17f6af4fb777a74c37f1c445c585c3d0386"} Dec 02 16:08:16 crc kubenswrapper[4933]: I1202 16:08:16.988452 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-84778dd5c6-qprl6" podStartSLOduration=1.988431659 podStartE2EDuration="1.988431659s" podCreationTimestamp="2025-12-02 16:08:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 16:08:16.984288538 +0000 UTC m=+960.235515261" watchObservedRunningTime="2025-12-02 16:08:16.988431659 +0000 UTC m=+960.239658372" Dec 02 16:08:17 crc kubenswrapper[4933]: I1202 16:08:17.169112 4933 patch_prober.go:28] interesting pod/machine-config-daemon-d2p6w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 16:08:17 crc kubenswrapper[4933]: I1202 16:08:17.169184 4933 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 16:08:17 crc kubenswrapper[4933]: I1202 16:08:17.169233 4933 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" Dec 02 16:08:17 crc kubenswrapper[4933]: I1202 16:08:17.169810 4933 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"21d5a98b0f608aa28a64cf9b966f8d88f136c42a20cbdba910ec855c8416478d"} pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 16:08:17 crc kubenswrapper[4933]: I1202 16:08:17.169882 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" containerName="machine-config-daemon" containerID="cri-o://21d5a98b0f608aa28a64cf9b966f8d88f136c42a20cbdba910ec855c8416478d" gracePeriod=600 Dec 02 16:08:17 crc kubenswrapper[4933]: I1202 16:08:17.979728 4933 generic.go:334] "Generic (PLEG): container finished" podID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" containerID="21d5a98b0f608aa28a64cf9b966f8d88f136c42a20cbdba910ec855c8416478d" exitCode=0 Dec 02 16:08:17 crc kubenswrapper[4933]: I1202 16:08:17.979808 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" event={"ID":"e6c1c5e6-50dd-428a-890c-2c3f0456f2fa","Type":"ContainerDied","Data":"21d5a98b0f608aa28a64cf9b966f8d88f136c42a20cbdba910ec855c8416478d"} Dec 02 16:08:17 crc kubenswrapper[4933]: I1202 16:08:17.980261 4933 scope.go:117] "RemoveContainer" containerID="338ae7bd911845a8a5e2ec13c95bb1233e30afce64a9ca00f9d9ea1398ecf073" Dec 02 16:08:19 crc kubenswrapper[4933]: I1202 16:08:19.597137 4933 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 16:08:19 crc kubenswrapper[4933]: I1202 16:08:19.995081 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-zjk85" event={"ID":"6dec4a20-a5e7-4e76-80d5-8dfbb82021b3","Type":"ContainerStarted","Data":"4cc02d3f303cc8aca69fb93f040a2a7f493149b061a9fe69ec2ec21ecb6e2dac"} Dec 02 16:08:19 crc kubenswrapper[4933]: I1202 16:08:19.995503 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-zjk85" Dec 02 16:08:19 crc kubenswrapper[4933]: I1202 16:08:19.996533 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-qr4ph" event={"ID":"09f6889b-f9be-48f6-a955-c19824c76cdd","Type":"ContainerStarted","Data":"3a1ce8624d7cf53d871ed8546b79e6ad7278e883b0528abe17303301e3265b34"} Dec 02 16:08:20 crc kubenswrapper[4933]: I1202 16:08:20.001768 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" event={"ID":"e6c1c5e6-50dd-428a-890c-2c3f0456f2fa","Type":"ContainerStarted","Data":"6a4e12f1c3641af98254a5deef90580d464dc44072cf6822887fee1b15222e36"} Dec 02 16:08:20 crc kubenswrapper[4933]: I1202 16:08:20.004217 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-gl4gm" event={"ID":"a4ec34d6-4de1-4ecb-a60a-46d969b32ff4","Type":"ContainerStarted","Data":"3f9813bd4ab97fe988c143ac3e2fc45e1007f84a9c41dd2f78a30ff235bd8528"} Dec 02 16:08:20 crc kubenswrapper[4933]: I1202 16:08:20.006528 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-85srr" event={"ID":"f007b4c7-10dc-452b-b11c-1506dc25da9a","Type":"ContainerStarted","Data":"1cebbf0d6a48aa8054ea2bea0e260428252a70807ffa90a6ab67990c14ac5a33"} Dec 02 16:08:20 crc kubenswrapper[4933]: I1202 16:08:20.006649 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-85srr" Dec 02 16:08:20 crc kubenswrapper[4933]: I1202 16:08:20.012339 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-zjk85" podStartSLOduration=2.047210168 podStartE2EDuration="6.012321668s" podCreationTimestamp="2025-12-02 16:08:14 +0000 UTC" firstStartedPulling="2025-12-02 16:08:15.369136138 +0000 UTC m=+958.620362841" lastFinishedPulling="2025-12-02 16:08:19.334247638 +0000 UTC m=+962.585474341" observedRunningTime="2025-12-02 16:08:20.010770026 +0000 UTC m=+963.261996729" watchObservedRunningTime="2025-12-02 16:08:20.012321668 +0000 UTC m=+963.263548371" Dec 02 16:08:20 crc kubenswrapper[4933]: I1202 16:08:20.028091 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-85srr" podStartSLOduration=2.724868987 podStartE2EDuration="6.028073487s" podCreationTimestamp="2025-12-02 16:08:14 +0000 UTC" firstStartedPulling="2025-12-02 16:08:16.311639353 +0000 UTC m=+959.562866056" lastFinishedPulling="2025-12-02 16:08:19.614843853 +0000 UTC m=+962.866070556" observedRunningTime="2025-12-02 16:08:20.025972521 +0000 UTC m=+963.277199244" watchObservedRunningTime="2025-12-02 16:08:20.028073487 +0000 UTC m=+963.279300200" Dec 02 16:08:20 crc kubenswrapper[4933]: I1202 16:08:20.056874 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-gl4gm" podStartSLOduration=1.662644461 podStartE2EDuration="5.056853172s" podCreationTimestamp="2025-12-02 16:08:15 +0000 UTC" firstStartedPulling="2025-12-02 16:08:15.941707591 +0000 UTC m=+959.192934294" lastFinishedPulling="2025-12-02 16:08:19.335916302 +0000 UTC m=+962.587143005" observedRunningTime="2025-12-02 16:08:20.051662654 +0000 UTC m=+963.302889377" watchObservedRunningTime="2025-12-02 16:08:20.056853172 +0000 UTC m=+963.308079885" Dec 02 16:08:23 crc kubenswrapper[4933]: I1202 16:08:23.027029 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-qr4ph" event={"ID":"09f6889b-f9be-48f6-a955-c19824c76cdd","Type":"ContainerStarted","Data":"9e31feac5bb9a26c0d9eb9a9cb70bf8d9e2aceffc9ac800f3c8c746de1bf9ef7"} Dec 02 16:08:23 crc kubenswrapper[4933]: I1202 16:08:23.047880 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-qr4ph" podStartSLOduration=2.586538627 podStartE2EDuration="9.047793655s" podCreationTimestamp="2025-12-02 16:08:14 +0000 UTC" firstStartedPulling="2025-12-02 16:08:15.798385618 +0000 UTC m=+959.049612321" lastFinishedPulling="2025-12-02 16:08:22.259640646 +0000 UTC m=+965.510867349" observedRunningTime="2025-12-02 16:08:23.042960396 +0000 UTC m=+966.294187109" watchObservedRunningTime="2025-12-02 16:08:23.047793655 +0000 UTC m=+966.299020398" Dec 02 16:08:25 crc kubenswrapper[4933]: I1202 16:08:25.383016 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-zjk85" Dec 02 16:08:25 crc kubenswrapper[4933]: I1202 16:08:25.655970 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-84778dd5c6-qprl6" Dec 02 16:08:25 crc kubenswrapper[4933]: I1202 16:08:25.656610 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-84778dd5c6-qprl6" Dec 02 16:08:25 crc kubenswrapper[4933]: I1202 16:08:25.661061 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-84778dd5c6-qprl6" Dec 02 16:08:26 crc kubenswrapper[4933]: I1202 16:08:26.047707 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-84778dd5c6-qprl6" Dec 02 16:08:26 crc kubenswrapper[4933]: I1202 16:08:26.144138 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6d6c4458c4-cfwr8"] Dec 02 16:08:35 crc kubenswrapper[4933]: I1202 16:08:35.868064 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-85srr" Dec 02 16:08:51 crc kubenswrapper[4933]: I1202 16:08:51.189514 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-6d6c4458c4-cfwr8" podUID="af27e767-99b2-48f8-99d5-19772d768d0c" containerName="console" containerID="cri-o://d7fd1a1d0912d78511622833f4d3f50655404895a4988ed895227989ad9f2dd5" gracePeriod=15 Dec 02 16:08:51 crc kubenswrapper[4933]: I1202 16:08:51.613340 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6d6c4458c4-cfwr8_af27e767-99b2-48f8-99d5-19772d768d0c/console/0.log" Dec 02 16:08:51 crc kubenswrapper[4933]: I1202 16:08:51.613654 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6d6c4458c4-cfwr8" Dec 02 16:08:51 crc kubenswrapper[4933]: I1202 16:08:51.630983 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/af27e767-99b2-48f8-99d5-19772d768d0c-console-serving-cert\") pod \"af27e767-99b2-48f8-99d5-19772d768d0c\" (UID: \"af27e767-99b2-48f8-99d5-19772d768d0c\") " Dec 02 16:08:51 crc kubenswrapper[4933]: I1202 16:08:51.631166 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/af27e767-99b2-48f8-99d5-19772d768d0c-service-ca\") pod \"af27e767-99b2-48f8-99d5-19772d768d0c\" (UID: \"af27e767-99b2-48f8-99d5-19772d768d0c\") " Dec 02 16:08:51 crc kubenswrapper[4933]: I1202 16:08:51.631202 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/af27e767-99b2-48f8-99d5-19772d768d0c-trusted-ca-bundle\") pod \"af27e767-99b2-48f8-99d5-19772d768d0c\" (UID: \"af27e767-99b2-48f8-99d5-19772d768d0c\") " Dec 02 16:08:51 crc kubenswrapper[4933]: I1202 16:08:51.631222 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2m2tx\" (UniqueName: \"kubernetes.io/projected/af27e767-99b2-48f8-99d5-19772d768d0c-kube-api-access-2m2tx\") pod \"af27e767-99b2-48f8-99d5-19772d768d0c\" (UID: \"af27e767-99b2-48f8-99d5-19772d768d0c\") " Dec 02 16:08:51 crc kubenswrapper[4933]: I1202 16:08:51.631252 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/af27e767-99b2-48f8-99d5-19772d768d0c-console-config\") pod \"af27e767-99b2-48f8-99d5-19772d768d0c\" (UID: \"af27e767-99b2-48f8-99d5-19772d768d0c\") " Dec 02 16:08:51 crc kubenswrapper[4933]: I1202 16:08:51.631303 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/af27e767-99b2-48f8-99d5-19772d768d0c-console-oauth-config\") pod \"af27e767-99b2-48f8-99d5-19772d768d0c\" (UID: \"af27e767-99b2-48f8-99d5-19772d768d0c\") " Dec 02 16:08:51 crc kubenswrapper[4933]: I1202 16:08:51.631358 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/af27e767-99b2-48f8-99d5-19772d768d0c-oauth-serving-cert\") pod \"af27e767-99b2-48f8-99d5-19772d768d0c\" (UID: \"af27e767-99b2-48f8-99d5-19772d768d0c\") " Dec 02 16:08:51 crc kubenswrapper[4933]: I1202 16:08:51.632154 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af27e767-99b2-48f8-99d5-19772d768d0c-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "af27e767-99b2-48f8-99d5-19772d768d0c" (UID: "af27e767-99b2-48f8-99d5-19772d768d0c"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:08:51 crc kubenswrapper[4933]: I1202 16:08:51.632145 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af27e767-99b2-48f8-99d5-19772d768d0c-console-config" (OuterVolumeSpecName: "console-config") pod "af27e767-99b2-48f8-99d5-19772d768d0c" (UID: "af27e767-99b2-48f8-99d5-19772d768d0c"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:08:51 crc kubenswrapper[4933]: I1202 16:08:51.632192 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af27e767-99b2-48f8-99d5-19772d768d0c-service-ca" (OuterVolumeSpecName: "service-ca") pod "af27e767-99b2-48f8-99d5-19772d768d0c" (UID: "af27e767-99b2-48f8-99d5-19772d768d0c"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:08:51 crc kubenswrapper[4933]: I1202 16:08:51.632220 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af27e767-99b2-48f8-99d5-19772d768d0c-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "af27e767-99b2-48f8-99d5-19772d768d0c" (UID: "af27e767-99b2-48f8-99d5-19772d768d0c"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:08:51 crc kubenswrapper[4933]: I1202 16:08:51.639255 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af27e767-99b2-48f8-99d5-19772d768d0c-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "af27e767-99b2-48f8-99d5-19772d768d0c" (UID: "af27e767-99b2-48f8-99d5-19772d768d0c"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:08:51 crc kubenswrapper[4933]: I1202 16:08:51.642053 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af27e767-99b2-48f8-99d5-19772d768d0c-kube-api-access-2m2tx" (OuterVolumeSpecName: "kube-api-access-2m2tx") pod "af27e767-99b2-48f8-99d5-19772d768d0c" (UID: "af27e767-99b2-48f8-99d5-19772d768d0c"). InnerVolumeSpecName "kube-api-access-2m2tx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:08:51 crc kubenswrapper[4933]: I1202 16:08:51.642192 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af27e767-99b2-48f8-99d5-19772d768d0c-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "af27e767-99b2-48f8-99d5-19772d768d0c" (UID: "af27e767-99b2-48f8-99d5-19772d768d0c"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:08:51 crc kubenswrapper[4933]: I1202 16:08:51.732646 4933 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/af27e767-99b2-48f8-99d5-19772d768d0c-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 16:08:51 crc kubenswrapper[4933]: I1202 16:08:51.732686 4933 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/af27e767-99b2-48f8-99d5-19772d768d0c-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 16:08:51 crc kubenswrapper[4933]: I1202 16:08:51.732701 4933 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/af27e767-99b2-48f8-99d5-19772d768d0c-service-ca\") on node \"crc\" DevicePath \"\"" Dec 02 16:08:51 crc kubenswrapper[4933]: I1202 16:08:51.732710 4933 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/af27e767-99b2-48f8-99d5-19772d768d0c-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 16:08:51 crc kubenswrapper[4933]: I1202 16:08:51.732718 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2m2tx\" (UniqueName: \"kubernetes.io/projected/af27e767-99b2-48f8-99d5-19772d768d0c-kube-api-access-2m2tx\") on node \"crc\" DevicePath \"\"" Dec 02 16:08:51 crc kubenswrapper[4933]: I1202 16:08:51.732727 4933 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/af27e767-99b2-48f8-99d5-19772d768d0c-console-config\") on node \"crc\" DevicePath \"\"" Dec 02 16:08:51 crc kubenswrapper[4933]: I1202 16:08:51.732735 4933 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/af27e767-99b2-48f8-99d5-19772d768d0c-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 02 16:08:52 crc kubenswrapper[4933]: I1202 16:08:52.242622 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6d6c4458c4-cfwr8_af27e767-99b2-48f8-99d5-19772d768d0c/console/0.log" Dec 02 16:08:52 crc kubenswrapper[4933]: I1202 16:08:52.242958 4933 generic.go:334] "Generic (PLEG): container finished" podID="af27e767-99b2-48f8-99d5-19772d768d0c" containerID="d7fd1a1d0912d78511622833f4d3f50655404895a4988ed895227989ad9f2dd5" exitCode=2 Dec 02 16:08:52 crc kubenswrapper[4933]: I1202 16:08:52.242990 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6d6c4458c4-cfwr8" event={"ID":"af27e767-99b2-48f8-99d5-19772d768d0c","Type":"ContainerDied","Data":"d7fd1a1d0912d78511622833f4d3f50655404895a4988ed895227989ad9f2dd5"} Dec 02 16:08:52 crc kubenswrapper[4933]: I1202 16:08:52.243017 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6d6c4458c4-cfwr8" event={"ID":"af27e767-99b2-48f8-99d5-19772d768d0c","Type":"ContainerDied","Data":"ade3b1fe68ed83b8a549e7cef3b7d55771370257aff8f6c845ebd35d0d7cbf40"} Dec 02 16:08:52 crc kubenswrapper[4933]: I1202 16:08:52.243036 4933 scope.go:117] "RemoveContainer" containerID="d7fd1a1d0912d78511622833f4d3f50655404895a4988ed895227989ad9f2dd5" Dec 02 16:08:52 crc kubenswrapper[4933]: I1202 16:08:52.243181 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6d6c4458c4-cfwr8" Dec 02 16:08:52 crc kubenswrapper[4933]: I1202 16:08:52.267214 4933 scope.go:117] "RemoveContainer" containerID="d7fd1a1d0912d78511622833f4d3f50655404895a4988ed895227989ad9f2dd5" Dec 02 16:08:52 crc kubenswrapper[4933]: E1202 16:08:52.267783 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7fd1a1d0912d78511622833f4d3f50655404895a4988ed895227989ad9f2dd5\": container with ID starting with d7fd1a1d0912d78511622833f4d3f50655404895a4988ed895227989ad9f2dd5 not found: ID does not exist" containerID="d7fd1a1d0912d78511622833f4d3f50655404895a4988ed895227989ad9f2dd5" Dec 02 16:08:52 crc kubenswrapper[4933]: I1202 16:08:52.267870 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7fd1a1d0912d78511622833f4d3f50655404895a4988ed895227989ad9f2dd5"} err="failed to get container status \"d7fd1a1d0912d78511622833f4d3f50655404895a4988ed895227989ad9f2dd5\": rpc error: code = NotFound desc = could not find container \"d7fd1a1d0912d78511622833f4d3f50655404895a4988ed895227989ad9f2dd5\": container with ID starting with d7fd1a1d0912d78511622833f4d3f50655404895a4988ed895227989ad9f2dd5 not found: ID does not exist" Dec 02 16:08:52 crc kubenswrapper[4933]: I1202 16:08:52.273890 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6d6c4458c4-cfwr8"] Dec 02 16:08:52 crc kubenswrapper[4933]: I1202 16:08:52.278020 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6d6c4458c4-cfwr8"] Dec 02 16:08:53 crc kubenswrapper[4933]: I1202 16:08:53.062309 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af27e767-99b2-48f8-99d5-19772d768d0c" path="/var/lib/kubelet/pods/af27e767-99b2-48f8-99d5-19772d768d0c/volumes" Dec 02 16:08:53 crc kubenswrapper[4933]: I1202 16:08:53.810275 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f837tfk6"] Dec 02 16:08:53 crc kubenswrapper[4933]: E1202 16:08:53.810639 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af27e767-99b2-48f8-99d5-19772d768d0c" containerName="console" Dec 02 16:08:53 crc kubenswrapper[4933]: I1202 16:08:53.810656 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="af27e767-99b2-48f8-99d5-19772d768d0c" containerName="console" Dec 02 16:08:53 crc kubenswrapper[4933]: I1202 16:08:53.810814 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="af27e767-99b2-48f8-99d5-19772d768d0c" containerName="console" Dec 02 16:08:53 crc kubenswrapper[4933]: I1202 16:08:53.812159 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f837tfk6" Dec 02 16:08:53 crc kubenswrapper[4933]: I1202 16:08:53.814761 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 02 16:08:53 crc kubenswrapper[4933]: I1202 16:08:53.819428 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f837tfk6"] Dec 02 16:08:53 crc kubenswrapper[4933]: I1202 16:08:53.859685 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqhjh\" (UniqueName: \"kubernetes.io/projected/de440c6d-8572-4245-a443-c7ac5ebe5a69-kube-api-access-xqhjh\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f837tfk6\" (UID: \"de440c6d-8572-4245-a443-c7ac5ebe5a69\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f837tfk6" Dec 02 16:08:53 crc kubenswrapper[4933]: I1202 16:08:53.859744 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/de440c6d-8572-4245-a443-c7ac5ebe5a69-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f837tfk6\" (UID: \"de440c6d-8572-4245-a443-c7ac5ebe5a69\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f837tfk6" Dec 02 16:08:53 crc kubenswrapper[4933]: I1202 16:08:53.859793 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/de440c6d-8572-4245-a443-c7ac5ebe5a69-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f837tfk6\" (UID: \"de440c6d-8572-4245-a443-c7ac5ebe5a69\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f837tfk6" Dec 02 16:08:53 crc kubenswrapper[4933]: I1202 16:08:53.961997 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqhjh\" (UniqueName: \"kubernetes.io/projected/de440c6d-8572-4245-a443-c7ac5ebe5a69-kube-api-access-xqhjh\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f837tfk6\" (UID: \"de440c6d-8572-4245-a443-c7ac5ebe5a69\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f837tfk6" Dec 02 16:08:53 crc kubenswrapper[4933]: I1202 16:08:53.962129 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/de440c6d-8572-4245-a443-c7ac5ebe5a69-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f837tfk6\" (UID: \"de440c6d-8572-4245-a443-c7ac5ebe5a69\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f837tfk6" Dec 02 16:08:53 crc kubenswrapper[4933]: I1202 16:08:53.962214 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/de440c6d-8572-4245-a443-c7ac5ebe5a69-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f837tfk6\" (UID: \"de440c6d-8572-4245-a443-c7ac5ebe5a69\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f837tfk6" Dec 02 16:08:53 crc kubenswrapper[4933]: I1202 16:08:53.963051 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/de440c6d-8572-4245-a443-c7ac5ebe5a69-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f837tfk6\" (UID: \"de440c6d-8572-4245-a443-c7ac5ebe5a69\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f837tfk6" Dec 02 16:08:53 crc kubenswrapper[4933]: I1202 16:08:53.963053 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/de440c6d-8572-4245-a443-c7ac5ebe5a69-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f837tfk6\" (UID: \"de440c6d-8572-4245-a443-c7ac5ebe5a69\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f837tfk6" Dec 02 16:08:53 crc kubenswrapper[4933]: I1202 16:08:53.981009 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqhjh\" (UniqueName: \"kubernetes.io/projected/de440c6d-8572-4245-a443-c7ac5ebe5a69-kube-api-access-xqhjh\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f837tfk6\" (UID: \"de440c6d-8572-4245-a443-c7ac5ebe5a69\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f837tfk6" Dec 02 16:08:54 crc kubenswrapper[4933]: I1202 16:08:54.127895 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f837tfk6" Dec 02 16:08:54 crc kubenswrapper[4933]: I1202 16:08:54.538601 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f837tfk6"] Dec 02 16:08:55 crc kubenswrapper[4933]: I1202 16:08:55.267790 4933 generic.go:334] "Generic (PLEG): container finished" podID="de440c6d-8572-4245-a443-c7ac5ebe5a69" containerID="c290b9908bd638abfac47135bd70a2408a376b8ab9e30dfab042187a96f6c33a" exitCode=0 Dec 02 16:08:55 crc kubenswrapper[4933]: I1202 16:08:55.267880 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f837tfk6" event={"ID":"de440c6d-8572-4245-a443-c7ac5ebe5a69","Type":"ContainerDied","Data":"c290b9908bd638abfac47135bd70a2408a376b8ab9e30dfab042187a96f6c33a"} Dec 02 16:08:55 crc kubenswrapper[4933]: I1202 16:08:55.268194 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f837tfk6" event={"ID":"de440c6d-8572-4245-a443-c7ac5ebe5a69","Type":"ContainerStarted","Data":"c6519cc23b05c05029246eb695cd0de5c7db2b2f2571d6ded19ef055ce8963e6"} Dec 02 16:08:57 crc kubenswrapper[4933]: I1202 16:08:57.280991 4933 generic.go:334] "Generic (PLEG): container finished" podID="de440c6d-8572-4245-a443-c7ac5ebe5a69" containerID="c6082acab81bda0fb42bd032c7e16bd510b0b93df720e3b35871440fd4b0cc10" exitCode=0 Dec 02 16:08:57 crc kubenswrapper[4933]: I1202 16:08:57.281054 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f837tfk6" event={"ID":"de440c6d-8572-4245-a443-c7ac5ebe5a69","Type":"ContainerDied","Data":"c6082acab81bda0fb42bd032c7e16bd510b0b93df720e3b35871440fd4b0cc10"} Dec 02 16:08:58 crc kubenswrapper[4933]: I1202 16:08:58.292996 4933 generic.go:334] "Generic (PLEG): container finished" podID="de440c6d-8572-4245-a443-c7ac5ebe5a69" containerID="55b19911b90e555084b954e6dee750c0b2d2650fa299fedce5e6217b7fe543f4" exitCode=0 Dec 02 16:08:58 crc kubenswrapper[4933]: I1202 16:08:58.293068 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f837tfk6" event={"ID":"de440c6d-8572-4245-a443-c7ac5ebe5a69","Type":"ContainerDied","Data":"55b19911b90e555084b954e6dee750c0b2d2650fa299fedce5e6217b7fe543f4"} Dec 02 16:08:59 crc kubenswrapper[4933]: I1202 16:08:59.634717 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f837tfk6" Dec 02 16:08:59 crc kubenswrapper[4933]: I1202 16:08:59.653874 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/de440c6d-8572-4245-a443-c7ac5ebe5a69-util\") pod \"de440c6d-8572-4245-a443-c7ac5ebe5a69\" (UID: \"de440c6d-8572-4245-a443-c7ac5ebe5a69\") " Dec 02 16:08:59 crc kubenswrapper[4933]: I1202 16:08:59.653990 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/de440c6d-8572-4245-a443-c7ac5ebe5a69-bundle\") pod \"de440c6d-8572-4245-a443-c7ac5ebe5a69\" (UID: \"de440c6d-8572-4245-a443-c7ac5ebe5a69\") " Dec 02 16:08:59 crc kubenswrapper[4933]: I1202 16:08:59.654048 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xqhjh\" (UniqueName: \"kubernetes.io/projected/de440c6d-8572-4245-a443-c7ac5ebe5a69-kube-api-access-xqhjh\") pod \"de440c6d-8572-4245-a443-c7ac5ebe5a69\" (UID: \"de440c6d-8572-4245-a443-c7ac5ebe5a69\") " Dec 02 16:08:59 crc kubenswrapper[4933]: I1202 16:08:59.655460 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de440c6d-8572-4245-a443-c7ac5ebe5a69-bundle" (OuterVolumeSpecName: "bundle") pod "de440c6d-8572-4245-a443-c7ac5ebe5a69" (UID: "de440c6d-8572-4245-a443-c7ac5ebe5a69"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 16:08:59 crc kubenswrapper[4933]: I1202 16:08:59.659951 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de440c6d-8572-4245-a443-c7ac5ebe5a69-kube-api-access-xqhjh" (OuterVolumeSpecName: "kube-api-access-xqhjh") pod "de440c6d-8572-4245-a443-c7ac5ebe5a69" (UID: "de440c6d-8572-4245-a443-c7ac5ebe5a69"). InnerVolumeSpecName "kube-api-access-xqhjh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:08:59 crc kubenswrapper[4933]: I1202 16:08:59.669230 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de440c6d-8572-4245-a443-c7ac5ebe5a69-util" (OuterVolumeSpecName: "util") pod "de440c6d-8572-4245-a443-c7ac5ebe5a69" (UID: "de440c6d-8572-4245-a443-c7ac5ebe5a69"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 16:08:59 crc kubenswrapper[4933]: I1202 16:08:59.756121 4933 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/de440c6d-8572-4245-a443-c7ac5ebe5a69-util\") on node \"crc\" DevicePath \"\"" Dec 02 16:08:59 crc kubenswrapper[4933]: I1202 16:08:59.756191 4933 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/de440c6d-8572-4245-a443-c7ac5ebe5a69-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 16:08:59 crc kubenswrapper[4933]: I1202 16:08:59.756206 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xqhjh\" (UniqueName: \"kubernetes.io/projected/de440c6d-8572-4245-a443-c7ac5ebe5a69-kube-api-access-xqhjh\") on node \"crc\" DevicePath \"\"" Dec 02 16:09:00 crc kubenswrapper[4933]: I1202 16:09:00.311963 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f837tfk6" event={"ID":"de440c6d-8572-4245-a443-c7ac5ebe5a69","Type":"ContainerDied","Data":"c6519cc23b05c05029246eb695cd0de5c7db2b2f2571d6ded19ef055ce8963e6"} Dec 02 16:09:00 crc kubenswrapper[4933]: I1202 16:09:00.312262 4933 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c6519cc23b05c05029246eb695cd0de5c7db2b2f2571d6ded19ef055ce8963e6" Dec 02 16:09:00 crc kubenswrapper[4933]: I1202 16:09:00.312084 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f837tfk6" Dec 02 16:09:09 crc kubenswrapper[4933]: I1202 16:09:09.648611 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-777b6c8cc-hkdpl"] Dec 02 16:09:09 crc kubenswrapper[4933]: E1202 16:09:09.649534 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de440c6d-8572-4245-a443-c7ac5ebe5a69" containerName="pull" Dec 02 16:09:09 crc kubenswrapper[4933]: I1202 16:09:09.649550 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="de440c6d-8572-4245-a443-c7ac5ebe5a69" containerName="pull" Dec 02 16:09:09 crc kubenswrapper[4933]: E1202 16:09:09.649563 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de440c6d-8572-4245-a443-c7ac5ebe5a69" containerName="util" Dec 02 16:09:09 crc kubenswrapper[4933]: I1202 16:09:09.649572 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="de440c6d-8572-4245-a443-c7ac5ebe5a69" containerName="util" Dec 02 16:09:09 crc kubenswrapper[4933]: E1202 16:09:09.649596 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de440c6d-8572-4245-a443-c7ac5ebe5a69" containerName="extract" Dec 02 16:09:09 crc kubenswrapper[4933]: I1202 16:09:09.649604 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="de440c6d-8572-4245-a443-c7ac5ebe5a69" containerName="extract" Dec 02 16:09:09 crc kubenswrapper[4933]: I1202 16:09:09.649757 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="de440c6d-8572-4245-a443-c7ac5ebe5a69" containerName="extract" Dec 02 16:09:09 crc kubenswrapper[4933]: I1202 16:09:09.650381 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-777b6c8cc-hkdpl" Dec 02 16:09:09 crc kubenswrapper[4933]: I1202 16:09:09.652539 4933 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Dec 02 16:09:09 crc kubenswrapper[4933]: I1202 16:09:09.652811 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Dec 02 16:09:09 crc kubenswrapper[4933]: I1202 16:09:09.652933 4933 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Dec 02 16:09:09 crc kubenswrapper[4933]: I1202 16:09:09.653014 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Dec 02 16:09:09 crc kubenswrapper[4933]: I1202 16:09:09.653397 4933 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-52hw8" Dec 02 16:09:09 crc kubenswrapper[4933]: I1202 16:09:09.667595 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-777b6c8cc-hkdpl"] Dec 02 16:09:09 crc kubenswrapper[4933]: I1202 16:09:09.757945 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s77gw\" (UniqueName: \"kubernetes.io/projected/a987e1fe-f524-4d29-98c4-cc0238d7b79f-kube-api-access-s77gw\") pod \"metallb-operator-controller-manager-777b6c8cc-hkdpl\" (UID: \"a987e1fe-f524-4d29-98c4-cc0238d7b79f\") " pod="metallb-system/metallb-operator-controller-manager-777b6c8cc-hkdpl" Dec 02 16:09:09 crc kubenswrapper[4933]: I1202 16:09:09.758044 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a987e1fe-f524-4d29-98c4-cc0238d7b79f-apiservice-cert\") pod \"metallb-operator-controller-manager-777b6c8cc-hkdpl\" (UID: \"a987e1fe-f524-4d29-98c4-cc0238d7b79f\") " pod="metallb-system/metallb-operator-controller-manager-777b6c8cc-hkdpl" Dec 02 16:09:09 crc kubenswrapper[4933]: I1202 16:09:09.758111 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a987e1fe-f524-4d29-98c4-cc0238d7b79f-webhook-cert\") pod \"metallb-operator-controller-manager-777b6c8cc-hkdpl\" (UID: \"a987e1fe-f524-4d29-98c4-cc0238d7b79f\") " pod="metallb-system/metallb-operator-controller-manager-777b6c8cc-hkdpl" Dec 02 16:09:09 crc kubenswrapper[4933]: I1202 16:09:09.859803 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a987e1fe-f524-4d29-98c4-cc0238d7b79f-apiservice-cert\") pod \"metallb-operator-controller-manager-777b6c8cc-hkdpl\" (UID: \"a987e1fe-f524-4d29-98c4-cc0238d7b79f\") " pod="metallb-system/metallb-operator-controller-manager-777b6c8cc-hkdpl" Dec 02 16:09:09 crc kubenswrapper[4933]: I1202 16:09:09.859950 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a987e1fe-f524-4d29-98c4-cc0238d7b79f-webhook-cert\") pod \"metallb-operator-controller-manager-777b6c8cc-hkdpl\" (UID: \"a987e1fe-f524-4d29-98c4-cc0238d7b79f\") " pod="metallb-system/metallb-operator-controller-manager-777b6c8cc-hkdpl" Dec 02 16:09:09 crc kubenswrapper[4933]: I1202 16:09:09.860011 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s77gw\" (UniqueName: \"kubernetes.io/projected/a987e1fe-f524-4d29-98c4-cc0238d7b79f-kube-api-access-s77gw\") pod \"metallb-operator-controller-manager-777b6c8cc-hkdpl\" (UID: \"a987e1fe-f524-4d29-98c4-cc0238d7b79f\") " pod="metallb-system/metallb-operator-controller-manager-777b6c8cc-hkdpl" Dec 02 16:09:09 crc kubenswrapper[4933]: I1202 16:09:09.867322 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a987e1fe-f524-4d29-98c4-cc0238d7b79f-apiservice-cert\") pod \"metallb-operator-controller-manager-777b6c8cc-hkdpl\" (UID: \"a987e1fe-f524-4d29-98c4-cc0238d7b79f\") " pod="metallb-system/metallb-operator-controller-manager-777b6c8cc-hkdpl" Dec 02 16:09:09 crc kubenswrapper[4933]: I1202 16:09:09.869470 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a987e1fe-f524-4d29-98c4-cc0238d7b79f-webhook-cert\") pod \"metallb-operator-controller-manager-777b6c8cc-hkdpl\" (UID: \"a987e1fe-f524-4d29-98c4-cc0238d7b79f\") " pod="metallb-system/metallb-operator-controller-manager-777b6c8cc-hkdpl" Dec 02 16:09:09 crc kubenswrapper[4933]: I1202 16:09:09.889657 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s77gw\" (UniqueName: \"kubernetes.io/projected/a987e1fe-f524-4d29-98c4-cc0238d7b79f-kube-api-access-s77gw\") pod \"metallb-operator-controller-manager-777b6c8cc-hkdpl\" (UID: \"a987e1fe-f524-4d29-98c4-cc0238d7b79f\") " pod="metallb-system/metallb-operator-controller-manager-777b6c8cc-hkdpl" Dec 02 16:09:09 crc kubenswrapper[4933]: I1202 16:09:09.970595 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-777b6c8cc-hkdpl" Dec 02 16:09:09 crc kubenswrapper[4933]: I1202 16:09:09.982528 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-ff64ff47b-dqgr2"] Dec 02 16:09:09 crc kubenswrapper[4933]: I1202 16:09:09.983518 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-ff64ff47b-dqgr2" Dec 02 16:09:09 crc kubenswrapper[4933]: I1202 16:09:09.988719 4933 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 02 16:09:09 crc kubenswrapper[4933]: I1202 16:09:09.988940 4933 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Dec 02 16:09:10 crc kubenswrapper[4933]: I1202 16:09:10.006341 4933 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-kbll4" Dec 02 16:09:10 crc kubenswrapper[4933]: I1202 16:09:10.011575 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-ff64ff47b-dqgr2"] Dec 02 16:09:10 crc kubenswrapper[4933]: I1202 16:09:10.063699 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3493ec7a-7065-4895-944d-2aefb1419718-apiservice-cert\") pod \"metallb-operator-webhook-server-ff64ff47b-dqgr2\" (UID: \"3493ec7a-7065-4895-944d-2aefb1419718\") " pod="metallb-system/metallb-operator-webhook-server-ff64ff47b-dqgr2" Dec 02 16:09:10 crc kubenswrapper[4933]: I1202 16:09:10.063816 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-588c2\" (UniqueName: \"kubernetes.io/projected/3493ec7a-7065-4895-944d-2aefb1419718-kube-api-access-588c2\") pod \"metallb-operator-webhook-server-ff64ff47b-dqgr2\" (UID: \"3493ec7a-7065-4895-944d-2aefb1419718\") " pod="metallb-system/metallb-operator-webhook-server-ff64ff47b-dqgr2" Dec 02 16:09:10 crc kubenswrapper[4933]: I1202 16:09:10.063959 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3493ec7a-7065-4895-944d-2aefb1419718-webhook-cert\") pod \"metallb-operator-webhook-server-ff64ff47b-dqgr2\" (UID: \"3493ec7a-7065-4895-944d-2aefb1419718\") " pod="metallb-system/metallb-operator-webhook-server-ff64ff47b-dqgr2" Dec 02 16:09:10 crc kubenswrapper[4933]: I1202 16:09:10.165461 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-588c2\" (UniqueName: \"kubernetes.io/projected/3493ec7a-7065-4895-944d-2aefb1419718-kube-api-access-588c2\") pod \"metallb-operator-webhook-server-ff64ff47b-dqgr2\" (UID: \"3493ec7a-7065-4895-944d-2aefb1419718\") " pod="metallb-system/metallb-operator-webhook-server-ff64ff47b-dqgr2" Dec 02 16:09:10 crc kubenswrapper[4933]: I1202 16:09:10.165638 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3493ec7a-7065-4895-944d-2aefb1419718-webhook-cert\") pod \"metallb-operator-webhook-server-ff64ff47b-dqgr2\" (UID: \"3493ec7a-7065-4895-944d-2aefb1419718\") " pod="metallb-system/metallb-operator-webhook-server-ff64ff47b-dqgr2" Dec 02 16:09:10 crc kubenswrapper[4933]: I1202 16:09:10.165697 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3493ec7a-7065-4895-944d-2aefb1419718-apiservice-cert\") pod \"metallb-operator-webhook-server-ff64ff47b-dqgr2\" (UID: \"3493ec7a-7065-4895-944d-2aefb1419718\") " pod="metallb-system/metallb-operator-webhook-server-ff64ff47b-dqgr2" Dec 02 16:09:10 crc kubenswrapper[4933]: I1202 16:09:10.187126 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3493ec7a-7065-4895-944d-2aefb1419718-webhook-cert\") pod \"metallb-operator-webhook-server-ff64ff47b-dqgr2\" (UID: \"3493ec7a-7065-4895-944d-2aefb1419718\") " pod="metallb-system/metallb-operator-webhook-server-ff64ff47b-dqgr2" Dec 02 16:09:10 crc kubenswrapper[4933]: I1202 16:09:10.187256 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3493ec7a-7065-4895-944d-2aefb1419718-apiservice-cert\") pod \"metallb-operator-webhook-server-ff64ff47b-dqgr2\" (UID: \"3493ec7a-7065-4895-944d-2aefb1419718\") " pod="metallb-system/metallb-operator-webhook-server-ff64ff47b-dqgr2" Dec 02 16:09:10 crc kubenswrapper[4933]: I1202 16:09:10.189645 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-588c2\" (UniqueName: \"kubernetes.io/projected/3493ec7a-7065-4895-944d-2aefb1419718-kube-api-access-588c2\") pod \"metallb-operator-webhook-server-ff64ff47b-dqgr2\" (UID: \"3493ec7a-7065-4895-944d-2aefb1419718\") " pod="metallb-system/metallb-operator-webhook-server-ff64ff47b-dqgr2" Dec 02 16:09:10 crc kubenswrapper[4933]: I1202 16:09:10.348563 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-ff64ff47b-dqgr2" Dec 02 16:09:10 crc kubenswrapper[4933]: I1202 16:09:10.499709 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-777b6c8cc-hkdpl"] Dec 02 16:09:10 crc kubenswrapper[4933]: W1202 16:09:10.778852 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3493ec7a_7065_4895_944d_2aefb1419718.slice/crio-70859d5a069129708a031315e9c864270f644daa6b0b527a33925b73b357e706 WatchSource:0}: Error finding container 70859d5a069129708a031315e9c864270f644daa6b0b527a33925b73b357e706: Status 404 returned error can't find the container with id 70859d5a069129708a031315e9c864270f644daa6b0b527a33925b73b357e706 Dec 02 16:09:10 crc kubenswrapper[4933]: I1202 16:09:10.782578 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-ff64ff47b-dqgr2"] Dec 02 16:09:11 crc kubenswrapper[4933]: I1202 16:09:11.393424 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-ff64ff47b-dqgr2" event={"ID":"3493ec7a-7065-4895-944d-2aefb1419718","Type":"ContainerStarted","Data":"70859d5a069129708a031315e9c864270f644daa6b0b527a33925b73b357e706"} Dec 02 16:09:11 crc kubenswrapper[4933]: I1202 16:09:11.394664 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-777b6c8cc-hkdpl" event={"ID":"a987e1fe-f524-4d29-98c4-cc0238d7b79f","Type":"ContainerStarted","Data":"13059a77d58fccce9759a840b30327c614ecf4b956e555e3fd97311cc06c8895"} Dec 02 16:09:15 crc kubenswrapper[4933]: I1202 16:09:15.446494 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-777b6c8cc-hkdpl" event={"ID":"a987e1fe-f524-4d29-98c4-cc0238d7b79f","Type":"ContainerStarted","Data":"62527b48266cf24f8aa8729bf61317dd4663d90d5af323c4491f3c34d4840990"} Dec 02 16:09:15 crc kubenswrapper[4933]: I1202 16:09:15.448175 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-777b6c8cc-hkdpl" Dec 02 16:09:15 crc kubenswrapper[4933]: I1202 16:09:15.479807 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-777b6c8cc-hkdpl" podStartSLOduration=2.621278405 podStartE2EDuration="6.479783787s" podCreationTimestamp="2025-12-02 16:09:09 +0000 UTC" firstStartedPulling="2025-12-02 16:09:10.505479793 +0000 UTC m=+1013.756706496" lastFinishedPulling="2025-12-02 16:09:14.363985175 +0000 UTC m=+1017.615211878" observedRunningTime="2025-12-02 16:09:15.474533882 +0000 UTC m=+1018.725760585" watchObservedRunningTime="2025-12-02 16:09:15.479783787 +0000 UTC m=+1018.731010490" Dec 02 16:09:17 crc kubenswrapper[4933]: I1202 16:09:17.462733 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-ff64ff47b-dqgr2" event={"ID":"3493ec7a-7065-4895-944d-2aefb1419718","Type":"ContainerStarted","Data":"dc80b52c9efc50a937f51835bdbac0ce5b18748ce12e5494c94afd14e6036a03"} Dec 02 16:09:18 crc kubenswrapper[4933]: I1202 16:09:18.468637 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-ff64ff47b-dqgr2" Dec 02 16:09:30 crc kubenswrapper[4933]: I1202 16:09:30.356380 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-ff64ff47b-dqgr2" Dec 02 16:09:30 crc kubenswrapper[4933]: I1202 16:09:30.374345 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-ff64ff47b-dqgr2" podStartSLOduration=15.568947902 podStartE2EDuration="21.374324741s" podCreationTimestamp="2025-12-02 16:09:09 +0000 UTC" firstStartedPulling="2025-12-02 16:09:10.781599738 +0000 UTC m=+1014.032826441" lastFinishedPulling="2025-12-02 16:09:16.586976577 +0000 UTC m=+1019.838203280" observedRunningTime="2025-12-02 16:09:17.496314412 +0000 UTC m=+1020.747541115" watchObservedRunningTime="2025-12-02 16:09:30.374324741 +0000 UTC m=+1033.625551444" Dec 02 16:09:49 crc kubenswrapper[4933]: I1202 16:09:49.974561 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-777b6c8cc-hkdpl" Dec 02 16:09:50 crc kubenswrapper[4933]: I1202 16:09:50.720421 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-2zdck"] Dec 02 16:09:50 crc kubenswrapper[4933]: I1202 16:09:50.726534 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-2zdck" Dec 02 16:09:50 crc kubenswrapper[4933]: I1202 16:09:50.728809 4933 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Dec 02 16:09:50 crc kubenswrapper[4933]: I1202 16:09:50.728866 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Dec 02 16:09:50 crc kubenswrapper[4933]: I1202 16:09:50.731662 4933 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-s475q" Dec 02 16:09:50 crc kubenswrapper[4933]: I1202 16:09:50.757806 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-r4spw"] Dec 02 16:09:50 crc kubenswrapper[4933]: I1202 16:09:50.758788 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-r4spw" Dec 02 16:09:50 crc kubenswrapper[4933]: I1202 16:09:50.760338 4933 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Dec 02 16:09:50 crc kubenswrapper[4933]: I1202 16:09:50.777020 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-r4spw"] Dec 02 16:09:50 crc kubenswrapper[4933]: I1202 16:09:50.830858 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/3cc86070-243d-4886-9746-9fcab519fb50-metrics\") pod \"frr-k8s-2zdck\" (UID: \"3cc86070-243d-4886-9746-9fcab519fb50\") " pod="metallb-system/frr-k8s-2zdck" Dec 02 16:09:50 crc kubenswrapper[4933]: I1202 16:09:50.830942 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/3cc86070-243d-4886-9746-9fcab519fb50-frr-startup\") pod \"frr-k8s-2zdck\" (UID: \"3cc86070-243d-4886-9746-9fcab519fb50\") " pod="metallb-system/frr-k8s-2zdck" Dec 02 16:09:50 crc kubenswrapper[4933]: I1202 16:09:50.831006 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/3cc86070-243d-4886-9746-9fcab519fb50-frr-conf\") pod \"frr-k8s-2zdck\" (UID: \"3cc86070-243d-4886-9746-9fcab519fb50\") " pod="metallb-system/frr-k8s-2zdck" Dec 02 16:09:50 crc kubenswrapper[4933]: I1202 16:09:50.831048 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/3cc86070-243d-4886-9746-9fcab519fb50-frr-sockets\") pod \"frr-k8s-2zdck\" (UID: \"3cc86070-243d-4886-9746-9fcab519fb50\") " pod="metallb-system/frr-k8s-2zdck" Dec 02 16:09:50 crc kubenswrapper[4933]: I1202 16:09:50.831073 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64h8d\" (UniqueName: \"kubernetes.io/projected/3cc86070-243d-4886-9746-9fcab519fb50-kube-api-access-64h8d\") pod \"frr-k8s-2zdck\" (UID: \"3cc86070-243d-4886-9746-9fcab519fb50\") " pod="metallb-system/frr-k8s-2zdck" Dec 02 16:09:50 crc kubenswrapper[4933]: I1202 16:09:50.831148 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/3cc86070-243d-4886-9746-9fcab519fb50-reloader\") pod \"frr-k8s-2zdck\" (UID: \"3cc86070-243d-4886-9746-9fcab519fb50\") " pod="metallb-system/frr-k8s-2zdck" Dec 02 16:09:50 crc kubenswrapper[4933]: I1202 16:09:50.831193 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3cc86070-243d-4886-9746-9fcab519fb50-metrics-certs\") pod \"frr-k8s-2zdck\" (UID: \"3cc86070-243d-4886-9746-9fcab519fb50\") " pod="metallb-system/frr-k8s-2zdck" Dec 02 16:09:50 crc kubenswrapper[4933]: I1202 16:09:50.831320 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-7428k"] Dec 02 16:09:50 crc kubenswrapper[4933]: I1202 16:09:50.832781 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-7428k" Dec 02 16:09:50 crc kubenswrapper[4933]: I1202 16:09:50.835575 4933 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Dec 02 16:09:50 crc kubenswrapper[4933]: I1202 16:09:50.835605 4933 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Dec 02 16:09:50 crc kubenswrapper[4933]: I1202 16:09:50.835644 4933 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-wqhft" Dec 02 16:09:50 crc kubenswrapper[4933]: I1202 16:09:50.835801 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Dec 02 16:09:50 crc kubenswrapper[4933]: I1202 16:09:50.845203 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-f8648f98b-dkmmd"] Dec 02 16:09:50 crc kubenswrapper[4933]: I1202 16:09:50.846430 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-dkmmd" Dec 02 16:09:50 crc kubenswrapper[4933]: I1202 16:09:50.848669 4933 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Dec 02 16:09:50 crc kubenswrapper[4933]: I1202 16:09:50.871354 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-dkmmd"] Dec 02 16:09:50 crc kubenswrapper[4933]: I1202 16:09:50.932440 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b18bcd0e-ac22-4425-940c-29311785588f-cert\") pod \"controller-f8648f98b-dkmmd\" (UID: \"b18bcd0e-ac22-4425-940c-29311785588f\") " pod="metallb-system/controller-f8648f98b-dkmmd" Dec 02 16:09:50 crc kubenswrapper[4933]: I1202 16:09:50.932808 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/3cc86070-243d-4886-9746-9fcab519fb50-frr-conf\") pod \"frr-k8s-2zdck\" (UID: \"3cc86070-243d-4886-9746-9fcab519fb50\") " pod="metallb-system/frr-k8s-2zdck" Dec 02 16:09:50 crc kubenswrapper[4933]: I1202 16:09:50.932880 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/3cc86070-243d-4886-9746-9fcab519fb50-frr-sockets\") pod \"frr-k8s-2zdck\" (UID: \"3cc86070-243d-4886-9746-9fcab519fb50\") " pod="metallb-system/frr-k8s-2zdck" Dec 02 16:09:50 crc kubenswrapper[4933]: I1202 16:09:50.932908 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64h8d\" (UniqueName: \"kubernetes.io/projected/3cc86070-243d-4886-9746-9fcab519fb50-kube-api-access-64h8d\") pod \"frr-k8s-2zdck\" (UID: \"3cc86070-243d-4886-9746-9fcab519fb50\") " pod="metallb-system/frr-k8s-2zdck" Dec 02 16:09:50 crc kubenswrapper[4933]: I1202 16:09:50.932926 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b18bcd0e-ac22-4425-940c-29311785588f-metrics-certs\") pod \"controller-f8648f98b-dkmmd\" (UID: \"b18bcd0e-ac22-4425-940c-29311785588f\") " pod="metallb-system/controller-f8648f98b-dkmmd" Dec 02 16:09:50 crc kubenswrapper[4933]: I1202 16:09:50.932962 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b5d26e47-d7ac-4bda-8b0e-559a3ee18eb6-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-r4spw\" (UID: \"b5d26e47-d7ac-4bda-8b0e-559a3ee18eb6\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-r4spw" Dec 02 16:09:50 crc kubenswrapper[4933]: I1202 16:09:50.932993 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/cf92f734-644b-4a6c-b716-bf6269ae99b5-memberlist\") pod \"speaker-7428k\" (UID: \"cf92f734-644b-4a6c-b716-bf6269ae99b5\") " pod="metallb-system/speaker-7428k" Dec 02 16:09:50 crc kubenswrapper[4933]: I1202 16:09:50.933030 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/3cc86070-243d-4886-9746-9fcab519fb50-reloader\") pod \"frr-k8s-2zdck\" (UID: \"3cc86070-243d-4886-9746-9fcab519fb50\") " pod="metallb-system/frr-k8s-2zdck" Dec 02 16:09:50 crc kubenswrapper[4933]: I1202 16:09:50.933058 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnzfp\" (UniqueName: \"kubernetes.io/projected/cf92f734-644b-4a6c-b716-bf6269ae99b5-kube-api-access-vnzfp\") pod \"speaker-7428k\" (UID: \"cf92f734-644b-4a6c-b716-bf6269ae99b5\") " pod="metallb-system/speaker-7428k" Dec 02 16:09:50 crc kubenswrapper[4933]: I1202 16:09:50.933098 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bj9d9\" (UniqueName: \"kubernetes.io/projected/b18bcd0e-ac22-4425-940c-29311785588f-kube-api-access-bj9d9\") pod \"controller-f8648f98b-dkmmd\" (UID: \"b18bcd0e-ac22-4425-940c-29311785588f\") " pod="metallb-system/controller-f8648f98b-dkmmd" Dec 02 16:09:50 crc kubenswrapper[4933]: I1202 16:09:50.933158 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3cc86070-243d-4886-9746-9fcab519fb50-metrics-certs\") pod \"frr-k8s-2zdck\" (UID: \"3cc86070-243d-4886-9746-9fcab519fb50\") " pod="metallb-system/frr-k8s-2zdck" Dec 02 16:09:50 crc kubenswrapper[4933]: I1202 16:09:50.933192 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/3cc86070-243d-4886-9746-9fcab519fb50-metrics\") pod \"frr-k8s-2zdck\" (UID: \"3cc86070-243d-4886-9746-9fcab519fb50\") " pod="metallb-system/frr-k8s-2zdck" Dec 02 16:09:50 crc kubenswrapper[4933]: I1202 16:09:50.933233 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/cf92f734-644b-4a6c-b716-bf6269ae99b5-metallb-excludel2\") pod \"speaker-7428k\" (UID: \"cf92f734-644b-4a6c-b716-bf6269ae99b5\") " pod="metallb-system/speaker-7428k" Dec 02 16:09:50 crc kubenswrapper[4933]: I1202 16:09:50.933240 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/3cc86070-243d-4886-9746-9fcab519fb50-frr-conf\") pod \"frr-k8s-2zdck\" (UID: \"3cc86070-243d-4886-9746-9fcab519fb50\") " pod="metallb-system/frr-k8s-2zdck" Dec 02 16:09:50 crc kubenswrapper[4933]: I1202 16:09:50.933260 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzkgc\" (UniqueName: \"kubernetes.io/projected/b5d26e47-d7ac-4bda-8b0e-559a3ee18eb6-kube-api-access-vzkgc\") pod \"frr-k8s-webhook-server-7fcb986d4-r4spw\" (UID: \"b5d26e47-d7ac-4bda-8b0e-559a3ee18eb6\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-r4spw" Dec 02 16:09:50 crc kubenswrapper[4933]: I1202 16:09:50.933291 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/3cc86070-243d-4886-9746-9fcab519fb50-frr-startup\") pod \"frr-k8s-2zdck\" (UID: \"3cc86070-243d-4886-9746-9fcab519fb50\") " pod="metallb-system/frr-k8s-2zdck" Dec 02 16:09:50 crc kubenswrapper[4933]: I1202 16:09:50.933318 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cf92f734-644b-4a6c-b716-bf6269ae99b5-metrics-certs\") pod \"speaker-7428k\" (UID: \"cf92f734-644b-4a6c-b716-bf6269ae99b5\") " pod="metallb-system/speaker-7428k" Dec 02 16:09:50 crc kubenswrapper[4933]: I1202 16:09:50.933366 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/3cc86070-243d-4886-9746-9fcab519fb50-frr-sockets\") pod \"frr-k8s-2zdck\" (UID: \"3cc86070-243d-4886-9746-9fcab519fb50\") " pod="metallb-system/frr-k8s-2zdck" Dec 02 16:09:50 crc kubenswrapper[4933]: I1202 16:09:50.933430 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/3cc86070-243d-4886-9746-9fcab519fb50-reloader\") pod \"frr-k8s-2zdck\" (UID: \"3cc86070-243d-4886-9746-9fcab519fb50\") " pod="metallb-system/frr-k8s-2zdck" Dec 02 16:09:50 crc kubenswrapper[4933]: I1202 16:09:50.933593 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/3cc86070-243d-4886-9746-9fcab519fb50-metrics\") pod \"frr-k8s-2zdck\" (UID: \"3cc86070-243d-4886-9746-9fcab519fb50\") " pod="metallb-system/frr-k8s-2zdck" Dec 02 16:09:50 crc kubenswrapper[4933]: I1202 16:09:50.934246 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/3cc86070-243d-4886-9746-9fcab519fb50-frr-startup\") pod \"frr-k8s-2zdck\" (UID: \"3cc86070-243d-4886-9746-9fcab519fb50\") " pod="metallb-system/frr-k8s-2zdck" Dec 02 16:09:50 crc kubenswrapper[4933]: I1202 16:09:50.939458 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3cc86070-243d-4886-9746-9fcab519fb50-metrics-certs\") pod \"frr-k8s-2zdck\" (UID: \"3cc86070-243d-4886-9746-9fcab519fb50\") " pod="metallb-system/frr-k8s-2zdck" Dec 02 16:09:50 crc kubenswrapper[4933]: I1202 16:09:50.952377 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64h8d\" (UniqueName: \"kubernetes.io/projected/3cc86070-243d-4886-9746-9fcab519fb50-kube-api-access-64h8d\") pod \"frr-k8s-2zdck\" (UID: \"3cc86070-243d-4886-9746-9fcab519fb50\") " pod="metallb-system/frr-k8s-2zdck" Dec 02 16:09:51 crc kubenswrapper[4933]: I1202 16:09:51.034638 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vzkgc\" (UniqueName: \"kubernetes.io/projected/b5d26e47-d7ac-4bda-8b0e-559a3ee18eb6-kube-api-access-vzkgc\") pod \"frr-k8s-webhook-server-7fcb986d4-r4spw\" (UID: \"b5d26e47-d7ac-4bda-8b0e-559a3ee18eb6\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-r4spw" Dec 02 16:09:51 crc kubenswrapper[4933]: I1202 16:09:51.034703 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cf92f734-644b-4a6c-b716-bf6269ae99b5-metrics-certs\") pod \"speaker-7428k\" (UID: \"cf92f734-644b-4a6c-b716-bf6269ae99b5\") " pod="metallb-system/speaker-7428k" Dec 02 16:09:51 crc kubenswrapper[4933]: I1202 16:09:51.034748 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b18bcd0e-ac22-4425-940c-29311785588f-cert\") pod \"controller-f8648f98b-dkmmd\" (UID: \"b18bcd0e-ac22-4425-940c-29311785588f\") " pod="metallb-system/controller-f8648f98b-dkmmd" Dec 02 16:09:51 crc kubenswrapper[4933]: I1202 16:09:51.034820 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b18bcd0e-ac22-4425-940c-29311785588f-metrics-certs\") pod \"controller-f8648f98b-dkmmd\" (UID: \"b18bcd0e-ac22-4425-940c-29311785588f\") " pod="metallb-system/controller-f8648f98b-dkmmd" Dec 02 16:09:51 crc kubenswrapper[4933]: I1202 16:09:51.034875 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b5d26e47-d7ac-4bda-8b0e-559a3ee18eb6-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-r4spw\" (UID: \"b5d26e47-d7ac-4bda-8b0e-559a3ee18eb6\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-r4spw" Dec 02 16:09:51 crc kubenswrapper[4933]: I1202 16:09:51.034902 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/cf92f734-644b-4a6c-b716-bf6269ae99b5-memberlist\") pod \"speaker-7428k\" (UID: \"cf92f734-644b-4a6c-b716-bf6269ae99b5\") " pod="metallb-system/speaker-7428k" Dec 02 16:09:51 crc kubenswrapper[4933]: I1202 16:09:51.034946 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnzfp\" (UniqueName: \"kubernetes.io/projected/cf92f734-644b-4a6c-b716-bf6269ae99b5-kube-api-access-vnzfp\") pod \"speaker-7428k\" (UID: \"cf92f734-644b-4a6c-b716-bf6269ae99b5\") " pod="metallb-system/speaker-7428k" Dec 02 16:09:51 crc kubenswrapper[4933]: I1202 16:09:51.034973 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bj9d9\" (UniqueName: \"kubernetes.io/projected/b18bcd0e-ac22-4425-940c-29311785588f-kube-api-access-bj9d9\") pod \"controller-f8648f98b-dkmmd\" (UID: \"b18bcd0e-ac22-4425-940c-29311785588f\") " pod="metallb-system/controller-f8648f98b-dkmmd" Dec 02 16:09:51 crc kubenswrapper[4933]: I1202 16:09:51.035027 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/cf92f734-644b-4a6c-b716-bf6269ae99b5-metallb-excludel2\") pod \"speaker-7428k\" (UID: \"cf92f734-644b-4a6c-b716-bf6269ae99b5\") " pod="metallb-system/speaker-7428k" Dec 02 16:09:51 crc kubenswrapper[4933]: I1202 16:09:51.035942 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/cf92f734-644b-4a6c-b716-bf6269ae99b5-metallb-excludel2\") pod \"speaker-7428k\" (UID: \"cf92f734-644b-4a6c-b716-bf6269ae99b5\") " pod="metallb-system/speaker-7428k" Dec 02 16:09:51 crc kubenswrapper[4933]: E1202 16:09:51.036055 4933 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Dec 02 16:09:51 crc kubenswrapper[4933]: E1202 16:09:51.036112 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b18bcd0e-ac22-4425-940c-29311785588f-metrics-certs podName:b18bcd0e-ac22-4425-940c-29311785588f nodeName:}" failed. No retries permitted until 2025-12-02 16:09:51.536095902 +0000 UTC m=+1054.787322595 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b18bcd0e-ac22-4425-940c-29311785588f-metrics-certs") pod "controller-f8648f98b-dkmmd" (UID: "b18bcd0e-ac22-4425-940c-29311785588f") : secret "controller-certs-secret" not found Dec 02 16:09:51 crc kubenswrapper[4933]: E1202 16:09:51.037041 4933 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 02 16:09:51 crc kubenswrapper[4933]: E1202 16:09:51.037093 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cf92f734-644b-4a6c-b716-bf6269ae99b5-memberlist podName:cf92f734-644b-4a6c-b716-bf6269ae99b5 nodeName:}" failed. No retries permitted until 2025-12-02 16:09:51.537078297 +0000 UTC m=+1054.788305010 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/cf92f734-644b-4a6c-b716-bf6269ae99b5-memberlist") pod "speaker-7428k" (UID: "cf92f734-644b-4a6c-b716-bf6269ae99b5") : secret "metallb-memberlist" not found Dec 02 16:09:51 crc kubenswrapper[4933]: I1202 16:09:51.039760 4933 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 02 16:09:51 crc kubenswrapper[4933]: I1202 16:09:51.041197 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b5d26e47-d7ac-4bda-8b0e-559a3ee18eb6-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-r4spw\" (UID: \"b5d26e47-d7ac-4bda-8b0e-559a3ee18eb6\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-r4spw" Dec 02 16:09:51 crc kubenswrapper[4933]: I1202 16:09:51.043757 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-2zdck" Dec 02 16:09:51 crc kubenswrapper[4933]: I1202 16:09:51.045332 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cf92f734-644b-4a6c-b716-bf6269ae99b5-metrics-certs\") pod \"speaker-7428k\" (UID: \"cf92f734-644b-4a6c-b716-bf6269ae99b5\") " pod="metallb-system/speaker-7428k" Dec 02 16:09:51 crc kubenswrapper[4933]: I1202 16:09:51.051511 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b18bcd0e-ac22-4425-940c-29311785588f-cert\") pod \"controller-f8648f98b-dkmmd\" (UID: \"b18bcd0e-ac22-4425-940c-29311785588f\") " pod="metallb-system/controller-f8648f98b-dkmmd" Dec 02 16:09:51 crc kubenswrapper[4933]: I1202 16:09:51.054219 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnzfp\" (UniqueName: \"kubernetes.io/projected/cf92f734-644b-4a6c-b716-bf6269ae99b5-kube-api-access-vnzfp\") pod \"speaker-7428k\" (UID: \"cf92f734-644b-4a6c-b716-bf6269ae99b5\") " pod="metallb-system/speaker-7428k" Dec 02 16:09:51 crc kubenswrapper[4933]: I1202 16:09:51.054320 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bj9d9\" (UniqueName: \"kubernetes.io/projected/b18bcd0e-ac22-4425-940c-29311785588f-kube-api-access-bj9d9\") pod \"controller-f8648f98b-dkmmd\" (UID: \"b18bcd0e-ac22-4425-940c-29311785588f\") " pod="metallb-system/controller-f8648f98b-dkmmd" Dec 02 16:09:51 crc kubenswrapper[4933]: I1202 16:09:51.070493 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzkgc\" (UniqueName: \"kubernetes.io/projected/b5d26e47-d7ac-4bda-8b0e-559a3ee18eb6-kube-api-access-vzkgc\") pod \"frr-k8s-webhook-server-7fcb986d4-r4spw\" (UID: \"b5d26e47-d7ac-4bda-8b0e-559a3ee18eb6\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-r4spw" Dec 02 16:09:51 crc kubenswrapper[4933]: I1202 16:09:51.071207 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-r4spw" Dec 02 16:09:51 crc kubenswrapper[4933]: I1202 16:09:51.544819 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-r4spw"] Dec 02 16:09:51 crc kubenswrapper[4933]: I1202 16:09:51.545006 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b18bcd0e-ac22-4425-940c-29311785588f-metrics-certs\") pod \"controller-f8648f98b-dkmmd\" (UID: \"b18bcd0e-ac22-4425-940c-29311785588f\") " pod="metallb-system/controller-f8648f98b-dkmmd" Dec 02 16:09:51 crc kubenswrapper[4933]: I1202 16:09:51.545190 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/cf92f734-644b-4a6c-b716-bf6269ae99b5-memberlist\") pod \"speaker-7428k\" (UID: \"cf92f734-644b-4a6c-b716-bf6269ae99b5\") " pod="metallb-system/speaker-7428k" Dec 02 16:09:51 crc kubenswrapper[4933]: E1202 16:09:51.545585 4933 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 02 16:09:51 crc kubenswrapper[4933]: E1202 16:09:51.545627 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cf92f734-644b-4a6c-b716-bf6269ae99b5-memberlist podName:cf92f734-644b-4a6c-b716-bf6269ae99b5 nodeName:}" failed. No retries permitted until 2025-12-02 16:09:52.545613821 +0000 UTC m=+1055.796840524 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/cf92f734-644b-4a6c-b716-bf6269ae99b5-memberlist") pod "speaker-7428k" (UID: "cf92f734-644b-4a6c-b716-bf6269ae99b5") : secret "metallb-memberlist" not found Dec 02 16:09:51 crc kubenswrapper[4933]: I1202 16:09:51.554372 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b18bcd0e-ac22-4425-940c-29311785588f-metrics-certs\") pod \"controller-f8648f98b-dkmmd\" (UID: \"b18bcd0e-ac22-4425-940c-29311785588f\") " pod="metallb-system/controller-f8648f98b-dkmmd" Dec 02 16:09:51 crc kubenswrapper[4933]: I1202 16:09:51.715969 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-2zdck" event={"ID":"3cc86070-243d-4886-9746-9fcab519fb50","Type":"ContainerStarted","Data":"59bbfdc930368c2d7d7383761a394afff8cd283194bb4eadbfdd5a384a5dc030"} Dec 02 16:09:51 crc kubenswrapper[4933]: I1202 16:09:51.718127 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-r4spw" event={"ID":"b5d26e47-d7ac-4bda-8b0e-559a3ee18eb6","Type":"ContainerStarted","Data":"81a4aa5acd36cd4297943f29ca86259ee3282fb9b0b9368010e1bbd990815d3e"} Dec 02 16:09:51 crc kubenswrapper[4933]: I1202 16:09:51.762688 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-dkmmd" Dec 02 16:09:52 crc kubenswrapper[4933]: I1202 16:09:52.169098 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-dkmmd"] Dec 02 16:09:52 crc kubenswrapper[4933]: I1202 16:09:52.563063 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/cf92f734-644b-4a6c-b716-bf6269ae99b5-memberlist\") pod \"speaker-7428k\" (UID: \"cf92f734-644b-4a6c-b716-bf6269ae99b5\") " pod="metallb-system/speaker-7428k" Dec 02 16:09:52 crc kubenswrapper[4933]: I1202 16:09:52.578497 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/cf92f734-644b-4a6c-b716-bf6269ae99b5-memberlist\") pod \"speaker-7428k\" (UID: \"cf92f734-644b-4a6c-b716-bf6269ae99b5\") " pod="metallb-system/speaker-7428k" Dec 02 16:09:52 crc kubenswrapper[4933]: I1202 16:09:52.654529 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-7428k" Dec 02 16:09:52 crc kubenswrapper[4933]: W1202 16:09:52.678521 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf92f734_644b_4a6c_b716_bf6269ae99b5.slice/crio-68da0aedd4265021adf0ccc7e64589e39a38c71e07bc4c9707a392082329f1e8 WatchSource:0}: Error finding container 68da0aedd4265021adf0ccc7e64589e39a38c71e07bc4c9707a392082329f1e8: Status 404 returned error can't find the container with id 68da0aedd4265021adf0ccc7e64589e39a38c71e07bc4c9707a392082329f1e8 Dec 02 16:09:52 crc kubenswrapper[4933]: I1202 16:09:52.726495 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-dkmmd" event={"ID":"b18bcd0e-ac22-4425-940c-29311785588f","Type":"ContainerStarted","Data":"fd7017b55dd4607043ae402a318b387d692d4670a5fb01c4146070ac7056a7ff"} Dec 02 16:09:52 crc kubenswrapper[4933]: I1202 16:09:52.726539 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-dkmmd" event={"ID":"b18bcd0e-ac22-4425-940c-29311785588f","Type":"ContainerStarted","Data":"a310a596aceaeb1fcb3f65d20214f0213d0be5d7d4d9b1a90a65a32c969f6186"} Dec 02 16:09:52 crc kubenswrapper[4933]: I1202 16:09:52.726552 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-dkmmd" event={"ID":"b18bcd0e-ac22-4425-940c-29311785588f","Type":"ContainerStarted","Data":"92bb067adeb799d072596182bdcd2351f543d4794ef71287cbc035f802a731d8"} Dec 02 16:09:52 crc kubenswrapper[4933]: I1202 16:09:52.726614 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-f8648f98b-dkmmd" Dec 02 16:09:52 crc kubenswrapper[4933]: I1202 16:09:52.727879 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-7428k" event={"ID":"cf92f734-644b-4a6c-b716-bf6269ae99b5","Type":"ContainerStarted","Data":"68da0aedd4265021adf0ccc7e64589e39a38c71e07bc4c9707a392082329f1e8"} Dec 02 16:09:52 crc kubenswrapper[4933]: I1202 16:09:52.756648 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-f8648f98b-dkmmd" podStartSLOduration=2.756626555 podStartE2EDuration="2.756626555s" podCreationTimestamp="2025-12-02 16:09:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 16:09:52.745613924 +0000 UTC m=+1055.996840637" watchObservedRunningTime="2025-12-02 16:09:52.756626555 +0000 UTC m=+1056.007853278" Dec 02 16:09:53 crc kubenswrapper[4933]: I1202 16:09:53.737997 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-7428k" event={"ID":"cf92f734-644b-4a6c-b716-bf6269ae99b5","Type":"ContainerStarted","Data":"4710c506fe66a1687fa0b440d9d1094890a15edb5ebe535ce5dc2a9291757e2e"} Dec 02 16:09:53 crc kubenswrapper[4933]: I1202 16:09:53.738332 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-7428k" event={"ID":"cf92f734-644b-4a6c-b716-bf6269ae99b5","Type":"ContainerStarted","Data":"75fd3ab8f2ba234b4edf5d6031f31520edfba7d3737248d94849254953abd587"} Dec 02 16:09:53 crc kubenswrapper[4933]: I1202 16:09:53.758951 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-7428k" podStartSLOduration=3.758933447 podStartE2EDuration="3.758933447s" podCreationTimestamp="2025-12-02 16:09:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 16:09:53.756891345 +0000 UTC m=+1057.008118048" watchObservedRunningTime="2025-12-02 16:09:53.758933447 +0000 UTC m=+1057.010160150" Dec 02 16:09:54 crc kubenswrapper[4933]: I1202 16:09:54.746080 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-7428k" Dec 02 16:09:59 crc kubenswrapper[4933]: I1202 16:09:59.816062 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-r4spw" event={"ID":"b5d26e47-d7ac-4bda-8b0e-559a3ee18eb6","Type":"ContainerStarted","Data":"1c237e96ba52769d50f0005c3cb61bc5f5836673dbcde6f604d443abfcff43c9"} Dec 02 16:09:59 crc kubenswrapper[4933]: I1202 16:09:59.816907 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-r4spw" Dec 02 16:09:59 crc kubenswrapper[4933]: I1202 16:09:59.818541 4933 generic.go:334] "Generic (PLEG): container finished" podID="3cc86070-243d-4886-9746-9fcab519fb50" containerID="cc723885f25b0f275fec76d5fb3e77f4f51b7ba40698f425177ef29ae1f4b692" exitCode=0 Dec 02 16:09:59 crc kubenswrapper[4933]: I1202 16:09:59.818614 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-2zdck" event={"ID":"3cc86070-243d-4886-9746-9fcab519fb50","Type":"ContainerDied","Data":"cc723885f25b0f275fec76d5fb3e77f4f51b7ba40698f425177ef29ae1f4b692"} Dec 02 16:09:59 crc kubenswrapper[4933]: I1202 16:09:59.851505 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-r4spw" podStartSLOduration=2.236685309 podStartE2EDuration="9.851478554s" podCreationTimestamp="2025-12-02 16:09:50 +0000 UTC" firstStartedPulling="2025-12-02 16:09:51.539875994 +0000 UTC m=+1054.791102707" lastFinishedPulling="2025-12-02 16:09:59.154669249 +0000 UTC m=+1062.405895952" observedRunningTime="2025-12-02 16:09:59.840927764 +0000 UTC m=+1063.092154477" watchObservedRunningTime="2025-12-02 16:09:59.851478554 +0000 UTC m=+1063.102705267" Dec 02 16:10:00 crc kubenswrapper[4933]: I1202 16:10:00.833309 4933 generic.go:334] "Generic (PLEG): container finished" podID="3cc86070-243d-4886-9746-9fcab519fb50" containerID="62607d80a6bd2288d3cafc4a4a4b2599ecf66bad121b0f9d0a5fc243ecbb78e8" exitCode=0 Dec 02 16:10:00 crc kubenswrapper[4933]: I1202 16:10:00.835617 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-2zdck" event={"ID":"3cc86070-243d-4886-9746-9fcab519fb50","Type":"ContainerDied","Data":"62607d80a6bd2288d3cafc4a4a4b2599ecf66bad121b0f9d0a5fc243ecbb78e8"} Dec 02 16:10:01 crc kubenswrapper[4933]: I1202 16:10:01.843683 4933 generic.go:334] "Generic (PLEG): container finished" podID="3cc86070-243d-4886-9746-9fcab519fb50" containerID="c61212461b6ed4cc8a204fc34ee733bb1a9e15a9987943c87e9b7e55844da115" exitCode=0 Dec 02 16:10:01 crc kubenswrapper[4933]: I1202 16:10:01.843752 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-2zdck" event={"ID":"3cc86070-243d-4886-9746-9fcab519fb50","Type":"ContainerDied","Data":"c61212461b6ed4cc8a204fc34ee733bb1a9e15a9987943c87e9b7e55844da115"} Dec 02 16:10:02 crc kubenswrapper[4933]: I1202 16:10:02.659951 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-7428k" Dec 02 16:10:02 crc kubenswrapper[4933]: I1202 16:10:02.876423 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-2zdck" event={"ID":"3cc86070-243d-4886-9746-9fcab519fb50","Type":"ContainerStarted","Data":"fe57492c64ebead183fb80b5168ac5715ce50dfbd868fb8e7c8e61a2ea6e94a5"} Dec 02 16:10:02 crc kubenswrapper[4933]: I1202 16:10:02.876499 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-2zdck" event={"ID":"3cc86070-243d-4886-9746-9fcab519fb50","Type":"ContainerStarted","Data":"2f9b6dbd27bd1ef691ad755193865edfe74e9fe5a208046c05ec9418e54a03bb"} Dec 02 16:10:02 crc kubenswrapper[4933]: I1202 16:10:02.876514 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-2zdck" event={"ID":"3cc86070-243d-4886-9746-9fcab519fb50","Type":"ContainerStarted","Data":"462ffe9509b322016fbb88d1f8d963878c6f3071484fc298354fa420a78aa6e1"} Dec 02 16:10:02 crc kubenswrapper[4933]: I1202 16:10:02.876528 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-2zdck" event={"ID":"3cc86070-243d-4886-9746-9fcab519fb50","Type":"ContainerStarted","Data":"2a386d41a5d9dcc2c91049b5d831eea29aa0e00c95f27ecb5f6db8229b30ec6d"} Dec 02 16:10:02 crc kubenswrapper[4933]: I1202 16:10:02.876549 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-2zdck" event={"ID":"3cc86070-243d-4886-9746-9fcab519fb50","Type":"ContainerStarted","Data":"df052695c51a9946929ef76ac75ab9b53b217357d78761e92c39cf6e91442871"} Dec 02 16:10:03 crc kubenswrapper[4933]: I1202 16:10:03.892783 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-2zdck" event={"ID":"3cc86070-243d-4886-9746-9fcab519fb50","Type":"ContainerStarted","Data":"80fe2c285f5e309ae1ee94bf3193fa5a8d3fc34ee66e6ae8006a84bf740c3679"} Dec 02 16:10:03 crc kubenswrapper[4933]: I1202 16:10:03.893051 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-2zdck" Dec 02 16:10:03 crc kubenswrapper[4933]: I1202 16:10:03.916615 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-2zdck" podStartSLOduration=6.033007694 podStartE2EDuration="13.916595006s" podCreationTimestamp="2025-12-02 16:09:50 +0000 UTC" firstStartedPulling="2025-12-02 16:09:51.250545872 +0000 UTC m=+1054.501772575" lastFinishedPulling="2025-12-02 16:09:59.134133184 +0000 UTC m=+1062.385359887" observedRunningTime="2025-12-02 16:10:03.910617013 +0000 UTC m=+1067.161843746" watchObservedRunningTime="2025-12-02 16:10:03.916595006 +0000 UTC m=+1067.167821719" Dec 02 16:10:06 crc kubenswrapper[4933]: I1202 16:10:06.044608 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-2zdck" Dec 02 16:10:06 crc kubenswrapper[4933]: I1202 16:10:06.081656 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-2zdck" Dec 02 16:10:11 crc kubenswrapper[4933]: I1202 16:10:11.080172 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-r4spw" Dec 02 16:10:11 crc kubenswrapper[4933]: I1202 16:10:11.767239 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-f8648f98b-dkmmd" Dec 02 16:10:16 crc kubenswrapper[4933]: I1202 16:10:16.410568 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-vv5bw"] Dec 02 16:10:16 crc kubenswrapper[4933]: I1202 16:10:16.413378 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-vv5bw" Dec 02 16:10:16 crc kubenswrapper[4933]: I1202 16:10:16.416310 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Dec 02 16:10:16 crc kubenswrapper[4933]: I1202 16:10:16.416681 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-hm2jr" Dec 02 16:10:16 crc kubenswrapper[4933]: I1202 16:10:16.417916 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Dec 02 16:10:16 crc kubenswrapper[4933]: I1202 16:10:16.419199 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-vv5bw"] Dec 02 16:10:16 crc kubenswrapper[4933]: I1202 16:10:16.546373 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqrqv\" (UniqueName: \"kubernetes.io/projected/df3012be-3abc-4502-a517-2d685340047f-kube-api-access-nqrqv\") pod \"openstack-operator-index-vv5bw\" (UID: \"df3012be-3abc-4502-a517-2d685340047f\") " pod="openstack-operators/openstack-operator-index-vv5bw" Dec 02 16:10:16 crc kubenswrapper[4933]: I1202 16:10:16.648107 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqrqv\" (UniqueName: \"kubernetes.io/projected/df3012be-3abc-4502-a517-2d685340047f-kube-api-access-nqrqv\") pod \"openstack-operator-index-vv5bw\" (UID: \"df3012be-3abc-4502-a517-2d685340047f\") " pod="openstack-operators/openstack-operator-index-vv5bw" Dec 02 16:10:16 crc kubenswrapper[4933]: I1202 16:10:16.679069 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqrqv\" (UniqueName: \"kubernetes.io/projected/df3012be-3abc-4502-a517-2d685340047f-kube-api-access-nqrqv\") pod \"openstack-operator-index-vv5bw\" (UID: \"df3012be-3abc-4502-a517-2d685340047f\") " pod="openstack-operators/openstack-operator-index-vv5bw" Dec 02 16:10:16 crc kubenswrapper[4933]: I1202 16:10:16.747561 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-vv5bw" Dec 02 16:10:17 crc kubenswrapper[4933]: I1202 16:10:17.203425 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-vv5bw"] Dec 02 16:10:18 crc kubenswrapper[4933]: I1202 16:10:18.032309 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-vv5bw" event={"ID":"df3012be-3abc-4502-a517-2d685340047f","Type":"ContainerStarted","Data":"f5a322906255e2fd2366482ec7b65d417249983e69f9c1a4460f153423756d76"} Dec 02 16:10:20 crc kubenswrapper[4933]: I1202 16:10:20.050040 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-vv5bw" event={"ID":"df3012be-3abc-4502-a517-2d685340047f","Type":"ContainerStarted","Data":"156cb1294dd81a2f7d898a58b2cad70da645559fa31cd853e1c07dffeb8e8fa2"} Dec 02 16:10:20 crc kubenswrapper[4933]: I1202 16:10:20.077146 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-vv5bw" podStartSLOduration=1.7761045659999999 podStartE2EDuration="4.077128122s" podCreationTimestamp="2025-12-02 16:10:16 +0000 UTC" firstStartedPulling="2025-12-02 16:10:17.204205952 +0000 UTC m=+1080.455432655" lastFinishedPulling="2025-12-02 16:10:19.505229508 +0000 UTC m=+1082.756456211" observedRunningTime="2025-12-02 16:10:20.071520598 +0000 UTC m=+1083.322747301" watchObservedRunningTime="2025-12-02 16:10:20.077128122 +0000 UTC m=+1083.328354825" Dec 02 16:10:21 crc kubenswrapper[4933]: I1202 16:10:21.047050 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-2zdck" Dec 02 16:10:26 crc kubenswrapper[4933]: I1202 16:10:26.747759 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-vv5bw" Dec 02 16:10:26 crc kubenswrapper[4933]: I1202 16:10:26.748494 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-vv5bw" Dec 02 16:10:26 crc kubenswrapper[4933]: I1202 16:10:26.785265 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-vv5bw" Dec 02 16:10:27 crc kubenswrapper[4933]: I1202 16:10:27.151706 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-vv5bw" Dec 02 16:10:32 crc kubenswrapper[4933]: I1202 16:10:32.865049 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/389cdc0f67cc74332d7e56ab6f2a768988fde617365ff87a48ceeb17d7c6rtb"] Dec 02 16:10:32 crc kubenswrapper[4933]: I1202 16:10:32.868250 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/389cdc0f67cc74332d7e56ab6f2a768988fde617365ff87a48ceeb17d7c6rtb" Dec 02 16:10:32 crc kubenswrapper[4933]: I1202 16:10:32.870470 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-jf6vn" Dec 02 16:10:32 crc kubenswrapper[4933]: I1202 16:10:32.874176 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/389cdc0f67cc74332d7e56ab6f2a768988fde617365ff87a48ceeb17d7c6rtb"] Dec 02 16:10:32 crc kubenswrapper[4933]: I1202 16:10:32.954471 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2749120d-52e8-46d6-b419-b0c2375e0355-util\") pod \"389cdc0f67cc74332d7e56ab6f2a768988fde617365ff87a48ceeb17d7c6rtb\" (UID: \"2749120d-52e8-46d6-b419-b0c2375e0355\") " pod="openstack-operators/389cdc0f67cc74332d7e56ab6f2a768988fde617365ff87a48ceeb17d7c6rtb" Dec 02 16:10:32 crc kubenswrapper[4933]: I1202 16:10:32.954543 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cf87z\" (UniqueName: \"kubernetes.io/projected/2749120d-52e8-46d6-b419-b0c2375e0355-kube-api-access-cf87z\") pod \"389cdc0f67cc74332d7e56ab6f2a768988fde617365ff87a48ceeb17d7c6rtb\" (UID: \"2749120d-52e8-46d6-b419-b0c2375e0355\") " pod="openstack-operators/389cdc0f67cc74332d7e56ab6f2a768988fde617365ff87a48ceeb17d7c6rtb" Dec 02 16:10:32 crc kubenswrapper[4933]: I1202 16:10:32.954985 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2749120d-52e8-46d6-b419-b0c2375e0355-bundle\") pod \"389cdc0f67cc74332d7e56ab6f2a768988fde617365ff87a48ceeb17d7c6rtb\" (UID: \"2749120d-52e8-46d6-b419-b0c2375e0355\") " pod="openstack-operators/389cdc0f67cc74332d7e56ab6f2a768988fde617365ff87a48ceeb17d7c6rtb" Dec 02 16:10:33 crc kubenswrapper[4933]: I1202 16:10:33.069874 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2749120d-52e8-46d6-b419-b0c2375e0355-bundle\") pod \"389cdc0f67cc74332d7e56ab6f2a768988fde617365ff87a48ceeb17d7c6rtb\" (UID: \"2749120d-52e8-46d6-b419-b0c2375e0355\") " pod="openstack-operators/389cdc0f67cc74332d7e56ab6f2a768988fde617365ff87a48ceeb17d7c6rtb" Dec 02 16:10:33 crc kubenswrapper[4933]: I1202 16:10:33.069960 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2749120d-52e8-46d6-b419-b0c2375e0355-util\") pod \"389cdc0f67cc74332d7e56ab6f2a768988fde617365ff87a48ceeb17d7c6rtb\" (UID: \"2749120d-52e8-46d6-b419-b0c2375e0355\") " pod="openstack-operators/389cdc0f67cc74332d7e56ab6f2a768988fde617365ff87a48ceeb17d7c6rtb" Dec 02 16:10:33 crc kubenswrapper[4933]: I1202 16:10:33.069987 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cf87z\" (UniqueName: \"kubernetes.io/projected/2749120d-52e8-46d6-b419-b0c2375e0355-kube-api-access-cf87z\") pod \"389cdc0f67cc74332d7e56ab6f2a768988fde617365ff87a48ceeb17d7c6rtb\" (UID: \"2749120d-52e8-46d6-b419-b0c2375e0355\") " pod="openstack-operators/389cdc0f67cc74332d7e56ab6f2a768988fde617365ff87a48ceeb17d7c6rtb" Dec 02 16:10:33 crc kubenswrapper[4933]: I1202 16:10:33.070949 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2749120d-52e8-46d6-b419-b0c2375e0355-bundle\") pod \"389cdc0f67cc74332d7e56ab6f2a768988fde617365ff87a48ceeb17d7c6rtb\" (UID: \"2749120d-52e8-46d6-b419-b0c2375e0355\") " pod="openstack-operators/389cdc0f67cc74332d7e56ab6f2a768988fde617365ff87a48ceeb17d7c6rtb" Dec 02 16:10:33 crc kubenswrapper[4933]: I1202 16:10:33.071179 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2749120d-52e8-46d6-b419-b0c2375e0355-util\") pod \"389cdc0f67cc74332d7e56ab6f2a768988fde617365ff87a48ceeb17d7c6rtb\" (UID: \"2749120d-52e8-46d6-b419-b0c2375e0355\") " pod="openstack-operators/389cdc0f67cc74332d7e56ab6f2a768988fde617365ff87a48ceeb17d7c6rtb" Dec 02 16:10:33 crc kubenswrapper[4933]: I1202 16:10:33.109048 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cf87z\" (UniqueName: \"kubernetes.io/projected/2749120d-52e8-46d6-b419-b0c2375e0355-kube-api-access-cf87z\") pod \"389cdc0f67cc74332d7e56ab6f2a768988fde617365ff87a48ceeb17d7c6rtb\" (UID: \"2749120d-52e8-46d6-b419-b0c2375e0355\") " pod="openstack-operators/389cdc0f67cc74332d7e56ab6f2a768988fde617365ff87a48ceeb17d7c6rtb" Dec 02 16:10:33 crc kubenswrapper[4933]: I1202 16:10:33.193040 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/389cdc0f67cc74332d7e56ab6f2a768988fde617365ff87a48ceeb17d7c6rtb" Dec 02 16:10:33 crc kubenswrapper[4933]: I1202 16:10:33.669466 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/389cdc0f67cc74332d7e56ab6f2a768988fde617365ff87a48ceeb17d7c6rtb"] Dec 02 16:10:34 crc kubenswrapper[4933]: I1202 16:10:34.174434 4933 generic.go:334] "Generic (PLEG): container finished" podID="2749120d-52e8-46d6-b419-b0c2375e0355" containerID="1fc05b4507142a832404f3687c03f05f73f0bf6d7b7e65ff8a8c791bd0ac982e" exitCode=0 Dec 02 16:10:34 crc kubenswrapper[4933]: I1202 16:10:34.174555 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/389cdc0f67cc74332d7e56ab6f2a768988fde617365ff87a48ceeb17d7c6rtb" event={"ID":"2749120d-52e8-46d6-b419-b0c2375e0355","Type":"ContainerDied","Data":"1fc05b4507142a832404f3687c03f05f73f0bf6d7b7e65ff8a8c791bd0ac982e"} Dec 02 16:10:34 crc kubenswrapper[4933]: I1202 16:10:34.174636 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/389cdc0f67cc74332d7e56ab6f2a768988fde617365ff87a48ceeb17d7c6rtb" event={"ID":"2749120d-52e8-46d6-b419-b0c2375e0355","Type":"ContainerStarted","Data":"c5cf97a5c01c591f50e553f7eb5b0b1ee1707282dc33420fae8e1e0925c7be1b"} Dec 02 16:10:35 crc kubenswrapper[4933]: I1202 16:10:35.185878 4933 generic.go:334] "Generic (PLEG): container finished" podID="2749120d-52e8-46d6-b419-b0c2375e0355" containerID="fbca88ebe696b49aef5d59db77b376f8b75f245ef1054ef5c4e4f428bb78c849" exitCode=0 Dec 02 16:10:35 crc kubenswrapper[4933]: I1202 16:10:35.185989 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/389cdc0f67cc74332d7e56ab6f2a768988fde617365ff87a48ceeb17d7c6rtb" event={"ID":"2749120d-52e8-46d6-b419-b0c2375e0355","Type":"ContainerDied","Data":"fbca88ebe696b49aef5d59db77b376f8b75f245ef1054ef5c4e4f428bb78c849"} Dec 02 16:10:36 crc kubenswrapper[4933]: I1202 16:10:36.194871 4933 generic.go:334] "Generic (PLEG): container finished" podID="2749120d-52e8-46d6-b419-b0c2375e0355" containerID="7fcbe634d3a93a3382faf2fec54f2ba7225096835b54cda6cbfafa5cb34f6fcf" exitCode=0 Dec 02 16:10:36 crc kubenswrapper[4933]: I1202 16:10:36.194917 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/389cdc0f67cc74332d7e56ab6f2a768988fde617365ff87a48ceeb17d7c6rtb" event={"ID":"2749120d-52e8-46d6-b419-b0c2375e0355","Type":"ContainerDied","Data":"7fcbe634d3a93a3382faf2fec54f2ba7225096835b54cda6cbfafa5cb34f6fcf"} Dec 02 16:10:37 crc kubenswrapper[4933]: I1202 16:10:37.621625 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/389cdc0f67cc74332d7e56ab6f2a768988fde617365ff87a48ceeb17d7c6rtb" Dec 02 16:10:37 crc kubenswrapper[4933]: I1202 16:10:37.746620 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2749120d-52e8-46d6-b419-b0c2375e0355-util\") pod \"2749120d-52e8-46d6-b419-b0c2375e0355\" (UID: \"2749120d-52e8-46d6-b419-b0c2375e0355\") " Dec 02 16:10:37 crc kubenswrapper[4933]: I1202 16:10:37.746941 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2749120d-52e8-46d6-b419-b0c2375e0355-bundle\") pod \"2749120d-52e8-46d6-b419-b0c2375e0355\" (UID: \"2749120d-52e8-46d6-b419-b0c2375e0355\") " Dec 02 16:10:37 crc kubenswrapper[4933]: I1202 16:10:37.746970 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cf87z\" (UniqueName: \"kubernetes.io/projected/2749120d-52e8-46d6-b419-b0c2375e0355-kube-api-access-cf87z\") pod \"2749120d-52e8-46d6-b419-b0c2375e0355\" (UID: \"2749120d-52e8-46d6-b419-b0c2375e0355\") " Dec 02 16:10:37 crc kubenswrapper[4933]: I1202 16:10:37.747598 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2749120d-52e8-46d6-b419-b0c2375e0355-bundle" (OuterVolumeSpecName: "bundle") pod "2749120d-52e8-46d6-b419-b0c2375e0355" (UID: "2749120d-52e8-46d6-b419-b0c2375e0355"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 16:10:37 crc kubenswrapper[4933]: I1202 16:10:37.761342 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2749120d-52e8-46d6-b419-b0c2375e0355-util" (OuterVolumeSpecName: "util") pod "2749120d-52e8-46d6-b419-b0c2375e0355" (UID: "2749120d-52e8-46d6-b419-b0c2375e0355"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 16:10:37 crc kubenswrapper[4933]: I1202 16:10:37.765298 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2749120d-52e8-46d6-b419-b0c2375e0355-kube-api-access-cf87z" (OuterVolumeSpecName: "kube-api-access-cf87z") pod "2749120d-52e8-46d6-b419-b0c2375e0355" (UID: "2749120d-52e8-46d6-b419-b0c2375e0355"). InnerVolumeSpecName "kube-api-access-cf87z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:10:37 crc kubenswrapper[4933]: I1202 16:10:37.849467 4933 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2749120d-52e8-46d6-b419-b0c2375e0355-util\") on node \"crc\" DevicePath \"\"" Dec 02 16:10:37 crc kubenswrapper[4933]: I1202 16:10:37.849496 4933 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2749120d-52e8-46d6-b419-b0c2375e0355-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 16:10:37 crc kubenswrapper[4933]: I1202 16:10:37.849505 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cf87z\" (UniqueName: \"kubernetes.io/projected/2749120d-52e8-46d6-b419-b0c2375e0355-kube-api-access-cf87z\") on node \"crc\" DevicePath \"\"" Dec 02 16:10:38 crc kubenswrapper[4933]: I1202 16:10:38.219703 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/389cdc0f67cc74332d7e56ab6f2a768988fde617365ff87a48ceeb17d7c6rtb" event={"ID":"2749120d-52e8-46d6-b419-b0c2375e0355","Type":"ContainerDied","Data":"c5cf97a5c01c591f50e553f7eb5b0b1ee1707282dc33420fae8e1e0925c7be1b"} Dec 02 16:10:38 crc kubenswrapper[4933]: I1202 16:10:38.219747 4933 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c5cf97a5c01c591f50e553f7eb5b0b1ee1707282dc33420fae8e1e0925c7be1b" Dec 02 16:10:38 crc kubenswrapper[4933]: I1202 16:10:38.219766 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/389cdc0f67cc74332d7e56ab6f2a768988fde617365ff87a48ceeb17d7c6rtb" Dec 02 16:10:47 crc kubenswrapper[4933]: I1202 16:10:47.169588 4933 patch_prober.go:28] interesting pod/machine-config-daemon-d2p6w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 16:10:47 crc kubenswrapper[4933]: I1202 16:10:47.169952 4933 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 16:10:48 crc kubenswrapper[4933]: I1202 16:10:48.401610 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-69c497bc86-fnlmp"] Dec 02 16:10:48 crc kubenswrapper[4933]: E1202 16:10:48.402077 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2749120d-52e8-46d6-b419-b0c2375e0355" containerName="util" Dec 02 16:10:48 crc kubenswrapper[4933]: I1202 16:10:48.402098 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="2749120d-52e8-46d6-b419-b0c2375e0355" containerName="util" Dec 02 16:10:48 crc kubenswrapper[4933]: E1202 16:10:48.402136 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2749120d-52e8-46d6-b419-b0c2375e0355" containerName="extract" Dec 02 16:10:48 crc kubenswrapper[4933]: I1202 16:10:48.402147 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="2749120d-52e8-46d6-b419-b0c2375e0355" containerName="extract" Dec 02 16:10:48 crc kubenswrapper[4933]: E1202 16:10:48.402178 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2749120d-52e8-46d6-b419-b0c2375e0355" containerName="pull" Dec 02 16:10:48 crc kubenswrapper[4933]: I1202 16:10:48.402188 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="2749120d-52e8-46d6-b419-b0c2375e0355" containerName="pull" Dec 02 16:10:48 crc kubenswrapper[4933]: I1202 16:10:48.402444 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="2749120d-52e8-46d6-b419-b0c2375e0355" containerName="extract" Dec 02 16:10:48 crc kubenswrapper[4933]: I1202 16:10:48.403512 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-69c497bc86-fnlmp" Dec 02 16:10:48 crc kubenswrapper[4933]: I1202 16:10:48.406447 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-gmb6l" Dec 02 16:10:48 crc kubenswrapper[4933]: I1202 16:10:48.422791 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-69c497bc86-fnlmp"] Dec 02 16:10:48 crc kubenswrapper[4933]: I1202 16:10:48.436641 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtmml\" (UniqueName: \"kubernetes.io/projected/a85c4e8a-605a-46c0-9b73-a9fa99a314a1-kube-api-access-qtmml\") pod \"openstack-operator-controller-operator-69c497bc86-fnlmp\" (UID: \"a85c4e8a-605a-46c0-9b73-a9fa99a314a1\") " pod="openstack-operators/openstack-operator-controller-operator-69c497bc86-fnlmp" Dec 02 16:10:48 crc kubenswrapper[4933]: I1202 16:10:48.538267 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qtmml\" (UniqueName: \"kubernetes.io/projected/a85c4e8a-605a-46c0-9b73-a9fa99a314a1-kube-api-access-qtmml\") pod \"openstack-operator-controller-operator-69c497bc86-fnlmp\" (UID: \"a85c4e8a-605a-46c0-9b73-a9fa99a314a1\") " pod="openstack-operators/openstack-operator-controller-operator-69c497bc86-fnlmp" Dec 02 16:10:48 crc kubenswrapper[4933]: I1202 16:10:48.558228 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtmml\" (UniqueName: \"kubernetes.io/projected/a85c4e8a-605a-46c0-9b73-a9fa99a314a1-kube-api-access-qtmml\") pod \"openstack-operator-controller-operator-69c497bc86-fnlmp\" (UID: \"a85c4e8a-605a-46c0-9b73-a9fa99a314a1\") " pod="openstack-operators/openstack-operator-controller-operator-69c497bc86-fnlmp" Dec 02 16:10:48 crc kubenswrapper[4933]: I1202 16:10:48.725438 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-69c497bc86-fnlmp" Dec 02 16:10:49 crc kubenswrapper[4933]: I1202 16:10:49.228815 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-69c497bc86-fnlmp"] Dec 02 16:10:49 crc kubenswrapper[4933]: I1202 16:10:49.318337 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-69c497bc86-fnlmp" event={"ID":"a85c4e8a-605a-46c0-9b73-a9fa99a314a1","Type":"ContainerStarted","Data":"0e2d9045916b9b9ec17042a2cc5817918c6dfc73d0b63ff0c78bb850f969acef"} Dec 02 16:10:53 crc kubenswrapper[4933]: I1202 16:10:53.347942 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-69c497bc86-fnlmp" event={"ID":"a85c4e8a-605a-46c0-9b73-a9fa99a314a1","Type":"ContainerStarted","Data":"38ec5f617b23ba61743835b1509e6e2a2a1eb6545fbcecfe2a5403378e8beb27"} Dec 02 16:10:53 crc kubenswrapper[4933]: I1202 16:10:53.349323 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-69c497bc86-fnlmp" Dec 02 16:10:53 crc kubenswrapper[4933]: I1202 16:10:53.382551 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-69c497bc86-fnlmp" podStartSLOduration=1.892496578 podStartE2EDuration="5.382532853s" podCreationTimestamp="2025-12-02 16:10:48 +0000 UTC" firstStartedPulling="2025-12-02 16:10:49.238006427 +0000 UTC m=+1112.489233130" lastFinishedPulling="2025-12-02 16:10:52.728042702 +0000 UTC m=+1115.979269405" observedRunningTime="2025-12-02 16:10:53.381038144 +0000 UTC m=+1116.632264897" watchObservedRunningTime="2025-12-02 16:10:53.382532853 +0000 UTC m=+1116.633759556" Dec 02 16:10:58 crc kubenswrapper[4933]: I1202 16:10:58.728632 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-69c497bc86-fnlmp" Dec 02 16:11:17 crc kubenswrapper[4933]: I1202 16:11:17.170028 4933 patch_prober.go:28] interesting pod/machine-config-daemon-d2p6w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 16:11:17 crc kubenswrapper[4933]: I1202 16:11:17.172262 4933 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 16:11:27 crc kubenswrapper[4933]: I1202 16:11:27.009590 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-ltqgf"] Dec 02 16:11:27 crc kubenswrapper[4933]: I1202 16:11:27.011342 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-ltqgf" Dec 02 16:11:27 crc kubenswrapper[4933]: I1202 16:11:27.014007 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wsc9\" (UniqueName: \"kubernetes.io/projected/d7ebb5f9-1140-47dc-a2e2-1952a295d218-kube-api-access-4wsc9\") pod \"barbican-operator-controller-manager-7d9dfd778-ltqgf\" (UID: \"d7ebb5f9-1140-47dc-a2e2-1952a295d218\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-ltqgf" Dec 02 16:11:27 crc kubenswrapper[4933]: I1202 16:11:27.015090 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-zhsjb"] Dec 02 16:11:27 crc kubenswrapper[4933]: I1202 16:11:27.016542 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-zhsjb" Dec 02 16:11:27 crc kubenswrapper[4933]: I1202 16:11:27.021251 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-qs72r" Dec 02 16:11:27 crc kubenswrapper[4933]: I1202 16:11:27.021279 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-4sjcd" Dec 02 16:11:27 crc kubenswrapper[4933]: I1202 16:11:27.032459 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-zhsjb"] Dec 02 16:11:27 crc kubenswrapper[4933]: I1202 16:11:27.040612 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-ltqgf"] Dec 02 16:11:27 crc kubenswrapper[4933]: I1202 16:11:27.081996 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987cd8cd-9kzqg"] Dec 02 16:11:27 crc kubenswrapper[4933]: I1202 16:11:27.106110 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-9kzqg" Dec 02 16:11:27 crc kubenswrapper[4933]: I1202 16:11:27.107965 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-vgblp"] Dec 02 16:11:27 crc kubenswrapper[4933]: I1202 16:11:27.110178 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-vgblp" Dec 02 16:11:27 crc kubenswrapper[4933]: I1202 16:11:27.114538 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-lctjn" Dec 02 16:11:27 crc kubenswrapper[4933]: I1202 16:11:27.114543 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-q5k24" Dec 02 16:11:27 crc kubenswrapper[4933]: I1202 16:11:27.115708 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wsc9\" (UniqueName: \"kubernetes.io/projected/d7ebb5f9-1140-47dc-a2e2-1952a295d218-kube-api-access-4wsc9\") pod \"barbican-operator-controller-manager-7d9dfd778-ltqgf\" (UID: \"d7ebb5f9-1140-47dc-a2e2-1952a295d218\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-ltqgf" Dec 02 16:11:27 crc kubenswrapper[4933]: I1202 16:11:27.143755 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-vgblp"] Dec 02 16:11:27 crc kubenswrapper[4933]: I1202 16:11:27.146738 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wsc9\" (UniqueName: \"kubernetes.io/projected/d7ebb5f9-1140-47dc-a2e2-1952a295d218-kube-api-access-4wsc9\") pod \"barbican-operator-controller-manager-7d9dfd778-ltqgf\" (UID: \"d7ebb5f9-1140-47dc-a2e2-1952a295d218\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-ltqgf" Dec 02 16:11:27 crc kubenswrapper[4933]: I1202 16:11:27.182017 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987cd8cd-9kzqg"] Dec 02 16:11:27 crc kubenswrapper[4933]: I1202 16:11:27.238736 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dsdmr\" (UniqueName: \"kubernetes.io/projected/4a6c2fb7-89ad-402e-8e31-b56e50d1386c-kube-api-access-dsdmr\") pod \"cinder-operator-controller-manager-859b6ccc6-zhsjb\" (UID: \"4a6c2fb7-89ad-402e-8e31-b56e50d1386c\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-zhsjb" Dec 02 16:11:27 crc kubenswrapper[4933]: I1202 16:11:27.239151 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9xxx\" (UniqueName: \"kubernetes.io/projected/623d7dfe-ddc2-4557-95a3-02b8fb56ee35-kube-api-access-v9xxx\") pod \"designate-operator-controller-manager-78b4bc895b-vgblp\" (UID: \"623d7dfe-ddc2-4557-95a3-02b8fb56ee35\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-vgblp" Dec 02 16:11:27 crc kubenswrapper[4933]: I1202 16:11:27.239305 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqx7h\" (UniqueName: \"kubernetes.io/projected/443e1418-ad3d-4f7a-b7b0-682beceb2977-kube-api-access-mqx7h\") pod \"glance-operator-controller-manager-77987cd8cd-9kzqg\" (UID: \"443e1418-ad3d-4f7a-b7b0-682beceb2977\") " pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-9kzqg" Dec 02 16:11:27 crc kubenswrapper[4933]: I1202 16:11:27.242770 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-t7rsk"] Dec 02 16:11:27 crc kubenswrapper[4933]: I1202 16:11:27.249073 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-t7rsk" Dec 02 16:11:27 crc kubenswrapper[4933]: I1202 16:11:27.261760 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-k7ckm" Dec 02 16:11:27 crc kubenswrapper[4933]: I1202 16:11:27.313255 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-ggssz"] Dec 02 16:11:27 crc kubenswrapper[4933]: I1202 16:11:27.318281 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-57548d458d-ggssz" Dec 02 16:11:27 crc kubenswrapper[4933]: I1202 16:11:27.320956 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-h2zrp" Dec 02 16:11:27 crc kubenswrapper[4933]: I1202 16:11:27.321293 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Dec 02 16:11:27 crc kubenswrapper[4933]: I1202 16:11:27.345922 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-rzj5x"] Dec 02 16:11:27 crc kubenswrapper[4933]: I1202 16:11:27.346059 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9xxx\" (UniqueName: \"kubernetes.io/projected/623d7dfe-ddc2-4557-95a3-02b8fb56ee35-kube-api-access-v9xxx\") pod \"designate-operator-controller-manager-78b4bc895b-vgblp\" (UID: \"623d7dfe-ddc2-4557-95a3-02b8fb56ee35\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-vgblp" Dec 02 16:11:27 crc kubenswrapper[4933]: I1202 16:11:27.346120 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqx7h\" (UniqueName: \"kubernetes.io/projected/443e1418-ad3d-4f7a-b7b0-682beceb2977-kube-api-access-mqx7h\") pod \"glance-operator-controller-manager-77987cd8cd-9kzqg\" (UID: \"443e1418-ad3d-4f7a-b7b0-682beceb2977\") " pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-9kzqg" Dec 02 16:11:27 crc kubenswrapper[4933]: I1202 16:11:27.346167 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dsdmr\" (UniqueName: \"kubernetes.io/projected/4a6c2fb7-89ad-402e-8e31-b56e50d1386c-kube-api-access-dsdmr\") pod \"cinder-operator-controller-manager-859b6ccc6-zhsjb\" (UID: \"4a6c2fb7-89ad-402e-8e31-b56e50d1386c\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-zhsjb" Dec 02 16:11:27 crc kubenswrapper[4933]: I1202 16:11:27.347555 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-rzj5x" Dec 02 16:11:27 crc kubenswrapper[4933]: I1202 16:11:27.356730 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-g8z8w" Dec 02 16:11:27 crc kubenswrapper[4933]: I1202 16:11:27.376131 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dsdmr\" (UniqueName: \"kubernetes.io/projected/4a6c2fb7-89ad-402e-8e31-b56e50d1386c-kube-api-access-dsdmr\") pod \"cinder-operator-controller-manager-859b6ccc6-zhsjb\" (UID: \"4a6c2fb7-89ad-402e-8e31-b56e50d1386c\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-zhsjb" Dec 02 16:11:27 crc kubenswrapper[4933]: I1202 16:11:27.381783 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-ltqgf" Dec 02 16:11:27 crc kubenswrapper[4933]: I1202 16:11:27.382780 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9xxx\" (UniqueName: \"kubernetes.io/projected/623d7dfe-ddc2-4557-95a3-02b8fb56ee35-kube-api-access-v9xxx\") pod \"designate-operator-controller-manager-78b4bc895b-vgblp\" (UID: \"623d7dfe-ddc2-4557-95a3-02b8fb56ee35\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-vgblp" Dec 02 16:11:27 crc kubenswrapper[4933]: I1202 16:11:27.386695 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-rzj5x"] Dec 02 16:11:27 crc kubenswrapper[4933]: I1202 16:11:27.398404 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-zhsjb" Dec 02 16:11:27 crc kubenswrapper[4933]: I1202 16:11:27.402556 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqx7h\" (UniqueName: \"kubernetes.io/projected/443e1418-ad3d-4f7a-b7b0-682beceb2977-kube-api-access-mqx7h\") pod \"glance-operator-controller-manager-77987cd8cd-9kzqg\" (UID: \"443e1418-ad3d-4f7a-b7b0-682beceb2977\") " pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-9kzqg" Dec 02 16:11:27 crc kubenswrapper[4933]: I1202 16:11:27.411723 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-t7rsk"] Dec 02 16:11:27 crc kubenswrapper[4933]: I1202 16:11:27.412900 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-ggssz"] Dec 02 16:11:27 crc kubenswrapper[4933]: I1202 16:11:27.425932 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-mslvn"] Dec 02 16:11:27 crc kubenswrapper[4933]: I1202 16:11:27.428941 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-mslvn" Dec 02 16:11:27 crc kubenswrapper[4933]: I1202 16:11:27.431271 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-cv2t8" Dec 02 16:11:27 crc kubenswrapper[4933]: I1202 16:11:27.446198 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-mslvn"] Dec 02 16:11:27 crc kubenswrapper[4933]: I1202 16:11:27.447283 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgs6v\" (UniqueName: \"kubernetes.io/projected/e7437781-26c4-4a57-8afc-6bbc1ef7f7dd-kube-api-access-qgs6v\") pod \"horizon-operator-controller-manager-68c6d99b8f-t7rsk\" (UID: \"e7437781-26c4-4a57-8afc-6bbc1ef7f7dd\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-t7rsk" Dec 02 16:11:27 crc kubenswrapper[4933]: I1202 16:11:27.448124 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2e0aae7d-292b-4117-8b13-6021d1b5174a-cert\") pod \"infra-operator-controller-manager-57548d458d-ggssz\" (UID: \"2e0aae7d-292b-4117-8b13-6021d1b5174a\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-ggssz" Dec 02 16:11:27 crc kubenswrapper[4933]: I1202 16:11:27.448241 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8f628\" (UniqueName: \"kubernetes.io/projected/2e0aae7d-292b-4117-8b13-6021d1b5174a-kube-api-access-8f628\") pod \"infra-operator-controller-manager-57548d458d-ggssz\" (UID: \"2e0aae7d-292b-4117-8b13-6021d1b5174a\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-ggssz" Dec 02 16:11:27 crc kubenswrapper[4933]: I1202 16:11:27.469474 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-7c79b5df47-rmd89"] Dec 02 16:11:27 crc kubenswrapper[4933]: I1202 16:11:27.470882 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-rmd89" Dec 02 16:11:27 crc kubenswrapper[4933]: I1202 16:11:27.478319 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-58wwr" Dec 02 16:11:27 crc kubenswrapper[4933]: I1202 16:11:27.481696 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-ln9b5"] Dec 02 16:11:27 crc kubenswrapper[4933]: I1202 16:11:27.483152 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-ln9b5" Dec 02 16:11:27 crc kubenswrapper[4933]: I1202 16:11:27.485267 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-nzb7n" Dec 02 16:11:27 crc kubenswrapper[4933]: I1202 16:11:27.515489 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-ln9b5"] Dec 02 16:11:27 crc kubenswrapper[4933]: I1202 16:11:27.536208 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-9kzqg" Dec 02 16:11:27 crc kubenswrapper[4933]: I1202 16:11:27.540524 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-7c79b5df47-rmd89"] Dec 02 16:11:27 crc kubenswrapper[4933]: I1202 16:11:27.550691 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qgs6v\" (UniqueName: \"kubernetes.io/projected/e7437781-26c4-4a57-8afc-6bbc1ef7f7dd-kube-api-access-qgs6v\") pod \"horizon-operator-controller-manager-68c6d99b8f-t7rsk\" (UID: \"e7437781-26c4-4a57-8afc-6bbc1ef7f7dd\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-t7rsk" Dec 02 16:11:27 crc kubenswrapper[4933]: I1202 16:11:27.550791 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2e0aae7d-292b-4117-8b13-6021d1b5174a-cert\") pod \"infra-operator-controller-manager-57548d458d-ggssz\" (UID: \"2e0aae7d-292b-4117-8b13-6021d1b5174a\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-ggssz" Dec 02 16:11:27 crc kubenswrapper[4933]: I1202 16:11:27.550854 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-789gw\" (UniqueName: \"kubernetes.io/projected/fba1c43c-fdfb-4ea0-939b-e38adcc79720-kube-api-access-789gw\") pod \"heat-operator-controller-manager-5f64f6f8bb-rzj5x\" (UID: \"fba1c43c-fdfb-4ea0-939b-e38adcc79720\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-rzj5x" Dec 02 16:11:27 crc kubenswrapper[4933]: I1202 16:11:27.550897 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8f628\" (UniqueName: \"kubernetes.io/projected/2e0aae7d-292b-4117-8b13-6021d1b5174a-kube-api-access-8f628\") pod \"infra-operator-controller-manager-57548d458d-ggssz\" (UID: \"2e0aae7d-292b-4117-8b13-6021d1b5174a\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-ggssz" Dec 02 16:11:27 crc kubenswrapper[4933]: I1202 16:11:27.550960 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxgqm\" (UniqueName: \"kubernetes.io/projected/9c02a5b3-2770-427a-a0a4-c1fac9bd0ffa-kube-api-access-dxgqm\") pod \"keystone-operator-controller-manager-7765d96ddf-mslvn\" (UID: \"9c02a5b3-2770-427a-a0a4-c1fac9bd0ffa\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-mslvn" Dec 02 16:11:27 crc kubenswrapper[4933]: E1202 16:11:27.551181 4933 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 02 16:11:27 crc kubenswrapper[4933]: E1202 16:11:27.551264 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2e0aae7d-292b-4117-8b13-6021d1b5174a-cert podName:2e0aae7d-292b-4117-8b13-6021d1b5174a nodeName:}" failed. No retries permitted until 2025-12-02 16:11:28.051245643 +0000 UTC m=+1151.302472346 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2e0aae7d-292b-4117-8b13-6021d1b5174a-cert") pod "infra-operator-controller-manager-57548d458d-ggssz" (UID: "2e0aae7d-292b-4117-8b13-6021d1b5174a") : secret "infra-operator-webhook-server-cert" not found Dec 02 16:11:27 crc kubenswrapper[4933]: I1202 16:11:27.576529 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgs6v\" (UniqueName: \"kubernetes.io/projected/e7437781-26c4-4a57-8afc-6bbc1ef7f7dd-kube-api-access-qgs6v\") pod \"horizon-operator-controller-manager-68c6d99b8f-t7rsk\" (UID: \"e7437781-26c4-4a57-8afc-6bbc1ef7f7dd\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-t7rsk" Dec 02 16:11:27 crc kubenswrapper[4933]: I1202 16:11:27.593681 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-7zb88"] Dec 02 16:11:27 crc kubenswrapper[4933]: I1202 16:11:27.595372 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-7zb88" Dec 02 16:11:27 crc kubenswrapper[4933]: I1202 16:11:27.598884 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-vgblp" Dec 02 16:11:27 crc kubenswrapper[4933]: I1202 16:11:27.617437 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8f628\" (UniqueName: \"kubernetes.io/projected/2e0aae7d-292b-4117-8b13-6021d1b5174a-kube-api-access-8f628\") pod \"infra-operator-controller-manager-57548d458d-ggssz\" (UID: \"2e0aae7d-292b-4117-8b13-6021d1b5174a\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-ggssz" Dec 02 16:11:27 crc kubenswrapper[4933]: I1202 16:11:27.623703 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-ft7tj" Dec 02 16:11:27 crc kubenswrapper[4933]: I1202 16:11:27.636468 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-7zb88"] Dec 02 16:11:27 crc kubenswrapper[4933]: I1202 16:11:27.661012 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6sdj\" (UniqueName: \"kubernetes.io/projected/e80caebf-2148-4666-98bb-963fce1bc84e-kube-api-access-v6sdj\") pod \"ironic-operator-controller-manager-6c548fd776-ln9b5\" (UID: \"e80caebf-2148-4666-98bb-963fce1bc84e\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-ln9b5" Dec 02 16:11:27 crc kubenswrapper[4933]: I1202 16:11:27.661129 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4wh5\" (UniqueName: \"kubernetes.io/projected/a5d0bd48-d38b-44ae-b486-1ff751a0791a-kube-api-access-p4wh5\") pod \"manila-operator-controller-manager-7c79b5df47-rmd89\" (UID: \"a5d0bd48-d38b-44ae-b486-1ff751a0791a\") " pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-rmd89" Dec 02 16:11:27 crc kubenswrapper[4933]: I1202 16:11:27.661158 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-789gw\" (UniqueName: \"kubernetes.io/projected/fba1c43c-fdfb-4ea0-939b-e38adcc79720-kube-api-access-789gw\") pod \"heat-operator-controller-manager-5f64f6f8bb-rzj5x\" (UID: \"fba1c43c-fdfb-4ea0-939b-e38adcc79720\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-rzj5x" Dec 02 16:11:27 crc kubenswrapper[4933]: I1202 16:11:27.661243 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxgqm\" (UniqueName: \"kubernetes.io/projected/9c02a5b3-2770-427a-a0a4-c1fac9bd0ffa-kube-api-access-dxgqm\") pod \"keystone-operator-controller-manager-7765d96ddf-mslvn\" (UID: \"9c02a5b3-2770-427a-a0a4-c1fac9bd0ffa\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-mslvn" Dec 02 16:11:27 crc kubenswrapper[4933]: I1202 16:11:27.677857 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-t7rsk" Dec 02 16:11:27 crc kubenswrapper[4933]: I1202 16:11:27.704577 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-789gw\" (UniqueName: \"kubernetes.io/projected/fba1c43c-fdfb-4ea0-939b-e38adcc79720-kube-api-access-789gw\") pod \"heat-operator-controller-manager-5f64f6f8bb-rzj5x\" (UID: \"fba1c43c-fdfb-4ea0-939b-e38adcc79720\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-rzj5x" Dec 02 16:11:27 crc kubenswrapper[4933]: I1202 16:11:27.704915 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxgqm\" (UniqueName: \"kubernetes.io/projected/9c02a5b3-2770-427a-a0a4-c1fac9bd0ffa-kube-api-access-dxgqm\") pod \"keystone-operator-controller-manager-7765d96ddf-mslvn\" (UID: \"9c02a5b3-2770-427a-a0a4-c1fac9bd0ffa\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-mslvn" Dec 02 16:11:27 crc kubenswrapper[4933]: I1202 16:11:27.712366 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-n59qc"] Dec 02 16:11:27 crc kubenswrapper[4933]: I1202 16:11:27.713722 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-n59qc" Dec 02 16:11:27 crc kubenswrapper[4933]: I1202 16:11:27.734116 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-87kvc" Dec 02 16:11:27 crc kubenswrapper[4933]: I1202 16:11:27.755915 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-87566"] Dec 02 16:11:27 crc kubenswrapper[4933]: I1202 16:11:27.757272 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-87566" Dec 02 16:11:27 crc kubenswrapper[4933]: I1202 16:11:27.762805 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4wh5\" (UniqueName: \"kubernetes.io/projected/a5d0bd48-d38b-44ae-b486-1ff751a0791a-kube-api-access-p4wh5\") pod \"manila-operator-controller-manager-7c79b5df47-rmd89\" (UID: \"a5d0bd48-d38b-44ae-b486-1ff751a0791a\") " pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-rmd89" Dec 02 16:11:27 crc kubenswrapper[4933]: I1202 16:11:27.762935 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6sdj\" (UniqueName: \"kubernetes.io/projected/e80caebf-2148-4666-98bb-963fce1bc84e-kube-api-access-v6sdj\") pod \"ironic-operator-controller-manager-6c548fd776-ln9b5\" (UID: \"e80caebf-2148-4666-98bb-963fce1bc84e\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-ln9b5" Dec 02 16:11:27 crc kubenswrapper[4933]: I1202 16:11:27.762985 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jsb7\" (UniqueName: \"kubernetes.io/projected/22a42f16-b74f-4323-aee5-2d713c1232ea-kube-api-access-7jsb7\") pod \"mariadb-operator-controller-manager-56bbcc9d85-7zb88\" (UID: \"22a42f16-b74f-4323-aee5-2d713c1232ea\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-7zb88" Dec 02 16:11:27 crc kubenswrapper[4933]: I1202 16:11:27.778364 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-cld5c" Dec 02 16:11:27 crc kubenswrapper[4933]: I1202 16:11:27.793979 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6sdj\" (UniqueName: \"kubernetes.io/projected/e80caebf-2148-4666-98bb-963fce1bc84e-kube-api-access-v6sdj\") pod \"ironic-operator-controller-manager-6c548fd776-ln9b5\" (UID: \"e80caebf-2148-4666-98bb-963fce1bc84e\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-ln9b5" Dec 02 16:11:27 crc kubenswrapper[4933]: I1202 16:11:27.808696 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4wh5\" (UniqueName: \"kubernetes.io/projected/a5d0bd48-d38b-44ae-b486-1ff751a0791a-kube-api-access-p4wh5\") pod \"manila-operator-controller-manager-7c79b5df47-rmd89\" (UID: \"a5d0bd48-d38b-44ae-b486-1ff751a0791a\") " pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-rmd89" Dec 02 16:11:27 crc kubenswrapper[4933]: I1202 16:11:27.818988 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-t8bvj"] Dec 02 16:11:27 crc kubenswrapper[4933]: I1202 16:11:27.820410 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-998648c74-t8bvj" Dec 02 16:11:27 crc kubenswrapper[4933]: I1202 16:11:27.831263 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-k9852" Dec 02 16:11:27 crc kubenswrapper[4933]: I1202 16:11:27.840784 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-rzj5x" Dec 02 16:11:27 crc kubenswrapper[4933]: I1202 16:11:27.857560 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-n59qc"] Dec 02 16:11:27 crc kubenswrapper[4933]: I1202 16:11:27.865695 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5wnz\" (UniqueName: \"kubernetes.io/projected/a35dd4ba-4d05-4af0-b0b2-2285e9e35889-kube-api-access-z5wnz\") pod \"nova-operator-controller-manager-697bc559fc-87566\" (UID: \"a35dd4ba-4d05-4af0-b0b2-2285e9e35889\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-87566" Dec 02 16:11:27 crc kubenswrapper[4933]: I1202 16:11:27.865746 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jsb7\" (UniqueName: \"kubernetes.io/projected/22a42f16-b74f-4323-aee5-2d713c1232ea-kube-api-access-7jsb7\") pod \"mariadb-operator-controller-manager-56bbcc9d85-7zb88\" (UID: \"22a42f16-b74f-4323-aee5-2d713c1232ea\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-7zb88" Dec 02 16:11:27 crc kubenswrapper[4933]: I1202 16:11:27.865768 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvczl\" (UniqueName: \"kubernetes.io/projected/b8e9c61c-a72f-41e1-8f62-99d951ce4950-kube-api-access-rvczl\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-n59qc\" (UID: \"b8e9c61c-a72f-41e1-8f62-99d951ce4950\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-n59qc" Dec 02 16:11:27 crc kubenswrapper[4933]: I1202 16:11:27.866191 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-mslvn" Dec 02 16:11:27 crc kubenswrapper[4933]: I1202 16:11:27.880574 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-rmd89" Dec 02 16:11:27 crc kubenswrapper[4933]: I1202 16:11:27.918927 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-87566"] Dec 02 16:11:27 crc kubenswrapper[4933]: I1202 16:11:27.924601 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jsb7\" (UniqueName: \"kubernetes.io/projected/22a42f16-b74f-4323-aee5-2d713c1232ea-kube-api-access-7jsb7\") pod \"mariadb-operator-controller-manager-56bbcc9d85-7zb88\" (UID: \"22a42f16-b74f-4323-aee5-2d713c1232ea\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-7zb88" Dec 02 16:11:27 crc kubenswrapper[4933]: I1202 16:11:27.925406 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-ln9b5" Dec 02 16:11:27 crc kubenswrapper[4933]: I1202 16:11:27.955903 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-t8bvj"] Dec 02 16:11:27 crc kubenswrapper[4933]: I1202 16:11:27.973111 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5b99l\" (UniqueName: \"kubernetes.io/projected/7d831995-fd00-455a-822e-82eb0cca6a33-kube-api-access-5b99l\") pod \"octavia-operator-controller-manager-998648c74-t8bvj\" (UID: \"7d831995-fd00-455a-822e-82eb0cca6a33\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-t8bvj" Dec 02 16:11:27 crc kubenswrapper[4933]: I1202 16:11:27.973317 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5wnz\" (UniqueName: \"kubernetes.io/projected/a35dd4ba-4d05-4af0-b0b2-2285e9e35889-kube-api-access-z5wnz\") pod \"nova-operator-controller-manager-697bc559fc-87566\" (UID: \"a35dd4ba-4d05-4af0-b0b2-2285e9e35889\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-87566" Dec 02 16:11:27 crc kubenswrapper[4933]: I1202 16:11:27.973352 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvczl\" (UniqueName: \"kubernetes.io/projected/b8e9c61c-a72f-41e1-8f62-99d951ce4950-kube-api-access-rvczl\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-n59qc\" (UID: \"b8e9c61c-a72f-41e1-8f62-99d951ce4950\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-n59qc" Dec 02 16:11:27 crc kubenswrapper[4933]: I1202 16:11:27.979231 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-7zb88" Dec 02 16:11:28 crc kubenswrapper[4933]: I1202 16:11:28.017567 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvczl\" (UniqueName: \"kubernetes.io/projected/b8e9c61c-a72f-41e1-8f62-99d951ce4950-kube-api-access-rvczl\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-n59qc\" (UID: \"b8e9c61c-a72f-41e1-8f62-99d951ce4950\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-n59qc" Dec 02 16:11:28 crc kubenswrapper[4933]: I1202 16:11:28.024848 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5wnz\" (UniqueName: \"kubernetes.io/projected/a35dd4ba-4d05-4af0-b0b2-2285e9e35889-kube-api-access-z5wnz\") pod \"nova-operator-controller-manager-697bc559fc-87566\" (UID: \"a35dd4ba-4d05-4af0-b0b2-2285e9e35889\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-87566" Dec 02 16:11:28 crc kubenswrapper[4933]: I1202 16:11:28.063903 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-mgdvd"] Dec 02 16:11:28 crc kubenswrapper[4933]: I1202 16:11:28.066237 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-mgdvd" Dec 02 16:11:28 crc kubenswrapper[4933]: I1202 16:11:28.066293 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-n59qc" Dec 02 16:11:28 crc kubenswrapper[4933]: I1202 16:11:28.082088 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-9mws7" Dec 02 16:11:28 crc kubenswrapper[4933]: I1202 16:11:28.084764 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5b99l\" (UniqueName: \"kubernetes.io/projected/7d831995-fd00-455a-822e-82eb0cca6a33-kube-api-access-5b99l\") pod \"octavia-operator-controller-manager-998648c74-t8bvj\" (UID: \"7d831995-fd00-455a-822e-82eb0cca6a33\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-t8bvj" Dec 02 16:11:28 crc kubenswrapper[4933]: I1202 16:11:28.085150 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2e0aae7d-292b-4117-8b13-6021d1b5174a-cert\") pod \"infra-operator-controller-manager-57548d458d-ggssz\" (UID: \"2e0aae7d-292b-4117-8b13-6021d1b5174a\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-ggssz" Dec 02 16:11:28 crc kubenswrapper[4933]: E1202 16:11:28.085858 4933 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 02 16:11:28 crc kubenswrapper[4933]: E1202 16:11:28.085907 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2e0aae7d-292b-4117-8b13-6021d1b5174a-cert podName:2e0aae7d-292b-4117-8b13-6021d1b5174a nodeName:}" failed. No retries permitted until 2025-12-02 16:11:29.085893159 +0000 UTC m=+1152.337119862 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2e0aae7d-292b-4117-8b13-6021d1b5174a-cert") pod "infra-operator-controller-manager-57548d458d-ggssz" (UID: "2e0aae7d-292b-4117-8b13-6021d1b5174a") : secret "infra-operator-webhook-server-cert" not found Dec 02 16:11:28 crc kubenswrapper[4933]: I1202 16:11:28.092397 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd42vpt8"] Dec 02 16:11:28 crc kubenswrapper[4933]: I1202 16:11:28.094374 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd42vpt8" Dec 02 16:11:28 crc kubenswrapper[4933]: I1202 16:11:28.106206 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Dec 02 16:11:28 crc kubenswrapper[4933]: I1202 16:11:28.106533 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-87566" Dec 02 16:11:28 crc kubenswrapper[4933]: I1202 16:11:28.106260 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-xc92m" Dec 02 16:11:28 crc kubenswrapper[4933]: I1202 16:11:28.119193 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5b99l\" (UniqueName: \"kubernetes.io/projected/7d831995-fd00-455a-822e-82eb0cca6a33-kube-api-access-5b99l\") pod \"octavia-operator-controller-manager-998648c74-t8bvj\" (UID: \"7d831995-fd00-455a-822e-82eb0cca6a33\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-t8bvj" Dec 02 16:11:28 crc kubenswrapper[4933]: I1202 16:11:28.145955 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-dbnk8"] Dec 02 16:11:28 crc kubenswrapper[4933]: I1202 16:11:28.171870 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-78f8948974-dbnk8" Dec 02 16:11:28 crc kubenswrapper[4933]: I1202 16:11:28.177482 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-mgdvd"] Dec 02 16:11:28 crc kubenswrapper[4933]: I1202 16:11:28.180463 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-998648c74-t8bvj" Dec 02 16:11:28 crc kubenswrapper[4933]: I1202 16:11:28.181211 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-xd87z" Dec 02 16:11:28 crc kubenswrapper[4933]: I1202 16:11:28.187966 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a9221a82-9f86-4d33-a63b-71bd4c532830-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd42vpt8\" (UID: \"a9221a82-9f86-4d33-a63b-71bd4c532830\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd42vpt8" Dec 02 16:11:28 crc kubenswrapper[4933]: I1202 16:11:28.188744 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8stjb\" (UniqueName: \"kubernetes.io/projected/9e637867-b8e7-48b3-987d-53172ec80734-kube-api-access-8stjb\") pod \"ovn-operator-controller-manager-b6456fdb6-mgdvd\" (UID: \"9e637867-b8e7-48b3-987d-53172ec80734\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-mgdvd" Dec 02 16:11:28 crc kubenswrapper[4933]: I1202 16:11:28.188798 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpx5c\" (UniqueName: \"kubernetes.io/projected/a9221a82-9f86-4d33-a63b-71bd4c532830-kube-api-access-dpx5c\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd42vpt8\" (UID: \"a9221a82-9f86-4d33-a63b-71bd4c532830\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd42vpt8" Dec 02 16:11:28 crc kubenswrapper[4933]: I1202 16:11:28.220003 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-dbnk8"] Dec 02 16:11:28 crc kubenswrapper[4933]: I1202 16:11:28.230284 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-qgclr"] Dec 02 16:11:28 crc kubenswrapper[4933]: I1202 16:11:28.233126 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-qgclr" Dec 02 16:11:28 crc kubenswrapper[4933]: I1202 16:11:28.239291 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-8hmzq" Dec 02 16:11:28 crc kubenswrapper[4933]: I1202 16:11:28.243890 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-65697495f7-vqpwl"] Dec 02 16:11:28 crc kubenswrapper[4933]: I1202 16:11:28.247104 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-65697495f7-vqpwl" Dec 02 16:11:28 crc kubenswrapper[4933]: I1202 16:11:28.251144 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-jql46" Dec 02 16:11:28 crc kubenswrapper[4933]: I1202 16:11:28.258894 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd42vpt8"] Dec 02 16:11:28 crc kubenswrapper[4933]: I1202 16:11:28.284585 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-65697495f7-vqpwl"] Dec 02 16:11:28 crc kubenswrapper[4933]: I1202 16:11:28.299771 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a9221a82-9f86-4d33-a63b-71bd4c532830-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd42vpt8\" (UID: \"a9221a82-9f86-4d33-a63b-71bd4c532830\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd42vpt8" Dec 02 16:11:28 crc kubenswrapper[4933]: I1202 16:11:28.299908 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxbf8\" (UniqueName: \"kubernetes.io/projected/e19f7d2e-55da-4ba5-9a68-0d49c06eecf3-kube-api-access-rxbf8\") pod \"placement-operator-controller-manager-78f8948974-dbnk8\" (UID: \"e19f7d2e-55da-4ba5-9a68-0d49c06eecf3\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-dbnk8" Dec 02 16:11:28 crc kubenswrapper[4933]: I1202 16:11:28.299959 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8stjb\" (UniqueName: \"kubernetes.io/projected/9e637867-b8e7-48b3-987d-53172ec80734-kube-api-access-8stjb\") pod \"ovn-operator-controller-manager-b6456fdb6-mgdvd\" (UID: \"9e637867-b8e7-48b3-987d-53172ec80734\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-mgdvd" Dec 02 16:11:28 crc kubenswrapper[4933]: I1202 16:11:28.300031 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dpx5c\" (UniqueName: \"kubernetes.io/projected/a9221a82-9f86-4d33-a63b-71bd4c532830-kube-api-access-dpx5c\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd42vpt8\" (UID: \"a9221a82-9f86-4d33-a63b-71bd4c532830\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd42vpt8" Dec 02 16:11:28 crc kubenswrapper[4933]: E1202 16:11:28.300900 4933 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 02 16:11:28 crc kubenswrapper[4933]: E1202 16:11:28.300950 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a9221a82-9f86-4d33-a63b-71bd4c532830-cert podName:a9221a82-9f86-4d33-a63b-71bd4c532830 nodeName:}" failed. No retries permitted until 2025-12-02 16:11:28.800936562 +0000 UTC m=+1152.052163265 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a9221a82-9f86-4d33-a63b-71bd4c532830-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd42vpt8" (UID: "a9221a82-9f86-4d33-a63b-71bd4c532830") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 02 16:11:28 crc kubenswrapper[4933]: I1202 16:11:28.310114 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-qgclr"] Dec 02 16:11:28 crc kubenswrapper[4933]: I1202 16:11:28.316457 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-w2ddp"] Dec 02 16:11:28 crc kubenswrapper[4933]: I1202 16:11:28.319425 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5854674fcc-w2ddp" Dec 02 16:11:28 crc kubenswrapper[4933]: I1202 16:11:28.321458 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-tfsn4" Dec 02 16:11:28 crc kubenswrapper[4933]: I1202 16:11:28.321709 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpx5c\" (UniqueName: \"kubernetes.io/projected/a9221a82-9f86-4d33-a63b-71bd4c532830-kube-api-access-dpx5c\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd42vpt8\" (UID: \"a9221a82-9f86-4d33-a63b-71bd4c532830\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd42vpt8" Dec 02 16:11:28 crc kubenswrapper[4933]: I1202 16:11:28.331123 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8stjb\" (UniqueName: \"kubernetes.io/projected/9e637867-b8e7-48b3-987d-53172ec80734-kube-api-access-8stjb\") pod \"ovn-operator-controller-manager-b6456fdb6-mgdvd\" (UID: \"9e637867-b8e7-48b3-987d-53172ec80734\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-mgdvd" Dec 02 16:11:28 crc kubenswrapper[4933]: I1202 16:11:28.347450 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-w2ddp"] Dec 02 16:11:28 crc kubenswrapper[4933]: I1202 16:11:28.370621 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-769dc69bc-c2pbl"] Dec 02 16:11:28 crc kubenswrapper[4933]: I1202 16:11:28.372071 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-c2pbl" Dec 02 16:11:28 crc kubenswrapper[4933]: I1202 16:11:28.373979 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-2fsv2" Dec 02 16:11:28 crc kubenswrapper[4933]: I1202 16:11:28.386946 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-769dc69bc-c2pbl"] Dec 02 16:11:28 crc kubenswrapper[4933]: I1202 16:11:28.402172 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pqnh\" (UniqueName: \"kubernetes.io/projected/94050b59-4392-4a9d-9ce8-b5e2c61e0d46-kube-api-access-4pqnh\") pod \"telemetry-operator-controller-manager-65697495f7-vqpwl\" (UID: \"94050b59-4392-4a9d-9ce8-b5e2c61e0d46\") " pod="openstack-operators/telemetry-operator-controller-manager-65697495f7-vqpwl" Dec 02 16:11:28 crc kubenswrapper[4933]: I1202 16:11:28.402491 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxbf8\" (UniqueName: \"kubernetes.io/projected/e19f7d2e-55da-4ba5-9a68-0d49c06eecf3-kube-api-access-rxbf8\") pod \"placement-operator-controller-manager-78f8948974-dbnk8\" (UID: \"e19f7d2e-55da-4ba5-9a68-0d49c06eecf3\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-dbnk8" Dec 02 16:11:28 crc kubenswrapper[4933]: I1202 16:11:28.402536 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q62cd\" (UniqueName: \"kubernetes.io/projected/d5d121c4-7a04-478d-b210-36b258949699-kube-api-access-q62cd\") pod \"swift-operator-controller-manager-5f8c65bbfc-qgclr\" (UID: \"d5d121c4-7a04-478d-b210-36b258949699\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-qgclr" Dec 02 16:11:28 crc kubenswrapper[4933]: I1202 16:11:28.407158 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-mgdvd" Dec 02 16:11:28 crc kubenswrapper[4933]: I1202 16:11:28.418932 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxbf8\" (UniqueName: \"kubernetes.io/projected/e19f7d2e-55da-4ba5-9a68-0d49c06eecf3-kube-api-access-rxbf8\") pod \"placement-operator-controller-manager-78f8948974-dbnk8\" (UID: \"e19f7d2e-55da-4ba5-9a68-0d49c06eecf3\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-dbnk8" Dec 02 16:11:28 crc kubenswrapper[4933]: I1202 16:11:28.435639 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-596767c485-f2zz4"] Dec 02 16:11:28 crc kubenswrapper[4933]: I1202 16:11:28.437582 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-596767c485-f2zz4" Dec 02 16:11:28 crc kubenswrapper[4933]: I1202 16:11:28.442057 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Dec 02 16:11:28 crc kubenswrapper[4933]: I1202 16:11:28.442299 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-hznw5" Dec 02 16:11:28 crc kubenswrapper[4933]: I1202 16:11:28.444044 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Dec 02 16:11:28 crc kubenswrapper[4933]: I1202 16:11:28.457154 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-596767c485-f2zz4"] Dec 02 16:11:28 crc kubenswrapper[4933]: I1202 16:11:28.482072 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-f8sqn"] Dec 02 16:11:28 crc kubenswrapper[4933]: I1202 16:11:28.483163 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-f8sqn" Dec 02 16:11:28 crc kubenswrapper[4933]: I1202 16:11:28.490251 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-ws999" Dec 02 16:11:28 crc kubenswrapper[4933]: I1202 16:11:28.493721 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-f8sqn"] Dec 02 16:11:28 crc kubenswrapper[4933]: I1202 16:11:28.504811 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q62cd\" (UniqueName: \"kubernetes.io/projected/d5d121c4-7a04-478d-b210-36b258949699-kube-api-access-q62cd\") pod \"swift-operator-controller-manager-5f8c65bbfc-qgclr\" (UID: \"d5d121c4-7a04-478d-b210-36b258949699\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-qgclr" Dec 02 16:11:28 crc kubenswrapper[4933]: I1202 16:11:28.504976 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4pqnh\" (UniqueName: \"kubernetes.io/projected/94050b59-4392-4a9d-9ce8-b5e2c61e0d46-kube-api-access-4pqnh\") pod \"telemetry-operator-controller-manager-65697495f7-vqpwl\" (UID: \"94050b59-4392-4a9d-9ce8-b5e2c61e0d46\") " pod="openstack-operators/telemetry-operator-controller-manager-65697495f7-vqpwl" Dec 02 16:11:28 crc kubenswrapper[4933]: I1202 16:11:28.505037 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6wqp\" (UniqueName: \"kubernetes.io/projected/7ab353e5-677f-499c-8909-47813767a26c-kube-api-access-z6wqp\") pod \"watcher-operator-controller-manager-769dc69bc-c2pbl\" (UID: \"7ab353e5-677f-499c-8909-47813767a26c\") " pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-c2pbl" Dec 02 16:11:28 crc kubenswrapper[4933]: I1202 16:11:28.505084 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzzb2\" (UniqueName: \"kubernetes.io/projected/7f82c652-2e90-4fd6-bd23-381c2f529a27-kube-api-access-pzzb2\") pod \"test-operator-controller-manager-5854674fcc-w2ddp\" (UID: \"7f82c652-2e90-4fd6-bd23-381c2f529a27\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-w2ddp" Dec 02 16:11:28 crc kubenswrapper[4933]: I1202 16:11:28.525016 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pqnh\" (UniqueName: \"kubernetes.io/projected/94050b59-4392-4a9d-9ce8-b5e2c61e0d46-kube-api-access-4pqnh\") pod \"telemetry-operator-controller-manager-65697495f7-vqpwl\" (UID: \"94050b59-4392-4a9d-9ce8-b5e2c61e0d46\") " pod="openstack-operators/telemetry-operator-controller-manager-65697495f7-vqpwl" Dec 02 16:11:28 crc kubenswrapper[4933]: I1202 16:11:28.527854 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q62cd\" (UniqueName: \"kubernetes.io/projected/d5d121c4-7a04-478d-b210-36b258949699-kube-api-access-q62cd\") pod \"swift-operator-controller-manager-5f8c65bbfc-qgclr\" (UID: \"d5d121c4-7a04-478d-b210-36b258949699\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-qgclr" Dec 02 16:11:28 crc kubenswrapper[4933]: I1202 16:11:28.606840 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a23e8ea6-05d4-49b6-a661-30b1236cb653-webhook-certs\") pod \"openstack-operator-controller-manager-596767c485-f2zz4\" (UID: \"a23e8ea6-05d4-49b6-a661-30b1236cb653\") " pod="openstack-operators/openstack-operator-controller-manager-596767c485-f2zz4" Dec 02 16:11:28 crc kubenswrapper[4933]: I1202 16:11:28.606907 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6wqp\" (UniqueName: \"kubernetes.io/projected/7ab353e5-677f-499c-8909-47813767a26c-kube-api-access-z6wqp\") pod \"watcher-operator-controller-manager-769dc69bc-c2pbl\" (UID: \"7ab353e5-677f-499c-8909-47813767a26c\") " pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-c2pbl" Dec 02 16:11:28 crc kubenswrapper[4933]: I1202 16:11:28.606942 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a23e8ea6-05d4-49b6-a661-30b1236cb653-metrics-certs\") pod \"openstack-operator-controller-manager-596767c485-f2zz4\" (UID: \"a23e8ea6-05d4-49b6-a661-30b1236cb653\") " pod="openstack-operators/openstack-operator-controller-manager-596767c485-f2zz4" Dec 02 16:11:28 crc kubenswrapper[4933]: I1202 16:11:28.606966 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzzb2\" (UniqueName: \"kubernetes.io/projected/7f82c652-2e90-4fd6-bd23-381c2f529a27-kube-api-access-pzzb2\") pod \"test-operator-controller-manager-5854674fcc-w2ddp\" (UID: \"7f82c652-2e90-4fd6-bd23-381c2f529a27\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-w2ddp" Dec 02 16:11:28 crc kubenswrapper[4933]: I1202 16:11:28.607038 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5mww\" (UniqueName: \"kubernetes.io/projected/a23e8ea6-05d4-49b6-a661-30b1236cb653-kube-api-access-r5mww\") pod \"openstack-operator-controller-manager-596767c485-f2zz4\" (UID: \"a23e8ea6-05d4-49b6-a661-30b1236cb653\") " pod="openstack-operators/openstack-operator-controller-manager-596767c485-f2zz4" Dec 02 16:11:28 crc kubenswrapper[4933]: I1202 16:11:28.607061 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fxh6\" (UniqueName: \"kubernetes.io/projected/fc13fe0e-6c9a-4d9a-bf78-ec06c2962f67-kube-api-access-9fxh6\") pod \"rabbitmq-cluster-operator-manager-668c99d594-f8sqn\" (UID: \"fc13fe0e-6c9a-4d9a-bf78-ec06c2962f67\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-f8sqn" Dec 02 16:11:28 crc kubenswrapper[4933]: I1202 16:11:28.617599 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-ltqgf"] Dec 02 16:11:28 crc kubenswrapper[4933]: I1202 16:11:28.631359 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6wqp\" (UniqueName: \"kubernetes.io/projected/7ab353e5-677f-499c-8909-47813767a26c-kube-api-access-z6wqp\") pod \"watcher-operator-controller-manager-769dc69bc-c2pbl\" (UID: \"7ab353e5-677f-499c-8909-47813767a26c\") " pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-c2pbl" Dec 02 16:11:28 crc kubenswrapper[4933]: I1202 16:11:28.641096 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzzb2\" (UniqueName: \"kubernetes.io/projected/7f82c652-2e90-4fd6-bd23-381c2f529a27-kube-api-access-pzzb2\") pod \"test-operator-controller-manager-5854674fcc-w2ddp\" (UID: \"7f82c652-2e90-4fd6-bd23-381c2f529a27\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-w2ddp" Dec 02 16:11:28 crc kubenswrapper[4933]: I1202 16:11:28.669767 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987cd8cd-9kzqg"] Dec 02 16:11:28 crc kubenswrapper[4933]: I1202 16:11:28.671393 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-78f8948974-dbnk8" Dec 02 16:11:28 crc kubenswrapper[4933]: I1202 16:11:28.710005 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a23e8ea6-05d4-49b6-a661-30b1236cb653-webhook-certs\") pod \"openstack-operator-controller-manager-596767c485-f2zz4\" (UID: \"a23e8ea6-05d4-49b6-a661-30b1236cb653\") " pod="openstack-operators/openstack-operator-controller-manager-596767c485-f2zz4" Dec 02 16:11:28 crc kubenswrapper[4933]: I1202 16:11:28.710086 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a23e8ea6-05d4-49b6-a661-30b1236cb653-metrics-certs\") pod \"openstack-operator-controller-manager-596767c485-f2zz4\" (UID: \"a23e8ea6-05d4-49b6-a661-30b1236cb653\") " pod="openstack-operators/openstack-operator-controller-manager-596767c485-f2zz4" Dec 02 16:11:28 crc kubenswrapper[4933]: I1202 16:11:28.710169 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5mww\" (UniqueName: \"kubernetes.io/projected/a23e8ea6-05d4-49b6-a661-30b1236cb653-kube-api-access-r5mww\") pod \"openstack-operator-controller-manager-596767c485-f2zz4\" (UID: \"a23e8ea6-05d4-49b6-a661-30b1236cb653\") " pod="openstack-operators/openstack-operator-controller-manager-596767c485-f2zz4" Dec 02 16:11:28 crc kubenswrapper[4933]: I1202 16:11:28.710199 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fxh6\" (UniqueName: \"kubernetes.io/projected/fc13fe0e-6c9a-4d9a-bf78-ec06c2962f67-kube-api-access-9fxh6\") pod \"rabbitmq-cluster-operator-manager-668c99d594-f8sqn\" (UID: \"fc13fe0e-6c9a-4d9a-bf78-ec06c2962f67\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-f8sqn" Dec 02 16:11:28 crc kubenswrapper[4933]: E1202 16:11:28.710326 4933 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 02 16:11:28 crc kubenswrapper[4933]: E1202 16:11:28.710410 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a23e8ea6-05d4-49b6-a661-30b1236cb653-webhook-certs podName:a23e8ea6-05d4-49b6-a661-30b1236cb653 nodeName:}" failed. No retries permitted until 2025-12-02 16:11:29.210386512 +0000 UTC m=+1152.461613275 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/a23e8ea6-05d4-49b6-a661-30b1236cb653-webhook-certs") pod "openstack-operator-controller-manager-596767c485-f2zz4" (UID: "a23e8ea6-05d4-49b6-a661-30b1236cb653") : secret "webhook-server-cert" not found Dec 02 16:11:28 crc kubenswrapper[4933]: E1202 16:11:28.710567 4933 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 02 16:11:28 crc kubenswrapper[4933]: E1202 16:11:28.710592 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a23e8ea6-05d4-49b6-a661-30b1236cb653-metrics-certs podName:a23e8ea6-05d4-49b6-a661-30b1236cb653 nodeName:}" failed. No retries permitted until 2025-12-02 16:11:29.210584977 +0000 UTC m=+1152.461811670 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a23e8ea6-05d4-49b6-a661-30b1236cb653-metrics-certs") pod "openstack-operator-controller-manager-596767c485-f2zz4" (UID: "a23e8ea6-05d4-49b6-a661-30b1236cb653") : secret "metrics-server-cert" not found Dec 02 16:11:28 crc kubenswrapper[4933]: I1202 16:11:28.733538 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5mww\" (UniqueName: \"kubernetes.io/projected/a23e8ea6-05d4-49b6-a661-30b1236cb653-kube-api-access-r5mww\") pod \"openstack-operator-controller-manager-596767c485-f2zz4\" (UID: \"a23e8ea6-05d4-49b6-a661-30b1236cb653\") " pod="openstack-operators/openstack-operator-controller-manager-596767c485-f2zz4" Dec 02 16:11:28 crc kubenswrapper[4933]: I1202 16:11:28.735577 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fxh6\" (UniqueName: \"kubernetes.io/projected/fc13fe0e-6c9a-4d9a-bf78-ec06c2962f67-kube-api-access-9fxh6\") pod \"rabbitmq-cluster-operator-manager-668c99d594-f8sqn\" (UID: \"fc13fe0e-6c9a-4d9a-bf78-ec06c2962f67\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-f8sqn" Dec 02 16:11:28 crc kubenswrapper[4933]: I1202 16:11:28.738120 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-65697495f7-vqpwl" Dec 02 16:11:28 crc kubenswrapper[4933]: I1202 16:11:28.755416 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-qgclr" Dec 02 16:11:28 crc kubenswrapper[4933]: I1202 16:11:28.757101 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5854674fcc-w2ddp" Dec 02 16:11:28 crc kubenswrapper[4933]: I1202 16:11:28.772161 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-c2pbl" Dec 02 16:11:28 crc kubenswrapper[4933]: I1202 16:11:28.804694 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-f8sqn" Dec 02 16:11:28 crc kubenswrapper[4933]: I1202 16:11:28.811332 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a9221a82-9f86-4d33-a63b-71bd4c532830-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd42vpt8\" (UID: \"a9221a82-9f86-4d33-a63b-71bd4c532830\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd42vpt8" Dec 02 16:11:28 crc kubenswrapper[4933]: E1202 16:11:28.812010 4933 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 02 16:11:28 crc kubenswrapper[4933]: E1202 16:11:28.812057 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a9221a82-9f86-4d33-a63b-71bd4c532830-cert podName:a9221a82-9f86-4d33-a63b-71bd4c532830 nodeName:}" failed. No retries permitted until 2025-12-02 16:11:29.812043673 +0000 UTC m=+1153.063270376 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a9221a82-9f86-4d33-a63b-71bd4c532830-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd42vpt8" (UID: "a9221a82-9f86-4d33-a63b-71bd4c532830") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 02 16:11:29 crc kubenswrapper[4933]: I1202 16:11:29.031491 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-zhsjb"] Dec 02 16:11:29 crc kubenswrapper[4933]: I1202 16:11:29.080627 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-vgblp"] Dec 02 16:11:29 crc kubenswrapper[4933]: W1202 16:11:29.085037 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod623d7dfe_ddc2_4557_95a3_02b8fb56ee35.slice/crio-5bab0171483807f31546ff83c18e768d8b20ae82adca751b84292cb36ac89d4b WatchSource:0}: Error finding container 5bab0171483807f31546ff83c18e768d8b20ae82adca751b84292cb36ac89d4b: Status 404 returned error can't find the container with id 5bab0171483807f31546ff83c18e768d8b20ae82adca751b84292cb36ac89d4b Dec 02 16:11:29 crc kubenswrapper[4933]: I1202 16:11:29.121342 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2e0aae7d-292b-4117-8b13-6021d1b5174a-cert\") pod \"infra-operator-controller-manager-57548d458d-ggssz\" (UID: \"2e0aae7d-292b-4117-8b13-6021d1b5174a\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-ggssz" Dec 02 16:11:29 crc kubenswrapper[4933]: E1202 16:11:29.121596 4933 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 02 16:11:29 crc kubenswrapper[4933]: E1202 16:11:29.121644 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2e0aae7d-292b-4117-8b13-6021d1b5174a-cert podName:2e0aae7d-292b-4117-8b13-6021d1b5174a nodeName:}" failed. No retries permitted until 2025-12-02 16:11:31.12163118 +0000 UTC m=+1154.372857883 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2e0aae7d-292b-4117-8b13-6021d1b5174a-cert") pod "infra-operator-controller-manager-57548d458d-ggssz" (UID: "2e0aae7d-292b-4117-8b13-6021d1b5174a") : secret "infra-operator-webhook-server-cert" not found Dec 02 16:11:29 crc kubenswrapper[4933]: I1202 16:11:29.222751 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a23e8ea6-05d4-49b6-a661-30b1236cb653-webhook-certs\") pod \"openstack-operator-controller-manager-596767c485-f2zz4\" (UID: \"a23e8ea6-05d4-49b6-a661-30b1236cb653\") " pod="openstack-operators/openstack-operator-controller-manager-596767c485-f2zz4" Dec 02 16:11:29 crc kubenswrapper[4933]: I1202 16:11:29.223066 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a23e8ea6-05d4-49b6-a661-30b1236cb653-metrics-certs\") pod \"openstack-operator-controller-manager-596767c485-f2zz4\" (UID: \"a23e8ea6-05d4-49b6-a661-30b1236cb653\") " pod="openstack-operators/openstack-operator-controller-manager-596767c485-f2zz4" Dec 02 16:11:29 crc kubenswrapper[4933]: E1202 16:11:29.222917 4933 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 02 16:11:29 crc kubenswrapper[4933]: E1202 16:11:29.224160 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a23e8ea6-05d4-49b6-a661-30b1236cb653-webhook-certs podName:a23e8ea6-05d4-49b6-a661-30b1236cb653 nodeName:}" failed. No retries permitted until 2025-12-02 16:11:30.224143403 +0000 UTC m=+1153.475370106 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/a23e8ea6-05d4-49b6-a661-30b1236cb653-webhook-certs") pod "openstack-operator-controller-manager-596767c485-f2zz4" (UID: "a23e8ea6-05d4-49b6-a661-30b1236cb653") : secret "webhook-server-cert" not found Dec 02 16:11:29 crc kubenswrapper[4933]: E1202 16:11:29.224156 4933 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 02 16:11:29 crc kubenswrapper[4933]: E1202 16:11:29.224262 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a23e8ea6-05d4-49b6-a661-30b1236cb653-metrics-certs podName:a23e8ea6-05d4-49b6-a661-30b1236cb653 nodeName:}" failed. No retries permitted until 2025-12-02 16:11:30.224235956 +0000 UTC m=+1153.475462699 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a23e8ea6-05d4-49b6-a661-30b1236cb653-metrics-certs") pod "openstack-operator-controller-manager-596767c485-f2zz4" (UID: "a23e8ea6-05d4-49b6-a661-30b1236cb653") : secret "metrics-server-cert" not found Dec 02 16:11:29 crc kubenswrapper[4933]: W1202 16:11:29.430125 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d831995_fd00_455a_822e_82eb0cca6a33.slice/crio-14d8b90a023bf052e09cde56944722069a79d295b8da4c390da123654640a71c WatchSource:0}: Error finding container 14d8b90a023bf052e09cde56944722069a79d295b8da4c390da123654640a71c: Status 404 returned error can't find the container with id 14d8b90a023bf052e09cde56944722069a79d295b8da4c390da123654640a71c Dec 02 16:11:29 crc kubenswrapper[4933]: I1202 16:11:29.434703 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-t8bvj"] Dec 02 16:11:29 crc kubenswrapper[4933]: W1202 16:11:29.443941 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c02a5b3_2770_427a_a0a4_c1fac9bd0ffa.slice/crio-89e7bbdd337ff991a7a62ca85036716783f587968082d2d86899260509d64f9f WatchSource:0}: Error finding container 89e7bbdd337ff991a7a62ca85036716783f587968082d2d86899260509d64f9f: Status 404 returned error can't find the container with id 89e7bbdd337ff991a7a62ca85036716783f587968082d2d86899260509d64f9f Dec 02 16:11:29 crc kubenswrapper[4933]: I1202 16:11:29.448341 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-mslvn"] Dec 02 16:11:29 crc kubenswrapper[4933]: W1202 16:11:29.462481 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfba1c43c_fdfb_4ea0_939b_e38adcc79720.slice/crio-c894f3ee575365c10ae94b1c4539db7a24763a7d86f3e824a7f59326173d4471 WatchSource:0}: Error finding container c894f3ee575365c10ae94b1c4539db7a24763a7d86f3e824a7f59326173d4471: Status 404 returned error can't find the container with id c894f3ee575365c10ae94b1c4539db7a24763a7d86f3e824a7f59326173d4471 Dec 02 16:11:29 crc kubenswrapper[4933]: I1202 16:11:29.463786 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-t7rsk"] Dec 02 16:11:29 crc kubenswrapper[4933]: W1202 16:11:29.466014 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode7437781_26c4_4a57_8afc_6bbc1ef7f7dd.slice/crio-0846fcf883741eea0824daf4b68b99e4c469750f462567af39310f5384f2ca36 WatchSource:0}: Error finding container 0846fcf883741eea0824daf4b68b99e4c469750f462567af39310f5384f2ca36: Status 404 returned error can't find the container with id 0846fcf883741eea0824daf4b68b99e4c469750f462567af39310f5384f2ca36 Dec 02 16:11:29 crc kubenswrapper[4933]: I1202 16:11:29.476213 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-rzj5x"] Dec 02 16:11:29 crc kubenswrapper[4933]: I1202 16:11:29.487483 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-ln9b5"] Dec 02 16:11:29 crc kubenswrapper[4933]: I1202 16:11:29.682414 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-t8bvj" event={"ID":"7d831995-fd00-455a-822e-82eb0cca6a33","Type":"ContainerStarted","Data":"14d8b90a023bf052e09cde56944722069a79d295b8da4c390da123654640a71c"} Dec 02 16:11:29 crc kubenswrapper[4933]: I1202 16:11:29.683811 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-mslvn" event={"ID":"9c02a5b3-2770-427a-a0a4-c1fac9bd0ffa","Type":"ContainerStarted","Data":"89e7bbdd337ff991a7a62ca85036716783f587968082d2d86899260509d64f9f"} Dec 02 16:11:29 crc kubenswrapper[4933]: I1202 16:11:29.685915 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-ln9b5" event={"ID":"e80caebf-2148-4666-98bb-963fce1bc84e","Type":"ContainerStarted","Data":"4b00b803d1518519d870dcc0b69a91cc22256133e543f9c2b088149aa5c08b97"} Dec 02 16:11:29 crc kubenswrapper[4933]: I1202 16:11:29.687302 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-9kzqg" event={"ID":"443e1418-ad3d-4f7a-b7b0-682beceb2977","Type":"ContainerStarted","Data":"247905404afdea9ee877a14b3dc19ea9304ab1b6972a1526aca95ae60a439b9d"} Dec 02 16:11:29 crc kubenswrapper[4933]: I1202 16:11:29.688418 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-zhsjb" event={"ID":"4a6c2fb7-89ad-402e-8e31-b56e50d1386c","Type":"ContainerStarted","Data":"28624d115f0c0ff321472310bb8d4bf352c713135bd7a8aa540ba1a543181950"} Dec 02 16:11:29 crc kubenswrapper[4933]: I1202 16:11:29.689572 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-ltqgf" event={"ID":"d7ebb5f9-1140-47dc-a2e2-1952a295d218","Type":"ContainerStarted","Data":"21a548d3c563be3ae1cd4d52089071ffab7ed3776f16941be43cf47960df2327"} Dec 02 16:11:29 crc kubenswrapper[4933]: I1202 16:11:29.690680 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-rzj5x" event={"ID":"fba1c43c-fdfb-4ea0-939b-e38adcc79720","Type":"ContainerStarted","Data":"c894f3ee575365c10ae94b1c4539db7a24763a7d86f3e824a7f59326173d4471"} Dec 02 16:11:29 crc kubenswrapper[4933]: I1202 16:11:29.692356 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-vgblp" event={"ID":"623d7dfe-ddc2-4557-95a3-02b8fb56ee35","Type":"ContainerStarted","Data":"5bab0171483807f31546ff83c18e768d8b20ae82adca751b84292cb36ac89d4b"} Dec 02 16:11:29 crc kubenswrapper[4933]: I1202 16:11:29.693162 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-t7rsk" event={"ID":"e7437781-26c4-4a57-8afc-6bbc1ef7f7dd","Type":"ContainerStarted","Data":"0846fcf883741eea0824daf4b68b99e4c469750f462567af39310f5384f2ca36"} Dec 02 16:11:29 crc kubenswrapper[4933]: I1202 16:11:29.836497 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a9221a82-9f86-4d33-a63b-71bd4c532830-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd42vpt8\" (UID: \"a9221a82-9f86-4d33-a63b-71bd4c532830\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd42vpt8" Dec 02 16:11:29 crc kubenswrapper[4933]: E1202 16:11:29.836837 4933 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 02 16:11:29 crc kubenswrapper[4933]: E1202 16:11:29.836909 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a9221a82-9f86-4d33-a63b-71bd4c532830-cert podName:a9221a82-9f86-4d33-a63b-71bd4c532830 nodeName:}" failed. No retries permitted until 2025-12-02 16:11:31.836890085 +0000 UTC m=+1155.088116788 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a9221a82-9f86-4d33-a63b-71bd4c532830-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd42vpt8" (UID: "a9221a82-9f86-4d33-a63b-71bd4c532830") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 02 16:11:29 crc kubenswrapper[4933]: I1202 16:11:29.873082 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-87566"] Dec 02 16:11:29 crc kubenswrapper[4933]: I1202 16:11:29.894883 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-dbnk8"] Dec 02 16:11:29 crc kubenswrapper[4933]: I1202 16:11:29.920620 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-7c79b5df47-rmd89"] Dec 02 16:11:29 crc kubenswrapper[4933]: W1202 16:11:29.922013 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda5d0bd48_d38b_44ae_b486_1ff751a0791a.slice/crio-644e282732053aad57e8b577027c0abb888cfab9691decd30d025e51dc134042 WatchSource:0}: Error finding container 644e282732053aad57e8b577027c0abb888cfab9691decd30d025e51dc134042: Status 404 returned error can't find the container with id 644e282732053aad57e8b577027c0abb888cfab9691decd30d025e51dc134042 Dec 02 16:11:29 crc kubenswrapper[4933]: W1202 16:11:29.922839 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode19f7d2e_55da_4ba5_9a68_0d49c06eecf3.slice/crio-51cfdd36b54da2ae7acbb150047185bccb66e30c81a7ea1bc7ac88d19157a02f WatchSource:0}: Error finding container 51cfdd36b54da2ae7acbb150047185bccb66e30c81a7ea1bc7ac88d19157a02f: Status 404 returned error can't find the container with id 51cfdd36b54da2ae7acbb150047185bccb66e30c81a7ea1bc7ac88d19157a02f Dec 02 16:11:29 crc kubenswrapper[4933]: W1202 16:11:29.926287 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22a42f16_b74f_4323_aee5_2d713c1232ea.slice/crio-2956749fb1a23b200451f0f5a78d9729256742e488bc0697d6ccdfdd3a0d9e46 WatchSource:0}: Error finding container 2956749fb1a23b200451f0f5a78d9729256742e488bc0697d6ccdfdd3a0d9e46: Status 404 returned error can't find the container with id 2956749fb1a23b200451f0f5a78d9729256742e488bc0697d6ccdfdd3a0d9e46 Dec 02 16:11:29 crc kubenswrapper[4933]: I1202 16:11:29.928684 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-mgdvd"] Dec 02 16:11:29 crc kubenswrapper[4933]: I1202 16:11:29.948569 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-7zb88"] Dec 02 16:11:30 crc kubenswrapper[4933]: I1202 16:11:30.051841 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-n59qc"] Dec 02 16:11:30 crc kubenswrapper[4933]: W1202 16:11:30.053424 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb8e9c61c_a72f_41e1_8f62_99d951ce4950.slice/crio-5d5c34eac797de7118fc4a134e0675142c68d44a32d93ca47cc03828b54f8dad WatchSource:0}: Error finding container 5d5c34eac797de7118fc4a134e0675142c68d44a32d93ca47cc03828b54f8dad: Status 404 returned error can't find the container with id 5d5c34eac797de7118fc4a134e0675142c68d44a32d93ca47cc03828b54f8dad Dec 02 16:11:30 crc kubenswrapper[4933]: I1202 16:11:30.245135 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a23e8ea6-05d4-49b6-a661-30b1236cb653-metrics-certs\") pod \"openstack-operator-controller-manager-596767c485-f2zz4\" (UID: \"a23e8ea6-05d4-49b6-a661-30b1236cb653\") " pod="openstack-operators/openstack-operator-controller-manager-596767c485-f2zz4" Dec 02 16:11:30 crc kubenswrapper[4933]: I1202 16:11:30.245334 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a23e8ea6-05d4-49b6-a661-30b1236cb653-webhook-certs\") pod \"openstack-operator-controller-manager-596767c485-f2zz4\" (UID: \"a23e8ea6-05d4-49b6-a661-30b1236cb653\") " pod="openstack-operators/openstack-operator-controller-manager-596767c485-f2zz4" Dec 02 16:11:30 crc kubenswrapper[4933]: E1202 16:11:30.245347 4933 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 02 16:11:30 crc kubenswrapper[4933]: E1202 16:11:30.245443 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a23e8ea6-05d4-49b6-a661-30b1236cb653-metrics-certs podName:a23e8ea6-05d4-49b6-a661-30b1236cb653 nodeName:}" failed. No retries permitted until 2025-12-02 16:11:32.24542131 +0000 UTC m=+1155.496648083 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a23e8ea6-05d4-49b6-a661-30b1236cb653-metrics-certs") pod "openstack-operator-controller-manager-596767c485-f2zz4" (UID: "a23e8ea6-05d4-49b6-a661-30b1236cb653") : secret "metrics-server-cert" not found Dec 02 16:11:30 crc kubenswrapper[4933]: E1202 16:11:30.245478 4933 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 02 16:11:30 crc kubenswrapper[4933]: E1202 16:11:30.245536 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a23e8ea6-05d4-49b6-a661-30b1236cb653-webhook-certs podName:a23e8ea6-05d4-49b6-a661-30b1236cb653 nodeName:}" failed. No retries permitted until 2025-12-02 16:11:32.245517723 +0000 UTC m=+1155.496744436 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/a23e8ea6-05d4-49b6-a661-30b1236cb653-webhook-certs") pod "openstack-operator-controller-manager-596767c485-f2zz4" (UID: "a23e8ea6-05d4-49b6-a661-30b1236cb653") : secret "webhook-server-cert" not found Dec 02 16:11:30 crc kubenswrapper[4933]: I1202 16:11:30.295494 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-65697495f7-vqpwl"] Dec 02 16:11:30 crc kubenswrapper[4933]: W1202 16:11:30.308913 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod94050b59_4392_4a9d_9ce8_b5e2c61e0d46.slice/crio-c3315610749daeb73c78e56d3e6c61284c85cbb339fa71f749a4341a8750d99e WatchSource:0}: Error finding container c3315610749daeb73c78e56d3e6c61284c85cbb339fa71f749a4341a8750d99e: Status 404 returned error can't find the container with id c3315610749daeb73c78e56d3e6c61284c85cbb339fa71f749a4341a8750d99e Dec 02 16:11:30 crc kubenswrapper[4933]: I1202 16:11:30.327379 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-qgclr"] Dec 02 16:11:30 crc kubenswrapper[4933]: I1202 16:11:30.342013 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-f8sqn"] Dec 02 16:11:30 crc kubenswrapper[4933]: W1202 16:11:30.349119 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc13fe0e_6c9a_4d9a_bf78_ec06c2962f67.slice/crio-c5e17bebc3c113f37ae560165b0e2abba01c6b162f97a13d837d4d045df88866 WatchSource:0}: Error finding container c5e17bebc3c113f37ae560165b0e2abba01c6b162f97a13d837d4d045df88866: Status 404 returned error can't find the container with id c5e17bebc3c113f37ae560165b0e2abba01c6b162f97a13d837d4d045df88866 Dec 02 16:11:30 crc kubenswrapper[4933]: I1202 16:11:30.355611 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-w2ddp"] Dec 02 16:11:30 crc kubenswrapper[4933]: W1202 16:11:30.370293 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd5d121c4_7a04_478d_b210_36b258949699.slice/crio-2d9f97d2199ff8cb2b5d11f29cecb8fdb074923d16e2b976abb536970ea92b04 WatchSource:0}: Error finding container 2d9f97d2199ff8cb2b5d11f29cecb8fdb074923d16e2b976abb536970ea92b04: Status 404 returned error can't find the container with id 2d9f97d2199ff8cb2b5d11f29cecb8fdb074923d16e2b976abb536970ea92b04 Dec 02 16:11:30 crc kubenswrapper[4933]: W1202 16:11:30.371875 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f82c652_2e90_4fd6_bd23_381c2f529a27.slice/crio-36aaf71029c6fba54cdb237e98d82eba63a08759c28d4363b253fcb8a4e8745b WatchSource:0}: Error finding container 36aaf71029c6fba54cdb237e98d82eba63a08759c28d4363b253fcb8a4e8745b: Status 404 returned error can't find the container with id 36aaf71029c6fba54cdb237e98d82eba63a08759c28d4363b253fcb8a4e8745b Dec 02 16:11:30 crc kubenswrapper[4933]: E1202 16:11:30.375950 4933 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-pzzb2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5854674fcc-w2ddp_openstack-operators(7f82c652-2e90-4fd6-bd23-381c2f529a27): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 02 16:11:30 crc kubenswrapper[4933]: E1202 16:11:30.378055 4933 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-pzzb2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5854674fcc-w2ddp_openstack-operators(7f82c652-2e90-4fd6-bd23-381c2f529a27): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 02 16:11:30 crc kubenswrapper[4933]: E1202 16:11:30.379243 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/test-operator-controller-manager-5854674fcc-w2ddp" podUID="7f82c652-2e90-4fd6-bd23-381c2f529a27" Dec 02 16:11:30 crc kubenswrapper[4933]: I1202 16:11:30.382313 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-769dc69bc-c2pbl"] Dec 02 16:11:30 crc kubenswrapper[4933]: E1202 16:11:30.402177 4933 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:9aa8c03633e4b934c57868c1660acf47e7d386ac86bcb344df262c9ad76b8621,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-z6wqp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-769dc69bc-c2pbl_openstack-operators(7ab353e5-677f-499c-8909-47813767a26c): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 02 16:11:30 crc kubenswrapper[4933]: E1202 16:11:30.404203 4933 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-z6wqp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-769dc69bc-c2pbl_openstack-operators(7ab353e5-677f-499c-8909-47813767a26c): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 02 16:11:30 crc kubenswrapper[4933]: E1202 16:11:30.405480 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-c2pbl" podUID="7ab353e5-677f-499c-8909-47813767a26c" Dec 02 16:11:30 crc kubenswrapper[4933]: I1202 16:11:30.710561 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-qgclr" event={"ID":"d5d121c4-7a04-478d-b210-36b258949699","Type":"ContainerStarted","Data":"2d9f97d2199ff8cb2b5d11f29cecb8fdb074923d16e2b976abb536970ea92b04"} Dec 02 16:11:30 crc kubenswrapper[4933]: I1202 16:11:30.712550 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-c2pbl" event={"ID":"7ab353e5-677f-499c-8909-47813767a26c","Type":"ContainerStarted","Data":"a5ae15d1455500ef87440721b5267d606f6893b778dc68706599f7085a944b92"} Dec 02 16:11:30 crc kubenswrapper[4933]: E1202 16:11:30.715292 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:9aa8c03633e4b934c57868c1660acf47e7d386ac86bcb344df262c9ad76b8621\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-c2pbl" podUID="7ab353e5-677f-499c-8909-47813767a26c" Dec 02 16:11:30 crc kubenswrapper[4933]: I1202 16:11:30.715777 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-mgdvd" event={"ID":"9e637867-b8e7-48b3-987d-53172ec80734","Type":"ContainerStarted","Data":"d7aa05d674b8562b195a6c446dbbe75f38e58c6dba037e0818455b413fe3ab6d"} Dec 02 16:11:30 crc kubenswrapper[4933]: I1202 16:11:30.723708 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-7zb88" event={"ID":"22a42f16-b74f-4323-aee5-2d713c1232ea","Type":"ContainerStarted","Data":"2956749fb1a23b200451f0f5a78d9729256742e488bc0697d6ccdfdd3a0d9e46"} Dec 02 16:11:30 crc kubenswrapper[4933]: I1202 16:11:30.726390 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-f8sqn" event={"ID":"fc13fe0e-6c9a-4d9a-bf78-ec06c2962f67","Type":"ContainerStarted","Data":"c5e17bebc3c113f37ae560165b0e2abba01c6b162f97a13d837d4d045df88866"} Dec 02 16:11:30 crc kubenswrapper[4933]: I1202 16:11:30.734769 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-rmd89" event={"ID":"a5d0bd48-d38b-44ae-b486-1ff751a0791a","Type":"ContainerStarted","Data":"644e282732053aad57e8b577027c0abb888cfab9691decd30d025e51dc134042"} Dec 02 16:11:30 crc kubenswrapper[4933]: I1202 16:11:30.748382 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-n59qc" event={"ID":"b8e9c61c-a72f-41e1-8f62-99d951ce4950","Type":"ContainerStarted","Data":"5d5c34eac797de7118fc4a134e0675142c68d44a32d93ca47cc03828b54f8dad"} Dec 02 16:11:30 crc kubenswrapper[4933]: I1202 16:11:30.754784 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-w2ddp" event={"ID":"7f82c652-2e90-4fd6-bd23-381c2f529a27","Type":"ContainerStarted","Data":"36aaf71029c6fba54cdb237e98d82eba63a08759c28d4363b253fcb8a4e8745b"} Dec 02 16:11:30 crc kubenswrapper[4933]: E1202 16:11:30.757673 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/test-operator-controller-manager-5854674fcc-w2ddp" podUID="7f82c652-2e90-4fd6-bd23-381c2f529a27" Dec 02 16:11:30 crc kubenswrapper[4933]: I1202 16:11:30.758108 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-87566" event={"ID":"a35dd4ba-4d05-4af0-b0b2-2285e9e35889","Type":"ContainerStarted","Data":"07b08e5ff0d9a8fd68ba2bacddf71cd696a60449701fd3679c7be153fcce8c73"} Dec 02 16:11:30 crc kubenswrapper[4933]: I1202 16:11:30.764165 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-dbnk8" event={"ID":"e19f7d2e-55da-4ba5-9a68-0d49c06eecf3","Type":"ContainerStarted","Data":"51cfdd36b54da2ae7acbb150047185bccb66e30c81a7ea1bc7ac88d19157a02f"} Dec 02 16:11:30 crc kubenswrapper[4933]: I1202 16:11:30.767430 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-65697495f7-vqpwl" event={"ID":"94050b59-4392-4a9d-9ce8-b5e2c61e0d46","Type":"ContainerStarted","Data":"c3315610749daeb73c78e56d3e6c61284c85cbb339fa71f749a4341a8750d99e"} Dec 02 16:11:31 crc kubenswrapper[4933]: I1202 16:11:31.171786 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2e0aae7d-292b-4117-8b13-6021d1b5174a-cert\") pod \"infra-operator-controller-manager-57548d458d-ggssz\" (UID: \"2e0aae7d-292b-4117-8b13-6021d1b5174a\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-ggssz" Dec 02 16:11:31 crc kubenswrapper[4933]: E1202 16:11:31.172012 4933 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 02 16:11:31 crc kubenswrapper[4933]: E1202 16:11:31.172095 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2e0aae7d-292b-4117-8b13-6021d1b5174a-cert podName:2e0aae7d-292b-4117-8b13-6021d1b5174a nodeName:}" failed. No retries permitted until 2025-12-02 16:11:35.172076013 +0000 UTC m=+1158.423302716 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2e0aae7d-292b-4117-8b13-6021d1b5174a-cert") pod "infra-operator-controller-manager-57548d458d-ggssz" (UID: "2e0aae7d-292b-4117-8b13-6021d1b5174a") : secret "infra-operator-webhook-server-cert" not found Dec 02 16:11:31 crc kubenswrapper[4933]: E1202 16:11:31.788715 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/test-operator-controller-manager-5854674fcc-w2ddp" podUID="7f82c652-2e90-4fd6-bd23-381c2f529a27" Dec 02 16:11:31 crc kubenswrapper[4933]: E1202 16:11:31.789338 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:9aa8c03633e4b934c57868c1660acf47e7d386ac86bcb344df262c9ad76b8621\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-c2pbl" podUID="7ab353e5-677f-499c-8909-47813767a26c" Dec 02 16:11:31 crc kubenswrapper[4933]: I1202 16:11:31.896085 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a9221a82-9f86-4d33-a63b-71bd4c532830-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd42vpt8\" (UID: \"a9221a82-9f86-4d33-a63b-71bd4c532830\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd42vpt8" Dec 02 16:11:31 crc kubenswrapper[4933]: E1202 16:11:31.896383 4933 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 02 16:11:31 crc kubenswrapper[4933]: E1202 16:11:31.896436 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a9221a82-9f86-4d33-a63b-71bd4c532830-cert podName:a9221a82-9f86-4d33-a63b-71bd4c532830 nodeName:}" failed. No retries permitted until 2025-12-02 16:11:35.896420149 +0000 UTC m=+1159.147646852 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a9221a82-9f86-4d33-a63b-71bd4c532830-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd42vpt8" (UID: "a9221a82-9f86-4d33-a63b-71bd4c532830") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 02 16:11:32 crc kubenswrapper[4933]: I1202 16:11:32.304081 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a23e8ea6-05d4-49b6-a661-30b1236cb653-webhook-certs\") pod \"openstack-operator-controller-manager-596767c485-f2zz4\" (UID: \"a23e8ea6-05d4-49b6-a661-30b1236cb653\") " pod="openstack-operators/openstack-operator-controller-manager-596767c485-f2zz4" Dec 02 16:11:32 crc kubenswrapper[4933]: E1202 16:11:32.304297 4933 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 02 16:11:32 crc kubenswrapper[4933]: I1202 16:11:32.304583 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a23e8ea6-05d4-49b6-a661-30b1236cb653-metrics-certs\") pod \"openstack-operator-controller-manager-596767c485-f2zz4\" (UID: \"a23e8ea6-05d4-49b6-a661-30b1236cb653\") " pod="openstack-operators/openstack-operator-controller-manager-596767c485-f2zz4" Dec 02 16:11:32 crc kubenswrapper[4933]: E1202 16:11:32.304629 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a23e8ea6-05d4-49b6-a661-30b1236cb653-webhook-certs podName:a23e8ea6-05d4-49b6-a661-30b1236cb653 nodeName:}" failed. No retries permitted until 2025-12-02 16:11:36.304603715 +0000 UTC m=+1159.555830418 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/a23e8ea6-05d4-49b6-a661-30b1236cb653-webhook-certs") pod "openstack-operator-controller-manager-596767c485-f2zz4" (UID: "a23e8ea6-05d4-49b6-a661-30b1236cb653") : secret "webhook-server-cert" not found Dec 02 16:11:32 crc kubenswrapper[4933]: E1202 16:11:32.304863 4933 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 02 16:11:32 crc kubenswrapper[4933]: E1202 16:11:32.305006 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a23e8ea6-05d4-49b6-a661-30b1236cb653-metrics-certs podName:a23e8ea6-05d4-49b6-a661-30b1236cb653 nodeName:}" failed. No retries permitted until 2025-12-02 16:11:36.304967135 +0000 UTC m=+1159.556193828 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a23e8ea6-05d4-49b6-a661-30b1236cb653-metrics-certs") pod "openstack-operator-controller-manager-596767c485-f2zz4" (UID: "a23e8ea6-05d4-49b6-a661-30b1236cb653") : secret "metrics-server-cert" not found Dec 02 16:11:35 crc kubenswrapper[4933]: I1202 16:11:35.263255 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2e0aae7d-292b-4117-8b13-6021d1b5174a-cert\") pod \"infra-operator-controller-manager-57548d458d-ggssz\" (UID: \"2e0aae7d-292b-4117-8b13-6021d1b5174a\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-ggssz" Dec 02 16:11:35 crc kubenswrapper[4933]: E1202 16:11:35.263549 4933 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 02 16:11:35 crc kubenswrapper[4933]: E1202 16:11:35.263694 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2e0aae7d-292b-4117-8b13-6021d1b5174a-cert podName:2e0aae7d-292b-4117-8b13-6021d1b5174a nodeName:}" failed. No retries permitted until 2025-12-02 16:11:43.263650662 +0000 UTC m=+1166.514877365 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2e0aae7d-292b-4117-8b13-6021d1b5174a-cert") pod "infra-operator-controller-manager-57548d458d-ggssz" (UID: "2e0aae7d-292b-4117-8b13-6021d1b5174a") : secret "infra-operator-webhook-server-cert" not found Dec 02 16:11:35 crc kubenswrapper[4933]: I1202 16:11:35.976009 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a9221a82-9f86-4d33-a63b-71bd4c532830-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd42vpt8\" (UID: \"a9221a82-9f86-4d33-a63b-71bd4c532830\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd42vpt8" Dec 02 16:11:35 crc kubenswrapper[4933]: E1202 16:11:35.976194 4933 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 02 16:11:35 crc kubenswrapper[4933]: E1202 16:11:35.976270 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a9221a82-9f86-4d33-a63b-71bd4c532830-cert podName:a9221a82-9f86-4d33-a63b-71bd4c532830 nodeName:}" failed. No retries permitted until 2025-12-02 16:11:43.976252906 +0000 UTC m=+1167.227479609 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a9221a82-9f86-4d33-a63b-71bd4c532830-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd42vpt8" (UID: "a9221a82-9f86-4d33-a63b-71bd4c532830") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 02 16:11:36 crc kubenswrapper[4933]: I1202 16:11:36.318062 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a23e8ea6-05d4-49b6-a661-30b1236cb653-webhook-certs\") pod \"openstack-operator-controller-manager-596767c485-f2zz4\" (UID: \"a23e8ea6-05d4-49b6-a661-30b1236cb653\") " pod="openstack-operators/openstack-operator-controller-manager-596767c485-f2zz4" Dec 02 16:11:36 crc kubenswrapper[4933]: I1202 16:11:36.318150 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a23e8ea6-05d4-49b6-a661-30b1236cb653-metrics-certs\") pod \"openstack-operator-controller-manager-596767c485-f2zz4\" (UID: \"a23e8ea6-05d4-49b6-a661-30b1236cb653\") " pod="openstack-operators/openstack-operator-controller-manager-596767c485-f2zz4" Dec 02 16:11:36 crc kubenswrapper[4933]: E1202 16:11:36.318262 4933 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 02 16:11:36 crc kubenswrapper[4933]: E1202 16:11:36.318317 4933 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 02 16:11:36 crc kubenswrapper[4933]: E1202 16:11:36.318324 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a23e8ea6-05d4-49b6-a661-30b1236cb653-webhook-certs podName:a23e8ea6-05d4-49b6-a661-30b1236cb653 nodeName:}" failed. No retries permitted until 2025-12-02 16:11:44.318308665 +0000 UTC m=+1167.569535358 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/a23e8ea6-05d4-49b6-a661-30b1236cb653-webhook-certs") pod "openstack-operator-controller-manager-596767c485-f2zz4" (UID: "a23e8ea6-05d4-49b6-a661-30b1236cb653") : secret "webhook-server-cert" not found Dec 02 16:11:36 crc kubenswrapper[4933]: E1202 16:11:36.318379 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a23e8ea6-05d4-49b6-a661-30b1236cb653-metrics-certs podName:a23e8ea6-05d4-49b6-a661-30b1236cb653 nodeName:}" failed. No retries permitted until 2025-12-02 16:11:44.318365897 +0000 UTC m=+1167.569592600 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a23e8ea6-05d4-49b6-a661-30b1236cb653-metrics-certs") pod "openstack-operator-controller-manager-596767c485-f2zz4" (UID: "a23e8ea6-05d4-49b6-a661-30b1236cb653") : secret "metrics-server-cert" not found Dec 02 16:11:42 crc kubenswrapper[4933]: E1202 16:11:42.525175 4933 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/horizon-operator@sha256:9e847f4dbdea19ab997f32a02b3680a9bd966f9c705911645c3866a19fda9ea5" Dec 02 16:11:42 crc kubenswrapper[4933]: E1202 16:11:42.525719 4933 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/horizon-operator@sha256:9e847f4dbdea19ab997f32a02b3680a9bd966f9c705911645c3866a19fda9ea5,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qgs6v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-operator-controller-manager-68c6d99b8f-t7rsk_openstack-operators(e7437781-26c4-4a57-8afc-6bbc1ef7f7dd): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 16:11:43 crc kubenswrapper[4933]: I1202 16:11:43.322060 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2e0aae7d-292b-4117-8b13-6021d1b5174a-cert\") pod \"infra-operator-controller-manager-57548d458d-ggssz\" (UID: \"2e0aae7d-292b-4117-8b13-6021d1b5174a\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-ggssz" Dec 02 16:11:43 crc kubenswrapper[4933]: I1202 16:11:43.336308 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2e0aae7d-292b-4117-8b13-6021d1b5174a-cert\") pod \"infra-operator-controller-manager-57548d458d-ggssz\" (UID: \"2e0aae7d-292b-4117-8b13-6021d1b5174a\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-ggssz" Dec 02 16:11:43 crc kubenswrapper[4933]: I1202 16:11:43.572051 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-57548d458d-ggssz" Dec 02 16:11:44 crc kubenswrapper[4933]: I1202 16:11:44.036071 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a9221a82-9f86-4d33-a63b-71bd4c532830-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd42vpt8\" (UID: \"a9221a82-9f86-4d33-a63b-71bd4c532830\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd42vpt8" Dec 02 16:11:44 crc kubenswrapper[4933]: I1202 16:11:44.056411 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a9221a82-9f86-4d33-a63b-71bd4c532830-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd42vpt8\" (UID: \"a9221a82-9f86-4d33-a63b-71bd4c532830\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd42vpt8" Dec 02 16:11:44 crc kubenswrapper[4933]: I1202 16:11:44.224257 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd42vpt8" Dec 02 16:11:44 crc kubenswrapper[4933]: I1202 16:11:44.341919 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a23e8ea6-05d4-49b6-a661-30b1236cb653-webhook-certs\") pod \"openstack-operator-controller-manager-596767c485-f2zz4\" (UID: \"a23e8ea6-05d4-49b6-a661-30b1236cb653\") " pod="openstack-operators/openstack-operator-controller-manager-596767c485-f2zz4" Dec 02 16:11:44 crc kubenswrapper[4933]: I1202 16:11:44.342004 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a23e8ea6-05d4-49b6-a661-30b1236cb653-metrics-certs\") pod \"openstack-operator-controller-manager-596767c485-f2zz4\" (UID: \"a23e8ea6-05d4-49b6-a661-30b1236cb653\") " pod="openstack-operators/openstack-operator-controller-manager-596767c485-f2zz4" Dec 02 16:11:44 crc kubenswrapper[4933]: E1202 16:11:44.342140 4933 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 02 16:11:44 crc kubenswrapper[4933]: E1202 16:11:44.342225 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a23e8ea6-05d4-49b6-a661-30b1236cb653-webhook-certs podName:a23e8ea6-05d4-49b6-a661-30b1236cb653 nodeName:}" failed. No retries permitted until 2025-12-02 16:12:00.34220392 +0000 UTC m=+1183.593430673 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/a23e8ea6-05d4-49b6-a661-30b1236cb653-webhook-certs") pod "openstack-operator-controller-manager-596767c485-f2zz4" (UID: "a23e8ea6-05d4-49b6-a661-30b1236cb653") : secret "webhook-server-cert" not found Dec 02 16:11:44 crc kubenswrapper[4933]: I1202 16:11:44.349704 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a23e8ea6-05d4-49b6-a661-30b1236cb653-metrics-certs\") pod \"openstack-operator-controller-manager-596767c485-f2zz4\" (UID: \"a23e8ea6-05d4-49b6-a661-30b1236cb653\") " pod="openstack-operators/openstack-operator-controller-manager-596767c485-f2zz4" Dec 02 16:11:47 crc kubenswrapper[4933]: I1202 16:11:47.169220 4933 patch_prober.go:28] interesting pod/machine-config-daemon-d2p6w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 16:11:47 crc kubenswrapper[4933]: I1202 16:11:47.169944 4933 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 16:11:47 crc kubenswrapper[4933]: I1202 16:11:47.170013 4933 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" Dec 02 16:11:47 crc kubenswrapper[4933]: I1202 16:11:47.171190 4933 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6a4e12f1c3641af98254a5deef90580d464dc44072cf6822887fee1b15222e36"} pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 16:11:47 crc kubenswrapper[4933]: I1202 16:11:47.171270 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" containerName="machine-config-daemon" containerID="cri-o://6a4e12f1c3641af98254a5deef90580d464dc44072cf6822887fee1b15222e36" gracePeriod=600 Dec 02 16:11:49 crc kubenswrapper[4933]: I1202 16:11:49.026016 4933 generic.go:334] "Generic (PLEG): container finished" podID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" containerID="6a4e12f1c3641af98254a5deef90580d464dc44072cf6822887fee1b15222e36" exitCode=0 Dec 02 16:11:49 crc kubenswrapper[4933]: I1202 16:11:49.026096 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" event={"ID":"e6c1c5e6-50dd-428a-890c-2c3f0456f2fa","Type":"ContainerDied","Data":"6a4e12f1c3641af98254a5deef90580d464dc44072cf6822887fee1b15222e36"} Dec 02 16:11:49 crc kubenswrapper[4933]: I1202 16:11:49.026410 4933 scope.go:117] "RemoveContainer" containerID="21d5a98b0f608aa28a64cf9b966f8d88f136c42a20cbdba910ec855c8416478d" Dec 02 16:11:50 crc kubenswrapper[4933]: E1202 16:11:50.847075 4933 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f" Dec 02 16:11:50 crc kubenswrapper[4933]: E1202 16:11:50.847611 4933 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-rxbf8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-78f8948974-dbnk8_openstack-operators(e19f7d2e-55da-4ba5-9a68-0d49c06eecf3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 16:11:51 crc kubenswrapper[4933]: E1202 16:11:51.542520 4933 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/neutron-operator@sha256:0b3fb69f35c151895d3dffd514974a9f9fe1c77c3bca69b78b81efb183cf4557" Dec 02 16:11:51 crc kubenswrapper[4933]: E1202 16:11:51.542721 4933 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:0b3fb69f35c151895d3dffd514974a9f9fe1c77c3bca69b78b81efb183cf4557,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-rvczl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-5fdfd5b6b5-n59qc_openstack-operators(b8e9c61c-a72f-41e1-8f62-99d951ce4950): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 16:11:52 crc kubenswrapper[4933]: E1202 16:11:52.686930 4933 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/swift-operator@sha256:2a3d21728a8bfb4e64617e63e61e2d1cb70a383ea3e8f846e0c3c3c02d2b0a9d" Dec 02 16:11:52 crc kubenswrapper[4933]: E1202 16:11:52.687159 4933 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:2a3d21728a8bfb4e64617e63e61e2d1cb70a383ea3e8f846e0c3c3c02d2b0a9d,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-q62cd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-5f8c65bbfc-qgclr_openstack-operators(d5d121c4-7a04-478d-b210-36b258949699): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 16:11:53 crc kubenswrapper[4933]: E1202 16:11:53.283380 4933 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59" Dec 02 16:11:53 crc kubenswrapper[4933]: E1202 16:11:53.283897 4933 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-8stjb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-b6456fdb6-mgdvd_openstack-operators(9e637867-b8e7-48b3-987d-53172ec80734): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 16:11:53 crc kubenswrapper[4933]: E1202 16:11:53.823157 4933 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/mariadb-operator@sha256:600ca007e493d3af0fcc2ebac92e8da5efd2afe812b62d7d3d4dd0115bdf05d7" Dec 02 16:11:53 crc kubenswrapper[4933]: E1202 16:11:53.823367 4933 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:600ca007e493d3af0fcc2ebac92e8da5efd2afe812b62d7d3d4dd0115bdf05d7,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7jsb7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-56bbcc9d85-7zb88_openstack-operators(22a42f16-b74f-4323-aee5-2d713c1232ea): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 16:11:56 crc kubenswrapper[4933]: E1202 16:11:56.822425 4933 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/manila-operator@sha256:2e59cfbeefc3aff0bb0a6ae9ce2235129f5173c98dd5ee8dac229ad4895faea9" Dec 02 16:11:56 crc kubenswrapper[4933]: E1202 16:11:56.822969 4933 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:2e59cfbeefc3aff0bb0a6ae9ce2235129f5173c98dd5ee8dac229ad4895faea9,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-p4wh5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-7c79b5df47-rmd89_openstack-operators(a5d0bd48-d38b-44ae-b486-1ff751a0791a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 16:12:00 crc kubenswrapper[4933]: I1202 16:12:00.344425 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a23e8ea6-05d4-49b6-a661-30b1236cb653-webhook-certs\") pod \"openstack-operator-controller-manager-596767c485-f2zz4\" (UID: \"a23e8ea6-05d4-49b6-a661-30b1236cb653\") " pod="openstack-operators/openstack-operator-controller-manager-596767c485-f2zz4" Dec 02 16:12:00 crc kubenswrapper[4933]: I1202 16:12:00.349838 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a23e8ea6-05d4-49b6-a661-30b1236cb653-webhook-certs\") pod \"openstack-operator-controller-manager-596767c485-f2zz4\" (UID: \"a23e8ea6-05d4-49b6-a661-30b1236cb653\") " pod="openstack-operators/openstack-operator-controller-manager-596767c485-f2zz4" Dec 02 16:12:00 crc kubenswrapper[4933]: I1202 16:12:00.586959 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-596767c485-f2zz4" Dec 02 16:12:02 crc kubenswrapper[4933]: E1202 16:12:02.867514 4933 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.44:5001/openstack-k8s-operators/telemetry-operator:f44d427de628dd83cacb613b38866e8c2725b35f" Dec 02 16:12:02 crc kubenswrapper[4933]: E1202 16:12:02.868059 4933 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.44:5001/openstack-k8s-operators/telemetry-operator:f44d427de628dd83cacb613b38866e8c2725b35f" Dec 02 16:12:02 crc kubenswrapper[4933]: E1202 16:12:02.868197 4933 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:38.102.83.44:5001/openstack-k8s-operators/telemetry-operator:f44d427de628dd83cacb613b38866e8c2725b35f,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4pqnh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-65697495f7-vqpwl_openstack-operators(94050b59-4392-4a9d-9ce8-b5e2c61e0d46): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 16:12:03 crc kubenswrapper[4933]: E1202 16:12:03.497020 4933 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94" Dec 02 16:12:03 crc kubenswrapper[4933]: E1202 16:12:03.497214 4933 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-pzzb2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5854674fcc-w2ddp_openstack-operators(7f82c652-2e90-4fd6-bd23-381c2f529a27): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 16:12:03 crc kubenswrapper[4933]: E1202 16:12:03.938921 4933 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2" Dec 02 16:12:03 crc kubenswrapper[4933]: E1202 16:12:03.939184 4933 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9fxh6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-f8sqn_openstack-operators(fc13fe0e-6c9a-4d9a-bf78-ec06c2962f67): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 16:12:03 crc kubenswrapper[4933]: E1202 16:12:03.940348 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-f8sqn" podUID="fc13fe0e-6c9a-4d9a-bf78-ec06c2962f67" Dec 02 16:12:04 crc kubenswrapper[4933]: E1202 16:12:04.175305 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-f8sqn" podUID="fc13fe0e-6c9a-4d9a-bf78-ec06c2962f67" Dec 02 16:12:04 crc kubenswrapper[4933]: E1202 16:12:04.683574 4933 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670" Dec 02 16:12:04 crc kubenswrapper[4933]: E1202 16:12:04.683816 4933 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-z5wnz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-697bc559fc-87566_openstack-operators(a35dd4ba-4d05-4af0-b0b2-2285e9e35889): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 16:12:05 crc kubenswrapper[4933]: E1202 16:12:05.186763 4933 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:72ad6517987f674af0d0ae092cbb874aeae909c8b8b60188099c311762ebc8f7" Dec 02 16:12:05 crc kubenswrapper[4933]: E1202 16:12:05.188136 4933 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:72ad6517987f674af0d0ae092cbb874aeae909c8b8b60188099c311762ebc8f7,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dxgqm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-7765d96ddf-mslvn_openstack-operators(9c02a5b3-2770-427a-a0a4-c1fac9bd0ffa): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 16:12:05 crc kubenswrapper[4933]: I1202 16:12:05.791474 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-ggssz"] Dec 02 16:12:05 crc kubenswrapper[4933]: I1202 16:12:05.883180 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd42vpt8"] Dec 02 16:12:05 crc kubenswrapper[4933]: W1202 16:12:05.930676 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e0aae7d_292b_4117_8b13_6021d1b5174a.slice/crio-9e20aeade9585cf3567061d13f02f13982e89f87cf69b0b67dc0a5f309c4e8c1 WatchSource:0}: Error finding container 9e20aeade9585cf3567061d13f02f13982e89f87cf69b0b67dc0a5f309c4e8c1: Status 404 returned error can't find the container with id 9e20aeade9585cf3567061d13f02f13982e89f87cf69b0b67dc0a5f309c4e8c1 Dec 02 16:12:05 crc kubenswrapper[4933]: W1202 16:12:05.953360 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda9221a82_9f86_4d33_a63b_71bd4c532830.slice/crio-535c4534a1bd8fc57a58307ddd721eb5ee9fcaeec3ed63692049e5341026d6b6 WatchSource:0}: Error finding container 535c4534a1bd8fc57a58307ddd721eb5ee9fcaeec3ed63692049e5341026d6b6: Status 404 returned error can't find the container with id 535c4534a1bd8fc57a58307ddd721eb5ee9fcaeec3ed63692049e5341026d6b6 Dec 02 16:12:06 crc kubenswrapper[4933]: I1202 16:12:06.005896 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-596767c485-f2zz4"] Dec 02 16:12:06 crc kubenswrapper[4933]: I1202 16:12:06.228956 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-ln9b5" event={"ID":"e80caebf-2148-4666-98bb-963fce1bc84e","Type":"ContainerStarted","Data":"5145313fa869b0c91290c83266919eb89f21576b72756d2ed558f784a52c4b74"} Dec 02 16:12:06 crc kubenswrapper[4933]: I1202 16:12:06.235899 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-ggssz" event={"ID":"2e0aae7d-292b-4117-8b13-6021d1b5174a","Type":"ContainerStarted","Data":"9e20aeade9585cf3567061d13f02f13982e89f87cf69b0b67dc0a5f309c4e8c1"} Dec 02 16:12:06 crc kubenswrapper[4933]: I1202 16:12:06.239540 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-9kzqg" event={"ID":"443e1418-ad3d-4f7a-b7b0-682beceb2977","Type":"ContainerStarted","Data":"059a0d864b1907638686e281a6a8c7a3ff5ff42af2767b4314890ed2b2bd4925"} Dec 02 16:12:06 crc kubenswrapper[4933]: I1202 16:12:06.243541 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" event={"ID":"e6c1c5e6-50dd-428a-890c-2c3f0456f2fa","Type":"ContainerStarted","Data":"88738e3bef53d0478fb4beaed256d677959bb94ab5e545598a6a14e9cb4a5112"} Dec 02 16:12:06 crc kubenswrapper[4933]: I1202 16:12:06.249384 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd42vpt8" event={"ID":"a9221a82-9f86-4d33-a63b-71bd4c532830","Type":"ContainerStarted","Data":"535c4534a1bd8fc57a58307ddd721eb5ee9fcaeec3ed63692049e5341026d6b6"} Dec 02 16:12:07 crc kubenswrapper[4933]: I1202 16:12:07.259533 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-596767c485-f2zz4" event={"ID":"a23e8ea6-05d4-49b6-a661-30b1236cb653","Type":"ContainerStarted","Data":"d06795025ce525c41c74f39ba7611e4f39a2f1c7b8c946400971827737c7e668"} Dec 02 16:12:08 crc kubenswrapper[4933]: I1202 16:12:08.268656 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-t8bvj" event={"ID":"7d831995-fd00-455a-822e-82eb0cca6a33","Type":"ContainerStarted","Data":"d04a9530ac18ef55d8c9321c949f1e9c0929296d1812571adf7e92bd3fa417fd"} Dec 02 16:12:08 crc kubenswrapper[4933]: I1202 16:12:08.270540 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-c2pbl" event={"ID":"7ab353e5-677f-499c-8909-47813767a26c","Type":"ContainerStarted","Data":"c9fbde220a501f8037a55d33cd37c17ea44dc398490b5b87c136b8ec44377c6e"} Dec 02 16:12:08 crc kubenswrapper[4933]: I1202 16:12:08.272218 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-rzj5x" event={"ID":"fba1c43c-fdfb-4ea0-939b-e38adcc79720","Type":"ContainerStarted","Data":"e36c9825c132b7d06c2a27792d4cac5c1b6adb5747c48b7d8c5ccfc9049cf98a"} Dec 02 16:12:08 crc kubenswrapper[4933]: I1202 16:12:08.273689 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-vgblp" event={"ID":"623d7dfe-ddc2-4557-95a3-02b8fb56ee35","Type":"ContainerStarted","Data":"8ab3cf6cfafeba8a992b8ed4bc96a491a0acd80718c9acf40174d091fcfe1217"} Dec 02 16:12:08 crc kubenswrapper[4933]: I1202 16:12:08.275105 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-zhsjb" event={"ID":"4a6c2fb7-89ad-402e-8e31-b56e50d1386c","Type":"ContainerStarted","Data":"6256566cd9993ba2645804c7fc3bdc888fe1d44ab0d028af8f639bf2ad510428"} Dec 02 16:12:08 crc kubenswrapper[4933]: I1202 16:12:08.276670 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-ltqgf" event={"ID":"d7ebb5f9-1140-47dc-a2e2-1952a295d218","Type":"ContainerStarted","Data":"d4f427e690749f272480d67f042d7221abab129e9a5d6fe22f62e63dce31a83c"} Dec 02 16:12:11 crc kubenswrapper[4933]: E1202 16:12:11.171767 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-mslvn" podUID="9c02a5b3-2770-427a-a0a4-c1fac9bd0ffa" Dec 02 16:12:11 crc kubenswrapper[4933]: I1202 16:12:11.307173 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-mslvn" event={"ID":"9c02a5b3-2770-427a-a0a4-c1fac9bd0ffa","Type":"ContainerStarted","Data":"a3337a79813fff4dc3c356b6ae0fd566dc235e63d997a7ae998ae80e0e63986d"} Dec 02 16:12:11 crc kubenswrapper[4933]: I1202 16:12:11.315471 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-596767c485-f2zz4" event={"ID":"a23e8ea6-05d4-49b6-a661-30b1236cb653","Type":"ContainerStarted","Data":"1d4d1358ad65678249998c44ed08045bc8b982f6da126dc149cc61f084785936"} Dec 02 16:12:11 crc kubenswrapper[4933]: I1202 16:12:11.316151 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-596767c485-f2zz4" Dec 02 16:12:11 crc kubenswrapper[4933]: I1202 16:12:11.320631 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-ln9b5" event={"ID":"e80caebf-2148-4666-98bb-963fce1bc84e","Type":"ContainerStarted","Data":"74bc367af7c323004a2e2a1996c8ee61e192e9eb763806a26f64d5416c1ef70c"} Dec 02 16:12:11 crc kubenswrapper[4933]: I1202 16:12:11.320834 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-ln9b5" Dec 02 16:12:11 crc kubenswrapper[4933]: I1202 16:12:11.323795 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-ln9b5" Dec 02 16:12:11 crc kubenswrapper[4933]: E1202 16:12:11.326646 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:72ad6517987f674af0d0ae092cbb874aeae909c8b8b60188099c311762ebc8f7\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-mslvn" podUID="9c02a5b3-2770-427a-a0a4-c1fac9bd0ffa" Dec 02 16:12:11 crc kubenswrapper[4933]: I1202 16:12:11.360041 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-596767c485-f2zz4" podStartSLOduration=44.36002353 podStartE2EDuration="44.36002353s" podCreationTimestamp="2025-12-02 16:11:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 16:12:11.348148125 +0000 UTC m=+1194.599374828" watchObservedRunningTime="2025-12-02 16:12:11.36002353 +0000 UTC m=+1194.611250233" Dec 02 16:12:11 crc kubenswrapper[4933]: I1202 16:12:11.398299 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-ln9b5" podStartSLOduration=2.9305487230000002 podStartE2EDuration="44.398273477s" podCreationTimestamp="2025-12-02 16:11:27 +0000 UTC" firstStartedPulling="2025-12-02 16:11:29.474708271 +0000 UTC m=+1152.725934974" lastFinishedPulling="2025-12-02 16:12:10.942433025 +0000 UTC m=+1194.193659728" observedRunningTime="2025-12-02 16:12:11.377504495 +0000 UTC m=+1194.628731188" watchObservedRunningTime="2025-12-02 16:12:11.398273477 +0000 UTC m=+1194.649500180" Dec 02 16:12:12 crc kubenswrapper[4933]: I1202 16:12:12.330653 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-t8bvj" event={"ID":"7d831995-fd00-455a-822e-82eb0cca6a33","Type":"ContainerStarted","Data":"adb67d054792926afd90b103fa81e7a12547757d6bc6cb1c70e0c5a03eb48476"} Dec 02 16:12:12 crc kubenswrapper[4933]: E1202 16:12:12.332892 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:72ad6517987f674af0d0ae092cbb874aeae909c8b8b60188099c311762ebc8f7\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-mslvn" podUID="9c02a5b3-2770-427a-a0a4-c1fac9bd0ffa" Dec 02 16:12:12 crc kubenswrapper[4933]: E1202 16:12:12.605684 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-87566" podUID="a35dd4ba-4d05-4af0-b0b2-2285e9e35889" Dec 02 16:12:12 crc kubenswrapper[4933]: E1202 16:12:12.610941 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-qgclr" podUID="d5d121c4-7a04-478d-b210-36b258949699" Dec 02 16:12:12 crc kubenswrapper[4933]: E1202 16:12:12.768074 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/test-operator-controller-manager-5854674fcc-w2ddp" podUID="7f82c652-2e90-4fd6-bd23-381c2f529a27" Dec 02 16:12:13 crc kubenswrapper[4933]: E1202 16:12:13.054564 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/placement-operator-controller-manager-78f8948974-dbnk8" podUID="e19f7d2e-55da-4ba5-9a68-0d49c06eecf3" Dec 02 16:12:13 crc kubenswrapper[4933]: E1202 16:12:13.091673 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-7zb88" podUID="22a42f16-b74f-4323-aee5-2d713c1232ea" Dec 02 16:12:13 crc kubenswrapper[4933]: E1202 16:12:13.141171 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-n59qc" podUID="b8e9c61c-a72f-41e1-8f62-99d951ce4950" Dec 02 16:12:13 crc kubenswrapper[4933]: E1202 16:12:13.339992 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-rmd89" podUID="a5d0bd48-d38b-44ae-b486-1ff751a0791a" Dec 02 16:12:13 crc kubenswrapper[4933]: I1202 16:12:13.341999 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-qgclr" event={"ID":"d5d121c4-7a04-478d-b210-36b258949699","Type":"ContainerStarted","Data":"ebd3a4355acbb22517d90558fc65ef9789ad727e595bc1589436795e0e832b15"} Dec 02 16:12:13 crc kubenswrapper[4933]: I1202 16:12:13.344518 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-vgblp" event={"ID":"623d7dfe-ddc2-4557-95a3-02b8fb56ee35","Type":"ContainerStarted","Data":"1f3f61fb9d3a866615c0060acd41c61572642d89b6ac78174f531d877b9efe06"} Dec 02 16:12:13 crc kubenswrapper[4933]: I1202 16:12:13.345081 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-vgblp" Dec 02 16:12:13 crc kubenswrapper[4933]: I1202 16:12:13.347225 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-vgblp" Dec 02 16:12:13 crc kubenswrapper[4933]: I1202 16:12:13.358741 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-n59qc" event={"ID":"b8e9c61c-a72f-41e1-8f62-99d951ce4950","Type":"ContainerStarted","Data":"4e8129eaabedd29bb11b66c0ecec08c516a8beb13e16974a1b29898ba79127a0"} Dec 02 16:12:13 crc kubenswrapper[4933]: I1202 16:12:13.367468 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-87566" event={"ID":"a35dd4ba-4d05-4af0-b0b2-2285e9e35889","Type":"ContainerStarted","Data":"b59b454d8c96eb316eff16a3a03a2852e2c5a0bfbb6816f4a29dd00ccc831330"} Dec 02 16:12:13 crc kubenswrapper[4933]: E1202 16:12:13.370214 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670\\\"\"" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-87566" podUID="a35dd4ba-4d05-4af0-b0b2-2285e9e35889" Dec 02 16:12:13 crc kubenswrapper[4933]: I1202 16:12:13.370862 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-w2ddp" event={"ID":"7f82c652-2e90-4fd6-bd23-381c2f529a27","Type":"ContainerStarted","Data":"44f441d2191fe78dd8f8f7f1c91713b7cbc77226d11aeddefbab3d1f690eeec4"} Dec 02 16:12:13 crc kubenswrapper[4933]: E1202 16:12:13.375282 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94\\\"\"" pod="openstack-operators/test-operator-controller-manager-5854674fcc-w2ddp" podUID="7f82c652-2e90-4fd6-bd23-381c2f529a27" Dec 02 16:12:13 crc kubenswrapper[4933]: I1202 16:12:13.425543 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-ltqgf" event={"ID":"d7ebb5f9-1140-47dc-a2e2-1952a295d218","Type":"ContainerStarted","Data":"d48c16853e244e67342183fc36a77743a02824ff35e8babf9b0af0f36111a1a4"} Dec 02 16:12:13 crc kubenswrapper[4933]: I1202 16:12:13.427712 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-ltqgf" Dec 02 16:12:13 crc kubenswrapper[4933]: I1202 16:12:13.430702 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-ltqgf" Dec 02 16:12:13 crc kubenswrapper[4933]: I1202 16:12:13.461324 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-rzj5x" event={"ID":"fba1c43c-fdfb-4ea0-939b-e38adcc79720","Type":"ContainerStarted","Data":"e27286ebcb1f9969f6a94a71134f6ed857f5b579666cddb7c1f3893121629cc8"} Dec 02 16:12:13 crc kubenswrapper[4933]: I1202 16:12:13.463000 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-rzj5x" Dec 02 16:12:13 crc kubenswrapper[4933]: I1202 16:12:13.466393 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-rzj5x" Dec 02 16:12:13 crc kubenswrapper[4933]: I1202 16:12:13.474699 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-7zb88" event={"ID":"22a42f16-b74f-4323-aee5-2d713c1232ea","Type":"ContainerStarted","Data":"62c3a74ca7272175c2b517302a88694dfe4f09dd1a81c2bff7fafcfa964ee709"} Dec 02 16:12:13 crc kubenswrapper[4933]: I1202 16:12:13.514349 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-dbnk8" event={"ID":"e19f7d2e-55da-4ba5-9a68-0d49c06eecf3","Type":"ContainerStarted","Data":"0447e142b54665e475e0099c87ed78c287895f1485c9cb2e22be75e4b838d370"} Dec 02 16:12:13 crc kubenswrapper[4933]: I1202 16:12:13.525928 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd42vpt8" event={"ID":"a9221a82-9f86-4d33-a63b-71bd4c532830","Type":"ContainerStarted","Data":"8a41c7a718e673c940d53ce765c177ec84521b0d04a56355e7ece19e86f82f50"} Dec 02 16:12:13 crc kubenswrapper[4933]: I1202 16:12:13.525970 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd42vpt8" event={"ID":"a9221a82-9f86-4d33-a63b-71bd4c532830","Type":"ContainerStarted","Data":"3899b646ad5ff887c5f94e5725e5c888c3b45695b05f9fb863f51b8c47d484b9"} Dec 02 16:12:13 crc kubenswrapper[4933]: I1202 16:12:13.526796 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd42vpt8" Dec 02 16:12:13 crc kubenswrapper[4933]: I1202 16:12:13.565475 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-9kzqg" event={"ID":"443e1418-ad3d-4f7a-b7b0-682beceb2977","Type":"ContainerStarted","Data":"afa75f9ff14d056f2d310e8766f324cd32abcd3468cdff662396e18fa1e92ab6"} Dec 02 16:12:13 crc kubenswrapper[4933]: I1202 16:12:13.566533 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-9kzqg" Dec 02 16:12:13 crc kubenswrapper[4933]: I1202 16:12:13.568667 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-9kzqg" Dec 02 16:12:13 crc kubenswrapper[4933]: I1202 16:12:13.579094 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-zhsjb" event={"ID":"4a6c2fb7-89ad-402e-8e31-b56e50d1386c","Type":"ContainerStarted","Data":"6967ce8b1d49c8ae3edfe58253886c5cf76f157c0d6a8e2dd6406cfeff051bac"} Dec 02 16:12:13 crc kubenswrapper[4933]: I1202 16:12:13.579964 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-zhsjb" Dec 02 16:12:13 crc kubenswrapper[4933]: I1202 16:12:13.586580 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-zhsjb" Dec 02 16:12:13 crc kubenswrapper[4933]: E1202 16:12:13.586586 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/telemetry-operator-controller-manager-65697495f7-vqpwl" podUID="94050b59-4392-4a9d-9ce8-b5e2c61e0d46" Dec 02 16:12:13 crc kubenswrapper[4933]: I1202 16:12:13.605798 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-vgblp" podStartSLOduration=2.997699388 podStartE2EDuration="46.605783044s" podCreationTimestamp="2025-12-02 16:11:27 +0000 UTC" firstStartedPulling="2025-12-02 16:11:29.090942814 +0000 UTC m=+1152.342169517" lastFinishedPulling="2025-12-02 16:12:12.69902646 +0000 UTC m=+1195.950253173" observedRunningTime="2025-12-02 16:12:13.603688528 +0000 UTC m=+1196.854915231" watchObservedRunningTime="2025-12-02 16:12:13.605783044 +0000 UTC m=+1196.857009737" Dec 02 16:12:13 crc kubenswrapper[4933]: E1202 16:12:13.629088 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-mgdvd" podUID="9e637867-b8e7-48b3-987d-53172ec80734" Dec 02 16:12:13 crc kubenswrapper[4933]: I1202 16:12:13.633010 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-zhsjb" podStartSLOduration=3.855826738 podStartE2EDuration="47.632996747s" podCreationTimestamp="2025-12-02 16:11:26 +0000 UTC" firstStartedPulling="2025-12-02 16:11:29.077670881 +0000 UTC m=+1152.328897584" lastFinishedPulling="2025-12-02 16:12:12.85484088 +0000 UTC m=+1196.106067593" observedRunningTime="2025-12-02 16:12:13.629531705 +0000 UTC m=+1196.880758408" watchObservedRunningTime="2025-12-02 16:12:13.632996747 +0000 UTC m=+1196.884223450" Dec 02 16:12:13 crc kubenswrapper[4933]: I1202 16:12:13.714694 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd42vpt8" podStartSLOduration=39.962422451 podStartE2EDuration="46.714678777s" podCreationTimestamp="2025-12-02 16:11:27 +0000 UTC" firstStartedPulling="2025-12-02 16:12:05.956226855 +0000 UTC m=+1189.207453558" lastFinishedPulling="2025-12-02 16:12:12.708483181 +0000 UTC m=+1195.959709884" observedRunningTime="2025-12-02 16:12:13.71215304 +0000 UTC m=+1196.963379743" watchObservedRunningTime="2025-12-02 16:12:13.714678777 +0000 UTC m=+1196.965905480" Dec 02 16:12:13 crc kubenswrapper[4933]: I1202 16:12:13.740843 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-rzj5x" podStartSLOduration=5.122746663 podStartE2EDuration="46.740810062s" podCreationTimestamp="2025-12-02 16:11:27 +0000 UTC" firstStartedPulling="2025-12-02 16:11:29.467321745 +0000 UTC m=+1152.718548438" lastFinishedPulling="2025-12-02 16:12:11.085385134 +0000 UTC m=+1194.336611837" observedRunningTime="2025-12-02 16:12:13.730761625 +0000 UTC m=+1196.981988328" watchObservedRunningTime="2025-12-02 16:12:13.740810062 +0000 UTC m=+1196.992036765" Dec 02 16:12:13 crc kubenswrapper[4933]: I1202 16:12:13.773256 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-998648c74-t8bvj" podStartSLOduration=5.049749174 podStartE2EDuration="46.773230933s" podCreationTimestamp="2025-12-02 16:11:27 +0000 UTC" firstStartedPulling="2025-12-02 16:11:29.439475095 +0000 UTC m=+1152.690701798" lastFinishedPulling="2025-12-02 16:12:11.162956854 +0000 UTC m=+1194.414183557" observedRunningTime="2025-12-02 16:12:13.760897736 +0000 UTC m=+1197.012124439" watchObservedRunningTime="2025-12-02 16:12:13.773230933 +0000 UTC m=+1197.024457646" Dec 02 16:12:13 crc kubenswrapper[4933]: E1202 16:12:13.807291 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-t7rsk" podUID="e7437781-26c4-4a57-8afc-6bbc1ef7f7dd" Dec 02 16:12:13 crc kubenswrapper[4933]: I1202 16:12:13.833988 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-9kzqg" podStartSLOduration=4.398041148 podStartE2EDuration="46.833954737s" podCreationTimestamp="2025-12-02 16:11:27 +0000 UTC" firstStartedPulling="2025-12-02 16:11:28.711367288 +0000 UTC m=+1151.962593991" lastFinishedPulling="2025-12-02 16:12:11.147280877 +0000 UTC m=+1194.398507580" observedRunningTime="2025-12-02 16:12:13.8254101 +0000 UTC m=+1197.076636803" watchObservedRunningTime="2025-12-02 16:12:13.833954737 +0000 UTC m=+1197.085181460" Dec 02 16:12:13 crc kubenswrapper[4933]: I1202 16:12:13.870108 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-ltqgf" podStartSLOduration=5.343083008 podStartE2EDuration="47.870075337s" podCreationTimestamp="2025-12-02 16:11:26 +0000 UTC" firstStartedPulling="2025-12-02 16:11:28.711687457 +0000 UTC m=+1151.962914170" lastFinishedPulling="2025-12-02 16:12:11.238679796 +0000 UTC m=+1194.489906499" observedRunningTime="2025-12-02 16:12:13.85554364 +0000 UTC m=+1197.106770333" watchObservedRunningTime="2025-12-02 16:12:13.870075337 +0000 UTC m=+1197.121302030" Dec 02 16:12:14 crc kubenswrapper[4933]: I1202 16:12:14.590063 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-rmd89" event={"ID":"a5d0bd48-d38b-44ae-b486-1ff751a0791a","Type":"ContainerStarted","Data":"8c81f8869f707e9c111f07ebfa417ba6b2115533674accbecee7742bf941d682"} Dec 02 16:12:14 crc kubenswrapper[4933]: I1202 16:12:14.590429 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-rmd89" event={"ID":"a5d0bd48-d38b-44ae-b486-1ff751a0791a","Type":"ContainerStarted","Data":"8f632bcd6f99b37dd0c95840dc28509e41d27090ef78aa5225bb07ae674fa1de"} Dec 02 16:12:14 crc kubenswrapper[4933]: I1202 16:12:14.590669 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-rmd89" Dec 02 16:12:14 crc kubenswrapper[4933]: I1202 16:12:14.594499 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-n59qc" event={"ID":"b8e9c61c-a72f-41e1-8f62-99d951ce4950","Type":"ContainerStarted","Data":"09c81c67d8b099c321f0ef20d27249cee8d8d136bb49b5329d228fd5311661a6"} Dec 02 16:12:14 crc kubenswrapper[4933]: I1202 16:12:14.594685 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-n59qc" Dec 02 16:12:14 crc kubenswrapper[4933]: I1202 16:12:14.598559 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-dbnk8" event={"ID":"e19f7d2e-55da-4ba5-9a68-0d49c06eecf3","Type":"ContainerStarted","Data":"ec924ac422c7cc67247ef5fe22cd5a9798a0a6bd3759627b0ff9ad02cb6b6283"} Dec 02 16:12:14 crc kubenswrapper[4933]: I1202 16:12:14.598717 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-78f8948974-dbnk8" Dec 02 16:12:14 crc kubenswrapper[4933]: I1202 16:12:14.605470 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-qgclr" event={"ID":"d5d121c4-7a04-478d-b210-36b258949699","Type":"ContainerStarted","Data":"2cc505930cbe7f24315d952a50ac13bf0a64f1ed4cd42f60e2799393c644e32b"} Dec 02 16:12:14 crc kubenswrapper[4933]: I1202 16:12:14.605623 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-qgclr" Dec 02 16:12:14 crc kubenswrapper[4933]: I1202 16:12:14.608537 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-t7rsk" event={"ID":"e7437781-26c4-4a57-8afc-6bbc1ef7f7dd","Type":"ContainerStarted","Data":"d83e990b38ac8e86fe55e2d15eb4b75929d3e13a3a7999ec37ecad0413ba04f6"} Dec 02 16:12:14 crc kubenswrapper[4933]: I1202 16:12:14.613004 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-65697495f7-vqpwl" event={"ID":"94050b59-4392-4a9d-9ce8-b5e2c61e0d46","Type":"ContainerStarted","Data":"4e41e18c1dd926506449f430f958689b0ebd08f535da6102a3f41a837945d7e4"} Dec 02 16:12:14 crc kubenswrapper[4933]: I1202 16:12:14.622686 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-rmd89" podStartSLOduration=3.6746745560000003 podStartE2EDuration="47.622664024s" podCreationTimestamp="2025-12-02 16:11:27 +0000 UTC" firstStartedPulling="2025-12-02 16:11:29.935222238 +0000 UTC m=+1153.186448951" lastFinishedPulling="2025-12-02 16:12:13.883211716 +0000 UTC m=+1197.134438419" observedRunningTime="2025-12-02 16:12:14.614259421 +0000 UTC m=+1197.865486134" watchObservedRunningTime="2025-12-02 16:12:14.622664024 +0000 UTC m=+1197.873890737" Dec 02 16:12:14 crc kubenswrapper[4933]: I1202 16:12:14.659681 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-c2pbl" event={"ID":"7ab353e5-677f-499c-8909-47813767a26c","Type":"ContainerStarted","Data":"ac52b122996948ac818d381400b9fe6d4ae5e82db4208f8fe2761ee93174e498"} Dec 02 16:12:14 crc kubenswrapper[4933]: I1202 16:12:14.660013 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-c2pbl" Dec 02 16:12:14 crc kubenswrapper[4933]: I1202 16:12:14.663301 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-c2pbl" Dec 02 16:12:14 crc kubenswrapper[4933]: I1202 16:12:14.665892 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-mgdvd" event={"ID":"9e637867-b8e7-48b3-987d-53172ec80734","Type":"ContainerStarted","Data":"b1ccf02f2e14836215523b66d21b7085f9bf1168ae5c2860b376d14d240a8227"} Dec 02 16:12:14 crc kubenswrapper[4933]: I1202 16:12:14.673147 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-7zb88" event={"ID":"22a42f16-b74f-4323-aee5-2d713c1232ea","Type":"ContainerStarted","Data":"31cb8cd614a706b48ba77d29d3dabbe92453b485669442ac0aa2b3c3dce9e0a0"} Dec 02 16:12:14 crc kubenswrapper[4933]: I1202 16:12:14.673186 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-7zb88" Dec 02 16:12:14 crc kubenswrapper[4933]: E1202 16:12:14.676117 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670\\\"\"" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-87566" podUID="a35dd4ba-4d05-4af0-b0b2-2285e9e35889" Dec 02 16:12:14 crc kubenswrapper[4933]: I1202 16:12:14.704902 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-78f8948974-dbnk8" podStartSLOduration=3.438046049 podStartE2EDuration="47.704802026s" podCreationTimestamp="2025-12-02 16:11:27 +0000 UTC" firstStartedPulling="2025-12-02 16:11:29.925424988 +0000 UTC m=+1153.176651701" lastFinishedPulling="2025-12-02 16:12:14.192180975 +0000 UTC m=+1197.443407678" observedRunningTime="2025-12-02 16:12:14.700203383 +0000 UTC m=+1197.951430086" watchObservedRunningTime="2025-12-02 16:12:14.704802026 +0000 UTC m=+1197.956028729" Dec 02 16:12:14 crc kubenswrapper[4933]: I1202 16:12:14.719527 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-n59qc" podStartSLOduration=3.5946094779999997 podStartE2EDuration="47.719507556s" podCreationTimestamp="2025-12-02 16:11:27 +0000 UTC" firstStartedPulling="2025-12-02 16:11:30.065574682 +0000 UTC m=+1153.316801385" lastFinishedPulling="2025-12-02 16:12:14.19047276 +0000 UTC m=+1197.441699463" observedRunningTime="2025-12-02 16:12:14.716729813 +0000 UTC m=+1197.967956506" watchObservedRunningTime="2025-12-02 16:12:14.719507556 +0000 UTC m=+1197.970734259" Dec 02 16:12:14 crc kubenswrapper[4933]: I1202 16:12:14.750659 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-qgclr" podStartSLOduration=4.083515799 podStartE2EDuration="47.750638734s" podCreationTimestamp="2025-12-02 16:11:27 +0000 UTC" firstStartedPulling="2025-12-02 16:11:30.373442782 +0000 UTC m=+1153.624669475" lastFinishedPulling="2025-12-02 16:12:14.040565697 +0000 UTC m=+1197.291792410" observedRunningTime="2025-12-02 16:12:14.74637337 +0000 UTC m=+1197.997600093" watchObservedRunningTime="2025-12-02 16:12:14.750638734 +0000 UTC m=+1198.001865437" Dec 02 16:12:14 crc kubenswrapper[4933]: I1202 16:12:14.817140 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-7zb88" podStartSLOduration=3.714977616 podStartE2EDuration="47.81711599s" podCreationTimestamp="2025-12-02 16:11:27 +0000 UTC" firstStartedPulling="2025-12-02 16:11:29.929924967 +0000 UTC m=+1153.181151670" lastFinishedPulling="2025-12-02 16:12:14.032063331 +0000 UTC m=+1197.283290044" observedRunningTime="2025-12-02 16:12:14.793117712 +0000 UTC m=+1198.044344425" watchObservedRunningTime="2025-12-02 16:12:14.81711599 +0000 UTC m=+1198.068342693" Dec 02 16:12:14 crc kubenswrapper[4933]: I1202 16:12:14.891132 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-c2pbl" podStartSLOduration=5.120574395 podStartE2EDuration="47.891112296s" podCreationTimestamp="2025-12-02 16:11:27 +0000 UTC" firstStartedPulling="2025-12-02 16:11:30.401990361 +0000 UTC m=+1153.653217064" lastFinishedPulling="2025-12-02 16:12:13.172528262 +0000 UTC m=+1196.423754965" observedRunningTime="2025-12-02 16:12:14.882396035 +0000 UTC m=+1198.133622738" watchObservedRunningTime="2025-12-02 16:12:14.891112296 +0000 UTC m=+1198.142338999" Dec 02 16:12:15 crc kubenswrapper[4933]: I1202 16:12:15.698925 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-65697495f7-vqpwl" event={"ID":"94050b59-4392-4a9d-9ce8-b5e2c61e0d46","Type":"ContainerStarted","Data":"b3ae0083d50c67efb34a9634604537059e69d2eaeaa195cf65350cd9cd375026"} Dec 02 16:12:15 crc kubenswrapper[4933]: I1202 16:12:15.719868 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-65697495f7-vqpwl" podStartSLOduration=4.314142328 podStartE2EDuration="48.719847897s" podCreationTimestamp="2025-12-02 16:11:27 +0000 UTC" firstStartedPulling="2025-12-02 16:11:30.325721834 +0000 UTC m=+1153.576948537" lastFinishedPulling="2025-12-02 16:12:14.731427403 +0000 UTC m=+1197.982654106" observedRunningTime="2025-12-02 16:12:15.719644742 +0000 UTC m=+1198.970871465" watchObservedRunningTime="2025-12-02 16:12:15.719847897 +0000 UTC m=+1198.971074600" Dec 02 16:12:16 crc kubenswrapper[4933]: I1202 16:12:16.709066 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-mgdvd" event={"ID":"9e637867-b8e7-48b3-987d-53172ec80734","Type":"ContainerStarted","Data":"866718d7f360355ed8e29cb025e97aead09b085632ddd92ba3872129a32baedc"} Dec 02 16:12:16 crc kubenswrapper[4933]: I1202 16:12:16.709506 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-mgdvd" Dec 02 16:12:16 crc kubenswrapper[4933]: I1202 16:12:16.711955 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-t7rsk" event={"ID":"e7437781-26c4-4a57-8afc-6bbc1ef7f7dd","Type":"ContainerStarted","Data":"eb533a63db242b26444e6c759d77063a53555ff49f028e4184f9a215f83a0c60"} Dec 02 16:12:16 crc kubenswrapper[4933]: I1202 16:12:16.712161 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-65697495f7-vqpwl" Dec 02 16:12:16 crc kubenswrapper[4933]: I1202 16:12:16.732345 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-mgdvd" podStartSLOduration=4.262504066 podStartE2EDuration="49.73232304s" podCreationTimestamp="2025-12-02 16:11:27 +0000 UTC" firstStartedPulling="2025-12-02 16:11:29.903201487 +0000 UTC m=+1153.154428190" lastFinishedPulling="2025-12-02 16:12:15.373020461 +0000 UTC m=+1198.624247164" observedRunningTime="2025-12-02 16:12:16.721858952 +0000 UTC m=+1199.973085665" watchObservedRunningTime="2025-12-02 16:12:16.73232304 +0000 UTC m=+1199.983549753" Dec 02 16:12:16 crc kubenswrapper[4933]: I1202 16:12:16.743318 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-t7rsk" podStartSLOduration=3.8423653719999997 podStartE2EDuration="49.743298652s" podCreationTimestamp="2025-12-02 16:11:27 +0000 UTC" firstStartedPulling="2025-12-02 16:11:29.474060014 +0000 UTC m=+1152.725286717" lastFinishedPulling="2025-12-02 16:12:15.374993294 +0000 UTC m=+1198.626219997" observedRunningTime="2025-12-02 16:12:16.737308693 +0000 UTC m=+1199.988535416" watchObservedRunningTime="2025-12-02 16:12:16.743298652 +0000 UTC m=+1199.994525355" Dec 02 16:12:17 crc kubenswrapper[4933]: I1202 16:12:17.678038 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-t7rsk" Dec 02 16:12:17 crc kubenswrapper[4933]: I1202 16:12:17.720295 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-ggssz" event={"ID":"2e0aae7d-292b-4117-8b13-6021d1b5174a","Type":"ContainerStarted","Data":"dac02e99a5f6853c0db1972a84fd02a9eb06a9d8efbf19dc56241d2025c06ec9"} Dec 02 16:12:17 crc kubenswrapper[4933]: I1202 16:12:17.720350 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-ggssz" event={"ID":"2e0aae7d-292b-4117-8b13-6021d1b5174a","Type":"ContainerStarted","Data":"8e1dde476827cac8d415240b64008c53eae3632735afed32df21c133c6c073c3"} Dec 02 16:12:17 crc kubenswrapper[4933]: I1202 16:12:17.747047 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-57548d458d-ggssz" podStartSLOduration=39.757836725 podStartE2EDuration="50.747027953s" podCreationTimestamp="2025-12-02 16:11:27 +0000 UTC" firstStartedPulling="2025-12-02 16:12:05.935262307 +0000 UTC m=+1189.186489010" lastFinishedPulling="2025-12-02 16:12:16.924453535 +0000 UTC m=+1200.175680238" observedRunningTime="2025-12-02 16:12:17.740710215 +0000 UTC m=+1200.991936918" watchObservedRunningTime="2025-12-02 16:12:17.747027953 +0000 UTC m=+1200.998254656" Dec 02 16:12:18 crc kubenswrapper[4933]: I1202 16:12:18.237847 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-998648c74-t8bvj" Dec 02 16:12:18 crc kubenswrapper[4933]: I1202 16:12:18.242070 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-998648c74-t8bvj" Dec 02 16:12:18 crc kubenswrapper[4933]: I1202 16:12:18.730198 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-57548d458d-ggssz" Dec 02 16:12:20 crc kubenswrapper[4933]: I1202 16:12:20.597597 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-596767c485-f2zz4" Dec 02 16:12:21 crc kubenswrapper[4933]: I1202 16:12:21.758410 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-f8sqn" event={"ID":"fc13fe0e-6c9a-4d9a-bf78-ec06c2962f67","Type":"ContainerStarted","Data":"e28fc45872b800dc68f18cdd5b9fba134c9e3a5ccdf838f0c6a4875110deb33a"} Dec 02 16:12:21 crc kubenswrapper[4933]: I1202 16:12:21.779892 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-f8sqn" podStartSLOduration=3.828186825 podStartE2EDuration="54.77986911s" podCreationTimestamp="2025-12-02 16:11:27 +0000 UTC" firstStartedPulling="2025-12-02 16:11:30.356400629 +0000 UTC m=+1153.607627332" lastFinishedPulling="2025-12-02 16:12:21.308082894 +0000 UTC m=+1204.559309617" observedRunningTime="2025-12-02 16:12:21.772158435 +0000 UTC m=+1205.023385128" watchObservedRunningTime="2025-12-02 16:12:21.77986911 +0000 UTC m=+1205.031095823" Dec 02 16:12:23 crc kubenswrapper[4933]: I1202 16:12:23.579649 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-57548d458d-ggssz" Dec 02 16:12:24 crc kubenswrapper[4933]: I1202 16:12:24.233378 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd42vpt8" Dec 02 16:12:24 crc kubenswrapper[4933]: I1202 16:12:24.782269 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-w2ddp" event={"ID":"7f82c652-2e90-4fd6-bd23-381c2f529a27","Type":"ContainerStarted","Data":"1cda7472271b8ad94babe8acd60f8d9d8b5a7082ad6eb8d6439242fbfc6e4555"} Dec 02 16:12:24 crc kubenswrapper[4933]: I1202 16:12:24.782817 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5854674fcc-w2ddp" Dec 02 16:12:24 crc kubenswrapper[4933]: I1202 16:12:24.804389 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5854674fcc-w2ddp" podStartSLOduration=3.679059413 podStartE2EDuration="57.804363375s" podCreationTimestamp="2025-12-02 16:11:27 +0000 UTC" firstStartedPulling="2025-12-02 16:11:30.375763464 +0000 UTC m=+1153.626990167" lastFinishedPulling="2025-12-02 16:12:24.501067436 +0000 UTC m=+1207.752294129" observedRunningTime="2025-12-02 16:12:24.796128236 +0000 UTC m=+1208.047354949" watchObservedRunningTime="2025-12-02 16:12:24.804363375 +0000 UTC m=+1208.055590088" Dec 02 16:12:27 crc kubenswrapper[4933]: I1202 16:12:27.680755 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-t7rsk" Dec 02 16:12:27 crc kubenswrapper[4933]: I1202 16:12:27.884601 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-rmd89" Dec 02 16:12:27 crc kubenswrapper[4933]: I1202 16:12:27.982392 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-7zb88" Dec 02 16:12:28 crc kubenswrapper[4933]: I1202 16:12:28.068601 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-n59qc" Dec 02 16:12:28 crc kubenswrapper[4933]: I1202 16:12:28.410656 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-mgdvd" Dec 02 16:12:28 crc kubenswrapper[4933]: I1202 16:12:28.676289 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-78f8948974-dbnk8" Dec 02 16:12:28 crc kubenswrapper[4933]: I1202 16:12:28.740424 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-65697495f7-vqpwl" Dec 02 16:12:28 crc kubenswrapper[4933]: I1202 16:12:28.761903 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-qgclr" Dec 02 16:12:28 crc kubenswrapper[4933]: I1202 16:12:28.814624 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-mslvn" event={"ID":"9c02a5b3-2770-427a-a0a4-c1fac9bd0ffa","Type":"ContainerStarted","Data":"373d715a6881938461b2194a1ca65ef83f87ed0c0331e3ee7b18a12d2e3a8e75"} Dec 02 16:12:28 crc kubenswrapper[4933]: I1202 16:12:28.814968 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-mslvn" Dec 02 16:12:28 crc kubenswrapper[4933]: I1202 16:12:28.838437 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-mslvn" podStartSLOduration=3.63267394 podStartE2EDuration="1m1.838419215s" podCreationTimestamp="2025-12-02 16:11:27 +0000 UTC" firstStartedPulling="2025-12-02 16:11:29.446238145 +0000 UTC m=+1152.697464848" lastFinishedPulling="2025-12-02 16:12:27.65198342 +0000 UTC m=+1210.903210123" observedRunningTime="2025-12-02 16:12:28.834506361 +0000 UTC m=+1212.085733074" watchObservedRunningTime="2025-12-02 16:12:28.838419215 +0000 UTC m=+1212.089645918" Dec 02 16:12:30 crc kubenswrapper[4933]: I1202 16:12:30.835028 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-87566" event={"ID":"a35dd4ba-4d05-4af0-b0b2-2285e9e35889","Type":"ContainerStarted","Data":"23e032b0a11e99eec555eba4842bda579ee55c9fca6f95514a0aabfcc9764ce1"} Dec 02 16:12:30 crc kubenswrapper[4933]: I1202 16:12:30.835562 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-87566" Dec 02 16:12:30 crc kubenswrapper[4933]: I1202 16:12:30.857886 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-87566" podStartSLOduration=3.150503578 podStartE2EDuration="1m3.857869164s" podCreationTimestamp="2025-12-02 16:11:27 +0000 UTC" firstStartedPulling="2025-12-02 16:11:29.893427087 +0000 UTC m=+1153.144653790" lastFinishedPulling="2025-12-02 16:12:30.600792653 +0000 UTC m=+1213.852019376" observedRunningTime="2025-12-02 16:12:30.851337161 +0000 UTC m=+1214.102563884" watchObservedRunningTime="2025-12-02 16:12:30.857869164 +0000 UTC m=+1214.109095877" Dec 02 16:12:37 crc kubenswrapper[4933]: I1202 16:12:37.868098 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-mslvn" Dec 02 16:12:38 crc kubenswrapper[4933]: I1202 16:12:38.109261 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-87566" Dec 02 16:12:38 crc kubenswrapper[4933]: I1202 16:12:38.762579 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5854674fcc-w2ddp" Dec 02 16:12:53 crc kubenswrapper[4933]: I1202 16:12:53.013574 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-tdkq4"] Dec 02 16:12:53 crc kubenswrapper[4933]: I1202 16:12:53.019010 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-tdkq4" Dec 02 16:12:53 crc kubenswrapper[4933]: I1202 16:12:53.024108 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Dec 02 16:12:53 crc kubenswrapper[4933]: I1202 16:12:53.024325 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-6zl82" Dec 02 16:12:53 crc kubenswrapper[4933]: I1202 16:12:53.024388 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Dec 02 16:12:53 crc kubenswrapper[4933]: I1202 16:12:53.024347 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Dec 02 16:12:53 crc kubenswrapper[4933]: I1202 16:12:53.030134 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-tdkq4"] Dec 02 16:12:53 crc kubenswrapper[4933]: I1202 16:12:53.124956 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-64h58"] Dec 02 16:12:53 crc kubenswrapper[4933]: I1202 16:12:53.126741 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-64h58" Dec 02 16:12:53 crc kubenswrapper[4933]: I1202 16:12:53.129346 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Dec 02 16:12:53 crc kubenswrapper[4933]: I1202 16:12:53.146787 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-64h58"] Dec 02 16:12:53 crc kubenswrapper[4933]: I1202 16:12:53.179998 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnbgn\" (UniqueName: \"kubernetes.io/projected/0e0da7e0-c02a-49ff-8814-fafd537983b1-kube-api-access-bnbgn\") pod \"dnsmasq-dns-78dd6ddcc-64h58\" (UID: \"0e0da7e0-c02a-49ff-8814-fafd537983b1\") " pod="openstack/dnsmasq-dns-78dd6ddcc-64h58" Dec 02 16:12:53 crc kubenswrapper[4933]: I1202 16:12:53.180055 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75bfg\" (UniqueName: \"kubernetes.io/projected/1fa23eaa-697d-49d3-b507-51018b313a75-kube-api-access-75bfg\") pod \"dnsmasq-dns-675f4bcbfc-tdkq4\" (UID: \"1fa23eaa-697d-49d3-b507-51018b313a75\") " pod="openstack/dnsmasq-dns-675f4bcbfc-tdkq4" Dec 02 16:12:53 crc kubenswrapper[4933]: I1202 16:12:53.180129 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fa23eaa-697d-49d3-b507-51018b313a75-config\") pod \"dnsmasq-dns-675f4bcbfc-tdkq4\" (UID: \"1fa23eaa-697d-49d3-b507-51018b313a75\") " pod="openstack/dnsmasq-dns-675f4bcbfc-tdkq4" Dec 02 16:12:53 crc kubenswrapper[4933]: I1202 16:12:53.180157 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e0da7e0-c02a-49ff-8814-fafd537983b1-config\") pod \"dnsmasq-dns-78dd6ddcc-64h58\" (UID: \"0e0da7e0-c02a-49ff-8814-fafd537983b1\") " pod="openstack/dnsmasq-dns-78dd6ddcc-64h58" Dec 02 16:12:53 crc kubenswrapper[4933]: I1202 16:12:53.180186 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0e0da7e0-c02a-49ff-8814-fafd537983b1-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-64h58\" (UID: \"0e0da7e0-c02a-49ff-8814-fafd537983b1\") " pod="openstack/dnsmasq-dns-78dd6ddcc-64h58" Dec 02 16:12:53 crc kubenswrapper[4933]: I1202 16:12:53.281613 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fa23eaa-697d-49d3-b507-51018b313a75-config\") pod \"dnsmasq-dns-675f4bcbfc-tdkq4\" (UID: \"1fa23eaa-697d-49d3-b507-51018b313a75\") " pod="openstack/dnsmasq-dns-675f4bcbfc-tdkq4" Dec 02 16:12:53 crc kubenswrapper[4933]: I1202 16:12:53.281679 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e0da7e0-c02a-49ff-8814-fafd537983b1-config\") pod \"dnsmasq-dns-78dd6ddcc-64h58\" (UID: \"0e0da7e0-c02a-49ff-8814-fafd537983b1\") " pod="openstack/dnsmasq-dns-78dd6ddcc-64h58" Dec 02 16:12:53 crc kubenswrapper[4933]: I1202 16:12:53.281725 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0e0da7e0-c02a-49ff-8814-fafd537983b1-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-64h58\" (UID: \"0e0da7e0-c02a-49ff-8814-fafd537983b1\") " pod="openstack/dnsmasq-dns-78dd6ddcc-64h58" Dec 02 16:12:53 crc kubenswrapper[4933]: I1202 16:12:53.281815 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bnbgn\" (UniqueName: \"kubernetes.io/projected/0e0da7e0-c02a-49ff-8814-fafd537983b1-kube-api-access-bnbgn\") pod \"dnsmasq-dns-78dd6ddcc-64h58\" (UID: \"0e0da7e0-c02a-49ff-8814-fafd537983b1\") " pod="openstack/dnsmasq-dns-78dd6ddcc-64h58" Dec 02 16:12:53 crc kubenswrapper[4933]: I1202 16:12:53.281870 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75bfg\" (UniqueName: \"kubernetes.io/projected/1fa23eaa-697d-49d3-b507-51018b313a75-kube-api-access-75bfg\") pod \"dnsmasq-dns-675f4bcbfc-tdkq4\" (UID: \"1fa23eaa-697d-49d3-b507-51018b313a75\") " pod="openstack/dnsmasq-dns-675f4bcbfc-tdkq4" Dec 02 16:12:53 crc kubenswrapper[4933]: I1202 16:12:53.282719 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fa23eaa-697d-49d3-b507-51018b313a75-config\") pod \"dnsmasq-dns-675f4bcbfc-tdkq4\" (UID: \"1fa23eaa-697d-49d3-b507-51018b313a75\") " pod="openstack/dnsmasq-dns-675f4bcbfc-tdkq4" Dec 02 16:12:53 crc kubenswrapper[4933]: I1202 16:12:53.282714 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e0da7e0-c02a-49ff-8814-fafd537983b1-config\") pod \"dnsmasq-dns-78dd6ddcc-64h58\" (UID: \"0e0da7e0-c02a-49ff-8814-fafd537983b1\") " pod="openstack/dnsmasq-dns-78dd6ddcc-64h58" Dec 02 16:12:53 crc kubenswrapper[4933]: I1202 16:12:53.282800 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0e0da7e0-c02a-49ff-8814-fafd537983b1-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-64h58\" (UID: \"0e0da7e0-c02a-49ff-8814-fafd537983b1\") " pod="openstack/dnsmasq-dns-78dd6ddcc-64h58" Dec 02 16:12:53 crc kubenswrapper[4933]: I1202 16:12:53.303373 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnbgn\" (UniqueName: \"kubernetes.io/projected/0e0da7e0-c02a-49ff-8814-fafd537983b1-kube-api-access-bnbgn\") pod \"dnsmasq-dns-78dd6ddcc-64h58\" (UID: \"0e0da7e0-c02a-49ff-8814-fafd537983b1\") " pod="openstack/dnsmasq-dns-78dd6ddcc-64h58" Dec 02 16:12:53 crc kubenswrapper[4933]: I1202 16:12:53.304260 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75bfg\" (UniqueName: \"kubernetes.io/projected/1fa23eaa-697d-49d3-b507-51018b313a75-kube-api-access-75bfg\") pod \"dnsmasq-dns-675f4bcbfc-tdkq4\" (UID: \"1fa23eaa-697d-49d3-b507-51018b313a75\") " pod="openstack/dnsmasq-dns-675f4bcbfc-tdkq4" Dec 02 16:12:53 crc kubenswrapper[4933]: I1202 16:12:53.339532 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-tdkq4" Dec 02 16:12:53 crc kubenswrapper[4933]: I1202 16:12:53.448306 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-64h58" Dec 02 16:12:53 crc kubenswrapper[4933]: I1202 16:12:53.816575 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-tdkq4"] Dec 02 16:12:53 crc kubenswrapper[4933]: I1202 16:12:53.959082 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-64h58"] Dec 02 16:12:53 crc kubenswrapper[4933]: W1202 16:12:53.962665 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0e0da7e0_c02a_49ff_8814_fafd537983b1.slice/crio-f82914c2d478c5f74b5eb0fbfaae6fc889a0c43d617075be3f91aeae5a4d0ee8 WatchSource:0}: Error finding container f82914c2d478c5f74b5eb0fbfaae6fc889a0c43d617075be3f91aeae5a4d0ee8: Status 404 returned error can't find the container with id f82914c2d478c5f74b5eb0fbfaae6fc889a0c43d617075be3f91aeae5a4d0ee8 Dec 02 16:12:54 crc kubenswrapper[4933]: I1202 16:12:54.069031 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-64h58" event={"ID":"0e0da7e0-c02a-49ff-8814-fafd537983b1","Type":"ContainerStarted","Data":"f82914c2d478c5f74b5eb0fbfaae6fc889a0c43d617075be3f91aeae5a4d0ee8"} Dec 02 16:12:54 crc kubenswrapper[4933]: I1202 16:12:54.069994 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-tdkq4" event={"ID":"1fa23eaa-697d-49d3-b507-51018b313a75","Type":"ContainerStarted","Data":"f4a88eba097198584ecc54c48e1273105d3ae5dc453de8f9afec83b3b0b1da80"} Dec 02 16:12:55 crc kubenswrapper[4933]: I1202 16:12:55.980371 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-tdkq4"] Dec 02 16:12:56 crc kubenswrapper[4933]: I1202 16:12:56.015855 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-4tp9z"] Dec 02 16:12:56 crc kubenswrapper[4933]: I1202 16:12:56.018956 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-4tp9z" Dec 02 16:12:56 crc kubenswrapper[4933]: I1202 16:12:56.028177 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-4tp9z"] Dec 02 16:12:56 crc kubenswrapper[4933]: I1202 16:12:56.044115 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e246afd2-cb0b-49f6-8144-6d4acba08825-dns-svc\") pod \"dnsmasq-dns-666b6646f7-4tp9z\" (UID: \"e246afd2-cb0b-49f6-8144-6d4acba08825\") " pod="openstack/dnsmasq-dns-666b6646f7-4tp9z" Dec 02 16:12:56 crc kubenswrapper[4933]: I1202 16:12:56.044435 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e246afd2-cb0b-49f6-8144-6d4acba08825-config\") pod \"dnsmasq-dns-666b6646f7-4tp9z\" (UID: \"e246afd2-cb0b-49f6-8144-6d4acba08825\") " pod="openstack/dnsmasq-dns-666b6646f7-4tp9z" Dec 02 16:12:56 crc kubenswrapper[4933]: I1202 16:12:56.044494 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rh2v\" (UniqueName: \"kubernetes.io/projected/e246afd2-cb0b-49f6-8144-6d4acba08825-kube-api-access-2rh2v\") pod \"dnsmasq-dns-666b6646f7-4tp9z\" (UID: \"e246afd2-cb0b-49f6-8144-6d4acba08825\") " pod="openstack/dnsmasq-dns-666b6646f7-4tp9z" Dec 02 16:12:56 crc kubenswrapper[4933]: I1202 16:12:56.145215 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e246afd2-cb0b-49f6-8144-6d4acba08825-config\") pod \"dnsmasq-dns-666b6646f7-4tp9z\" (UID: \"e246afd2-cb0b-49f6-8144-6d4acba08825\") " pod="openstack/dnsmasq-dns-666b6646f7-4tp9z" Dec 02 16:12:56 crc kubenswrapper[4933]: I1202 16:12:56.145576 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rh2v\" (UniqueName: \"kubernetes.io/projected/e246afd2-cb0b-49f6-8144-6d4acba08825-kube-api-access-2rh2v\") pod \"dnsmasq-dns-666b6646f7-4tp9z\" (UID: \"e246afd2-cb0b-49f6-8144-6d4acba08825\") " pod="openstack/dnsmasq-dns-666b6646f7-4tp9z" Dec 02 16:12:56 crc kubenswrapper[4933]: I1202 16:12:56.145768 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e246afd2-cb0b-49f6-8144-6d4acba08825-dns-svc\") pod \"dnsmasq-dns-666b6646f7-4tp9z\" (UID: \"e246afd2-cb0b-49f6-8144-6d4acba08825\") " pod="openstack/dnsmasq-dns-666b6646f7-4tp9z" Dec 02 16:12:56 crc kubenswrapper[4933]: I1202 16:12:56.146790 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e246afd2-cb0b-49f6-8144-6d4acba08825-dns-svc\") pod \"dnsmasq-dns-666b6646f7-4tp9z\" (UID: \"e246afd2-cb0b-49f6-8144-6d4acba08825\") " pod="openstack/dnsmasq-dns-666b6646f7-4tp9z" Dec 02 16:12:56 crc kubenswrapper[4933]: I1202 16:12:56.147400 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e246afd2-cb0b-49f6-8144-6d4acba08825-config\") pod \"dnsmasq-dns-666b6646f7-4tp9z\" (UID: \"e246afd2-cb0b-49f6-8144-6d4acba08825\") " pod="openstack/dnsmasq-dns-666b6646f7-4tp9z" Dec 02 16:12:56 crc kubenswrapper[4933]: I1202 16:12:56.166177 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rh2v\" (UniqueName: \"kubernetes.io/projected/e246afd2-cb0b-49f6-8144-6d4acba08825-kube-api-access-2rh2v\") pod \"dnsmasq-dns-666b6646f7-4tp9z\" (UID: \"e246afd2-cb0b-49f6-8144-6d4acba08825\") " pod="openstack/dnsmasq-dns-666b6646f7-4tp9z" Dec 02 16:12:56 crc kubenswrapper[4933]: I1202 16:12:56.287944 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-64h58"] Dec 02 16:12:56 crc kubenswrapper[4933]: I1202 16:12:56.331789 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-rxkr8"] Dec 02 16:12:56 crc kubenswrapper[4933]: I1202 16:12:56.333338 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-rxkr8" Dec 02 16:12:56 crc kubenswrapper[4933]: I1202 16:12:56.339056 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-rxkr8"] Dec 02 16:12:56 crc kubenswrapper[4933]: I1202 16:12:56.353356 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4l76z\" (UniqueName: \"kubernetes.io/projected/1042bb7b-3049-4492-a177-d1eb37d550a7-kube-api-access-4l76z\") pod \"dnsmasq-dns-57d769cc4f-rxkr8\" (UID: \"1042bb7b-3049-4492-a177-d1eb37d550a7\") " pod="openstack/dnsmasq-dns-57d769cc4f-rxkr8" Dec 02 16:12:56 crc kubenswrapper[4933]: I1202 16:12:56.353523 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1042bb7b-3049-4492-a177-d1eb37d550a7-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-rxkr8\" (UID: \"1042bb7b-3049-4492-a177-d1eb37d550a7\") " pod="openstack/dnsmasq-dns-57d769cc4f-rxkr8" Dec 02 16:12:56 crc kubenswrapper[4933]: I1202 16:12:56.353578 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1042bb7b-3049-4492-a177-d1eb37d550a7-config\") pod \"dnsmasq-dns-57d769cc4f-rxkr8\" (UID: \"1042bb7b-3049-4492-a177-d1eb37d550a7\") " pod="openstack/dnsmasq-dns-57d769cc4f-rxkr8" Dec 02 16:12:56 crc kubenswrapper[4933]: I1202 16:12:56.356287 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-4tp9z" Dec 02 16:12:56 crc kubenswrapper[4933]: I1202 16:12:56.455326 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1042bb7b-3049-4492-a177-d1eb37d550a7-config\") pod \"dnsmasq-dns-57d769cc4f-rxkr8\" (UID: \"1042bb7b-3049-4492-a177-d1eb37d550a7\") " pod="openstack/dnsmasq-dns-57d769cc4f-rxkr8" Dec 02 16:12:56 crc kubenswrapper[4933]: I1202 16:12:56.455388 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4l76z\" (UniqueName: \"kubernetes.io/projected/1042bb7b-3049-4492-a177-d1eb37d550a7-kube-api-access-4l76z\") pod \"dnsmasq-dns-57d769cc4f-rxkr8\" (UID: \"1042bb7b-3049-4492-a177-d1eb37d550a7\") " pod="openstack/dnsmasq-dns-57d769cc4f-rxkr8" Dec 02 16:12:56 crc kubenswrapper[4933]: I1202 16:12:56.455495 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1042bb7b-3049-4492-a177-d1eb37d550a7-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-rxkr8\" (UID: \"1042bb7b-3049-4492-a177-d1eb37d550a7\") " pod="openstack/dnsmasq-dns-57d769cc4f-rxkr8" Dec 02 16:12:56 crc kubenswrapper[4933]: I1202 16:12:56.456234 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1042bb7b-3049-4492-a177-d1eb37d550a7-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-rxkr8\" (UID: \"1042bb7b-3049-4492-a177-d1eb37d550a7\") " pod="openstack/dnsmasq-dns-57d769cc4f-rxkr8" Dec 02 16:12:56 crc kubenswrapper[4933]: I1202 16:12:56.456275 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1042bb7b-3049-4492-a177-d1eb37d550a7-config\") pod \"dnsmasq-dns-57d769cc4f-rxkr8\" (UID: \"1042bb7b-3049-4492-a177-d1eb37d550a7\") " pod="openstack/dnsmasq-dns-57d769cc4f-rxkr8" Dec 02 16:12:56 crc kubenswrapper[4933]: I1202 16:12:56.479097 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4l76z\" (UniqueName: \"kubernetes.io/projected/1042bb7b-3049-4492-a177-d1eb37d550a7-kube-api-access-4l76z\") pod \"dnsmasq-dns-57d769cc4f-rxkr8\" (UID: \"1042bb7b-3049-4492-a177-d1eb37d550a7\") " pod="openstack/dnsmasq-dns-57d769cc4f-rxkr8" Dec 02 16:12:56 crc kubenswrapper[4933]: I1202 16:12:56.663251 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-rxkr8" Dec 02 16:12:56 crc kubenswrapper[4933]: I1202 16:12:56.901995 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-4tp9z"] Dec 02 16:12:56 crc kubenswrapper[4933]: W1202 16:12:56.912804 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode246afd2_cb0b_49f6_8144_6d4acba08825.slice/crio-fe055abd92fee46f5f33ddaff98e45c1b4ef1276b7806f8cb51acf804d976fba WatchSource:0}: Error finding container fe055abd92fee46f5f33ddaff98e45c1b4ef1276b7806f8cb51acf804d976fba: Status 404 returned error can't find the container with id fe055abd92fee46f5f33ddaff98e45c1b4ef1276b7806f8cb51acf804d976fba Dec 02 16:12:57 crc kubenswrapper[4933]: I1202 16:12:57.134851 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-4tp9z" event={"ID":"e246afd2-cb0b-49f6-8144-6d4acba08825","Type":"ContainerStarted","Data":"fe055abd92fee46f5f33ddaff98e45c1b4ef1276b7806f8cb51acf804d976fba"} Dec 02 16:12:57 crc kubenswrapper[4933]: W1202 16:12:57.158045 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1042bb7b_3049_4492_a177_d1eb37d550a7.slice/crio-83b17980631c356d2532612682c8a7947a32d7d962189ece71586c79edb1efcb WatchSource:0}: Error finding container 83b17980631c356d2532612682c8a7947a32d7d962189ece71586c79edb1efcb: Status 404 returned error can't find the container with id 83b17980631c356d2532612682c8a7947a32d7d962189ece71586c79edb1efcb Dec 02 16:12:57 crc kubenswrapper[4933]: I1202 16:12:57.163450 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-rxkr8"] Dec 02 16:12:57 crc kubenswrapper[4933]: I1202 16:12:57.176219 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 02 16:12:57 crc kubenswrapper[4933]: I1202 16:12:57.178688 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 02 16:12:57 crc kubenswrapper[4933]: I1202 16:12:57.182688 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 02 16:12:57 crc kubenswrapper[4933]: I1202 16:12:57.182765 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 02 16:12:57 crc kubenswrapper[4933]: I1202 16:12:57.182785 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 02 16:12:57 crc kubenswrapper[4933]: I1202 16:12:57.182929 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 02 16:12:57 crc kubenswrapper[4933]: I1202 16:12:57.183502 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-58f87" Dec 02 16:12:57 crc kubenswrapper[4933]: I1202 16:12:57.183643 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 02 16:12:57 crc kubenswrapper[4933]: I1202 16:12:57.183680 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 02 16:12:57 crc kubenswrapper[4933]: I1202 16:12:57.188890 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 02 16:12:57 crc kubenswrapper[4933]: I1202 16:12:57.375740 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5df97ea3-3281-4995-bf14-cb09bf09f39d-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"5df97ea3-3281-4995-bf14-cb09bf09f39d\") " pod="openstack/rabbitmq-server-0" Dec 02 16:12:57 crc kubenswrapper[4933]: I1202 16:12:57.375798 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5df97ea3-3281-4995-bf14-cb09bf09f39d-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"5df97ea3-3281-4995-bf14-cb09bf09f39d\") " pod="openstack/rabbitmq-server-0" Dec 02 16:12:57 crc kubenswrapper[4933]: I1202 16:12:57.375859 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5df97ea3-3281-4995-bf14-cb09bf09f39d-server-conf\") pod \"rabbitmq-server-0\" (UID: \"5df97ea3-3281-4995-bf14-cb09bf09f39d\") " pod="openstack/rabbitmq-server-0" Dec 02 16:12:57 crc kubenswrapper[4933]: I1202 16:12:57.375910 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5df97ea3-3281-4995-bf14-cb09bf09f39d-config-data\") pod \"rabbitmq-server-0\" (UID: \"5df97ea3-3281-4995-bf14-cb09bf09f39d\") " pod="openstack/rabbitmq-server-0" Dec 02 16:12:57 crc kubenswrapper[4933]: I1202 16:12:57.375957 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5df97ea3-3281-4995-bf14-cb09bf09f39d-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"5df97ea3-3281-4995-bf14-cb09bf09f39d\") " pod="openstack/rabbitmq-server-0" Dec 02 16:12:57 crc kubenswrapper[4933]: I1202 16:12:57.376011 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5df97ea3-3281-4995-bf14-cb09bf09f39d-pod-info\") pod \"rabbitmq-server-0\" (UID: \"5df97ea3-3281-4995-bf14-cb09bf09f39d\") " pod="openstack/rabbitmq-server-0" Dec 02 16:12:57 crc kubenswrapper[4933]: I1202 16:12:57.376037 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-server-0\" (UID: \"5df97ea3-3281-4995-bf14-cb09bf09f39d\") " pod="openstack/rabbitmq-server-0" Dec 02 16:12:57 crc kubenswrapper[4933]: I1202 16:12:57.376090 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5df97ea3-3281-4995-bf14-cb09bf09f39d-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"5df97ea3-3281-4995-bf14-cb09bf09f39d\") " pod="openstack/rabbitmq-server-0" Dec 02 16:12:57 crc kubenswrapper[4933]: I1202 16:12:57.376123 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7ctd\" (UniqueName: \"kubernetes.io/projected/5df97ea3-3281-4995-bf14-cb09bf09f39d-kube-api-access-b7ctd\") pod \"rabbitmq-server-0\" (UID: \"5df97ea3-3281-4995-bf14-cb09bf09f39d\") " pod="openstack/rabbitmq-server-0" Dec 02 16:12:57 crc kubenswrapper[4933]: I1202 16:12:57.376152 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5df97ea3-3281-4995-bf14-cb09bf09f39d-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"5df97ea3-3281-4995-bf14-cb09bf09f39d\") " pod="openstack/rabbitmq-server-0" Dec 02 16:12:57 crc kubenswrapper[4933]: I1202 16:12:57.376277 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5df97ea3-3281-4995-bf14-cb09bf09f39d-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"5df97ea3-3281-4995-bf14-cb09bf09f39d\") " pod="openstack/rabbitmq-server-0" Dec 02 16:12:57 crc kubenswrapper[4933]: I1202 16:12:57.461680 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 02 16:12:57 crc kubenswrapper[4933]: I1202 16:12:57.469300 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 02 16:12:57 crc kubenswrapper[4933]: I1202 16:12:57.469425 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 02 16:12:57 crc kubenswrapper[4933]: I1202 16:12:57.471671 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 02 16:12:57 crc kubenswrapper[4933]: I1202 16:12:57.471896 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 02 16:12:57 crc kubenswrapper[4933]: I1202 16:12:57.471926 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 02 16:12:57 crc kubenswrapper[4933]: I1202 16:12:57.473303 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 02 16:12:57 crc kubenswrapper[4933]: I1202 16:12:57.473420 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 02 16:12:57 crc kubenswrapper[4933]: I1202 16:12:57.473308 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 02 16:12:57 crc kubenswrapper[4933]: I1202 16:12:57.473607 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-d9fbb" Dec 02 16:12:57 crc kubenswrapper[4933]: I1202 16:12:57.478034 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5df97ea3-3281-4995-bf14-cb09bf09f39d-server-conf\") pod \"rabbitmq-server-0\" (UID: \"5df97ea3-3281-4995-bf14-cb09bf09f39d\") " pod="openstack/rabbitmq-server-0" Dec 02 16:12:57 crc kubenswrapper[4933]: I1202 16:12:57.478084 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5df97ea3-3281-4995-bf14-cb09bf09f39d-config-data\") pod \"rabbitmq-server-0\" (UID: \"5df97ea3-3281-4995-bf14-cb09bf09f39d\") " pod="openstack/rabbitmq-server-0" Dec 02 16:12:57 crc kubenswrapper[4933]: I1202 16:12:57.478130 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5df97ea3-3281-4995-bf14-cb09bf09f39d-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"5df97ea3-3281-4995-bf14-cb09bf09f39d\") " pod="openstack/rabbitmq-server-0" Dec 02 16:12:57 crc kubenswrapper[4933]: I1202 16:12:57.478162 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5df97ea3-3281-4995-bf14-cb09bf09f39d-pod-info\") pod \"rabbitmq-server-0\" (UID: \"5df97ea3-3281-4995-bf14-cb09bf09f39d\") " pod="openstack/rabbitmq-server-0" Dec 02 16:12:57 crc kubenswrapper[4933]: I1202 16:12:57.478189 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-server-0\" (UID: \"5df97ea3-3281-4995-bf14-cb09bf09f39d\") " pod="openstack/rabbitmq-server-0" Dec 02 16:12:57 crc kubenswrapper[4933]: I1202 16:12:57.478225 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5df97ea3-3281-4995-bf14-cb09bf09f39d-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"5df97ea3-3281-4995-bf14-cb09bf09f39d\") " pod="openstack/rabbitmq-server-0" Dec 02 16:12:57 crc kubenswrapper[4933]: I1202 16:12:57.478248 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7ctd\" (UniqueName: \"kubernetes.io/projected/5df97ea3-3281-4995-bf14-cb09bf09f39d-kube-api-access-b7ctd\") pod \"rabbitmq-server-0\" (UID: \"5df97ea3-3281-4995-bf14-cb09bf09f39d\") " pod="openstack/rabbitmq-server-0" Dec 02 16:12:57 crc kubenswrapper[4933]: I1202 16:12:57.478264 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5df97ea3-3281-4995-bf14-cb09bf09f39d-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"5df97ea3-3281-4995-bf14-cb09bf09f39d\") " pod="openstack/rabbitmq-server-0" Dec 02 16:12:57 crc kubenswrapper[4933]: I1202 16:12:57.478291 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5df97ea3-3281-4995-bf14-cb09bf09f39d-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"5df97ea3-3281-4995-bf14-cb09bf09f39d\") " pod="openstack/rabbitmq-server-0" Dec 02 16:12:57 crc kubenswrapper[4933]: I1202 16:12:57.478343 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5df97ea3-3281-4995-bf14-cb09bf09f39d-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"5df97ea3-3281-4995-bf14-cb09bf09f39d\") " pod="openstack/rabbitmq-server-0" Dec 02 16:12:57 crc kubenswrapper[4933]: I1202 16:12:57.478370 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5df97ea3-3281-4995-bf14-cb09bf09f39d-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"5df97ea3-3281-4995-bf14-cb09bf09f39d\") " pod="openstack/rabbitmq-server-0" Dec 02 16:12:57 crc kubenswrapper[4933]: I1202 16:12:57.479195 4933 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-server-0\" (UID: \"5df97ea3-3281-4995-bf14-cb09bf09f39d\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/rabbitmq-server-0" Dec 02 16:12:57 crc kubenswrapper[4933]: I1202 16:12:57.479420 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5df97ea3-3281-4995-bf14-cb09bf09f39d-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"5df97ea3-3281-4995-bf14-cb09bf09f39d\") " pod="openstack/rabbitmq-server-0" Dec 02 16:12:57 crc kubenswrapper[4933]: I1202 16:12:57.480003 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5df97ea3-3281-4995-bf14-cb09bf09f39d-server-conf\") pod \"rabbitmq-server-0\" (UID: \"5df97ea3-3281-4995-bf14-cb09bf09f39d\") " pod="openstack/rabbitmq-server-0" Dec 02 16:12:57 crc kubenswrapper[4933]: I1202 16:12:57.480478 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5df97ea3-3281-4995-bf14-cb09bf09f39d-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"5df97ea3-3281-4995-bf14-cb09bf09f39d\") " pod="openstack/rabbitmq-server-0" Dec 02 16:12:57 crc kubenswrapper[4933]: I1202 16:12:57.480717 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5df97ea3-3281-4995-bf14-cb09bf09f39d-config-data\") pod \"rabbitmq-server-0\" (UID: \"5df97ea3-3281-4995-bf14-cb09bf09f39d\") " pod="openstack/rabbitmq-server-0" Dec 02 16:12:57 crc kubenswrapper[4933]: I1202 16:12:57.482074 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5df97ea3-3281-4995-bf14-cb09bf09f39d-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"5df97ea3-3281-4995-bf14-cb09bf09f39d\") " pod="openstack/rabbitmq-server-0" Dec 02 16:12:57 crc kubenswrapper[4933]: I1202 16:12:57.502208 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5df97ea3-3281-4995-bf14-cb09bf09f39d-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"5df97ea3-3281-4995-bf14-cb09bf09f39d\") " pod="openstack/rabbitmq-server-0" Dec 02 16:12:57 crc kubenswrapper[4933]: I1202 16:12:57.502253 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5df97ea3-3281-4995-bf14-cb09bf09f39d-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"5df97ea3-3281-4995-bf14-cb09bf09f39d\") " pod="openstack/rabbitmq-server-0" Dec 02 16:12:57 crc kubenswrapper[4933]: I1202 16:12:57.502568 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5df97ea3-3281-4995-bf14-cb09bf09f39d-pod-info\") pod \"rabbitmq-server-0\" (UID: \"5df97ea3-3281-4995-bf14-cb09bf09f39d\") " pod="openstack/rabbitmq-server-0" Dec 02 16:12:57 crc kubenswrapper[4933]: I1202 16:12:57.502898 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5df97ea3-3281-4995-bf14-cb09bf09f39d-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"5df97ea3-3281-4995-bf14-cb09bf09f39d\") " pod="openstack/rabbitmq-server-0" Dec 02 16:12:57 crc kubenswrapper[4933]: I1202 16:12:57.506025 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7ctd\" (UniqueName: \"kubernetes.io/projected/5df97ea3-3281-4995-bf14-cb09bf09f39d-kube-api-access-b7ctd\") pod \"rabbitmq-server-0\" (UID: \"5df97ea3-3281-4995-bf14-cb09bf09f39d\") " pod="openstack/rabbitmq-server-0" Dec 02 16:12:57 crc kubenswrapper[4933]: I1202 16:12:57.510796 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-server-0\" (UID: \"5df97ea3-3281-4995-bf14-cb09bf09f39d\") " pod="openstack/rabbitmq-server-0" Dec 02 16:12:57 crc kubenswrapper[4933]: I1202 16:12:57.531505 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 02 16:12:57 crc kubenswrapper[4933]: I1202 16:12:57.579375 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e3cdda86-3d0d-486d-ae36-ab6792bff2ab-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3cdda86-3d0d-486d-ae36-ab6792bff2ab\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 16:12:57 crc kubenswrapper[4933]: I1202 16:12:57.579442 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e3cdda86-3d0d-486d-ae36-ab6792bff2ab-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3cdda86-3d0d-486d-ae36-ab6792bff2ab\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 16:12:57 crc kubenswrapper[4933]: I1202 16:12:57.579486 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3cdda86-3d0d-486d-ae36-ab6792bff2ab\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 16:12:57 crc kubenswrapper[4933]: I1202 16:12:57.579530 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e3cdda86-3d0d-486d-ae36-ab6792bff2ab-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3cdda86-3d0d-486d-ae36-ab6792bff2ab\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 16:12:57 crc kubenswrapper[4933]: I1202 16:12:57.579562 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e3cdda86-3d0d-486d-ae36-ab6792bff2ab-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3cdda86-3d0d-486d-ae36-ab6792bff2ab\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 16:12:57 crc kubenswrapper[4933]: I1202 16:12:57.579615 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e3cdda86-3d0d-486d-ae36-ab6792bff2ab-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3cdda86-3d0d-486d-ae36-ab6792bff2ab\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 16:12:57 crc kubenswrapper[4933]: I1202 16:12:57.579683 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-897gp\" (UniqueName: \"kubernetes.io/projected/e3cdda86-3d0d-486d-ae36-ab6792bff2ab-kube-api-access-897gp\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3cdda86-3d0d-486d-ae36-ab6792bff2ab\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 16:12:57 crc kubenswrapper[4933]: I1202 16:12:57.579761 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e3cdda86-3d0d-486d-ae36-ab6792bff2ab-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3cdda86-3d0d-486d-ae36-ab6792bff2ab\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 16:12:57 crc kubenswrapper[4933]: I1202 16:12:57.579781 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e3cdda86-3d0d-486d-ae36-ab6792bff2ab-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3cdda86-3d0d-486d-ae36-ab6792bff2ab\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 16:12:57 crc kubenswrapper[4933]: I1202 16:12:57.579802 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e3cdda86-3d0d-486d-ae36-ab6792bff2ab-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3cdda86-3d0d-486d-ae36-ab6792bff2ab\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 16:12:57 crc kubenswrapper[4933]: I1202 16:12:57.579949 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e3cdda86-3d0d-486d-ae36-ab6792bff2ab-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3cdda86-3d0d-486d-ae36-ab6792bff2ab\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 16:12:57 crc kubenswrapper[4933]: I1202 16:12:57.681753 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e3cdda86-3d0d-486d-ae36-ab6792bff2ab-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3cdda86-3d0d-486d-ae36-ab6792bff2ab\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 16:12:57 crc kubenswrapper[4933]: I1202 16:12:57.682219 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e3cdda86-3d0d-486d-ae36-ab6792bff2ab-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3cdda86-3d0d-486d-ae36-ab6792bff2ab\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 16:12:57 crc kubenswrapper[4933]: I1202 16:12:57.682257 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e3cdda86-3d0d-486d-ae36-ab6792bff2ab-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3cdda86-3d0d-486d-ae36-ab6792bff2ab\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 16:12:57 crc kubenswrapper[4933]: I1202 16:12:57.682289 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e3cdda86-3d0d-486d-ae36-ab6792bff2ab-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3cdda86-3d0d-486d-ae36-ab6792bff2ab\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 16:12:57 crc kubenswrapper[4933]: I1202 16:12:57.682338 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e3cdda86-3d0d-486d-ae36-ab6792bff2ab-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3cdda86-3d0d-486d-ae36-ab6792bff2ab\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 16:12:57 crc kubenswrapper[4933]: I1202 16:12:57.682366 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e3cdda86-3d0d-486d-ae36-ab6792bff2ab-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3cdda86-3d0d-486d-ae36-ab6792bff2ab\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 16:12:57 crc kubenswrapper[4933]: I1202 16:12:57.682403 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3cdda86-3d0d-486d-ae36-ab6792bff2ab\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 16:12:57 crc kubenswrapper[4933]: I1202 16:12:57.682488 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e3cdda86-3d0d-486d-ae36-ab6792bff2ab-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3cdda86-3d0d-486d-ae36-ab6792bff2ab\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 16:12:57 crc kubenswrapper[4933]: I1202 16:12:57.682600 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e3cdda86-3d0d-486d-ae36-ab6792bff2ab-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3cdda86-3d0d-486d-ae36-ab6792bff2ab\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 16:12:57 crc kubenswrapper[4933]: I1202 16:12:57.682662 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e3cdda86-3d0d-486d-ae36-ab6792bff2ab-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3cdda86-3d0d-486d-ae36-ab6792bff2ab\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 16:12:57 crc kubenswrapper[4933]: I1202 16:12:57.682697 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-897gp\" (UniqueName: \"kubernetes.io/projected/e3cdda86-3d0d-486d-ae36-ab6792bff2ab-kube-api-access-897gp\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3cdda86-3d0d-486d-ae36-ab6792bff2ab\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 16:12:57 crc kubenswrapper[4933]: I1202 16:12:57.682740 4933 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3cdda86-3d0d-486d-ae36-ab6792bff2ab\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/rabbitmq-cell1-server-0" Dec 02 16:12:57 crc kubenswrapper[4933]: I1202 16:12:57.683741 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e3cdda86-3d0d-486d-ae36-ab6792bff2ab-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3cdda86-3d0d-486d-ae36-ab6792bff2ab\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 16:12:57 crc kubenswrapper[4933]: I1202 16:12:57.684290 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e3cdda86-3d0d-486d-ae36-ab6792bff2ab-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3cdda86-3d0d-486d-ae36-ab6792bff2ab\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 16:12:57 crc kubenswrapper[4933]: I1202 16:12:57.684529 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e3cdda86-3d0d-486d-ae36-ab6792bff2ab-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3cdda86-3d0d-486d-ae36-ab6792bff2ab\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 16:12:57 crc kubenswrapper[4933]: I1202 16:12:57.684838 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e3cdda86-3d0d-486d-ae36-ab6792bff2ab-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3cdda86-3d0d-486d-ae36-ab6792bff2ab\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 16:12:57 crc kubenswrapper[4933]: I1202 16:12:57.685710 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e3cdda86-3d0d-486d-ae36-ab6792bff2ab-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3cdda86-3d0d-486d-ae36-ab6792bff2ab\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 16:12:57 crc kubenswrapper[4933]: I1202 16:12:57.703431 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e3cdda86-3d0d-486d-ae36-ab6792bff2ab-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3cdda86-3d0d-486d-ae36-ab6792bff2ab\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 16:12:57 crc kubenswrapper[4933]: I1202 16:12:57.704200 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e3cdda86-3d0d-486d-ae36-ab6792bff2ab-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3cdda86-3d0d-486d-ae36-ab6792bff2ab\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 16:12:57 crc kubenswrapper[4933]: I1202 16:12:57.704742 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e3cdda86-3d0d-486d-ae36-ab6792bff2ab-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3cdda86-3d0d-486d-ae36-ab6792bff2ab\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 16:12:57 crc kubenswrapper[4933]: I1202 16:12:57.705562 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e3cdda86-3d0d-486d-ae36-ab6792bff2ab-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3cdda86-3d0d-486d-ae36-ab6792bff2ab\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 16:12:57 crc kubenswrapper[4933]: I1202 16:12:57.726682 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-897gp\" (UniqueName: \"kubernetes.io/projected/e3cdda86-3d0d-486d-ae36-ab6792bff2ab-kube-api-access-897gp\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3cdda86-3d0d-486d-ae36-ab6792bff2ab\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 16:12:57 crc kubenswrapper[4933]: I1202 16:12:57.737405 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3cdda86-3d0d-486d-ae36-ab6792bff2ab\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 16:12:57 crc kubenswrapper[4933]: I1202 16:12:57.872259 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 02 16:12:58 crc kubenswrapper[4933]: I1202 16:12:58.161084 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-rxkr8" event={"ID":"1042bb7b-3049-4492-a177-d1eb37d550a7","Type":"ContainerStarted","Data":"83b17980631c356d2532612682c8a7947a32d7d962189ece71586c79edb1efcb"} Dec 02 16:12:58 crc kubenswrapper[4933]: I1202 16:12:58.932079 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Dec 02 16:12:58 crc kubenswrapper[4933]: I1202 16:12:58.934968 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 02 16:12:58 crc kubenswrapper[4933]: I1202 16:12:58.939546 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Dec 02 16:12:58 crc kubenswrapper[4933]: I1202 16:12:58.940147 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-tpdkl" Dec 02 16:12:58 crc kubenswrapper[4933]: I1202 16:12:58.940165 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Dec 02 16:12:58 crc kubenswrapper[4933]: I1202 16:12:58.940262 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Dec 02 16:12:58 crc kubenswrapper[4933]: I1202 16:12:58.948305 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 02 16:12:58 crc kubenswrapper[4933]: I1202 16:12:58.949400 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Dec 02 16:12:59 crc kubenswrapper[4933]: I1202 16:12:59.012438 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/cd426cb0-522e-4c62-874e-a85119e82490-config-data-default\") pod \"openstack-galera-0\" (UID: \"cd426cb0-522e-4c62-874e-a85119e82490\") " pod="openstack/openstack-galera-0" Dec 02 16:12:59 crc kubenswrapper[4933]: I1202 16:12:59.012490 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/cd426cb0-522e-4c62-874e-a85119e82490-config-data-generated\") pod \"openstack-galera-0\" (UID: \"cd426cb0-522e-4c62-874e-a85119e82490\") " pod="openstack/openstack-galera-0" Dec 02 16:12:59 crc kubenswrapper[4933]: I1202 16:12:59.012531 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd426cb0-522e-4c62-874e-a85119e82490-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"cd426cb0-522e-4c62-874e-a85119e82490\") " pod="openstack/openstack-galera-0" Dec 02 16:12:59 crc kubenswrapper[4933]: I1202 16:12:59.012554 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"cd426cb0-522e-4c62-874e-a85119e82490\") " pod="openstack/openstack-galera-0" Dec 02 16:12:59 crc kubenswrapper[4933]: I1202 16:12:59.012655 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd426cb0-522e-4c62-874e-a85119e82490-operator-scripts\") pod \"openstack-galera-0\" (UID: \"cd426cb0-522e-4c62-874e-a85119e82490\") " pod="openstack/openstack-galera-0" Dec 02 16:12:59 crc kubenswrapper[4933]: I1202 16:12:59.013085 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rw6v\" (UniqueName: \"kubernetes.io/projected/cd426cb0-522e-4c62-874e-a85119e82490-kube-api-access-6rw6v\") pod \"openstack-galera-0\" (UID: \"cd426cb0-522e-4c62-874e-a85119e82490\") " pod="openstack/openstack-galera-0" Dec 02 16:12:59 crc kubenswrapper[4933]: I1202 16:12:59.013156 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/cd426cb0-522e-4c62-874e-a85119e82490-kolla-config\") pod \"openstack-galera-0\" (UID: \"cd426cb0-522e-4c62-874e-a85119e82490\") " pod="openstack/openstack-galera-0" Dec 02 16:12:59 crc kubenswrapper[4933]: I1202 16:12:59.013264 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd426cb0-522e-4c62-874e-a85119e82490-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"cd426cb0-522e-4c62-874e-a85119e82490\") " pod="openstack/openstack-galera-0" Dec 02 16:12:59 crc kubenswrapper[4933]: I1202 16:12:59.114514 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd426cb0-522e-4c62-874e-a85119e82490-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"cd426cb0-522e-4c62-874e-a85119e82490\") " pod="openstack/openstack-galera-0" Dec 02 16:12:59 crc kubenswrapper[4933]: I1202 16:12:59.114879 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/cd426cb0-522e-4c62-874e-a85119e82490-config-data-default\") pod \"openstack-galera-0\" (UID: \"cd426cb0-522e-4c62-874e-a85119e82490\") " pod="openstack/openstack-galera-0" Dec 02 16:12:59 crc kubenswrapper[4933]: I1202 16:12:59.114901 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/cd426cb0-522e-4c62-874e-a85119e82490-config-data-generated\") pod \"openstack-galera-0\" (UID: \"cd426cb0-522e-4c62-874e-a85119e82490\") " pod="openstack/openstack-galera-0" Dec 02 16:12:59 crc kubenswrapper[4933]: I1202 16:12:59.114952 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd426cb0-522e-4c62-874e-a85119e82490-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"cd426cb0-522e-4c62-874e-a85119e82490\") " pod="openstack/openstack-galera-0" Dec 02 16:12:59 crc kubenswrapper[4933]: I1202 16:12:59.114986 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"cd426cb0-522e-4c62-874e-a85119e82490\") " pod="openstack/openstack-galera-0" Dec 02 16:12:59 crc kubenswrapper[4933]: I1202 16:12:59.115010 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd426cb0-522e-4c62-874e-a85119e82490-operator-scripts\") pod \"openstack-galera-0\" (UID: \"cd426cb0-522e-4c62-874e-a85119e82490\") " pod="openstack/openstack-galera-0" Dec 02 16:12:59 crc kubenswrapper[4933]: I1202 16:12:59.115147 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rw6v\" (UniqueName: \"kubernetes.io/projected/cd426cb0-522e-4c62-874e-a85119e82490-kube-api-access-6rw6v\") pod \"openstack-galera-0\" (UID: \"cd426cb0-522e-4c62-874e-a85119e82490\") " pod="openstack/openstack-galera-0" Dec 02 16:12:59 crc kubenswrapper[4933]: I1202 16:12:59.115181 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/cd426cb0-522e-4c62-874e-a85119e82490-kolla-config\") pod \"openstack-galera-0\" (UID: \"cd426cb0-522e-4c62-874e-a85119e82490\") " pod="openstack/openstack-galera-0" Dec 02 16:12:59 crc kubenswrapper[4933]: I1202 16:12:59.115397 4933 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"cd426cb0-522e-4c62-874e-a85119e82490\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/openstack-galera-0" Dec 02 16:12:59 crc kubenswrapper[4933]: I1202 16:12:59.135295 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd426cb0-522e-4c62-874e-a85119e82490-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"cd426cb0-522e-4c62-874e-a85119e82490\") " pod="openstack/openstack-galera-0" Dec 02 16:12:59 crc kubenswrapper[4933]: I1202 16:12:59.136732 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/cd426cb0-522e-4c62-874e-a85119e82490-config-data-generated\") pod \"openstack-galera-0\" (UID: \"cd426cb0-522e-4c62-874e-a85119e82490\") " pod="openstack/openstack-galera-0" Dec 02 16:12:59 crc kubenswrapper[4933]: I1202 16:12:59.137080 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/cd426cb0-522e-4c62-874e-a85119e82490-kolla-config\") pod \"openstack-galera-0\" (UID: \"cd426cb0-522e-4c62-874e-a85119e82490\") " pod="openstack/openstack-galera-0" Dec 02 16:12:59 crc kubenswrapper[4933]: I1202 16:12:59.137270 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/cd426cb0-522e-4c62-874e-a85119e82490-config-data-default\") pod \"openstack-galera-0\" (UID: \"cd426cb0-522e-4c62-874e-a85119e82490\") " pod="openstack/openstack-galera-0" Dec 02 16:12:59 crc kubenswrapper[4933]: I1202 16:12:59.137964 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd426cb0-522e-4c62-874e-a85119e82490-operator-scripts\") pod \"openstack-galera-0\" (UID: \"cd426cb0-522e-4c62-874e-a85119e82490\") " pod="openstack/openstack-galera-0" Dec 02 16:12:59 crc kubenswrapper[4933]: I1202 16:12:59.149871 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"cd426cb0-522e-4c62-874e-a85119e82490\") " pod="openstack/openstack-galera-0" Dec 02 16:12:59 crc kubenswrapper[4933]: I1202 16:12:59.155618 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rw6v\" (UniqueName: \"kubernetes.io/projected/cd426cb0-522e-4c62-874e-a85119e82490-kube-api-access-6rw6v\") pod \"openstack-galera-0\" (UID: \"cd426cb0-522e-4c62-874e-a85119e82490\") " pod="openstack/openstack-galera-0" Dec 02 16:12:59 crc kubenswrapper[4933]: I1202 16:12:59.162024 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd426cb0-522e-4c62-874e-a85119e82490-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"cd426cb0-522e-4c62-874e-a85119e82490\") " pod="openstack/openstack-galera-0" Dec 02 16:12:59 crc kubenswrapper[4933]: I1202 16:12:59.261947 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 02 16:13:00 crc kubenswrapper[4933]: I1202 16:13:00.317944 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 02 16:13:00 crc kubenswrapper[4933]: I1202 16:13:00.321015 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 02 16:13:00 crc kubenswrapper[4933]: I1202 16:13:00.324485 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Dec 02 16:13:00 crc kubenswrapper[4933]: I1202 16:13:00.324487 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-t29ph" Dec 02 16:13:00 crc kubenswrapper[4933]: I1202 16:13:00.325816 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Dec 02 16:13:00 crc kubenswrapper[4933]: I1202 16:13:00.325984 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Dec 02 16:13:00 crc kubenswrapper[4933]: I1202 16:13:00.331334 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 02 16:13:00 crc kubenswrapper[4933]: I1202 16:13:00.456751 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Dec 02 16:13:00 crc kubenswrapper[4933]: I1202 16:13:00.458321 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 02 16:13:00 crc kubenswrapper[4933]: I1202 16:13:00.463300 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Dec 02 16:13:00 crc kubenswrapper[4933]: I1202 16:13:00.466446 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/553a306b-ceeb-41b2-8b2c-c32dcd70639e-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"553a306b-ceeb-41b2-8b2c-c32dcd70639e\") " pod="openstack/openstack-cell1-galera-0" Dec 02 16:13:00 crc kubenswrapper[4933]: I1202 16:13:00.466527 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"553a306b-ceeb-41b2-8b2c-c32dcd70639e\") " pod="openstack/openstack-cell1-galera-0" Dec 02 16:13:00 crc kubenswrapper[4933]: I1202 16:13:00.466621 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fh8qw\" (UniqueName: \"kubernetes.io/projected/553a306b-ceeb-41b2-8b2c-c32dcd70639e-kube-api-access-fh8qw\") pod \"openstack-cell1-galera-0\" (UID: \"553a306b-ceeb-41b2-8b2c-c32dcd70639e\") " pod="openstack/openstack-cell1-galera-0" Dec 02 16:13:00 crc kubenswrapper[4933]: I1202 16:13:00.466649 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/553a306b-ceeb-41b2-8b2c-c32dcd70639e-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"553a306b-ceeb-41b2-8b2c-c32dcd70639e\") " pod="openstack/openstack-cell1-galera-0" Dec 02 16:13:00 crc kubenswrapper[4933]: I1202 16:13:00.466676 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/553a306b-ceeb-41b2-8b2c-c32dcd70639e-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"553a306b-ceeb-41b2-8b2c-c32dcd70639e\") " pod="openstack/openstack-cell1-galera-0" Dec 02 16:13:00 crc kubenswrapper[4933]: I1202 16:13:00.466723 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/553a306b-ceeb-41b2-8b2c-c32dcd70639e-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"553a306b-ceeb-41b2-8b2c-c32dcd70639e\") " pod="openstack/openstack-cell1-galera-0" Dec 02 16:13:00 crc kubenswrapper[4933]: I1202 16:13:00.466749 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/553a306b-ceeb-41b2-8b2c-c32dcd70639e-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"553a306b-ceeb-41b2-8b2c-c32dcd70639e\") " pod="openstack/openstack-cell1-galera-0" Dec 02 16:13:00 crc kubenswrapper[4933]: I1202 16:13:00.466800 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/553a306b-ceeb-41b2-8b2c-c32dcd70639e-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"553a306b-ceeb-41b2-8b2c-c32dcd70639e\") " pod="openstack/openstack-cell1-galera-0" Dec 02 16:13:00 crc kubenswrapper[4933]: I1202 16:13:00.467854 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-2gp4f" Dec 02 16:13:00 crc kubenswrapper[4933]: I1202 16:13:00.472902 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Dec 02 16:13:00 crc kubenswrapper[4933]: I1202 16:13:00.488110 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 02 16:13:00 crc kubenswrapper[4933]: I1202 16:13:00.570764 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/553a306b-ceeb-41b2-8b2c-c32dcd70639e-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"553a306b-ceeb-41b2-8b2c-c32dcd70639e\") " pod="openstack/openstack-cell1-galera-0" Dec 02 16:13:00 crc kubenswrapper[4933]: I1202 16:13:00.570841 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"553a306b-ceeb-41b2-8b2c-c32dcd70639e\") " pod="openstack/openstack-cell1-galera-0" Dec 02 16:13:00 crc kubenswrapper[4933]: I1202 16:13:00.570917 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fh8qw\" (UniqueName: \"kubernetes.io/projected/553a306b-ceeb-41b2-8b2c-c32dcd70639e-kube-api-access-fh8qw\") pod \"openstack-cell1-galera-0\" (UID: \"553a306b-ceeb-41b2-8b2c-c32dcd70639e\") " pod="openstack/openstack-cell1-galera-0" Dec 02 16:13:00 crc kubenswrapper[4933]: I1202 16:13:00.570973 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/32a2d025-2245-46a6-82d1-228e920490a3-memcached-tls-certs\") pod \"memcached-0\" (UID: \"32a2d025-2245-46a6-82d1-228e920490a3\") " pod="openstack/memcached-0" Dec 02 16:13:00 crc kubenswrapper[4933]: I1202 16:13:00.570991 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/553a306b-ceeb-41b2-8b2c-c32dcd70639e-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"553a306b-ceeb-41b2-8b2c-c32dcd70639e\") " pod="openstack/openstack-cell1-galera-0" Dec 02 16:13:00 crc kubenswrapper[4933]: I1202 16:13:00.571011 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/553a306b-ceeb-41b2-8b2c-c32dcd70639e-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"553a306b-ceeb-41b2-8b2c-c32dcd70639e\") " pod="openstack/openstack-cell1-galera-0" Dec 02 16:13:00 crc kubenswrapper[4933]: I1202 16:13:00.571029 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/32a2d025-2245-46a6-82d1-228e920490a3-kolla-config\") pod \"memcached-0\" (UID: \"32a2d025-2245-46a6-82d1-228e920490a3\") " pod="openstack/memcached-0" Dec 02 16:13:00 crc kubenswrapper[4933]: I1202 16:13:00.571062 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/553a306b-ceeb-41b2-8b2c-c32dcd70639e-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"553a306b-ceeb-41b2-8b2c-c32dcd70639e\") " pod="openstack/openstack-cell1-galera-0" Dec 02 16:13:00 crc kubenswrapper[4933]: I1202 16:13:00.571082 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/553a306b-ceeb-41b2-8b2c-c32dcd70639e-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"553a306b-ceeb-41b2-8b2c-c32dcd70639e\") " pod="openstack/openstack-cell1-galera-0" Dec 02 16:13:00 crc kubenswrapper[4933]: I1202 16:13:00.571121 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/553a306b-ceeb-41b2-8b2c-c32dcd70639e-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"553a306b-ceeb-41b2-8b2c-c32dcd70639e\") " pod="openstack/openstack-cell1-galera-0" Dec 02 16:13:00 crc kubenswrapper[4933]: I1202 16:13:00.571152 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjnfj\" (UniqueName: \"kubernetes.io/projected/32a2d025-2245-46a6-82d1-228e920490a3-kube-api-access-qjnfj\") pod \"memcached-0\" (UID: \"32a2d025-2245-46a6-82d1-228e920490a3\") " pod="openstack/memcached-0" Dec 02 16:13:00 crc kubenswrapper[4933]: I1202 16:13:00.571182 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32a2d025-2245-46a6-82d1-228e920490a3-combined-ca-bundle\") pod \"memcached-0\" (UID: \"32a2d025-2245-46a6-82d1-228e920490a3\") " pod="openstack/memcached-0" Dec 02 16:13:00 crc kubenswrapper[4933]: I1202 16:13:00.571226 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/32a2d025-2245-46a6-82d1-228e920490a3-config-data\") pod \"memcached-0\" (UID: \"32a2d025-2245-46a6-82d1-228e920490a3\") " pod="openstack/memcached-0" Dec 02 16:13:00 crc kubenswrapper[4933]: I1202 16:13:00.571278 4933 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"553a306b-ceeb-41b2-8b2c-c32dcd70639e\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/openstack-cell1-galera-0" Dec 02 16:13:00 crc kubenswrapper[4933]: I1202 16:13:00.571595 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/553a306b-ceeb-41b2-8b2c-c32dcd70639e-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"553a306b-ceeb-41b2-8b2c-c32dcd70639e\") " pod="openstack/openstack-cell1-galera-0" Dec 02 16:13:00 crc kubenswrapper[4933]: I1202 16:13:00.572199 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/553a306b-ceeb-41b2-8b2c-c32dcd70639e-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"553a306b-ceeb-41b2-8b2c-c32dcd70639e\") " pod="openstack/openstack-cell1-galera-0" Dec 02 16:13:00 crc kubenswrapper[4933]: I1202 16:13:00.572236 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/553a306b-ceeb-41b2-8b2c-c32dcd70639e-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"553a306b-ceeb-41b2-8b2c-c32dcd70639e\") " pod="openstack/openstack-cell1-galera-0" Dec 02 16:13:00 crc kubenswrapper[4933]: I1202 16:13:00.572314 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/553a306b-ceeb-41b2-8b2c-c32dcd70639e-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"553a306b-ceeb-41b2-8b2c-c32dcd70639e\") " pod="openstack/openstack-cell1-galera-0" Dec 02 16:13:00 crc kubenswrapper[4933]: I1202 16:13:00.576741 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/553a306b-ceeb-41b2-8b2c-c32dcd70639e-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"553a306b-ceeb-41b2-8b2c-c32dcd70639e\") " pod="openstack/openstack-cell1-galera-0" Dec 02 16:13:00 crc kubenswrapper[4933]: I1202 16:13:00.578132 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/553a306b-ceeb-41b2-8b2c-c32dcd70639e-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"553a306b-ceeb-41b2-8b2c-c32dcd70639e\") " pod="openstack/openstack-cell1-galera-0" Dec 02 16:13:00 crc kubenswrapper[4933]: I1202 16:13:00.589741 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fh8qw\" (UniqueName: \"kubernetes.io/projected/553a306b-ceeb-41b2-8b2c-c32dcd70639e-kube-api-access-fh8qw\") pod \"openstack-cell1-galera-0\" (UID: \"553a306b-ceeb-41b2-8b2c-c32dcd70639e\") " pod="openstack/openstack-cell1-galera-0" Dec 02 16:13:00 crc kubenswrapper[4933]: I1202 16:13:00.605509 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"553a306b-ceeb-41b2-8b2c-c32dcd70639e\") " pod="openstack/openstack-cell1-galera-0" Dec 02 16:13:00 crc kubenswrapper[4933]: I1202 16:13:00.672588 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/32a2d025-2245-46a6-82d1-228e920490a3-memcached-tls-certs\") pod \"memcached-0\" (UID: \"32a2d025-2245-46a6-82d1-228e920490a3\") " pod="openstack/memcached-0" Dec 02 16:13:00 crc kubenswrapper[4933]: I1202 16:13:00.672633 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/32a2d025-2245-46a6-82d1-228e920490a3-kolla-config\") pod \"memcached-0\" (UID: \"32a2d025-2245-46a6-82d1-228e920490a3\") " pod="openstack/memcached-0" Dec 02 16:13:00 crc kubenswrapper[4933]: I1202 16:13:00.672712 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjnfj\" (UniqueName: \"kubernetes.io/projected/32a2d025-2245-46a6-82d1-228e920490a3-kube-api-access-qjnfj\") pod \"memcached-0\" (UID: \"32a2d025-2245-46a6-82d1-228e920490a3\") " pod="openstack/memcached-0" Dec 02 16:13:00 crc kubenswrapper[4933]: I1202 16:13:00.672766 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32a2d025-2245-46a6-82d1-228e920490a3-combined-ca-bundle\") pod \"memcached-0\" (UID: \"32a2d025-2245-46a6-82d1-228e920490a3\") " pod="openstack/memcached-0" Dec 02 16:13:00 crc kubenswrapper[4933]: I1202 16:13:00.672797 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/32a2d025-2245-46a6-82d1-228e920490a3-config-data\") pod \"memcached-0\" (UID: \"32a2d025-2245-46a6-82d1-228e920490a3\") " pod="openstack/memcached-0" Dec 02 16:13:00 crc kubenswrapper[4933]: I1202 16:13:00.673538 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/32a2d025-2245-46a6-82d1-228e920490a3-config-data\") pod \"memcached-0\" (UID: \"32a2d025-2245-46a6-82d1-228e920490a3\") " pod="openstack/memcached-0" Dec 02 16:13:00 crc kubenswrapper[4933]: I1202 16:13:00.674033 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/32a2d025-2245-46a6-82d1-228e920490a3-kolla-config\") pod \"memcached-0\" (UID: \"32a2d025-2245-46a6-82d1-228e920490a3\") " pod="openstack/memcached-0" Dec 02 16:13:00 crc kubenswrapper[4933]: I1202 16:13:00.675239 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 02 16:13:00 crc kubenswrapper[4933]: I1202 16:13:00.683887 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32a2d025-2245-46a6-82d1-228e920490a3-combined-ca-bundle\") pod \"memcached-0\" (UID: \"32a2d025-2245-46a6-82d1-228e920490a3\") " pod="openstack/memcached-0" Dec 02 16:13:00 crc kubenswrapper[4933]: I1202 16:13:00.683943 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/32a2d025-2245-46a6-82d1-228e920490a3-memcached-tls-certs\") pod \"memcached-0\" (UID: \"32a2d025-2245-46a6-82d1-228e920490a3\") " pod="openstack/memcached-0" Dec 02 16:13:00 crc kubenswrapper[4933]: I1202 16:13:00.699813 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjnfj\" (UniqueName: \"kubernetes.io/projected/32a2d025-2245-46a6-82d1-228e920490a3-kube-api-access-qjnfj\") pod \"memcached-0\" (UID: \"32a2d025-2245-46a6-82d1-228e920490a3\") " pod="openstack/memcached-0" Dec 02 16:13:00 crc kubenswrapper[4933]: I1202 16:13:00.776345 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 02 16:13:02 crc kubenswrapper[4933]: I1202 16:13:02.687587 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 02 16:13:02 crc kubenswrapper[4933]: I1202 16:13:02.690020 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 02 16:13:02 crc kubenswrapper[4933]: I1202 16:13:02.696231 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-n5nwj" Dec 02 16:13:02 crc kubenswrapper[4933]: I1202 16:13:02.713965 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 02 16:13:02 crc kubenswrapper[4933]: I1202 16:13:02.825798 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-js6p2\" (UniqueName: \"kubernetes.io/projected/09f9fb84-64a0-46a8-8551-4fcbf0e46050-kube-api-access-js6p2\") pod \"kube-state-metrics-0\" (UID: \"09f9fb84-64a0-46a8-8551-4fcbf0e46050\") " pod="openstack/kube-state-metrics-0" Dec 02 16:13:02 crc kubenswrapper[4933]: I1202 16:13:02.931246 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-js6p2\" (UniqueName: \"kubernetes.io/projected/09f9fb84-64a0-46a8-8551-4fcbf0e46050-kube-api-access-js6p2\") pod \"kube-state-metrics-0\" (UID: \"09f9fb84-64a0-46a8-8551-4fcbf0e46050\") " pod="openstack/kube-state-metrics-0" Dec 02 16:13:03 crc kubenswrapper[4933]: I1202 16:13:03.003235 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-js6p2\" (UniqueName: \"kubernetes.io/projected/09f9fb84-64a0-46a8-8551-4fcbf0e46050-kube-api-access-js6p2\") pod \"kube-state-metrics-0\" (UID: \"09f9fb84-64a0-46a8-8551-4fcbf0e46050\") " pod="openstack/kube-state-metrics-0" Dec 02 16:13:03 crc kubenswrapper[4933]: I1202 16:13:03.043272 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 02 16:13:03 crc kubenswrapper[4933]: I1202 16:13:03.462246 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-ui-dashboards-7d5fb4cbfb-zdcrf"] Dec 02 16:13:03 crc kubenswrapper[4933]: I1202 16:13:03.463812 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-zdcrf" Dec 02 16:13:03 crc kubenswrapper[4933]: I1202 16:13:03.466979 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-ui-dashboards" Dec 02 16:13:03 crc kubenswrapper[4933]: I1202 16:13:03.467129 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-ui-dashboards-sa-dockercfg-6bkjp" Dec 02 16:13:03 crc kubenswrapper[4933]: I1202 16:13:03.485202 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-ui-dashboards-7d5fb4cbfb-zdcrf"] Dec 02 16:13:03 crc kubenswrapper[4933]: I1202 16:13:03.671586 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d691d907-a29d-40ad-ad96-009e8a7d56e8-serving-cert\") pod \"observability-ui-dashboards-7d5fb4cbfb-zdcrf\" (UID: \"d691d907-a29d-40ad-ad96-009e8a7d56e8\") " pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-zdcrf" Dec 02 16:13:03 crc kubenswrapper[4933]: I1202 16:13:03.671692 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khh66\" (UniqueName: \"kubernetes.io/projected/d691d907-a29d-40ad-ad96-009e8a7d56e8-kube-api-access-khh66\") pod \"observability-ui-dashboards-7d5fb4cbfb-zdcrf\" (UID: \"d691d907-a29d-40ad-ad96-009e8a7d56e8\") " pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-zdcrf" Dec 02 16:13:03 crc kubenswrapper[4933]: I1202 16:13:03.773573 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d691d907-a29d-40ad-ad96-009e8a7d56e8-serving-cert\") pod \"observability-ui-dashboards-7d5fb4cbfb-zdcrf\" (UID: \"d691d907-a29d-40ad-ad96-009e8a7d56e8\") " pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-zdcrf" Dec 02 16:13:03 crc kubenswrapper[4933]: I1202 16:13:03.773651 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khh66\" (UniqueName: \"kubernetes.io/projected/d691d907-a29d-40ad-ad96-009e8a7d56e8-kube-api-access-khh66\") pod \"observability-ui-dashboards-7d5fb4cbfb-zdcrf\" (UID: \"d691d907-a29d-40ad-ad96-009e8a7d56e8\") " pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-zdcrf" Dec 02 16:13:03 crc kubenswrapper[4933]: E1202 16:13:03.774102 4933 secret.go:188] Couldn't get secret openshift-operators/observability-ui-dashboards: secret "observability-ui-dashboards" not found Dec 02 16:13:03 crc kubenswrapper[4933]: E1202 16:13:03.774145 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d691d907-a29d-40ad-ad96-009e8a7d56e8-serving-cert podName:d691d907-a29d-40ad-ad96-009e8a7d56e8 nodeName:}" failed. No retries permitted until 2025-12-02 16:13:04.274129674 +0000 UTC m=+1247.525356377 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/d691d907-a29d-40ad-ad96-009e8a7d56e8-serving-cert") pod "observability-ui-dashboards-7d5fb4cbfb-zdcrf" (UID: "d691d907-a29d-40ad-ad96-009e8a7d56e8") : secret "observability-ui-dashboards" not found Dec 02 16:13:03 crc kubenswrapper[4933]: I1202 16:13:03.810661 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khh66\" (UniqueName: \"kubernetes.io/projected/d691d907-a29d-40ad-ad96-009e8a7d56e8-kube-api-access-khh66\") pod \"observability-ui-dashboards-7d5fb4cbfb-zdcrf\" (UID: \"d691d907-a29d-40ad-ad96-009e8a7d56e8\") " pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-zdcrf" Dec 02 16:13:03 crc kubenswrapper[4933]: I1202 16:13:03.838370 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-75644579b7-4b5d6"] Dec 02 16:13:03 crc kubenswrapper[4933]: I1202 16:13:03.840174 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-75644579b7-4b5d6" Dec 02 16:13:03 crc kubenswrapper[4933]: I1202 16:13:03.860208 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-75644579b7-4b5d6"] Dec 02 16:13:03 crc kubenswrapper[4933]: I1202 16:13:03.984684 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8ae8bf53-f595-41aa-a854-2854ec134942-console-oauth-config\") pod \"console-75644579b7-4b5d6\" (UID: \"8ae8bf53-f595-41aa-a854-2854ec134942\") " pod="openshift-console/console-75644579b7-4b5d6" Dec 02 16:13:03 crc kubenswrapper[4933]: I1202 16:13:03.985039 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8ae8bf53-f595-41aa-a854-2854ec134942-service-ca\") pod \"console-75644579b7-4b5d6\" (UID: \"8ae8bf53-f595-41aa-a854-2854ec134942\") " pod="openshift-console/console-75644579b7-4b5d6" Dec 02 16:13:03 crc kubenswrapper[4933]: I1202 16:13:03.985070 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxxqp\" (UniqueName: \"kubernetes.io/projected/8ae8bf53-f595-41aa-a854-2854ec134942-kube-api-access-zxxqp\") pod \"console-75644579b7-4b5d6\" (UID: \"8ae8bf53-f595-41aa-a854-2854ec134942\") " pod="openshift-console/console-75644579b7-4b5d6" Dec 02 16:13:03 crc kubenswrapper[4933]: I1202 16:13:03.985108 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8ae8bf53-f595-41aa-a854-2854ec134942-console-config\") pod \"console-75644579b7-4b5d6\" (UID: \"8ae8bf53-f595-41aa-a854-2854ec134942\") " pod="openshift-console/console-75644579b7-4b5d6" Dec 02 16:13:03 crc kubenswrapper[4933]: I1202 16:13:03.985161 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8ae8bf53-f595-41aa-a854-2854ec134942-oauth-serving-cert\") pod \"console-75644579b7-4b5d6\" (UID: \"8ae8bf53-f595-41aa-a854-2854ec134942\") " pod="openshift-console/console-75644579b7-4b5d6" Dec 02 16:13:03 crc kubenswrapper[4933]: I1202 16:13:03.985179 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8ae8bf53-f595-41aa-a854-2854ec134942-console-serving-cert\") pod \"console-75644579b7-4b5d6\" (UID: \"8ae8bf53-f595-41aa-a854-2854ec134942\") " pod="openshift-console/console-75644579b7-4b5d6" Dec 02 16:13:03 crc kubenswrapper[4933]: I1202 16:13:03.985200 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8ae8bf53-f595-41aa-a854-2854ec134942-trusted-ca-bundle\") pod \"console-75644579b7-4b5d6\" (UID: \"8ae8bf53-f595-41aa-a854-2854ec134942\") " pod="openshift-console/console-75644579b7-4b5d6" Dec 02 16:13:04 crc kubenswrapper[4933]: I1202 16:13:04.037362 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 02 16:13:04 crc kubenswrapper[4933]: I1202 16:13:04.040996 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 02 16:13:04 crc kubenswrapper[4933]: I1202 16:13:04.045079 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Dec 02 16:13:04 crc kubenswrapper[4933]: I1202 16:13:04.047009 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Dec 02 16:13:04 crc kubenswrapper[4933]: I1202 16:13:04.048030 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-fbngp" Dec 02 16:13:04 crc kubenswrapper[4933]: I1202 16:13:04.050540 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Dec 02 16:13:04 crc kubenswrapper[4933]: I1202 16:13:04.054112 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Dec 02 16:13:04 crc kubenswrapper[4933]: I1202 16:13:04.062729 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Dec 02 16:13:04 crc kubenswrapper[4933]: I1202 16:13:04.066999 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 02 16:13:04 crc kubenswrapper[4933]: I1202 16:13:04.090746 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8ae8bf53-f595-41aa-a854-2854ec134942-console-config\") pod \"console-75644579b7-4b5d6\" (UID: \"8ae8bf53-f595-41aa-a854-2854ec134942\") " pod="openshift-console/console-75644579b7-4b5d6" Dec 02 16:13:04 crc kubenswrapper[4933]: I1202 16:13:04.090832 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8ae8bf53-f595-41aa-a854-2854ec134942-oauth-serving-cert\") pod \"console-75644579b7-4b5d6\" (UID: \"8ae8bf53-f595-41aa-a854-2854ec134942\") " pod="openshift-console/console-75644579b7-4b5d6" Dec 02 16:13:04 crc kubenswrapper[4933]: I1202 16:13:04.090855 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8ae8bf53-f595-41aa-a854-2854ec134942-console-serving-cert\") pod \"console-75644579b7-4b5d6\" (UID: \"8ae8bf53-f595-41aa-a854-2854ec134942\") " pod="openshift-console/console-75644579b7-4b5d6" Dec 02 16:13:04 crc kubenswrapper[4933]: I1202 16:13:04.090876 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8ae8bf53-f595-41aa-a854-2854ec134942-trusted-ca-bundle\") pod \"console-75644579b7-4b5d6\" (UID: \"8ae8bf53-f595-41aa-a854-2854ec134942\") " pod="openshift-console/console-75644579b7-4b5d6" Dec 02 16:13:04 crc kubenswrapper[4933]: I1202 16:13:04.090972 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8ae8bf53-f595-41aa-a854-2854ec134942-console-oauth-config\") pod \"console-75644579b7-4b5d6\" (UID: \"8ae8bf53-f595-41aa-a854-2854ec134942\") " pod="openshift-console/console-75644579b7-4b5d6" Dec 02 16:13:04 crc kubenswrapper[4933]: I1202 16:13:04.091015 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8ae8bf53-f595-41aa-a854-2854ec134942-service-ca\") pod \"console-75644579b7-4b5d6\" (UID: \"8ae8bf53-f595-41aa-a854-2854ec134942\") " pod="openshift-console/console-75644579b7-4b5d6" Dec 02 16:13:04 crc kubenswrapper[4933]: I1202 16:13:04.091038 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zxxqp\" (UniqueName: \"kubernetes.io/projected/8ae8bf53-f595-41aa-a854-2854ec134942-kube-api-access-zxxqp\") pod \"console-75644579b7-4b5d6\" (UID: \"8ae8bf53-f595-41aa-a854-2854ec134942\") " pod="openshift-console/console-75644579b7-4b5d6" Dec 02 16:13:04 crc kubenswrapper[4933]: I1202 16:13:04.092626 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8ae8bf53-f595-41aa-a854-2854ec134942-console-config\") pod \"console-75644579b7-4b5d6\" (UID: \"8ae8bf53-f595-41aa-a854-2854ec134942\") " pod="openshift-console/console-75644579b7-4b5d6" Dec 02 16:13:04 crc kubenswrapper[4933]: I1202 16:13:04.093456 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8ae8bf53-f595-41aa-a854-2854ec134942-oauth-serving-cert\") pod \"console-75644579b7-4b5d6\" (UID: \"8ae8bf53-f595-41aa-a854-2854ec134942\") " pod="openshift-console/console-75644579b7-4b5d6" Dec 02 16:13:04 crc kubenswrapper[4933]: I1202 16:13:04.094102 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8ae8bf53-f595-41aa-a854-2854ec134942-trusted-ca-bundle\") pod \"console-75644579b7-4b5d6\" (UID: \"8ae8bf53-f595-41aa-a854-2854ec134942\") " pod="openshift-console/console-75644579b7-4b5d6" Dec 02 16:13:04 crc kubenswrapper[4933]: I1202 16:13:04.094710 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8ae8bf53-f595-41aa-a854-2854ec134942-service-ca\") pod \"console-75644579b7-4b5d6\" (UID: \"8ae8bf53-f595-41aa-a854-2854ec134942\") " pod="openshift-console/console-75644579b7-4b5d6" Dec 02 16:13:04 crc kubenswrapper[4933]: I1202 16:13:04.101417 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8ae8bf53-f595-41aa-a854-2854ec134942-console-serving-cert\") pod \"console-75644579b7-4b5d6\" (UID: \"8ae8bf53-f595-41aa-a854-2854ec134942\") " pod="openshift-console/console-75644579b7-4b5d6" Dec 02 16:13:04 crc kubenswrapper[4933]: I1202 16:13:04.110710 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxxqp\" (UniqueName: \"kubernetes.io/projected/8ae8bf53-f595-41aa-a854-2854ec134942-kube-api-access-zxxqp\") pod \"console-75644579b7-4b5d6\" (UID: \"8ae8bf53-f595-41aa-a854-2854ec134942\") " pod="openshift-console/console-75644579b7-4b5d6" Dec 02 16:13:04 crc kubenswrapper[4933]: I1202 16:13:04.112067 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8ae8bf53-f595-41aa-a854-2854ec134942-console-oauth-config\") pod \"console-75644579b7-4b5d6\" (UID: \"8ae8bf53-f595-41aa-a854-2854ec134942\") " pod="openshift-console/console-75644579b7-4b5d6" Dec 02 16:13:04 crc kubenswrapper[4933]: I1202 16:13:04.166610 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-75644579b7-4b5d6" Dec 02 16:13:04 crc kubenswrapper[4933]: I1202 16:13:04.192909 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6t4mn\" (UniqueName: \"kubernetes.io/projected/803ee8c3-0a9d-46c8-8069-1ebffde1429c-kube-api-access-6t4mn\") pod \"prometheus-metric-storage-0\" (UID: \"803ee8c3-0a9d-46c8-8069-1ebffde1429c\") " pod="openstack/prometheus-metric-storage-0" Dec 02 16:13:04 crc kubenswrapper[4933]: I1202 16:13:04.193064 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/803ee8c3-0a9d-46c8-8069-1ebffde1429c-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"803ee8c3-0a9d-46c8-8069-1ebffde1429c\") " pod="openstack/prometheus-metric-storage-0" Dec 02 16:13:04 crc kubenswrapper[4933]: I1202 16:13:04.193117 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/803ee8c3-0a9d-46c8-8069-1ebffde1429c-config\") pod \"prometheus-metric-storage-0\" (UID: \"803ee8c3-0a9d-46c8-8069-1ebffde1429c\") " pod="openstack/prometheus-metric-storage-0" Dec 02 16:13:04 crc kubenswrapper[4933]: I1202 16:13:04.193138 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/803ee8c3-0a9d-46c8-8069-1ebffde1429c-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"803ee8c3-0a9d-46c8-8069-1ebffde1429c\") " pod="openstack/prometheus-metric-storage-0" Dec 02 16:13:04 crc kubenswrapper[4933]: I1202 16:13:04.193278 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"prometheus-metric-storage-0\" (UID: \"803ee8c3-0a9d-46c8-8069-1ebffde1429c\") " pod="openstack/prometheus-metric-storage-0" Dec 02 16:13:04 crc kubenswrapper[4933]: I1202 16:13:04.193317 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/803ee8c3-0a9d-46c8-8069-1ebffde1429c-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"803ee8c3-0a9d-46c8-8069-1ebffde1429c\") " pod="openstack/prometheus-metric-storage-0" Dec 02 16:13:04 crc kubenswrapper[4933]: I1202 16:13:04.193349 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/803ee8c3-0a9d-46c8-8069-1ebffde1429c-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"803ee8c3-0a9d-46c8-8069-1ebffde1429c\") " pod="openstack/prometheus-metric-storage-0" Dec 02 16:13:04 crc kubenswrapper[4933]: I1202 16:13:04.193391 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/803ee8c3-0a9d-46c8-8069-1ebffde1429c-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"803ee8c3-0a9d-46c8-8069-1ebffde1429c\") " pod="openstack/prometheus-metric-storage-0" Dec 02 16:13:04 crc kubenswrapper[4933]: I1202 16:13:04.296065 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/803ee8c3-0a9d-46c8-8069-1ebffde1429c-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"803ee8c3-0a9d-46c8-8069-1ebffde1429c\") " pod="openstack/prometheus-metric-storage-0" Dec 02 16:13:04 crc kubenswrapper[4933]: I1202 16:13:04.296180 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6t4mn\" (UniqueName: \"kubernetes.io/projected/803ee8c3-0a9d-46c8-8069-1ebffde1429c-kube-api-access-6t4mn\") pod \"prometheus-metric-storage-0\" (UID: \"803ee8c3-0a9d-46c8-8069-1ebffde1429c\") " pod="openstack/prometheus-metric-storage-0" Dec 02 16:13:04 crc kubenswrapper[4933]: I1202 16:13:04.296238 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/803ee8c3-0a9d-46c8-8069-1ebffde1429c-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"803ee8c3-0a9d-46c8-8069-1ebffde1429c\") " pod="openstack/prometheus-metric-storage-0" Dec 02 16:13:04 crc kubenswrapper[4933]: I1202 16:13:04.296271 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/803ee8c3-0a9d-46c8-8069-1ebffde1429c-config\") pod \"prometheus-metric-storage-0\" (UID: \"803ee8c3-0a9d-46c8-8069-1ebffde1429c\") " pod="openstack/prometheus-metric-storage-0" Dec 02 16:13:04 crc kubenswrapper[4933]: I1202 16:13:04.296289 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/803ee8c3-0a9d-46c8-8069-1ebffde1429c-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"803ee8c3-0a9d-46c8-8069-1ebffde1429c\") " pod="openstack/prometheus-metric-storage-0" Dec 02 16:13:04 crc kubenswrapper[4933]: I1202 16:13:04.296345 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d691d907-a29d-40ad-ad96-009e8a7d56e8-serving-cert\") pod \"observability-ui-dashboards-7d5fb4cbfb-zdcrf\" (UID: \"d691d907-a29d-40ad-ad96-009e8a7d56e8\") " pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-zdcrf" Dec 02 16:13:04 crc kubenswrapper[4933]: I1202 16:13:04.296369 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"prometheus-metric-storage-0\" (UID: \"803ee8c3-0a9d-46c8-8069-1ebffde1429c\") " pod="openstack/prometheus-metric-storage-0" Dec 02 16:13:04 crc kubenswrapper[4933]: I1202 16:13:04.296387 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/803ee8c3-0a9d-46c8-8069-1ebffde1429c-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"803ee8c3-0a9d-46c8-8069-1ebffde1429c\") " pod="openstack/prometheus-metric-storage-0" Dec 02 16:13:04 crc kubenswrapper[4933]: I1202 16:13:04.296423 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/803ee8c3-0a9d-46c8-8069-1ebffde1429c-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"803ee8c3-0a9d-46c8-8069-1ebffde1429c\") " pod="openstack/prometheus-metric-storage-0" Dec 02 16:13:04 crc kubenswrapper[4933]: I1202 16:13:04.298031 4933 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"prometheus-metric-storage-0\" (UID: \"803ee8c3-0a9d-46c8-8069-1ebffde1429c\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/prometheus-metric-storage-0" Dec 02 16:13:04 crc kubenswrapper[4933]: I1202 16:13:04.299633 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/803ee8c3-0a9d-46c8-8069-1ebffde1429c-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"803ee8c3-0a9d-46c8-8069-1ebffde1429c\") " pod="openstack/prometheus-metric-storage-0" Dec 02 16:13:04 crc kubenswrapper[4933]: I1202 16:13:04.300479 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/803ee8c3-0a9d-46c8-8069-1ebffde1429c-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"803ee8c3-0a9d-46c8-8069-1ebffde1429c\") " pod="openstack/prometheus-metric-storage-0" Dec 02 16:13:04 crc kubenswrapper[4933]: I1202 16:13:04.302596 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d691d907-a29d-40ad-ad96-009e8a7d56e8-serving-cert\") pod \"observability-ui-dashboards-7d5fb4cbfb-zdcrf\" (UID: \"d691d907-a29d-40ad-ad96-009e8a7d56e8\") " pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-zdcrf" Dec 02 16:13:04 crc kubenswrapper[4933]: I1202 16:13:04.311399 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/803ee8c3-0a9d-46c8-8069-1ebffde1429c-config\") pod \"prometheus-metric-storage-0\" (UID: \"803ee8c3-0a9d-46c8-8069-1ebffde1429c\") " pod="openstack/prometheus-metric-storage-0" Dec 02 16:13:04 crc kubenswrapper[4933]: I1202 16:13:04.312427 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/803ee8c3-0a9d-46c8-8069-1ebffde1429c-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"803ee8c3-0a9d-46c8-8069-1ebffde1429c\") " pod="openstack/prometheus-metric-storage-0" Dec 02 16:13:04 crc kubenswrapper[4933]: I1202 16:13:04.319217 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/803ee8c3-0a9d-46c8-8069-1ebffde1429c-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"803ee8c3-0a9d-46c8-8069-1ebffde1429c\") " pod="openstack/prometheus-metric-storage-0" Dec 02 16:13:04 crc kubenswrapper[4933]: I1202 16:13:04.322987 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/803ee8c3-0a9d-46c8-8069-1ebffde1429c-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"803ee8c3-0a9d-46c8-8069-1ebffde1429c\") " pod="openstack/prometheus-metric-storage-0" Dec 02 16:13:04 crc kubenswrapper[4933]: I1202 16:13:04.337756 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6t4mn\" (UniqueName: \"kubernetes.io/projected/803ee8c3-0a9d-46c8-8069-1ebffde1429c-kube-api-access-6t4mn\") pod \"prometheus-metric-storage-0\" (UID: \"803ee8c3-0a9d-46c8-8069-1ebffde1429c\") " pod="openstack/prometheus-metric-storage-0" Dec 02 16:13:04 crc kubenswrapper[4933]: I1202 16:13:04.361063 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"prometheus-metric-storage-0\" (UID: \"803ee8c3-0a9d-46c8-8069-1ebffde1429c\") " pod="openstack/prometheus-metric-storage-0" Dec 02 16:13:04 crc kubenswrapper[4933]: I1202 16:13:04.399763 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 02 16:13:04 crc kubenswrapper[4933]: I1202 16:13:04.403490 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-zdcrf" Dec 02 16:13:05 crc kubenswrapper[4933]: I1202 16:13:05.923098 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-g8qb5"] Dec 02 16:13:05 crc kubenswrapper[4933]: I1202 16:13:05.932487 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-g8qb5" Dec 02 16:13:05 crc kubenswrapper[4933]: I1202 16:13:05.944236 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-ccmnp" Dec 02 16:13:05 crc kubenswrapper[4933]: I1202 16:13:05.944443 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Dec 02 16:13:05 crc kubenswrapper[4933]: I1202 16:13:05.949300 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Dec 02 16:13:05 crc kubenswrapper[4933]: I1202 16:13:05.957899 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-g8qb5"] Dec 02 16:13:05 crc kubenswrapper[4933]: I1202 16:13:05.981551 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-2z7kb"] Dec 02 16:13:05 crc kubenswrapper[4933]: I1202 16:13:05.984248 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-2z7kb" Dec 02 16:13:06 crc kubenswrapper[4933]: I1202 16:13:06.029962 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-2z7kb"] Dec 02 16:13:06 crc kubenswrapper[4933]: I1202 16:13:06.067180 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a35aeec-161d-4ef9-a42b-7967c06c7249-ovn-controller-tls-certs\") pod \"ovn-controller-g8qb5\" (UID: \"8a35aeec-161d-4ef9-a42b-7967c06c7249\") " pod="openstack/ovn-controller-g8qb5" Dec 02 16:13:06 crc kubenswrapper[4933]: I1202 16:13:06.067265 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a35aeec-161d-4ef9-a42b-7967c06c7249-combined-ca-bundle\") pod \"ovn-controller-g8qb5\" (UID: \"8a35aeec-161d-4ef9-a42b-7967c06c7249\") " pod="openstack/ovn-controller-g8qb5" Dec 02 16:13:06 crc kubenswrapper[4933]: I1202 16:13:06.067341 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8a35aeec-161d-4ef9-a42b-7967c06c7249-var-run\") pod \"ovn-controller-g8qb5\" (UID: \"8a35aeec-161d-4ef9-a42b-7967c06c7249\") " pod="openstack/ovn-controller-g8qb5" Dec 02 16:13:06 crc kubenswrapper[4933]: I1202 16:13:06.067409 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8a35aeec-161d-4ef9-a42b-7967c06c7249-scripts\") pod \"ovn-controller-g8qb5\" (UID: \"8a35aeec-161d-4ef9-a42b-7967c06c7249\") " pod="openstack/ovn-controller-g8qb5" Dec 02 16:13:06 crc kubenswrapper[4933]: I1202 16:13:06.067441 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8a35aeec-161d-4ef9-a42b-7967c06c7249-var-log-ovn\") pod \"ovn-controller-g8qb5\" (UID: \"8a35aeec-161d-4ef9-a42b-7967c06c7249\") " pod="openstack/ovn-controller-g8qb5" Dec 02 16:13:06 crc kubenswrapper[4933]: I1202 16:13:06.067500 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfz54\" (UniqueName: \"kubernetes.io/projected/8a35aeec-161d-4ef9-a42b-7967c06c7249-kube-api-access-bfz54\") pod \"ovn-controller-g8qb5\" (UID: \"8a35aeec-161d-4ef9-a42b-7967c06c7249\") " pod="openstack/ovn-controller-g8qb5" Dec 02 16:13:06 crc kubenswrapper[4933]: I1202 16:13:06.067519 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8a35aeec-161d-4ef9-a42b-7967c06c7249-var-run-ovn\") pod \"ovn-controller-g8qb5\" (UID: \"8a35aeec-161d-4ef9-a42b-7967c06c7249\") " pod="openstack/ovn-controller-g8qb5" Dec 02 16:13:06 crc kubenswrapper[4933]: I1202 16:13:06.155524 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 02 16:13:06 crc kubenswrapper[4933]: I1202 16:13:06.168988 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8a35aeec-161d-4ef9-a42b-7967c06c7249-var-run\") pod \"ovn-controller-g8qb5\" (UID: \"8a35aeec-161d-4ef9-a42b-7967c06c7249\") " pod="openstack/ovn-controller-g8qb5" Dec 02 16:13:06 crc kubenswrapper[4933]: I1202 16:13:06.169048 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpl4k\" (UniqueName: \"kubernetes.io/projected/34ddcddf-7552-4555-8013-3dc06fc2549a-kube-api-access-vpl4k\") pod \"ovn-controller-ovs-2z7kb\" (UID: \"34ddcddf-7552-4555-8013-3dc06fc2549a\") " pod="openstack/ovn-controller-ovs-2z7kb" Dec 02 16:13:06 crc kubenswrapper[4933]: I1202 16:13:06.169105 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8a35aeec-161d-4ef9-a42b-7967c06c7249-scripts\") pod \"ovn-controller-g8qb5\" (UID: \"8a35aeec-161d-4ef9-a42b-7967c06c7249\") " pod="openstack/ovn-controller-g8qb5" Dec 02 16:13:06 crc kubenswrapper[4933]: I1202 16:13:06.169127 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/34ddcddf-7552-4555-8013-3dc06fc2549a-var-log\") pod \"ovn-controller-ovs-2z7kb\" (UID: \"34ddcddf-7552-4555-8013-3dc06fc2549a\") " pod="openstack/ovn-controller-ovs-2z7kb" Dec 02 16:13:06 crc kubenswrapper[4933]: I1202 16:13:06.169157 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8a35aeec-161d-4ef9-a42b-7967c06c7249-var-log-ovn\") pod \"ovn-controller-g8qb5\" (UID: \"8a35aeec-161d-4ef9-a42b-7967c06c7249\") " pod="openstack/ovn-controller-g8qb5" Dec 02 16:13:06 crc kubenswrapper[4933]: I1202 16:13:06.169220 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfz54\" (UniqueName: \"kubernetes.io/projected/8a35aeec-161d-4ef9-a42b-7967c06c7249-kube-api-access-bfz54\") pod \"ovn-controller-g8qb5\" (UID: \"8a35aeec-161d-4ef9-a42b-7967c06c7249\") " pod="openstack/ovn-controller-g8qb5" Dec 02 16:13:06 crc kubenswrapper[4933]: I1202 16:13:06.169239 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/34ddcddf-7552-4555-8013-3dc06fc2549a-var-lib\") pod \"ovn-controller-ovs-2z7kb\" (UID: \"34ddcddf-7552-4555-8013-3dc06fc2549a\") " pod="openstack/ovn-controller-ovs-2z7kb" Dec 02 16:13:06 crc kubenswrapper[4933]: I1202 16:13:06.169255 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8a35aeec-161d-4ef9-a42b-7967c06c7249-var-run-ovn\") pod \"ovn-controller-g8qb5\" (UID: \"8a35aeec-161d-4ef9-a42b-7967c06c7249\") " pod="openstack/ovn-controller-g8qb5" Dec 02 16:13:06 crc kubenswrapper[4933]: I1202 16:13:06.169278 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/34ddcddf-7552-4555-8013-3dc06fc2549a-scripts\") pod \"ovn-controller-ovs-2z7kb\" (UID: \"34ddcddf-7552-4555-8013-3dc06fc2549a\") " pod="openstack/ovn-controller-ovs-2z7kb" Dec 02 16:13:06 crc kubenswrapper[4933]: I1202 16:13:06.169310 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/34ddcddf-7552-4555-8013-3dc06fc2549a-var-run\") pod \"ovn-controller-ovs-2z7kb\" (UID: \"34ddcddf-7552-4555-8013-3dc06fc2549a\") " pod="openstack/ovn-controller-ovs-2z7kb" Dec 02 16:13:06 crc kubenswrapper[4933]: I1202 16:13:06.169352 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/34ddcddf-7552-4555-8013-3dc06fc2549a-etc-ovs\") pod \"ovn-controller-ovs-2z7kb\" (UID: \"34ddcddf-7552-4555-8013-3dc06fc2549a\") " pod="openstack/ovn-controller-ovs-2z7kb" Dec 02 16:13:06 crc kubenswrapper[4933]: I1202 16:13:06.169384 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a35aeec-161d-4ef9-a42b-7967c06c7249-ovn-controller-tls-certs\") pod \"ovn-controller-g8qb5\" (UID: \"8a35aeec-161d-4ef9-a42b-7967c06c7249\") " pod="openstack/ovn-controller-g8qb5" Dec 02 16:13:06 crc kubenswrapper[4933]: I1202 16:13:06.169432 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a35aeec-161d-4ef9-a42b-7967c06c7249-combined-ca-bundle\") pod \"ovn-controller-g8qb5\" (UID: \"8a35aeec-161d-4ef9-a42b-7967c06c7249\") " pod="openstack/ovn-controller-g8qb5" Dec 02 16:13:06 crc kubenswrapper[4933]: I1202 16:13:06.170054 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8a35aeec-161d-4ef9-a42b-7967c06c7249-var-run-ovn\") pod \"ovn-controller-g8qb5\" (UID: \"8a35aeec-161d-4ef9-a42b-7967c06c7249\") " pod="openstack/ovn-controller-g8qb5" Dec 02 16:13:06 crc kubenswrapper[4933]: I1202 16:13:06.171285 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8a35aeec-161d-4ef9-a42b-7967c06c7249-var-log-ovn\") pod \"ovn-controller-g8qb5\" (UID: \"8a35aeec-161d-4ef9-a42b-7967c06c7249\") " pod="openstack/ovn-controller-g8qb5" Dec 02 16:13:06 crc kubenswrapper[4933]: I1202 16:13:06.172389 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8a35aeec-161d-4ef9-a42b-7967c06c7249-var-run\") pod \"ovn-controller-g8qb5\" (UID: \"8a35aeec-161d-4ef9-a42b-7967c06c7249\") " pod="openstack/ovn-controller-g8qb5" Dec 02 16:13:06 crc kubenswrapper[4933]: I1202 16:13:06.181359 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8a35aeec-161d-4ef9-a42b-7967c06c7249-scripts\") pod \"ovn-controller-g8qb5\" (UID: \"8a35aeec-161d-4ef9-a42b-7967c06c7249\") " pod="openstack/ovn-controller-g8qb5" Dec 02 16:13:06 crc kubenswrapper[4933]: I1202 16:13:06.198598 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a35aeec-161d-4ef9-a42b-7967c06c7249-combined-ca-bundle\") pod \"ovn-controller-g8qb5\" (UID: \"8a35aeec-161d-4ef9-a42b-7967c06c7249\") " pod="openstack/ovn-controller-g8qb5" Dec 02 16:13:06 crc kubenswrapper[4933]: I1202 16:13:06.199207 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a35aeec-161d-4ef9-a42b-7967c06c7249-ovn-controller-tls-certs\") pod \"ovn-controller-g8qb5\" (UID: \"8a35aeec-161d-4ef9-a42b-7967c06c7249\") " pod="openstack/ovn-controller-g8qb5" Dec 02 16:13:06 crc kubenswrapper[4933]: I1202 16:13:06.206587 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfz54\" (UniqueName: \"kubernetes.io/projected/8a35aeec-161d-4ef9-a42b-7967c06c7249-kube-api-access-bfz54\") pod \"ovn-controller-g8qb5\" (UID: \"8a35aeec-161d-4ef9-a42b-7967c06c7249\") " pod="openstack/ovn-controller-g8qb5" Dec 02 16:13:06 crc kubenswrapper[4933]: I1202 16:13:06.271171 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/34ddcddf-7552-4555-8013-3dc06fc2549a-var-lib\") pod \"ovn-controller-ovs-2z7kb\" (UID: \"34ddcddf-7552-4555-8013-3dc06fc2549a\") " pod="openstack/ovn-controller-ovs-2z7kb" Dec 02 16:13:06 crc kubenswrapper[4933]: I1202 16:13:06.271223 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/34ddcddf-7552-4555-8013-3dc06fc2549a-scripts\") pod \"ovn-controller-ovs-2z7kb\" (UID: \"34ddcddf-7552-4555-8013-3dc06fc2549a\") " pod="openstack/ovn-controller-ovs-2z7kb" Dec 02 16:13:06 crc kubenswrapper[4933]: I1202 16:13:06.271248 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/34ddcddf-7552-4555-8013-3dc06fc2549a-var-run\") pod \"ovn-controller-ovs-2z7kb\" (UID: \"34ddcddf-7552-4555-8013-3dc06fc2549a\") " pod="openstack/ovn-controller-ovs-2z7kb" Dec 02 16:13:06 crc kubenswrapper[4933]: I1202 16:13:06.271278 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/34ddcddf-7552-4555-8013-3dc06fc2549a-etc-ovs\") pod \"ovn-controller-ovs-2z7kb\" (UID: \"34ddcddf-7552-4555-8013-3dc06fc2549a\") " pod="openstack/ovn-controller-ovs-2z7kb" Dec 02 16:13:06 crc kubenswrapper[4933]: I1202 16:13:06.271357 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vpl4k\" (UniqueName: \"kubernetes.io/projected/34ddcddf-7552-4555-8013-3dc06fc2549a-kube-api-access-vpl4k\") pod \"ovn-controller-ovs-2z7kb\" (UID: \"34ddcddf-7552-4555-8013-3dc06fc2549a\") " pod="openstack/ovn-controller-ovs-2z7kb" Dec 02 16:13:06 crc kubenswrapper[4933]: I1202 16:13:06.271392 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/34ddcddf-7552-4555-8013-3dc06fc2549a-var-log\") pod \"ovn-controller-ovs-2z7kb\" (UID: \"34ddcddf-7552-4555-8013-3dc06fc2549a\") " pod="openstack/ovn-controller-ovs-2z7kb" Dec 02 16:13:06 crc kubenswrapper[4933]: I1202 16:13:06.271757 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/34ddcddf-7552-4555-8013-3dc06fc2549a-var-log\") pod \"ovn-controller-ovs-2z7kb\" (UID: \"34ddcddf-7552-4555-8013-3dc06fc2549a\") " pod="openstack/ovn-controller-ovs-2z7kb" Dec 02 16:13:06 crc kubenswrapper[4933]: I1202 16:13:06.271987 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/34ddcddf-7552-4555-8013-3dc06fc2549a-var-lib\") pod \"ovn-controller-ovs-2z7kb\" (UID: \"34ddcddf-7552-4555-8013-3dc06fc2549a\") " pod="openstack/ovn-controller-ovs-2z7kb" Dec 02 16:13:06 crc kubenswrapper[4933]: I1202 16:13:06.272970 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/34ddcddf-7552-4555-8013-3dc06fc2549a-etc-ovs\") pod \"ovn-controller-ovs-2z7kb\" (UID: \"34ddcddf-7552-4555-8013-3dc06fc2549a\") " pod="openstack/ovn-controller-ovs-2z7kb" Dec 02 16:13:06 crc kubenswrapper[4933]: I1202 16:13:06.273040 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/34ddcddf-7552-4555-8013-3dc06fc2549a-var-run\") pod \"ovn-controller-ovs-2z7kb\" (UID: \"34ddcddf-7552-4555-8013-3dc06fc2549a\") " pod="openstack/ovn-controller-ovs-2z7kb" Dec 02 16:13:06 crc kubenswrapper[4933]: I1202 16:13:06.273771 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/34ddcddf-7552-4555-8013-3dc06fc2549a-scripts\") pod \"ovn-controller-ovs-2z7kb\" (UID: \"34ddcddf-7552-4555-8013-3dc06fc2549a\") " pod="openstack/ovn-controller-ovs-2z7kb" Dec 02 16:13:06 crc kubenswrapper[4933]: I1202 16:13:06.281960 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-g8qb5" Dec 02 16:13:06 crc kubenswrapper[4933]: I1202 16:13:06.290925 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpl4k\" (UniqueName: \"kubernetes.io/projected/34ddcddf-7552-4555-8013-3dc06fc2549a-kube-api-access-vpl4k\") pod \"ovn-controller-ovs-2z7kb\" (UID: \"34ddcddf-7552-4555-8013-3dc06fc2549a\") " pod="openstack/ovn-controller-ovs-2z7kb" Dec 02 16:13:06 crc kubenswrapper[4933]: I1202 16:13:06.329214 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-2z7kb" Dec 02 16:13:06 crc kubenswrapper[4933]: I1202 16:13:06.764442 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 02 16:13:06 crc kubenswrapper[4933]: I1202 16:13:06.767868 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 02 16:13:06 crc kubenswrapper[4933]: I1202 16:13:06.773100 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Dec 02 16:13:06 crc kubenswrapper[4933]: I1202 16:13:06.773425 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Dec 02 16:13:06 crc kubenswrapper[4933]: I1202 16:13:06.773582 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Dec 02 16:13:06 crc kubenswrapper[4933]: I1202 16:13:06.773730 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Dec 02 16:13:06 crc kubenswrapper[4933]: I1202 16:13:06.773971 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-9sh4p" Dec 02 16:13:06 crc kubenswrapper[4933]: I1202 16:13:06.779189 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 02 16:13:06 crc kubenswrapper[4933]: I1202 16:13:06.884547 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/662ba342-72a0-430b-b46d-d6f0f0eafd2b-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"662ba342-72a0-430b-b46d-d6f0f0eafd2b\") " pod="openstack/ovsdbserver-nb-0" Dec 02 16:13:06 crc kubenswrapper[4933]: I1202 16:13:06.884781 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/662ba342-72a0-430b-b46d-d6f0f0eafd2b-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"662ba342-72a0-430b-b46d-d6f0f0eafd2b\") " pod="openstack/ovsdbserver-nb-0" Dec 02 16:13:06 crc kubenswrapper[4933]: I1202 16:13:06.884849 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/662ba342-72a0-430b-b46d-d6f0f0eafd2b-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"662ba342-72a0-430b-b46d-d6f0f0eafd2b\") " pod="openstack/ovsdbserver-nb-0" Dec 02 16:13:06 crc kubenswrapper[4933]: I1202 16:13:06.884884 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-nb-0\" (UID: \"662ba342-72a0-430b-b46d-d6f0f0eafd2b\") " pod="openstack/ovsdbserver-nb-0" Dec 02 16:13:06 crc kubenswrapper[4933]: I1202 16:13:06.884916 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/662ba342-72a0-430b-b46d-d6f0f0eafd2b-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"662ba342-72a0-430b-b46d-d6f0f0eafd2b\") " pod="openstack/ovsdbserver-nb-0" Dec 02 16:13:06 crc kubenswrapper[4933]: I1202 16:13:06.886718 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wglg4\" (UniqueName: \"kubernetes.io/projected/662ba342-72a0-430b-b46d-d6f0f0eafd2b-kube-api-access-wglg4\") pod \"ovsdbserver-nb-0\" (UID: \"662ba342-72a0-430b-b46d-d6f0f0eafd2b\") " pod="openstack/ovsdbserver-nb-0" Dec 02 16:13:06 crc kubenswrapper[4933]: I1202 16:13:06.886812 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/662ba342-72a0-430b-b46d-d6f0f0eafd2b-config\") pod \"ovsdbserver-nb-0\" (UID: \"662ba342-72a0-430b-b46d-d6f0f0eafd2b\") " pod="openstack/ovsdbserver-nb-0" Dec 02 16:13:06 crc kubenswrapper[4933]: I1202 16:13:06.887082 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/662ba342-72a0-430b-b46d-d6f0f0eafd2b-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"662ba342-72a0-430b-b46d-d6f0f0eafd2b\") " pod="openstack/ovsdbserver-nb-0" Dec 02 16:13:06 crc kubenswrapper[4933]: I1202 16:13:06.990974 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/662ba342-72a0-430b-b46d-d6f0f0eafd2b-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"662ba342-72a0-430b-b46d-d6f0f0eafd2b\") " pod="openstack/ovsdbserver-nb-0" Dec 02 16:13:06 crc kubenswrapper[4933]: I1202 16:13:06.991062 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/662ba342-72a0-430b-b46d-d6f0f0eafd2b-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"662ba342-72a0-430b-b46d-d6f0f0eafd2b\") " pod="openstack/ovsdbserver-nb-0" Dec 02 16:13:06 crc kubenswrapper[4933]: I1202 16:13:06.991101 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-nb-0\" (UID: \"662ba342-72a0-430b-b46d-d6f0f0eafd2b\") " pod="openstack/ovsdbserver-nb-0" Dec 02 16:13:06 crc kubenswrapper[4933]: I1202 16:13:06.991245 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/662ba342-72a0-430b-b46d-d6f0f0eafd2b-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"662ba342-72a0-430b-b46d-d6f0f0eafd2b\") " pod="openstack/ovsdbserver-nb-0" Dec 02 16:13:06 crc kubenswrapper[4933]: I1202 16:13:06.991452 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wglg4\" (UniqueName: \"kubernetes.io/projected/662ba342-72a0-430b-b46d-d6f0f0eafd2b-kube-api-access-wglg4\") pod \"ovsdbserver-nb-0\" (UID: \"662ba342-72a0-430b-b46d-d6f0f0eafd2b\") " pod="openstack/ovsdbserver-nb-0" Dec 02 16:13:06 crc kubenswrapper[4933]: I1202 16:13:06.991521 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/662ba342-72a0-430b-b46d-d6f0f0eafd2b-config\") pod \"ovsdbserver-nb-0\" (UID: \"662ba342-72a0-430b-b46d-d6f0f0eafd2b\") " pod="openstack/ovsdbserver-nb-0" Dec 02 16:13:06 crc kubenswrapper[4933]: I1202 16:13:06.991807 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/662ba342-72a0-430b-b46d-d6f0f0eafd2b-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"662ba342-72a0-430b-b46d-d6f0f0eafd2b\") " pod="openstack/ovsdbserver-nb-0" Dec 02 16:13:06 crc kubenswrapper[4933]: I1202 16:13:06.992198 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/662ba342-72a0-430b-b46d-d6f0f0eafd2b-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"662ba342-72a0-430b-b46d-d6f0f0eafd2b\") " pod="openstack/ovsdbserver-nb-0" Dec 02 16:13:06 crc kubenswrapper[4933]: I1202 16:13:06.992300 4933 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-nb-0\" (UID: \"662ba342-72a0-430b-b46d-d6f0f0eafd2b\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/ovsdbserver-nb-0" Dec 02 16:13:06 crc kubenswrapper[4933]: I1202 16:13:06.993365 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/662ba342-72a0-430b-b46d-d6f0f0eafd2b-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"662ba342-72a0-430b-b46d-d6f0f0eafd2b\") " pod="openstack/ovsdbserver-nb-0" Dec 02 16:13:06 crc kubenswrapper[4933]: I1202 16:13:06.994255 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/662ba342-72a0-430b-b46d-d6f0f0eafd2b-config\") pod \"ovsdbserver-nb-0\" (UID: \"662ba342-72a0-430b-b46d-d6f0f0eafd2b\") " pod="openstack/ovsdbserver-nb-0" Dec 02 16:13:06 crc kubenswrapper[4933]: I1202 16:13:06.996047 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/662ba342-72a0-430b-b46d-d6f0f0eafd2b-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"662ba342-72a0-430b-b46d-d6f0f0eafd2b\") " pod="openstack/ovsdbserver-nb-0" Dec 02 16:13:06 crc kubenswrapper[4933]: I1202 16:13:06.996647 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/662ba342-72a0-430b-b46d-d6f0f0eafd2b-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"662ba342-72a0-430b-b46d-d6f0f0eafd2b\") " pod="openstack/ovsdbserver-nb-0" Dec 02 16:13:06 crc kubenswrapper[4933]: I1202 16:13:06.996975 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/662ba342-72a0-430b-b46d-d6f0f0eafd2b-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"662ba342-72a0-430b-b46d-d6f0f0eafd2b\") " pod="openstack/ovsdbserver-nb-0" Dec 02 16:13:07 crc kubenswrapper[4933]: I1202 16:13:07.001833 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/662ba342-72a0-430b-b46d-d6f0f0eafd2b-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"662ba342-72a0-430b-b46d-d6f0f0eafd2b\") " pod="openstack/ovsdbserver-nb-0" Dec 02 16:13:07 crc kubenswrapper[4933]: I1202 16:13:07.044739 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wglg4\" (UniqueName: \"kubernetes.io/projected/662ba342-72a0-430b-b46d-d6f0f0eafd2b-kube-api-access-wglg4\") pod \"ovsdbserver-nb-0\" (UID: \"662ba342-72a0-430b-b46d-d6f0f0eafd2b\") " pod="openstack/ovsdbserver-nb-0" Dec 02 16:13:07 crc kubenswrapper[4933]: I1202 16:13:07.061287 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-nb-0\" (UID: \"662ba342-72a0-430b-b46d-d6f0f0eafd2b\") " pod="openstack/ovsdbserver-nb-0" Dec 02 16:13:07 crc kubenswrapper[4933]: I1202 16:13:07.114742 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 02 16:13:09 crc kubenswrapper[4933]: I1202 16:13:09.757902 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 02 16:13:09 crc kubenswrapper[4933]: I1202 16:13:09.759978 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 02 16:13:09 crc kubenswrapper[4933]: I1202 16:13:09.762495 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-2kct2" Dec 02 16:13:09 crc kubenswrapper[4933]: I1202 16:13:09.763569 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Dec 02 16:13:09 crc kubenswrapper[4933]: I1202 16:13:09.763579 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Dec 02 16:13:09 crc kubenswrapper[4933]: I1202 16:13:09.764365 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Dec 02 16:13:09 crc kubenswrapper[4933]: I1202 16:13:09.766729 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 02 16:13:09 crc kubenswrapper[4933]: I1202 16:13:09.895423 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-sb-0\" (UID: \"d5b04559-f2ad-49dc-a280-626d6de841de\") " pod="openstack/ovsdbserver-sb-0" Dec 02 16:13:09 crc kubenswrapper[4933]: I1202 16:13:09.895479 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5b04559-f2ad-49dc-a280-626d6de841de-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"d5b04559-f2ad-49dc-a280-626d6de841de\") " pod="openstack/ovsdbserver-sb-0" Dec 02 16:13:09 crc kubenswrapper[4933]: I1202 16:13:09.895558 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q77pm\" (UniqueName: \"kubernetes.io/projected/d5b04559-f2ad-49dc-a280-626d6de841de-kube-api-access-q77pm\") pod \"ovsdbserver-sb-0\" (UID: \"d5b04559-f2ad-49dc-a280-626d6de841de\") " pod="openstack/ovsdbserver-sb-0" Dec 02 16:13:09 crc kubenswrapper[4933]: I1202 16:13:09.895624 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5b04559-f2ad-49dc-a280-626d6de841de-config\") pod \"ovsdbserver-sb-0\" (UID: \"d5b04559-f2ad-49dc-a280-626d6de841de\") " pod="openstack/ovsdbserver-sb-0" Dec 02 16:13:09 crc kubenswrapper[4933]: I1202 16:13:09.895646 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5b04559-f2ad-49dc-a280-626d6de841de-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"d5b04559-f2ad-49dc-a280-626d6de841de\") " pod="openstack/ovsdbserver-sb-0" Dec 02 16:13:09 crc kubenswrapper[4933]: I1202 16:13:09.895683 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d5b04559-f2ad-49dc-a280-626d6de841de-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"d5b04559-f2ad-49dc-a280-626d6de841de\") " pod="openstack/ovsdbserver-sb-0" Dec 02 16:13:09 crc kubenswrapper[4933]: I1202 16:13:09.895788 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5b04559-f2ad-49dc-a280-626d6de841de-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"d5b04559-f2ad-49dc-a280-626d6de841de\") " pod="openstack/ovsdbserver-sb-0" Dec 02 16:13:09 crc kubenswrapper[4933]: I1202 16:13:09.895874 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d5b04559-f2ad-49dc-a280-626d6de841de-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"d5b04559-f2ad-49dc-a280-626d6de841de\") " pod="openstack/ovsdbserver-sb-0" Dec 02 16:13:09 crc kubenswrapper[4933]: I1202 16:13:09.997503 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5b04559-f2ad-49dc-a280-626d6de841de-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"d5b04559-f2ad-49dc-a280-626d6de841de\") " pod="openstack/ovsdbserver-sb-0" Dec 02 16:13:09 crc kubenswrapper[4933]: I1202 16:13:09.997591 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d5b04559-f2ad-49dc-a280-626d6de841de-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"d5b04559-f2ad-49dc-a280-626d6de841de\") " pod="openstack/ovsdbserver-sb-0" Dec 02 16:13:09 crc kubenswrapper[4933]: I1202 16:13:09.997614 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-sb-0\" (UID: \"d5b04559-f2ad-49dc-a280-626d6de841de\") " pod="openstack/ovsdbserver-sb-0" Dec 02 16:13:09 crc kubenswrapper[4933]: I1202 16:13:09.997630 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5b04559-f2ad-49dc-a280-626d6de841de-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"d5b04559-f2ad-49dc-a280-626d6de841de\") " pod="openstack/ovsdbserver-sb-0" Dec 02 16:13:09 crc kubenswrapper[4933]: I1202 16:13:09.997679 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q77pm\" (UniqueName: \"kubernetes.io/projected/d5b04559-f2ad-49dc-a280-626d6de841de-kube-api-access-q77pm\") pod \"ovsdbserver-sb-0\" (UID: \"d5b04559-f2ad-49dc-a280-626d6de841de\") " pod="openstack/ovsdbserver-sb-0" Dec 02 16:13:09 crc kubenswrapper[4933]: I1202 16:13:09.997728 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5b04559-f2ad-49dc-a280-626d6de841de-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"d5b04559-f2ad-49dc-a280-626d6de841de\") " pod="openstack/ovsdbserver-sb-0" Dec 02 16:13:09 crc kubenswrapper[4933]: I1202 16:13:09.997748 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5b04559-f2ad-49dc-a280-626d6de841de-config\") pod \"ovsdbserver-sb-0\" (UID: \"d5b04559-f2ad-49dc-a280-626d6de841de\") " pod="openstack/ovsdbserver-sb-0" Dec 02 16:13:09 crc kubenswrapper[4933]: I1202 16:13:09.997777 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d5b04559-f2ad-49dc-a280-626d6de841de-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"d5b04559-f2ad-49dc-a280-626d6de841de\") " pod="openstack/ovsdbserver-sb-0" Dec 02 16:13:09 crc kubenswrapper[4933]: I1202 16:13:09.998050 4933 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-sb-0\" (UID: \"d5b04559-f2ad-49dc-a280-626d6de841de\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/ovsdbserver-sb-0" Dec 02 16:13:09 crc kubenswrapper[4933]: I1202 16:13:09.998349 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d5b04559-f2ad-49dc-a280-626d6de841de-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"d5b04559-f2ad-49dc-a280-626d6de841de\") " pod="openstack/ovsdbserver-sb-0" Dec 02 16:13:09 crc kubenswrapper[4933]: I1202 16:13:09.998989 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d5b04559-f2ad-49dc-a280-626d6de841de-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"d5b04559-f2ad-49dc-a280-626d6de841de\") " pod="openstack/ovsdbserver-sb-0" Dec 02 16:13:09 crc kubenswrapper[4933]: I1202 16:13:09.999579 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5b04559-f2ad-49dc-a280-626d6de841de-config\") pod \"ovsdbserver-sb-0\" (UID: \"d5b04559-f2ad-49dc-a280-626d6de841de\") " pod="openstack/ovsdbserver-sb-0" Dec 02 16:13:10 crc kubenswrapper[4933]: I1202 16:13:10.004325 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5b04559-f2ad-49dc-a280-626d6de841de-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"d5b04559-f2ad-49dc-a280-626d6de841de\") " pod="openstack/ovsdbserver-sb-0" Dec 02 16:13:10 crc kubenswrapper[4933]: I1202 16:13:10.005626 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5b04559-f2ad-49dc-a280-626d6de841de-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"d5b04559-f2ad-49dc-a280-626d6de841de\") " pod="openstack/ovsdbserver-sb-0" Dec 02 16:13:10 crc kubenswrapper[4933]: I1202 16:13:10.006781 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5b04559-f2ad-49dc-a280-626d6de841de-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"d5b04559-f2ad-49dc-a280-626d6de841de\") " pod="openstack/ovsdbserver-sb-0" Dec 02 16:13:10 crc kubenswrapper[4933]: I1202 16:13:10.016865 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q77pm\" (UniqueName: \"kubernetes.io/projected/d5b04559-f2ad-49dc-a280-626d6de841de-kube-api-access-q77pm\") pod \"ovsdbserver-sb-0\" (UID: \"d5b04559-f2ad-49dc-a280-626d6de841de\") " pod="openstack/ovsdbserver-sb-0" Dec 02 16:13:10 crc kubenswrapper[4933]: I1202 16:13:10.025563 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-sb-0\" (UID: \"d5b04559-f2ad-49dc-a280-626d6de841de\") " pod="openstack/ovsdbserver-sb-0" Dec 02 16:13:10 crc kubenswrapper[4933]: I1202 16:13:10.100068 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 02 16:13:12 crc kubenswrapper[4933]: E1202 16:13:12.003005 4933 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 02 16:13:12 crc kubenswrapper[4933]: E1202 16:13:12.003775 4933 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-75bfg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-tdkq4_openstack(1fa23eaa-697d-49d3-b507-51018b313a75): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 16:13:12 crc kubenswrapper[4933]: E1202 16:13:12.005301 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-tdkq4" podUID="1fa23eaa-697d-49d3-b507-51018b313a75" Dec 02 16:13:12 crc kubenswrapper[4933]: E1202 16:13:12.107328 4933 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 02 16:13:12 crc kubenswrapper[4933]: E1202 16:13:12.107803 4933 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bnbgn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-64h58_openstack(0e0da7e0-c02a-49ff-8814-fafd537983b1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 16:13:12 crc kubenswrapper[4933]: E1202 16:13:12.109245 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-64h58" podUID="0e0da7e0-c02a-49ff-8814-fafd537983b1" Dec 02 16:13:12 crc kubenswrapper[4933]: I1202 16:13:12.310392 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e3cdda86-3d0d-486d-ae36-ab6792bff2ab","Type":"ContainerStarted","Data":"725c79b533ea015717173c67be921234aebffade0a88dab09a5710a1934b5dd4"} Dec 02 16:13:12 crc kubenswrapper[4933]: I1202 16:13:12.651208 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 02 16:13:12 crc kubenswrapper[4933]: I1202 16:13:12.663598 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 02 16:13:12 crc kubenswrapper[4933]: I1202 16:13:12.971098 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 02 16:13:12 crc kubenswrapper[4933]: W1202 16:13:12.976715 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcd426cb0_522e_4c62_874e_a85119e82490.slice/crio-084390e48a42252a555b707e36b0ca7f50b3272b42c9270240aa90ec8ba4e8b3 WatchSource:0}: Error finding container 084390e48a42252a555b707e36b0ca7f50b3272b42c9270240aa90ec8ba4e8b3: Status 404 returned error can't find the container with id 084390e48a42252a555b707e36b0ca7f50b3272b42c9270240aa90ec8ba4e8b3 Dec 02 16:13:13 crc kubenswrapper[4933]: I1202 16:13:13.069101 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-64h58" Dec 02 16:13:13 crc kubenswrapper[4933]: I1202 16:13:13.076340 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-tdkq4" Dec 02 16:13:13 crc kubenswrapper[4933]: I1202 16:13:13.167568 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-75bfg\" (UniqueName: \"kubernetes.io/projected/1fa23eaa-697d-49d3-b507-51018b313a75-kube-api-access-75bfg\") pod \"1fa23eaa-697d-49d3-b507-51018b313a75\" (UID: \"1fa23eaa-697d-49d3-b507-51018b313a75\") " Dec 02 16:13:13 crc kubenswrapper[4933]: I1202 16:13:13.168053 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e0da7e0-c02a-49ff-8814-fafd537983b1-config\") pod \"0e0da7e0-c02a-49ff-8814-fafd537983b1\" (UID: \"0e0da7e0-c02a-49ff-8814-fafd537983b1\") " Dec 02 16:13:13 crc kubenswrapper[4933]: I1202 16:13:13.168202 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bnbgn\" (UniqueName: \"kubernetes.io/projected/0e0da7e0-c02a-49ff-8814-fafd537983b1-kube-api-access-bnbgn\") pod \"0e0da7e0-c02a-49ff-8814-fafd537983b1\" (UID: \"0e0da7e0-c02a-49ff-8814-fafd537983b1\") " Dec 02 16:13:13 crc kubenswrapper[4933]: I1202 16:13:13.168251 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fa23eaa-697d-49d3-b507-51018b313a75-config\") pod \"1fa23eaa-697d-49d3-b507-51018b313a75\" (UID: \"1fa23eaa-697d-49d3-b507-51018b313a75\") " Dec 02 16:13:13 crc kubenswrapper[4933]: I1202 16:13:13.168295 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0e0da7e0-c02a-49ff-8814-fafd537983b1-dns-svc\") pod \"0e0da7e0-c02a-49ff-8814-fafd537983b1\" (UID: \"0e0da7e0-c02a-49ff-8814-fafd537983b1\") " Dec 02 16:13:13 crc kubenswrapper[4933]: I1202 16:13:13.169467 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e0da7e0-c02a-49ff-8814-fafd537983b1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0e0da7e0-c02a-49ff-8814-fafd537983b1" (UID: "0e0da7e0-c02a-49ff-8814-fafd537983b1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:13:13 crc kubenswrapper[4933]: I1202 16:13:13.171132 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fa23eaa-697d-49d3-b507-51018b313a75-config" (OuterVolumeSpecName: "config") pod "1fa23eaa-697d-49d3-b507-51018b313a75" (UID: "1fa23eaa-697d-49d3-b507-51018b313a75"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:13:13 crc kubenswrapper[4933]: I1202 16:13:13.171159 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e0da7e0-c02a-49ff-8814-fafd537983b1-config" (OuterVolumeSpecName: "config") pod "0e0da7e0-c02a-49ff-8814-fafd537983b1" (UID: "0e0da7e0-c02a-49ff-8814-fafd537983b1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:13:13 crc kubenswrapper[4933]: I1202 16:13:13.175621 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e0da7e0-c02a-49ff-8814-fafd537983b1-kube-api-access-bnbgn" (OuterVolumeSpecName: "kube-api-access-bnbgn") pod "0e0da7e0-c02a-49ff-8814-fafd537983b1" (UID: "0e0da7e0-c02a-49ff-8814-fafd537983b1"). InnerVolumeSpecName "kube-api-access-bnbgn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:13:13 crc kubenswrapper[4933]: I1202 16:13:13.176396 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fa23eaa-697d-49d3-b507-51018b313a75-kube-api-access-75bfg" (OuterVolumeSpecName: "kube-api-access-75bfg") pod "1fa23eaa-697d-49d3-b507-51018b313a75" (UID: "1fa23eaa-697d-49d3-b507-51018b313a75"). InnerVolumeSpecName "kube-api-access-75bfg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:13:13 crc kubenswrapper[4933]: I1202 16:13:13.270580 4933 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e0da7e0-c02a-49ff-8814-fafd537983b1-config\") on node \"crc\" DevicePath \"\"" Dec 02 16:13:13 crc kubenswrapper[4933]: I1202 16:13:13.270617 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bnbgn\" (UniqueName: \"kubernetes.io/projected/0e0da7e0-c02a-49ff-8814-fafd537983b1-kube-api-access-bnbgn\") on node \"crc\" DevicePath \"\"" Dec 02 16:13:13 crc kubenswrapper[4933]: I1202 16:13:13.270631 4933 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fa23eaa-697d-49d3-b507-51018b313a75-config\") on node \"crc\" DevicePath \"\"" Dec 02 16:13:13 crc kubenswrapper[4933]: I1202 16:13:13.270642 4933 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0e0da7e0-c02a-49ff-8814-fafd537983b1-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 16:13:13 crc kubenswrapper[4933]: I1202 16:13:13.270653 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-75bfg\" (UniqueName: \"kubernetes.io/projected/1fa23eaa-697d-49d3-b507-51018b313a75-kube-api-access-75bfg\") on node \"crc\" DevicePath \"\"" Dec 02 16:13:13 crc kubenswrapper[4933]: I1202 16:13:13.325313 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"cd426cb0-522e-4c62-874e-a85119e82490","Type":"ContainerStarted","Data":"084390e48a42252a555b707e36b0ca7f50b3272b42c9270240aa90ec8ba4e8b3"} Dec 02 16:13:13 crc kubenswrapper[4933]: I1202 16:13:13.328593 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"553a306b-ceeb-41b2-8b2c-c32dcd70639e","Type":"ContainerStarted","Data":"ba1df86643ae35114d53ee8b604d30e906dfb55981e49ceaf98bac06092e1cef"} Dec 02 16:13:13 crc kubenswrapper[4933]: I1202 16:13:13.331415 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-64h58" Dec 02 16:13:13 crc kubenswrapper[4933]: I1202 16:13:13.331454 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-64h58" event={"ID":"0e0da7e0-c02a-49ff-8814-fafd537983b1","Type":"ContainerDied","Data":"f82914c2d478c5f74b5eb0fbfaae6fc889a0c43d617075be3f91aeae5a4d0ee8"} Dec 02 16:13:13 crc kubenswrapper[4933]: I1202 16:13:13.337816 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"5df97ea3-3281-4995-bf14-cb09bf09f39d","Type":"ContainerStarted","Data":"32085553b903c00bc98aea6fc554407cf0f02237e8cc505f51502b4eda09da3c"} Dec 02 16:13:13 crc kubenswrapper[4933]: I1202 16:13:13.340537 4933 generic.go:334] "Generic (PLEG): container finished" podID="e246afd2-cb0b-49f6-8144-6d4acba08825" containerID="60bb09918cfa0acd9c53ec060c987eaa645d316221e9eba07629d91dc767f855" exitCode=0 Dec 02 16:13:13 crc kubenswrapper[4933]: I1202 16:13:13.340593 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-4tp9z" event={"ID":"e246afd2-cb0b-49f6-8144-6d4acba08825","Type":"ContainerDied","Data":"60bb09918cfa0acd9c53ec060c987eaa645d316221e9eba07629d91dc767f855"} Dec 02 16:13:13 crc kubenswrapper[4933]: I1202 16:13:13.345483 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-tdkq4" event={"ID":"1fa23eaa-697d-49d3-b507-51018b313a75","Type":"ContainerDied","Data":"f4a88eba097198584ecc54c48e1273105d3ae5dc453de8f9afec83b3b0b1da80"} Dec 02 16:13:13 crc kubenswrapper[4933]: I1202 16:13:13.345526 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-tdkq4" Dec 02 16:13:13 crc kubenswrapper[4933]: I1202 16:13:13.348039 4933 generic.go:334] "Generic (PLEG): container finished" podID="1042bb7b-3049-4492-a177-d1eb37d550a7" containerID="9d30cec3a97a831843594c14863a1367de3b3e59a815b953ed80bbf97f8d7a3f" exitCode=0 Dec 02 16:13:13 crc kubenswrapper[4933]: I1202 16:13:13.348067 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-rxkr8" event={"ID":"1042bb7b-3049-4492-a177-d1eb37d550a7","Type":"ContainerDied","Data":"9d30cec3a97a831843594c14863a1367de3b3e59a815b953ed80bbf97f8d7a3f"} Dec 02 16:13:13 crc kubenswrapper[4933]: E1202 16:13:13.393876 4933 mount_linux.go:282] Mount failed: exit status 32 Dec 02 16:13:13 crc kubenswrapper[4933]: Mounting command: mount Dec 02 16:13:13 crc kubenswrapper[4933]: Mounting arguments: --no-canonicalize -o bind /proc/4933/fd/30 /var/lib/kubelet/pods/e246afd2-cb0b-49f6-8144-6d4acba08825/volume-subpaths/dns-svc/dnsmasq-dns/1 Dec 02 16:13:13 crc kubenswrapper[4933]: Output: mount: /var/lib/kubelet/pods/e246afd2-cb0b-49f6-8144-6d4acba08825/volume-subpaths/dns-svc/dnsmasq-dns/1: mount(2) system call failed: No such file or directory. Dec 02 16:13:13 crc kubenswrapper[4933]: E1202 16:13:13.430240 4933 kubelet_pods.go:349] "Failed to prepare subPath for volumeMount of the container" err=< Dec 02 16:13:13 crc kubenswrapper[4933]: error mounting /var/lib/kubelet/pods/e246afd2-cb0b-49f6-8144-6d4acba08825/volumes/kubernetes.io~configmap/dns-svc/..2025_12_02_16_12_56.3801737307/dns-svc: mount failed: exit status 32 Dec 02 16:13:13 crc kubenswrapper[4933]: Mounting command: mount Dec 02 16:13:13 crc kubenswrapper[4933]: Mounting arguments: --no-canonicalize -o bind /proc/4933/fd/30 /var/lib/kubelet/pods/e246afd2-cb0b-49f6-8144-6d4acba08825/volume-subpaths/dns-svc/dnsmasq-dns/1 Dec 02 16:13:13 crc kubenswrapper[4933]: Output: mount: /var/lib/kubelet/pods/e246afd2-cb0b-49f6-8144-6d4acba08825/volume-subpaths/dns-svc/dnsmasq-dns/1: mount(2) system call failed: No such file or directory. Dec 02 16:13:13 crc kubenswrapper[4933]: > containerName="dnsmasq-dns" volumeMountName="dns-svc" Dec 02 16:13:13 crc kubenswrapper[4933]: E1202 16:13:13.430595 4933 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2rh2v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-666b6646f7-4tp9z_openstack(e246afd2-cb0b-49f6-8144-6d4acba08825): CreateContainerConfigError: failed to prepare subPath for volumeMount \"dns-svc\" of container \"dnsmasq-dns\"" logger="UnhandledError" Dec 02 16:13:13 crc kubenswrapper[4933]: E1202 16:13:13.432477 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerConfigError: \"failed to prepare subPath for volumeMount \\\"dns-svc\\\" of container \\\"dnsmasq-dns\\\"\"" pod="openstack/dnsmasq-dns-666b6646f7-4tp9z" podUID="e246afd2-cb0b-49f6-8144-6d4acba08825" Dec 02 16:13:13 crc kubenswrapper[4933]: I1202 16:13:13.447174 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-64h58"] Dec 02 16:13:13 crc kubenswrapper[4933]: I1202 16:13:13.474384 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-64h58"] Dec 02 16:13:13 crc kubenswrapper[4933]: I1202 16:13:13.537234 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-tdkq4"] Dec 02 16:13:13 crc kubenswrapper[4933]: I1202 16:13:13.571568 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-tdkq4"] Dec 02 16:13:13 crc kubenswrapper[4933]: I1202 16:13:13.582531 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-ui-dashboards-7d5fb4cbfb-zdcrf"] Dec 02 16:13:13 crc kubenswrapper[4933]: I1202 16:13:13.609587 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-75644579b7-4b5d6"] Dec 02 16:13:13 crc kubenswrapper[4933]: I1202 16:13:13.649967 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 02 16:13:13 crc kubenswrapper[4933]: I1202 16:13:13.657286 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 02 16:13:13 crc kubenswrapper[4933]: I1202 16:13:13.664279 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 02 16:13:13 crc kubenswrapper[4933]: I1202 16:13:13.846938 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-g8qb5"] Dec 02 16:13:13 crc kubenswrapper[4933]: I1202 16:13:13.882707 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 02 16:13:13 crc kubenswrapper[4933]: I1202 16:13:13.984562 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 02 16:13:14 crc kubenswrapper[4933]: I1202 16:13:14.081247 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-2z7kb"] Dec 02 16:13:14 crc kubenswrapper[4933]: I1202 16:13:14.359399 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-rxkr8" event={"ID":"1042bb7b-3049-4492-a177-d1eb37d550a7","Type":"ContainerStarted","Data":"432b0ec0aa421862c4a079761af395b452a813f5b9ceb3d0c7b111354bdf1e7a"} Dec 02 16:13:14 crc kubenswrapper[4933]: I1202 16:13:14.403101 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-rxkr8" podStartSLOduration=3.291837294 podStartE2EDuration="18.40307731s" podCreationTimestamp="2025-12-02 16:12:56 +0000 UTC" firstStartedPulling="2025-12-02 16:12:57.167498177 +0000 UTC m=+1240.418724880" lastFinishedPulling="2025-12-02 16:13:12.278738193 +0000 UTC m=+1255.529964896" observedRunningTime="2025-12-02 16:13:14.398163609 +0000 UTC m=+1257.649390312" watchObservedRunningTime="2025-12-02 16:13:14.40307731 +0000 UTC m=+1257.654304013" Dec 02 16:13:14 crc kubenswrapper[4933]: W1202 16:13:14.968401 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod34ddcddf_7552_4555_8013_3dc06fc2549a.slice/crio-d19a9e20f03edca737fda4c963ef5c9e2c4c9b2d26fe9e761707eaf847f8c726 WatchSource:0}: Error finding container d19a9e20f03edca737fda4c963ef5c9e2c4c9b2d26fe9e761707eaf847f8c726: Status 404 returned error can't find the container with id d19a9e20f03edca737fda4c963ef5c9e2c4c9b2d26fe9e761707eaf847f8c726 Dec 02 16:13:14 crc kubenswrapper[4933]: W1202 16:13:14.971882 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod803ee8c3_0a9d_46c8_8069_1ebffde1429c.slice/crio-b3619fb9e1eedcc545c5fe8376be0f1111dbf0be41ffd241e4a5351c963aee0b WatchSource:0}: Error finding container b3619fb9e1eedcc545c5fe8376be0f1111dbf0be41ffd241e4a5351c963aee0b: Status 404 returned error can't find the container with id b3619fb9e1eedcc545c5fe8376be0f1111dbf0be41ffd241e4a5351c963aee0b Dec 02 16:13:14 crc kubenswrapper[4933]: W1202 16:13:14.973946 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd5b04559_f2ad_49dc_a280_626d6de841de.slice/crio-500086c1ff290e7835ca448b875edc735ed1b4df108d32d49607a2b1c4a7aa31 WatchSource:0}: Error finding container 500086c1ff290e7835ca448b875edc735ed1b4df108d32d49607a2b1c4a7aa31: Status 404 returned error can't find the container with id 500086c1ff290e7835ca448b875edc735ed1b4df108d32d49607a2b1c4a7aa31 Dec 02 16:13:14 crc kubenswrapper[4933]: W1202 16:13:14.980669 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod09f9fb84_64a0_46a8_8551_4fcbf0e46050.slice/crio-a352a65a9c0d7ca501c24658a49c21d73ce43bd39a57d3da215596102005ca27 WatchSource:0}: Error finding container a352a65a9c0d7ca501c24658a49c21d73ce43bd39a57d3da215596102005ca27: Status 404 returned error can't find the container with id a352a65a9c0d7ca501c24658a49c21d73ce43bd39a57d3da215596102005ca27 Dec 02 16:13:14 crc kubenswrapper[4933]: W1202 16:13:14.987789 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod32a2d025_2245_46a6_82d1_228e920490a3.slice/crio-b17522335985fe2805f1a3399474ea21effe2738356e9dd4553bdf548a87665e WatchSource:0}: Error finding container b17522335985fe2805f1a3399474ea21effe2738356e9dd4553bdf548a87665e: Status 404 returned error can't find the container with id b17522335985fe2805f1a3399474ea21effe2738356e9dd4553bdf548a87665e Dec 02 16:13:15 crc kubenswrapper[4933]: W1202 16:13:15.001698 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod662ba342_72a0_430b_b46d_d6f0f0eafd2b.slice/crio-2f32edae21ec8390ec3f7e000c6960e53a289fb61d4e2b1a2d656d31bad99f9f WatchSource:0}: Error finding container 2f32edae21ec8390ec3f7e000c6960e53a289fb61d4e2b1a2d656d31bad99f9f: Status 404 returned error can't find the container with id 2f32edae21ec8390ec3f7e000c6960e53a289fb61d4e2b1a2d656d31bad99f9f Dec 02 16:13:15 crc kubenswrapper[4933]: W1202 16:13:15.050255 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a35aeec_161d_4ef9_a42b_7967c06c7249.slice/crio-2f99d41e99e73b451a765ba6a94ef131fd0a90103f9e9791b5d3d0c788dae235 WatchSource:0}: Error finding container 2f99d41e99e73b451a765ba6a94ef131fd0a90103f9e9791b5d3d0c788dae235: Status 404 returned error can't find the container with id 2f99d41e99e73b451a765ba6a94ef131fd0a90103f9e9791b5d3d0c788dae235 Dec 02 16:13:15 crc kubenswrapper[4933]: I1202 16:13:15.064146 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e0da7e0-c02a-49ff-8814-fafd537983b1" path="/var/lib/kubelet/pods/0e0da7e0-c02a-49ff-8814-fafd537983b1/volumes" Dec 02 16:13:15 crc kubenswrapper[4933]: I1202 16:13:15.064609 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1fa23eaa-697d-49d3-b507-51018b313a75" path="/var/lib/kubelet/pods/1fa23eaa-697d-49d3-b507-51018b313a75/volumes" Dec 02 16:13:15 crc kubenswrapper[4933]: I1202 16:13:15.372196 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-zdcrf" event={"ID":"d691d907-a29d-40ad-ad96-009e8a7d56e8","Type":"ContainerStarted","Data":"df1ee3e473cbb5816fbe174e694449abb58e2953e7ea7acba4a4b1ee80a427a0"} Dec 02 16:13:15 crc kubenswrapper[4933]: I1202 16:13:15.373808 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"09f9fb84-64a0-46a8-8551-4fcbf0e46050","Type":"ContainerStarted","Data":"a352a65a9c0d7ca501c24658a49c21d73ce43bd39a57d3da215596102005ca27"} Dec 02 16:13:15 crc kubenswrapper[4933]: I1202 16:13:15.375811 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"d5b04559-f2ad-49dc-a280-626d6de841de","Type":"ContainerStarted","Data":"500086c1ff290e7835ca448b875edc735ed1b4df108d32d49607a2b1c4a7aa31"} Dec 02 16:13:15 crc kubenswrapper[4933]: I1202 16:13:15.377016 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"32a2d025-2245-46a6-82d1-228e920490a3","Type":"ContainerStarted","Data":"b17522335985fe2805f1a3399474ea21effe2738356e9dd4553bdf548a87665e"} Dec 02 16:13:15 crc kubenswrapper[4933]: I1202 16:13:15.378732 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"662ba342-72a0-430b-b46d-d6f0f0eafd2b","Type":"ContainerStarted","Data":"2f32edae21ec8390ec3f7e000c6960e53a289fb61d4e2b1a2d656d31bad99f9f"} Dec 02 16:13:15 crc kubenswrapper[4933]: I1202 16:13:15.380766 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-4tp9z" event={"ID":"e246afd2-cb0b-49f6-8144-6d4acba08825","Type":"ContainerStarted","Data":"4ee8badfebcd14424fe101d599f29ee8cf5486a1eedaefd836ddb8ad8e591ada"} Dec 02 16:13:15 crc kubenswrapper[4933]: I1202 16:13:15.381412 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-666b6646f7-4tp9z" Dec 02 16:13:15 crc kubenswrapper[4933]: I1202 16:13:15.382863 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-75644579b7-4b5d6" event={"ID":"8ae8bf53-f595-41aa-a854-2854ec134942","Type":"ContainerStarted","Data":"943eb47ec0403c3aad0df5d4bd90beb1eae0703cfd3f3216cba8b59222a644ec"} Dec 02 16:13:15 crc kubenswrapper[4933]: I1202 16:13:15.382894 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-75644579b7-4b5d6" event={"ID":"8ae8bf53-f595-41aa-a854-2854ec134942","Type":"ContainerStarted","Data":"f091080b751adb801d1f8993b65db2d3a4a93b323ff8f18b8744090f0a5a7a9a"} Dec 02 16:13:15 crc kubenswrapper[4933]: I1202 16:13:15.385643 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-2z7kb" event={"ID":"34ddcddf-7552-4555-8013-3dc06fc2549a","Type":"ContainerStarted","Data":"d19a9e20f03edca737fda4c963ef5c9e2c4c9b2d26fe9e761707eaf847f8c726"} Dec 02 16:13:15 crc kubenswrapper[4933]: I1202 16:13:15.387048 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"803ee8c3-0a9d-46c8-8069-1ebffde1429c","Type":"ContainerStarted","Data":"b3619fb9e1eedcc545c5fe8376be0f1111dbf0be41ffd241e4a5351c963aee0b"} Dec 02 16:13:15 crc kubenswrapper[4933]: I1202 16:13:15.390113 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-g8qb5" event={"ID":"8a35aeec-161d-4ef9-a42b-7967c06c7249","Type":"ContainerStarted","Data":"2f99d41e99e73b451a765ba6a94ef131fd0a90103f9e9791b5d3d0c788dae235"} Dec 02 16:13:15 crc kubenswrapper[4933]: I1202 16:13:15.390253 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-rxkr8" Dec 02 16:13:15 crc kubenswrapper[4933]: I1202 16:13:15.406778 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-666b6646f7-4tp9z" podStartSLOduration=5.075646241 podStartE2EDuration="20.406754119s" podCreationTimestamp="2025-12-02 16:12:55 +0000 UTC" firstStartedPulling="2025-12-02 16:12:56.914848054 +0000 UTC m=+1240.166074757" lastFinishedPulling="2025-12-02 16:13:12.245955932 +0000 UTC m=+1255.497182635" observedRunningTime="2025-12-02 16:13:15.396741063 +0000 UTC m=+1258.647967776" watchObservedRunningTime="2025-12-02 16:13:15.406754119 +0000 UTC m=+1258.657980832" Dec 02 16:13:15 crc kubenswrapper[4933]: I1202 16:13:15.419566 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-75644579b7-4b5d6" podStartSLOduration=12.419546289 podStartE2EDuration="12.419546289s" podCreationTimestamp="2025-12-02 16:13:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 16:13:15.414509925 +0000 UTC m=+1258.665736638" watchObservedRunningTime="2025-12-02 16:13:15.419546289 +0000 UTC m=+1258.670772992" Dec 02 16:13:17 crc kubenswrapper[4933]: I1202 16:13:17.413219 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e3cdda86-3d0d-486d-ae36-ab6792bff2ab","Type":"ContainerStarted","Data":"7da6f3952a39b50e00a2e909de5ee41531348ab856141537311ba7dea9e99461"} Dec 02 16:13:17 crc kubenswrapper[4933]: I1202 16:13:17.417066 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"5df97ea3-3281-4995-bf14-cb09bf09f39d","Type":"ContainerStarted","Data":"c0cbe56a7a65f4eb7aa06267f0be741d781f0a45fc009bb3d23073ffd1f9bfcb"} Dec 02 16:13:19 crc kubenswrapper[4933]: I1202 16:13:19.241704 4933 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/octavia-operator-controller-manager-998648c74-t8bvj" podUID="7d831995-fd00-455a-822e-82eb0cca6a33" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.113:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 02 16:13:21 crc kubenswrapper[4933]: I1202 16:13:21.358294 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-666b6646f7-4tp9z" Dec 02 16:13:21 crc kubenswrapper[4933]: I1202 16:13:21.664998 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57d769cc4f-rxkr8" Dec 02 16:13:21 crc kubenswrapper[4933]: I1202 16:13:21.736310 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-4tp9z"] Dec 02 16:13:22 crc kubenswrapper[4933]: I1202 16:13:22.302218 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-666b6646f7-4tp9z" podUID="e246afd2-cb0b-49f6-8144-6d4acba08825" containerName="dnsmasq-dns" containerID="cri-o://4ee8badfebcd14424fe101d599f29ee8cf5486a1eedaefd836ddb8ad8e591ada" gracePeriod=10 Dec 02 16:13:23 crc kubenswrapper[4933]: I1202 16:13:23.312903 4933 generic.go:334] "Generic (PLEG): container finished" podID="e246afd2-cb0b-49f6-8144-6d4acba08825" containerID="4ee8badfebcd14424fe101d599f29ee8cf5486a1eedaefd836ddb8ad8e591ada" exitCode=0 Dec 02 16:13:23 crc kubenswrapper[4933]: I1202 16:13:23.312949 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-4tp9z" event={"ID":"e246afd2-cb0b-49f6-8144-6d4acba08825","Type":"ContainerDied","Data":"4ee8badfebcd14424fe101d599f29ee8cf5486a1eedaefd836ddb8ad8e591ada"} Dec 02 16:13:24 crc kubenswrapper[4933]: I1202 16:13:24.168804 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-75644579b7-4b5d6" Dec 02 16:13:24 crc kubenswrapper[4933]: I1202 16:13:24.169112 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-75644579b7-4b5d6" Dec 02 16:13:24 crc kubenswrapper[4933]: I1202 16:13:24.174629 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-75644579b7-4b5d6" Dec 02 16:13:24 crc kubenswrapper[4933]: I1202 16:13:24.323726 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-75644579b7-4b5d6" Dec 02 16:13:24 crc kubenswrapper[4933]: I1202 16:13:24.392014 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-84778dd5c6-qprl6"] Dec 02 16:13:27 crc kubenswrapper[4933]: E1202 16:13:27.486235 4933 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-mariadb:current-podified" Dec 02 16:13:27 crc kubenswrapper[4933]: E1202 16:13:27.487399 4933 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6rw6v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-galera-0_openstack(cd426cb0-522e-4c62-874e-a85119e82490): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 16:13:27 crc kubenswrapper[4933]: E1202 16:13:27.489599 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstack-galera-0" podUID="cd426cb0-522e-4c62-874e-a85119e82490" Dec 02 16:13:27 crc kubenswrapper[4933]: E1202 16:13:27.494690 4933 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-mariadb:current-podified" Dec 02 16:13:27 crc kubenswrapper[4933]: E1202 16:13:27.494846 4933 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fh8qw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-cell1-galera-0_openstack(553a306b-ceeb-41b2-8b2c-c32dcd70639e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 16:13:27 crc kubenswrapper[4933]: E1202 16:13:27.496613 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstack-cell1-galera-0" podUID="553a306b-ceeb-41b2-8b2c-c32dcd70639e" Dec 02 16:13:27 crc kubenswrapper[4933]: I1202 16:13:27.557137 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-4tp9z" Dec 02 16:13:27 crc kubenswrapper[4933]: I1202 16:13:27.650709 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2rh2v\" (UniqueName: \"kubernetes.io/projected/e246afd2-cb0b-49f6-8144-6d4acba08825-kube-api-access-2rh2v\") pod \"e246afd2-cb0b-49f6-8144-6d4acba08825\" (UID: \"e246afd2-cb0b-49f6-8144-6d4acba08825\") " Dec 02 16:13:27 crc kubenswrapper[4933]: I1202 16:13:27.650847 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e246afd2-cb0b-49f6-8144-6d4acba08825-config\") pod \"e246afd2-cb0b-49f6-8144-6d4acba08825\" (UID: \"e246afd2-cb0b-49f6-8144-6d4acba08825\") " Dec 02 16:13:27 crc kubenswrapper[4933]: I1202 16:13:27.650903 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e246afd2-cb0b-49f6-8144-6d4acba08825-dns-svc\") pod \"e246afd2-cb0b-49f6-8144-6d4acba08825\" (UID: \"e246afd2-cb0b-49f6-8144-6d4acba08825\") " Dec 02 16:13:27 crc kubenswrapper[4933]: I1202 16:13:27.657791 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e246afd2-cb0b-49f6-8144-6d4acba08825-kube-api-access-2rh2v" (OuterVolumeSpecName: "kube-api-access-2rh2v") pod "e246afd2-cb0b-49f6-8144-6d4acba08825" (UID: "e246afd2-cb0b-49f6-8144-6d4acba08825"). InnerVolumeSpecName "kube-api-access-2rh2v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:13:27 crc kubenswrapper[4933]: I1202 16:13:27.702311 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e246afd2-cb0b-49f6-8144-6d4acba08825-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e246afd2-cb0b-49f6-8144-6d4acba08825" (UID: "e246afd2-cb0b-49f6-8144-6d4acba08825"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:13:27 crc kubenswrapper[4933]: I1202 16:13:27.707931 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e246afd2-cb0b-49f6-8144-6d4acba08825-config" (OuterVolumeSpecName: "config") pod "e246afd2-cb0b-49f6-8144-6d4acba08825" (UID: "e246afd2-cb0b-49f6-8144-6d4acba08825"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:13:27 crc kubenswrapper[4933]: I1202 16:13:27.753208 4933 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e246afd2-cb0b-49f6-8144-6d4acba08825-config\") on node \"crc\" DevicePath \"\"" Dec 02 16:13:27 crc kubenswrapper[4933]: I1202 16:13:27.753234 4933 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e246afd2-cb0b-49f6-8144-6d4acba08825-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 16:13:27 crc kubenswrapper[4933]: I1202 16:13:27.753244 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2rh2v\" (UniqueName: \"kubernetes.io/projected/e246afd2-cb0b-49f6-8144-6d4acba08825-kube-api-access-2rh2v\") on node \"crc\" DevicePath \"\"" Dec 02 16:13:27 crc kubenswrapper[4933]: E1202 16:13:27.809800 4933 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server:current-podified" Dec 02 16:13:27 crc kubenswrapper[4933]: E1202 16:13:27.810041 4933 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ovsdbserver-nb,Image:quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server:current-podified,Command:[/usr/bin/dumb-init],Args:[/usr/local/bin/container-scripts/setup.sh],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n8h5f5hd5h574h9hf7hd4h5fbh68bhcbh5d8hf4h585h5dch97hb7h66ch669h56bhcfhd4h54ch589h6h5dfh67dh59ch56bh649h698h69h567q,ValueFrom:nil,},EnvVar{Name:OVN_LOGDIR,Value:/tmp,ValueFrom:nil,},EnvVar{Name:OVN_RUNDIR,Value:/tmp,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovndbcluster-nb-etc-ovn,ReadOnly:false,MountPath:/etc/ovn,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdb-rundir,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndb.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/private/ovndb.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndbca.crt,SubPath:ca.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wglg4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/pidof ovsdb-server],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/pidof ovsdb-server],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:&Lifecycle{PostStart:nil,PreStop:&LifecycleHandler{Exec:&ExecAction{Command:[/usr/local/bin/container-scripts/cleanup.sh],},HTTPGet:nil,TCPSocket:nil,Sleep:nil,},},TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/pidof ovsdb-server],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:20,TerminationGracePeriodSeconds:nil,},ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovsdbserver-nb-0_openstack(662ba342-72a0-430b-b46d-d6f0f0eafd2b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 16:13:27 crc kubenswrapper[4933]: I1202 16:13:27.813134 4933 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 16:13:28 crc kubenswrapper[4933]: I1202 16:13:28.356408 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-4tp9z" event={"ID":"e246afd2-cb0b-49f6-8144-6d4acba08825","Type":"ContainerDied","Data":"fe055abd92fee46f5f33ddaff98e45c1b4ef1276b7806f8cb51acf804d976fba"} Dec 02 16:13:28 crc kubenswrapper[4933]: I1202 16:13:28.356468 4933 scope.go:117] "RemoveContainer" containerID="4ee8badfebcd14424fe101d599f29ee8cf5486a1eedaefd836ddb8ad8e591ada" Dec 02 16:13:28 crc kubenswrapper[4933]: I1202 16:13:28.356481 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-4tp9z" Dec 02 16:13:28 crc kubenswrapper[4933]: E1202 16:13:28.357802 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-mariadb:current-podified\\\"\"" pod="openstack/openstack-galera-0" podUID="cd426cb0-522e-4c62-874e-a85119e82490" Dec 02 16:13:28 crc kubenswrapper[4933]: E1202 16:13:28.358744 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-mariadb:current-podified\\\"\"" pod="openstack/openstack-cell1-galera-0" podUID="553a306b-ceeb-41b2-8b2c-c32dcd70639e" Dec 02 16:13:28 crc kubenswrapper[4933]: E1202 16:13:28.394176 4933 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified" Dec 02 16:13:28 crc kubenswrapper[4933]: E1202 16:13:28.394355 4933 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ovn-controller,Image:quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified,Command:[ovn-controller --pidfile unix:/run/openvswitch/db.sock --certificate=/etc/pki/tls/certs/ovndb.crt --private-key=/etc/pki/tls/private/ovndb.key --ca-cert=/etc/pki/tls/certs/ovndbca.crt],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n66dh94h65dh559h676h66dhb6h5f8h67bh588h64dh547h588h8dhdfh649h65bh5c4h68h58chd4h678h676h696h5h556h65bh57hf8h57fh648h568q,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:var-run,ReadOnly:false,MountPath:/var/run/openvswitch,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:var-run-ovn,ReadOnly:false,MountPath:/var/run/ovn,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:var-log-ovn,ReadOnly:false,MountPath:/var/log/ovn,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovn-controller-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndb.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovn-controller-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/private/ovndb.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovn-controller-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndbca.crt,SubPath:ca.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bfz54,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/local/bin/container-scripts/ovn_controller_liveness.sh],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:30,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/local/bin/container-scripts/ovn_controller_readiness.sh],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:30,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:&Lifecycle{PostStart:nil,PreStop:&LifecycleHandler{Exec:&ExecAction{Command:[/usr/share/ovn/scripts/ovn-ctl stop_controller],},HTTPGet:nil,TCPSocket:nil,Sleep:nil,},},TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[NET_ADMIN SYS_ADMIN SYS_NICE],Drop:[],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-controller-g8qb5_openstack(8a35aeec-161d-4ef9-a42b-7967c06c7249): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 16:13:28 crc kubenswrapper[4933]: E1202 16:13:28.395636 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovn-controller\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ovn-controller-g8qb5" podUID="8a35aeec-161d-4ef9-a42b-7967c06c7249" Dec 02 16:13:28 crc kubenswrapper[4933]: I1202 16:13:28.429913 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-4tp9z"] Dec 02 16:13:28 crc kubenswrapper[4933]: I1202 16:13:28.437371 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-4tp9z"] Dec 02 16:13:29 crc kubenswrapper[4933]: I1202 16:13:29.071059 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e246afd2-cb0b-49f6-8144-6d4acba08825" path="/var/lib/kubelet/pods/e246afd2-cb0b-49f6-8144-6d4acba08825/volumes" Dec 02 16:13:29 crc kubenswrapper[4933]: I1202 16:13:29.179144 4933 scope.go:117] "RemoveContainer" containerID="60bb09918cfa0acd9c53ec060c987eaa645d316221e9eba07629d91dc767f855" Dec 02 16:13:29 crc kubenswrapper[4933]: E1202 16:13:29.375375 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovn-controller\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified\\\"\"" pod="openstack/ovn-controller-g8qb5" podUID="8a35aeec-161d-4ef9-a42b-7967c06c7249" Dec 02 16:13:29 crc kubenswrapper[4933]: E1202 16:13:29.620409 4933 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0" Dec 02 16:13:29 crc kubenswrapper[4933]: E1202 16:13:29.620853 4933 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0" Dec 02 16:13:29 crc kubenswrapper[4933]: E1202 16:13:29.621040 4933 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-state-metrics,Image:registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0,Command:[],Args:[--resources=pods --namespaces=openstack],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:http-metrics,HostPort:0,ContainerPort:8080,Protocol:TCP,HostIP:,},ContainerPort{Name:telemetry,HostPort:0,ContainerPort:8081,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-js6p2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/livez,Port:{0 8080 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod kube-state-metrics-0_openstack(09f9fb84-64a0-46a8-8551-4fcbf0e46050): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 02 16:13:29 crc kubenswrapper[4933]: E1202 16:13:29.622891 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openstack/kube-state-metrics-0" podUID="09f9fb84-64a0-46a8-8551-4fcbf0e46050" Dec 02 16:13:30 crc kubenswrapper[4933]: I1202 16:13:30.384106 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"d5b04559-f2ad-49dc-a280-626d6de841de","Type":"ContainerStarted","Data":"8c1c341f6e78156bbfbde681d497764792b58a1884e206d0798dcd1a987166e5"} Dec 02 16:13:30 crc kubenswrapper[4933]: I1202 16:13:30.386660 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"32a2d025-2245-46a6-82d1-228e920490a3","Type":"ContainerStarted","Data":"82bfc9fe482dda75f66ac1fdc0bc093725f0868a8985d55b07504675d9fd7abb"} Dec 02 16:13:30 crc kubenswrapper[4933]: I1202 16:13:30.386774 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Dec 02 16:13:30 crc kubenswrapper[4933]: I1202 16:13:30.395321 4933 generic.go:334] "Generic (PLEG): container finished" podID="34ddcddf-7552-4555-8013-3dc06fc2549a" containerID="ad3fb0194a54ed5d482a90408738cbbcf2f6fd6c95642230c509ef6aea855c73" exitCode=0 Dec 02 16:13:30 crc kubenswrapper[4933]: I1202 16:13:30.395425 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-2z7kb" event={"ID":"34ddcddf-7552-4555-8013-3dc06fc2549a","Type":"ContainerDied","Data":"ad3fb0194a54ed5d482a90408738cbbcf2f6fd6c95642230c509ef6aea855c73"} Dec 02 16:13:30 crc kubenswrapper[4933]: I1202 16:13:30.398016 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-zdcrf" event={"ID":"d691d907-a29d-40ad-ad96-009e8a7d56e8","Type":"ContainerStarted","Data":"3c16771c6923295ab05abc49edbcd94b9e6f8e1aeb1afdf18e95e8ba3aabcfb6"} Dec 02 16:13:30 crc kubenswrapper[4933]: E1202 16:13:30.398705 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0\\\"\"" pod="openstack/kube-state-metrics-0" podUID="09f9fb84-64a0-46a8-8551-4fcbf0e46050" Dec 02 16:13:30 crc kubenswrapper[4933]: I1202 16:13:30.407355 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=15.744601065 podStartE2EDuration="30.407331845s" podCreationTimestamp="2025-12-02 16:13:00 +0000 UTC" firstStartedPulling="2025-12-02 16:13:14.990793446 +0000 UTC m=+1258.242020149" lastFinishedPulling="2025-12-02 16:13:29.653524226 +0000 UTC m=+1272.904750929" observedRunningTime="2025-12-02 16:13:30.404191191 +0000 UTC m=+1273.655417894" watchObservedRunningTime="2025-12-02 16:13:30.407331845 +0000 UTC m=+1273.658558558" Dec 02 16:13:30 crc kubenswrapper[4933]: I1202 16:13:30.464207 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-zdcrf" podStartSLOduration=12.85016036 podStartE2EDuration="27.464184415s" podCreationTimestamp="2025-12-02 16:13:03 +0000 UTC" firstStartedPulling="2025-12-02 16:13:15.000440363 +0000 UTC m=+1258.251667086" lastFinishedPulling="2025-12-02 16:13:29.614464438 +0000 UTC m=+1272.865691141" observedRunningTime="2025-12-02 16:13:30.456679276 +0000 UTC m=+1273.707905979" watchObservedRunningTime="2025-12-02 16:13:30.464184415 +0000 UTC m=+1273.715411118" Dec 02 16:13:31 crc kubenswrapper[4933]: I1202 16:13:31.358365 4933 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-666b6646f7-4tp9z" podUID="e246afd2-cb0b-49f6-8144-6d4acba08825" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.125:5353: i/o timeout" Dec 02 16:13:32 crc kubenswrapper[4933]: E1202 16:13:32.110300 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdbserver-nb\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ovsdbserver-nb-0" podUID="662ba342-72a0-430b-b46d-d6f0f0eafd2b" Dec 02 16:13:32 crc kubenswrapper[4933]: I1202 16:13:32.433918 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"662ba342-72a0-430b-b46d-d6f0f0eafd2b","Type":"ContainerStarted","Data":"2ee562246ddab66ffb7dc71306e3d9739f67c9cddcf8627582603e0e2fc85c38"} Dec 02 16:13:32 crc kubenswrapper[4933]: E1202 16:13:32.436599 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdbserver-nb\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server:current-podified\\\"\"" pod="openstack/ovsdbserver-nb-0" podUID="662ba342-72a0-430b-b46d-d6f0f0eafd2b" Dec 02 16:13:32 crc kubenswrapper[4933]: I1202 16:13:32.438459 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-2z7kb" event={"ID":"34ddcddf-7552-4555-8013-3dc06fc2549a","Type":"ContainerStarted","Data":"0a8c6efae8e18901708f5a83779f18c4446e7fd8d455e0ce53530a1fc050dc42"} Dec 02 16:13:32 crc kubenswrapper[4933]: I1202 16:13:32.438523 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-2z7kb" Dec 02 16:13:32 crc kubenswrapper[4933]: I1202 16:13:32.438542 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-2z7kb" Dec 02 16:13:32 crc kubenswrapper[4933]: I1202 16:13:32.438552 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-2z7kb" event={"ID":"34ddcddf-7552-4555-8013-3dc06fc2549a","Type":"ContainerStarted","Data":"6840b87d748d5fe07ba1daa02f427d3ae1b357c235ac21cd68a0986df5d6ee7d"} Dec 02 16:13:32 crc kubenswrapper[4933]: I1202 16:13:32.441635 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"803ee8c3-0a9d-46c8-8069-1ebffde1429c","Type":"ContainerStarted","Data":"807b2330f3cb44203c7f511d9f2b7abbe69f7ffc78640e9df411d576e748c6d3"} Dec 02 16:13:32 crc kubenswrapper[4933]: I1202 16:13:32.445197 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"d5b04559-f2ad-49dc-a280-626d6de841de","Type":"ContainerStarted","Data":"50ab704f30d54d80f17c0efb56dd2a589517bdf2f74df4a6137f1f2a1f126013"} Dec 02 16:13:32 crc kubenswrapper[4933]: I1202 16:13:32.515226 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-2z7kb" podStartSLOduration=13.405473809 podStartE2EDuration="27.515207934s" podCreationTimestamp="2025-12-02 16:13:05 +0000 UTC" firstStartedPulling="2025-12-02 16:13:14.970476247 +0000 UTC m=+1258.221702950" lastFinishedPulling="2025-12-02 16:13:29.080210372 +0000 UTC m=+1272.331437075" observedRunningTime="2025-12-02 16:13:32.495988223 +0000 UTC m=+1275.747214926" watchObservedRunningTime="2025-12-02 16:13:32.515207934 +0000 UTC m=+1275.766434637" Dec 02 16:13:32 crc kubenswrapper[4933]: I1202 16:13:32.518146 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=7.959844815 podStartE2EDuration="24.518134112s" podCreationTimestamp="2025-12-02 16:13:08 +0000 UTC" firstStartedPulling="2025-12-02 16:13:14.975742246 +0000 UTC m=+1258.226968949" lastFinishedPulling="2025-12-02 16:13:31.534031533 +0000 UTC m=+1274.785258246" observedRunningTime="2025-12-02 16:13:32.513764696 +0000 UTC m=+1275.764991409" watchObservedRunningTime="2025-12-02 16:13:32.518134112 +0000 UTC m=+1275.769360815" Dec 02 16:13:33 crc kubenswrapper[4933]: E1202 16:13:33.456798 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdbserver-nb\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server:current-podified\\\"\"" pod="openstack/ovsdbserver-nb-0" podUID="662ba342-72a0-430b-b46d-d6f0f0eafd2b" Dec 02 16:13:34 crc kubenswrapper[4933]: I1202 16:13:34.100865 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Dec 02 16:13:34 crc kubenswrapper[4933]: I1202 16:13:34.141954 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Dec 02 16:13:34 crc kubenswrapper[4933]: I1202 16:13:34.465957 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Dec 02 16:13:34 crc kubenswrapper[4933]: I1202 16:13:34.509997 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Dec 02 16:13:34 crc kubenswrapper[4933]: I1202 16:13:34.804418 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-fxt8c"] Dec 02 16:13:34 crc kubenswrapper[4933]: E1202 16:13:34.804913 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e246afd2-cb0b-49f6-8144-6d4acba08825" containerName="init" Dec 02 16:13:34 crc kubenswrapper[4933]: I1202 16:13:34.804930 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="e246afd2-cb0b-49f6-8144-6d4acba08825" containerName="init" Dec 02 16:13:34 crc kubenswrapper[4933]: E1202 16:13:34.804962 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e246afd2-cb0b-49f6-8144-6d4acba08825" containerName="dnsmasq-dns" Dec 02 16:13:34 crc kubenswrapper[4933]: I1202 16:13:34.804968 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="e246afd2-cb0b-49f6-8144-6d4acba08825" containerName="dnsmasq-dns" Dec 02 16:13:34 crc kubenswrapper[4933]: I1202 16:13:34.805153 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="e246afd2-cb0b-49f6-8144-6d4acba08825" containerName="dnsmasq-dns" Dec 02 16:13:34 crc kubenswrapper[4933]: I1202 16:13:34.806380 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-fxt8c" Dec 02 16:13:34 crc kubenswrapper[4933]: I1202 16:13:34.815277 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Dec 02 16:13:34 crc kubenswrapper[4933]: I1202 16:13:34.818507 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-fxt8c"] Dec 02 16:13:34 crc kubenswrapper[4933]: I1202 16:13:34.827534 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-cvdrx"] Dec 02 16:13:34 crc kubenswrapper[4933]: I1202 16:13:34.830075 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-cvdrx" Dec 02 16:13:34 crc kubenswrapper[4933]: I1202 16:13:34.832297 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Dec 02 16:13:34 crc kubenswrapper[4933]: I1202 16:13:34.853530 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-cvdrx"] Dec 02 16:13:34 crc kubenswrapper[4933]: I1202 16:13:34.904449 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/10c5e5d4-0a18-43ed-8fd0-a386264a6033-dns-svc\") pod \"dnsmasq-dns-6bc7876d45-fxt8c\" (UID: \"10c5e5d4-0a18-43ed-8fd0-a386264a6033\") " pod="openstack/dnsmasq-dns-6bc7876d45-fxt8c" Dec 02 16:13:34 crc kubenswrapper[4933]: I1202 16:13:34.904527 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9b7f4a7-69fa-4d7a-bbbe-631089b57bf2-combined-ca-bundle\") pod \"ovn-controller-metrics-cvdrx\" (UID: \"b9b7f4a7-69fa-4d7a-bbbe-631089b57bf2\") " pod="openstack/ovn-controller-metrics-cvdrx" Dec 02 16:13:34 crc kubenswrapper[4933]: I1202 16:13:34.904555 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gv2f9\" (UniqueName: \"kubernetes.io/projected/10c5e5d4-0a18-43ed-8fd0-a386264a6033-kube-api-access-gv2f9\") pod \"dnsmasq-dns-6bc7876d45-fxt8c\" (UID: \"10c5e5d4-0a18-43ed-8fd0-a386264a6033\") " pod="openstack/dnsmasq-dns-6bc7876d45-fxt8c" Dec 02 16:13:34 crc kubenswrapper[4933]: I1202 16:13:34.904571 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/10c5e5d4-0a18-43ed-8fd0-a386264a6033-ovsdbserver-sb\") pod \"dnsmasq-dns-6bc7876d45-fxt8c\" (UID: \"10c5e5d4-0a18-43ed-8fd0-a386264a6033\") " pod="openstack/dnsmasq-dns-6bc7876d45-fxt8c" Dec 02 16:13:34 crc kubenswrapper[4933]: I1202 16:13:34.904596 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9b7f4a7-69fa-4d7a-bbbe-631089b57bf2-config\") pod \"ovn-controller-metrics-cvdrx\" (UID: \"b9b7f4a7-69fa-4d7a-bbbe-631089b57bf2\") " pod="openstack/ovn-controller-metrics-cvdrx" Dec 02 16:13:34 crc kubenswrapper[4933]: I1202 16:13:34.904653 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwsz8\" (UniqueName: \"kubernetes.io/projected/b9b7f4a7-69fa-4d7a-bbbe-631089b57bf2-kube-api-access-mwsz8\") pod \"ovn-controller-metrics-cvdrx\" (UID: \"b9b7f4a7-69fa-4d7a-bbbe-631089b57bf2\") " pod="openstack/ovn-controller-metrics-cvdrx" Dec 02 16:13:34 crc kubenswrapper[4933]: I1202 16:13:34.904675 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b9b7f4a7-69fa-4d7a-bbbe-631089b57bf2-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-cvdrx\" (UID: \"b9b7f4a7-69fa-4d7a-bbbe-631089b57bf2\") " pod="openstack/ovn-controller-metrics-cvdrx" Dec 02 16:13:34 crc kubenswrapper[4933]: I1202 16:13:34.904799 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/b9b7f4a7-69fa-4d7a-bbbe-631089b57bf2-ovn-rundir\") pod \"ovn-controller-metrics-cvdrx\" (UID: \"b9b7f4a7-69fa-4d7a-bbbe-631089b57bf2\") " pod="openstack/ovn-controller-metrics-cvdrx" Dec 02 16:13:34 crc kubenswrapper[4933]: I1202 16:13:34.904901 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/b9b7f4a7-69fa-4d7a-bbbe-631089b57bf2-ovs-rundir\") pod \"ovn-controller-metrics-cvdrx\" (UID: \"b9b7f4a7-69fa-4d7a-bbbe-631089b57bf2\") " pod="openstack/ovn-controller-metrics-cvdrx" Dec 02 16:13:34 crc kubenswrapper[4933]: I1202 16:13:34.904969 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10c5e5d4-0a18-43ed-8fd0-a386264a6033-config\") pod \"dnsmasq-dns-6bc7876d45-fxt8c\" (UID: \"10c5e5d4-0a18-43ed-8fd0-a386264a6033\") " pod="openstack/dnsmasq-dns-6bc7876d45-fxt8c" Dec 02 16:13:35 crc kubenswrapper[4933]: I1202 16:13:35.006695 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9b7f4a7-69fa-4d7a-bbbe-631089b57bf2-combined-ca-bundle\") pod \"ovn-controller-metrics-cvdrx\" (UID: \"b9b7f4a7-69fa-4d7a-bbbe-631089b57bf2\") " pod="openstack/ovn-controller-metrics-cvdrx" Dec 02 16:13:35 crc kubenswrapper[4933]: I1202 16:13:35.007789 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gv2f9\" (UniqueName: \"kubernetes.io/projected/10c5e5d4-0a18-43ed-8fd0-a386264a6033-kube-api-access-gv2f9\") pod \"dnsmasq-dns-6bc7876d45-fxt8c\" (UID: \"10c5e5d4-0a18-43ed-8fd0-a386264a6033\") " pod="openstack/dnsmasq-dns-6bc7876d45-fxt8c" Dec 02 16:13:35 crc kubenswrapper[4933]: I1202 16:13:35.007850 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/10c5e5d4-0a18-43ed-8fd0-a386264a6033-ovsdbserver-sb\") pod \"dnsmasq-dns-6bc7876d45-fxt8c\" (UID: \"10c5e5d4-0a18-43ed-8fd0-a386264a6033\") " pod="openstack/dnsmasq-dns-6bc7876d45-fxt8c" Dec 02 16:13:35 crc kubenswrapper[4933]: I1202 16:13:35.007894 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9b7f4a7-69fa-4d7a-bbbe-631089b57bf2-config\") pod \"ovn-controller-metrics-cvdrx\" (UID: \"b9b7f4a7-69fa-4d7a-bbbe-631089b57bf2\") " pod="openstack/ovn-controller-metrics-cvdrx" Dec 02 16:13:35 crc kubenswrapper[4933]: I1202 16:13:35.007986 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwsz8\" (UniqueName: \"kubernetes.io/projected/b9b7f4a7-69fa-4d7a-bbbe-631089b57bf2-kube-api-access-mwsz8\") pod \"ovn-controller-metrics-cvdrx\" (UID: \"b9b7f4a7-69fa-4d7a-bbbe-631089b57bf2\") " pod="openstack/ovn-controller-metrics-cvdrx" Dec 02 16:13:35 crc kubenswrapper[4933]: I1202 16:13:35.008014 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b9b7f4a7-69fa-4d7a-bbbe-631089b57bf2-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-cvdrx\" (UID: \"b9b7f4a7-69fa-4d7a-bbbe-631089b57bf2\") " pod="openstack/ovn-controller-metrics-cvdrx" Dec 02 16:13:35 crc kubenswrapper[4933]: I1202 16:13:35.008112 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/b9b7f4a7-69fa-4d7a-bbbe-631089b57bf2-ovn-rundir\") pod \"ovn-controller-metrics-cvdrx\" (UID: \"b9b7f4a7-69fa-4d7a-bbbe-631089b57bf2\") " pod="openstack/ovn-controller-metrics-cvdrx" Dec 02 16:13:35 crc kubenswrapper[4933]: I1202 16:13:35.008144 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/b9b7f4a7-69fa-4d7a-bbbe-631089b57bf2-ovs-rundir\") pod \"ovn-controller-metrics-cvdrx\" (UID: \"b9b7f4a7-69fa-4d7a-bbbe-631089b57bf2\") " pod="openstack/ovn-controller-metrics-cvdrx" Dec 02 16:13:35 crc kubenswrapper[4933]: I1202 16:13:35.008187 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10c5e5d4-0a18-43ed-8fd0-a386264a6033-config\") pod \"dnsmasq-dns-6bc7876d45-fxt8c\" (UID: \"10c5e5d4-0a18-43ed-8fd0-a386264a6033\") " pod="openstack/dnsmasq-dns-6bc7876d45-fxt8c" Dec 02 16:13:35 crc kubenswrapper[4933]: I1202 16:13:35.008255 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/10c5e5d4-0a18-43ed-8fd0-a386264a6033-dns-svc\") pod \"dnsmasq-dns-6bc7876d45-fxt8c\" (UID: \"10c5e5d4-0a18-43ed-8fd0-a386264a6033\") " pod="openstack/dnsmasq-dns-6bc7876d45-fxt8c" Dec 02 16:13:35 crc kubenswrapper[4933]: I1202 16:13:35.008453 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/b9b7f4a7-69fa-4d7a-bbbe-631089b57bf2-ovn-rundir\") pod \"ovn-controller-metrics-cvdrx\" (UID: \"b9b7f4a7-69fa-4d7a-bbbe-631089b57bf2\") " pod="openstack/ovn-controller-metrics-cvdrx" Dec 02 16:13:35 crc kubenswrapper[4933]: I1202 16:13:35.008719 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/10c5e5d4-0a18-43ed-8fd0-a386264a6033-ovsdbserver-sb\") pod \"dnsmasq-dns-6bc7876d45-fxt8c\" (UID: \"10c5e5d4-0a18-43ed-8fd0-a386264a6033\") " pod="openstack/dnsmasq-dns-6bc7876d45-fxt8c" Dec 02 16:13:35 crc kubenswrapper[4933]: I1202 16:13:35.008787 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9b7f4a7-69fa-4d7a-bbbe-631089b57bf2-config\") pod \"ovn-controller-metrics-cvdrx\" (UID: \"b9b7f4a7-69fa-4d7a-bbbe-631089b57bf2\") " pod="openstack/ovn-controller-metrics-cvdrx" Dec 02 16:13:35 crc kubenswrapper[4933]: I1202 16:13:35.008805 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/b9b7f4a7-69fa-4d7a-bbbe-631089b57bf2-ovs-rundir\") pod \"ovn-controller-metrics-cvdrx\" (UID: \"b9b7f4a7-69fa-4d7a-bbbe-631089b57bf2\") " pod="openstack/ovn-controller-metrics-cvdrx" Dec 02 16:13:35 crc kubenswrapper[4933]: I1202 16:13:35.009086 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/10c5e5d4-0a18-43ed-8fd0-a386264a6033-dns-svc\") pod \"dnsmasq-dns-6bc7876d45-fxt8c\" (UID: \"10c5e5d4-0a18-43ed-8fd0-a386264a6033\") " pod="openstack/dnsmasq-dns-6bc7876d45-fxt8c" Dec 02 16:13:35 crc kubenswrapper[4933]: I1202 16:13:35.009294 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10c5e5d4-0a18-43ed-8fd0-a386264a6033-config\") pod \"dnsmasq-dns-6bc7876d45-fxt8c\" (UID: \"10c5e5d4-0a18-43ed-8fd0-a386264a6033\") " pod="openstack/dnsmasq-dns-6bc7876d45-fxt8c" Dec 02 16:13:35 crc kubenswrapper[4933]: I1202 16:13:35.012518 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b9b7f4a7-69fa-4d7a-bbbe-631089b57bf2-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-cvdrx\" (UID: \"b9b7f4a7-69fa-4d7a-bbbe-631089b57bf2\") " pod="openstack/ovn-controller-metrics-cvdrx" Dec 02 16:13:35 crc kubenswrapper[4933]: I1202 16:13:35.025653 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9b7f4a7-69fa-4d7a-bbbe-631089b57bf2-combined-ca-bundle\") pod \"ovn-controller-metrics-cvdrx\" (UID: \"b9b7f4a7-69fa-4d7a-bbbe-631089b57bf2\") " pod="openstack/ovn-controller-metrics-cvdrx" Dec 02 16:13:35 crc kubenswrapper[4933]: I1202 16:13:35.031739 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gv2f9\" (UniqueName: \"kubernetes.io/projected/10c5e5d4-0a18-43ed-8fd0-a386264a6033-kube-api-access-gv2f9\") pod \"dnsmasq-dns-6bc7876d45-fxt8c\" (UID: \"10c5e5d4-0a18-43ed-8fd0-a386264a6033\") " pod="openstack/dnsmasq-dns-6bc7876d45-fxt8c" Dec 02 16:13:35 crc kubenswrapper[4933]: I1202 16:13:35.033335 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwsz8\" (UniqueName: \"kubernetes.io/projected/b9b7f4a7-69fa-4d7a-bbbe-631089b57bf2-kube-api-access-mwsz8\") pod \"ovn-controller-metrics-cvdrx\" (UID: \"b9b7f4a7-69fa-4d7a-bbbe-631089b57bf2\") " pod="openstack/ovn-controller-metrics-cvdrx" Dec 02 16:13:35 crc kubenswrapper[4933]: I1202 16:13:35.092322 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-fxt8c"] Dec 02 16:13:35 crc kubenswrapper[4933]: I1202 16:13:35.094627 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-fxt8c" Dec 02 16:13:35 crc kubenswrapper[4933]: I1202 16:13:35.124448 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8554648995-8qkcx"] Dec 02 16:13:35 crc kubenswrapper[4933]: I1202 16:13:35.132376 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-8qkcx" Dec 02 16:13:35 crc kubenswrapper[4933]: I1202 16:13:35.135075 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Dec 02 16:13:35 crc kubenswrapper[4933]: I1202 16:13:35.157640 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-cvdrx" Dec 02 16:13:35 crc kubenswrapper[4933]: I1202 16:13:35.175117 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-8qkcx"] Dec 02 16:13:35 crc kubenswrapper[4933]: I1202 16:13:35.222012 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mjw2\" (UniqueName: \"kubernetes.io/projected/62a770e8-010d-48cf-aef1-9926e6ed868d-kube-api-access-9mjw2\") pod \"dnsmasq-dns-8554648995-8qkcx\" (UID: \"62a770e8-010d-48cf-aef1-9926e6ed868d\") " pod="openstack/dnsmasq-dns-8554648995-8qkcx" Dec 02 16:13:35 crc kubenswrapper[4933]: I1202 16:13:35.222209 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/62a770e8-010d-48cf-aef1-9926e6ed868d-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-8qkcx\" (UID: \"62a770e8-010d-48cf-aef1-9926e6ed868d\") " pod="openstack/dnsmasq-dns-8554648995-8qkcx" Dec 02 16:13:35 crc kubenswrapper[4933]: I1202 16:13:35.222241 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62a770e8-010d-48cf-aef1-9926e6ed868d-config\") pod \"dnsmasq-dns-8554648995-8qkcx\" (UID: \"62a770e8-010d-48cf-aef1-9926e6ed868d\") " pod="openstack/dnsmasq-dns-8554648995-8qkcx" Dec 02 16:13:35 crc kubenswrapper[4933]: I1202 16:13:35.222264 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/62a770e8-010d-48cf-aef1-9926e6ed868d-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-8qkcx\" (UID: \"62a770e8-010d-48cf-aef1-9926e6ed868d\") " pod="openstack/dnsmasq-dns-8554648995-8qkcx" Dec 02 16:13:35 crc kubenswrapper[4933]: I1202 16:13:35.222316 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/62a770e8-010d-48cf-aef1-9926e6ed868d-dns-svc\") pod \"dnsmasq-dns-8554648995-8qkcx\" (UID: \"62a770e8-010d-48cf-aef1-9926e6ed868d\") " pod="openstack/dnsmasq-dns-8554648995-8qkcx" Dec 02 16:13:35 crc kubenswrapper[4933]: I1202 16:13:35.323977 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/62a770e8-010d-48cf-aef1-9926e6ed868d-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-8qkcx\" (UID: \"62a770e8-010d-48cf-aef1-9926e6ed868d\") " pod="openstack/dnsmasq-dns-8554648995-8qkcx" Dec 02 16:13:35 crc kubenswrapper[4933]: I1202 16:13:35.324334 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62a770e8-010d-48cf-aef1-9926e6ed868d-config\") pod \"dnsmasq-dns-8554648995-8qkcx\" (UID: \"62a770e8-010d-48cf-aef1-9926e6ed868d\") " pod="openstack/dnsmasq-dns-8554648995-8qkcx" Dec 02 16:13:35 crc kubenswrapper[4933]: I1202 16:13:35.324371 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/62a770e8-010d-48cf-aef1-9926e6ed868d-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-8qkcx\" (UID: \"62a770e8-010d-48cf-aef1-9926e6ed868d\") " pod="openstack/dnsmasq-dns-8554648995-8qkcx" Dec 02 16:13:35 crc kubenswrapper[4933]: I1202 16:13:35.324439 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/62a770e8-010d-48cf-aef1-9926e6ed868d-dns-svc\") pod \"dnsmasq-dns-8554648995-8qkcx\" (UID: \"62a770e8-010d-48cf-aef1-9926e6ed868d\") " pod="openstack/dnsmasq-dns-8554648995-8qkcx" Dec 02 16:13:35 crc kubenswrapper[4933]: I1202 16:13:35.324473 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mjw2\" (UniqueName: \"kubernetes.io/projected/62a770e8-010d-48cf-aef1-9926e6ed868d-kube-api-access-9mjw2\") pod \"dnsmasq-dns-8554648995-8qkcx\" (UID: \"62a770e8-010d-48cf-aef1-9926e6ed868d\") " pod="openstack/dnsmasq-dns-8554648995-8qkcx" Dec 02 16:13:35 crc kubenswrapper[4933]: I1202 16:13:35.326473 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/62a770e8-010d-48cf-aef1-9926e6ed868d-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-8qkcx\" (UID: \"62a770e8-010d-48cf-aef1-9926e6ed868d\") " pod="openstack/dnsmasq-dns-8554648995-8qkcx" Dec 02 16:13:35 crc kubenswrapper[4933]: I1202 16:13:35.328331 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/62a770e8-010d-48cf-aef1-9926e6ed868d-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-8qkcx\" (UID: \"62a770e8-010d-48cf-aef1-9926e6ed868d\") " pod="openstack/dnsmasq-dns-8554648995-8qkcx" Dec 02 16:13:35 crc kubenswrapper[4933]: I1202 16:13:35.328336 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62a770e8-010d-48cf-aef1-9926e6ed868d-config\") pod \"dnsmasq-dns-8554648995-8qkcx\" (UID: \"62a770e8-010d-48cf-aef1-9926e6ed868d\") " pod="openstack/dnsmasq-dns-8554648995-8qkcx" Dec 02 16:13:35 crc kubenswrapper[4933]: I1202 16:13:35.329584 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/62a770e8-010d-48cf-aef1-9926e6ed868d-dns-svc\") pod \"dnsmasq-dns-8554648995-8qkcx\" (UID: \"62a770e8-010d-48cf-aef1-9926e6ed868d\") " pod="openstack/dnsmasq-dns-8554648995-8qkcx" Dec 02 16:13:35 crc kubenswrapper[4933]: I1202 16:13:35.344929 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mjw2\" (UniqueName: \"kubernetes.io/projected/62a770e8-010d-48cf-aef1-9926e6ed868d-kube-api-access-9mjw2\") pod \"dnsmasq-dns-8554648995-8qkcx\" (UID: \"62a770e8-010d-48cf-aef1-9926e6ed868d\") " pod="openstack/dnsmasq-dns-8554648995-8qkcx" Dec 02 16:13:35 crc kubenswrapper[4933]: I1202 16:13:35.607853 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-8qkcx" Dec 02 16:13:35 crc kubenswrapper[4933]: I1202 16:13:35.719560 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-fxt8c"] Dec 02 16:13:35 crc kubenswrapper[4933]: I1202 16:13:35.778490 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Dec 02 16:13:35 crc kubenswrapper[4933]: I1202 16:13:35.841438 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-cvdrx"] Dec 02 16:13:37 crc kubenswrapper[4933]: I1202 16:13:37.498479 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-cvdrx" event={"ID":"b9b7f4a7-69fa-4d7a-bbbe-631089b57bf2","Type":"ContainerStarted","Data":"0f775ee9934b732ed0836ada7f14356d1e4a0b8b61239dcad880c422d71a2e63"} Dec 02 16:13:37 crc kubenswrapper[4933]: I1202 16:13:37.500897 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-fxt8c" event={"ID":"10c5e5d4-0a18-43ed-8fd0-a386264a6033","Type":"ContainerStarted","Data":"68fe22fdc1112232856b27ad68a8a076a8e733dabee6757ee5c1f36b73006656"} Dec 02 16:13:37 crc kubenswrapper[4933]: I1202 16:13:37.731676 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-8qkcx"] Dec 02 16:13:37 crc kubenswrapper[4933]: W1202 16:13:37.739240 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod62a770e8_010d_48cf_aef1_9926e6ed868d.slice/crio-a7cbb190a8dc895b4be71cbc456eb5b1fa29ccecc57c992a600a588e45810449 WatchSource:0}: Error finding container a7cbb190a8dc895b4be71cbc456eb5b1fa29ccecc57c992a600a588e45810449: Status 404 returned error can't find the container with id a7cbb190a8dc895b4be71cbc456eb5b1fa29ccecc57c992a600a588e45810449 Dec 02 16:13:38 crc kubenswrapper[4933]: I1202 16:13:38.516181 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-8qkcx" event={"ID":"62a770e8-010d-48cf-aef1-9926e6ed868d","Type":"ContainerStarted","Data":"a7cbb190a8dc895b4be71cbc456eb5b1fa29ccecc57c992a600a588e45810449"} Dec 02 16:13:39 crc kubenswrapper[4933]: I1202 16:13:39.527142 4933 generic.go:334] "Generic (PLEG): container finished" podID="62a770e8-010d-48cf-aef1-9926e6ed868d" containerID="cbf1baf8661593d1530f326464c83275da51b642d43336ae7752bb49853e3463" exitCode=0 Dec 02 16:13:39 crc kubenswrapper[4933]: I1202 16:13:39.527206 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-8qkcx" event={"ID":"62a770e8-010d-48cf-aef1-9926e6ed868d","Type":"ContainerDied","Data":"cbf1baf8661593d1530f326464c83275da51b642d43336ae7752bb49853e3463"} Dec 02 16:13:39 crc kubenswrapper[4933]: I1202 16:13:39.529548 4933 generic.go:334] "Generic (PLEG): container finished" podID="10c5e5d4-0a18-43ed-8fd0-a386264a6033" containerID="9ee4c51017bd8f5659c3e8c7b2a20db37894536e2f904e5c1c4d9169ee74c25a" exitCode=0 Dec 02 16:13:39 crc kubenswrapper[4933]: I1202 16:13:39.529596 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-fxt8c" event={"ID":"10c5e5d4-0a18-43ed-8fd0-a386264a6033","Type":"ContainerDied","Data":"9ee4c51017bd8f5659c3e8c7b2a20db37894536e2f904e5c1c4d9169ee74c25a"} Dec 02 16:13:39 crc kubenswrapper[4933]: I1202 16:13:39.532393 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-cvdrx" event={"ID":"b9b7f4a7-69fa-4d7a-bbbe-631089b57bf2","Type":"ContainerStarted","Data":"7eecec8d3475557e9ca5098af2983e8ec0c2a407e07dde2693bc5a208564d6e6"} Dec 02 16:13:39 crc kubenswrapper[4933]: I1202 16:13:39.576185 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-cvdrx" podStartSLOduration=5.576168123 podStartE2EDuration="5.576168123s" podCreationTimestamp="2025-12-02 16:13:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 16:13:39.565642153 +0000 UTC m=+1282.816868856" watchObservedRunningTime="2025-12-02 16:13:39.576168123 +0000 UTC m=+1282.827394826" Dec 02 16:13:39 crc kubenswrapper[4933]: I1202 16:13:39.887182 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-fxt8c" Dec 02 16:13:39 crc kubenswrapper[4933]: I1202 16:13:39.932739 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/10c5e5d4-0a18-43ed-8fd0-a386264a6033-dns-svc\") pod \"10c5e5d4-0a18-43ed-8fd0-a386264a6033\" (UID: \"10c5e5d4-0a18-43ed-8fd0-a386264a6033\") " Dec 02 16:13:39 crc kubenswrapper[4933]: I1202 16:13:39.932784 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gv2f9\" (UniqueName: \"kubernetes.io/projected/10c5e5d4-0a18-43ed-8fd0-a386264a6033-kube-api-access-gv2f9\") pod \"10c5e5d4-0a18-43ed-8fd0-a386264a6033\" (UID: \"10c5e5d4-0a18-43ed-8fd0-a386264a6033\") " Dec 02 16:13:39 crc kubenswrapper[4933]: I1202 16:13:39.932807 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/10c5e5d4-0a18-43ed-8fd0-a386264a6033-ovsdbserver-sb\") pod \"10c5e5d4-0a18-43ed-8fd0-a386264a6033\" (UID: \"10c5e5d4-0a18-43ed-8fd0-a386264a6033\") " Dec 02 16:13:39 crc kubenswrapper[4933]: I1202 16:13:39.932988 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10c5e5d4-0a18-43ed-8fd0-a386264a6033-config\") pod \"10c5e5d4-0a18-43ed-8fd0-a386264a6033\" (UID: \"10c5e5d4-0a18-43ed-8fd0-a386264a6033\") " Dec 02 16:13:39 crc kubenswrapper[4933]: I1202 16:13:39.937308 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10c5e5d4-0a18-43ed-8fd0-a386264a6033-kube-api-access-gv2f9" (OuterVolumeSpecName: "kube-api-access-gv2f9") pod "10c5e5d4-0a18-43ed-8fd0-a386264a6033" (UID: "10c5e5d4-0a18-43ed-8fd0-a386264a6033"). InnerVolumeSpecName "kube-api-access-gv2f9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:13:39 crc kubenswrapper[4933]: I1202 16:13:39.959012 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10c5e5d4-0a18-43ed-8fd0-a386264a6033-config" (OuterVolumeSpecName: "config") pod "10c5e5d4-0a18-43ed-8fd0-a386264a6033" (UID: "10c5e5d4-0a18-43ed-8fd0-a386264a6033"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:13:39 crc kubenswrapper[4933]: I1202 16:13:39.959150 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10c5e5d4-0a18-43ed-8fd0-a386264a6033-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "10c5e5d4-0a18-43ed-8fd0-a386264a6033" (UID: "10c5e5d4-0a18-43ed-8fd0-a386264a6033"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:13:39 crc kubenswrapper[4933]: I1202 16:13:39.968213 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10c5e5d4-0a18-43ed-8fd0-a386264a6033-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "10c5e5d4-0a18-43ed-8fd0-a386264a6033" (UID: "10c5e5d4-0a18-43ed-8fd0-a386264a6033"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:13:40 crc kubenswrapper[4933]: I1202 16:13:40.034907 4933 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10c5e5d4-0a18-43ed-8fd0-a386264a6033-config\") on node \"crc\" DevicePath \"\"" Dec 02 16:13:40 crc kubenswrapper[4933]: I1202 16:13:40.034943 4933 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/10c5e5d4-0a18-43ed-8fd0-a386264a6033-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 16:13:40 crc kubenswrapper[4933]: I1202 16:13:40.034953 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gv2f9\" (UniqueName: \"kubernetes.io/projected/10c5e5d4-0a18-43ed-8fd0-a386264a6033-kube-api-access-gv2f9\") on node \"crc\" DevicePath \"\"" Dec 02 16:13:40 crc kubenswrapper[4933]: I1202 16:13:40.034966 4933 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/10c5e5d4-0a18-43ed-8fd0-a386264a6033-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 02 16:13:40 crc kubenswrapper[4933]: I1202 16:13:40.548370 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-fxt8c" event={"ID":"10c5e5d4-0a18-43ed-8fd0-a386264a6033","Type":"ContainerDied","Data":"68fe22fdc1112232856b27ad68a8a076a8e733dabee6757ee5c1f36b73006656"} Dec 02 16:13:40 crc kubenswrapper[4933]: I1202 16:13:40.548689 4933 scope.go:117] "RemoveContainer" containerID="9ee4c51017bd8f5659c3e8c7b2a20db37894536e2f904e5c1c4d9169ee74c25a" Dec 02 16:13:40 crc kubenswrapper[4933]: I1202 16:13:40.548411 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-fxt8c" Dec 02 16:13:40 crc kubenswrapper[4933]: I1202 16:13:40.552551 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"553a306b-ceeb-41b2-8b2c-c32dcd70639e","Type":"ContainerStarted","Data":"ec58157f081b9ef4630a44882171b6b5835a80908f158d4bdc843455dbe814fe"} Dec 02 16:13:40 crc kubenswrapper[4933]: I1202 16:13:40.554481 4933 generic.go:334] "Generic (PLEG): container finished" podID="803ee8c3-0a9d-46c8-8069-1ebffde1429c" containerID="807b2330f3cb44203c7f511d9f2b7abbe69f7ffc78640e9df411d576e748c6d3" exitCode=0 Dec 02 16:13:40 crc kubenswrapper[4933]: I1202 16:13:40.554580 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"803ee8c3-0a9d-46c8-8069-1ebffde1429c","Type":"ContainerDied","Data":"807b2330f3cb44203c7f511d9f2b7abbe69f7ffc78640e9df411d576e748c6d3"} Dec 02 16:13:40 crc kubenswrapper[4933]: I1202 16:13:40.558115 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-8qkcx" event={"ID":"62a770e8-010d-48cf-aef1-9926e6ed868d","Type":"ContainerStarted","Data":"9afa6cbff396898f3ddc4d0620a43dfc896dc0905e53e3b016b8107e104afb89"} Dec 02 16:13:40 crc kubenswrapper[4933]: I1202 16:13:40.558274 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8554648995-8qkcx" Dec 02 16:13:40 crc kubenswrapper[4933]: I1202 16:13:40.619031 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8554648995-8qkcx" podStartSLOduration=5.619004402 podStartE2EDuration="5.619004402s" podCreationTimestamp="2025-12-02 16:13:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 16:13:40.607211439 +0000 UTC m=+1283.858438152" watchObservedRunningTime="2025-12-02 16:13:40.619004402 +0000 UTC m=+1283.870231125" Dec 02 16:13:40 crc kubenswrapper[4933]: I1202 16:13:40.695408 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-fxt8c"] Dec 02 16:13:40 crc kubenswrapper[4933]: I1202 16:13:40.702934 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-fxt8c"] Dec 02 16:13:41 crc kubenswrapper[4933]: I1202 16:13:41.069865 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10c5e5d4-0a18-43ed-8fd0-a386264a6033" path="/var/lib/kubelet/pods/10c5e5d4-0a18-43ed-8fd0-a386264a6033/volumes" Dec 02 16:13:42 crc kubenswrapper[4933]: I1202 16:13:42.582307 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"cd426cb0-522e-4c62-874e-a85119e82490","Type":"ContainerStarted","Data":"ce237f204414d94a15d1e1174c18cc589f2b40b9be66d7701c711e09638d8f2a"} Dec 02 16:13:42 crc kubenswrapper[4933]: I1202 16:13:42.974098 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-8qkcx"] Dec 02 16:13:42 crc kubenswrapper[4933]: I1202 16:13:42.974319 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8554648995-8qkcx" podUID="62a770e8-010d-48cf-aef1-9926e6ed868d" containerName="dnsmasq-dns" containerID="cri-o://9afa6cbff396898f3ddc4d0620a43dfc896dc0905e53e3b016b8107e104afb89" gracePeriod=10 Dec 02 16:13:43 crc kubenswrapper[4933]: I1202 16:13:43.013397 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-5fxhc"] Dec 02 16:13:43 crc kubenswrapper[4933]: E1202 16:13:43.013811 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10c5e5d4-0a18-43ed-8fd0-a386264a6033" containerName="init" Dec 02 16:13:43 crc kubenswrapper[4933]: I1202 16:13:43.013892 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="10c5e5d4-0a18-43ed-8fd0-a386264a6033" containerName="init" Dec 02 16:13:43 crc kubenswrapper[4933]: I1202 16:13:43.014135 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="10c5e5d4-0a18-43ed-8fd0-a386264a6033" containerName="init" Dec 02 16:13:43 crc kubenswrapper[4933]: I1202 16:13:43.015247 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-5fxhc" Dec 02 16:13:43 crc kubenswrapper[4933]: I1202 16:13:43.053856 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-5fxhc"] Dec 02 16:13:43 crc kubenswrapper[4933]: I1202 16:13:43.094729 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vtfg\" (UniqueName: \"kubernetes.io/projected/36ad0698-417e-43da-b2da-8d4752f57abf-kube-api-access-5vtfg\") pod \"dnsmasq-dns-b8fbc5445-5fxhc\" (UID: \"36ad0698-417e-43da-b2da-8d4752f57abf\") " pod="openstack/dnsmasq-dns-b8fbc5445-5fxhc" Dec 02 16:13:43 crc kubenswrapper[4933]: I1202 16:13:43.094793 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/36ad0698-417e-43da-b2da-8d4752f57abf-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-5fxhc\" (UID: \"36ad0698-417e-43da-b2da-8d4752f57abf\") " pod="openstack/dnsmasq-dns-b8fbc5445-5fxhc" Dec 02 16:13:43 crc kubenswrapper[4933]: I1202 16:13:43.094925 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/36ad0698-417e-43da-b2da-8d4752f57abf-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-5fxhc\" (UID: \"36ad0698-417e-43da-b2da-8d4752f57abf\") " pod="openstack/dnsmasq-dns-b8fbc5445-5fxhc" Dec 02 16:13:43 crc kubenswrapper[4933]: I1202 16:13:43.094983 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36ad0698-417e-43da-b2da-8d4752f57abf-config\") pod \"dnsmasq-dns-b8fbc5445-5fxhc\" (UID: \"36ad0698-417e-43da-b2da-8d4752f57abf\") " pod="openstack/dnsmasq-dns-b8fbc5445-5fxhc" Dec 02 16:13:43 crc kubenswrapper[4933]: I1202 16:13:43.095004 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/36ad0698-417e-43da-b2da-8d4752f57abf-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-5fxhc\" (UID: \"36ad0698-417e-43da-b2da-8d4752f57abf\") " pod="openstack/dnsmasq-dns-b8fbc5445-5fxhc" Dec 02 16:13:43 crc kubenswrapper[4933]: I1202 16:13:43.196213 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36ad0698-417e-43da-b2da-8d4752f57abf-config\") pod \"dnsmasq-dns-b8fbc5445-5fxhc\" (UID: \"36ad0698-417e-43da-b2da-8d4752f57abf\") " pod="openstack/dnsmasq-dns-b8fbc5445-5fxhc" Dec 02 16:13:43 crc kubenswrapper[4933]: I1202 16:13:43.196269 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/36ad0698-417e-43da-b2da-8d4752f57abf-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-5fxhc\" (UID: \"36ad0698-417e-43da-b2da-8d4752f57abf\") " pod="openstack/dnsmasq-dns-b8fbc5445-5fxhc" Dec 02 16:13:43 crc kubenswrapper[4933]: I1202 16:13:43.196338 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vtfg\" (UniqueName: \"kubernetes.io/projected/36ad0698-417e-43da-b2da-8d4752f57abf-kube-api-access-5vtfg\") pod \"dnsmasq-dns-b8fbc5445-5fxhc\" (UID: \"36ad0698-417e-43da-b2da-8d4752f57abf\") " pod="openstack/dnsmasq-dns-b8fbc5445-5fxhc" Dec 02 16:13:43 crc kubenswrapper[4933]: I1202 16:13:43.196376 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/36ad0698-417e-43da-b2da-8d4752f57abf-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-5fxhc\" (UID: \"36ad0698-417e-43da-b2da-8d4752f57abf\") " pod="openstack/dnsmasq-dns-b8fbc5445-5fxhc" Dec 02 16:13:43 crc kubenswrapper[4933]: I1202 16:13:43.196469 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/36ad0698-417e-43da-b2da-8d4752f57abf-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-5fxhc\" (UID: \"36ad0698-417e-43da-b2da-8d4752f57abf\") " pod="openstack/dnsmasq-dns-b8fbc5445-5fxhc" Dec 02 16:13:43 crc kubenswrapper[4933]: I1202 16:13:43.198260 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/36ad0698-417e-43da-b2da-8d4752f57abf-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-5fxhc\" (UID: \"36ad0698-417e-43da-b2da-8d4752f57abf\") " pod="openstack/dnsmasq-dns-b8fbc5445-5fxhc" Dec 02 16:13:43 crc kubenswrapper[4933]: I1202 16:13:43.198489 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36ad0698-417e-43da-b2da-8d4752f57abf-config\") pod \"dnsmasq-dns-b8fbc5445-5fxhc\" (UID: \"36ad0698-417e-43da-b2da-8d4752f57abf\") " pod="openstack/dnsmasq-dns-b8fbc5445-5fxhc" Dec 02 16:13:43 crc kubenswrapper[4933]: I1202 16:13:43.199272 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/36ad0698-417e-43da-b2da-8d4752f57abf-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-5fxhc\" (UID: \"36ad0698-417e-43da-b2da-8d4752f57abf\") " pod="openstack/dnsmasq-dns-b8fbc5445-5fxhc" Dec 02 16:13:43 crc kubenswrapper[4933]: I1202 16:13:43.199409 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/36ad0698-417e-43da-b2da-8d4752f57abf-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-5fxhc\" (UID: \"36ad0698-417e-43da-b2da-8d4752f57abf\") " pod="openstack/dnsmasq-dns-b8fbc5445-5fxhc" Dec 02 16:13:43 crc kubenswrapper[4933]: I1202 16:13:43.225748 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vtfg\" (UniqueName: \"kubernetes.io/projected/36ad0698-417e-43da-b2da-8d4752f57abf-kube-api-access-5vtfg\") pod \"dnsmasq-dns-b8fbc5445-5fxhc\" (UID: \"36ad0698-417e-43da-b2da-8d4752f57abf\") " pod="openstack/dnsmasq-dns-b8fbc5445-5fxhc" Dec 02 16:13:43 crc kubenswrapper[4933]: I1202 16:13:43.382814 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-5fxhc" Dec 02 16:13:43 crc kubenswrapper[4933]: I1202 16:13:43.499480 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-8qkcx" Dec 02 16:13:43 crc kubenswrapper[4933]: I1202 16:13:43.602421 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/62a770e8-010d-48cf-aef1-9926e6ed868d-dns-svc\") pod \"62a770e8-010d-48cf-aef1-9926e6ed868d\" (UID: \"62a770e8-010d-48cf-aef1-9926e6ed868d\") " Dec 02 16:13:43 crc kubenswrapper[4933]: I1202 16:13:43.602481 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62a770e8-010d-48cf-aef1-9926e6ed868d-config\") pod \"62a770e8-010d-48cf-aef1-9926e6ed868d\" (UID: \"62a770e8-010d-48cf-aef1-9926e6ed868d\") " Dec 02 16:13:43 crc kubenswrapper[4933]: I1202 16:13:43.602572 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9mjw2\" (UniqueName: \"kubernetes.io/projected/62a770e8-010d-48cf-aef1-9926e6ed868d-kube-api-access-9mjw2\") pod \"62a770e8-010d-48cf-aef1-9926e6ed868d\" (UID: \"62a770e8-010d-48cf-aef1-9926e6ed868d\") " Dec 02 16:13:43 crc kubenswrapper[4933]: I1202 16:13:43.602721 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/62a770e8-010d-48cf-aef1-9926e6ed868d-ovsdbserver-nb\") pod \"62a770e8-010d-48cf-aef1-9926e6ed868d\" (UID: \"62a770e8-010d-48cf-aef1-9926e6ed868d\") " Dec 02 16:13:43 crc kubenswrapper[4933]: I1202 16:13:43.602775 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/62a770e8-010d-48cf-aef1-9926e6ed868d-ovsdbserver-sb\") pod \"62a770e8-010d-48cf-aef1-9926e6ed868d\" (UID: \"62a770e8-010d-48cf-aef1-9926e6ed868d\") " Dec 02 16:13:43 crc kubenswrapper[4933]: I1202 16:13:43.613119 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62a770e8-010d-48cf-aef1-9926e6ed868d-kube-api-access-9mjw2" (OuterVolumeSpecName: "kube-api-access-9mjw2") pod "62a770e8-010d-48cf-aef1-9926e6ed868d" (UID: "62a770e8-010d-48cf-aef1-9926e6ed868d"). InnerVolumeSpecName "kube-api-access-9mjw2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:13:43 crc kubenswrapper[4933]: I1202 16:13:43.617182 4933 generic.go:334] "Generic (PLEG): container finished" podID="62a770e8-010d-48cf-aef1-9926e6ed868d" containerID="9afa6cbff396898f3ddc4d0620a43dfc896dc0905e53e3b016b8107e104afb89" exitCode=0 Dec 02 16:13:43 crc kubenswrapper[4933]: I1202 16:13:43.617254 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-8qkcx" event={"ID":"62a770e8-010d-48cf-aef1-9926e6ed868d","Type":"ContainerDied","Data":"9afa6cbff396898f3ddc4d0620a43dfc896dc0905e53e3b016b8107e104afb89"} Dec 02 16:13:43 crc kubenswrapper[4933]: I1202 16:13:43.617283 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-8qkcx" event={"ID":"62a770e8-010d-48cf-aef1-9926e6ed868d","Type":"ContainerDied","Data":"a7cbb190a8dc895b4be71cbc456eb5b1fa29ccecc57c992a600a588e45810449"} Dec 02 16:13:43 crc kubenswrapper[4933]: I1202 16:13:43.617298 4933 scope.go:117] "RemoveContainer" containerID="9afa6cbff396898f3ddc4d0620a43dfc896dc0905e53e3b016b8107e104afb89" Dec 02 16:13:43 crc kubenswrapper[4933]: I1202 16:13:43.617428 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-8qkcx" Dec 02 16:13:43 crc kubenswrapper[4933]: I1202 16:13:43.632727 4933 generic.go:334] "Generic (PLEG): container finished" podID="553a306b-ceeb-41b2-8b2c-c32dcd70639e" containerID="ec58157f081b9ef4630a44882171b6b5835a80908f158d4bdc843455dbe814fe" exitCode=0 Dec 02 16:13:43 crc kubenswrapper[4933]: I1202 16:13:43.632766 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"553a306b-ceeb-41b2-8b2c-c32dcd70639e","Type":"ContainerDied","Data":"ec58157f081b9ef4630a44882171b6b5835a80908f158d4bdc843455dbe814fe"} Dec 02 16:13:43 crc kubenswrapper[4933]: I1202 16:13:43.686849 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62a770e8-010d-48cf-aef1-9926e6ed868d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "62a770e8-010d-48cf-aef1-9926e6ed868d" (UID: "62a770e8-010d-48cf-aef1-9926e6ed868d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:13:43 crc kubenswrapper[4933]: I1202 16:13:43.687271 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62a770e8-010d-48cf-aef1-9926e6ed868d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "62a770e8-010d-48cf-aef1-9926e6ed868d" (UID: "62a770e8-010d-48cf-aef1-9926e6ed868d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:13:43 crc kubenswrapper[4933]: I1202 16:13:43.696799 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62a770e8-010d-48cf-aef1-9926e6ed868d-config" (OuterVolumeSpecName: "config") pod "62a770e8-010d-48cf-aef1-9926e6ed868d" (UID: "62a770e8-010d-48cf-aef1-9926e6ed868d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:13:43 crc kubenswrapper[4933]: I1202 16:13:43.703177 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62a770e8-010d-48cf-aef1-9926e6ed868d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "62a770e8-010d-48cf-aef1-9926e6ed868d" (UID: "62a770e8-010d-48cf-aef1-9926e6ed868d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:13:43 crc kubenswrapper[4933]: I1202 16:13:43.710218 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9mjw2\" (UniqueName: \"kubernetes.io/projected/62a770e8-010d-48cf-aef1-9926e6ed868d-kube-api-access-9mjw2\") on node \"crc\" DevicePath \"\"" Dec 02 16:13:43 crc kubenswrapper[4933]: I1202 16:13:43.710239 4933 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/62a770e8-010d-48cf-aef1-9926e6ed868d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 02 16:13:43 crc kubenswrapper[4933]: I1202 16:13:43.710247 4933 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/62a770e8-010d-48cf-aef1-9926e6ed868d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 02 16:13:43 crc kubenswrapper[4933]: I1202 16:13:43.710257 4933 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/62a770e8-010d-48cf-aef1-9926e6ed868d-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 16:13:43 crc kubenswrapper[4933]: I1202 16:13:43.710265 4933 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62a770e8-010d-48cf-aef1-9926e6ed868d-config\") on node \"crc\" DevicePath \"\"" Dec 02 16:13:43 crc kubenswrapper[4933]: I1202 16:13:43.728895 4933 scope.go:117] "RemoveContainer" containerID="cbf1baf8661593d1530f326464c83275da51b642d43336ae7752bb49853e3463" Dec 02 16:13:43 crc kubenswrapper[4933]: I1202 16:13:43.780602 4933 scope.go:117] "RemoveContainer" containerID="9afa6cbff396898f3ddc4d0620a43dfc896dc0905e53e3b016b8107e104afb89" Dec 02 16:13:43 crc kubenswrapper[4933]: E1202 16:13:43.781200 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9afa6cbff396898f3ddc4d0620a43dfc896dc0905e53e3b016b8107e104afb89\": container with ID starting with 9afa6cbff396898f3ddc4d0620a43dfc896dc0905e53e3b016b8107e104afb89 not found: ID does not exist" containerID="9afa6cbff396898f3ddc4d0620a43dfc896dc0905e53e3b016b8107e104afb89" Dec 02 16:13:43 crc kubenswrapper[4933]: I1202 16:13:43.781237 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9afa6cbff396898f3ddc4d0620a43dfc896dc0905e53e3b016b8107e104afb89"} err="failed to get container status \"9afa6cbff396898f3ddc4d0620a43dfc896dc0905e53e3b016b8107e104afb89\": rpc error: code = NotFound desc = could not find container \"9afa6cbff396898f3ddc4d0620a43dfc896dc0905e53e3b016b8107e104afb89\": container with ID starting with 9afa6cbff396898f3ddc4d0620a43dfc896dc0905e53e3b016b8107e104afb89 not found: ID does not exist" Dec 02 16:13:43 crc kubenswrapper[4933]: I1202 16:13:43.781266 4933 scope.go:117] "RemoveContainer" containerID="cbf1baf8661593d1530f326464c83275da51b642d43336ae7752bb49853e3463" Dec 02 16:13:43 crc kubenswrapper[4933]: E1202 16:13:43.787075 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cbf1baf8661593d1530f326464c83275da51b642d43336ae7752bb49853e3463\": container with ID starting with cbf1baf8661593d1530f326464c83275da51b642d43336ae7752bb49853e3463 not found: ID does not exist" containerID="cbf1baf8661593d1530f326464c83275da51b642d43336ae7752bb49853e3463" Dec 02 16:13:43 crc kubenswrapper[4933]: I1202 16:13:43.787129 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbf1baf8661593d1530f326464c83275da51b642d43336ae7752bb49853e3463"} err="failed to get container status \"cbf1baf8661593d1530f326464c83275da51b642d43336ae7752bb49853e3463\": rpc error: code = NotFound desc = could not find container \"cbf1baf8661593d1530f326464c83275da51b642d43336ae7752bb49853e3463\": container with ID starting with cbf1baf8661593d1530f326464c83275da51b642d43336ae7752bb49853e3463 not found: ID does not exist" Dec 02 16:13:43 crc kubenswrapper[4933]: I1202 16:13:43.927679 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-5fxhc"] Dec 02 16:13:43 crc kubenswrapper[4933]: W1202 16:13:43.935485 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod36ad0698_417e_43da_b2da_8d4752f57abf.slice/crio-cfc8f745c2bdcefeb4f0a535bba7977c352b6f9375c3376a0b90a0e9da59fc70 WatchSource:0}: Error finding container cfc8f745c2bdcefeb4f0a535bba7977c352b6f9375c3376a0b90a0e9da59fc70: Status 404 returned error can't find the container with id cfc8f745c2bdcefeb4f0a535bba7977c352b6f9375c3376a0b90a0e9da59fc70 Dec 02 16:13:43 crc kubenswrapper[4933]: I1202 16:13:43.996694 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-8qkcx"] Dec 02 16:13:44 crc kubenswrapper[4933]: I1202 16:13:44.007417 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8554648995-8qkcx"] Dec 02 16:13:44 crc kubenswrapper[4933]: I1202 16:13:44.104130 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Dec 02 16:13:44 crc kubenswrapper[4933]: E1202 16:13:44.104639 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62a770e8-010d-48cf-aef1-9926e6ed868d" containerName="init" Dec 02 16:13:44 crc kubenswrapper[4933]: I1202 16:13:44.104665 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="62a770e8-010d-48cf-aef1-9926e6ed868d" containerName="init" Dec 02 16:13:44 crc kubenswrapper[4933]: E1202 16:13:44.104697 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62a770e8-010d-48cf-aef1-9926e6ed868d" containerName="dnsmasq-dns" Dec 02 16:13:44 crc kubenswrapper[4933]: I1202 16:13:44.104706 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="62a770e8-010d-48cf-aef1-9926e6ed868d" containerName="dnsmasq-dns" Dec 02 16:13:44 crc kubenswrapper[4933]: I1202 16:13:44.104950 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="62a770e8-010d-48cf-aef1-9926e6ed868d" containerName="dnsmasq-dns" Dec 02 16:13:44 crc kubenswrapper[4933]: I1202 16:13:44.112320 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 02 16:13:44 crc kubenswrapper[4933]: I1202 16:13:44.116243 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-c6mh5" Dec 02 16:13:44 crc kubenswrapper[4933]: I1202 16:13:44.116498 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Dec 02 16:13:44 crc kubenswrapper[4933]: I1202 16:13:44.116510 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Dec 02 16:13:44 crc kubenswrapper[4933]: I1202 16:13:44.116628 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Dec 02 16:13:44 crc kubenswrapper[4933]: I1202 16:13:44.146888 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 02 16:13:44 crc kubenswrapper[4933]: I1202 16:13:44.231918 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/cf325ab8-af91-4009-9e8b-a299db2234da-lock\") pod \"swift-storage-0\" (UID: \"cf325ab8-af91-4009-9e8b-a299db2234da\") " pod="openstack/swift-storage-0" Dec 02 16:13:44 crc kubenswrapper[4933]: I1202 16:13:44.232452 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/cf325ab8-af91-4009-9e8b-a299db2234da-etc-swift\") pod \"swift-storage-0\" (UID: \"cf325ab8-af91-4009-9e8b-a299db2234da\") " pod="openstack/swift-storage-0" Dec 02 16:13:44 crc kubenswrapper[4933]: I1202 16:13:44.232518 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/cf325ab8-af91-4009-9e8b-a299db2234da-cache\") pod \"swift-storage-0\" (UID: \"cf325ab8-af91-4009-9e8b-a299db2234da\") " pod="openstack/swift-storage-0" Dec 02 16:13:44 crc kubenswrapper[4933]: I1202 16:13:44.232583 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"swift-storage-0\" (UID: \"cf325ab8-af91-4009-9e8b-a299db2234da\") " pod="openstack/swift-storage-0" Dec 02 16:13:44 crc kubenswrapper[4933]: I1202 16:13:44.232617 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48pd8\" (UniqueName: \"kubernetes.io/projected/cf325ab8-af91-4009-9e8b-a299db2234da-kube-api-access-48pd8\") pod \"swift-storage-0\" (UID: \"cf325ab8-af91-4009-9e8b-a299db2234da\") " pod="openstack/swift-storage-0" Dec 02 16:13:44 crc kubenswrapper[4933]: I1202 16:13:44.334619 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48pd8\" (UniqueName: \"kubernetes.io/projected/cf325ab8-af91-4009-9e8b-a299db2234da-kube-api-access-48pd8\") pod \"swift-storage-0\" (UID: \"cf325ab8-af91-4009-9e8b-a299db2234da\") " pod="openstack/swift-storage-0" Dec 02 16:13:44 crc kubenswrapper[4933]: I1202 16:13:44.334695 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/cf325ab8-af91-4009-9e8b-a299db2234da-lock\") pod \"swift-storage-0\" (UID: \"cf325ab8-af91-4009-9e8b-a299db2234da\") " pod="openstack/swift-storage-0" Dec 02 16:13:44 crc kubenswrapper[4933]: I1202 16:13:44.334772 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/cf325ab8-af91-4009-9e8b-a299db2234da-etc-swift\") pod \"swift-storage-0\" (UID: \"cf325ab8-af91-4009-9e8b-a299db2234da\") " pod="openstack/swift-storage-0" Dec 02 16:13:44 crc kubenswrapper[4933]: I1202 16:13:44.334855 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/cf325ab8-af91-4009-9e8b-a299db2234da-cache\") pod \"swift-storage-0\" (UID: \"cf325ab8-af91-4009-9e8b-a299db2234da\") " pod="openstack/swift-storage-0" Dec 02 16:13:44 crc kubenswrapper[4933]: I1202 16:13:44.334912 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"swift-storage-0\" (UID: \"cf325ab8-af91-4009-9e8b-a299db2234da\") " pod="openstack/swift-storage-0" Dec 02 16:13:44 crc kubenswrapper[4933]: E1202 16:13:44.335081 4933 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 02 16:13:44 crc kubenswrapper[4933]: E1202 16:13:44.335111 4933 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 02 16:13:44 crc kubenswrapper[4933]: E1202 16:13:44.335168 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cf325ab8-af91-4009-9e8b-a299db2234da-etc-swift podName:cf325ab8-af91-4009-9e8b-a299db2234da nodeName:}" failed. No retries permitted until 2025-12-02 16:13:44.835150645 +0000 UTC m=+1288.086377348 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/cf325ab8-af91-4009-9e8b-a299db2234da-etc-swift") pod "swift-storage-0" (UID: "cf325ab8-af91-4009-9e8b-a299db2234da") : configmap "swift-ring-files" not found Dec 02 16:13:44 crc kubenswrapper[4933]: I1202 16:13:44.335208 4933 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"swift-storage-0\" (UID: \"cf325ab8-af91-4009-9e8b-a299db2234da\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/swift-storage-0" Dec 02 16:13:44 crc kubenswrapper[4933]: I1202 16:13:44.335305 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/cf325ab8-af91-4009-9e8b-a299db2234da-lock\") pod \"swift-storage-0\" (UID: \"cf325ab8-af91-4009-9e8b-a299db2234da\") " pod="openstack/swift-storage-0" Dec 02 16:13:44 crc kubenswrapper[4933]: I1202 16:13:44.335336 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/cf325ab8-af91-4009-9e8b-a299db2234da-cache\") pod \"swift-storage-0\" (UID: \"cf325ab8-af91-4009-9e8b-a299db2234da\") " pod="openstack/swift-storage-0" Dec 02 16:13:44 crc kubenswrapper[4933]: I1202 16:13:44.361853 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"swift-storage-0\" (UID: \"cf325ab8-af91-4009-9e8b-a299db2234da\") " pod="openstack/swift-storage-0" Dec 02 16:13:44 crc kubenswrapper[4933]: I1202 16:13:44.374721 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48pd8\" (UniqueName: \"kubernetes.io/projected/cf325ab8-af91-4009-9e8b-a299db2234da-kube-api-access-48pd8\") pod \"swift-storage-0\" (UID: \"cf325ab8-af91-4009-9e8b-a299db2234da\") " pod="openstack/swift-storage-0" Dec 02 16:13:44 crc kubenswrapper[4933]: I1202 16:13:44.646530 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"553a306b-ceeb-41b2-8b2c-c32dcd70639e","Type":"ContainerStarted","Data":"31466a253b23b8d1c5a91cbcc9dfab88826bfc2eea6b891822dc404e4a8f4fe9"} Dec 02 16:13:44 crc kubenswrapper[4933]: I1202 16:13:44.649111 4933 generic.go:334] "Generic (PLEG): container finished" podID="36ad0698-417e-43da-b2da-8d4752f57abf" containerID="5b185b4ccbd021ccf09063083956147391757c9423cb7ebad08d581d2737a815" exitCode=0 Dec 02 16:13:44 crc kubenswrapper[4933]: I1202 16:13:44.649189 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-5fxhc" event={"ID":"36ad0698-417e-43da-b2da-8d4752f57abf","Type":"ContainerDied","Data":"5b185b4ccbd021ccf09063083956147391757c9423cb7ebad08d581d2737a815"} Dec 02 16:13:44 crc kubenswrapper[4933]: I1202 16:13:44.649225 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-5fxhc" event={"ID":"36ad0698-417e-43da-b2da-8d4752f57abf","Type":"ContainerStarted","Data":"cfc8f745c2bdcefeb4f0a535bba7977c352b6f9375c3376a0b90a0e9da59fc70"} Dec 02 16:13:44 crc kubenswrapper[4933]: I1202 16:13:44.668519 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=18.616938666 podStartE2EDuration="45.668504193s" podCreationTimestamp="2025-12-02 16:12:59 +0000 UTC" firstStartedPulling="2025-12-02 16:13:12.683082927 +0000 UTC m=+1255.934309630" lastFinishedPulling="2025-12-02 16:13:39.734648454 +0000 UTC m=+1282.985875157" observedRunningTime="2025-12-02 16:13:44.66199172 +0000 UTC m=+1287.913218423" watchObservedRunningTime="2025-12-02 16:13:44.668504193 +0000 UTC m=+1287.919730896" Dec 02 16:13:44 crc kubenswrapper[4933]: I1202 16:13:44.845549 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/cf325ab8-af91-4009-9e8b-a299db2234da-etc-swift\") pod \"swift-storage-0\" (UID: \"cf325ab8-af91-4009-9e8b-a299db2234da\") " pod="openstack/swift-storage-0" Dec 02 16:13:44 crc kubenswrapper[4933]: E1202 16:13:44.845939 4933 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 02 16:13:44 crc kubenswrapper[4933]: E1202 16:13:44.845971 4933 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 02 16:13:44 crc kubenswrapper[4933]: E1202 16:13:44.846037 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cf325ab8-af91-4009-9e8b-a299db2234da-etc-swift podName:cf325ab8-af91-4009-9e8b-a299db2234da nodeName:}" failed. No retries permitted until 2025-12-02 16:13:45.84601482 +0000 UTC m=+1289.097241533 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/cf325ab8-af91-4009-9e8b-a299db2234da-etc-swift") pod "swift-storage-0" (UID: "cf325ab8-af91-4009-9e8b-a299db2234da") : configmap "swift-ring-files" not found Dec 02 16:13:45 crc kubenswrapper[4933]: I1202 16:13:45.068476 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62a770e8-010d-48cf-aef1-9926e6ed868d" path="/var/lib/kubelet/pods/62a770e8-010d-48cf-aef1-9926e6ed868d/volumes" Dec 02 16:13:45 crc kubenswrapper[4933]: I1202 16:13:45.681152 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-5fxhc" event={"ID":"36ad0698-417e-43da-b2da-8d4752f57abf","Type":"ContainerStarted","Data":"f815f4b942dceacd673181b01a8d49d8c21f7e4c6a9171fece2cfebed934c41f"} Dec 02 16:13:45 crc kubenswrapper[4933]: I1202 16:13:45.681727 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b8fbc5445-5fxhc" Dec 02 16:13:45 crc kubenswrapper[4933]: I1202 16:13:45.684641 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-g8qb5" event={"ID":"8a35aeec-161d-4ef9-a42b-7967c06c7249","Type":"ContainerStarted","Data":"8541c3601f3bdc9c38715335ad56a9be91791a1017360c36b5b1ee90ecb6a1c9"} Dec 02 16:13:45 crc kubenswrapper[4933]: I1202 16:13:45.685103 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-g8qb5" Dec 02 16:13:45 crc kubenswrapper[4933]: I1202 16:13:45.711226 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-b8fbc5445-5fxhc" podStartSLOduration=3.711208449 podStartE2EDuration="3.711208449s" podCreationTimestamp="2025-12-02 16:13:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 16:13:45.702876568 +0000 UTC m=+1288.954103301" watchObservedRunningTime="2025-12-02 16:13:45.711208449 +0000 UTC m=+1288.962435172" Dec 02 16:13:45 crc kubenswrapper[4933]: I1202 16:13:45.742971 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-g8qb5" podStartSLOduration=10.998677337 podStartE2EDuration="40.742932892s" podCreationTimestamp="2025-12-02 16:13:05 +0000 UTC" firstStartedPulling="2025-12-02 16:13:15.064066893 +0000 UTC m=+1258.315293596" lastFinishedPulling="2025-12-02 16:13:44.808322448 +0000 UTC m=+1288.059549151" observedRunningTime="2025-12-02 16:13:45.720118656 +0000 UTC m=+1288.971345369" watchObservedRunningTime="2025-12-02 16:13:45.742932892 +0000 UTC m=+1288.994159605" Dec 02 16:13:45 crc kubenswrapper[4933]: I1202 16:13:45.865676 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/cf325ab8-af91-4009-9e8b-a299db2234da-etc-swift\") pod \"swift-storage-0\" (UID: \"cf325ab8-af91-4009-9e8b-a299db2234da\") " pod="openstack/swift-storage-0" Dec 02 16:13:45 crc kubenswrapper[4933]: E1202 16:13:45.865945 4933 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 02 16:13:45 crc kubenswrapper[4933]: E1202 16:13:45.865974 4933 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 02 16:13:45 crc kubenswrapper[4933]: E1202 16:13:45.866034 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cf325ab8-af91-4009-9e8b-a299db2234da-etc-swift podName:cf325ab8-af91-4009-9e8b-a299db2234da nodeName:}" failed. No retries permitted until 2025-12-02 16:13:47.866014323 +0000 UTC m=+1291.117241026 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/cf325ab8-af91-4009-9e8b-a299db2234da-etc-swift") pod "swift-storage-0" (UID: "cf325ab8-af91-4009-9e8b-a299db2234da") : configmap "swift-ring-files" not found Dec 02 16:13:46 crc kubenswrapper[4933]: I1202 16:13:46.693726 4933 generic.go:334] "Generic (PLEG): container finished" podID="cd426cb0-522e-4c62-874e-a85119e82490" containerID="ce237f204414d94a15d1e1174c18cc589f2b40b9be66d7701c711e09638d8f2a" exitCode=0 Dec 02 16:13:46 crc kubenswrapper[4933]: I1202 16:13:46.693931 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"cd426cb0-522e-4c62-874e-a85119e82490","Type":"ContainerDied","Data":"ce237f204414d94a15d1e1174c18cc589f2b40b9be66d7701c711e09638d8f2a"} Dec 02 16:13:47 crc kubenswrapper[4933]: I1202 16:13:47.910554 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/cf325ab8-af91-4009-9e8b-a299db2234da-etc-swift\") pod \"swift-storage-0\" (UID: \"cf325ab8-af91-4009-9e8b-a299db2234da\") " pod="openstack/swift-storage-0" Dec 02 16:13:47 crc kubenswrapper[4933]: E1202 16:13:47.910800 4933 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 02 16:13:47 crc kubenswrapper[4933]: E1202 16:13:47.911041 4933 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 02 16:13:47 crc kubenswrapper[4933]: E1202 16:13:47.911108 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cf325ab8-af91-4009-9e8b-a299db2234da-etc-swift podName:cf325ab8-af91-4009-9e8b-a299db2234da nodeName:}" failed. No retries permitted until 2025-12-02 16:13:51.911087663 +0000 UTC m=+1295.162314356 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/cf325ab8-af91-4009-9e8b-a299db2234da-etc-swift") pod "swift-storage-0" (UID: "cf325ab8-af91-4009-9e8b-a299db2234da") : configmap "swift-ring-files" not found Dec 02 16:13:48 crc kubenswrapper[4933]: I1202 16:13:48.070235 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-vwmtj"] Dec 02 16:13:48 crc kubenswrapper[4933]: I1202 16:13:48.071785 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-vwmtj" Dec 02 16:13:48 crc kubenswrapper[4933]: I1202 16:13:48.077220 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Dec 02 16:13:48 crc kubenswrapper[4933]: I1202 16:13:48.077603 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Dec 02 16:13:48 crc kubenswrapper[4933]: I1202 16:13:48.078174 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 02 16:13:48 crc kubenswrapper[4933]: I1202 16:13:48.084612 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-vwmtj"] Dec 02 16:13:48 crc kubenswrapper[4933]: I1202 16:13:48.115914 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/591c4cbc-2470-449b-9046-76b4c6543cb9-dispersionconf\") pod \"swift-ring-rebalance-vwmtj\" (UID: \"591c4cbc-2470-449b-9046-76b4c6543cb9\") " pod="openstack/swift-ring-rebalance-vwmtj" Dec 02 16:13:48 crc kubenswrapper[4933]: I1202 16:13:48.115972 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/591c4cbc-2470-449b-9046-76b4c6543cb9-etc-swift\") pod \"swift-ring-rebalance-vwmtj\" (UID: \"591c4cbc-2470-449b-9046-76b4c6543cb9\") " pod="openstack/swift-ring-rebalance-vwmtj" Dec 02 16:13:48 crc kubenswrapper[4933]: I1202 16:13:48.115995 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/591c4cbc-2470-449b-9046-76b4c6543cb9-swiftconf\") pod \"swift-ring-rebalance-vwmtj\" (UID: \"591c4cbc-2470-449b-9046-76b4c6543cb9\") " pod="openstack/swift-ring-rebalance-vwmtj" Dec 02 16:13:48 crc kubenswrapper[4933]: I1202 16:13:48.116116 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/591c4cbc-2470-449b-9046-76b4c6543cb9-scripts\") pod \"swift-ring-rebalance-vwmtj\" (UID: \"591c4cbc-2470-449b-9046-76b4c6543cb9\") " pod="openstack/swift-ring-rebalance-vwmtj" Dec 02 16:13:48 crc kubenswrapper[4933]: I1202 16:13:48.116165 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5q9st\" (UniqueName: \"kubernetes.io/projected/591c4cbc-2470-449b-9046-76b4c6543cb9-kube-api-access-5q9st\") pod \"swift-ring-rebalance-vwmtj\" (UID: \"591c4cbc-2470-449b-9046-76b4c6543cb9\") " pod="openstack/swift-ring-rebalance-vwmtj" Dec 02 16:13:48 crc kubenswrapper[4933]: I1202 16:13:48.116198 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/591c4cbc-2470-449b-9046-76b4c6543cb9-ring-data-devices\") pod \"swift-ring-rebalance-vwmtj\" (UID: \"591c4cbc-2470-449b-9046-76b4c6543cb9\") " pod="openstack/swift-ring-rebalance-vwmtj" Dec 02 16:13:48 crc kubenswrapper[4933]: I1202 16:13:48.116256 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/591c4cbc-2470-449b-9046-76b4c6543cb9-combined-ca-bundle\") pod \"swift-ring-rebalance-vwmtj\" (UID: \"591c4cbc-2470-449b-9046-76b4c6543cb9\") " pod="openstack/swift-ring-rebalance-vwmtj" Dec 02 16:13:48 crc kubenswrapper[4933]: I1202 16:13:48.218006 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/591c4cbc-2470-449b-9046-76b4c6543cb9-scripts\") pod \"swift-ring-rebalance-vwmtj\" (UID: \"591c4cbc-2470-449b-9046-76b4c6543cb9\") " pod="openstack/swift-ring-rebalance-vwmtj" Dec 02 16:13:48 crc kubenswrapper[4933]: I1202 16:13:48.218093 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5q9st\" (UniqueName: \"kubernetes.io/projected/591c4cbc-2470-449b-9046-76b4c6543cb9-kube-api-access-5q9st\") pod \"swift-ring-rebalance-vwmtj\" (UID: \"591c4cbc-2470-449b-9046-76b4c6543cb9\") " pod="openstack/swift-ring-rebalance-vwmtj" Dec 02 16:13:48 crc kubenswrapper[4933]: I1202 16:13:48.218147 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/591c4cbc-2470-449b-9046-76b4c6543cb9-ring-data-devices\") pod \"swift-ring-rebalance-vwmtj\" (UID: \"591c4cbc-2470-449b-9046-76b4c6543cb9\") " pod="openstack/swift-ring-rebalance-vwmtj" Dec 02 16:13:48 crc kubenswrapper[4933]: I1202 16:13:48.218197 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/591c4cbc-2470-449b-9046-76b4c6543cb9-combined-ca-bundle\") pod \"swift-ring-rebalance-vwmtj\" (UID: \"591c4cbc-2470-449b-9046-76b4c6543cb9\") " pod="openstack/swift-ring-rebalance-vwmtj" Dec 02 16:13:48 crc kubenswrapper[4933]: I1202 16:13:48.218288 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/591c4cbc-2470-449b-9046-76b4c6543cb9-dispersionconf\") pod \"swift-ring-rebalance-vwmtj\" (UID: \"591c4cbc-2470-449b-9046-76b4c6543cb9\") " pod="openstack/swift-ring-rebalance-vwmtj" Dec 02 16:13:48 crc kubenswrapper[4933]: I1202 16:13:48.218346 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/591c4cbc-2470-449b-9046-76b4c6543cb9-etc-swift\") pod \"swift-ring-rebalance-vwmtj\" (UID: \"591c4cbc-2470-449b-9046-76b4c6543cb9\") " pod="openstack/swift-ring-rebalance-vwmtj" Dec 02 16:13:48 crc kubenswrapper[4933]: I1202 16:13:48.218378 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/591c4cbc-2470-449b-9046-76b4c6543cb9-swiftconf\") pod \"swift-ring-rebalance-vwmtj\" (UID: \"591c4cbc-2470-449b-9046-76b4c6543cb9\") " pod="openstack/swift-ring-rebalance-vwmtj" Dec 02 16:13:48 crc kubenswrapper[4933]: I1202 16:13:48.218950 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/591c4cbc-2470-449b-9046-76b4c6543cb9-etc-swift\") pod \"swift-ring-rebalance-vwmtj\" (UID: \"591c4cbc-2470-449b-9046-76b4c6543cb9\") " pod="openstack/swift-ring-rebalance-vwmtj" Dec 02 16:13:48 crc kubenswrapper[4933]: I1202 16:13:48.219347 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/591c4cbc-2470-449b-9046-76b4c6543cb9-ring-data-devices\") pod \"swift-ring-rebalance-vwmtj\" (UID: \"591c4cbc-2470-449b-9046-76b4c6543cb9\") " pod="openstack/swift-ring-rebalance-vwmtj" Dec 02 16:13:48 crc kubenswrapper[4933]: I1202 16:13:48.219968 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/591c4cbc-2470-449b-9046-76b4c6543cb9-scripts\") pod \"swift-ring-rebalance-vwmtj\" (UID: \"591c4cbc-2470-449b-9046-76b4c6543cb9\") " pod="openstack/swift-ring-rebalance-vwmtj" Dec 02 16:13:48 crc kubenswrapper[4933]: I1202 16:13:48.223649 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/591c4cbc-2470-449b-9046-76b4c6543cb9-dispersionconf\") pod \"swift-ring-rebalance-vwmtj\" (UID: \"591c4cbc-2470-449b-9046-76b4c6543cb9\") " pod="openstack/swift-ring-rebalance-vwmtj" Dec 02 16:13:48 crc kubenswrapper[4933]: I1202 16:13:48.232957 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/591c4cbc-2470-449b-9046-76b4c6543cb9-combined-ca-bundle\") pod \"swift-ring-rebalance-vwmtj\" (UID: \"591c4cbc-2470-449b-9046-76b4c6543cb9\") " pod="openstack/swift-ring-rebalance-vwmtj" Dec 02 16:13:48 crc kubenswrapper[4933]: I1202 16:13:48.233560 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/591c4cbc-2470-449b-9046-76b4c6543cb9-swiftconf\") pod \"swift-ring-rebalance-vwmtj\" (UID: \"591c4cbc-2470-449b-9046-76b4c6543cb9\") " pod="openstack/swift-ring-rebalance-vwmtj" Dec 02 16:13:48 crc kubenswrapper[4933]: I1202 16:13:48.234537 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5q9st\" (UniqueName: \"kubernetes.io/projected/591c4cbc-2470-449b-9046-76b4c6543cb9-kube-api-access-5q9st\") pod \"swift-ring-rebalance-vwmtj\" (UID: \"591c4cbc-2470-449b-9046-76b4c6543cb9\") " pod="openstack/swift-ring-rebalance-vwmtj" Dec 02 16:13:48 crc kubenswrapper[4933]: I1202 16:13:48.392261 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-vwmtj" Dec 02 16:13:49 crc kubenswrapper[4933]: I1202 16:13:49.436116 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-84778dd5c6-qprl6" podUID="1e17b7a2-545e-4757-87d6-4bb0d4f6407c" containerName="console" containerID="cri-o://f092b3ce01fd53d09b740797004140f50a3d7fc565e3d42c7805e63faf6b735f" gracePeriod=15 Dec 02 16:13:49 crc kubenswrapper[4933]: I1202 16:13:49.728893 4933 generic.go:334] "Generic (PLEG): container finished" podID="5df97ea3-3281-4995-bf14-cb09bf09f39d" containerID="c0cbe56a7a65f4eb7aa06267f0be741d781f0a45fc009bb3d23073ffd1f9bfcb" exitCode=0 Dec 02 16:13:49 crc kubenswrapper[4933]: I1202 16:13:49.729021 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"5df97ea3-3281-4995-bf14-cb09bf09f39d","Type":"ContainerDied","Data":"c0cbe56a7a65f4eb7aa06267f0be741d781f0a45fc009bb3d23073ffd1f9bfcb"} Dec 02 16:13:50 crc kubenswrapper[4933]: I1202 16:13:50.676172 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Dec 02 16:13:50 crc kubenswrapper[4933]: I1202 16:13:50.676623 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Dec 02 16:13:50 crc kubenswrapper[4933]: I1202 16:13:50.714156 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-vwmtj"] Dec 02 16:13:50 crc kubenswrapper[4933]: I1202 16:13:50.760860 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"cd426cb0-522e-4c62-874e-a85119e82490","Type":"ContainerStarted","Data":"c5b6326fe316a5337cc644ee945b588573d74d1c1def2bd30ad2e22b5c0e6315"} Dec 02 16:13:50 crc kubenswrapper[4933]: I1202 16:13:50.768090 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"5df97ea3-3281-4995-bf14-cb09bf09f39d","Type":"ContainerStarted","Data":"3cc74c4fecf1add4592a00be1e97ffe77de72c7f603cf197d35316e05299c703"} Dec 02 16:13:50 crc kubenswrapper[4933]: I1202 16:13:50.768324 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 02 16:13:50 crc kubenswrapper[4933]: I1202 16:13:50.771050 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-vwmtj" event={"ID":"591c4cbc-2470-449b-9046-76b4c6543cb9","Type":"ContainerStarted","Data":"b603ca3198ad37d8505b1c743a0b63cf584393354b1920902633fc1b994c7abf"} Dec 02 16:13:50 crc kubenswrapper[4933]: I1202 16:13:50.772654 4933 generic.go:334] "Generic (PLEG): container finished" podID="e3cdda86-3d0d-486d-ae36-ab6792bff2ab" containerID="7da6f3952a39b50e00a2e909de5ee41531348ab856141537311ba7dea9e99461" exitCode=0 Dec 02 16:13:50 crc kubenswrapper[4933]: I1202 16:13:50.772723 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e3cdda86-3d0d-486d-ae36-ab6792bff2ab","Type":"ContainerDied","Data":"7da6f3952a39b50e00a2e909de5ee41531348ab856141537311ba7dea9e99461"} Dec 02 16:13:50 crc kubenswrapper[4933]: I1202 16:13:50.779554 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-84778dd5c6-qprl6_1e17b7a2-545e-4757-87d6-4bb0d4f6407c/console/0.log" Dec 02 16:13:50 crc kubenswrapper[4933]: I1202 16:13:50.779598 4933 generic.go:334] "Generic (PLEG): container finished" podID="1e17b7a2-545e-4757-87d6-4bb0d4f6407c" containerID="f092b3ce01fd53d09b740797004140f50a3d7fc565e3d42c7805e63faf6b735f" exitCode=2 Dec 02 16:13:50 crc kubenswrapper[4933]: I1202 16:13:50.779627 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-84778dd5c6-qprl6" event={"ID":"1e17b7a2-545e-4757-87d6-4bb0d4f6407c","Type":"ContainerDied","Data":"f092b3ce01fd53d09b740797004140f50a3d7fc565e3d42c7805e63faf6b735f"} Dec 02 16:13:50 crc kubenswrapper[4933]: I1202 16:13:50.785323 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Dec 02 16:13:50 crc kubenswrapper[4933]: I1202 16:13:50.792506 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=-9223371983.062292 podStartE2EDuration="53.792484456s" podCreationTimestamp="2025-12-02 16:12:57 +0000 UTC" firstStartedPulling="2025-12-02 16:13:12.980287124 +0000 UTC m=+1256.231513817" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 16:13:50.790043451 +0000 UTC m=+1294.041270164" watchObservedRunningTime="2025-12-02 16:13:50.792484456 +0000 UTC m=+1294.043711169" Dec 02 16:13:50 crc kubenswrapper[4933]: I1202 16:13:50.799999 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-84778dd5c6-qprl6_1e17b7a2-545e-4757-87d6-4bb0d4f6407c/console/0.log" Dec 02 16:13:50 crc kubenswrapper[4933]: I1202 16:13:50.800067 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-84778dd5c6-qprl6" Dec 02 16:13:50 crc kubenswrapper[4933]: I1202 16:13:50.862172 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=52.457644586 podStartE2EDuration="54.862148777s" podCreationTimestamp="2025-12-02 16:12:56 +0000 UTC" firstStartedPulling="2025-12-02 16:13:12.684393942 +0000 UTC m=+1255.935620635" lastFinishedPulling="2025-12-02 16:13:15.088898123 +0000 UTC m=+1258.340124826" observedRunningTime="2025-12-02 16:13:50.85997377 +0000 UTC m=+1294.111200473" watchObservedRunningTime="2025-12-02 16:13:50.862148777 +0000 UTC m=+1294.113375490" Dec 02 16:13:50 crc kubenswrapper[4933]: I1202 16:13:50.917465 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Dec 02 16:13:50 crc kubenswrapper[4933]: I1202 16:13:50.987030 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1e17b7a2-545e-4757-87d6-4bb0d4f6407c-service-ca\") pod \"1e17b7a2-545e-4757-87d6-4bb0d4f6407c\" (UID: \"1e17b7a2-545e-4757-87d6-4bb0d4f6407c\") " Dec 02 16:13:50 crc kubenswrapper[4933]: I1202 16:13:50.987097 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1e17b7a2-545e-4757-87d6-4bb0d4f6407c-console-serving-cert\") pod \"1e17b7a2-545e-4757-87d6-4bb0d4f6407c\" (UID: \"1e17b7a2-545e-4757-87d6-4bb0d4f6407c\") " Dec 02 16:13:50 crc kubenswrapper[4933]: I1202 16:13:50.987215 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wng4j\" (UniqueName: \"kubernetes.io/projected/1e17b7a2-545e-4757-87d6-4bb0d4f6407c-kube-api-access-wng4j\") pod \"1e17b7a2-545e-4757-87d6-4bb0d4f6407c\" (UID: \"1e17b7a2-545e-4757-87d6-4bb0d4f6407c\") " Dec 02 16:13:50 crc kubenswrapper[4933]: I1202 16:13:50.987354 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1e17b7a2-545e-4757-87d6-4bb0d4f6407c-console-config\") pod \"1e17b7a2-545e-4757-87d6-4bb0d4f6407c\" (UID: \"1e17b7a2-545e-4757-87d6-4bb0d4f6407c\") " Dec 02 16:13:50 crc kubenswrapper[4933]: I1202 16:13:50.987414 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1e17b7a2-545e-4757-87d6-4bb0d4f6407c-trusted-ca-bundle\") pod \"1e17b7a2-545e-4757-87d6-4bb0d4f6407c\" (UID: \"1e17b7a2-545e-4757-87d6-4bb0d4f6407c\") " Dec 02 16:13:50 crc kubenswrapper[4933]: I1202 16:13:50.987444 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1e17b7a2-545e-4757-87d6-4bb0d4f6407c-oauth-serving-cert\") pod \"1e17b7a2-545e-4757-87d6-4bb0d4f6407c\" (UID: \"1e17b7a2-545e-4757-87d6-4bb0d4f6407c\") " Dec 02 16:13:50 crc kubenswrapper[4933]: I1202 16:13:50.987483 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1e17b7a2-545e-4757-87d6-4bb0d4f6407c-console-oauth-config\") pod \"1e17b7a2-545e-4757-87d6-4bb0d4f6407c\" (UID: \"1e17b7a2-545e-4757-87d6-4bb0d4f6407c\") " Dec 02 16:13:50 crc kubenswrapper[4933]: I1202 16:13:50.988620 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e17b7a2-545e-4757-87d6-4bb0d4f6407c-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1e17b7a2-545e-4757-87d6-4bb0d4f6407c" (UID: "1e17b7a2-545e-4757-87d6-4bb0d4f6407c"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:13:50 crc kubenswrapper[4933]: I1202 16:13:50.988650 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e17b7a2-545e-4757-87d6-4bb0d4f6407c-service-ca" (OuterVolumeSpecName: "service-ca") pod "1e17b7a2-545e-4757-87d6-4bb0d4f6407c" (UID: "1e17b7a2-545e-4757-87d6-4bb0d4f6407c"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:13:50 crc kubenswrapper[4933]: I1202 16:13:50.988666 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e17b7a2-545e-4757-87d6-4bb0d4f6407c-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "1e17b7a2-545e-4757-87d6-4bb0d4f6407c" (UID: "1e17b7a2-545e-4757-87d6-4bb0d4f6407c"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:13:50 crc kubenswrapper[4933]: I1202 16:13:50.989910 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e17b7a2-545e-4757-87d6-4bb0d4f6407c-console-config" (OuterVolumeSpecName: "console-config") pod "1e17b7a2-545e-4757-87d6-4bb0d4f6407c" (UID: "1e17b7a2-545e-4757-87d6-4bb0d4f6407c"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:13:50 crc kubenswrapper[4933]: I1202 16:13:50.995039 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e17b7a2-545e-4757-87d6-4bb0d4f6407c-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "1e17b7a2-545e-4757-87d6-4bb0d4f6407c" (UID: "1e17b7a2-545e-4757-87d6-4bb0d4f6407c"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:13:50 crc kubenswrapper[4933]: I1202 16:13:50.995205 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e17b7a2-545e-4757-87d6-4bb0d4f6407c-kube-api-access-wng4j" (OuterVolumeSpecName: "kube-api-access-wng4j") pod "1e17b7a2-545e-4757-87d6-4bb0d4f6407c" (UID: "1e17b7a2-545e-4757-87d6-4bb0d4f6407c"). InnerVolumeSpecName "kube-api-access-wng4j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:13:50 crc kubenswrapper[4933]: I1202 16:13:50.995571 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e17b7a2-545e-4757-87d6-4bb0d4f6407c-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "1e17b7a2-545e-4757-87d6-4bb0d4f6407c" (UID: "1e17b7a2-545e-4757-87d6-4bb0d4f6407c"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:13:51 crc kubenswrapper[4933]: I1202 16:13:51.090234 4933 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1e17b7a2-545e-4757-87d6-4bb0d4f6407c-console-config\") on node \"crc\" DevicePath \"\"" Dec 02 16:13:51 crc kubenswrapper[4933]: I1202 16:13:51.090266 4933 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1e17b7a2-545e-4757-87d6-4bb0d4f6407c-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 16:13:51 crc kubenswrapper[4933]: I1202 16:13:51.090280 4933 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1e17b7a2-545e-4757-87d6-4bb0d4f6407c-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 16:13:51 crc kubenswrapper[4933]: I1202 16:13:51.090289 4933 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1e17b7a2-545e-4757-87d6-4bb0d4f6407c-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 02 16:13:51 crc kubenswrapper[4933]: I1202 16:13:51.090298 4933 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1e17b7a2-545e-4757-87d6-4bb0d4f6407c-service-ca\") on node \"crc\" DevicePath \"\"" Dec 02 16:13:51 crc kubenswrapper[4933]: I1202 16:13:51.090306 4933 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1e17b7a2-545e-4757-87d6-4bb0d4f6407c-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 16:13:51 crc kubenswrapper[4933]: I1202 16:13:51.090316 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wng4j\" (UniqueName: \"kubernetes.io/projected/1e17b7a2-545e-4757-87d6-4bb0d4f6407c-kube-api-access-wng4j\") on node \"crc\" DevicePath \"\"" Dec 02 16:13:51 crc kubenswrapper[4933]: I1202 16:13:51.793073 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"09f9fb84-64a0-46a8-8551-4fcbf0e46050","Type":"ContainerStarted","Data":"1f19e9713fcdd5bfdc66ce1322b3ccb20258bc3fc52a9850060d466d321dc854"} Dec 02 16:13:51 crc kubenswrapper[4933]: I1202 16:13:51.794969 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 02 16:13:51 crc kubenswrapper[4933]: I1202 16:13:51.796299 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e3cdda86-3d0d-486d-ae36-ab6792bff2ab","Type":"ContainerStarted","Data":"688e25e8a56e20f4ea19c240866f399d248c58f9b1fca7617e58dd509627ef00"} Dec 02 16:13:51 crc kubenswrapper[4933]: I1202 16:13:51.797157 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 02 16:13:51 crc kubenswrapper[4933]: I1202 16:13:51.799208 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-84778dd5c6-qprl6_1e17b7a2-545e-4757-87d6-4bb0d4f6407c/console/0.log" Dec 02 16:13:51 crc kubenswrapper[4933]: I1202 16:13:51.799259 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-84778dd5c6-qprl6" event={"ID":"1e17b7a2-545e-4757-87d6-4bb0d4f6407c","Type":"ContainerDied","Data":"44475fcd1cc0e6c72863113ea85bfe218ea731f076d0db7d0b8989edbc3c29cf"} Dec 02 16:13:51 crc kubenswrapper[4933]: I1202 16:13:51.799285 4933 scope.go:117] "RemoveContainer" containerID="f092b3ce01fd53d09b740797004140f50a3d7fc565e3d42c7805e63faf6b735f" Dec 02 16:13:51 crc kubenswrapper[4933]: I1202 16:13:51.799368 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-84778dd5c6-qprl6" Dec 02 16:13:51 crc kubenswrapper[4933]: I1202 16:13:51.802850 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"662ba342-72a0-430b-b46d-d6f0f0eafd2b","Type":"ContainerStarted","Data":"13adf61b1e09ecb1db73b00d7d61fcb1a1da90c0be6f04230beb8e7832fb62b6"} Dec 02 16:13:51 crc kubenswrapper[4933]: I1202 16:13:51.823726 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"803ee8c3-0a9d-46c8-8069-1ebffde1429c","Type":"ContainerStarted","Data":"5fd08c6b7b6cb739c7067736e9f8f414c50b71ff937dbe23552505db804fb045"} Dec 02 16:13:51 crc kubenswrapper[4933]: I1202 16:13:51.835111 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=14.128935138 podStartE2EDuration="49.835083619s" podCreationTimestamp="2025-12-02 16:13:02 +0000 UTC" firstStartedPulling="2025-12-02 16:13:14.988351612 +0000 UTC m=+1258.239578315" lastFinishedPulling="2025-12-02 16:13:50.694500093 +0000 UTC m=+1293.945726796" observedRunningTime="2025-12-02 16:13:51.820647185 +0000 UTC m=+1295.071873918" watchObservedRunningTime="2025-12-02 16:13:51.835083619 +0000 UTC m=+1295.086310322" Dec 02 16:13:51 crc kubenswrapper[4933]: I1202 16:13:51.878297 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=52.19360044 podStartE2EDuration="55.878133903s" podCreationTimestamp="2025-12-02 16:12:56 +0000 UTC" firstStartedPulling="2025-12-02 16:13:11.402717166 +0000 UTC m=+1254.653943869" lastFinishedPulling="2025-12-02 16:13:15.087250629 +0000 UTC m=+1258.338477332" observedRunningTime="2025-12-02 16:13:51.847781776 +0000 UTC m=+1295.099008499" watchObservedRunningTime="2025-12-02 16:13:51.878133903 +0000 UTC m=+1295.129360636" Dec 02 16:13:51 crc kubenswrapper[4933]: I1202 16:13:51.900324 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-84778dd5c6-qprl6"] Dec 02 16:13:51 crc kubenswrapper[4933]: I1202 16:13:51.909231 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-84778dd5c6-qprl6"] Dec 02 16:13:51 crc kubenswrapper[4933]: I1202 16:13:51.911492 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=11.388404182 podStartE2EDuration="46.911475339s" podCreationTimestamp="2025-12-02 16:13:05 +0000 UTC" firstStartedPulling="2025-12-02 16:13:15.04363008 +0000 UTC m=+1258.294856783" lastFinishedPulling="2025-12-02 16:13:50.566701237 +0000 UTC m=+1293.817927940" observedRunningTime="2025-12-02 16:13:51.891292753 +0000 UTC m=+1295.142519476" watchObservedRunningTime="2025-12-02 16:13:51.911475339 +0000 UTC m=+1295.162702042" Dec 02 16:13:51 crc kubenswrapper[4933]: I1202 16:13:51.925482 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/cf325ab8-af91-4009-9e8b-a299db2234da-etc-swift\") pod \"swift-storage-0\" (UID: \"cf325ab8-af91-4009-9e8b-a299db2234da\") " pod="openstack/swift-storage-0" Dec 02 16:13:51 crc kubenswrapper[4933]: E1202 16:13:51.925627 4933 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 02 16:13:51 crc kubenswrapper[4933]: E1202 16:13:51.925673 4933 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 02 16:13:51 crc kubenswrapper[4933]: E1202 16:13:51.925737 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cf325ab8-af91-4009-9e8b-a299db2234da-etc-swift podName:cf325ab8-af91-4009-9e8b-a299db2234da nodeName:}" failed. No retries permitted until 2025-12-02 16:13:59.925701197 +0000 UTC m=+1303.176927900 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/cf325ab8-af91-4009-9e8b-a299db2234da-etc-swift") pod "swift-storage-0" (UID: "cf325ab8-af91-4009-9e8b-a299db2234da") : configmap "swift-ring-files" not found Dec 02 16:13:52 crc kubenswrapper[4933]: I1202 16:13:52.117700 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Dec 02 16:13:52 crc kubenswrapper[4933]: I1202 16:13:52.117741 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Dec 02 16:13:53 crc kubenswrapper[4933]: I1202 16:13:53.065638 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e17b7a2-545e-4757-87d6-4bb0d4f6407c" path="/var/lib/kubelet/pods/1e17b7a2-545e-4757-87d6-4bb0d4f6407c/volumes" Dec 02 16:13:53 crc kubenswrapper[4933]: I1202 16:13:53.384450 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-b8fbc5445-5fxhc" Dec 02 16:13:53 crc kubenswrapper[4933]: I1202 16:13:53.446391 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-rxkr8"] Dec 02 16:13:53 crc kubenswrapper[4933]: I1202 16:13:53.446807 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-rxkr8" podUID="1042bb7b-3049-4492-a177-d1eb37d550a7" containerName="dnsmasq-dns" containerID="cri-o://432b0ec0aa421862c4a079761af395b452a813f5b9ceb3d0c7b111354bdf1e7a" gracePeriod=10 Dec 02 16:13:53 crc kubenswrapper[4933]: I1202 16:13:53.846724 4933 generic.go:334] "Generic (PLEG): container finished" podID="1042bb7b-3049-4492-a177-d1eb37d550a7" containerID="432b0ec0aa421862c4a079761af395b452a813f5b9ceb3d0c7b111354bdf1e7a" exitCode=0 Dec 02 16:13:53 crc kubenswrapper[4933]: I1202 16:13:53.846894 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-rxkr8" event={"ID":"1042bb7b-3049-4492-a177-d1eb37d550a7","Type":"ContainerDied","Data":"432b0ec0aa421862c4a079761af395b452a813f5b9ceb3d0c7b111354bdf1e7a"} Dec 02 16:13:54 crc kubenswrapper[4933]: I1202 16:13:54.857944 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-vwmtj" event={"ID":"591c4cbc-2470-449b-9046-76b4c6543cb9","Type":"ContainerStarted","Data":"29a7af4ac36a03c73f8eef474f00dca005d84e0eaf77cc706a4b09b2dec1fa01"} Dec 02 16:13:54 crc kubenswrapper[4933]: I1202 16:13:54.860315 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"803ee8c3-0a9d-46c8-8069-1ebffde1429c","Type":"ContainerStarted","Data":"96f014b8f282fa1db98bd8bcc75f61999e91a886400a470edc75f46ddd45439a"} Dec 02 16:13:54 crc kubenswrapper[4933]: I1202 16:13:54.862079 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-rxkr8" event={"ID":"1042bb7b-3049-4492-a177-d1eb37d550a7","Type":"ContainerDied","Data":"83b17980631c356d2532612682c8a7947a32d7d962189ece71586c79edb1efcb"} Dec 02 16:13:54 crc kubenswrapper[4933]: I1202 16:13:54.862114 4933 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="83b17980631c356d2532612682c8a7947a32d7d962189ece71586c79edb1efcb" Dec 02 16:13:54 crc kubenswrapper[4933]: I1202 16:13:54.887430 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-vwmtj" podStartSLOduration=3.092991001 podStartE2EDuration="6.887407414s" podCreationTimestamp="2025-12-02 16:13:48 +0000 UTC" firstStartedPulling="2025-12-02 16:13:50.735816661 +0000 UTC m=+1293.987043364" lastFinishedPulling="2025-12-02 16:13:54.530233074 +0000 UTC m=+1297.781459777" observedRunningTime="2025-12-02 16:13:54.877805039 +0000 UTC m=+1298.129031752" watchObservedRunningTime="2025-12-02 16:13:54.887407414 +0000 UTC m=+1298.138634147" Dec 02 16:13:54 crc kubenswrapper[4933]: I1202 16:13:54.899679 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-rxkr8" Dec 02 16:13:54 crc kubenswrapper[4933]: I1202 16:13:54.989307 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1042bb7b-3049-4492-a177-d1eb37d550a7-config\") pod \"1042bb7b-3049-4492-a177-d1eb37d550a7\" (UID: \"1042bb7b-3049-4492-a177-d1eb37d550a7\") " Dec 02 16:13:54 crc kubenswrapper[4933]: I1202 16:13:54.989396 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1042bb7b-3049-4492-a177-d1eb37d550a7-dns-svc\") pod \"1042bb7b-3049-4492-a177-d1eb37d550a7\" (UID: \"1042bb7b-3049-4492-a177-d1eb37d550a7\") " Dec 02 16:13:54 crc kubenswrapper[4933]: I1202 16:13:54.989546 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4l76z\" (UniqueName: \"kubernetes.io/projected/1042bb7b-3049-4492-a177-d1eb37d550a7-kube-api-access-4l76z\") pod \"1042bb7b-3049-4492-a177-d1eb37d550a7\" (UID: \"1042bb7b-3049-4492-a177-d1eb37d550a7\") " Dec 02 16:13:54 crc kubenswrapper[4933]: I1202 16:13:54.995574 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1042bb7b-3049-4492-a177-d1eb37d550a7-kube-api-access-4l76z" (OuterVolumeSpecName: "kube-api-access-4l76z") pod "1042bb7b-3049-4492-a177-d1eb37d550a7" (UID: "1042bb7b-3049-4492-a177-d1eb37d550a7"). InnerVolumeSpecName "kube-api-access-4l76z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:13:55 crc kubenswrapper[4933]: I1202 16:13:55.045742 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1042bb7b-3049-4492-a177-d1eb37d550a7-config" (OuterVolumeSpecName: "config") pod "1042bb7b-3049-4492-a177-d1eb37d550a7" (UID: "1042bb7b-3049-4492-a177-d1eb37d550a7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:13:55 crc kubenswrapper[4933]: I1202 16:13:55.048182 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1042bb7b-3049-4492-a177-d1eb37d550a7-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1042bb7b-3049-4492-a177-d1eb37d550a7" (UID: "1042bb7b-3049-4492-a177-d1eb37d550a7"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:13:55 crc kubenswrapper[4933]: I1202 16:13:55.096703 4933 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1042bb7b-3049-4492-a177-d1eb37d550a7-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 16:13:55 crc kubenswrapper[4933]: I1202 16:13:55.096739 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4l76z\" (UniqueName: \"kubernetes.io/projected/1042bb7b-3049-4492-a177-d1eb37d550a7-kube-api-access-4l76z\") on node \"crc\" DevicePath \"\"" Dec 02 16:13:55 crc kubenswrapper[4933]: I1202 16:13:55.096754 4933 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1042bb7b-3049-4492-a177-d1eb37d550a7-config\") on node \"crc\" DevicePath \"\"" Dec 02 16:13:55 crc kubenswrapper[4933]: I1202 16:13:55.169530 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Dec 02 16:13:55 crc kubenswrapper[4933]: I1202 16:13:55.214302 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Dec 02 16:13:55 crc kubenswrapper[4933]: I1202 16:13:55.507014 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Dec 02 16:13:55 crc kubenswrapper[4933]: E1202 16:13:55.507664 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1042bb7b-3049-4492-a177-d1eb37d550a7" containerName="init" Dec 02 16:13:55 crc kubenswrapper[4933]: I1202 16:13:55.507680 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="1042bb7b-3049-4492-a177-d1eb37d550a7" containerName="init" Dec 02 16:13:55 crc kubenswrapper[4933]: E1202 16:13:55.507716 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e17b7a2-545e-4757-87d6-4bb0d4f6407c" containerName="console" Dec 02 16:13:55 crc kubenswrapper[4933]: I1202 16:13:55.507723 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e17b7a2-545e-4757-87d6-4bb0d4f6407c" containerName="console" Dec 02 16:13:55 crc kubenswrapper[4933]: E1202 16:13:55.507734 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1042bb7b-3049-4492-a177-d1eb37d550a7" containerName="dnsmasq-dns" Dec 02 16:13:55 crc kubenswrapper[4933]: I1202 16:13:55.507742 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="1042bb7b-3049-4492-a177-d1eb37d550a7" containerName="dnsmasq-dns" Dec 02 16:13:55 crc kubenswrapper[4933]: I1202 16:13:55.507947 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e17b7a2-545e-4757-87d6-4bb0d4f6407c" containerName="console" Dec 02 16:13:55 crc kubenswrapper[4933]: I1202 16:13:55.507965 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="1042bb7b-3049-4492-a177-d1eb37d550a7" containerName="dnsmasq-dns" Dec 02 16:13:55 crc kubenswrapper[4933]: I1202 16:13:55.513246 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 02 16:13:55 crc kubenswrapper[4933]: I1202 16:13:55.515133 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Dec 02 16:13:55 crc kubenswrapper[4933]: I1202 16:13:55.515424 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-xhhjz" Dec 02 16:13:55 crc kubenswrapper[4933]: I1202 16:13:55.515606 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Dec 02 16:13:55 crc kubenswrapper[4933]: I1202 16:13:55.515842 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Dec 02 16:13:55 crc kubenswrapper[4933]: I1202 16:13:55.525242 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 02 16:13:55 crc kubenswrapper[4933]: I1202 16:13:55.608625 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f365adeb-4cfd-409d-8bd8-29b4779e1e0f-config\") pod \"ovn-northd-0\" (UID: \"f365adeb-4cfd-409d-8bd8-29b4779e1e0f\") " pod="openstack/ovn-northd-0" Dec 02 16:13:55 crc kubenswrapper[4933]: I1202 16:13:55.608673 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f365adeb-4cfd-409d-8bd8-29b4779e1e0f-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"f365adeb-4cfd-409d-8bd8-29b4779e1e0f\") " pod="openstack/ovn-northd-0" Dec 02 16:13:55 crc kubenswrapper[4933]: I1202 16:13:55.608735 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f365adeb-4cfd-409d-8bd8-29b4779e1e0f-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"f365adeb-4cfd-409d-8bd8-29b4779e1e0f\") " pod="openstack/ovn-northd-0" Dec 02 16:13:55 crc kubenswrapper[4933]: I1202 16:13:55.608775 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/f365adeb-4cfd-409d-8bd8-29b4779e1e0f-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"f365adeb-4cfd-409d-8bd8-29b4779e1e0f\") " pod="openstack/ovn-northd-0" Dec 02 16:13:55 crc kubenswrapper[4933]: I1202 16:13:55.608885 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqqdm\" (UniqueName: \"kubernetes.io/projected/f365adeb-4cfd-409d-8bd8-29b4779e1e0f-kube-api-access-lqqdm\") pod \"ovn-northd-0\" (UID: \"f365adeb-4cfd-409d-8bd8-29b4779e1e0f\") " pod="openstack/ovn-northd-0" Dec 02 16:13:55 crc kubenswrapper[4933]: I1202 16:13:55.608975 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f365adeb-4cfd-409d-8bd8-29b4779e1e0f-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"f365adeb-4cfd-409d-8bd8-29b4779e1e0f\") " pod="openstack/ovn-northd-0" Dec 02 16:13:55 crc kubenswrapper[4933]: I1202 16:13:55.609011 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f365adeb-4cfd-409d-8bd8-29b4779e1e0f-scripts\") pod \"ovn-northd-0\" (UID: \"f365adeb-4cfd-409d-8bd8-29b4779e1e0f\") " pod="openstack/ovn-northd-0" Dec 02 16:13:55 crc kubenswrapper[4933]: I1202 16:13:55.710496 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f365adeb-4cfd-409d-8bd8-29b4779e1e0f-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"f365adeb-4cfd-409d-8bd8-29b4779e1e0f\") " pod="openstack/ovn-northd-0" Dec 02 16:13:55 crc kubenswrapper[4933]: I1202 16:13:55.710566 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f365adeb-4cfd-409d-8bd8-29b4779e1e0f-scripts\") pod \"ovn-northd-0\" (UID: \"f365adeb-4cfd-409d-8bd8-29b4779e1e0f\") " pod="openstack/ovn-northd-0" Dec 02 16:13:55 crc kubenswrapper[4933]: I1202 16:13:55.710645 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f365adeb-4cfd-409d-8bd8-29b4779e1e0f-config\") pod \"ovn-northd-0\" (UID: \"f365adeb-4cfd-409d-8bd8-29b4779e1e0f\") " pod="openstack/ovn-northd-0" Dec 02 16:13:55 crc kubenswrapper[4933]: I1202 16:13:55.710666 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f365adeb-4cfd-409d-8bd8-29b4779e1e0f-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"f365adeb-4cfd-409d-8bd8-29b4779e1e0f\") " pod="openstack/ovn-northd-0" Dec 02 16:13:55 crc kubenswrapper[4933]: I1202 16:13:55.710698 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f365adeb-4cfd-409d-8bd8-29b4779e1e0f-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"f365adeb-4cfd-409d-8bd8-29b4779e1e0f\") " pod="openstack/ovn-northd-0" Dec 02 16:13:55 crc kubenswrapper[4933]: I1202 16:13:55.710722 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/f365adeb-4cfd-409d-8bd8-29b4779e1e0f-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"f365adeb-4cfd-409d-8bd8-29b4779e1e0f\") " pod="openstack/ovn-northd-0" Dec 02 16:13:55 crc kubenswrapper[4933]: I1202 16:13:55.710785 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqqdm\" (UniqueName: \"kubernetes.io/projected/f365adeb-4cfd-409d-8bd8-29b4779e1e0f-kube-api-access-lqqdm\") pod \"ovn-northd-0\" (UID: \"f365adeb-4cfd-409d-8bd8-29b4779e1e0f\") " pod="openstack/ovn-northd-0" Dec 02 16:13:55 crc kubenswrapper[4933]: I1202 16:13:55.711432 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f365adeb-4cfd-409d-8bd8-29b4779e1e0f-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"f365adeb-4cfd-409d-8bd8-29b4779e1e0f\") " pod="openstack/ovn-northd-0" Dec 02 16:13:55 crc kubenswrapper[4933]: I1202 16:13:55.711708 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f365adeb-4cfd-409d-8bd8-29b4779e1e0f-scripts\") pod \"ovn-northd-0\" (UID: \"f365adeb-4cfd-409d-8bd8-29b4779e1e0f\") " pod="openstack/ovn-northd-0" Dec 02 16:13:55 crc kubenswrapper[4933]: I1202 16:13:55.711934 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f365adeb-4cfd-409d-8bd8-29b4779e1e0f-config\") pod \"ovn-northd-0\" (UID: \"f365adeb-4cfd-409d-8bd8-29b4779e1e0f\") " pod="openstack/ovn-northd-0" Dec 02 16:13:55 crc kubenswrapper[4933]: I1202 16:13:55.717051 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f365adeb-4cfd-409d-8bd8-29b4779e1e0f-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"f365adeb-4cfd-409d-8bd8-29b4779e1e0f\") " pod="openstack/ovn-northd-0" Dec 02 16:13:55 crc kubenswrapper[4933]: I1202 16:13:55.719738 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f365adeb-4cfd-409d-8bd8-29b4779e1e0f-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"f365adeb-4cfd-409d-8bd8-29b4779e1e0f\") " pod="openstack/ovn-northd-0" Dec 02 16:13:55 crc kubenswrapper[4933]: I1202 16:13:55.720397 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/f365adeb-4cfd-409d-8bd8-29b4779e1e0f-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"f365adeb-4cfd-409d-8bd8-29b4779e1e0f\") " pod="openstack/ovn-northd-0" Dec 02 16:13:55 crc kubenswrapper[4933]: I1202 16:13:55.741549 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqqdm\" (UniqueName: \"kubernetes.io/projected/f365adeb-4cfd-409d-8bd8-29b4779e1e0f-kube-api-access-lqqdm\") pod \"ovn-northd-0\" (UID: \"f365adeb-4cfd-409d-8bd8-29b4779e1e0f\") " pod="openstack/ovn-northd-0" Dec 02 16:13:55 crc kubenswrapper[4933]: I1202 16:13:55.845604 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 02 16:13:55 crc kubenswrapper[4933]: I1202 16:13:55.913977 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-rxkr8" Dec 02 16:13:56 crc kubenswrapper[4933]: I1202 16:13:56.070907 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-rxkr8"] Dec 02 16:13:56 crc kubenswrapper[4933]: I1202 16:13:56.084500 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-rxkr8"] Dec 02 16:13:56 crc kubenswrapper[4933]: I1202 16:13:56.521485 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 02 16:13:56 crc kubenswrapper[4933]: I1202 16:13:56.921324 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"f365adeb-4cfd-409d-8bd8-29b4779e1e0f","Type":"ContainerStarted","Data":"b1ce9cabbdd6fdb944ab3b2e0f4bb78ac5fba9170d24bc3af4dc8cba888dec74"} Dec 02 16:13:57 crc kubenswrapper[4933]: I1202 16:13:57.066213 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1042bb7b-3049-4492-a177-d1eb37d550a7" path="/var/lib/kubelet/pods/1042bb7b-3049-4492-a177-d1eb37d550a7/volumes" Dec 02 16:13:58 crc kubenswrapper[4933]: I1202 16:13:58.946200 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"803ee8c3-0a9d-46c8-8069-1ebffde1429c","Type":"ContainerStarted","Data":"12047917740de2ce64dc4eaa0e212329828c8c575ed792a68d6f89b9492c7d76"} Dec 02 16:13:59 crc kubenswrapper[4933]: I1202 16:13:59.262925 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Dec 02 16:13:59 crc kubenswrapper[4933]: I1202 16:13:59.263146 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Dec 02 16:13:59 crc kubenswrapper[4933]: I1202 16:13:59.347427 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Dec 02 16:13:59 crc kubenswrapper[4933]: I1202 16:13:59.377623 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=12.714015964 podStartE2EDuration="56.377604874s" podCreationTimestamp="2025-12-02 16:13:03 +0000 UTC" firstStartedPulling="2025-12-02 16:13:14.97473944 +0000 UTC m=+1258.225966143" lastFinishedPulling="2025-12-02 16:13:58.63832835 +0000 UTC m=+1301.889555053" observedRunningTime="2025-12-02 16:13:58.978967692 +0000 UTC m=+1302.230194415" watchObservedRunningTime="2025-12-02 16:13:59.377604874 +0000 UTC m=+1302.628831567" Dec 02 16:13:59 crc kubenswrapper[4933]: I1202 16:13:59.400072 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Dec 02 16:13:59 crc kubenswrapper[4933]: I1202 16:13:59.955699 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"f365adeb-4cfd-409d-8bd8-29b4779e1e0f","Type":"ContainerStarted","Data":"b855f4d78f4f13021573a7e086a2ab945cb3dc3b2ef8cf5aaa62474692bd6801"} Dec 02 16:13:59 crc kubenswrapper[4933]: I1202 16:13:59.957656 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Dec 02 16:13:59 crc kubenswrapper[4933]: I1202 16:13:59.957702 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"f365adeb-4cfd-409d-8bd8-29b4779e1e0f","Type":"ContainerStarted","Data":"6fd2055c2c80594271d3db6ec183623cb7fabab0a6246b36553eda8f9cc51ea2"} Dec 02 16:14:00 crc kubenswrapper[4933]: I1202 16:14:00.002790 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/cf325ab8-af91-4009-9e8b-a299db2234da-etc-swift\") pod \"swift-storage-0\" (UID: \"cf325ab8-af91-4009-9e8b-a299db2234da\") " pod="openstack/swift-storage-0" Dec 02 16:14:00 crc kubenswrapper[4933]: E1202 16:14:00.005083 4933 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 02 16:14:00 crc kubenswrapper[4933]: E1202 16:14:00.005110 4933 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 02 16:14:00 crc kubenswrapper[4933]: E1202 16:14:00.005158 4933 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cf325ab8-af91-4009-9e8b-a299db2234da-etc-swift podName:cf325ab8-af91-4009-9e8b-a299db2234da nodeName:}" failed. No retries permitted until 2025-12-02 16:14:16.005139399 +0000 UTC m=+1319.256366202 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/cf325ab8-af91-4009-9e8b-a299db2234da-etc-swift") pod "swift-storage-0" (UID: "cf325ab8-af91-4009-9e8b-a299db2234da") : configmap "swift-ring-files" not found Dec 02 16:14:00 crc kubenswrapper[4933]: I1202 16:14:00.028458 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Dec 02 16:14:00 crc kubenswrapper[4933]: I1202 16:14:00.046669 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.406765637 podStartE2EDuration="5.046648682s" podCreationTimestamp="2025-12-02 16:13:55 +0000 UTC" firstStartedPulling="2025-12-02 16:13:56.547238328 +0000 UTC m=+1299.798465031" lastFinishedPulling="2025-12-02 16:13:59.187121373 +0000 UTC m=+1302.438348076" observedRunningTime="2025-12-02 16:13:59.983402431 +0000 UTC m=+1303.234629134" watchObservedRunningTime="2025-12-02 16:14:00.046648682 +0000 UTC m=+1303.297875395" Dec 02 16:14:00 crc kubenswrapper[4933]: I1202 16:14:00.517025 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-92e3-account-create-update-mql6h"] Dec 02 16:14:00 crc kubenswrapper[4933]: I1202 16:14:00.518536 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-92e3-account-create-update-mql6h" Dec 02 16:14:00 crc kubenswrapper[4933]: I1202 16:14:00.525318 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Dec 02 16:14:00 crc kubenswrapper[4933]: I1202 16:14:00.527711 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-tmvnh"] Dec 02 16:14:00 crc kubenswrapper[4933]: I1202 16:14:00.529230 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-tmvnh" Dec 02 16:14:00 crc kubenswrapper[4933]: I1202 16:14:00.546506 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-92e3-account-create-update-mql6h"] Dec 02 16:14:00 crc kubenswrapper[4933]: I1202 16:14:00.563068 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-tmvnh"] Dec 02 16:14:00 crc kubenswrapper[4933]: I1202 16:14:00.617897 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0fb1fa15-e7a1-412d-a2c7-20e6e8e8792b-operator-scripts\") pod \"keystone-92e3-account-create-update-mql6h\" (UID: \"0fb1fa15-e7a1-412d-a2c7-20e6e8e8792b\") " pod="openstack/keystone-92e3-account-create-update-mql6h" Dec 02 16:14:00 crc kubenswrapper[4933]: I1202 16:14:00.618273 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2m6b\" (UniqueName: \"kubernetes.io/projected/0fb1fa15-e7a1-412d-a2c7-20e6e8e8792b-kube-api-access-r2m6b\") pod \"keystone-92e3-account-create-update-mql6h\" (UID: \"0fb1fa15-e7a1-412d-a2c7-20e6e8e8792b\") " pod="openstack/keystone-92e3-account-create-update-mql6h" Dec 02 16:14:00 crc kubenswrapper[4933]: I1202 16:14:00.618379 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t89q4\" (UniqueName: \"kubernetes.io/projected/96eee4be-07d4-4e98-a725-94746710698c-kube-api-access-t89q4\") pod \"keystone-db-create-tmvnh\" (UID: \"96eee4be-07d4-4e98-a725-94746710698c\") " pod="openstack/keystone-db-create-tmvnh" Dec 02 16:14:00 crc kubenswrapper[4933]: I1202 16:14:00.618425 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/96eee4be-07d4-4e98-a725-94746710698c-operator-scripts\") pod \"keystone-db-create-tmvnh\" (UID: \"96eee4be-07d4-4e98-a725-94746710698c\") " pod="openstack/keystone-db-create-tmvnh" Dec 02 16:14:00 crc kubenswrapper[4933]: I1202 16:14:00.707676 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-kqjfb"] Dec 02 16:14:00 crc kubenswrapper[4933]: I1202 16:14:00.709113 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-kqjfb" Dec 02 16:14:00 crc kubenswrapper[4933]: I1202 16:14:00.719316 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-kqjfb"] Dec 02 16:14:00 crc kubenswrapper[4933]: I1202 16:14:00.720665 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4402628f-1d1f-44ad-8b84-e414f7345014-operator-scripts\") pod \"placement-db-create-kqjfb\" (UID: \"4402628f-1d1f-44ad-8b84-e414f7345014\") " pod="openstack/placement-db-create-kqjfb" Dec 02 16:14:00 crc kubenswrapper[4933]: I1202 16:14:00.720729 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0fb1fa15-e7a1-412d-a2c7-20e6e8e8792b-operator-scripts\") pod \"keystone-92e3-account-create-update-mql6h\" (UID: \"0fb1fa15-e7a1-412d-a2c7-20e6e8e8792b\") " pod="openstack/keystone-92e3-account-create-update-mql6h" Dec 02 16:14:00 crc kubenswrapper[4933]: I1202 16:14:00.720766 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2m6b\" (UniqueName: \"kubernetes.io/projected/0fb1fa15-e7a1-412d-a2c7-20e6e8e8792b-kube-api-access-r2m6b\") pod \"keystone-92e3-account-create-update-mql6h\" (UID: \"0fb1fa15-e7a1-412d-a2c7-20e6e8e8792b\") " pod="openstack/keystone-92e3-account-create-update-mql6h" Dec 02 16:14:00 crc kubenswrapper[4933]: I1202 16:14:00.720809 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l97lx\" (UniqueName: \"kubernetes.io/projected/4402628f-1d1f-44ad-8b84-e414f7345014-kube-api-access-l97lx\") pod \"placement-db-create-kqjfb\" (UID: \"4402628f-1d1f-44ad-8b84-e414f7345014\") " pod="openstack/placement-db-create-kqjfb" Dec 02 16:14:00 crc kubenswrapper[4933]: I1202 16:14:00.720887 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t89q4\" (UniqueName: \"kubernetes.io/projected/96eee4be-07d4-4e98-a725-94746710698c-kube-api-access-t89q4\") pod \"keystone-db-create-tmvnh\" (UID: \"96eee4be-07d4-4e98-a725-94746710698c\") " pod="openstack/keystone-db-create-tmvnh" Dec 02 16:14:00 crc kubenswrapper[4933]: I1202 16:14:00.720938 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/96eee4be-07d4-4e98-a725-94746710698c-operator-scripts\") pod \"keystone-db-create-tmvnh\" (UID: \"96eee4be-07d4-4e98-a725-94746710698c\") " pod="openstack/keystone-db-create-tmvnh" Dec 02 16:14:00 crc kubenswrapper[4933]: I1202 16:14:00.721781 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/96eee4be-07d4-4e98-a725-94746710698c-operator-scripts\") pod \"keystone-db-create-tmvnh\" (UID: \"96eee4be-07d4-4e98-a725-94746710698c\") " pod="openstack/keystone-db-create-tmvnh" Dec 02 16:14:00 crc kubenswrapper[4933]: I1202 16:14:00.722789 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0fb1fa15-e7a1-412d-a2c7-20e6e8e8792b-operator-scripts\") pod \"keystone-92e3-account-create-update-mql6h\" (UID: \"0fb1fa15-e7a1-412d-a2c7-20e6e8e8792b\") " pod="openstack/keystone-92e3-account-create-update-mql6h" Dec 02 16:14:00 crc kubenswrapper[4933]: I1202 16:14:00.744382 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t89q4\" (UniqueName: \"kubernetes.io/projected/96eee4be-07d4-4e98-a725-94746710698c-kube-api-access-t89q4\") pod \"keystone-db-create-tmvnh\" (UID: \"96eee4be-07d4-4e98-a725-94746710698c\") " pod="openstack/keystone-db-create-tmvnh" Dec 02 16:14:00 crc kubenswrapper[4933]: I1202 16:14:00.747293 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2m6b\" (UniqueName: \"kubernetes.io/projected/0fb1fa15-e7a1-412d-a2c7-20e6e8e8792b-kube-api-access-r2m6b\") pod \"keystone-92e3-account-create-update-mql6h\" (UID: \"0fb1fa15-e7a1-412d-a2c7-20e6e8e8792b\") " pod="openstack/keystone-92e3-account-create-update-mql6h" Dec 02 16:14:00 crc kubenswrapper[4933]: I1202 16:14:00.816583 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-2d40-account-create-update-cb4br"] Dec 02 16:14:00 crc kubenswrapper[4933]: I1202 16:14:00.821475 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-2d40-account-create-update-cb4br" Dec 02 16:14:00 crc kubenswrapper[4933]: I1202 16:14:00.822425 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbhjk\" (UniqueName: \"kubernetes.io/projected/33d64345-2d6d-47a6-90cf-2a7430c9749d-kube-api-access-bbhjk\") pod \"placement-2d40-account-create-update-cb4br\" (UID: \"33d64345-2d6d-47a6-90cf-2a7430c9749d\") " pod="openstack/placement-2d40-account-create-update-cb4br" Dec 02 16:14:00 crc kubenswrapper[4933]: I1202 16:14:00.822675 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4402628f-1d1f-44ad-8b84-e414f7345014-operator-scripts\") pod \"placement-db-create-kqjfb\" (UID: \"4402628f-1d1f-44ad-8b84-e414f7345014\") " pod="openstack/placement-db-create-kqjfb" Dec 02 16:14:00 crc kubenswrapper[4933]: I1202 16:14:00.822715 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/33d64345-2d6d-47a6-90cf-2a7430c9749d-operator-scripts\") pod \"placement-2d40-account-create-update-cb4br\" (UID: \"33d64345-2d6d-47a6-90cf-2a7430c9749d\") " pod="openstack/placement-2d40-account-create-update-cb4br" Dec 02 16:14:00 crc kubenswrapper[4933]: I1202 16:14:00.822791 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l97lx\" (UniqueName: \"kubernetes.io/projected/4402628f-1d1f-44ad-8b84-e414f7345014-kube-api-access-l97lx\") pod \"placement-db-create-kqjfb\" (UID: \"4402628f-1d1f-44ad-8b84-e414f7345014\") " pod="openstack/placement-db-create-kqjfb" Dec 02 16:14:00 crc kubenswrapper[4933]: I1202 16:14:00.823525 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4402628f-1d1f-44ad-8b84-e414f7345014-operator-scripts\") pod \"placement-db-create-kqjfb\" (UID: \"4402628f-1d1f-44ad-8b84-e414f7345014\") " pod="openstack/placement-db-create-kqjfb" Dec 02 16:14:00 crc kubenswrapper[4933]: I1202 16:14:00.824506 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Dec 02 16:14:00 crc kubenswrapper[4933]: I1202 16:14:00.834436 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-2d40-account-create-update-cb4br"] Dec 02 16:14:00 crc kubenswrapper[4933]: I1202 16:14:00.850788 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l97lx\" (UniqueName: \"kubernetes.io/projected/4402628f-1d1f-44ad-8b84-e414f7345014-kube-api-access-l97lx\") pod \"placement-db-create-kqjfb\" (UID: \"4402628f-1d1f-44ad-8b84-e414f7345014\") " pod="openstack/placement-db-create-kqjfb" Dec 02 16:14:00 crc kubenswrapper[4933]: I1202 16:14:00.881386 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-tmvnh" Dec 02 16:14:00 crc kubenswrapper[4933]: I1202 16:14:00.881573 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-92e3-account-create-update-mql6h" Dec 02 16:14:00 crc kubenswrapper[4933]: I1202 16:14:00.926514 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/33d64345-2d6d-47a6-90cf-2a7430c9749d-operator-scripts\") pod \"placement-2d40-account-create-update-cb4br\" (UID: \"33d64345-2d6d-47a6-90cf-2a7430c9749d\") " pod="openstack/placement-2d40-account-create-update-cb4br" Dec 02 16:14:00 crc kubenswrapper[4933]: I1202 16:14:00.926670 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbhjk\" (UniqueName: \"kubernetes.io/projected/33d64345-2d6d-47a6-90cf-2a7430c9749d-kube-api-access-bbhjk\") pod \"placement-2d40-account-create-update-cb4br\" (UID: \"33d64345-2d6d-47a6-90cf-2a7430c9749d\") " pod="openstack/placement-2d40-account-create-update-cb4br" Dec 02 16:14:00 crc kubenswrapper[4933]: I1202 16:14:00.927623 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/33d64345-2d6d-47a6-90cf-2a7430c9749d-operator-scripts\") pod \"placement-2d40-account-create-update-cb4br\" (UID: \"33d64345-2d6d-47a6-90cf-2a7430c9749d\") " pod="openstack/placement-2d40-account-create-update-cb4br" Dec 02 16:14:00 crc kubenswrapper[4933]: I1202 16:14:00.943930 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbhjk\" (UniqueName: \"kubernetes.io/projected/33d64345-2d6d-47a6-90cf-2a7430c9749d-kube-api-access-bbhjk\") pod \"placement-2d40-account-create-update-cb4br\" (UID: \"33d64345-2d6d-47a6-90cf-2a7430c9749d\") " pod="openstack/placement-2d40-account-create-update-cb4br" Dec 02 16:14:01 crc kubenswrapper[4933]: I1202 16:14:01.035449 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-kqjfb" Dec 02 16:14:01 crc kubenswrapper[4933]: I1202 16:14:01.040847 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-2d40-account-create-update-cb4br" Dec 02 16:14:01 crc kubenswrapper[4933]: I1202 16:14:01.395253 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-tmvnh"] Dec 02 16:14:01 crc kubenswrapper[4933]: W1202 16:14:01.404209 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod96eee4be_07d4_4e98_a725_94746710698c.slice/crio-059b8e79d8760f1eb2b93cc013f4c3444b5b01aa46bf13d7773641b9ff563fff WatchSource:0}: Error finding container 059b8e79d8760f1eb2b93cc013f4c3444b5b01aa46bf13d7773641b9ff563fff: Status 404 returned error can't find the container with id 059b8e79d8760f1eb2b93cc013f4c3444b5b01aa46bf13d7773641b9ff563fff Dec 02 16:14:01 crc kubenswrapper[4933]: I1202 16:14:01.510957 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-92e3-account-create-update-mql6h"] Dec 02 16:14:01 crc kubenswrapper[4933]: W1202 16:14:01.630205 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4402628f_1d1f_44ad_8b84_e414f7345014.slice/crio-556993931c61094ea06bceb1f30641cf9adc3297a0d3ad1f9827937e7acbfb07 WatchSource:0}: Error finding container 556993931c61094ea06bceb1f30641cf9adc3297a0d3ad1f9827937e7acbfb07: Status 404 returned error can't find the container with id 556993931c61094ea06bceb1f30641cf9adc3297a0d3ad1f9827937e7acbfb07 Dec 02 16:14:01 crc kubenswrapper[4933]: I1202 16:14:01.630646 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-kqjfb"] Dec 02 16:14:01 crc kubenswrapper[4933]: I1202 16:14:01.749163 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-2d40-account-create-update-cb4br"] Dec 02 16:14:01 crc kubenswrapper[4933]: W1202 16:14:01.754097 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod33d64345_2d6d_47a6_90cf_2a7430c9749d.slice/crio-0dccb983f58cbd7a433bc7787c475b8e83186bd5ec693c7aa6d01bba1145c3a3 WatchSource:0}: Error finding container 0dccb983f58cbd7a433bc7787c475b8e83186bd5ec693c7aa6d01bba1145c3a3: Status 404 returned error can't find the container with id 0dccb983f58cbd7a433bc7787c475b8e83186bd5ec693c7aa6d01bba1145c3a3 Dec 02 16:14:01 crc kubenswrapper[4933]: I1202 16:14:01.980098 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-2d40-account-create-update-cb4br" event={"ID":"33d64345-2d6d-47a6-90cf-2a7430c9749d","Type":"ContainerStarted","Data":"0dccb983f58cbd7a433bc7787c475b8e83186bd5ec693c7aa6d01bba1145c3a3"} Dec 02 16:14:01 crc kubenswrapper[4933]: I1202 16:14:01.981744 4933 generic.go:334] "Generic (PLEG): container finished" podID="4402628f-1d1f-44ad-8b84-e414f7345014" containerID="a96f0cd0e5a6ddd987b6b04080a43336cc8ffe9e08346f77b8de85b4e09b413c" exitCode=0 Dec 02 16:14:01 crc kubenswrapper[4933]: I1202 16:14:01.981811 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-kqjfb" event={"ID":"4402628f-1d1f-44ad-8b84-e414f7345014","Type":"ContainerDied","Data":"a96f0cd0e5a6ddd987b6b04080a43336cc8ffe9e08346f77b8de85b4e09b413c"} Dec 02 16:14:01 crc kubenswrapper[4933]: I1202 16:14:01.981873 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-kqjfb" event={"ID":"4402628f-1d1f-44ad-8b84-e414f7345014","Type":"ContainerStarted","Data":"556993931c61094ea06bceb1f30641cf9adc3297a0d3ad1f9827937e7acbfb07"} Dec 02 16:14:01 crc kubenswrapper[4933]: I1202 16:14:01.983345 4933 generic.go:334] "Generic (PLEG): container finished" podID="0fb1fa15-e7a1-412d-a2c7-20e6e8e8792b" containerID="2dcb116ae0821b2675b43f05b8244b124bb63ad3795def616da768a32fbfefd4" exitCode=0 Dec 02 16:14:01 crc kubenswrapper[4933]: I1202 16:14:01.983388 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-92e3-account-create-update-mql6h" event={"ID":"0fb1fa15-e7a1-412d-a2c7-20e6e8e8792b","Type":"ContainerDied","Data":"2dcb116ae0821b2675b43f05b8244b124bb63ad3795def616da768a32fbfefd4"} Dec 02 16:14:01 crc kubenswrapper[4933]: I1202 16:14:01.983403 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-92e3-account-create-update-mql6h" event={"ID":"0fb1fa15-e7a1-412d-a2c7-20e6e8e8792b","Type":"ContainerStarted","Data":"ab73f17473241fdb9f82caeb5f471d8c5016bf4193a8b6f519a5862eea972a40"} Dec 02 16:14:01 crc kubenswrapper[4933]: I1202 16:14:01.985183 4933 generic.go:334] "Generic (PLEG): container finished" podID="96eee4be-07d4-4e98-a725-94746710698c" containerID="1d9bd01203c548cba964d12bb32bb71fdb8053e98deda054288b5b0db059913c" exitCode=0 Dec 02 16:14:01 crc kubenswrapper[4933]: I1202 16:14:01.985317 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-tmvnh" event={"ID":"96eee4be-07d4-4e98-a725-94746710698c","Type":"ContainerDied","Data":"1d9bd01203c548cba964d12bb32bb71fdb8053e98deda054288b5b0db059913c"} Dec 02 16:14:01 crc kubenswrapper[4933]: I1202 16:14:01.985384 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-tmvnh" event={"ID":"96eee4be-07d4-4e98-a725-94746710698c","Type":"ContainerStarted","Data":"059b8e79d8760f1eb2b93cc013f4c3444b5b01aa46bf13d7773641b9ff563fff"} Dec 02 16:14:02 crc kubenswrapper[4933]: I1202 16:14:02.772767 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-l2rkw"] Dec 02 16:14:02 crc kubenswrapper[4933]: I1202 16:14:02.775469 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-db-create-l2rkw" Dec 02 16:14:02 crc kubenswrapper[4933]: I1202 16:14:02.786542 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-l2rkw"] Dec 02 16:14:02 crc kubenswrapper[4933]: I1202 16:14:02.872497 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cf17e438-7511-4c81-ad30-864da63a2965-operator-scripts\") pod \"mysqld-exporter-openstack-db-create-l2rkw\" (UID: \"cf17e438-7511-4c81-ad30-864da63a2965\") " pod="openstack/mysqld-exporter-openstack-db-create-l2rkw" Dec 02 16:14:02 crc kubenswrapper[4933]: I1202 16:14:02.872697 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7x5xn\" (UniqueName: \"kubernetes.io/projected/cf17e438-7511-4c81-ad30-864da63a2965-kube-api-access-7x5xn\") pod \"mysqld-exporter-openstack-db-create-l2rkw\" (UID: \"cf17e438-7511-4c81-ad30-864da63a2965\") " pod="openstack/mysqld-exporter-openstack-db-create-l2rkw" Dec 02 16:14:02 crc kubenswrapper[4933]: I1202 16:14:02.968704 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-f87c-account-create-update-zndp8"] Dec 02 16:14:02 crc kubenswrapper[4933]: I1202 16:14:02.970437 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-f87c-account-create-update-zndp8" Dec 02 16:14:02 crc kubenswrapper[4933]: I1202 16:14:02.973833 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"mysqld-exporter-openstack-db-secret" Dec 02 16:14:02 crc kubenswrapper[4933]: I1202 16:14:02.975402 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7x5xn\" (UniqueName: \"kubernetes.io/projected/cf17e438-7511-4c81-ad30-864da63a2965-kube-api-access-7x5xn\") pod \"mysqld-exporter-openstack-db-create-l2rkw\" (UID: \"cf17e438-7511-4c81-ad30-864da63a2965\") " pod="openstack/mysqld-exporter-openstack-db-create-l2rkw" Dec 02 16:14:02 crc kubenswrapper[4933]: I1202 16:14:02.975529 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cf17e438-7511-4c81-ad30-864da63a2965-operator-scripts\") pod \"mysqld-exporter-openstack-db-create-l2rkw\" (UID: \"cf17e438-7511-4c81-ad30-864da63a2965\") " pod="openstack/mysqld-exporter-openstack-db-create-l2rkw" Dec 02 16:14:02 crc kubenswrapper[4933]: I1202 16:14:02.976739 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cf17e438-7511-4c81-ad30-864da63a2965-operator-scripts\") pod \"mysqld-exporter-openstack-db-create-l2rkw\" (UID: \"cf17e438-7511-4c81-ad30-864da63a2965\") " pod="openstack/mysqld-exporter-openstack-db-create-l2rkw" Dec 02 16:14:02 crc kubenswrapper[4933]: I1202 16:14:02.984903 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-f87c-account-create-update-zndp8"] Dec 02 16:14:03 crc kubenswrapper[4933]: I1202 16:14:03.005439 4933 generic.go:334] "Generic (PLEG): container finished" podID="591c4cbc-2470-449b-9046-76b4c6543cb9" containerID="29a7af4ac36a03c73f8eef474f00dca005d84e0eaf77cc706a4b09b2dec1fa01" exitCode=0 Dec 02 16:14:03 crc kubenswrapper[4933]: I1202 16:14:03.005551 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-vwmtj" event={"ID":"591c4cbc-2470-449b-9046-76b4c6543cb9","Type":"ContainerDied","Data":"29a7af4ac36a03c73f8eef474f00dca005d84e0eaf77cc706a4b09b2dec1fa01"} Dec 02 16:14:03 crc kubenswrapper[4933]: I1202 16:14:03.008461 4933 generic.go:334] "Generic (PLEG): container finished" podID="33d64345-2d6d-47a6-90cf-2a7430c9749d" containerID="7b1087e21782d7d67edc0d348786fee50b0980ada5c687456d587679e77d17fe" exitCode=0 Dec 02 16:14:03 crc kubenswrapper[4933]: I1202 16:14:03.009659 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-2d40-account-create-update-cb4br" event={"ID":"33d64345-2d6d-47a6-90cf-2a7430c9749d","Type":"ContainerDied","Data":"7b1087e21782d7d67edc0d348786fee50b0980ada5c687456d587679e77d17fe"} Dec 02 16:14:03 crc kubenswrapper[4933]: I1202 16:14:03.032135 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7x5xn\" (UniqueName: \"kubernetes.io/projected/cf17e438-7511-4c81-ad30-864da63a2965-kube-api-access-7x5xn\") pod \"mysqld-exporter-openstack-db-create-l2rkw\" (UID: \"cf17e438-7511-4c81-ad30-864da63a2965\") " pod="openstack/mysqld-exporter-openstack-db-create-l2rkw" Dec 02 16:14:03 crc kubenswrapper[4933]: I1202 16:14:03.070464 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 02 16:14:03 crc kubenswrapper[4933]: I1202 16:14:03.081220 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjvg5\" (UniqueName: \"kubernetes.io/projected/63164577-9438-40d2-96b6-ba81f3d960d4-kube-api-access-kjvg5\") pod \"mysqld-exporter-f87c-account-create-update-zndp8\" (UID: \"63164577-9438-40d2-96b6-ba81f3d960d4\") " pod="openstack/mysqld-exporter-f87c-account-create-update-zndp8" Dec 02 16:14:03 crc kubenswrapper[4933]: I1202 16:14:03.081471 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/63164577-9438-40d2-96b6-ba81f3d960d4-operator-scripts\") pod \"mysqld-exporter-f87c-account-create-update-zndp8\" (UID: \"63164577-9438-40d2-96b6-ba81f3d960d4\") " pod="openstack/mysqld-exporter-f87c-account-create-update-zndp8" Dec 02 16:14:03 crc kubenswrapper[4933]: I1202 16:14:03.152425 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-db-create-l2rkw" Dec 02 16:14:03 crc kubenswrapper[4933]: I1202 16:14:03.183428 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjvg5\" (UniqueName: \"kubernetes.io/projected/63164577-9438-40d2-96b6-ba81f3d960d4-kube-api-access-kjvg5\") pod \"mysqld-exporter-f87c-account-create-update-zndp8\" (UID: \"63164577-9438-40d2-96b6-ba81f3d960d4\") " pod="openstack/mysqld-exporter-f87c-account-create-update-zndp8" Dec 02 16:14:03 crc kubenswrapper[4933]: I1202 16:14:03.183487 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/63164577-9438-40d2-96b6-ba81f3d960d4-operator-scripts\") pod \"mysqld-exporter-f87c-account-create-update-zndp8\" (UID: \"63164577-9438-40d2-96b6-ba81f3d960d4\") " pod="openstack/mysqld-exporter-f87c-account-create-update-zndp8" Dec 02 16:14:03 crc kubenswrapper[4933]: I1202 16:14:03.185312 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/63164577-9438-40d2-96b6-ba81f3d960d4-operator-scripts\") pod \"mysqld-exporter-f87c-account-create-update-zndp8\" (UID: \"63164577-9438-40d2-96b6-ba81f3d960d4\") " pod="openstack/mysqld-exporter-f87c-account-create-update-zndp8" Dec 02 16:14:03 crc kubenswrapper[4933]: I1202 16:14:03.212447 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjvg5\" (UniqueName: \"kubernetes.io/projected/63164577-9438-40d2-96b6-ba81f3d960d4-kube-api-access-kjvg5\") pod \"mysqld-exporter-f87c-account-create-update-zndp8\" (UID: \"63164577-9438-40d2-96b6-ba81f3d960d4\") " pod="openstack/mysqld-exporter-f87c-account-create-update-zndp8" Dec 02 16:14:03 crc kubenswrapper[4933]: I1202 16:14:03.304579 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-f87c-account-create-update-zndp8" Dec 02 16:14:03 crc kubenswrapper[4933]: I1202 16:14:03.580989 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-kqjfb" Dec 02 16:14:03 crc kubenswrapper[4933]: I1202 16:14:03.698481 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4402628f-1d1f-44ad-8b84-e414f7345014-operator-scripts\") pod \"4402628f-1d1f-44ad-8b84-e414f7345014\" (UID: \"4402628f-1d1f-44ad-8b84-e414f7345014\") " Dec 02 16:14:03 crc kubenswrapper[4933]: I1202 16:14:03.698626 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l97lx\" (UniqueName: \"kubernetes.io/projected/4402628f-1d1f-44ad-8b84-e414f7345014-kube-api-access-l97lx\") pod \"4402628f-1d1f-44ad-8b84-e414f7345014\" (UID: \"4402628f-1d1f-44ad-8b84-e414f7345014\") " Dec 02 16:14:03 crc kubenswrapper[4933]: I1202 16:14:03.699661 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4402628f-1d1f-44ad-8b84-e414f7345014-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4402628f-1d1f-44ad-8b84-e414f7345014" (UID: "4402628f-1d1f-44ad-8b84-e414f7345014"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:14:03 crc kubenswrapper[4933]: I1202 16:14:03.704873 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4402628f-1d1f-44ad-8b84-e414f7345014-kube-api-access-l97lx" (OuterVolumeSpecName: "kube-api-access-l97lx") pod "4402628f-1d1f-44ad-8b84-e414f7345014" (UID: "4402628f-1d1f-44ad-8b84-e414f7345014"). InnerVolumeSpecName "kube-api-access-l97lx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:14:03 crc kubenswrapper[4933]: I1202 16:14:03.732183 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-tmvnh" Dec 02 16:14:03 crc kubenswrapper[4933]: I1202 16:14:03.754749 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-92e3-account-create-update-mql6h" Dec 02 16:14:03 crc kubenswrapper[4933]: I1202 16:14:03.801767 4933 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4402628f-1d1f-44ad-8b84-e414f7345014-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 16:14:03 crc kubenswrapper[4933]: I1202 16:14:03.801800 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l97lx\" (UniqueName: \"kubernetes.io/projected/4402628f-1d1f-44ad-8b84-e414f7345014-kube-api-access-l97lx\") on node \"crc\" DevicePath \"\"" Dec 02 16:14:03 crc kubenswrapper[4933]: W1202 16:14:03.836789 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcf17e438_7511_4c81_ad30_864da63a2965.slice/crio-e3e53f140fd0fd82ccde05811dd11c9975bd6d501f773d5642d7bb896522b2e3 WatchSource:0}: Error finding container e3e53f140fd0fd82ccde05811dd11c9975bd6d501f773d5642d7bb896522b2e3: Status 404 returned error can't find the container with id e3e53f140fd0fd82ccde05811dd11c9975bd6d501f773d5642d7bb896522b2e3 Dec 02 16:14:03 crc kubenswrapper[4933]: I1202 16:14:03.840895 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-l2rkw"] Dec 02 16:14:03 crc kubenswrapper[4933]: I1202 16:14:03.903418 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t89q4\" (UniqueName: \"kubernetes.io/projected/96eee4be-07d4-4e98-a725-94746710698c-kube-api-access-t89q4\") pod \"96eee4be-07d4-4e98-a725-94746710698c\" (UID: \"96eee4be-07d4-4e98-a725-94746710698c\") " Dec 02 16:14:03 crc kubenswrapper[4933]: I1202 16:14:03.903926 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r2m6b\" (UniqueName: \"kubernetes.io/projected/0fb1fa15-e7a1-412d-a2c7-20e6e8e8792b-kube-api-access-r2m6b\") pod \"0fb1fa15-e7a1-412d-a2c7-20e6e8e8792b\" (UID: \"0fb1fa15-e7a1-412d-a2c7-20e6e8e8792b\") " Dec 02 16:14:03 crc kubenswrapper[4933]: I1202 16:14:03.903983 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0fb1fa15-e7a1-412d-a2c7-20e6e8e8792b-operator-scripts\") pod \"0fb1fa15-e7a1-412d-a2c7-20e6e8e8792b\" (UID: \"0fb1fa15-e7a1-412d-a2c7-20e6e8e8792b\") " Dec 02 16:14:03 crc kubenswrapper[4933]: I1202 16:14:03.904056 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/96eee4be-07d4-4e98-a725-94746710698c-operator-scripts\") pod \"96eee4be-07d4-4e98-a725-94746710698c\" (UID: \"96eee4be-07d4-4e98-a725-94746710698c\") " Dec 02 16:14:03 crc kubenswrapper[4933]: I1202 16:14:03.904598 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0fb1fa15-e7a1-412d-a2c7-20e6e8e8792b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0fb1fa15-e7a1-412d-a2c7-20e6e8e8792b" (UID: "0fb1fa15-e7a1-412d-a2c7-20e6e8e8792b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:14:03 crc kubenswrapper[4933]: I1202 16:14:03.904956 4933 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0fb1fa15-e7a1-412d-a2c7-20e6e8e8792b-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 16:14:03 crc kubenswrapper[4933]: I1202 16:14:03.904998 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96eee4be-07d4-4e98-a725-94746710698c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "96eee4be-07d4-4e98-a725-94746710698c" (UID: "96eee4be-07d4-4e98-a725-94746710698c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:14:03 crc kubenswrapper[4933]: I1202 16:14:03.908247 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96eee4be-07d4-4e98-a725-94746710698c-kube-api-access-t89q4" (OuterVolumeSpecName: "kube-api-access-t89q4") pod "96eee4be-07d4-4e98-a725-94746710698c" (UID: "96eee4be-07d4-4e98-a725-94746710698c"). InnerVolumeSpecName "kube-api-access-t89q4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:14:03 crc kubenswrapper[4933]: I1202 16:14:03.908557 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0fb1fa15-e7a1-412d-a2c7-20e6e8e8792b-kube-api-access-r2m6b" (OuterVolumeSpecName: "kube-api-access-r2m6b") pod "0fb1fa15-e7a1-412d-a2c7-20e6e8e8792b" (UID: "0fb1fa15-e7a1-412d-a2c7-20e6e8e8792b"). InnerVolumeSpecName "kube-api-access-r2m6b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:14:04 crc kubenswrapper[4933]: I1202 16:14:04.007237 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t89q4\" (UniqueName: \"kubernetes.io/projected/96eee4be-07d4-4e98-a725-94746710698c-kube-api-access-t89q4\") on node \"crc\" DevicePath \"\"" Dec 02 16:14:04 crc kubenswrapper[4933]: I1202 16:14:04.007262 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r2m6b\" (UniqueName: \"kubernetes.io/projected/0fb1fa15-e7a1-412d-a2c7-20e6e8e8792b-kube-api-access-r2m6b\") on node \"crc\" DevicePath \"\"" Dec 02 16:14:04 crc kubenswrapper[4933]: I1202 16:14:04.007271 4933 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/96eee4be-07d4-4e98-a725-94746710698c-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 16:14:04 crc kubenswrapper[4933]: I1202 16:14:04.040366 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-92e3-account-create-update-mql6h" event={"ID":"0fb1fa15-e7a1-412d-a2c7-20e6e8e8792b","Type":"ContainerDied","Data":"ab73f17473241fdb9f82caeb5f471d8c5016bf4193a8b6f519a5862eea972a40"} Dec 02 16:14:04 crc kubenswrapper[4933]: I1202 16:14:04.040409 4933 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ab73f17473241fdb9f82caeb5f471d8c5016bf4193a8b6f519a5862eea972a40" Dec 02 16:14:04 crc kubenswrapper[4933]: I1202 16:14:04.040414 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-92e3-account-create-update-mql6h" Dec 02 16:14:04 crc kubenswrapper[4933]: I1202 16:14:04.042529 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-tmvnh" event={"ID":"96eee4be-07d4-4e98-a725-94746710698c","Type":"ContainerDied","Data":"059b8e79d8760f1eb2b93cc013f4c3444b5b01aa46bf13d7773641b9ff563fff"} Dec 02 16:14:04 crc kubenswrapper[4933]: I1202 16:14:04.042571 4933 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="059b8e79d8760f1eb2b93cc013f4c3444b5b01aa46bf13d7773641b9ff563fff" Dec 02 16:14:04 crc kubenswrapper[4933]: I1202 16:14:04.042546 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-tmvnh" Dec 02 16:14:04 crc kubenswrapper[4933]: W1202 16:14:04.042743 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod63164577_9438_40d2_96b6_ba81f3d960d4.slice/crio-8f3e6e4ad5abbaee3b39f67f360d28386aff5e8496c864a81e5e13664696814e WatchSource:0}: Error finding container 8f3e6e4ad5abbaee3b39f67f360d28386aff5e8496c864a81e5e13664696814e: Status 404 returned error can't find the container with id 8f3e6e4ad5abbaee3b39f67f360d28386aff5e8496c864a81e5e13664696814e Dec 02 16:14:04 crc kubenswrapper[4933]: I1202 16:14:04.045277 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-db-create-l2rkw" event={"ID":"cf17e438-7511-4c81-ad30-864da63a2965","Type":"ContainerStarted","Data":"a2333808787add1f966b4b40008b6f37bd16437f2a0180b7d8d03e1e9b4eb72d"} Dec 02 16:14:04 crc kubenswrapper[4933]: I1202 16:14:04.045323 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-db-create-l2rkw" event={"ID":"cf17e438-7511-4c81-ad30-864da63a2965","Type":"ContainerStarted","Data":"e3e53f140fd0fd82ccde05811dd11c9975bd6d501f773d5642d7bb896522b2e3"} Dec 02 16:14:04 crc kubenswrapper[4933]: I1202 16:14:04.051687 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-kqjfb" Dec 02 16:14:04 crc kubenswrapper[4933]: I1202 16:14:04.052642 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-kqjfb" event={"ID":"4402628f-1d1f-44ad-8b84-e414f7345014","Type":"ContainerDied","Data":"556993931c61094ea06bceb1f30641cf9adc3297a0d3ad1f9827937e7acbfb07"} Dec 02 16:14:04 crc kubenswrapper[4933]: I1202 16:14:04.052671 4933 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="556993931c61094ea06bceb1f30641cf9adc3297a0d3ad1f9827937e7acbfb07" Dec 02 16:14:04 crc kubenswrapper[4933]: I1202 16:14:04.064901 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-f87c-account-create-update-zndp8"] Dec 02 16:14:04 crc kubenswrapper[4933]: I1202 16:14:04.080246 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mysqld-exporter-openstack-db-create-l2rkw" podStartSLOduration=2.08021833 podStartE2EDuration="2.08021833s" podCreationTimestamp="2025-12-02 16:14:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 16:14:04.070859701 +0000 UTC m=+1307.322086424" watchObservedRunningTime="2025-12-02 16:14:04.08021833 +0000 UTC m=+1307.331445033" Dec 02 16:14:04 crc kubenswrapper[4933]: I1202 16:14:04.400385 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Dec 02 16:14:04 crc kubenswrapper[4933]: I1202 16:14:04.406058 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Dec 02 16:14:04 crc kubenswrapper[4933]: I1202 16:14:04.561120 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-vwmtj" Dec 02 16:14:04 crc kubenswrapper[4933]: I1202 16:14:04.575275 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-2d40-account-create-update-cb4br" Dec 02 16:14:04 crc kubenswrapper[4933]: I1202 16:14:04.619047 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/591c4cbc-2470-449b-9046-76b4c6543cb9-swiftconf\") pod \"591c4cbc-2470-449b-9046-76b4c6543cb9\" (UID: \"591c4cbc-2470-449b-9046-76b4c6543cb9\") " Dec 02 16:14:04 crc kubenswrapper[4933]: I1202 16:14:04.619102 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/33d64345-2d6d-47a6-90cf-2a7430c9749d-operator-scripts\") pod \"33d64345-2d6d-47a6-90cf-2a7430c9749d\" (UID: \"33d64345-2d6d-47a6-90cf-2a7430c9749d\") " Dec 02 16:14:04 crc kubenswrapper[4933]: I1202 16:14:04.619148 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5q9st\" (UniqueName: \"kubernetes.io/projected/591c4cbc-2470-449b-9046-76b4c6543cb9-kube-api-access-5q9st\") pod \"591c4cbc-2470-449b-9046-76b4c6543cb9\" (UID: \"591c4cbc-2470-449b-9046-76b4c6543cb9\") " Dec 02 16:14:04 crc kubenswrapper[4933]: I1202 16:14:04.619294 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/591c4cbc-2470-449b-9046-76b4c6543cb9-etc-swift\") pod \"591c4cbc-2470-449b-9046-76b4c6543cb9\" (UID: \"591c4cbc-2470-449b-9046-76b4c6543cb9\") " Dec 02 16:14:04 crc kubenswrapper[4933]: I1202 16:14:04.619407 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/591c4cbc-2470-449b-9046-76b4c6543cb9-ring-data-devices\") pod \"591c4cbc-2470-449b-9046-76b4c6543cb9\" (UID: \"591c4cbc-2470-449b-9046-76b4c6543cb9\") " Dec 02 16:14:04 crc kubenswrapper[4933]: I1202 16:14:04.619479 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/591c4cbc-2470-449b-9046-76b4c6543cb9-combined-ca-bundle\") pod \"591c4cbc-2470-449b-9046-76b4c6543cb9\" (UID: \"591c4cbc-2470-449b-9046-76b4c6543cb9\") " Dec 02 16:14:04 crc kubenswrapper[4933]: I1202 16:14:04.619604 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bbhjk\" (UniqueName: \"kubernetes.io/projected/33d64345-2d6d-47a6-90cf-2a7430c9749d-kube-api-access-bbhjk\") pod \"33d64345-2d6d-47a6-90cf-2a7430c9749d\" (UID: \"33d64345-2d6d-47a6-90cf-2a7430c9749d\") " Dec 02 16:14:04 crc kubenswrapper[4933]: I1202 16:14:04.619764 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/591c4cbc-2470-449b-9046-76b4c6543cb9-dispersionconf\") pod \"591c4cbc-2470-449b-9046-76b4c6543cb9\" (UID: \"591c4cbc-2470-449b-9046-76b4c6543cb9\") " Dec 02 16:14:04 crc kubenswrapper[4933]: I1202 16:14:04.619813 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/591c4cbc-2470-449b-9046-76b4c6543cb9-scripts\") pod \"591c4cbc-2470-449b-9046-76b4c6543cb9\" (UID: \"591c4cbc-2470-449b-9046-76b4c6543cb9\") " Dec 02 16:14:04 crc kubenswrapper[4933]: I1202 16:14:04.621536 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33d64345-2d6d-47a6-90cf-2a7430c9749d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "33d64345-2d6d-47a6-90cf-2a7430c9749d" (UID: "33d64345-2d6d-47a6-90cf-2a7430c9749d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:14:04 crc kubenswrapper[4933]: I1202 16:14:04.622307 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/591c4cbc-2470-449b-9046-76b4c6543cb9-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "591c4cbc-2470-449b-9046-76b4c6543cb9" (UID: "591c4cbc-2470-449b-9046-76b4c6543cb9"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:14:04 crc kubenswrapper[4933]: I1202 16:14:04.622650 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/591c4cbc-2470-449b-9046-76b4c6543cb9-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "591c4cbc-2470-449b-9046-76b4c6543cb9" (UID: "591c4cbc-2470-449b-9046-76b4c6543cb9"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 16:14:04 crc kubenswrapper[4933]: I1202 16:14:04.631745 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33d64345-2d6d-47a6-90cf-2a7430c9749d-kube-api-access-bbhjk" (OuterVolumeSpecName: "kube-api-access-bbhjk") pod "33d64345-2d6d-47a6-90cf-2a7430c9749d" (UID: "33d64345-2d6d-47a6-90cf-2a7430c9749d"). InnerVolumeSpecName "kube-api-access-bbhjk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:14:04 crc kubenswrapper[4933]: I1202 16:14:04.631882 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/591c4cbc-2470-449b-9046-76b4c6543cb9-kube-api-access-5q9st" (OuterVolumeSpecName: "kube-api-access-5q9st") pod "591c4cbc-2470-449b-9046-76b4c6543cb9" (UID: "591c4cbc-2470-449b-9046-76b4c6543cb9"). InnerVolumeSpecName "kube-api-access-5q9st". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:14:04 crc kubenswrapper[4933]: I1202 16:14:04.656607 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/591c4cbc-2470-449b-9046-76b4c6543cb9-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "591c4cbc-2470-449b-9046-76b4c6543cb9" (UID: "591c4cbc-2470-449b-9046-76b4c6543cb9"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:14:04 crc kubenswrapper[4933]: I1202 16:14:04.665648 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/591c4cbc-2470-449b-9046-76b4c6543cb9-scripts" (OuterVolumeSpecName: "scripts") pod "591c4cbc-2470-449b-9046-76b4c6543cb9" (UID: "591c4cbc-2470-449b-9046-76b4c6543cb9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:14:04 crc kubenswrapper[4933]: I1202 16:14:04.677420 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/591c4cbc-2470-449b-9046-76b4c6543cb9-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "591c4cbc-2470-449b-9046-76b4c6543cb9" (UID: "591c4cbc-2470-449b-9046-76b4c6543cb9"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:14:04 crc kubenswrapper[4933]: I1202 16:14:04.682178 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/591c4cbc-2470-449b-9046-76b4c6543cb9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "591c4cbc-2470-449b-9046-76b4c6543cb9" (UID: "591c4cbc-2470-449b-9046-76b4c6543cb9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:14:04 crc kubenswrapper[4933]: I1202 16:14:04.722830 4933 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/591c4cbc-2470-449b-9046-76b4c6543cb9-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 02 16:14:04 crc kubenswrapper[4933]: I1202 16:14:04.722886 4933 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/591c4cbc-2470-449b-9046-76b4c6543cb9-ring-data-devices\") on node \"crc\" DevicePath \"\"" Dec 02 16:14:04 crc kubenswrapper[4933]: I1202 16:14:04.722902 4933 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/591c4cbc-2470-449b-9046-76b4c6543cb9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 16:14:04 crc kubenswrapper[4933]: I1202 16:14:04.722917 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bbhjk\" (UniqueName: \"kubernetes.io/projected/33d64345-2d6d-47a6-90cf-2a7430c9749d-kube-api-access-bbhjk\") on node \"crc\" DevicePath \"\"" Dec 02 16:14:04 crc kubenswrapper[4933]: I1202 16:14:04.722932 4933 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/591c4cbc-2470-449b-9046-76b4c6543cb9-dispersionconf\") on node \"crc\" DevicePath \"\"" Dec 02 16:14:04 crc kubenswrapper[4933]: I1202 16:14:04.722969 4933 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/591c4cbc-2470-449b-9046-76b4c6543cb9-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 16:14:04 crc kubenswrapper[4933]: I1202 16:14:04.722980 4933 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/591c4cbc-2470-449b-9046-76b4c6543cb9-swiftconf\") on node \"crc\" DevicePath \"\"" Dec 02 16:14:04 crc kubenswrapper[4933]: I1202 16:14:04.722993 4933 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/33d64345-2d6d-47a6-90cf-2a7430c9749d-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 16:14:04 crc kubenswrapper[4933]: I1202 16:14:04.723008 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5q9st\" (UniqueName: \"kubernetes.io/projected/591c4cbc-2470-449b-9046-76b4c6543cb9-kube-api-access-5q9st\") on node \"crc\" DevicePath \"\"" Dec 02 16:14:05 crc kubenswrapper[4933]: I1202 16:14:05.069243 4933 generic.go:334] "Generic (PLEG): container finished" podID="63164577-9438-40d2-96b6-ba81f3d960d4" containerID="c67940f616718367b793f26e7ff40e948807364b58ce4eab90d8cb3819d1b627" exitCode=0 Dec 02 16:14:05 crc kubenswrapper[4933]: I1202 16:14:05.071072 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-vwmtj" Dec 02 16:14:05 crc kubenswrapper[4933]: I1202 16:14:05.075940 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-2d40-account-create-update-cb4br" Dec 02 16:14:05 crc kubenswrapper[4933]: I1202 16:14:05.077487 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-f87c-account-create-update-zndp8" event={"ID":"63164577-9438-40d2-96b6-ba81f3d960d4","Type":"ContainerDied","Data":"c67940f616718367b793f26e7ff40e948807364b58ce4eab90d8cb3819d1b627"} Dec 02 16:14:05 crc kubenswrapper[4933]: I1202 16:14:05.077522 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-f87c-account-create-update-zndp8" event={"ID":"63164577-9438-40d2-96b6-ba81f3d960d4","Type":"ContainerStarted","Data":"8f3e6e4ad5abbaee3b39f67f360d28386aff5e8496c864a81e5e13664696814e"} Dec 02 16:14:05 crc kubenswrapper[4933]: I1202 16:14:05.077537 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-vwmtj" event={"ID":"591c4cbc-2470-449b-9046-76b4c6543cb9","Type":"ContainerDied","Data":"b603ca3198ad37d8505b1c743a0b63cf584393354b1920902633fc1b994c7abf"} Dec 02 16:14:05 crc kubenswrapper[4933]: I1202 16:14:05.077555 4933 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b603ca3198ad37d8505b1c743a0b63cf584393354b1920902633fc1b994c7abf" Dec 02 16:14:05 crc kubenswrapper[4933]: I1202 16:14:05.077567 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-2d40-account-create-update-cb4br" event={"ID":"33d64345-2d6d-47a6-90cf-2a7430c9749d","Type":"ContainerDied","Data":"0dccb983f58cbd7a433bc7787c475b8e83186bd5ec693c7aa6d01bba1145c3a3"} Dec 02 16:14:05 crc kubenswrapper[4933]: I1202 16:14:05.077578 4933 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0dccb983f58cbd7a433bc7787c475b8e83186bd5ec693c7aa6d01bba1145c3a3" Dec 02 16:14:05 crc kubenswrapper[4933]: I1202 16:14:05.079707 4933 generic.go:334] "Generic (PLEG): container finished" podID="cf17e438-7511-4c81-ad30-864da63a2965" containerID="a2333808787add1f966b4b40008b6f37bd16437f2a0180b7d8d03e1e9b4eb72d" exitCode=0 Dec 02 16:14:05 crc kubenswrapper[4933]: I1202 16:14:05.081090 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-db-create-l2rkw" event={"ID":"cf17e438-7511-4c81-ad30-864da63a2965","Type":"ContainerDied","Data":"a2333808787add1f966b4b40008b6f37bd16437f2a0180b7d8d03e1e9b4eb72d"} Dec 02 16:14:05 crc kubenswrapper[4933]: I1202 16:14:05.082702 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Dec 02 16:14:05 crc kubenswrapper[4933]: I1202 16:14:05.997788 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-5pwnn"] Dec 02 16:14:05 crc kubenswrapper[4933]: E1202 16:14:05.998293 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4402628f-1d1f-44ad-8b84-e414f7345014" containerName="mariadb-database-create" Dec 02 16:14:05 crc kubenswrapper[4933]: I1202 16:14:05.998320 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="4402628f-1d1f-44ad-8b84-e414f7345014" containerName="mariadb-database-create" Dec 02 16:14:05 crc kubenswrapper[4933]: E1202 16:14:05.998338 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fb1fa15-e7a1-412d-a2c7-20e6e8e8792b" containerName="mariadb-account-create-update" Dec 02 16:14:05 crc kubenswrapper[4933]: I1202 16:14:05.998347 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fb1fa15-e7a1-412d-a2c7-20e6e8e8792b" containerName="mariadb-account-create-update" Dec 02 16:14:05 crc kubenswrapper[4933]: E1202 16:14:05.998364 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33d64345-2d6d-47a6-90cf-2a7430c9749d" containerName="mariadb-account-create-update" Dec 02 16:14:05 crc kubenswrapper[4933]: I1202 16:14:05.998373 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="33d64345-2d6d-47a6-90cf-2a7430c9749d" containerName="mariadb-account-create-update" Dec 02 16:14:05 crc kubenswrapper[4933]: E1202 16:14:05.998388 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96eee4be-07d4-4e98-a725-94746710698c" containerName="mariadb-database-create" Dec 02 16:14:05 crc kubenswrapper[4933]: I1202 16:14:05.998396 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="96eee4be-07d4-4e98-a725-94746710698c" containerName="mariadb-database-create" Dec 02 16:14:05 crc kubenswrapper[4933]: E1202 16:14:05.998416 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="591c4cbc-2470-449b-9046-76b4c6543cb9" containerName="swift-ring-rebalance" Dec 02 16:14:05 crc kubenswrapper[4933]: I1202 16:14:05.998424 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="591c4cbc-2470-449b-9046-76b4c6543cb9" containerName="swift-ring-rebalance" Dec 02 16:14:05 crc kubenswrapper[4933]: I1202 16:14:05.998708 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="96eee4be-07d4-4e98-a725-94746710698c" containerName="mariadb-database-create" Dec 02 16:14:05 crc kubenswrapper[4933]: I1202 16:14:05.998731 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="0fb1fa15-e7a1-412d-a2c7-20e6e8e8792b" containerName="mariadb-account-create-update" Dec 02 16:14:05 crc kubenswrapper[4933]: I1202 16:14:05.998750 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="4402628f-1d1f-44ad-8b84-e414f7345014" containerName="mariadb-database-create" Dec 02 16:14:05 crc kubenswrapper[4933]: I1202 16:14:05.998763 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="591c4cbc-2470-449b-9046-76b4c6543cb9" containerName="swift-ring-rebalance" Dec 02 16:14:05 crc kubenswrapper[4933]: I1202 16:14:05.998784 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="33d64345-2d6d-47a6-90cf-2a7430c9749d" containerName="mariadb-account-create-update" Dec 02 16:14:06 crc kubenswrapper[4933]: I1202 16:14:06.000040 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-5pwnn" Dec 02 16:14:06 crc kubenswrapper[4933]: I1202 16:14:06.019009 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-5pwnn"] Dec 02 16:14:06 crc kubenswrapper[4933]: I1202 16:14:06.049656 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/360b6f71-eb21-4616-87ef-d142ca4a2e1a-operator-scripts\") pod \"glance-db-create-5pwnn\" (UID: \"360b6f71-eb21-4616-87ef-d142ca4a2e1a\") " pod="openstack/glance-db-create-5pwnn" Dec 02 16:14:06 crc kubenswrapper[4933]: I1202 16:14:06.049850 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zspth\" (UniqueName: \"kubernetes.io/projected/360b6f71-eb21-4616-87ef-d142ca4a2e1a-kube-api-access-zspth\") pod \"glance-db-create-5pwnn\" (UID: \"360b6f71-eb21-4616-87ef-d142ca4a2e1a\") " pod="openstack/glance-db-create-5pwnn" Dec 02 16:14:06 crc kubenswrapper[4933]: I1202 16:14:06.090175 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-139c-account-create-update-rws48"] Dec 02 16:14:06 crc kubenswrapper[4933]: I1202 16:14:06.091510 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-139c-account-create-update-rws48" Dec 02 16:14:06 crc kubenswrapper[4933]: I1202 16:14:06.093734 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Dec 02 16:14:06 crc kubenswrapper[4933]: I1202 16:14:06.107734 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-139c-account-create-update-rws48"] Dec 02 16:14:06 crc kubenswrapper[4933]: I1202 16:14:06.151987 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/360b6f71-eb21-4616-87ef-d142ca4a2e1a-operator-scripts\") pod \"glance-db-create-5pwnn\" (UID: \"360b6f71-eb21-4616-87ef-d142ca4a2e1a\") " pod="openstack/glance-db-create-5pwnn" Dec 02 16:14:06 crc kubenswrapper[4933]: I1202 16:14:06.152080 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zspth\" (UniqueName: \"kubernetes.io/projected/360b6f71-eb21-4616-87ef-d142ca4a2e1a-kube-api-access-zspth\") pod \"glance-db-create-5pwnn\" (UID: \"360b6f71-eb21-4616-87ef-d142ca4a2e1a\") " pod="openstack/glance-db-create-5pwnn" Dec 02 16:14:06 crc kubenswrapper[4933]: I1202 16:14:06.153473 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/360b6f71-eb21-4616-87ef-d142ca4a2e1a-operator-scripts\") pod \"glance-db-create-5pwnn\" (UID: \"360b6f71-eb21-4616-87ef-d142ca4a2e1a\") " pod="openstack/glance-db-create-5pwnn" Dec 02 16:14:06 crc kubenswrapper[4933]: I1202 16:14:06.172558 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zspth\" (UniqueName: \"kubernetes.io/projected/360b6f71-eb21-4616-87ef-d142ca4a2e1a-kube-api-access-zspth\") pod \"glance-db-create-5pwnn\" (UID: \"360b6f71-eb21-4616-87ef-d142ca4a2e1a\") " pod="openstack/glance-db-create-5pwnn" Dec 02 16:14:06 crc kubenswrapper[4933]: I1202 16:14:06.254345 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rs99r\" (UniqueName: \"kubernetes.io/projected/ada1850f-4671-48a5-97f3-7b82aab8ab97-kube-api-access-rs99r\") pod \"glance-139c-account-create-update-rws48\" (UID: \"ada1850f-4671-48a5-97f3-7b82aab8ab97\") " pod="openstack/glance-139c-account-create-update-rws48" Dec 02 16:14:06 crc kubenswrapper[4933]: I1202 16:14:06.254438 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ada1850f-4671-48a5-97f3-7b82aab8ab97-operator-scripts\") pod \"glance-139c-account-create-update-rws48\" (UID: \"ada1850f-4671-48a5-97f3-7b82aab8ab97\") " pod="openstack/glance-139c-account-create-update-rws48" Dec 02 16:14:06 crc kubenswrapper[4933]: I1202 16:14:06.323229 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-5pwnn" Dec 02 16:14:06 crc kubenswrapper[4933]: I1202 16:14:06.357203 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ada1850f-4671-48a5-97f3-7b82aab8ab97-operator-scripts\") pod \"glance-139c-account-create-update-rws48\" (UID: \"ada1850f-4671-48a5-97f3-7b82aab8ab97\") " pod="openstack/glance-139c-account-create-update-rws48" Dec 02 16:14:06 crc kubenswrapper[4933]: I1202 16:14:06.357594 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rs99r\" (UniqueName: \"kubernetes.io/projected/ada1850f-4671-48a5-97f3-7b82aab8ab97-kube-api-access-rs99r\") pod \"glance-139c-account-create-update-rws48\" (UID: \"ada1850f-4671-48a5-97f3-7b82aab8ab97\") " pod="openstack/glance-139c-account-create-update-rws48" Dec 02 16:14:06 crc kubenswrapper[4933]: I1202 16:14:06.358817 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ada1850f-4671-48a5-97f3-7b82aab8ab97-operator-scripts\") pod \"glance-139c-account-create-update-rws48\" (UID: \"ada1850f-4671-48a5-97f3-7b82aab8ab97\") " pod="openstack/glance-139c-account-create-update-rws48" Dec 02 16:14:06 crc kubenswrapper[4933]: I1202 16:14:06.383100 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rs99r\" (UniqueName: \"kubernetes.io/projected/ada1850f-4671-48a5-97f3-7b82aab8ab97-kube-api-access-rs99r\") pod \"glance-139c-account-create-update-rws48\" (UID: \"ada1850f-4671-48a5-97f3-7b82aab8ab97\") " pod="openstack/glance-139c-account-create-update-rws48" Dec 02 16:14:06 crc kubenswrapper[4933]: I1202 16:14:06.410683 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-2z7kb" Dec 02 16:14:06 crc kubenswrapper[4933]: I1202 16:14:06.416638 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-139c-account-create-update-rws48" Dec 02 16:14:06 crc kubenswrapper[4933]: I1202 16:14:06.417300 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-2z7kb" Dec 02 16:14:06 crc kubenswrapper[4933]: I1202 16:14:06.568627 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-f87c-account-create-update-zndp8" Dec 02 16:14:07 crc kubenswrapper[4933]: I1202 16:14:06.730474 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-db-create-l2rkw" Dec 02 16:14:07 crc kubenswrapper[4933]: I1202 16:14:06.764375 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-g8qb5-config-zgdq8"] Dec 02 16:14:07 crc kubenswrapper[4933]: E1202 16:14:06.765289 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf17e438-7511-4c81-ad30-864da63a2965" containerName="mariadb-database-create" Dec 02 16:14:07 crc kubenswrapper[4933]: I1202 16:14:06.765302 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf17e438-7511-4c81-ad30-864da63a2965" containerName="mariadb-database-create" Dec 02 16:14:07 crc kubenswrapper[4933]: E1202 16:14:06.766454 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63164577-9438-40d2-96b6-ba81f3d960d4" containerName="mariadb-account-create-update" Dec 02 16:14:07 crc kubenswrapper[4933]: I1202 16:14:06.766466 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="63164577-9438-40d2-96b6-ba81f3d960d4" containerName="mariadb-account-create-update" Dec 02 16:14:07 crc kubenswrapper[4933]: I1202 16:14:06.766687 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="63164577-9438-40d2-96b6-ba81f3d960d4" containerName="mariadb-account-create-update" Dec 02 16:14:07 crc kubenswrapper[4933]: I1202 16:14:06.766714 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf17e438-7511-4c81-ad30-864da63a2965" containerName="mariadb-database-create" Dec 02 16:14:07 crc kubenswrapper[4933]: I1202 16:14:06.768582 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-g8qb5-config-zgdq8" Dec 02 16:14:07 crc kubenswrapper[4933]: I1202 16:14:06.770533 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/63164577-9438-40d2-96b6-ba81f3d960d4-operator-scripts\") pod \"63164577-9438-40d2-96b6-ba81f3d960d4\" (UID: \"63164577-9438-40d2-96b6-ba81f3d960d4\") " Dec 02 16:14:07 crc kubenswrapper[4933]: I1202 16:14:06.770741 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kjvg5\" (UniqueName: \"kubernetes.io/projected/63164577-9438-40d2-96b6-ba81f3d960d4-kube-api-access-kjvg5\") pod \"63164577-9438-40d2-96b6-ba81f3d960d4\" (UID: \"63164577-9438-40d2-96b6-ba81f3d960d4\") " Dec 02 16:14:07 crc kubenswrapper[4933]: I1202 16:14:06.773227 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Dec 02 16:14:07 crc kubenswrapper[4933]: I1202 16:14:06.774175 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/63164577-9438-40d2-96b6-ba81f3d960d4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "63164577-9438-40d2-96b6-ba81f3d960d4" (UID: "63164577-9438-40d2-96b6-ba81f3d960d4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:14:07 crc kubenswrapper[4933]: I1202 16:14:06.787195 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63164577-9438-40d2-96b6-ba81f3d960d4-kube-api-access-kjvg5" (OuterVolumeSpecName: "kube-api-access-kjvg5") pod "63164577-9438-40d2-96b6-ba81f3d960d4" (UID: "63164577-9438-40d2-96b6-ba81f3d960d4"). InnerVolumeSpecName "kube-api-access-kjvg5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:14:07 crc kubenswrapper[4933]: I1202 16:14:06.794656 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-g8qb5-config-zgdq8"] Dec 02 16:14:07 crc kubenswrapper[4933]: I1202 16:14:06.875088 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7x5xn\" (UniqueName: \"kubernetes.io/projected/cf17e438-7511-4c81-ad30-864da63a2965-kube-api-access-7x5xn\") pod \"cf17e438-7511-4c81-ad30-864da63a2965\" (UID: \"cf17e438-7511-4c81-ad30-864da63a2965\") " Dec 02 16:14:07 crc kubenswrapper[4933]: I1202 16:14:06.875396 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cf17e438-7511-4c81-ad30-864da63a2965-operator-scripts\") pod \"cf17e438-7511-4c81-ad30-864da63a2965\" (UID: \"cf17e438-7511-4c81-ad30-864da63a2965\") " Dec 02 16:14:07 crc kubenswrapper[4933]: I1202 16:14:06.876026 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmw99\" (UniqueName: \"kubernetes.io/projected/854d6a3f-90b4-4606-8432-9888fced9115-kube-api-access-fmw99\") pod \"ovn-controller-g8qb5-config-zgdq8\" (UID: \"854d6a3f-90b4-4606-8432-9888fced9115\") " pod="openstack/ovn-controller-g8qb5-config-zgdq8" Dec 02 16:14:07 crc kubenswrapper[4933]: I1202 16:14:06.876133 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/854d6a3f-90b4-4606-8432-9888fced9115-scripts\") pod \"ovn-controller-g8qb5-config-zgdq8\" (UID: \"854d6a3f-90b4-4606-8432-9888fced9115\") " pod="openstack/ovn-controller-g8qb5-config-zgdq8" Dec 02 16:14:07 crc kubenswrapper[4933]: I1202 16:14:06.876385 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/854d6a3f-90b4-4606-8432-9888fced9115-var-run-ovn\") pod \"ovn-controller-g8qb5-config-zgdq8\" (UID: \"854d6a3f-90b4-4606-8432-9888fced9115\") " pod="openstack/ovn-controller-g8qb5-config-zgdq8" Dec 02 16:14:07 crc kubenswrapper[4933]: I1202 16:14:06.876464 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/854d6a3f-90b4-4606-8432-9888fced9115-additional-scripts\") pod \"ovn-controller-g8qb5-config-zgdq8\" (UID: \"854d6a3f-90b4-4606-8432-9888fced9115\") " pod="openstack/ovn-controller-g8qb5-config-zgdq8" Dec 02 16:14:07 crc kubenswrapper[4933]: I1202 16:14:06.879927 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/854d6a3f-90b4-4606-8432-9888fced9115-var-log-ovn\") pod \"ovn-controller-g8qb5-config-zgdq8\" (UID: \"854d6a3f-90b4-4606-8432-9888fced9115\") " pod="openstack/ovn-controller-g8qb5-config-zgdq8" Dec 02 16:14:07 crc kubenswrapper[4933]: I1202 16:14:06.880050 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/854d6a3f-90b4-4606-8432-9888fced9115-var-run\") pod \"ovn-controller-g8qb5-config-zgdq8\" (UID: \"854d6a3f-90b4-4606-8432-9888fced9115\") " pod="openstack/ovn-controller-g8qb5-config-zgdq8" Dec 02 16:14:07 crc kubenswrapper[4933]: I1202 16:14:06.880337 4933 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/63164577-9438-40d2-96b6-ba81f3d960d4-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 16:14:07 crc kubenswrapper[4933]: I1202 16:14:06.880358 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kjvg5\" (UniqueName: \"kubernetes.io/projected/63164577-9438-40d2-96b6-ba81f3d960d4-kube-api-access-kjvg5\") on node \"crc\" DevicePath \"\"" Dec 02 16:14:07 crc kubenswrapper[4933]: I1202 16:14:06.880416 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf17e438-7511-4c81-ad30-864da63a2965-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cf17e438-7511-4c81-ad30-864da63a2965" (UID: "cf17e438-7511-4c81-ad30-864da63a2965"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:14:07 crc kubenswrapper[4933]: I1202 16:14:06.885709 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf17e438-7511-4c81-ad30-864da63a2965-kube-api-access-7x5xn" (OuterVolumeSpecName: "kube-api-access-7x5xn") pod "cf17e438-7511-4c81-ad30-864da63a2965" (UID: "cf17e438-7511-4c81-ad30-864da63a2965"). InnerVolumeSpecName "kube-api-access-7x5xn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:14:07 crc kubenswrapper[4933]: I1202 16:14:06.982535 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmw99\" (UniqueName: \"kubernetes.io/projected/854d6a3f-90b4-4606-8432-9888fced9115-kube-api-access-fmw99\") pod \"ovn-controller-g8qb5-config-zgdq8\" (UID: \"854d6a3f-90b4-4606-8432-9888fced9115\") " pod="openstack/ovn-controller-g8qb5-config-zgdq8" Dec 02 16:14:07 crc kubenswrapper[4933]: I1202 16:14:06.982612 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/854d6a3f-90b4-4606-8432-9888fced9115-scripts\") pod \"ovn-controller-g8qb5-config-zgdq8\" (UID: \"854d6a3f-90b4-4606-8432-9888fced9115\") " pod="openstack/ovn-controller-g8qb5-config-zgdq8" Dec 02 16:14:07 crc kubenswrapper[4933]: I1202 16:14:06.985037 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/854d6a3f-90b4-4606-8432-9888fced9115-scripts\") pod \"ovn-controller-g8qb5-config-zgdq8\" (UID: \"854d6a3f-90b4-4606-8432-9888fced9115\") " pod="openstack/ovn-controller-g8qb5-config-zgdq8" Dec 02 16:14:07 crc kubenswrapper[4933]: I1202 16:14:06.985184 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/854d6a3f-90b4-4606-8432-9888fced9115-var-run-ovn\") pod \"ovn-controller-g8qb5-config-zgdq8\" (UID: \"854d6a3f-90b4-4606-8432-9888fced9115\") " pod="openstack/ovn-controller-g8qb5-config-zgdq8" Dec 02 16:14:07 crc kubenswrapper[4933]: I1202 16:14:06.985254 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/854d6a3f-90b4-4606-8432-9888fced9115-additional-scripts\") pod \"ovn-controller-g8qb5-config-zgdq8\" (UID: \"854d6a3f-90b4-4606-8432-9888fced9115\") " pod="openstack/ovn-controller-g8qb5-config-zgdq8" Dec 02 16:14:07 crc kubenswrapper[4933]: I1202 16:14:06.985277 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/854d6a3f-90b4-4606-8432-9888fced9115-var-log-ovn\") pod \"ovn-controller-g8qb5-config-zgdq8\" (UID: \"854d6a3f-90b4-4606-8432-9888fced9115\") " pod="openstack/ovn-controller-g8qb5-config-zgdq8" Dec 02 16:14:07 crc kubenswrapper[4933]: I1202 16:14:06.985349 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/854d6a3f-90b4-4606-8432-9888fced9115-var-run\") pod \"ovn-controller-g8qb5-config-zgdq8\" (UID: \"854d6a3f-90b4-4606-8432-9888fced9115\") " pod="openstack/ovn-controller-g8qb5-config-zgdq8" Dec 02 16:14:07 crc kubenswrapper[4933]: I1202 16:14:06.985461 4933 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cf17e438-7511-4c81-ad30-864da63a2965-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 16:14:07 crc kubenswrapper[4933]: I1202 16:14:06.985477 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7x5xn\" (UniqueName: \"kubernetes.io/projected/cf17e438-7511-4c81-ad30-864da63a2965-kube-api-access-7x5xn\") on node \"crc\" DevicePath \"\"" Dec 02 16:14:07 crc kubenswrapper[4933]: I1202 16:14:06.985815 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/854d6a3f-90b4-4606-8432-9888fced9115-var-run\") pod \"ovn-controller-g8qb5-config-zgdq8\" (UID: \"854d6a3f-90b4-4606-8432-9888fced9115\") " pod="openstack/ovn-controller-g8qb5-config-zgdq8" Dec 02 16:14:07 crc kubenswrapper[4933]: I1202 16:14:06.985903 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/854d6a3f-90b4-4606-8432-9888fced9115-var-run-ovn\") pod \"ovn-controller-g8qb5-config-zgdq8\" (UID: \"854d6a3f-90b4-4606-8432-9888fced9115\") " pod="openstack/ovn-controller-g8qb5-config-zgdq8" Dec 02 16:14:07 crc kubenswrapper[4933]: I1202 16:14:06.986408 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/854d6a3f-90b4-4606-8432-9888fced9115-additional-scripts\") pod \"ovn-controller-g8qb5-config-zgdq8\" (UID: \"854d6a3f-90b4-4606-8432-9888fced9115\") " pod="openstack/ovn-controller-g8qb5-config-zgdq8" Dec 02 16:14:07 crc kubenswrapper[4933]: I1202 16:14:06.986465 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/854d6a3f-90b4-4606-8432-9888fced9115-var-log-ovn\") pod \"ovn-controller-g8qb5-config-zgdq8\" (UID: \"854d6a3f-90b4-4606-8432-9888fced9115\") " pod="openstack/ovn-controller-g8qb5-config-zgdq8" Dec 02 16:14:07 crc kubenswrapper[4933]: I1202 16:14:07.055568 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmw99\" (UniqueName: \"kubernetes.io/projected/854d6a3f-90b4-4606-8432-9888fced9115-kube-api-access-fmw99\") pod \"ovn-controller-g8qb5-config-zgdq8\" (UID: \"854d6a3f-90b4-4606-8432-9888fced9115\") " pod="openstack/ovn-controller-g8qb5-config-zgdq8" Dec 02 16:14:07 crc kubenswrapper[4933]: I1202 16:14:07.105441 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-g8qb5-config-zgdq8" Dec 02 16:14:07 crc kubenswrapper[4933]: I1202 16:14:07.128513 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-db-create-l2rkw" event={"ID":"cf17e438-7511-4c81-ad30-864da63a2965","Type":"ContainerDied","Data":"e3e53f140fd0fd82ccde05811dd11c9975bd6d501f773d5642d7bb896522b2e3"} Dec 02 16:14:07 crc kubenswrapper[4933]: I1202 16:14:07.128562 4933 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e3e53f140fd0fd82ccde05811dd11c9975bd6d501f773d5642d7bb896522b2e3" Dec 02 16:14:07 crc kubenswrapper[4933]: I1202 16:14:07.128528 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-db-create-l2rkw" Dec 02 16:14:07 crc kubenswrapper[4933]: I1202 16:14:07.130554 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-f87c-account-create-update-zndp8" Dec 02 16:14:07 crc kubenswrapper[4933]: I1202 16:14:07.131112 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-f87c-account-create-update-zndp8" event={"ID":"63164577-9438-40d2-96b6-ba81f3d960d4","Type":"ContainerDied","Data":"8f3e6e4ad5abbaee3b39f67f360d28386aff5e8496c864a81e5e13664696814e"} Dec 02 16:14:07 crc kubenswrapper[4933]: I1202 16:14:07.131139 4933 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8f3e6e4ad5abbaee3b39f67f360d28386aff5e8496c864a81e5e13664696814e" Dec 02 16:14:07 crc kubenswrapper[4933]: I1202 16:14:07.536105 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 02 16:14:07 crc kubenswrapper[4933]: I1202 16:14:07.844763 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-5pwnn"] Dec 02 16:14:07 crc kubenswrapper[4933]: I1202 16:14:07.891026 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 02 16:14:08 crc kubenswrapper[4933]: I1202 16:14:08.103291 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-create-9g2jj"] Dec 02 16:14:08 crc kubenswrapper[4933]: I1202 16:14:08.105029 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-9g2jj" Dec 02 16:14:08 crc kubenswrapper[4933]: I1202 16:14:08.126894 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-59cd-account-create-update-fqk6n"] Dec 02 16:14:08 crc kubenswrapper[4933]: I1202 16:14:08.128337 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-59cd-account-create-update-fqk6n" Dec 02 16:14:08 crc kubenswrapper[4933]: I1202 16:14:08.131490 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-db-secret" Dec 02 16:14:08 crc kubenswrapper[4933]: I1202 16:14:08.147716 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-9g2jj"] Dec 02 16:14:08 crc kubenswrapper[4933]: I1202 16:14:08.160322 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-5pwnn" event={"ID":"360b6f71-eb21-4616-87ef-d142ca4a2e1a","Type":"ContainerStarted","Data":"f389c81d1304fc08b5c71de6d57df6441f36092cc3f38637b57496111803c656"} Dec 02 16:14:08 crc kubenswrapper[4933]: I1202 16:14:08.160362 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-5pwnn" event={"ID":"360b6f71-eb21-4616-87ef-d142ca4a2e1a","Type":"ContainerStarted","Data":"b1bf1cb6ce1af60ef261c21923ca050b067814b7559257a04c1691f463e18a58"} Dec 02 16:14:08 crc kubenswrapper[4933]: I1202 16:14:08.161762 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-59cd-account-create-update-fqk6n"] Dec 02 16:14:08 crc kubenswrapper[4933]: I1202 16:14:08.223341 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ft59b\" (UniqueName: \"kubernetes.io/projected/1b64b3a1-928a-49f7-b910-6d99cc7e7cf1-kube-api-access-ft59b\") pod \"heat-db-create-9g2jj\" (UID: \"1b64b3a1-928a-49f7-b910-6d99cc7e7cf1\") " pod="openstack/heat-db-create-9g2jj" Dec 02 16:14:08 crc kubenswrapper[4933]: I1202 16:14:08.223662 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1b64b3a1-928a-49f7-b910-6d99cc7e7cf1-operator-scripts\") pod \"heat-db-create-9g2jj\" (UID: \"1b64b3a1-928a-49f7-b910-6d99cc7e7cf1\") " pod="openstack/heat-db-create-9g2jj" Dec 02 16:14:08 crc kubenswrapper[4933]: I1202 16:14:08.241703 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-139c-account-create-update-rws48"] Dec 02 16:14:08 crc kubenswrapper[4933]: I1202 16:14:08.256709 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-g8qb5-config-zgdq8"] Dec 02 16:14:08 crc kubenswrapper[4933]: I1202 16:14:08.265173 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-create-5pwnn" podStartSLOduration=3.265153169 podStartE2EDuration="3.265153169s" podCreationTimestamp="2025-12-02 16:14:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 16:14:08.198516429 +0000 UTC m=+1311.449743142" watchObservedRunningTime="2025-12-02 16:14:08.265153169 +0000 UTC m=+1311.516379872" Dec 02 16:14:08 crc kubenswrapper[4933]: I1202 16:14:08.327044 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-vh4vf"] Dec 02 16:14:08 crc kubenswrapper[4933]: I1202 16:14:08.333997 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-vh4vf" Dec 02 16:14:08 crc kubenswrapper[4933]: I1202 16:14:08.346662 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-2ntjs" Dec 02 16:14:08 crc kubenswrapper[4933]: I1202 16:14:08.346876 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 02 16:14:08 crc kubenswrapper[4933]: I1202 16:14:08.349250 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 02 16:14:08 crc kubenswrapper[4933]: I1202 16:14:08.349594 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 02 16:14:08 crc kubenswrapper[4933]: I1202 16:14:08.352180 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ft59b\" (UniqueName: \"kubernetes.io/projected/1b64b3a1-928a-49f7-b910-6d99cc7e7cf1-kube-api-access-ft59b\") pod \"heat-db-create-9g2jj\" (UID: \"1b64b3a1-928a-49f7-b910-6d99cc7e7cf1\") " pod="openstack/heat-db-create-9g2jj" Dec 02 16:14:08 crc kubenswrapper[4933]: I1202 16:14:08.352502 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1b64b3a1-928a-49f7-b910-6d99cc7e7cf1-operator-scripts\") pod \"heat-db-create-9g2jj\" (UID: \"1b64b3a1-928a-49f7-b910-6d99cc7e7cf1\") " pod="openstack/heat-db-create-9g2jj" Dec 02 16:14:08 crc kubenswrapper[4933]: I1202 16:14:08.352688 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/82ac1fbd-600e-4795-855d-3dae65b36263-operator-scripts\") pod \"heat-59cd-account-create-update-fqk6n\" (UID: \"82ac1fbd-600e-4795-855d-3dae65b36263\") " pod="openstack/heat-59cd-account-create-update-fqk6n" Dec 02 16:14:08 crc kubenswrapper[4933]: I1202 16:14:08.352955 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwj6d\" (UniqueName: \"kubernetes.io/projected/82ac1fbd-600e-4795-855d-3dae65b36263-kube-api-access-bwj6d\") pod \"heat-59cd-account-create-update-fqk6n\" (UID: \"82ac1fbd-600e-4795-855d-3dae65b36263\") " pod="openstack/heat-59cd-account-create-update-fqk6n" Dec 02 16:14:08 crc kubenswrapper[4933]: I1202 16:14:08.354081 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1b64b3a1-928a-49f7-b910-6d99cc7e7cf1-operator-scripts\") pod \"heat-db-create-9g2jj\" (UID: \"1b64b3a1-928a-49f7-b910-6d99cc7e7cf1\") " pod="openstack/heat-db-create-9g2jj" Dec 02 16:14:08 crc kubenswrapper[4933]: I1202 16:14:08.354068 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-dvgn2"] Dec 02 16:14:08 crc kubenswrapper[4933]: I1202 16:14:08.369075 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-dvgn2" Dec 02 16:14:08 crc kubenswrapper[4933]: I1202 16:14:08.429186 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-dvgn2"] Dec 02 16:14:08 crc kubenswrapper[4933]: I1202 16:14:08.424914 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ft59b\" (UniqueName: \"kubernetes.io/projected/1b64b3a1-928a-49f7-b910-6d99cc7e7cf1-kube-api-access-ft59b\") pod \"heat-db-create-9g2jj\" (UID: \"1b64b3a1-928a-49f7-b910-6d99cc7e7cf1\") " pod="openstack/heat-db-create-9g2jj" Dec 02 16:14:08 crc kubenswrapper[4933]: I1202 16:14:08.443860 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-vh4vf"] Dec 02 16:14:08 crc kubenswrapper[4933]: I1202 16:14:08.450323 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-9g2jj" Dec 02 16:14:08 crc kubenswrapper[4933]: I1202 16:14:08.454453 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e2abbab-0ae7-4cb8-9b51-a2423b7948da-config-data\") pod \"keystone-db-sync-vh4vf\" (UID: \"8e2abbab-0ae7-4cb8-9b51-a2423b7948da\") " pod="openstack/keystone-db-sync-vh4vf" Dec 02 16:14:08 crc kubenswrapper[4933]: I1202 16:14:08.454497 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/be505e35-b5c6-434e-9f60-621f6adb19aa-operator-scripts\") pod \"barbican-db-create-dvgn2\" (UID: \"be505e35-b5c6-434e-9f60-621f6adb19aa\") " pod="openstack/barbican-db-create-dvgn2" Dec 02 16:14:08 crc kubenswrapper[4933]: I1202 16:14:08.454570 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7mvx\" (UniqueName: \"kubernetes.io/projected/8e2abbab-0ae7-4cb8-9b51-a2423b7948da-kube-api-access-w7mvx\") pod \"keystone-db-sync-vh4vf\" (UID: \"8e2abbab-0ae7-4cb8-9b51-a2423b7948da\") " pod="openstack/keystone-db-sync-vh4vf" Dec 02 16:14:08 crc kubenswrapper[4933]: I1202 16:14:08.454633 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/82ac1fbd-600e-4795-855d-3dae65b36263-operator-scripts\") pod \"heat-59cd-account-create-update-fqk6n\" (UID: \"82ac1fbd-600e-4795-855d-3dae65b36263\") " pod="openstack/heat-59cd-account-create-update-fqk6n" Dec 02 16:14:08 crc kubenswrapper[4933]: I1202 16:14:08.454728 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-275rp\" (UniqueName: \"kubernetes.io/projected/be505e35-b5c6-434e-9f60-621f6adb19aa-kube-api-access-275rp\") pod \"barbican-db-create-dvgn2\" (UID: \"be505e35-b5c6-434e-9f60-621f6adb19aa\") " pod="openstack/barbican-db-create-dvgn2" Dec 02 16:14:08 crc kubenswrapper[4933]: I1202 16:14:08.454798 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bwj6d\" (UniqueName: \"kubernetes.io/projected/82ac1fbd-600e-4795-855d-3dae65b36263-kube-api-access-bwj6d\") pod \"heat-59cd-account-create-update-fqk6n\" (UID: \"82ac1fbd-600e-4795-855d-3dae65b36263\") " pod="openstack/heat-59cd-account-create-update-fqk6n" Dec 02 16:14:08 crc kubenswrapper[4933]: I1202 16:14:08.454857 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e2abbab-0ae7-4cb8-9b51-a2423b7948da-combined-ca-bundle\") pod \"keystone-db-sync-vh4vf\" (UID: \"8e2abbab-0ae7-4cb8-9b51-a2423b7948da\") " pod="openstack/keystone-db-sync-vh4vf" Dec 02 16:14:08 crc kubenswrapper[4933]: I1202 16:14:08.455243 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/82ac1fbd-600e-4795-855d-3dae65b36263-operator-scripts\") pod \"heat-59cd-account-create-update-fqk6n\" (UID: \"82ac1fbd-600e-4795-855d-3dae65b36263\") " pod="openstack/heat-59cd-account-create-update-fqk6n" Dec 02 16:14:08 crc kubenswrapper[4933]: I1202 16:14:08.461023 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-gnm4h"] Dec 02 16:14:08 crc kubenswrapper[4933]: I1202 16:14:08.462341 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-gnm4h" Dec 02 16:14:08 crc kubenswrapper[4933]: I1202 16:14:08.476030 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwj6d\" (UniqueName: \"kubernetes.io/projected/82ac1fbd-600e-4795-855d-3dae65b36263-kube-api-access-bwj6d\") pod \"heat-59cd-account-create-update-fqk6n\" (UID: \"82ac1fbd-600e-4795-855d-3dae65b36263\") " pod="openstack/heat-59cd-account-create-update-fqk6n" Dec 02 16:14:08 crc kubenswrapper[4933]: I1202 16:14:08.479336 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-b25b-account-create-update-p4l95"] Dec 02 16:14:08 crc kubenswrapper[4933]: I1202 16:14:08.481233 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-b25b-account-create-update-p4l95" Dec 02 16:14:08 crc kubenswrapper[4933]: I1202 16:14:08.484710 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Dec 02 16:14:08 crc kubenswrapper[4933]: I1202 16:14:08.510708 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-gnm4h"] Dec 02 16:14:08 crc kubenswrapper[4933]: I1202 16:14:08.526562 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-b25b-account-create-update-p4l95"] Dec 02 16:14:08 crc kubenswrapper[4933]: I1202 16:14:08.557003 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-275rp\" (UniqueName: \"kubernetes.io/projected/be505e35-b5c6-434e-9f60-621f6adb19aa-kube-api-access-275rp\") pod \"barbican-db-create-dvgn2\" (UID: \"be505e35-b5c6-434e-9f60-621f6adb19aa\") " pod="openstack/barbican-db-create-dvgn2" Dec 02 16:14:08 crc kubenswrapper[4933]: I1202 16:14:08.557155 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e2abbab-0ae7-4cb8-9b51-a2423b7948da-combined-ca-bundle\") pod \"keystone-db-sync-vh4vf\" (UID: \"8e2abbab-0ae7-4cb8-9b51-a2423b7948da\") " pod="openstack/keystone-db-sync-vh4vf" Dec 02 16:14:08 crc kubenswrapper[4933]: I1202 16:14:08.557241 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfpjn\" (UniqueName: \"kubernetes.io/projected/bc660678-23e8-42d3-b34e-970411b1b48c-kube-api-access-nfpjn\") pod \"cinder-b25b-account-create-update-p4l95\" (UID: \"bc660678-23e8-42d3-b34e-970411b1b48c\") " pod="openstack/cinder-b25b-account-create-update-p4l95" Dec 02 16:14:08 crc kubenswrapper[4933]: I1202 16:14:08.557356 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bc660678-23e8-42d3-b34e-970411b1b48c-operator-scripts\") pod \"cinder-b25b-account-create-update-p4l95\" (UID: \"bc660678-23e8-42d3-b34e-970411b1b48c\") " pod="openstack/cinder-b25b-account-create-update-p4l95" Dec 02 16:14:08 crc kubenswrapper[4933]: I1202 16:14:08.557424 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e2abbab-0ae7-4cb8-9b51-a2423b7948da-config-data\") pod \"keystone-db-sync-vh4vf\" (UID: \"8e2abbab-0ae7-4cb8-9b51-a2423b7948da\") " pod="openstack/keystone-db-sync-vh4vf" Dec 02 16:14:08 crc kubenswrapper[4933]: I1202 16:14:08.557498 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/be505e35-b5c6-434e-9f60-621f6adb19aa-operator-scripts\") pod \"barbican-db-create-dvgn2\" (UID: \"be505e35-b5c6-434e-9f60-621f6adb19aa\") " pod="openstack/barbican-db-create-dvgn2" Dec 02 16:14:08 crc kubenswrapper[4933]: I1202 16:14:08.557660 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rpjl\" (UniqueName: \"kubernetes.io/projected/1ff9f4e8-8048-4527-be5c-58279375324a-kube-api-access-6rpjl\") pod \"cinder-db-create-gnm4h\" (UID: \"1ff9f4e8-8048-4527-be5c-58279375324a\") " pod="openstack/cinder-db-create-gnm4h" Dec 02 16:14:08 crc kubenswrapper[4933]: I1202 16:14:08.557737 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7mvx\" (UniqueName: \"kubernetes.io/projected/8e2abbab-0ae7-4cb8-9b51-a2423b7948da-kube-api-access-w7mvx\") pod \"keystone-db-sync-vh4vf\" (UID: \"8e2abbab-0ae7-4cb8-9b51-a2423b7948da\") " pod="openstack/keystone-db-sync-vh4vf" Dec 02 16:14:08 crc kubenswrapper[4933]: I1202 16:14:08.557802 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1ff9f4e8-8048-4527-be5c-58279375324a-operator-scripts\") pod \"cinder-db-create-gnm4h\" (UID: \"1ff9f4e8-8048-4527-be5c-58279375324a\") " pod="openstack/cinder-db-create-gnm4h" Dec 02 16:14:08 crc kubenswrapper[4933]: I1202 16:14:08.558314 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/be505e35-b5c6-434e-9f60-621f6adb19aa-operator-scripts\") pod \"barbican-db-create-dvgn2\" (UID: \"be505e35-b5c6-434e-9f60-621f6adb19aa\") " pod="openstack/barbican-db-create-dvgn2" Dec 02 16:14:08 crc kubenswrapper[4933]: I1202 16:14:08.562774 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e2abbab-0ae7-4cb8-9b51-a2423b7948da-config-data\") pod \"keystone-db-sync-vh4vf\" (UID: \"8e2abbab-0ae7-4cb8-9b51-a2423b7948da\") " pod="openstack/keystone-db-sync-vh4vf" Dec 02 16:14:08 crc kubenswrapper[4933]: I1202 16:14:08.567921 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e2abbab-0ae7-4cb8-9b51-a2423b7948da-combined-ca-bundle\") pod \"keystone-db-sync-vh4vf\" (UID: \"8e2abbab-0ae7-4cb8-9b51-a2423b7948da\") " pod="openstack/keystone-db-sync-vh4vf" Dec 02 16:14:08 crc kubenswrapper[4933]: I1202 16:14:08.602267 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-275rp\" (UniqueName: \"kubernetes.io/projected/be505e35-b5c6-434e-9f60-621f6adb19aa-kube-api-access-275rp\") pod \"barbican-db-create-dvgn2\" (UID: \"be505e35-b5c6-434e-9f60-621f6adb19aa\") " pod="openstack/barbican-db-create-dvgn2" Dec 02 16:14:08 crc kubenswrapper[4933]: I1202 16:14:08.602332 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-bgnf5"] Dec 02 16:14:08 crc kubenswrapper[4933]: I1202 16:14:08.603692 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-bgnf5" Dec 02 16:14:08 crc kubenswrapper[4933]: I1202 16:14:08.604599 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7mvx\" (UniqueName: \"kubernetes.io/projected/8e2abbab-0ae7-4cb8-9b51-a2423b7948da-kube-api-access-w7mvx\") pod \"keystone-db-sync-vh4vf\" (UID: \"8e2abbab-0ae7-4cb8-9b51-a2423b7948da\") " pod="openstack/keystone-db-sync-vh4vf" Dec 02 16:14:08 crc kubenswrapper[4933]: I1202 16:14:08.615394 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5615-account-create-update-d844g"] Dec 02 16:14:08 crc kubenswrapper[4933]: I1202 16:14:08.616948 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5615-account-create-update-d844g" Dec 02 16:14:08 crc kubenswrapper[4933]: I1202 16:14:08.624026 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-bgnf5"] Dec 02 16:14:08 crc kubenswrapper[4933]: I1202 16:14:08.624302 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Dec 02 16:14:08 crc kubenswrapper[4933]: I1202 16:14:08.644100 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5615-account-create-update-d844g"] Dec 02 16:14:08 crc kubenswrapper[4933]: I1202 16:14:08.656493 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 02 16:14:08 crc kubenswrapper[4933]: I1202 16:14:08.657065 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="803ee8c3-0a9d-46c8-8069-1ebffde1429c" containerName="prometheus" containerID="cri-o://5fd08c6b7b6cb739c7067736e9f8f414c50b71ff937dbe23552505db804fb045" gracePeriod=600 Dec 02 16:14:08 crc kubenswrapper[4933]: I1202 16:14:08.657200 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="803ee8c3-0a9d-46c8-8069-1ebffde1429c" containerName="thanos-sidecar" containerID="cri-o://12047917740de2ce64dc4eaa0e212329828c8c575ed792a68d6f89b9492c7d76" gracePeriod=600 Dec 02 16:14:08 crc kubenswrapper[4933]: I1202 16:14:08.657242 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="803ee8c3-0a9d-46c8-8069-1ebffde1429c" containerName="config-reloader" containerID="cri-o://96f014b8f282fa1db98bd8bcc75f61999e91a886400a470edc75f46ddd45439a" gracePeriod=600 Dec 02 16:14:08 crc kubenswrapper[4933]: I1202 16:14:08.663151 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfkm8\" (UniqueName: \"kubernetes.io/projected/8f084c2d-a8f4-4082-9130-052fbf9c9a30-kube-api-access-rfkm8\") pod \"neutron-db-create-bgnf5\" (UID: \"8f084c2d-a8f4-4082-9130-052fbf9c9a30\") " pod="openstack/neutron-db-create-bgnf5" Dec 02 16:14:08 crc kubenswrapper[4933]: I1202 16:14:08.663193 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8f084c2d-a8f4-4082-9130-052fbf9c9a30-operator-scripts\") pod \"neutron-db-create-bgnf5\" (UID: \"8f084c2d-a8f4-4082-9130-052fbf9c9a30\") " pod="openstack/neutron-db-create-bgnf5" Dec 02 16:14:08 crc kubenswrapper[4933]: I1202 16:14:08.663252 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nfpjn\" (UniqueName: \"kubernetes.io/projected/bc660678-23e8-42d3-b34e-970411b1b48c-kube-api-access-nfpjn\") pod \"cinder-b25b-account-create-update-p4l95\" (UID: \"bc660678-23e8-42d3-b34e-970411b1b48c\") " pod="openstack/cinder-b25b-account-create-update-p4l95" Dec 02 16:14:08 crc kubenswrapper[4933]: I1202 16:14:08.663273 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bc660678-23e8-42d3-b34e-970411b1b48c-operator-scripts\") pod \"cinder-b25b-account-create-update-p4l95\" (UID: \"bc660678-23e8-42d3-b34e-970411b1b48c\") " pod="openstack/cinder-b25b-account-create-update-p4l95" Dec 02 16:14:08 crc kubenswrapper[4933]: I1202 16:14:08.664026 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bc660678-23e8-42d3-b34e-970411b1b48c-operator-scripts\") pod \"cinder-b25b-account-create-update-p4l95\" (UID: \"bc660678-23e8-42d3-b34e-970411b1b48c\") " pod="openstack/cinder-b25b-account-create-update-p4l95" Dec 02 16:14:08 crc kubenswrapper[4933]: I1202 16:14:08.664256 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rpjl\" (UniqueName: \"kubernetes.io/projected/1ff9f4e8-8048-4527-be5c-58279375324a-kube-api-access-6rpjl\") pod \"cinder-db-create-gnm4h\" (UID: \"1ff9f4e8-8048-4527-be5c-58279375324a\") " pod="openstack/cinder-db-create-gnm4h" Dec 02 16:14:08 crc kubenswrapper[4933]: I1202 16:14:08.664288 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c4d438d7-31b1-4eb8-8794-2cc6a8aa79a3-operator-scripts\") pod \"neutron-5615-account-create-update-d844g\" (UID: \"c4d438d7-31b1-4eb8-8794-2cc6a8aa79a3\") " pod="openstack/neutron-5615-account-create-update-d844g" Dec 02 16:14:08 crc kubenswrapper[4933]: I1202 16:14:08.664326 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1ff9f4e8-8048-4527-be5c-58279375324a-operator-scripts\") pod \"cinder-db-create-gnm4h\" (UID: \"1ff9f4e8-8048-4527-be5c-58279375324a\") " pod="openstack/cinder-db-create-gnm4h" Dec 02 16:14:08 crc kubenswrapper[4933]: I1202 16:14:08.664424 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlwkl\" (UniqueName: \"kubernetes.io/projected/c4d438d7-31b1-4eb8-8794-2cc6a8aa79a3-kube-api-access-wlwkl\") pod \"neutron-5615-account-create-update-d844g\" (UID: \"c4d438d7-31b1-4eb8-8794-2cc6a8aa79a3\") " pod="openstack/neutron-5615-account-create-update-d844g" Dec 02 16:14:08 crc kubenswrapper[4933]: I1202 16:14:08.665074 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1ff9f4e8-8048-4527-be5c-58279375324a-operator-scripts\") pod \"cinder-db-create-gnm4h\" (UID: \"1ff9f4e8-8048-4527-be5c-58279375324a\") " pod="openstack/cinder-db-create-gnm4h" Dec 02 16:14:08 crc kubenswrapper[4933]: I1202 16:14:08.700525 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-vh4vf" Dec 02 16:14:08 crc kubenswrapper[4933]: I1202 16:14:08.705020 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-dvgn2" Dec 02 16:14:08 crc kubenswrapper[4933]: I1202 16:14:08.716580 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rpjl\" (UniqueName: \"kubernetes.io/projected/1ff9f4e8-8048-4527-be5c-58279375324a-kube-api-access-6rpjl\") pod \"cinder-db-create-gnm4h\" (UID: \"1ff9f4e8-8048-4527-be5c-58279375324a\") " pod="openstack/cinder-db-create-gnm4h" Dec 02 16:14:08 crc kubenswrapper[4933]: I1202 16:14:08.718590 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfpjn\" (UniqueName: \"kubernetes.io/projected/bc660678-23e8-42d3-b34e-970411b1b48c-kube-api-access-nfpjn\") pod \"cinder-b25b-account-create-update-p4l95\" (UID: \"bc660678-23e8-42d3-b34e-970411b1b48c\") " pod="openstack/cinder-b25b-account-create-update-p4l95" Dec 02 16:14:08 crc kubenswrapper[4933]: I1202 16:14:08.731910 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-6e7e-account-create-update-6zd9l"] Dec 02 16:14:08 crc kubenswrapper[4933]: I1202 16:14:08.733534 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-6e7e-account-create-update-6zd9l" Dec 02 16:14:08 crc kubenswrapper[4933]: I1202 16:14:08.738051 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Dec 02 16:14:08 crc kubenswrapper[4933]: I1202 16:14:08.742460 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-6e7e-account-create-update-6zd9l"] Dec 02 16:14:08 crc kubenswrapper[4933]: I1202 16:14:08.766504 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c4d438d7-31b1-4eb8-8794-2cc6a8aa79a3-operator-scripts\") pod \"neutron-5615-account-create-update-d844g\" (UID: \"c4d438d7-31b1-4eb8-8794-2cc6a8aa79a3\") " pod="openstack/neutron-5615-account-create-update-d844g" Dec 02 16:14:08 crc kubenswrapper[4933]: I1202 16:14:08.766595 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wlwkl\" (UniqueName: \"kubernetes.io/projected/c4d438d7-31b1-4eb8-8794-2cc6a8aa79a3-kube-api-access-wlwkl\") pod \"neutron-5615-account-create-update-d844g\" (UID: \"c4d438d7-31b1-4eb8-8794-2cc6a8aa79a3\") " pod="openstack/neutron-5615-account-create-update-d844g" Dec 02 16:14:08 crc kubenswrapper[4933]: I1202 16:14:08.766635 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9rds\" (UniqueName: \"kubernetes.io/projected/1d265444-f155-4232-802e-b454de66daa6-kube-api-access-x9rds\") pod \"barbican-6e7e-account-create-update-6zd9l\" (UID: \"1d265444-f155-4232-802e-b454de66daa6\") " pod="openstack/barbican-6e7e-account-create-update-6zd9l" Dec 02 16:14:08 crc kubenswrapper[4933]: I1202 16:14:08.766709 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rfkm8\" (UniqueName: \"kubernetes.io/projected/8f084c2d-a8f4-4082-9130-052fbf9c9a30-kube-api-access-rfkm8\") pod \"neutron-db-create-bgnf5\" (UID: \"8f084c2d-a8f4-4082-9130-052fbf9c9a30\") " pod="openstack/neutron-db-create-bgnf5" Dec 02 16:14:08 crc kubenswrapper[4933]: I1202 16:14:08.766751 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8f084c2d-a8f4-4082-9130-052fbf9c9a30-operator-scripts\") pod \"neutron-db-create-bgnf5\" (UID: \"8f084c2d-a8f4-4082-9130-052fbf9c9a30\") " pod="openstack/neutron-db-create-bgnf5" Dec 02 16:14:08 crc kubenswrapper[4933]: I1202 16:14:08.766782 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1d265444-f155-4232-802e-b454de66daa6-operator-scripts\") pod \"barbican-6e7e-account-create-update-6zd9l\" (UID: \"1d265444-f155-4232-802e-b454de66daa6\") " pod="openstack/barbican-6e7e-account-create-update-6zd9l" Dec 02 16:14:08 crc kubenswrapper[4933]: I1202 16:14:08.767913 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c4d438d7-31b1-4eb8-8794-2cc6a8aa79a3-operator-scripts\") pod \"neutron-5615-account-create-update-d844g\" (UID: \"c4d438d7-31b1-4eb8-8794-2cc6a8aa79a3\") " pod="openstack/neutron-5615-account-create-update-d844g" Dec 02 16:14:08 crc kubenswrapper[4933]: I1202 16:14:08.768940 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8f084c2d-a8f4-4082-9130-052fbf9c9a30-operator-scripts\") pod \"neutron-db-create-bgnf5\" (UID: \"8f084c2d-a8f4-4082-9130-052fbf9c9a30\") " pod="openstack/neutron-db-create-bgnf5" Dec 02 16:14:08 crc kubenswrapper[4933]: I1202 16:14:08.770532 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-59cd-account-create-update-fqk6n" Dec 02 16:14:08 crc kubenswrapper[4933]: I1202 16:14:08.801662 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlwkl\" (UniqueName: \"kubernetes.io/projected/c4d438d7-31b1-4eb8-8794-2cc6a8aa79a3-kube-api-access-wlwkl\") pod \"neutron-5615-account-create-update-d844g\" (UID: \"c4d438d7-31b1-4eb8-8794-2cc6a8aa79a3\") " pod="openstack/neutron-5615-account-create-update-d844g" Dec 02 16:14:08 crc kubenswrapper[4933]: I1202 16:14:08.814353 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfkm8\" (UniqueName: \"kubernetes.io/projected/8f084c2d-a8f4-4082-9130-052fbf9c9a30-kube-api-access-rfkm8\") pod \"neutron-db-create-bgnf5\" (UID: \"8f084c2d-a8f4-4082-9130-052fbf9c9a30\") " pod="openstack/neutron-db-create-bgnf5" Dec 02 16:14:08 crc kubenswrapper[4933]: I1202 16:14:08.842285 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-gnm4h" Dec 02 16:14:08 crc kubenswrapper[4933]: I1202 16:14:08.858338 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-b25b-account-create-update-p4l95" Dec 02 16:14:08 crc kubenswrapper[4933]: I1202 16:14:08.869920 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1d265444-f155-4232-802e-b454de66daa6-operator-scripts\") pod \"barbican-6e7e-account-create-update-6zd9l\" (UID: \"1d265444-f155-4232-802e-b454de66daa6\") " pod="openstack/barbican-6e7e-account-create-update-6zd9l" Dec 02 16:14:08 crc kubenswrapper[4933]: I1202 16:14:08.870529 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9rds\" (UniqueName: \"kubernetes.io/projected/1d265444-f155-4232-802e-b454de66daa6-kube-api-access-x9rds\") pod \"barbican-6e7e-account-create-update-6zd9l\" (UID: \"1d265444-f155-4232-802e-b454de66daa6\") " pod="openstack/barbican-6e7e-account-create-update-6zd9l" Dec 02 16:14:08 crc kubenswrapper[4933]: I1202 16:14:08.875156 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1d265444-f155-4232-802e-b454de66daa6-operator-scripts\") pod \"barbican-6e7e-account-create-update-6zd9l\" (UID: \"1d265444-f155-4232-802e-b454de66daa6\") " pod="openstack/barbican-6e7e-account-create-update-6zd9l" Dec 02 16:14:08 crc kubenswrapper[4933]: I1202 16:14:08.897271 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-ssstq"] Dec 02 16:14:08 crc kubenswrapper[4933]: I1202 16:14:08.898623 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-cell1-db-create-ssstq" Dec 02 16:14:08 crc kubenswrapper[4933]: I1202 16:14:08.910438 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9rds\" (UniqueName: \"kubernetes.io/projected/1d265444-f155-4232-802e-b454de66daa6-kube-api-access-x9rds\") pod \"barbican-6e7e-account-create-update-6zd9l\" (UID: \"1d265444-f155-4232-802e-b454de66daa6\") " pod="openstack/barbican-6e7e-account-create-update-6zd9l" Dec 02 16:14:08 crc kubenswrapper[4933]: I1202 16:14:08.923021 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-ssstq"] Dec 02 16:14:08 crc kubenswrapper[4933]: I1202 16:14:08.945366 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-ab9a-account-create-update-s4k9m"] Dec 02 16:14:08 crc kubenswrapper[4933]: I1202 16:14:08.946801 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-ab9a-account-create-update-s4k9m" Dec 02 16:14:08 crc kubenswrapper[4933]: I1202 16:14:08.953401 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"mysqld-exporter-openstack-cell1-db-secret" Dec 02 16:14:08 crc kubenswrapper[4933]: I1202 16:14:08.976448 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-ab9a-account-create-update-s4k9m"] Dec 02 16:14:09 crc kubenswrapper[4933]: I1202 16:14:09.000730 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-bgnf5" Dec 02 16:14:09 crc kubenswrapper[4933]: I1202 16:14:09.026995 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5615-account-create-update-d844g" Dec 02 16:14:09 crc kubenswrapper[4933]: I1202 16:14:09.075275 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-6e7e-account-create-update-6zd9l" Dec 02 16:14:09 crc kubenswrapper[4933]: I1202 16:14:09.077626 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmk4w\" (UniqueName: \"kubernetes.io/projected/ea8158de-843d-4149-8b2d-5621374e8e71-kube-api-access-pmk4w\") pod \"mysqld-exporter-openstack-cell1-db-create-ssstq\" (UID: \"ea8158de-843d-4149-8b2d-5621374e8e71\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-ssstq" Dec 02 16:14:09 crc kubenswrapper[4933]: I1202 16:14:09.077871 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3508d064-3032-4754-a0b9-a81c7e3a14b9-operator-scripts\") pod \"mysqld-exporter-ab9a-account-create-update-s4k9m\" (UID: \"3508d064-3032-4754-a0b9-a81c7e3a14b9\") " pod="openstack/mysqld-exporter-ab9a-account-create-update-s4k9m" Dec 02 16:14:09 crc kubenswrapper[4933]: I1202 16:14:09.078173 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ea8158de-843d-4149-8b2d-5621374e8e71-operator-scripts\") pod \"mysqld-exporter-openstack-cell1-db-create-ssstq\" (UID: \"ea8158de-843d-4149-8b2d-5621374e8e71\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-ssstq" Dec 02 16:14:09 crc kubenswrapper[4933]: I1202 16:14:09.078269 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6g98m\" (UniqueName: \"kubernetes.io/projected/3508d064-3032-4754-a0b9-a81c7e3a14b9-kube-api-access-6g98m\") pod \"mysqld-exporter-ab9a-account-create-update-s4k9m\" (UID: \"3508d064-3032-4754-a0b9-a81c7e3a14b9\") " pod="openstack/mysqld-exporter-ab9a-account-create-update-s4k9m" Dec 02 16:14:09 crc kubenswrapper[4933]: I1202 16:14:09.165609 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-9g2jj"] Dec 02 16:14:09 crc kubenswrapper[4933]: I1202 16:14:09.179700 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ea8158de-843d-4149-8b2d-5621374e8e71-operator-scripts\") pod \"mysqld-exporter-openstack-cell1-db-create-ssstq\" (UID: \"ea8158de-843d-4149-8b2d-5621374e8e71\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-ssstq" Dec 02 16:14:09 crc kubenswrapper[4933]: I1202 16:14:09.179768 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6g98m\" (UniqueName: \"kubernetes.io/projected/3508d064-3032-4754-a0b9-a81c7e3a14b9-kube-api-access-6g98m\") pod \"mysqld-exporter-ab9a-account-create-update-s4k9m\" (UID: \"3508d064-3032-4754-a0b9-a81c7e3a14b9\") " pod="openstack/mysqld-exporter-ab9a-account-create-update-s4k9m" Dec 02 16:14:09 crc kubenswrapper[4933]: I1202 16:14:09.179842 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmk4w\" (UniqueName: \"kubernetes.io/projected/ea8158de-843d-4149-8b2d-5621374e8e71-kube-api-access-pmk4w\") pod \"mysqld-exporter-openstack-cell1-db-create-ssstq\" (UID: \"ea8158de-843d-4149-8b2d-5621374e8e71\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-ssstq" Dec 02 16:14:09 crc kubenswrapper[4933]: I1202 16:14:09.179906 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3508d064-3032-4754-a0b9-a81c7e3a14b9-operator-scripts\") pod \"mysqld-exporter-ab9a-account-create-update-s4k9m\" (UID: \"3508d064-3032-4754-a0b9-a81c7e3a14b9\") " pod="openstack/mysqld-exporter-ab9a-account-create-update-s4k9m" Dec 02 16:14:09 crc kubenswrapper[4933]: I1202 16:14:09.180398 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ea8158de-843d-4149-8b2d-5621374e8e71-operator-scripts\") pod \"mysqld-exporter-openstack-cell1-db-create-ssstq\" (UID: \"ea8158de-843d-4149-8b2d-5621374e8e71\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-ssstq" Dec 02 16:14:09 crc kubenswrapper[4933]: I1202 16:14:09.180675 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3508d064-3032-4754-a0b9-a81c7e3a14b9-operator-scripts\") pod \"mysqld-exporter-ab9a-account-create-update-s4k9m\" (UID: \"3508d064-3032-4754-a0b9-a81c7e3a14b9\") " pod="openstack/mysqld-exporter-ab9a-account-create-update-s4k9m" Dec 02 16:14:09 crc kubenswrapper[4933]: I1202 16:14:09.203365 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6g98m\" (UniqueName: \"kubernetes.io/projected/3508d064-3032-4754-a0b9-a81c7e3a14b9-kube-api-access-6g98m\") pod \"mysqld-exporter-ab9a-account-create-update-s4k9m\" (UID: \"3508d064-3032-4754-a0b9-a81c7e3a14b9\") " pod="openstack/mysqld-exporter-ab9a-account-create-update-s4k9m" Dec 02 16:14:09 crc kubenswrapper[4933]: I1202 16:14:09.206605 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmk4w\" (UniqueName: \"kubernetes.io/projected/ea8158de-843d-4149-8b2d-5621374e8e71-kube-api-access-pmk4w\") pod \"mysqld-exporter-openstack-cell1-db-create-ssstq\" (UID: \"ea8158de-843d-4149-8b2d-5621374e8e71\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-ssstq" Dec 02 16:14:09 crc kubenswrapper[4933]: I1202 16:14:09.208569 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-g8qb5-config-zgdq8" event={"ID":"854d6a3f-90b4-4606-8432-9888fced9115","Type":"ContainerStarted","Data":"f56f437ea9e5050b83f13c02df967bcb6a5972ddd757e066055b967858cf917d"} Dec 02 16:14:09 crc kubenswrapper[4933]: I1202 16:14:09.208613 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-g8qb5-config-zgdq8" event={"ID":"854d6a3f-90b4-4606-8432-9888fced9115","Type":"ContainerStarted","Data":"d2f51e43b940de9c3327b2ad840b44c20fe17fc9f1dccc09e4fc71b65f79ff62"} Dec 02 16:14:09 crc kubenswrapper[4933]: I1202 16:14:09.213090 4933 generic.go:334] "Generic (PLEG): container finished" podID="360b6f71-eb21-4616-87ef-d142ca4a2e1a" containerID="f389c81d1304fc08b5c71de6d57df6441f36092cc3f38637b57496111803c656" exitCode=0 Dec 02 16:14:09 crc kubenswrapper[4933]: I1202 16:14:09.213186 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-5pwnn" event={"ID":"360b6f71-eb21-4616-87ef-d142ca4a2e1a","Type":"ContainerDied","Data":"f389c81d1304fc08b5c71de6d57df6441f36092cc3f38637b57496111803c656"} Dec 02 16:14:09 crc kubenswrapper[4933]: I1202 16:14:09.236874 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-g8qb5-config-zgdq8" podStartSLOduration=3.236849258 podStartE2EDuration="3.236849258s" podCreationTimestamp="2025-12-02 16:14:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 16:14:09.233056757 +0000 UTC m=+1312.484283470" watchObservedRunningTime="2025-12-02 16:14:09.236849258 +0000 UTC m=+1312.488075961" Dec 02 16:14:09 crc kubenswrapper[4933]: I1202 16:14:09.238227 4933 generic.go:334] "Generic (PLEG): container finished" podID="ada1850f-4671-48a5-97f3-7b82aab8ab97" containerID="6184db6079b1450eda447c57617970dfad41e894f6184cddfc8d394add273cf6" exitCode=0 Dec 02 16:14:09 crc kubenswrapper[4933]: I1202 16:14:09.238364 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-139c-account-create-update-rws48" event={"ID":"ada1850f-4671-48a5-97f3-7b82aab8ab97","Type":"ContainerDied","Data":"6184db6079b1450eda447c57617970dfad41e894f6184cddfc8d394add273cf6"} Dec 02 16:14:09 crc kubenswrapper[4933]: I1202 16:14:09.238418 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-139c-account-create-update-rws48" event={"ID":"ada1850f-4671-48a5-97f3-7b82aab8ab97","Type":"ContainerStarted","Data":"99bd7195bf31b017c4301862998afdc4701200cc15fe5c67cc24d3e62870ec42"} Dec 02 16:14:09 crc kubenswrapper[4933]: I1202 16:14:09.261963 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-cell1-db-create-ssstq" Dec 02 16:14:09 crc kubenswrapper[4933]: I1202 16:14:09.262270 4933 generic.go:334] "Generic (PLEG): container finished" podID="803ee8c3-0a9d-46c8-8069-1ebffde1429c" containerID="12047917740de2ce64dc4eaa0e212329828c8c575ed792a68d6f89b9492c7d76" exitCode=0 Dec 02 16:14:09 crc kubenswrapper[4933]: I1202 16:14:09.262311 4933 generic.go:334] "Generic (PLEG): container finished" podID="803ee8c3-0a9d-46c8-8069-1ebffde1429c" containerID="96f014b8f282fa1db98bd8bcc75f61999e91a886400a470edc75f46ddd45439a" exitCode=0 Dec 02 16:14:09 crc kubenswrapper[4933]: I1202 16:14:09.262325 4933 generic.go:334] "Generic (PLEG): container finished" podID="803ee8c3-0a9d-46c8-8069-1ebffde1429c" containerID="5fd08c6b7b6cb739c7067736e9f8f414c50b71ff937dbe23552505db804fb045" exitCode=0 Dec 02 16:14:09 crc kubenswrapper[4933]: I1202 16:14:09.262368 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"803ee8c3-0a9d-46c8-8069-1ebffde1429c","Type":"ContainerDied","Data":"12047917740de2ce64dc4eaa0e212329828c8c575ed792a68d6f89b9492c7d76"} Dec 02 16:14:09 crc kubenswrapper[4933]: I1202 16:14:09.262406 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"803ee8c3-0a9d-46c8-8069-1ebffde1429c","Type":"ContainerDied","Data":"96f014b8f282fa1db98bd8bcc75f61999e91a886400a470edc75f46ddd45439a"} Dec 02 16:14:09 crc kubenswrapper[4933]: I1202 16:14:09.262418 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"803ee8c3-0a9d-46c8-8069-1ebffde1429c","Type":"ContainerDied","Data":"5fd08c6b7b6cb739c7067736e9f8f414c50b71ff937dbe23552505db804fb045"} Dec 02 16:14:09 crc kubenswrapper[4933]: I1202 16:14:09.298051 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-ab9a-account-create-update-s4k9m" Dec 02 16:14:09 crc kubenswrapper[4933]: I1202 16:14:09.413936 4933 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="803ee8c3-0a9d-46c8-8069-1ebffde1429c" containerName="prometheus" probeResult="failure" output="Get \"http://10.217.0.135:9090/-/ready\": dial tcp 10.217.0.135:9090: connect: connection refused" Dec 02 16:14:09 crc kubenswrapper[4933]: I1202 16:14:09.658853 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-vh4vf"] Dec 02 16:14:09 crc kubenswrapper[4933]: I1202 16:14:09.789928 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 02 16:14:09 crc kubenswrapper[4933]: I1202 16:14:09.918807 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/803ee8c3-0a9d-46c8-8069-1ebffde1429c-tls-assets\") pod \"803ee8c3-0a9d-46c8-8069-1ebffde1429c\" (UID: \"803ee8c3-0a9d-46c8-8069-1ebffde1429c\") " Dec 02 16:14:09 crc kubenswrapper[4933]: I1202 16:14:09.919106 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/803ee8c3-0a9d-46c8-8069-1ebffde1429c-thanos-prometheus-http-client-file\") pod \"803ee8c3-0a9d-46c8-8069-1ebffde1429c\" (UID: \"803ee8c3-0a9d-46c8-8069-1ebffde1429c\") " Dec 02 16:14:09 crc kubenswrapper[4933]: I1202 16:14:09.919149 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/803ee8c3-0a9d-46c8-8069-1ebffde1429c-web-config\") pod \"803ee8c3-0a9d-46c8-8069-1ebffde1429c\" (UID: \"803ee8c3-0a9d-46c8-8069-1ebffde1429c\") " Dec 02 16:14:09 crc kubenswrapper[4933]: I1202 16:14:09.919194 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/803ee8c3-0a9d-46c8-8069-1ebffde1429c-config\") pod \"803ee8c3-0a9d-46c8-8069-1ebffde1429c\" (UID: \"803ee8c3-0a9d-46c8-8069-1ebffde1429c\") " Dec 02 16:14:09 crc kubenswrapper[4933]: I1202 16:14:09.919242 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/803ee8c3-0a9d-46c8-8069-1ebffde1429c-prometheus-metric-storage-rulefiles-0\") pod \"803ee8c3-0a9d-46c8-8069-1ebffde1429c\" (UID: \"803ee8c3-0a9d-46c8-8069-1ebffde1429c\") " Dec 02 16:14:09 crc kubenswrapper[4933]: I1202 16:14:09.919312 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/803ee8c3-0a9d-46c8-8069-1ebffde1429c-config-out\") pod \"803ee8c3-0a9d-46c8-8069-1ebffde1429c\" (UID: \"803ee8c3-0a9d-46c8-8069-1ebffde1429c\") " Dec 02 16:14:09 crc kubenswrapper[4933]: I1202 16:14:09.919394 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"803ee8c3-0a9d-46c8-8069-1ebffde1429c\" (UID: \"803ee8c3-0a9d-46c8-8069-1ebffde1429c\") " Dec 02 16:14:09 crc kubenswrapper[4933]: I1202 16:14:09.919438 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6t4mn\" (UniqueName: \"kubernetes.io/projected/803ee8c3-0a9d-46c8-8069-1ebffde1429c-kube-api-access-6t4mn\") pod \"803ee8c3-0a9d-46c8-8069-1ebffde1429c\" (UID: \"803ee8c3-0a9d-46c8-8069-1ebffde1429c\") " Dec 02 16:14:09 crc kubenswrapper[4933]: I1202 16:14:09.931714 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/803ee8c3-0a9d-46c8-8069-1ebffde1429c-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "803ee8c3-0a9d-46c8-8069-1ebffde1429c" (UID: "803ee8c3-0a9d-46c8-8069-1ebffde1429c"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:14:09 crc kubenswrapper[4933]: I1202 16:14:09.932318 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/803ee8c3-0a9d-46c8-8069-1ebffde1429c-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "803ee8c3-0a9d-46c8-8069-1ebffde1429c" (UID: "803ee8c3-0a9d-46c8-8069-1ebffde1429c"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:14:09 crc kubenswrapper[4933]: I1202 16:14:09.934405 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/803ee8c3-0a9d-46c8-8069-1ebffde1429c-config-out" (OuterVolumeSpecName: "config-out") pod "803ee8c3-0a9d-46c8-8069-1ebffde1429c" (UID: "803ee8c3-0a9d-46c8-8069-1ebffde1429c"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 16:14:09 crc kubenswrapper[4933]: I1202 16:14:09.938912 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/803ee8c3-0a9d-46c8-8069-1ebffde1429c-kube-api-access-6t4mn" (OuterVolumeSpecName: "kube-api-access-6t4mn") pod "803ee8c3-0a9d-46c8-8069-1ebffde1429c" (UID: "803ee8c3-0a9d-46c8-8069-1ebffde1429c"). InnerVolumeSpecName "kube-api-access-6t4mn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:14:09 crc kubenswrapper[4933]: I1202 16:14:09.942438 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "803ee8c3-0a9d-46c8-8069-1ebffde1429c" (UID: "803ee8c3-0a9d-46c8-8069-1ebffde1429c"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 02 16:14:09 crc kubenswrapper[4933]: I1202 16:14:09.943675 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/803ee8c3-0a9d-46c8-8069-1ebffde1429c-config" (OuterVolumeSpecName: "config") pod "803ee8c3-0a9d-46c8-8069-1ebffde1429c" (UID: "803ee8c3-0a9d-46c8-8069-1ebffde1429c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:14:09 crc kubenswrapper[4933]: I1202 16:14:09.943912 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/803ee8c3-0a9d-46c8-8069-1ebffde1429c-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "803ee8c3-0a9d-46c8-8069-1ebffde1429c" (UID: "803ee8c3-0a9d-46c8-8069-1ebffde1429c"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:14:09 crc kubenswrapper[4933]: I1202 16:14:09.960777 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/803ee8c3-0a9d-46c8-8069-1ebffde1429c-web-config" (OuterVolumeSpecName: "web-config") pod "803ee8c3-0a9d-46c8-8069-1ebffde1429c" (UID: "803ee8c3-0a9d-46c8-8069-1ebffde1429c"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:14:10 crc kubenswrapper[4933]: I1202 16:14:10.027420 4933 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/803ee8c3-0a9d-46c8-8069-1ebffde1429c-config-out\") on node \"crc\" DevicePath \"\"" Dec 02 16:14:10 crc kubenswrapper[4933]: I1202 16:14:10.027472 4933 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Dec 02 16:14:10 crc kubenswrapper[4933]: I1202 16:14:10.027487 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6t4mn\" (UniqueName: \"kubernetes.io/projected/803ee8c3-0a9d-46c8-8069-1ebffde1429c-kube-api-access-6t4mn\") on node \"crc\" DevicePath \"\"" Dec 02 16:14:10 crc kubenswrapper[4933]: I1202 16:14:10.027500 4933 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/803ee8c3-0a9d-46c8-8069-1ebffde1429c-tls-assets\") on node \"crc\" DevicePath \"\"" Dec 02 16:14:10 crc kubenswrapper[4933]: I1202 16:14:10.027529 4933 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/803ee8c3-0a9d-46c8-8069-1ebffde1429c-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Dec 02 16:14:10 crc kubenswrapper[4933]: I1202 16:14:10.027541 4933 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/803ee8c3-0a9d-46c8-8069-1ebffde1429c-web-config\") on node \"crc\" DevicePath \"\"" Dec 02 16:14:10 crc kubenswrapper[4933]: I1202 16:14:10.027553 4933 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/803ee8c3-0a9d-46c8-8069-1ebffde1429c-config\") on node \"crc\" DevicePath \"\"" Dec 02 16:14:10 crc kubenswrapper[4933]: I1202 16:14:10.027564 4933 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/803ee8c3-0a9d-46c8-8069-1ebffde1429c-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Dec 02 16:14:10 crc kubenswrapper[4933]: I1202 16:14:10.051149 4933 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Dec 02 16:14:10 crc kubenswrapper[4933]: I1202 16:14:10.108719 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-59cd-account-create-update-fqk6n"] Dec 02 16:14:10 crc kubenswrapper[4933]: I1202 16:14:10.126639 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-dvgn2"] Dec 02 16:14:10 crc kubenswrapper[4933]: I1202 16:14:10.129045 4933 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Dec 02 16:14:10 crc kubenswrapper[4933]: I1202 16:14:10.151932 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-gnm4h"] Dec 02 16:14:10 crc kubenswrapper[4933]: I1202 16:14:10.175421 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-b25b-account-create-update-p4l95"] Dec 02 16:14:10 crc kubenswrapper[4933]: I1202 16:14:10.310667 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-vh4vf" event={"ID":"8e2abbab-0ae7-4cb8-9b51-a2423b7948da","Type":"ContainerStarted","Data":"32a9b9649788f58aa602fa769faa3467ae9a6343192e69c20b973b666a63be4c"} Dec 02 16:14:10 crc kubenswrapper[4933]: I1202 16:14:10.316255 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-g8qb5-config-zgdq8" event={"ID":"854d6a3f-90b4-4606-8432-9888fced9115","Type":"ContainerDied","Data":"f56f437ea9e5050b83f13c02df967bcb6a5972ddd757e066055b967858cf917d"} Dec 02 16:14:10 crc kubenswrapper[4933]: I1202 16:14:10.317770 4933 generic.go:334] "Generic (PLEG): container finished" podID="854d6a3f-90b4-4606-8432-9888fced9115" containerID="f56f437ea9e5050b83f13c02df967bcb6a5972ddd757e066055b967858cf917d" exitCode=0 Dec 02 16:14:10 crc kubenswrapper[4933]: I1202 16:14:10.320773 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-59cd-account-create-update-fqk6n" event={"ID":"82ac1fbd-600e-4795-855d-3dae65b36263","Type":"ContainerStarted","Data":"6f6955ebb60d6c77019e5318be027dabdc73e00c5ef4ce62853bcf15027257c3"} Dec 02 16:14:10 crc kubenswrapper[4933]: I1202 16:14:10.322113 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-b25b-account-create-update-p4l95" event={"ID":"bc660678-23e8-42d3-b34e-970411b1b48c","Type":"ContainerStarted","Data":"becf2f9f9a898f3f44a5e3439bf3df1e313ff78eabf1e7afc4abe549a7e50895"} Dec 02 16:14:10 crc kubenswrapper[4933]: I1202 16:14:10.323151 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-gnm4h" event={"ID":"1ff9f4e8-8048-4527-be5c-58279375324a","Type":"ContainerStarted","Data":"9530ba469a912991abdf15b66e7027113ff3353343471d9ff0fc221f63f928f1"} Dec 02 16:14:10 crc kubenswrapper[4933]: I1202 16:14:10.324155 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-dvgn2" event={"ID":"be505e35-b5c6-434e-9f60-621f6adb19aa","Type":"ContainerStarted","Data":"832edb4c59b0f375d07efe1ab000752cc82676baacfd8a98e089da6f0cbae442"} Dec 02 16:14:10 crc kubenswrapper[4933]: I1202 16:14:10.326838 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"803ee8c3-0a9d-46c8-8069-1ebffde1429c","Type":"ContainerDied","Data":"b3619fb9e1eedcc545c5fe8376be0f1111dbf0be41ffd241e4a5351c963aee0b"} Dec 02 16:14:10 crc kubenswrapper[4933]: I1202 16:14:10.326880 4933 scope.go:117] "RemoveContainer" containerID="12047917740de2ce64dc4eaa0e212329828c8c575ed792a68d6f89b9492c7d76" Dec 02 16:14:10 crc kubenswrapper[4933]: I1202 16:14:10.327023 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 02 16:14:10 crc kubenswrapper[4933]: I1202 16:14:10.339509 4933 generic.go:334] "Generic (PLEG): container finished" podID="1b64b3a1-928a-49f7-b910-6d99cc7e7cf1" containerID="972925d235e7cc5bec7e3cf4f10f50400f5340b66c9f7fe09d26ce2ff882f8c3" exitCode=0 Dec 02 16:14:10 crc kubenswrapper[4933]: I1202 16:14:10.339802 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-9g2jj" event={"ID":"1b64b3a1-928a-49f7-b910-6d99cc7e7cf1","Type":"ContainerDied","Data":"972925d235e7cc5bec7e3cf4f10f50400f5340b66c9f7fe09d26ce2ff882f8c3"} Dec 02 16:14:10 crc kubenswrapper[4933]: I1202 16:14:10.339852 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-9g2jj" event={"ID":"1b64b3a1-928a-49f7-b910-6d99cc7e7cf1","Type":"ContainerStarted","Data":"f3eca6b7ffc55b7170f78e147c26de847a57608f8cad2ab67ea5b47cfdfba1ad"} Dec 02 16:14:10 crc kubenswrapper[4933]: I1202 16:14:10.383639 4933 scope.go:117] "RemoveContainer" containerID="96f014b8f282fa1db98bd8bcc75f61999e91a886400a470edc75f46ddd45439a" Dec 02 16:14:10 crc kubenswrapper[4933]: I1202 16:14:10.458913 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-bgnf5"] Dec 02 16:14:10 crc kubenswrapper[4933]: I1202 16:14:10.465394 4933 scope.go:117] "RemoveContainer" containerID="5fd08c6b7b6cb739c7067736e9f8f414c50b71ff937dbe23552505db804fb045" Dec 02 16:14:10 crc kubenswrapper[4933]: I1202 16:14:10.468374 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 02 16:14:10 crc kubenswrapper[4933]: I1202 16:14:10.477839 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 02 16:14:10 crc kubenswrapper[4933]: I1202 16:14:10.530389 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 02 16:14:10 crc kubenswrapper[4933]: E1202 16:14:10.530903 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="803ee8c3-0a9d-46c8-8069-1ebffde1429c" containerName="thanos-sidecar" Dec 02 16:14:10 crc kubenswrapper[4933]: I1202 16:14:10.530921 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="803ee8c3-0a9d-46c8-8069-1ebffde1429c" containerName="thanos-sidecar" Dec 02 16:14:10 crc kubenswrapper[4933]: E1202 16:14:10.530938 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="803ee8c3-0a9d-46c8-8069-1ebffde1429c" containerName="init-config-reloader" Dec 02 16:14:10 crc kubenswrapper[4933]: I1202 16:14:10.530946 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="803ee8c3-0a9d-46c8-8069-1ebffde1429c" containerName="init-config-reloader" Dec 02 16:14:10 crc kubenswrapper[4933]: E1202 16:14:10.530973 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="803ee8c3-0a9d-46c8-8069-1ebffde1429c" containerName="prometheus" Dec 02 16:14:10 crc kubenswrapper[4933]: I1202 16:14:10.530983 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="803ee8c3-0a9d-46c8-8069-1ebffde1429c" containerName="prometheus" Dec 02 16:14:10 crc kubenswrapper[4933]: E1202 16:14:10.530996 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="803ee8c3-0a9d-46c8-8069-1ebffde1429c" containerName="config-reloader" Dec 02 16:14:10 crc kubenswrapper[4933]: I1202 16:14:10.531002 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="803ee8c3-0a9d-46c8-8069-1ebffde1429c" containerName="config-reloader" Dec 02 16:14:10 crc kubenswrapper[4933]: I1202 16:14:10.531214 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="803ee8c3-0a9d-46c8-8069-1ebffde1429c" containerName="thanos-sidecar" Dec 02 16:14:10 crc kubenswrapper[4933]: I1202 16:14:10.531238 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="803ee8c3-0a9d-46c8-8069-1ebffde1429c" containerName="prometheus" Dec 02 16:14:10 crc kubenswrapper[4933]: I1202 16:14:10.531251 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="803ee8c3-0a9d-46c8-8069-1ebffde1429c" containerName="config-reloader" Dec 02 16:14:10 crc kubenswrapper[4933]: I1202 16:14:10.533066 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 02 16:14:10 crc kubenswrapper[4933]: I1202 16:14:10.542635 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Dec 02 16:14:10 crc kubenswrapper[4933]: I1202 16:14:10.543079 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Dec 02 16:14:10 crc kubenswrapper[4933]: I1202 16:14:10.543155 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-metric-storage-prometheus-svc" Dec 02 16:14:10 crc kubenswrapper[4933]: I1202 16:14:10.543224 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Dec 02 16:14:10 crc kubenswrapper[4933]: I1202 16:14:10.543327 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-fbngp" Dec 02 16:14:10 crc kubenswrapper[4933]: I1202 16:14:10.543334 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Dec 02 16:14:10 crc kubenswrapper[4933]: I1202 16:14:10.573179 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 02 16:14:10 crc kubenswrapper[4933]: I1202 16:14:10.603683 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-ssstq"] Dec 02 16:14:10 crc kubenswrapper[4933]: I1202 16:14:10.607935 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Dec 02 16:14:10 crc kubenswrapper[4933]: I1202 16:14:10.621257 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-ab9a-account-create-update-s4k9m"] Dec 02 16:14:10 crc kubenswrapper[4933]: I1202 16:14:10.645422 4933 scope.go:117] "RemoveContainer" containerID="807b2330f3cb44203c7f511d9f2b7abbe69f7ffc78640e9df411d576e748c6d3" Dec 02 16:14:10 crc kubenswrapper[4933]: I1202 16:14:10.646097 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-6e7e-account-create-update-6zd9l"] Dec 02 16:14:10 crc kubenswrapper[4933]: I1202 16:14:10.647508 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8621d49b-46de-42d1-a6aa-291b8cda18ca-config\") pod \"prometheus-metric-storage-0\" (UID: \"8621d49b-46de-42d1-a6aa-291b8cda18ca\") " pod="openstack/prometheus-metric-storage-0" Dec 02 16:14:10 crc kubenswrapper[4933]: I1202 16:14:10.647539 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8621d49b-46de-42d1-a6aa-291b8cda18ca-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"8621d49b-46de-42d1-a6aa-291b8cda18ca\") " pod="openstack/prometheus-metric-storage-0" Dec 02 16:14:10 crc kubenswrapper[4933]: I1202 16:14:10.647562 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8621d49b-46de-42d1-a6aa-291b8cda18ca-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"8621d49b-46de-42d1-a6aa-291b8cda18ca\") " pod="openstack/prometheus-metric-storage-0" Dec 02 16:14:10 crc kubenswrapper[4933]: I1202 16:14:10.647685 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/8621d49b-46de-42d1-a6aa-291b8cda18ca-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"8621d49b-46de-42d1-a6aa-291b8cda18ca\") " pod="openstack/prometheus-metric-storage-0" Dec 02 16:14:10 crc kubenswrapper[4933]: I1202 16:14:10.647725 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"prometheus-metric-storage-0\" (UID: \"8621d49b-46de-42d1-a6aa-291b8cda18ca\") " pod="openstack/prometheus-metric-storage-0" Dec 02 16:14:10 crc kubenswrapper[4933]: I1202 16:14:10.647799 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/8621d49b-46de-42d1-a6aa-291b8cda18ca-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"8621d49b-46de-42d1-a6aa-291b8cda18ca\") " pod="openstack/prometheus-metric-storage-0" Dec 02 16:14:10 crc kubenswrapper[4933]: I1202 16:14:10.647910 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/8621d49b-46de-42d1-a6aa-291b8cda18ca-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"8621d49b-46de-42d1-a6aa-291b8cda18ca\") " pod="openstack/prometheus-metric-storage-0" Dec 02 16:14:10 crc kubenswrapper[4933]: I1202 16:14:10.647956 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/8621d49b-46de-42d1-a6aa-291b8cda18ca-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"8621d49b-46de-42d1-a6aa-291b8cda18ca\") " pod="openstack/prometheus-metric-storage-0" Dec 02 16:14:10 crc kubenswrapper[4933]: I1202 16:14:10.648599 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8621d49b-46de-42d1-a6aa-291b8cda18ca-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"8621d49b-46de-42d1-a6aa-291b8cda18ca\") " pod="openstack/prometheus-metric-storage-0" Dec 02 16:14:10 crc kubenswrapper[4933]: I1202 16:14:10.648735 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8621d49b-46de-42d1-a6aa-291b8cda18ca-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"8621d49b-46de-42d1-a6aa-291b8cda18ca\") " pod="openstack/prometheus-metric-storage-0" Dec 02 16:14:10 crc kubenswrapper[4933]: I1202 16:14:10.648862 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhf77\" (UniqueName: \"kubernetes.io/projected/8621d49b-46de-42d1-a6aa-291b8cda18ca-kube-api-access-lhf77\") pod \"prometheus-metric-storage-0\" (UID: \"8621d49b-46de-42d1-a6aa-291b8cda18ca\") " pod="openstack/prometheus-metric-storage-0" Dec 02 16:14:10 crc kubenswrapper[4933]: I1202 16:14:10.713297 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5615-account-create-update-d844g"] Dec 02 16:14:10 crc kubenswrapper[4933]: I1202 16:14:10.788255 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8621d49b-46de-42d1-a6aa-291b8cda18ca-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"8621d49b-46de-42d1-a6aa-291b8cda18ca\") " pod="openstack/prometheus-metric-storage-0" Dec 02 16:14:10 crc kubenswrapper[4933]: I1202 16:14:10.788330 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8621d49b-46de-42d1-a6aa-291b8cda18ca-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"8621d49b-46de-42d1-a6aa-291b8cda18ca\") " pod="openstack/prometheus-metric-storage-0" Dec 02 16:14:10 crc kubenswrapper[4933]: I1202 16:14:10.788383 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhf77\" (UniqueName: \"kubernetes.io/projected/8621d49b-46de-42d1-a6aa-291b8cda18ca-kube-api-access-lhf77\") pod \"prometheus-metric-storage-0\" (UID: \"8621d49b-46de-42d1-a6aa-291b8cda18ca\") " pod="openstack/prometheus-metric-storage-0" Dec 02 16:14:10 crc kubenswrapper[4933]: I1202 16:14:10.788418 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8621d49b-46de-42d1-a6aa-291b8cda18ca-config\") pod \"prometheus-metric-storage-0\" (UID: \"8621d49b-46de-42d1-a6aa-291b8cda18ca\") " pod="openstack/prometheus-metric-storage-0" Dec 02 16:14:10 crc kubenswrapper[4933]: I1202 16:14:10.788442 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8621d49b-46de-42d1-a6aa-291b8cda18ca-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"8621d49b-46de-42d1-a6aa-291b8cda18ca\") " pod="openstack/prometheus-metric-storage-0" Dec 02 16:14:10 crc kubenswrapper[4933]: I1202 16:14:10.788468 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8621d49b-46de-42d1-a6aa-291b8cda18ca-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"8621d49b-46de-42d1-a6aa-291b8cda18ca\") " pod="openstack/prometheus-metric-storage-0" Dec 02 16:14:10 crc kubenswrapper[4933]: I1202 16:14:10.788518 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/8621d49b-46de-42d1-a6aa-291b8cda18ca-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"8621d49b-46de-42d1-a6aa-291b8cda18ca\") " pod="openstack/prometheus-metric-storage-0" Dec 02 16:14:10 crc kubenswrapper[4933]: I1202 16:14:10.788551 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"prometheus-metric-storage-0\" (UID: \"8621d49b-46de-42d1-a6aa-291b8cda18ca\") " pod="openstack/prometheus-metric-storage-0" Dec 02 16:14:10 crc kubenswrapper[4933]: I1202 16:14:10.788591 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/8621d49b-46de-42d1-a6aa-291b8cda18ca-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"8621d49b-46de-42d1-a6aa-291b8cda18ca\") " pod="openstack/prometheus-metric-storage-0" Dec 02 16:14:10 crc kubenswrapper[4933]: I1202 16:14:10.788636 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/8621d49b-46de-42d1-a6aa-291b8cda18ca-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"8621d49b-46de-42d1-a6aa-291b8cda18ca\") " pod="openstack/prometheus-metric-storage-0" Dec 02 16:14:10 crc kubenswrapper[4933]: I1202 16:14:10.788666 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/8621d49b-46de-42d1-a6aa-291b8cda18ca-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"8621d49b-46de-42d1-a6aa-291b8cda18ca\") " pod="openstack/prometheus-metric-storage-0" Dec 02 16:14:10 crc kubenswrapper[4933]: I1202 16:14:10.790461 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/8621d49b-46de-42d1-a6aa-291b8cda18ca-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"8621d49b-46de-42d1-a6aa-291b8cda18ca\") " pod="openstack/prometheus-metric-storage-0" Dec 02 16:14:10 crc kubenswrapper[4933]: I1202 16:14:10.793931 4933 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"prometheus-metric-storage-0\" (UID: \"8621d49b-46de-42d1-a6aa-291b8cda18ca\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/prometheus-metric-storage-0" Dec 02 16:14:10 crc kubenswrapper[4933]: I1202 16:14:10.799053 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8621d49b-46de-42d1-a6aa-291b8cda18ca-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"8621d49b-46de-42d1-a6aa-291b8cda18ca\") " pod="openstack/prometheus-metric-storage-0" Dec 02 16:14:10 crc kubenswrapper[4933]: I1202 16:14:10.799259 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8621d49b-46de-42d1-a6aa-291b8cda18ca-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"8621d49b-46de-42d1-a6aa-291b8cda18ca\") " pod="openstack/prometheus-metric-storage-0" Dec 02 16:14:10 crc kubenswrapper[4933]: I1202 16:14:10.799311 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/8621d49b-46de-42d1-a6aa-291b8cda18ca-config\") pod \"prometheus-metric-storage-0\" (UID: \"8621d49b-46de-42d1-a6aa-291b8cda18ca\") " pod="openstack/prometheus-metric-storage-0" Dec 02 16:14:10 crc kubenswrapper[4933]: I1202 16:14:10.799541 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8621d49b-46de-42d1-a6aa-291b8cda18ca-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"8621d49b-46de-42d1-a6aa-291b8cda18ca\") " pod="openstack/prometheus-metric-storage-0" Dec 02 16:14:10 crc kubenswrapper[4933]: I1202 16:14:10.799767 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/8621d49b-46de-42d1-a6aa-291b8cda18ca-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"8621d49b-46de-42d1-a6aa-291b8cda18ca\") " pod="openstack/prometheus-metric-storage-0" Dec 02 16:14:10 crc kubenswrapper[4933]: I1202 16:14:10.805298 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/8621d49b-46de-42d1-a6aa-291b8cda18ca-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"8621d49b-46de-42d1-a6aa-291b8cda18ca\") " pod="openstack/prometheus-metric-storage-0" Dec 02 16:14:10 crc kubenswrapper[4933]: I1202 16:14:10.814528 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/8621d49b-46de-42d1-a6aa-291b8cda18ca-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"8621d49b-46de-42d1-a6aa-291b8cda18ca\") " pod="openstack/prometheus-metric-storage-0" Dec 02 16:14:10 crc kubenswrapper[4933]: I1202 16:14:10.819504 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8621d49b-46de-42d1-a6aa-291b8cda18ca-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"8621d49b-46de-42d1-a6aa-291b8cda18ca\") " pod="openstack/prometheus-metric-storage-0" Dec 02 16:14:10 crc kubenswrapper[4933]: I1202 16:14:10.833082 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhf77\" (UniqueName: \"kubernetes.io/projected/8621d49b-46de-42d1-a6aa-291b8cda18ca-kube-api-access-lhf77\") pod \"prometheus-metric-storage-0\" (UID: \"8621d49b-46de-42d1-a6aa-291b8cda18ca\") " pod="openstack/prometheus-metric-storage-0" Dec 02 16:14:10 crc kubenswrapper[4933]: I1202 16:14:10.855427 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"prometheus-metric-storage-0\" (UID: \"8621d49b-46de-42d1-a6aa-291b8cda18ca\") " pod="openstack/prometheus-metric-storage-0" Dec 02 16:14:10 crc kubenswrapper[4933]: I1202 16:14:10.959251 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Dec 02 16:14:11 crc kubenswrapper[4933]: I1202 16:14:10.997617 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 02 16:14:11 crc kubenswrapper[4933]: I1202 16:14:11.080720 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="803ee8c3-0a9d-46c8-8069-1ebffde1429c" path="/var/lib/kubelet/pods/803ee8c3-0a9d-46c8-8069-1ebffde1429c/volumes" Dec 02 16:14:11 crc kubenswrapper[4933]: I1202 16:14:11.353672 4933 generic.go:334] "Generic (PLEG): container finished" podID="82ac1fbd-600e-4795-855d-3dae65b36263" containerID="680359322fbe0d13024f5c435bbcfbef006d6d62d239663d711e12b930f56cb0" exitCode=0 Dec 02 16:14:11 crc kubenswrapper[4933]: I1202 16:14:11.353997 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-59cd-account-create-update-fqk6n" event={"ID":"82ac1fbd-600e-4795-855d-3dae65b36263","Type":"ContainerDied","Data":"680359322fbe0d13024f5c435bbcfbef006d6d62d239663d711e12b930f56cb0"} Dec 02 16:14:11 crc kubenswrapper[4933]: I1202 16:14:11.387869 4933 generic.go:334] "Generic (PLEG): container finished" podID="bc660678-23e8-42d3-b34e-970411b1b48c" containerID="fe4e7b2cb0d52f4b9903fec1ed83f1d9d7beccad4248982089285af8e762b828" exitCode=0 Dec 02 16:14:11 crc kubenswrapper[4933]: I1202 16:14:11.387957 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-b25b-account-create-update-p4l95" event={"ID":"bc660678-23e8-42d3-b34e-970411b1b48c","Type":"ContainerDied","Data":"fe4e7b2cb0d52f4b9903fec1ed83f1d9d7beccad4248982089285af8e762b828"} Dec 02 16:14:11 crc kubenswrapper[4933]: I1202 16:14:11.390844 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5615-account-create-update-d844g" event={"ID":"c4d438d7-31b1-4eb8-8794-2cc6a8aa79a3","Type":"ContainerStarted","Data":"99d00660130cceb9709a98bc5674bab7aba68356681d21ac3e46a00534fdf01b"} Dec 02 16:14:11 crc kubenswrapper[4933]: I1202 16:14:11.409525 4933 generic.go:334] "Generic (PLEG): container finished" podID="1ff9f4e8-8048-4527-be5c-58279375324a" containerID="2435a130c6381d22ba3e9e38424d379235216bc10da713924dae4f1d2896c524" exitCode=0 Dec 02 16:14:11 crc kubenswrapper[4933]: I1202 16:14:11.409632 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-gnm4h" event={"ID":"1ff9f4e8-8048-4527-be5c-58279375324a","Type":"ContainerDied","Data":"2435a130c6381d22ba3e9e38424d379235216bc10da713924dae4f1d2896c524"} Dec 02 16:14:11 crc kubenswrapper[4933]: I1202 16:14:11.435297 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-139c-account-create-update-rws48" event={"ID":"ada1850f-4671-48a5-97f3-7b82aab8ab97","Type":"ContainerDied","Data":"99bd7195bf31b017c4301862998afdc4701200cc15fe5c67cc24d3e62870ec42"} Dec 02 16:14:11 crc kubenswrapper[4933]: I1202 16:14:11.435344 4933 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="99bd7195bf31b017c4301862998afdc4701200cc15fe5c67cc24d3e62870ec42" Dec 02 16:14:11 crc kubenswrapper[4933]: I1202 16:14:11.437154 4933 generic.go:334] "Generic (PLEG): container finished" podID="be505e35-b5c6-434e-9f60-621f6adb19aa" containerID="110e9e366b344c9454e199763a09d2f3161de244d6494ea4d8f115ca99b1faf0" exitCode=0 Dec 02 16:14:11 crc kubenswrapper[4933]: I1202 16:14:11.437213 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-dvgn2" event={"ID":"be505e35-b5c6-434e-9f60-621f6adb19aa","Type":"ContainerDied","Data":"110e9e366b344c9454e199763a09d2f3161de244d6494ea4d8f115ca99b1faf0"} Dec 02 16:14:11 crc kubenswrapper[4933]: I1202 16:14:11.445598 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-ab9a-account-create-update-s4k9m" event={"ID":"3508d064-3032-4754-a0b9-a81c7e3a14b9","Type":"ContainerStarted","Data":"ae617038e11b8edcdf6c064d49f4c8b64292f8f3c6f53da12219fc50b98dd074"} Dec 02 16:14:11 crc kubenswrapper[4933]: I1202 16:14:11.454186 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-5pwnn" event={"ID":"360b6f71-eb21-4616-87ef-d142ca4a2e1a","Type":"ContainerDied","Data":"b1bf1cb6ce1af60ef261c21923ca050b067814b7559257a04c1691f463e18a58"} Dec 02 16:14:11 crc kubenswrapper[4933]: I1202 16:14:11.454235 4933 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b1bf1cb6ce1af60ef261c21923ca050b067814b7559257a04c1691f463e18a58" Dec 02 16:14:11 crc kubenswrapper[4933]: I1202 16:14:11.475244 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-6e7e-account-create-update-6zd9l" event={"ID":"1d265444-f155-4232-802e-b454de66daa6","Type":"ContainerStarted","Data":"cf2b184907cfa70c64ebab1f8c3e138fdd91180ee044f5a6f182d7202453f531"} Dec 02 16:14:11 crc kubenswrapper[4933]: I1202 16:14:11.481443 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-cell1-db-create-ssstq" event={"ID":"ea8158de-843d-4149-8b2d-5621374e8e71","Type":"ContainerStarted","Data":"1dbdc1c6b7998895cd8aebaea25e820ebb4bb6a708b59779f14e495889cc4e06"} Dec 02 16:14:11 crc kubenswrapper[4933]: I1202 16:14:11.482957 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-bgnf5" event={"ID":"8f084c2d-a8f4-4082-9130-052fbf9c9a30","Type":"ContainerStarted","Data":"b4c725abc869ef7ef9576c47f9f2b3c0c23d0c692f2e87a9616f60a404d72370"} Dec 02 16:14:11 crc kubenswrapper[4933]: I1202 16:14:11.501706 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-5pwnn" Dec 02 16:14:11 crc kubenswrapper[4933]: I1202 16:14:11.534664 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-139c-account-create-update-rws48" Dec 02 16:14:11 crc kubenswrapper[4933]: I1202 16:14:11.548716 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/360b6f71-eb21-4616-87ef-d142ca4a2e1a-operator-scripts\") pod \"360b6f71-eb21-4616-87ef-d142ca4a2e1a\" (UID: \"360b6f71-eb21-4616-87ef-d142ca4a2e1a\") " Dec 02 16:14:11 crc kubenswrapper[4933]: I1202 16:14:11.548876 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zspth\" (UniqueName: \"kubernetes.io/projected/360b6f71-eb21-4616-87ef-d142ca4a2e1a-kube-api-access-zspth\") pod \"360b6f71-eb21-4616-87ef-d142ca4a2e1a\" (UID: \"360b6f71-eb21-4616-87ef-d142ca4a2e1a\") " Dec 02 16:14:11 crc kubenswrapper[4933]: I1202 16:14:11.552615 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/360b6f71-eb21-4616-87ef-d142ca4a2e1a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "360b6f71-eb21-4616-87ef-d142ca4a2e1a" (UID: "360b6f71-eb21-4616-87ef-d142ca4a2e1a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:14:11 crc kubenswrapper[4933]: I1202 16:14:11.556270 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/360b6f71-eb21-4616-87ef-d142ca4a2e1a-kube-api-access-zspth" (OuterVolumeSpecName: "kube-api-access-zspth") pod "360b6f71-eb21-4616-87ef-d142ca4a2e1a" (UID: "360b6f71-eb21-4616-87ef-d142ca4a2e1a"). InnerVolumeSpecName "kube-api-access-zspth". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:14:11 crc kubenswrapper[4933]: I1202 16:14:11.613941 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 02 16:14:11 crc kubenswrapper[4933]: I1202 16:14:11.650873 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ada1850f-4671-48a5-97f3-7b82aab8ab97-operator-scripts\") pod \"ada1850f-4671-48a5-97f3-7b82aab8ab97\" (UID: \"ada1850f-4671-48a5-97f3-7b82aab8ab97\") " Dec 02 16:14:11 crc kubenswrapper[4933]: I1202 16:14:11.650971 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rs99r\" (UniqueName: \"kubernetes.io/projected/ada1850f-4671-48a5-97f3-7b82aab8ab97-kube-api-access-rs99r\") pod \"ada1850f-4671-48a5-97f3-7b82aab8ab97\" (UID: \"ada1850f-4671-48a5-97f3-7b82aab8ab97\") " Dec 02 16:14:11 crc kubenswrapper[4933]: I1202 16:14:11.651753 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ada1850f-4671-48a5-97f3-7b82aab8ab97-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ada1850f-4671-48a5-97f3-7b82aab8ab97" (UID: "ada1850f-4671-48a5-97f3-7b82aab8ab97"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:14:11 crc kubenswrapper[4933]: I1202 16:14:11.651975 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zspth\" (UniqueName: \"kubernetes.io/projected/360b6f71-eb21-4616-87ef-d142ca4a2e1a-kube-api-access-zspth\") on node \"crc\" DevicePath \"\"" Dec 02 16:14:11 crc kubenswrapper[4933]: I1202 16:14:11.652014 4933 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ada1850f-4671-48a5-97f3-7b82aab8ab97-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 16:14:11 crc kubenswrapper[4933]: I1202 16:14:11.652025 4933 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/360b6f71-eb21-4616-87ef-d142ca4a2e1a-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 16:14:11 crc kubenswrapper[4933]: I1202 16:14:11.676301 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ada1850f-4671-48a5-97f3-7b82aab8ab97-kube-api-access-rs99r" (OuterVolumeSpecName: "kube-api-access-rs99r") pod "ada1850f-4671-48a5-97f3-7b82aab8ab97" (UID: "ada1850f-4671-48a5-97f3-7b82aab8ab97"). InnerVolumeSpecName "kube-api-access-rs99r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:14:11 crc kubenswrapper[4933]: I1202 16:14:11.754061 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rs99r\" (UniqueName: \"kubernetes.io/projected/ada1850f-4671-48a5-97f3-7b82aab8ab97-kube-api-access-rs99r\") on node \"crc\" DevicePath \"\"" Dec 02 16:14:12 crc kubenswrapper[4933]: I1202 16:14:12.038420 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-g8qb5-config-zgdq8" Dec 02 16:14:12 crc kubenswrapper[4933]: I1202 16:14:12.060608 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/854d6a3f-90b4-4606-8432-9888fced9115-var-run-ovn\") pod \"854d6a3f-90b4-4606-8432-9888fced9115\" (UID: \"854d6a3f-90b4-4606-8432-9888fced9115\") " Dec 02 16:14:12 crc kubenswrapper[4933]: I1202 16:14:12.060701 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/854d6a3f-90b4-4606-8432-9888fced9115-additional-scripts\") pod \"854d6a3f-90b4-4606-8432-9888fced9115\" (UID: \"854d6a3f-90b4-4606-8432-9888fced9115\") " Dec 02 16:14:12 crc kubenswrapper[4933]: I1202 16:14:12.060729 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fmw99\" (UniqueName: \"kubernetes.io/projected/854d6a3f-90b4-4606-8432-9888fced9115-kube-api-access-fmw99\") pod \"854d6a3f-90b4-4606-8432-9888fced9115\" (UID: \"854d6a3f-90b4-4606-8432-9888fced9115\") " Dec 02 16:14:12 crc kubenswrapper[4933]: I1202 16:14:12.060767 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/854d6a3f-90b4-4606-8432-9888fced9115-scripts\") pod \"854d6a3f-90b4-4606-8432-9888fced9115\" (UID: \"854d6a3f-90b4-4606-8432-9888fced9115\") " Dec 02 16:14:12 crc kubenswrapper[4933]: I1202 16:14:12.060889 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/854d6a3f-90b4-4606-8432-9888fced9115-var-run\") pod \"854d6a3f-90b4-4606-8432-9888fced9115\" (UID: \"854d6a3f-90b4-4606-8432-9888fced9115\") " Dec 02 16:14:12 crc kubenswrapper[4933]: I1202 16:14:12.060968 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/854d6a3f-90b4-4606-8432-9888fced9115-var-log-ovn\") pod \"854d6a3f-90b4-4606-8432-9888fced9115\" (UID: \"854d6a3f-90b4-4606-8432-9888fced9115\") " Dec 02 16:14:12 crc kubenswrapper[4933]: I1202 16:14:12.060986 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/854d6a3f-90b4-4606-8432-9888fced9115-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "854d6a3f-90b4-4606-8432-9888fced9115" (UID: "854d6a3f-90b4-4606-8432-9888fced9115"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 16:14:12 crc kubenswrapper[4933]: I1202 16:14:12.061417 4933 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/854d6a3f-90b4-4606-8432-9888fced9115-var-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 02 16:14:12 crc kubenswrapper[4933]: I1202 16:14:12.061460 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/854d6a3f-90b4-4606-8432-9888fced9115-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "854d6a3f-90b4-4606-8432-9888fced9115" (UID: "854d6a3f-90b4-4606-8432-9888fced9115"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 16:14:12 crc kubenswrapper[4933]: I1202 16:14:12.062134 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/854d6a3f-90b4-4606-8432-9888fced9115-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "854d6a3f-90b4-4606-8432-9888fced9115" (UID: "854d6a3f-90b4-4606-8432-9888fced9115"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:14:12 crc kubenswrapper[4933]: I1202 16:14:12.065704 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/854d6a3f-90b4-4606-8432-9888fced9115-scripts" (OuterVolumeSpecName: "scripts") pod "854d6a3f-90b4-4606-8432-9888fced9115" (UID: "854d6a3f-90b4-4606-8432-9888fced9115"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:14:12 crc kubenswrapper[4933]: I1202 16:14:12.065786 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/854d6a3f-90b4-4606-8432-9888fced9115-var-run" (OuterVolumeSpecName: "var-run") pod "854d6a3f-90b4-4606-8432-9888fced9115" (UID: "854d6a3f-90b4-4606-8432-9888fced9115"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 16:14:12 crc kubenswrapper[4933]: I1202 16:14:12.099036 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/854d6a3f-90b4-4606-8432-9888fced9115-kube-api-access-fmw99" (OuterVolumeSpecName: "kube-api-access-fmw99") pod "854d6a3f-90b4-4606-8432-9888fced9115" (UID: "854d6a3f-90b4-4606-8432-9888fced9115"). InnerVolumeSpecName "kube-api-access-fmw99". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:14:12 crc kubenswrapper[4933]: I1202 16:14:12.164219 4933 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/854d6a3f-90b4-4606-8432-9888fced9115-additional-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 16:14:12 crc kubenswrapper[4933]: I1202 16:14:12.164275 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fmw99\" (UniqueName: \"kubernetes.io/projected/854d6a3f-90b4-4606-8432-9888fced9115-kube-api-access-fmw99\") on node \"crc\" DevicePath \"\"" Dec 02 16:14:12 crc kubenswrapper[4933]: I1202 16:14:12.164287 4933 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/854d6a3f-90b4-4606-8432-9888fced9115-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 16:14:12 crc kubenswrapper[4933]: I1202 16:14:12.164296 4933 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/854d6a3f-90b4-4606-8432-9888fced9115-var-run\") on node \"crc\" DevicePath \"\"" Dec 02 16:14:12 crc kubenswrapper[4933]: I1202 16:14:12.164304 4933 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/854d6a3f-90b4-4606-8432-9888fced9115-var-log-ovn\") on node \"crc\" DevicePath \"\"" Dec 02 16:14:12 crc kubenswrapper[4933]: I1202 16:14:12.186421 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-9g2jj" Dec 02 16:14:12 crc kubenswrapper[4933]: I1202 16:14:12.266200 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1b64b3a1-928a-49f7-b910-6d99cc7e7cf1-operator-scripts\") pod \"1b64b3a1-928a-49f7-b910-6d99cc7e7cf1\" (UID: \"1b64b3a1-928a-49f7-b910-6d99cc7e7cf1\") " Dec 02 16:14:12 crc kubenswrapper[4933]: I1202 16:14:12.267306 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b64b3a1-928a-49f7-b910-6d99cc7e7cf1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1b64b3a1-928a-49f7-b910-6d99cc7e7cf1" (UID: "1b64b3a1-928a-49f7-b910-6d99cc7e7cf1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:14:12 crc kubenswrapper[4933]: I1202 16:14:12.267543 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ft59b\" (UniqueName: \"kubernetes.io/projected/1b64b3a1-928a-49f7-b910-6d99cc7e7cf1-kube-api-access-ft59b\") pod \"1b64b3a1-928a-49f7-b910-6d99cc7e7cf1\" (UID: \"1b64b3a1-928a-49f7-b910-6d99cc7e7cf1\") " Dec 02 16:14:12 crc kubenswrapper[4933]: I1202 16:14:12.268251 4933 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1b64b3a1-928a-49f7-b910-6d99cc7e7cf1-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 16:14:12 crc kubenswrapper[4933]: I1202 16:14:12.272930 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b64b3a1-928a-49f7-b910-6d99cc7e7cf1-kube-api-access-ft59b" (OuterVolumeSpecName: "kube-api-access-ft59b") pod "1b64b3a1-928a-49f7-b910-6d99cc7e7cf1" (UID: "1b64b3a1-928a-49f7-b910-6d99cc7e7cf1"). InnerVolumeSpecName "kube-api-access-ft59b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:14:12 crc kubenswrapper[4933]: I1202 16:14:12.349158 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-g8qb5-config-zgdq8"] Dec 02 16:14:12 crc kubenswrapper[4933]: I1202 16:14:12.361127 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-g8qb5-config-zgdq8"] Dec 02 16:14:12 crc kubenswrapper[4933]: I1202 16:14:12.370882 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ft59b\" (UniqueName: \"kubernetes.io/projected/1b64b3a1-928a-49f7-b910-6d99cc7e7cf1-kube-api-access-ft59b\") on node \"crc\" DevicePath \"\"" Dec 02 16:14:12 crc kubenswrapper[4933]: I1202 16:14:12.498867 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-9g2jj" Dec 02 16:14:12 crc kubenswrapper[4933]: I1202 16:14:12.498875 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-9g2jj" event={"ID":"1b64b3a1-928a-49f7-b910-6d99cc7e7cf1","Type":"ContainerDied","Data":"f3eca6b7ffc55b7170f78e147c26de847a57608f8cad2ab67ea5b47cfdfba1ad"} Dec 02 16:14:12 crc kubenswrapper[4933]: I1202 16:14:12.498928 4933 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f3eca6b7ffc55b7170f78e147c26de847a57608f8cad2ab67ea5b47cfdfba1ad" Dec 02 16:14:12 crc kubenswrapper[4933]: I1202 16:14:12.501654 4933 generic.go:334] "Generic (PLEG): container finished" podID="ea8158de-843d-4149-8b2d-5621374e8e71" containerID="75029d347067d95cfb2f6f53714d3275445b13b9bf53c763cc00d6512c4308b4" exitCode=0 Dec 02 16:14:12 crc kubenswrapper[4933]: I1202 16:14:12.501767 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-cell1-db-create-ssstq" event={"ID":"ea8158de-843d-4149-8b2d-5621374e8e71","Type":"ContainerDied","Data":"75029d347067d95cfb2f6f53714d3275445b13b9bf53c763cc00d6512c4308b4"} Dec 02 16:14:12 crc kubenswrapper[4933]: I1202 16:14:12.514060 4933 generic.go:334] "Generic (PLEG): container finished" podID="8f084c2d-a8f4-4082-9130-052fbf9c9a30" containerID="6c1d76c96756bbec8155e7a1c2bf381e04eaeec22018684ad5072b259093274a" exitCode=0 Dec 02 16:14:12 crc kubenswrapper[4933]: I1202 16:14:12.514150 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-bgnf5" event={"ID":"8f084c2d-a8f4-4082-9130-052fbf9c9a30","Type":"ContainerDied","Data":"6c1d76c96756bbec8155e7a1c2bf381e04eaeec22018684ad5072b259093274a"} Dec 02 16:14:12 crc kubenswrapper[4933]: I1202 16:14:12.516672 4933 generic.go:334] "Generic (PLEG): container finished" podID="3508d064-3032-4754-a0b9-a81c7e3a14b9" containerID="3d3a713cc446acee639e6dad31a015f8d81143a07efdac0c0f386481d3124631" exitCode=0 Dec 02 16:14:12 crc kubenswrapper[4933]: I1202 16:14:12.516715 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-ab9a-account-create-update-s4k9m" event={"ID":"3508d064-3032-4754-a0b9-a81c7e3a14b9","Type":"ContainerDied","Data":"3d3a713cc446acee639e6dad31a015f8d81143a07efdac0c0f386481d3124631"} Dec 02 16:14:12 crc kubenswrapper[4933]: I1202 16:14:12.518701 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-g8qb5-config-zgdq8" Dec 02 16:14:12 crc kubenswrapper[4933]: I1202 16:14:12.521467 4933 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d2f51e43b940de9c3327b2ad840b44c20fe17fc9f1dccc09e4fc71b65f79ff62" Dec 02 16:14:12 crc kubenswrapper[4933]: I1202 16:14:12.537641 4933 generic.go:334] "Generic (PLEG): container finished" podID="c4d438d7-31b1-4eb8-8794-2cc6a8aa79a3" containerID="714b6d3a0e45b0bca815b97029794c499c5e4972182ee301d007b07f8025b6da" exitCode=0 Dec 02 16:14:12 crc kubenswrapper[4933]: I1202 16:14:12.537890 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5615-account-create-update-d844g" event={"ID":"c4d438d7-31b1-4eb8-8794-2cc6a8aa79a3","Type":"ContainerDied","Data":"714b6d3a0e45b0bca815b97029794c499c5e4972182ee301d007b07f8025b6da"} Dec 02 16:14:12 crc kubenswrapper[4933]: I1202 16:14:12.546213 4933 generic.go:334] "Generic (PLEG): container finished" podID="1d265444-f155-4232-802e-b454de66daa6" containerID="dfa4f9abf32790f776aba1c95b5303a69d2be0c713f9ab9292d8904028a9af3f" exitCode=0 Dec 02 16:14:12 crc kubenswrapper[4933]: I1202 16:14:12.546405 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-6e7e-account-create-update-6zd9l" event={"ID":"1d265444-f155-4232-802e-b454de66daa6","Type":"ContainerDied","Data":"dfa4f9abf32790f776aba1c95b5303a69d2be0c713f9ab9292d8904028a9af3f"} Dec 02 16:14:12 crc kubenswrapper[4933]: I1202 16:14:12.548482 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-139c-account-create-update-rws48" Dec 02 16:14:12 crc kubenswrapper[4933]: I1202 16:14:12.548521 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"8621d49b-46de-42d1-a6aa-291b8cda18ca","Type":"ContainerStarted","Data":"1054d74f5b81d75d8faf69419d1e5f52e7db100ff8d24ae166d97420ac852a72"} Dec 02 16:14:12 crc kubenswrapper[4933]: I1202 16:14:12.548933 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-5pwnn" Dec 02 16:14:13 crc kubenswrapper[4933]: I1202 16:14:13.026650 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-gnm4h" Dec 02 16:14:13 crc kubenswrapper[4933]: I1202 16:14:13.076258 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="854d6a3f-90b4-4606-8432-9888fced9115" path="/var/lib/kubelet/pods/854d6a3f-90b4-4606-8432-9888fced9115/volumes" Dec 02 16:14:13 crc kubenswrapper[4933]: I1202 16:14:13.200729 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6rpjl\" (UniqueName: \"kubernetes.io/projected/1ff9f4e8-8048-4527-be5c-58279375324a-kube-api-access-6rpjl\") pod \"1ff9f4e8-8048-4527-be5c-58279375324a\" (UID: \"1ff9f4e8-8048-4527-be5c-58279375324a\") " Dec 02 16:14:13 crc kubenswrapper[4933]: I1202 16:14:13.200788 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1ff9f4e8-8048-4527-be5c-58279375324a-operator-scripts\") pod \"1ff9f4e8-8048-4527-be5c-58279375324a\" (UID: \"1ff9f4e8-8048-4527-be5c-58279375324a\") " Dec 02 16:14:13 crc kubenswrapper[4933]: I1202 16:14:13.201704 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ff9f4e8-8048-4527-be5c-58279375324a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1ff9f4e8-8048-4527-be5c-58279375324a" (UID: "1ff9f4e8-8048-4527-be5c-58279375324a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:14:13 crc kubenswrapper[4933]: I1202 16:14:13.208084 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ff9f4e8-8048-4527-be5c-58279375324a-kube-api-access-6rpjl" (OuterVolumeSpecName: "kube-api-access-6rpjl") pod "1ff9f4e8-8048-4527-be5c-58279375324a" (UID: "1ff9f4e8-8048-4527-be5c-58279375324a"). InnerVolumeSpecName "kube-api-access-6rpjl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:14:13 crc kubenswrapper[4933]: I1202 16:14:13.268629 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-dvgn2" Dec 02 16:14:13 crc kubenswrapper[4933]: I1202 16:14:13.280031 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-59cd-account-create-update-fqk6n" Dec 02 16:14:13 crc kubenswrapper[4933]: I1202 16:14:13.333949 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6rpjl\" (UniqueName: \"kubernetes.io/projected/1ff9f4e8-8048-4527-be5c-58279375324a-kube-api-access-6rpjl\") on node \"crc\" DevicePath \"\"" Dec 02 16:14:13 crc kubenswrapper[4933]: I1202 16:14:13.333983 4933 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1ff9f4e8-8048-4527-be5c-58279375324a-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 16:14:13 crc kubenswrapper[4933]: I1202 16:14:13.344410 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-b25b-account-create-update-p4l95" Dec 02 16:14:13 crc kubenswrapper[4933]: I1202 16:14:13.434860 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-275rp\" (UniqueName: \"kubernetes.io/projected/be505e35-b5c6-434e-9f60-621f6adb19aa-kube-api-access-275rp\") pod \"be505e35-b5c6-434e-9f60-621f6adb19aa\" (UID: \"be505e35-b5c6-434e-9f60-621f6adb19aa\") " Dec 02 16:14:13 crc kubenswrapper[4933]: I1202 16:14:13.434981 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/82ac1fbd-600e-4795-855d-3dae65b36263-operator-scripts\") pod \"82ac1fbd-600e-4795-855d-3dae65b36263\" (UID: \"82ac1fbd-600e-4795-855d-3dae65b36263\") " Dec 02 16:14:13 crc kubenswrapper[4933]: I1202 16:14:13.435051 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/be505e35-b5c6-434e-9f60-621f6adb19aa-operator-scripts\") pod \"be505e35-b5c6-434e-9f60-621f6adb19aa\" (UID: \"be505e35-b5c6-434e-9f60-621f6adb19aa\") " Dec 02 16:14:13 crc kubenswrapper[4933]: I1202 16:14:13.435096 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bwj6d\" (UniqueName: \"kubernetes.io/projected/82ac1fbd-600e-4795-855d-3dae65b36263-kube-api-access-bwj6d\") pod \"82ac1fbd-600e-4795-855d-3dae65b36263\" (UID: \"82ac1fbd-600e-4795-855d-3dae65b36263\") " Dec 02 16:14:13 crc kubenswrapper[4933]: I1202 16:14:13.435613 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82ac1fbd-600e-4795-855d-3dae65b36263-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "82ac1fbd-600e-4795-855d-3dae65b36263" (UID: "82ac1fbd-600e-4795-855d-3dae65b36263"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:14:13 crc kubenswrapper[4933]: I1202 16:14:13.437439 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be505e35-b5c6-434e-9f60-621f6adb19aa-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "be505e35-b5c6-434e-9f60-621f6adb19aa" (UID: "be505e35-b5c6-434e-9f60-621f6adb19aa"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:14:13 crc kubenswrapper[4933]: I1202 16:14:13.455079 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be505e35-b5c6-434e-9f60-621f6adb19aa-kube-api-access-275rp" (OuterVolumeSpecName: "kube-api-access-275rp") pod "be505e35-b5c6-434e-9f60-621f6adb19aa" (UID: "be505e35-b5c6-434e-9f60-621f6adb19aa"). InnerVolumeSpecName "kube-api-access-275rp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:14:13 crc kubenswrapper[4933]: I1202 16:14:13.455146 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82ac1fbd-600e-4795-855d-3dae65b36263-kube-api-access-bwj6d" (OuterVolumeSpecName: "kube-api-access-bwj6d") pod "82ac1fbd-600e-4795-855d-3dae65b36263" (UID: "82ac1fbd-600e-4795-855d-3dae65b36263"). InnerVolumeSpecName "kube-api-access-bwj6d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:14:13 crc kubenswrapper[4933]: I1202 16:14:13.538039 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bc660678-23e8-42d3-b34e-970411b1b48c-operator-scripts\") pod \"bc660678-23e8-42d3-b34e-970411b1b48c\" (UID: \"bc660678-23e8-42d3-b34e-970411b1b48c\") " Dec 02 16:14:13 crc kubenswrapper[4933]: I1202 16:14:13.538360 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nfpjn\" (UniqueName: \"kubernetes.io/projected/bc660678-23e8-42d3-b34e-970411b1b48c-kube-api-access-nfpjn\") pod \"bc660678-23e8-42d3-b34e-970411b1b48c\" (UID: \"bc660678-23e8-42d3-b34e-970411b1b48c\") " Dec 02 16:14:13 crc kubenswrapper[4933]: I1202 16:14:13.538843 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-275rp\" (UniqueName: \"kubernetes.io/projected/be505e35-b5c6-434e-9f60-621f6adb19aa-kube-api-access-275rp\") on node \"crc\" DevicePath \"\"" Dec 02 16:14:13 crc kubenswrapper[4933]: I1202 16:14:13.538862 4933 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/82ac1fbd-600e-4795-855d-3dae65b36263-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 16:14:13 crc kubenswrapper[4933]: I1202 16:14:13.538871 4933 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/be505e35-b5c6-434e-9f60-621f6adb19aa-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 16:14:13 crc kubenswrapper[4933]: I1202 16:14:13.538881 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bwj6d\" (UniqueName: \"kubernetes.io/projected/82ac1fbd-600e-4795-855d-3dae65b36263-kube-api-access-bwj6d\") on node \"crc\" DevicePath \"\"" Dec 02 16:14:13 crc kubenswrapper[4933]: I1202 16:14:13.540885 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc660678-23e8-42d3-b34e-970411b1b48c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bc660678-23e8-42d3-b34e-970411b1b48c" (UID: "bc660678-23e8-42d3-b34e-970411b1b48c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:14:13 crc kubenswrapper[4933]: I1202 16:14:13.543364 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc660678-23e8-42d3-b34e-970411b1b48c-kube-api-access-nfpjn" (OuterVolumeSpecName: "kube-api-access-nfpjn") pod "bc660678-23e8-42d3-b34e-970411b1b48c" (UID: "bc660678-23e8-42d3-b34e-970411b1b48c"). InnerVolumeSpecName "kube-api-access-nfpjn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:14:13 crc kubenswrapper[4933]: I1202 16:14:13.564221 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-b25b-account-create-update-p4l95" event={"ID":"bc660678-23e8-42d3-b34e-970411b1b48c","Type":"ContainerDied","Data":"becf2f9f9a898f3f44a5e3439bf3df1e313ff78eabf1e7afc4abe549a7e50895"} Dec 02 16:14:13 crc kubenswrapper[4933]: I1202 16:14:13.564269 4933 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="becf2f9f9a898f3f44a5e3439bf3df1e313ff78eabf1e7afc4abe549a7e50895" Dec 02 16:14:13 crc kubenswrapper[4933]: I1202 16:14:13.564338 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-b25b-account-create-update-p4l95" Dec 02 16:14:13 crc kubenswrapper[4933]: I1202 16:14:13.578158 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-gnm4h" event={"ID":"1ff9f4e8-8048-4527-be5c-58279375324a","Type":"ContainerDied","Data":"9530ba469a912991abdf15b66e7027113ff3353343471d9ff0fc221f63f928f1"} Dec 02 16:14:13 crc kubenswrapper[4933]: I1202 16:14:13.578203 4933 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9530ba469a912991abdf15b66e7027113ff3353343471d9ff0fc221f63f928f1" Dec 02 16:14:13 crc kubenswrapper[4933]: I1202 16:14:13.578271 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-gnm4h" Dec 02 16:14:13 crc kubenswrapper[4933]: I1202 16:14:13.582471 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-dvgn2" Dec 02 16:14:13 crc kubenswrapper[4933]: I1202 16:14:13.582499 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-dvgn2" event={"ID":"be505e35-b5c6-434e-9f60-621f6adb19aa","Type":"ContainerDied","Data":"832edb4c59b0f375d07efe1ab000752cc82676baacfd8a98e089da6f0cbae442"} Dec 02 16:14:13 crc kubenswrapper[4933]: I1202 16:14:13.582556 4933 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="832edb4c59b0f375d07efe1ab000752cc82676baacfd8a98e089da6f0cbae442" Dec 02 16:14:13 crc kubenswrapper[4933]: I1202 16:14:13.584480 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-59cd-account-create-update-fqk6n" event={"ID":"82ac1fbd-600e-4795-855d-3dae65b36263","Type":"ContainerDied","Data":"6f6955ebb60d6c77019e5318be027dabdc73e00c5ef4ce62853bcf15027257c3"} Dec 02 16:14:13 crc kubenswrapper[4933]: I1202 16:14:13.584532 4933 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6f6955ebb60d6c77019e5318be027dabdc73e00c5ef4ce62853bcf15027257c3" Dec 02 16:14:13 crc kubenswrapper[4933]: I1202 16:14:13.584561 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-59cd-account-create-update-fqk6n" Dec 02 16:14:13 crc kubenswrapper[4933]: I1202 16:14:13.640615 4933 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bc660678-23e8-42d3-b34e-970411b1b48c-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 16:14:13 crc kubenswrapper[4933]: I1202 16:14:13.640640 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nfpjn\" (UniqueName: \"kubernetes.io/projected/bc660678-23e8-42d3-b34e-970411b1b48c-kube-api-access-nfpjn\") on node \"crc\" DevicePath \"\"" Dec 02 16:14:15 crc kubenswrapper[4933]: I1202 16:14:15.606684 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"8621d49b-46de-42d1-a6aa-291b8cda18ca","Type":"ContainerStarted","Data":"ac1f47d0c5533c1b960ff5bad3f8be41a7c6d87a84b6e4222c64094818c8ab82"} Dec 02 16:14:16 crc kubenswrapper[4933]: I1202 16:14:16.091259 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/cf325ab8-af91-4009-9e8b-a299db2234da-etc-swift\") pod \"swift-storage-0\" (UID: \"cf325ab8-af91-4009-9e8b-a299db2234da\") " pod="openstack/swift-storage-0" Dec 02 16:14:16 crc kubenswrapper[4933]: I1202 16:14:16.098577 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/cf325ab8-af91-4009-9e8b-a299db2234da-etc-swift\") pod \"swift-storage-0\" (UID: \"cf325ab8-af91-4009-9e8b-a299db2234da\") " pod="openstack/swift-storage-0" Dec 02 16:14:16 crc kubenswrapper[4933]: I1202 16:14:16.297766 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 02 16:14:16 crc kubenswrapper[4933]: I1202 16:14:16.353214 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-x8fgp"] Dec 02 16:14:16 crc kubenswrapper[4933]: E1202 16:14:16.354157 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b64b3a1-928a-49f7-b910-6d99cc7e7cf1" containerName="mariadb-database-create" Dec 02 16:14:16 crc kubenswrapper[4933]: I1202 16:14:16.354188 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b64b3a1-928a-49f7-b910-6d99cc7e7cf1" containerName="mariadb-database-create" Dec 02 16:14:16 crc kubenswrapper[4933]: E1202 16:14:16.354215 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ff9f4e8-8048-4527-be5c-58279375324a" containerName="mariadb-database-create" Dec 02 16:14:16 crc kubenswrapper[4933]: I1202 16:14:16.354228 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ff9f4e8-8048-4527-be5c-58279375324a" containerName="mariadb-database-create" Dec 02 16:14:16 crc kubenswrapper[4933]: E1202 16:14:16.354254 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ada1850f-4671-48a5-97f3-7b82aab8ab97" containerName="mariadb-account-create-update" Dec 02 16:14:16 crc kubenswrapper[4933]: I1202 16:14:16.354266 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="ada1850f-4671-48a5-97f3-7b82aab8ab97" containerName="mariadb-account-create-update" Dec 02 16:14:16 crc kubenswrapper[4933]: E1202 16:14:16.354297 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="360b6f71-eb21-4616-87ef-d142ca4a2e1a" containerName="mariadb-database-create" Dec 02 16:14:16 crc kubenswrapper[4933]: I1202 16:14:16.354310 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="360b6f71-eb21-4616-87ef-d142ca4a2e1a" containerName="mariadb-database-create" Dec 02 16:14:16 crc kubenswrapper[4933]: E1202 16:14:16.354338 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be505e35-b5c6-434e-9f60-621f6adb19aa" containerName="mariadb-database-create" Dec 02 16:14:16 crc kubenswrapper[4933]: I1202 16:14:16.354351 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="be505e35-b5c6-434e-9f60-621f6adb19aa" containerName="mariadb-database-create" Dec 02 16:14:16 crc kubenswrapper[4933]: E1202 16:14:16.354373 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="854d6a3f-90b4-4606-8432-9888fced9115" containerName="ovn-config" Dec 02 16:14:16 crc kubenswrapper[4933]: I1202 16:14:16.354385 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="854d6a3f-90b4-4606-8432-9888fced9115" containerName="ovn-config" Dec 02 16:14:16 crc kubenswrapper[4933]: E1202 16:14:16.354403 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82ac1fbd-600e-4795-855d-3dae65b36263" containerName="mariadb-account-create-update" Dec 02 16:14:16 crc kubenswrapper[4933]: I1202 16:14:16.354415 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="82ac1fbd-600e-4795-855d-3dae65b36263" containerName="mariadb-account-create-update" Dec 02 16:14:16 crc kubenswrapper[4933]: E1202 16:14:16.354453 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc660678-23e8-42d3-b34e-970411b1b48c" containerName="mariadb-account-create-update" Dec 02 16:14:16 crc kubenswrapper[4933]: I1202 16:14:16.354468 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc660678-23e8-42d3-b34e-970411b1b48c" containerName="mariadb-account-create-update" Dec 02 16:14:16 crc kubenswrapper[4933]: I1202 16:14:16.354851 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="82ac1fbd-600e-4795-855d-3dae65b36263" containerName="mariadb-account-create-update" Dec 02 16:14:16 crc kubenswrapper[4933]: I1202 16:14:16.354893 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="360b6f71-eb21-4616-87ef-d142ca4a2e1a" containerName="mariadb-database-create" Dec 02 16:14:16 crc kubenswrapper[4933]: I1202 16:14:16.354908 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="be505e35-b5c6-434e-9f60-621f6adb19aa" containerName="mariadb-database-create" Dec 02 16:14:16 crc kubenswrapper[4933]: I1202 16:14:16.354936 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ff9f4e8-8048-4527-be5c-58279375324a" containerName="mariadb-database-create" Dec 02 16:14:16 crc kubenswrapper[4933]: I1202 16:14:16.354957 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="854d6a3f-90b4-4606-8432-9888fced9115" containerName="ovn-config" Dec 02 16:14:16 crc kubenswrapper[4933]: I1202 16:14:16.354995 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b64b3a1-928a-49f7-b910-6d99cc7e7cf1" containerName="mariadb-database-create" Dec 02 16:14:16 crc kubenswrapper[4933]: I1202 16:14:16.355017 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="ada1850f-4671-48a5-97f3-7b82aab8ab97" containerName="mariadb-account-create-update" Dec 02 16:14:16 crc kubenswrapper[4933]: I1202 16:14:16.355029 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc660678-23e8-42d3-b34e-970411b1b48c" containerName="mariadb-account-create-update" Dec 02 16:14:16 crc kubenswrapper[4933]: I1202 16:14:16.356195 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-x8fgp" Dec 02 16:14:16 crc kubenswrapper[4933]: I1202 16:14:16.362115 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-6fkqb" Dec 02 16:14:16 crc kubenswrapper[4933]: I1202 16:14:16.362430 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Dec 02 16:14:16 crc kubenswrapper[4933]: I1202 16:14:16.366163 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-g8qb5" Dec 02 16:14:16 crc kubenswrapper[4933]: I1202 16:14:16.372088 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-x8fgp"] Dec 02 16:14:16 crc kubenswrapper[4933]: I1202 16:14:16.505789 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-cell1-db-create-ssstq" Dec 02 16:14:16 crc kubenswrapper[4933]: I1202 16:14:16.507757 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5b7bd174-a0e2-41ff-bc1e-46095ec6f2c4-db-sync-config-data\") pod \"glance-db-sync-x8fgp\" (UID: \"5b7bd174-a0e2-41ff-bc1e-46095ec6f2c4\") " pod="openstack/glance-db-sync-x8fgp" Dec 02 16:14:16 crc kubenswrapper[4933]: I1202 16:14:16.507861 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djvls\" (UniqueName: \"kubernetes.io/projected/5b7bd174-a0e2-41ff-bc1e-46095ec6f2c4-kube-api-access-djvls\") pod \"glance-db-sync-x8fgp\" (UID: \"5b7bd174-a0e2-41ff-bc1e-46095ec6f2c4\") " pod="openstack/glance-db-sync-x8fgp" Dec 02 16:14:16 crc kubenswrapper[4933]: I1202 16:14:16.508038 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b7bd174-a0e2-41ff-bc1e-46095ec6f2c4-combined-ca-bundle\") pod \"glance-db-sync-x8fgp\" (UID: \"5b7bd174-a0e2-41ff-bc1e-46095ec6f2c4\") " pod="openstack/glance-db-sync-x8fgp" Dec 02 16:14:16 crc kubenswrapper[4933]: I1202 16:14:16.508160 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b7bd174-a0e2-41ff-bc1e-46095ec6f2c4-config-data\") pod \"glance-db-sync-x8fgp\" (UID: \"5b7bd174-a0e2-41ff-bc1e-46095ec6f2c4\") " pod="openstack/glance-db-sync-x8fgp" Dec 02 16:14:16 crc kubenswrapper[4933]: I1202 16:14:16.519915 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-ab9a-account-create-update-s4k9m" Dec 02 16:14:16 crc kubenswrapper[4933]: I1202 16:14:16.542965 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-6e7e-account-create-update-6zd9l" Dec 02 16:14:16 crc kubenswrapper[4933]: I1202 16:14:16.574158 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-bgnf5" Dec 02 16:14:16 crc kubenswrapper[4933]: I1202 16:14:16.579616 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5615-account-create-update-d844g" Dec 02 16:14:16 crc kubenswrapper[4933]: I1202 16:14:16.609106 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ea8158de-843d-4149-8b2d-5621374e8e71-operator-scripts\") pod \"ea8158de-843d-4149-8b2d-5621374e8e71\" (UID: \"ea8158de-843d-4149-8b2d-5621374e8e71\") " Dec 02 16:14:16 crc kubenswrapper[4933]: I1202 16:14:16.609203 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pmk4w\" (UniqueName: \"kubernetes.io/projected/ea8158de-843d-4149-8b2d-5621374e8e71-kube-api-access-pmk4w\") pod \"ea8158de-843d-4149-8b2d-5621374e8e71\" (UID: \"ea8158de-843d-4149-8b2d-5621374e8e71\") " Dec 02 16:14:16 crc kubenswrapper[4933]: I1202 16:14:16.609751 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b7bd174-a0e2-41ff-bc1e-46095ec6f2c4-config-data\") pod \"glance-db-sync-x8fgp\" (UID: \"5b7bd174-a0e2-41ff-bc1e-46095ec6f2c4\") " pod="openstack/glance-db-sync-x8fgp" Dec 02 16:14:16 crc kubenswrapper[4933]: I1202 16:14:16.609846 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5b7bd174-a0e2-41ff-bc1e-46095ec6f2c4-db-sync-config-data\") pod \"glance-db-sync-x8fgp\" (UID: \"5b7bd174-a0e2-41ff-bc1e-46095ec6f2c4\") " pod="openstack/glance-db-sync-x8fgp" Dec 02 16:14:16 crc kubenswrapper[4933]: I1202 16:14:16.609869 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djvls\" (UniqueName: \"kubernetes.io/projected/5b7bd174-a0e2-41ff-bc1e-46095ec6f2c4-kube-api-access-djvls\") pod \"glance-db-sync-x8fgp\" (UID: \"5b7bd174-a0e2-41ff-bc1e-46095ec6f2c4\") " pod="openstack/glance-db-sync-x8fgp" Dec 02 16:14:16 crc kubenswrapper[4933]: I1202 16:14:16.609975 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b7bd174-a0e2-41ff-bc1e-46095ec6f2c4-combined-ca-bundle\") pod \"glance-db-sync-x8fgp\" (UID: \"5b7bd174-a0e2-41ff-bc1e-46095ec6f2c4\") " pod="openstack/glance-db-sync-x8fgp" Dec 02 16:14:16 crc kubenswrapper[4933]: I1202 16:14:16.610262 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea8158de-843d-4149-8b2d-5621374e8e71-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ea8158de-843d-4149-8b2d-5621374e8e71" (UID: "ea8158de-843d-4149-8b2d-5621374e8e71"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:14:16 crc kubenswrapper[4933]: I1202 16:14:16.617206 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b7bd174-a0e2-41ff-bc1e-46095ec6f2c4-config-data\") pod \"glance-db-sync-x8fgp\" (UID: \"5b7bd174-a0e2-41ff-bc1e-46095ec6f2c4\") " pod="openstack/glance-db-sync-x8fgp" Dec 02 16:14:16 crc kubenswrapper[4933]: I1202 16:14:16.617250 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b7bd174-a0e2-41ff-bc1e-46095ec6f2c4-combined-ca-bundle\") pod \"glance-db-sync-x8fgp\" (UID: \"5b7bd174-a0e2-41ff-bc1e-46095ec6f2c4\") " pod="openstack/glance-db-sync-x8fgp" Dec 02 16:14:16 crc kubenswrapper[4933]: I1202 16:14:16.630382 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5b7bd174-a0e2-41ff-bc1e-46095ec6f2c4-db-sync-config-data\") pod \"glance-db-sync-x8fgp\" (UID: \"5b7bd174-a0e2-41ff-bc1e-46095ec6f2c4\") " pod="openstack/glance-db-sync-x8fgp" Dec 02 16:14:16 crc kubenswrapper[4933]: I1202 16:14:16.631327 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-vh4vf" event={"ID":"8e2abbab-0ae7-4cb8-9b51-a2423b7948da","Type":"ContainerStarted","Data":"fee102c14fb246b50092852ad06657a7a932daf8c51286fd2e6bf917c0160042"} Dec 02 16:14:16 crc kubenswrapper[4933]: I1202 16:14:16.639346 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea8158de-843d-4149-8b2d-5621374e8e71-kube-api-access-pmk4w" (OuterVolumeSpecName: "kube-api-access-pmk4w") pod "ea8158de-843d-4149-8b2d-5621374e8e71" (UID: "ea8158de-843d-4149-8b2d-5621374e8e71"). InnerVolumeSpecName "kube-api-access-pmk4w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:14:16 crc kubenswrapper[4933]: I1202 16:14:16.639903 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-cell1-db-create-ssstq" event={"ID":"ea8158de-843d-4149-8b2d-5621374e8e71","Type":"ContainerDied","Data":"1dbdc1c6b7998895cd8aebaea25e820ebb4bb6a708b59779f14e495889cc4e06"} Dec 02 16:14:16 crc kubenswrapper[4933]: I1202 16:14:16.639934 4933 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1dbdc1c6b7998895cd8aebaea25e820ebb4bb6a708b59779f14e495889cc4e06" Dec 02 16:14:16 crc kubenswrapper[4933]: I1202 16:14:16.639984 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-cell1-db-create-ssstq" Dec 02 16:14:16 crc kubenswrapper[4933]: I1202 16:14:16.641120 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-bgnf5" event={"ID":"8f084c2d-a8f4-4082-9130-052fbf9c9a30","Type":"ContainerDied","Data":"b4c725abc869ef7ef9576c47f9f2b3c0c23d0c692f2e87a9616f60a404d72370"} Dec 02 16:14:16 crc kubenswrapper[4933]: I1202 16:14:16.641157 4933 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b4c725abc869ef7ef9576c47f9f2b3c0c23d0c692f2e87a9616f60a404d72370" Dec 02 16:14:16 crc kubenswrapper[4933]: I1202 16:14:16.641210 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-bgnf5" Dec 02 16:14:16 crc kubenswrapper[4933]: I1202 16:14:16.642682 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-ab9a-account-create-update-s4k9m" event={"ID":"3508d064-3032-4754-a0b9-a81c7e3a14b9","Type":"ContainerDied","Data":"ae617038e11b8edcdf6c064d49f4c8b64292f8f3c6f53da12219fc50b98dd074"} Dec 02 16:14:16 crc kubenswrapper[4933]: I1202 16:14:16.642703 4933 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ae617038e11b8edcdf6c064d49f4c8b64292f8f3c6f53da12219fc50b98dd074" Dec 02 16:14:16 crc kubenswrapper[4933]: I1202 16:14:16.642739 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-ab9a-account-create-update-s4k9m" Dec 02 16:14:16 crc kubenswrapper[4933]: I1202 16:14:16.653201 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djvls\" (UniqueName: \"kubernetes.io/projected/5b7bd174-a0e2-41ff-bc1e-46095ec6f2c4-kube-api-access-djvls\") pod \"glance-db-sync-x8fgp\" (UID: \"5b7bd174-a0e2-41ff-bc1e-46095ec6f2c4\") " pod="openstack/glance-db-sync-x8fgp" Dec 02 16:14:16 crc kubenswrapper[4933]: I1202 16:14:16.655156 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5615-account-create-update-d844g" event={"ID":"c4d438d7-31b1-4eb8-8794-2cc6a8aa79a3","Type":"ContainerDied","Data":"99d00660130cceb9709a98bc5674bab7aba68356681d21ac3e46a00534fdf01b"} Dec 02 16:14:16 crc kubenswrapper[4933]: I1202 16:14:16.655189 4933 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="99d00660130cceb9709a98bc5674bab7aba68356681d21ac3e46a00534fdf01b" Dec 02 16:14:16 crc kubenswrapper[4933]: I1202 16:14:16.655263 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5615-account-create-update-d844g" Dec 02 16:14:16 crc kubenswrapper[4933]: I1202 16:14:16.661167 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-6e7e-account-create-update-6zd9l" Dec 02 16:14:16 crc kubenswrapper[4933]: I1202 16:14:16.661367 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-6e7e-account-create-update-6zd9l" event={"ID":"1d265444-f155-4232-802e-b454de66daa6","Type":"ContainerDied","Data":"cf2b184907cfa70c64ebab1f8c3e138fdd91180ee044f5a6f182d7202453f531"} Dec 02 16:14:16 crc kubenswrapper[4933]: I1202 16:14:16.661396 4933 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cf2b184907cfa70c64ebab1f8c3e138fdd91180ee044f5a6f182d7202453f531" Dec 02 16:14:16 crc kubenswrapper[4933]: I1202 16:14:16.710718 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rfkm8\" (UniqueName: \"kubernetes.io/projected/8f084c2d-a8f4-4082-9130-052fbf9c9a30-kube-api-access-rfkm8\") pod \"8f084c2d-a8f4-4082-9130-052fbf9c9a30\" (UID: \"8f084c2d-a8f4-4082-9130-052fbf9c9a30\") " Dec 02 16:14:16 crc kubenswrapper[4933]: I1202 16:14:16.710893 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c4d438d7-31b1-4eb8-8794-2cc6a8aa79a3-operator-scripts\") pod \"c4d438d7-31b1-4eb8-8794-2cc6a8aa79a3\" (UID: \"c4d438d7-31b1-4eb8-8794-2cc6a8aa79a3\") " Dec 02 16:14:16 crc kubenswrapper[4933]: I1202 16:14:16.710921 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3508d064-3032-4754-a0b9-a81c7e3a14b9-operator-scripts\") pod \"3508d064-3032-4754-a0b9-a81c7e3a14b9\" (UID: \"3508d064-3032-4754-a0b9-a81c7e3a14b9\") " Dec 02 16:14:16 crc kubenswrapper[4933]: I1202 16:14:16.710949 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1d265444-f155-4232-802e-b454de66daa6-operator-scripts\") pod \"1d265444-f155-4232-802e-b454de66daa6\" (UID: \"1d265444-f155-4232-802e-b454de66daa6\") " Dec 02 16:14:16 crc kubenswrapper[4933]: I1202 16:14:16.710990 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wlwkl\" (UniqueName: \"kubernetes.io/projected/c4d438d7-31b1-4eb8-8794-2cc6a8aa79a3-kube-api-access-wlwkl\") pod \"c4d438d7-31b1-4eb8-8794-2cc6a8aa79a3\" (UID: \"c4d438d7-31b1-4eb8-8794-2cc6a8aa79a3\") " Dec 02 16:14:16 crc kubenswrapper[4933]: I1202 16:14:16.711047 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x9rds\" (UniqueName: \"kubernetes.io/projected/1d265444-f155-4232-802e-b454de66daa6-kube-api-access-x9rds\") pod \"1d265444-f155-4232-802e-b454de66daa6\" (UID: \"1d265444-f155-4232-802e-b454de66daa6\") " Dec 02 16:14:16 crc kubenswrapper[4933]: I1202 16:14:16.711117 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g98m\" (UniqueName: \"kubernetes.io/projected/3508d064-3032-4754-a0b9-a81c7e3a14b9-kube-api-access-6g98m\") pod \"3508d064-3032-4754-a0b9-a81c7e3a14b9\" (UID: \"3508d064-3032-4754-a0b9-a81c7e3a14b9\") " Dec 02 16:14:16 crc kubenswrapper[4933]: I1202 16:14:16.711146 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8f084c2d-a8f4-4082-9130-052fbf9c9a30-operator-scripts\") pod \"8f084c2d-a8f4-4082-9130-052fbf9c9a30\" (UID: \"8f084c2d-a8f4-4082-9130-052fbf9c9a30\") " Dec 02 16:14:16 crc kubenswrapper[4933]: I1202 16:14:16.711623 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d265444-f155-4232-802e-b454de66daa6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1d265444-f155-4232-802e-b454de66daa6" (UID: "1d265444-f155-4232-802e-b454de66daa6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:14:16 crc kubenswrapper[4933]: I1202 16:14:16.711668 4933 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ea8158de-843d-4149-8b2d-5621374e8e71-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 16:14:16 crc kubenswrapper[4933]: I1202 16:14:16.711681 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pmk4w\" (UniqueName: \"kubernetes.io/projected/ea8158de-843d-4149-8b2d-5621374e8e71-kube-api-access-pmk4w\") on node \"crc\" DevicePath \"\"" Dec 02 16:14:16 crc kubenswrapper[4933]: I1202 16:14:16.712079 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4d438d7-31b1-4eb8-8794-2cc6a8aa79a3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c4d438d7-31b1-4eb8-8794-2cc6a8aa79a3" (UID: "c4d438d7-31b1-4eb8-8794-2cc6a8aa79a3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:14:16 crc kubenswrapper[4933]: I1202 16:14:16.712434 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3508d064-3032-4754-a0b9-a81c7e3a14b9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3508d064-3032-4754-a0b9-a81c7e3a14b9" (UID: "3508d064-3032-4754-a0b9-a81c7e3a14b9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:14:16 crc kubenswrapper[4933]: I1202 16:14:16.712504 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f084c2d-a8f4-4082-9130-052fbf9c9a30-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8f084c2d-a8f4-4082-9130-052fbf9c9a30" (UID: "8f084c2d-a8f4-4082-9130-052fbf9c9a30"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:14:16 crc kubenswrapper[4933]: I1202 16:14:16.714992 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3508d064-3032-4754-a0b9-a81c7e3a14b9-kube-api-access-6g98m" (OuterVolumeSpecName: "kube-api-access-6g98m") pod "3508d064-3032-4754-a0b9-a81c7e3a14b9" (UID: "3508d064-3032-4754-a0b9-a81c7e3a14b9"). InnerVolumeSpecName "kube-api-access-6g98m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:14:16 crc kubenswrapper[4933]: I1202 16:14:16.716642 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d265444-f155-4232-802e-b454de66daa6-kube-api-access-x9rds" (OuterVolumeSpecName: "kube-api-access-x9rds") pod "1d265444-f155-4232-802e-b454de66daa6" (UID: "1d265444-f155-4232-802e-b454de66daa6"). InnerVolumeSpecName "kube-api-access-x9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:14:16 crc kubenswrapper[4933]: I1202 16:14:16.716715 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f084c2d-a8f4-4082-9130-052fbf9c9a30-kube-api-access-rfkm8" (OuterVolumeSpecName: "kube-api-access-rfkm8") pod "8f084c2d-a8f4-4082-9130-052fbf9c9a30" (UID: "8f084c2d-a8f4-4082-9130-052fbf9c9a30"). InnerVolumeSpecName "kube-api-access-rfkm8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:14:16 crc kubenswrapper[4933]: I1202 16:14:16.716916 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4d438d7-31b1-4eb8-8794-2cc6a8aa79a3-kube-api-access-wlwkl" (OuterVolumeSpecName: "kube-api-access-wlwkl") pod "c4d438d7-31b1-4eb8-8794-2cc6a8aa79a3" (UID: "c4d438d7-31b1-4eb8-8794-2cc6a8aa79a3"). InnerVolumeSpecName "kube-api-access-wlwkl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:14:16 crc kubenswrapper[4933]: I1202 16:14:16.813932 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rfkm8\" (UniqueName: \"kubernetes.io/projected/8f084c2d-a8f4-4082-9130-052fbf9c9a30-kube-api-access-rfkm8\") on node \"crc\" DevicePath \"\"" Dec 02 16:14:16 crc kubenswrapper[4933]: I1202 16:14:16.814270 4933 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c4d438d7-31b1-4eb8-8794-2cc6a8aa79a3-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 16:14:16 crc kubenswrapper[4933]: I1202 16:14:16.814285 4933 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3508d064-3032-4754-a0b9-a81c7e3a14b9-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 16:14:16 crc kubenswrapper[4933]: I1202 16:14:16.814298 4933 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1d265444-f155-4232-802e-b454de66daa6-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 16:14:16 crc kubenswrapper[4933]: I1202 16:14:16.814310 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wlwkl\" (UniqueName: \"kubernetes.io/projected/c4d438d7-31b1-4eb8-8794-2cc6a8aa79a3-kube-api-access-wlwkl\") on node \"crc\" DevicePath \"\"" Dec 02 16:14:16 crc kubenswrapper[4933]: I1202 16:14:16.814322 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x9rds\" (UniqueName: \"kubernetes.io/projected/1d265444-f155-4232-802e-b454de66daa6-kube-api-access-x9rds\") on node \"crc\" DevicePath \"\"" Dec 02 16:14:16 crc kubenswrapper[4933]: I1202 16:14:16.814333 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g98m\" (UniqueName: \"kubernetes.io/projected/3508d064-3032-4754-a0b9-a81c7e3a14b9-kube-api-access-6g98m\") on node \"crc\" DevicePath \"\"" Dec 02 16:14:16 crc kubenswrapper[4933]: I1202 16:14:16.814345 4933 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8f084c2d-a8f4-4082-9130-052fbf9c9a30-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 16:14:16 crc kubenswrapper[4933]: I1202 16:14:16.911628 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-x8fgp" Dec 02 16:14:17 crc kubenswrapper[4933]: I1202 16:14:17.073764 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 02 16:14:17 crc kubenswrapper[4933]: I1202 16:14:17.170639 4933 patch_prober.go:28] interesting pod/machine-config-daemon-d2p6w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 16:14:17 crc kubenswrapper[4933]: I1202 16:14:17.170697 4933 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 16:14:17 crc kubenswrapper[4933]: W1202 16:14:17.480805 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5b7bd174_a0e2_41ff_bc1e_46095ec6f2c4.slice/crio-64cc692081ef026e3a5ad414a271e843a471f5f0994d2b7549c4b30da61264a0 WatchSource:0}: Error finding container 64cc692081ef026e3a5ad414a271e843a471f5f0994d2b7549c4b30da61264a0: Status 404 returned error can't find the container with id 64cc692081ef026e3a5ad414a271e843a471f5f0994d2b7549c4b30da61264a0 Dec 02 16:14:17 crc kubenswrapper[4933]: I1202 16:14:17.482190 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-x8fgp"] Dec 02 16:14:17 crc kubenswrapper[4933]: I1202 16:14:17.671233 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"cf325ab8-af91-4009-9e8b-a299db2234da","Type":"ContainerStarted","Data":"2cab5f2ff8d11495bac81b6c1878faeec321fa2e098b203af14db114c74cfcd5"} Dec 02 16:14:17 crc kubenswrapper[4933]: I1202 16:14:17.674516 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-x8fgp" event={"ID":"5b7bd174-a0e2-41ff-bc1e-46095ec6f2c4","Type":"ContainerStarted","Data":"64cc692081ef026e3a5ad414a271e843a471f5f0994d2b7549c4b30da61264a0"} Dec 02 16:14:17 crc kubenswrapper[4933]: I1202 16:14:17.698863 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-vh4vf" podStartSLOduration=3.007983449 podStartE2EDuration="9.698846574s" podCreationTimestamp="2025-12-02 16:14:08 +0000 UTC" firstStartedPulling="2025-12-02 16:14:09.670810859 +0000 UTC m=+1312.922037562" lastFinishedPulling="2025-12-02 16:14:16.361673974 +0000 UTC m=+1319.612900687" observedRunningTime="2025-12-02 16:14:17.691368445 +0000 UTC m=+1320.942595168" watchObservedRunningTime="2025-12-02 16:14:17.698846574 +0000 UTC m=+1320.950073277" Dec 02 16:14:18 crc kubenswrapper[4933]: I1202 16:14:18.693605 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"cf325ab8-af91-4009-9e8b-a299db2234da","Type":"ContainerStarted","Data":"bde78984749dec4c1e20c7273b3f3d08f6e104d3a38db65b57fc61eeb152f462"} Dec 02 16:14:18 crc kubenswrapper[4933]: I1202 16:14:18.693924 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"cf325ab8-af91-4009-9e8b-a299db2234da","Type":"ContainerStarted","Data":"0ad2edf9835bbcfc5790cdc9c03bacfb657d62466ab141f7f200cc02de03c7ea"} Dec 02 16:14:19 crc kubenswrapper[4933]: I1202 16:14:19.262911 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-0"] Dec 02 16:14:19 crc kubenswrapper[4933]: E1202 16:14:19.263812 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4d438d7-31b1-4eb8-8794-2cc6a8aa79a3" containerName="mariadb-account-create-update" Dec 02 16:14:19 crc kubenswrapper[4933]: I1202 16:14:19.263863 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4d438d7-31b1-4eb8-8794-2cc6a8aa79a3" containerName="mariadb-account-create-update" Dec 02 16:14:19 crc kubenswrapper[4933]: E1202 16:14:19.263902 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f084c2d-a8f4-4082-9130-052fbf9c9a30" containerName="mariadb-database-create" Dec 02 16:14:19 crc kubenswrapper[4933]: I1202 16:14:19.263935 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f084c2d-a8f4-4082-9130-052fbf9c9a30" containerName="mariadb-database-create" Dec 02 16:14:19 crc kubenswrapper[4933]: E1202 16:14:19.263960 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3508d064-3032-4754-a0b9-a81c7e3a14b9" containerName="mariadb-account-create-update" Dec 02 16:14:19 crc kubenswrapper[4933]: I1202 16:14:19.263968 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="3508d064-3032-4754-a0b9-a81c7e3a14b9" containerName="mariadb-account-create-update" Dec 02 16:14:19 crc kubenswrapper[4933]: E1202 16:14:19.263982 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d265444-f155-4232-802e-b454de66daa6" containerName="mariadb-account-create-update" Dec 02 16:14:19 crc kubenswrapper[4933]: I1202 16:14:19.264009 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d265444-f155-4232-802e-b454de66daa6" containerName="mariadb-account-create-update" Dec 02 16:14:19 crc kubenswrapper[4933]: E1202 16:14:19.264020 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea8158de-843d-4149-8b2d-5621374e8e71" containerName="mariadb-database-create" Dec 02 16:14:19 crc kubenswrapper[4933]: I1202 16:14:19.264028 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea8158de-843d-4149-8b2d-5621374e8e71" containerName="mariadb-database-create" Dec 02 16:14:19 crc kubenswrapper[4933]: I1202 16:14:19.264316 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d265444-f155-4232-802e-b454de66daa6" containerName="mariadb-account-create-update" Dec 02 16:14:19 crc kubenswrapper[4933]: I1202 16:14:19.264356 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="3508d064-3032-4754-a0b9-a81c7e3a14b9" containerName="mariadb-account-create-update" Dec 02 16:14:19 crc kubenswrapper[4933]: I1202 16:14:19.264371 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea8158de-843d-4149-8b2d-5621374e8e71" containerName="mariadb-database-create" Dec 02 16:14:19 crc kubenswrapper[4933]: I1202 16:14:19.264379 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4d438d7-31b1-4eb8-8794-2cc6a8aa79a3" containerName="mariadb-account-create-update" Dec 02 16:14:19 crc kubenswrapper[4933]: I1202 16:14:19.264394 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f084c2d-a8f4-4082-9130-052fbf9c9a30" containerName="mariadb-database-create" Dec 02 16:14:19 crc kubenswrapper[4933]: I1202 16:14:19.265985 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Dec 02 16:14:19 crc kubenswrapper[4933]: I1202 16:14:19.271372 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"mysqld-exporter-config-data" Dec 02 16:14:19 crc kubenswrapper[4933]: I1202 16:14:19.325485 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-0"] Dec 02 16:14:19 crc kubenswrapper[4933]: I1202 16:14:19.331262 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbftz\" (UniqueName: \"kubernetes.io/projected/fa0c36d2-33b1-4415-9413-7835332aa49d-kube-api-access-gbftz\") pod \"mysqld-exporter-0\" (UID: \"fa0c36d2-33b1-4415-9413-7835332aa49d\") " pod="openstack/mysqld-exporter-0" Dec 02 16:14:19 crc kubenswrapper[4933]: I1202 16:14:19.331553 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa0c36d2-33b1-4415-9413-7835332aa49d-config-data\") pod \"mysqld-exporter-0\" (UID: \"fa0c36d2-33b1-4415-9413-7835332aa49d\") " pod="openstack/mysqld-exporter-0" Dec 02 16:14:19 crc kubenswrapper[4933]: I1202 16:14:19.331578 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa0c36d2-33b1-4415-9413-7835332aa49d-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"fa0c36d2-33b1-4415-9413-7835332aa49d\") " pod="openstack/mysqld-exporter-0" Dec 02 16:14:19 crc kubenswrapper[4933]: I1202 16:14:19.434080 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gbftz\" (UniqueName: \"kubernetes.io/projected/fa0c36d2-33b1-4415-9413-7835332aa49d-kube-api-access-gbftz\") pod \"mysqld-exporter-0\" (UID: \"fa0c36d2-33b1-4415-9413-7835332aa49d\") " pod="openstack/mysqld-exporter-0" Dec 02 16:14:19 crc kubenswrapper[4933]: I1202 16:14:19.434191 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa0c36d2-33b1-4415-9413-7835332aa49d-config-data\") pod \"mysqld-exporter-0\" (UID: \"fa0c36d2-33b1-4415-9413-7835332aa49d\") " pod="openstack/mysqld-exporter-0" Dec 02 16:14:19 crc kubenswrapper[4933]: I1202 16:14:19.434222 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa0c36d2-33b1-4415-9413-7835332aa49d-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"fa0c36d2-33b1-4415-9413-7835332aa49d\") " pod="openstack/mysqld-exporter-0" Dec 02 16:14:19 crc kubenswrapper[4933]: I1202 16:14:19.443751 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa0c36d2-33b1-4415-9413-7835332aa49d-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"fa0c36d2-33b1-4415-9413-7835332aa49d\") " pod="openstack/mysqld-exporter-0" Dec 02 16:14:19 crc kubenswrapper[4933]: I1202 16:14:19.456067 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbftz\" (UniqueName: \"kubernetes.io/projected/fa0c36d2-33b1-4415-9413-7835332aa49d-kube-api-access-gbftz\") pod \"mysqld-exporter-0\" (UID: \"fa0c36d2-33b1-4415-9413-7835332aa49d\") " pod="openstack/mysqld-exporter-0" Dec 02 16:14:19 crc kubenswrapper[4933]: I1202 16:14:19.478749 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa0c36d2-33b1-4415-9413-7835332aa49d-config-data\") pod \"mysqld-exporter-0\" (UID: \"fa0c36d2-33b1-4415-9413-7835332aa49d\") " pod="openstack/mysqld-exporter-0" Dec 02 16:14:19 crc kubenswrapper[4933]: I1202 16:14:19.712350 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"cf325ab8-af91-4009-9e8b-a299db2234da","Type":"ContainerStarted","Data":"a37e3ca887cf30bc1fd997b7d864eaa926f2abb4f5965f2ca0f375ae89df71a8"} Dec 02 16:14:19 crc kubenswrapper[4933]: I1202 16:14:19.713107 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"cf325ab8-af91-4009-9e8b-a299db2234da","Type":"ContainerStarted","Data":"36ce5f0938ea937e63a3484466ae6faa6a0df9e9aa4b96f5aaa4098886608556"} Dec 02 16:14:19 crc kubenswrapper[4933]: I1202 16:14:19.745634 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Dec 02 16:14:20 crc kubenswrapper[4933]: I1202 16:14:20.671348 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-0"] Dec 02 16:14:20 crc kubenswrapper[4933]: I1202 16:14:20.733242 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"fa0c36d2-33b1-4415-9413-7835332aa49d","Type":"ContainerStarted","Data":"a44b461c55935785ce7e2b10367e551519a02f9b3e481e0fbd8d60797fe7238d"} Dec 02 16:14:21 crc kubenswrapper[4933]: E1202 16:14:21.232727 4933 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8e2abbab_0ae7_4cb8_9b51_a2423b7948da.slice/crio-fee102c14fb246b50092852ad06657a7a932daf8c51286fd2e6bf917c0160042.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8e2abbab_0ae7_4cb8_9b51_a2423b7948da.slice/crio-conmon-fee102c14fb246b50092852ad06657a7a932daf8c51286fd2e6bf917c0160042.scope\": RecentStats: unable to find data in memory cache]" Dec 02 16:14:21 crc kubenswrapper[4933]: I1202 16:14:21.743841 4933 generic.go:334] "Generic (PLEG): container finished" podID="8e2abbab-0ae7-4cb8-9b51-a2423b7948da" containerID="fee102c14fb246b50092852ad06657a7a932daf8c51286fd2e6bf917c0160042" exitCode=0 Dec 02 16:14:21 crc kubenswrapper[4933]: I1202 16:14:21.743933 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-vh4vf" event={"ID":"8e2abbab-0ae7-4cb8-9b51-a2423b7948da","Type":"ContainerDied","Data":"fee102c14fb246b50092852ad06657a7a932daf8c51286fd2e6bf917c0160042"} Dec 02 16:14:22 crc kubenswrapper[4933]: I1202 16:14:22.753333 4933 generic.go:334] "Generic (PLEG): container finished" podID="8621d49b-46de-42d1-a6aa-291b8cda18ca" containerID="ac1f47d0c5533c1b960ff5bad3f8be41a7c6d87a84b6e4222c64094818c8ab82" exitCode=0 Dec 02 16:14:22 crc kubenswrapper[4933]: I1202 16:14:22.753432 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"8621d49b-46de-42d1-a6aa-291b8cda18ca","Type":"ContainerDied","Data":"ac1f47d0c5533c1b960ff5bad3f8be41a7c6d87a84b6e4222c64094818c8ab82"} Dec 02 16:14:23 crc kubenswrapper[4933]: I1202 16:14:23.077682 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-vh4vf" Dec 02 16:14:23 crc kubenswrapper[4933]: I1202 16:14:23.238173 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7mvx\" (UniqueName: \"kubernetes.io/projected/8e2abbab-0ae7-4cb8-9b51-a2423b7948da-kube-api-access-w7mvx\") pod \"8e2abbab-0ae7-4cb8-9b51-a2423b7948da\" (UID: \"8e2abbab-0ae7-4cb8-9b51-a2423b7948da\") " Dec 02 16:14:23 crc kubenswrapper[4933]: I1202 16:14:23.238410 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e2abbab-0ae7-4cb8-9b51-a2423b7948da-combined-ca-bundle\") pod \"8e2abbab-0ae7-4cb8-9b51-a2423b7948da\" (UID: \"8e2abbab-0ae7-4cb8-9b51-a2423b7948da\") " Dec 02 16:14:23 crc kubenswrapper[4933]: I1202 16:14:23.238475 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e2abbab-0ae7-4cb8-9b51-a2423b7948da-config-data\") pod \"8e2abbab-0ae7-4cb8-9b51-a2423b7948da\" (UID: \"8e2abbab-0ae7-4cb8-9b51-a2423b7948da\") " Dec 02 16:14:23 crc kubenswrapper[4933]: I1202 16:14:23.245151 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e2abbab-0ae7-4cb8-9b51-a2423b7948da-kube-api-access-w7mvx" (OuterVolumeSpecName: "kube-api-access-w7mvx") pod "8e2abbab-0ae7-4cb8-9b51-a2423b7948da" (UID: "8e2abbab-0ae7-4cb8-9b51-a2423b7948da"). InnerVolumeSpecName "kube-api-access-w7mvx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:14:23 crc kubenswrapper[4933]: I1202 16:14:23.268934 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e2abbab-0ae7-4cb8-9b51-a2423b7948da-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8e2abbab-0ae7-4cb8-9b51-a2423b7948da" (UID: "8e2abbab-0ae7-4cb8-9b51-a2423b7948da"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:14:23 crc kubenswrapper[4933]: I1202 16:14:23.291507 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e2abbab-0ae7-4cb8-9b51-a2423b7948da-config-data" (OuterVolumeSpecName: "config-data") pod "8e2abbab-0ae7-4cb8-9b51-a2423b7948da" (UID: "8e2abbab-0ae7-4cb8-9b51-a2423b7948da"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:14:23 crc kubenswrapper[4933]: I1202 16:14:23.340943 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7mvx\" (UniqueName: \"kubernetes.io/projected/8e2abbab-0ae7-4cb8-9b51-a2423b7948da-kube-api-access-w7mvx\") on node \"crc\" DevicePath \"\"" Dec 02 16:14:23 crc kubenswrapper[4933]: I1202 16:14:23.340985 4933 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e2abbab-0ae7-4cb8-9b51-a2423b7948da-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 16:14:23 crc kubenswrapper[4933]: I1202 16:14:23.340997 4933 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e2abbab-0ae7-4cb8-9b51-a2423b7948da-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 16:14:23 crc kubenswrapper[4933]: I1202 16:14:23.782113 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"fa0c36d2-33b1-4415-9413-7835332aa49d","Type":"ContainerStarted","Data":"2d89454283151b2d688ebfe4eec249b0c87cde5ca22c4f55a067a14c997607d2"} Dec 02 16:14:23 crc kubenswrapper[4933]: I1202 16:14:23.786878 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"8621d49b-46de-42d1-a6aa-291b8cda18ca","Type":"ContainerStarted","Data":"b065b65b311986ae48408aba11bf557062f4f16b164008a9af0f8e1ce33aead5"} Dec 02 16:14:23 crc kubenswrapper[4933]: I1202 16:14:23.790398 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-vh4vf" event={"ID":"8e2abbab-0ae7-4cb8-9b51-a2423b7948da","Type":"ContainerDied","Data":"32a9b9649788f58aa602fa769faa3467ae9a6343192e69c20b973b666a63be4c"} Dec 02 16:14:23 crc kubenswrapper[4933]: I1202 16:14:23.790451 4933 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="32a9b9649788f58aa602fa769faa3467ae9a6343192e69c20b973b666a63be4c" Dec 02 16:14:23 crc kubenswrapper[4933]: I1202 16:14:23.790473 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-vh4vf" Dec 02 16:14:23 crc kubenswrapper[4933]: I1202 16:14:23.804613 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"cf325ab8-af91-4009-9e8b-a299db2234da","Type":"ContainerStarted","Data":"345e7d1597f7196cbbeaac3e2d685bdd45e2fe36f189b55dcda0c7ace0d10348"} Dec 02 16:14:23 crc kubenswrapper[4933]: I1202 16:14:23.804673 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"cf325ab8-af91-4009-9e8b-a299db2234da","Type":"ContainerStarted","Data":"7a7fc94ced4ff8e89fd8c9a271eabbf090227dbea67877ce18ee506d3ae64dce"} Dec 02 16:14:23 crc kubenswrapper[4933]: I1202 16:14:23.804688 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"cf325ab8-af91-4009-9e8b-a299db2234da","Type":"ContainerStarted","Data":"ba653e4e5b917cb3aa1413c01b4c939929b80af78402f356651f0e7045bd5254"} Dec 02 16:14:23 crc kubenswrapper[4933]: I1202 16:14:23.818648 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mysqld-exporter-0" podStartSLOduration=2.874543607 podStartE2EDuration="4.818626765s" podCreationTimestamp="2025-12-02 16:14:19 +0000 UTC" firstStartedPulling="2025-12-02 16:14:20.685723779 +0000 UTC m=+1323.936950482" lastFinishedPulling="2025-12-02 16:14:22.629806937 +0000 UTC m=+1325.881033640" observedRunningTime="2025-12-02 16:14:23.80529692 +0000 UTC m=+1327.056523623" watchObservedRunningTime="2025-12-02 16:14:23.818626765 +0000 UTC m=+1327.069853478" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.020060 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c9d85d47c-q9sfs"] Dec 02 16:14:24 crc kubenswrapper[4933]: E1202 16:14:24.020569 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e2abbab-0ae7-4cb8-9b51-a2423b7948da" containerName="keystone-db-sync" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.022326 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e2abbab-0ae7-4cb8-9b51-a2423b7948da" containerName="keystone-db-sync" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.022852 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e2abbab-0ae7-4cb8-9b51-a2423b7948da" containerName="keystone-db-sync" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.024320 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9d85d47c-q9sfs" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.043982 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9d85d47c-q9sfs"] Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.084733 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-s7fbv"] Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.086737 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-s7fbv" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.087032 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0a52b684-ce41-4f94-a185-3b8c522cd365-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9d85d47c-q9sfs\" (UID: \"0a52b684-ce41-4f94-a185-3b8c522cd365\") " pod="openstack/dnsmasq-dns-5c9d85d47c-q9sfs" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.087108 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x949c\" (UniqueName: \"kubernetes.io/projected/0a52b684-ce41-4f94-a185-3b8c522cd365-kube-api-access-x949c\") pod \"dnsmasq-dns-5c9d85d47c-q9sfs\" (UID: \"0a52b684-ce41-4f94-a185-3b8c522cd365\") " pod="openstack/dnsmasq-dns-5c9d85d47c-q9sfs" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.087175 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0a52b684-ce41-4f94-a185-3b8c522cd365-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9d85d47c-q9sfs\" (UID: \"0a52b684-ce41-4f94-a185-3b8c522cd365\") " pod="openstack/dnsmasq-dns-5c9d85d47c-q9sfs" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.087201 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0a52b684-ce41-4f94-a185-3b8c522cd365-dns-svc\") pod \"dnsmasq-dns-5c9d85d47c-q9sfs\" (UID: \"0a52b684-ce41-4f94-a185-3b8c522cd365\") " pod="openstack/dnsmasq-dns-5c9d85d47c-q9sfs" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.087257 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a52b684-ce41-4f94-a185-3b8c522cd365-config\") pod \"dnsmasq-dns-5c9d85d47c-q9sfs\" (UID: \"0a52b684-ce41-4f94-a185-3b8c522cd365\") " pod="openstack/dnsmasq-dns-5c9d85d47c-q9sfs" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.097486 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.097900 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.098070 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.098237 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-2ntjs" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.099029 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.142358 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-s7fbv"] Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.233245 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a52b684-ce41-4f94-a185-3b8c522cd365-config\") pod \"dnsmasq-dns-5c9d85d47c-q9sfs\" (UID: \"0a52b684-ce41-4f94-a185-3b8c522cd365\") " pod="openstack/dnsmasq-dns-5c9d85d47c-q9sfs" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.233689 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0a52b684-ce41-4f94-a185-3b8c522cd365-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9d85d47c-q9sfs\" (UID: \"0a52b684-ce41-4f94-a185-3b8c522cd365\") " pod="openstack/dnsmasq-dns-5c9d85d47c-q9sfs" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.234340 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a52b684-ce41-4f94-a185-3b8c522cd365-config\") pod \"dnsmasq-dns-5c9d85d47c-q9sfs\" (UID: \"0a52b684-ce41-4f94-a185-3b8c522cd365\") " pod="openstack/dnsmasq-dns-5c9d85d47c-q9sfs" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.237945 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x949c\" (UniqueName: \"kubernetes.io/projected/0a52b684-ce41-4f94-a185-3b8c522cd365-kube-api-access-x949c\") pod \"dnsmasq-dns-5c9d85d47c-q9sfs\" (UID: \"0a52b684-ce41-4f94-a185-3b8c522cd365\") " pod="openstack/dnsmasq-dns-5c9d85d47c-q9sfs" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.238232 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0a52b684-ce41-4f94-a185-3b8c522cd365-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9d85d47c-q9sfs\" (UID: \"0a52b684-ce41-4f94-a185-3b8c522cd365\") " pod="openstack/dnsmasq-dns-5c9d85d47c-q9sfs" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.238310 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0a52b684-ce41-4f94-a185-3b8c522cd365-dns-svc\") pod \"dnsmasq-dns-5c9d85d47c-q9sfs\" (UID: \"0a52b684-ce41-4f94-a185-3b8c522cd365\") " pod="openstack/dnsmasq-dns-5c9d85d47c-q9sfs" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.259548 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-sync-kzxbf"] Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.262089 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0a52b684-ce41-4f94-a185-3b8c522cd365-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9d85d47c-q9sfs\" (UID: \"0a52b684-ce41-4f94-a185-3b8c522cd365\") " pod="openstack/dnsmasq-dns-5c9d85d47c-q9sfs" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.269434 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0a52b684-ce41-4f94-a185-3b8c522cd365-dns-svc\") pod \"dnsmasq-dns-5c9d85d47c-q9sfs\" (UID: \"0a52b684-ce41-4f94-a185-3b8c522cd365\") " pod="openstack/dnsmasq-dns-5c9d85d47c-q9sfs" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.273367 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0a52b684-ce41-4f94-a185-3b8c522cd365-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9d85d47c-q9sfs\" (UID: \"0a52b684-ce41-4f94-a185-3b8c522cd365\") " pod="openstack/dnsmasq-dns-5c9d85d47c-q9sfs" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.289661 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x949c\" (UniqueName: \"kubernetes.io/projected/0a52b684-ce41-4f94-a185-3b8c522cd365-kube-api-access-x949c\") pod \"dnsmasq-dns-5c9d85d47c-q9sfs\" (UID: \"0a52b684-ce41-4f94-a185-3b8c522cd365\") " pod="openstack/dnsmasq-dns-5c9d85d47c-q9sfs" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.294257 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-kzxbf" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.298770 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-kbbdj" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.299237 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.321368 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-kzxbf"] Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.341342 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ce6f7cae-26b1-41d6-9cf0-769405edbdb5-fernet-keys\") pod \"keystone-bootstrap-s7fbv\" (UID: \"ce6f7cae-26b1-41d6-9cf0-769405edbdb5\") " pod="openstack/keystone-bootstrap-s7fbv" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.341422 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztzcf\" (UniqueName: \"kubernetes.io/projected/ce6f7cae-26b1-41d6-9cf0-769405edbdb5-kube-api-access-ztzcf\") pod \"keystone-bootstrap-s7fbv\" (UID: \"ce6f7cae-26b1-41d6-9cf0-769405edbdb5\") " pod="openstack/keystone-bootstrap-s7fbv" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.341460 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ce6f7cae-26b1-41d6-9cf0-769405edbdb5-credential-keys\") pod \"keystone-bootstrap-s7fbv\" (UID: \"ce6f7cae-26b1-41d6-9cf0-769405edbdb5\") " pod="openstack/keystone-bootstrap-s7fbv" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.341515 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4239535-0d5c-4b17-a695-9f57efb4d381-combined-ca-bundle\") pod \"heat-db-sync-kzxbf\" (UID: \"c4239535-0d5c-4b17-a695-9f57efb4d381\") " pod="openstack/heat-db-sync-kzxbf" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.341553 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce6f7cae-26b1-41d6-9cf0-769405edbdb5-scripts\") pod \"keystone-bootstrap-s7fbv\" (UID: \"ce6f7cae-26b1-41d6-9cf0-769405edbdb5\") " pod="openstack/keystone-bootstrap-s7fbv" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.341575 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce6f7cae-26b1-41d6-9cf0-769405edbdb5-combined-ca-bundle\") pod \"keystone-bootstrap-s7fbv\" (UID: \"ce6f7cae-26b1-41d6-9cf0-769405edbdb5\") " pod="openstack/keystone-bootstrap-s7fbv" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.341596 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4239535-0d5c-4b17-a695-9f57efb4d381-config-data\") pod \"heat-db-sync-kzxbf\" (UID: \"c4239535-0d5c-4b17-a695-9f57efb4d381\") " pod="openstack/heat-db-sync-kzxbf" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.341625 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce6f7cae-26b1-41d6-9cf0-769405edbdb5-config-data\") pod \"keystone-bootstrap-s7fbv\" (UID: \"ce6f7cae-26b1-41d6-9cf0-769405edbdb5\") " pod="openstack/keystone-bootstrap-s7fbv" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.341650 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqxrb\" (UniqueName: \"kubernetes.io/projected/c4239535-0d5c-4b17-a695-9f57efb4d381-kube-api-access-mqxrb\") pod \"heat-db-sync-kzxbf\" (UID: \"c4239535-0d5c-4b17-a695-9f57efb4d381\") " pod="openstack/heat-db-sync-kzxbf" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.347779 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-sxblk"] Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.349366 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-sxblk" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.385379 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.386070 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.386259 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-k4lbn" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.386948 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-gtbcc"] Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.389710 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-gtbcc" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.399843 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-sxblk"] Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.400293 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.401181 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-4k8nt" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.401479 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.408550 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-gtbcc"] Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.445658 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce6f7cae-26b1-41d6-9cf0-769405edbdb5-config-data\") pod \"keystone-bootstrap-s7fbv\" (UID: \"ce6f7cae-26b1-41d6-9cf0-769405edbdb5\") " pod="openstack/keystone-bootstrap-s7fbv" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.445877 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqxrb\" (UniqueName: \"kubernetes.io/projected/c4239535-0d5c-4b17-a695-9f57efb4d381-kube-api-access-mqxrb\") pod \"heat-db-sync-kzxbf\" (UID: \"c4239535-0d5c-4b17-a695-9f57efb4d381\") " pod="openstack/heat-db-sync-kzxbf" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.445939 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ce6f7cae-26b1-41d6-9cf0-769405edbdb5-fernet-keys\") pod \"keystone-bootstrap-s7fbv\" (UID: \"ce6f7cae-26b1-41d6-9cf0-769405edbdb5\") " pod="openstack/keystone-bootstrap-s7fbv" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.445987 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztzcf\" (UniqueName: \"kubernetes.io/projected/ce6f7cae-26b1-41d6-9cf0-769405edbdb5-kube-api-access-ztzcf\") pod \"keystone-bootstrap-s7fbv\" (UID: \"ce6f7cae-26b1-41d6-9cf0-769405edbdb5\") " pod="openstack/keystone-bootstrap-s7fbv" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.446020 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ce6f7cae-26b1-41d6-9cf0-769405edbdb5-credential-keys\") pod \"keystone-bootstrap-s7fbv\" (UID: \"ce6f7cae-26b1-41d6-9cf0-769405edbdb5\") " pod="openstack/keystone-bootstrap-s7fbv" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.446068 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4239535-0d5c-4b17-a695-9f57efb4d381-combined-ca-bundle\") pod \"heat-db-sync-kzxbf\" (UID: \"c4239535-0d5c-4b17-a695-9f57efb4d381\") " pod="openstack/heat-db-sync-kzxbf" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.446105 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce6f7cae-26b1-41d6-9cf0-769405edbdb5-scripts\") pod \"keystone-bootstrap-s7fbv\" (UID: \"ce6f7cae-26b1-41d6-9cf0-769405edbdb5\") " pod="openstack/keystone-bootstrap-s7fbv" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.446127 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce6f7cae-26b1-41d6-9cf0-769405edbdb5-combined-ca-bundle\") pod \"keystone-bootstrap-s7fbv\" (UID: \"ce6f7cae-26b1-41d6-9cf0-769405edbdb5\") " pod="openstack/keystone-bootstrap-s7fbv" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.446149 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4239535-0d5c-4b17-a695-9f57efb4d381-config-data\") pod \"heat-db-sync-kzxbf\" (UID: \"c4239535-0d5c-4b17-a695-9f57efb4d381\") " pod="openstack/heat-db-sync-kzxbf" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.471697 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9d85d47c-q9sfs" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.492098 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce6f7cae-26b1-41d6-9cf0-769405edbdb5-config-data\") pod \"keystone-bootstrap-s7fbv\" (UID: \"ce6f7cae-26b1-41d6-9cf0-769405edbdb5\") " pod="openstack/keystone-bootstrap-s7fbv" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.496022 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce6f7cae-26b1-41d6-9cf0-769405edbdb5-scripts\") pod \"keystone-bootstrap-s7fbv\" (UID: \"ce6f7cae-26b1-41d6-9cf0-769405edbdb5\") " pod="openstack/keystone-bootstrap-s7fbv" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.503038 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce6f7cae-26b1-41d6-9cf0-769405edbdb5-combined-ca-bundle\") pod \"keystone-bootstrap-s7fbv\" (UID: \"ce6f7cae-26b1-41d6-9cf0-769405edbdb5\") " pod="openstack/keystone-bootstrap-s7fbv" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.505768 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ce6f7cae-26b1-41d6-9cf0-769405edbdb5-credential-keys\") pod \"keystone-bootstrap-s7fbv\" (UID: \"ce6f7cae-26b1-41d6-9cf0-769405edbdb5\") " pod="openstack/keystone-bootstrap-s7fbv" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.514576 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqxrb\" (UniqueName: \"kubernetes.io/projected/c4239535-0d5c-4b17-a695-9f57efb4d381-kube-api-access-mqxrb\") pod \"heat-db-sync-kzxbf\" (UID: \"c4239535-0d5c-4b17-a695-9f57efb4d381\") " pod="openstack/heat-db-sync-kzxbf" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.521533 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ce6f7cae-26b1-41d6-9cf0-769405edbdb5-fernet-keys\") pod \"keystone-bootstrap-s7fbv\" (UID: \"ce6f7cae-26b1-41d6-9cf0-769405edbdb5\") " pod="openstack/keystone-bootstrap-s7fbv" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.551017 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9c326307-73df-462e-98ec-0e4dc89fdd54-etc-machine-id\") pod \"cinder-db-sync-gtbcc\" (UID: \"9c326307-73df-462e-98ec-0e4dc89fdd54\") " pod="openstack/cinder-db-sync-gtbcc" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.551112 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d342549d-487a-4b41-89cc-8ce263aed373-config\") pod \"neutron-db-sync-sxblk\" (UID: \"d342549d-487a-4b41-89cc-8ce263aed373\") " pod="openstack/neutron-db-sync-sxblk" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.551131 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d342549d-487a-4b41-89cc-8ce263aed373-combined-ca-bundle\") pod \"neutron-db-sync-sxblk\" (UID: \"d342549d-487a-4b41-89cc-8ce263aed373\") " pod="openstack/neutron-db-sync-sxblk" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.551159 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c326307-73df-462e-98ec-0e4dc89fdd54-combined-ca-bundle\") pod \"cinder-db-sync-gtbcc\" (UID: \"9c326307-73df-462e-98ec-0e4dc89fdd54\") " pod="openstack/cinder-db-sync-gtbcc" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.551190 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9c326307-73df-462e-98ec-0e4dc89fdd54-db-sync-config-data\") pod \"cinder-db-sync-gtbcc\" (UID: \"9c326307-73df-462e-98ec-0e4dc89fdd54\") " pod="openstack/cinder-db-sync-gtbcc" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.551236 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c326307-73df-462e-98ec-0e4dc89fdd54-scripts\") pod \"cinder-db-sync-gtbcc\" (UID: \"9c326307-73df-462e-98ec-0e4dc89fdd54\") " pod="openstack/cinder-db-sync-gtbcc" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.551269 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgfhp\" (UniqueName: \"kubernetes.io/projected/d342549d-487a-4b41-89cc-8ce263aed373-kube-api-access-tgfhp\") pod \"neutron-db-sync-sxblk\" (UID: \"d342549d-487a-4b41-89cc-8ce263aed373\") " pod="openstack/neutron-db-sync-sxblk" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.551301 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c326307-73df-462e-98ec-0e4dc89fdd54-config-data\") pod \"cinder-db-sync-gtbcc\" (UID: \"9c326307-73df-462e-98ec-0e4dc89fdd54\") " pod="openstack/cinder-db-sync-gtbcc" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.551339 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htbmg\" (UniqueName: \"kubernetes.io/projected/9c326307-73df-462e-98ec-0e4dc89fdd54-kube-api-access-htbmg\") pod \"cinder-db-sync-gtbcc\" (UID: \"9c326307-73df-462e-98ec-0e4dc89fdd54\") " pod="openstack/cinder-db-sync-gtbcc" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.552980 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4239535-0d5c-4b17-a695-9f57efb4d381-combined-ca-bundle\") pod \"heat-db-sync-kzxbf\" (UID: \"c4239535-0d5c-4b17-a695-9f57efb4d381\") " pod="openstack/heat-db-sync-kzxbf" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.552985 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztzcf\" (UniqueName: \"kubernetes.io/projected/ce6f7cae-26b1-41d6-9cf0-769405edbdb5-kube-api-access-ztzcf\") pod \"keystone-bootstrap-s7fbv\" (UID: \"ce6f7cae-26b1-41d6-9cf0-769405edbdb5\") " pod="openstack/keystone-bootstrap-s7fbv" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.555697 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4239535-0d5c-4b17-a695-9f57efb4d381-config-data\") pod \"heat-db-sync-kzxbf\" (UID: \"c4239535-0d5c-4b17-a695-9f57efb4d381\") " pod="openstack/heat-db-sync-kzxbf" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.580422 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9d85d47c-q9sfs"] Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.591624 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-649gh"] Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.600352 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-649gh" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.601485 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-pd5s5"] Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.602968 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-pd5s5" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.604594 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.604717 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-xjnn2" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.609642 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-649gh"] Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.610121 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-xpbfr" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.610271 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.611258 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.628292 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.636646 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.638323 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-pd5s5"] Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.644394 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.644603 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.648730 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.654264 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htbmg\" (UniqueName: \"kubernetes.io/projected/9c326307-73df-462e-98ec-0e4dc89fdd54-kube-api-access-htbmg\") pod \"cinder-db-sync-gtbcc\" (UID: \"9c326307-73df-462e-98ec-0e4dc89fdd54\") " pod="openstack/cinder-db-sync-gtbcc" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.654357 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9c326307-73df-462e-98ec-0e4dc89fdd54-etc-machine-id\") pod \"cinder-db-sync-gtbcc\" (UID: \"9c326307-73df-462e-98ec-0e4dc89fdd54\") " pod="openstack/cinder-db-sync-gtbcc" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.654411 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d342549d-487a-4b41-89cc-8ce263aed373-config\") pod \"neutron-db-sync-sxblk\" (UID: \"d342549d-487a-4b41-89cc-8ce263aed373\") " pod="openstack/neutron-db-sync-sxblk" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.654433 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d342549d-487a-4b41-89cc-8ce263aed373-combined-ca-bundle\") pod \"neutron-db-sync-sxblk\" (UID: \"d342549d-487a-4b41-89cc-8ce263aed373\") " pod="openstack/neutron-db-sync-sxblk" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.654453 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c326307-73df-462e-98ec-0e4dc89fdd54-combined-ca-bundle\") pod \"cinder-db-sync-gtbcc\" (UID: \"9c326307-73df-462e-98ec-0e4dc89fdd54\") " pod="openstack/cinder-db-sync-gtbcc" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.654479 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9c326307-73df-462e-98ec-0e4dc89fdd54-db-sync-config-data\") pod \"cinder-db-sync-gtbcc\" (UID: \"9c326307-73df-462e-98ec-0e4dc89fdd54\") " pod="openstack/cinder-db-sync-gtbcc" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.654518 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c326307-73df-462e-98ec-0e4dc89fdd54-scripts\") pod \"cinder-db-sync-gtbcc\" (UID: \"9c326307-73df-462e-98ec-0e4dc89fdd54\") " pod="openstack/cinder-db-sync-gtbcc" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.654551 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgfhp\" (UniqueName: \"kubernetes.io/projected/d342549d-487a-4b41-89cc-8ce263aed373-kube-api-access-tgfhp\") pod \"neutron-db-sync-sxblk\" (UID: \"d342549d-487a-4b41-89cc-8ce263aed373\") " pod="openstack/neutron-db-sync-sxblk" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.654595 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c326307-73df-462e-98ec-0e4dc89fdd54-config-data\") pod \"cinder-db-sync-gtbcc\" (UID: \"9c326307-73df-462e-98ec-0e4dc89fdd54\") " pod="openstack/cinder-db-sync-gtbcc" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.654902 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9c326307-73df-462e-98ec-0e4dc89fdd54-etc-machine-id\") pod \"cinder-db-sync-gtbcc\" (UID: \"9c326307-73df-462e-98ec-0e4dc89fdd54\") " pod="openstack/cinder-db-sync-gtbcc" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.660862 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6ffb94d8ff-wlntx"] Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.662770 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6ffb94d8ff-wlntx" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.666494 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c326307-73df-462e-98ec-0e4dc89fdd54-scripts\") pod \"cinder-db-sync-gtbcc\" (UID: \"9c326307-73df-462e-98ec-0e4dc89fdd54\") " pod="openstack/cinder-db-sync-gtbcc" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.672051 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6ffb94d8ff-wlntx"] Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.681905 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c326307-73df-462e-98ec-0e4dc89fdd54-combined-ca-bundle\") pod \"cinder-db-sync-gtbcc\" (UID: \"9c326307-73df-462e-98ec-0e4dc89fdd54\") " pod="openstack/cinder-db-sync-gtbcc" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.683153 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htbmg\" (UniqueName: \"kubernetes.io/projected/9c326307-73df-462e-98ec-0e4dc89fdd54-kube-api-access-htbmg\") pod \"cinder-db-sync-gtbcc\" (UID: \"9c326307-73df-462e-98ec-0e4dc89fdd54\") " pod="openstack/cinder-db-sync-gtbcc" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.684136 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/d342549d-487a-4b41-89cc-8ce263aed373-config\") pod \"neutron-db-sync-sxblk\" (UID: \"d342549d-487a-4b41-89cc-8ce263aed373\") " pod="openstack/neutron-db-sync-sxblk" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.684452 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c326307-73df-462e-98ec-0e4dc89fdd54-config-data\") pod \"cinder-db-sync-gtbcc\" (UID: \"9c326307-73df-462e-98ec-0e4dc89fdd54\") " pod="openstack/cinder-db-sync-gtbcc" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.685030 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9c326307-73df-462e-98ec-0e4dc89fdd54-db-sync-config-data\") pod \"cinder-db-sync-gtbcc\" (UID: \"9c326307-73df-462e-98ec-0e4dc89fdd54\") " pod="openstack/cinder-db-sync-gtbcc" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.685552 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d342549d-487a-4b41-89cc-8ce263aed373-combined-ca-bundle\") pod \"neutron-db-sync-sxblk\" (UID: \"d342549d-487a-4b41-89cc-8ce263aed373\") " pod="openstack/neutron-db-sync-sxblk" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.689915 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgfhp\" (UniqueName: \"kubernetes.io/projected/d342549d-487a-4b41-89cc-8ce263aed373-kube-api-access-tgfhp\") pod \"neutron-db-sync-sxblk\" (UID: \"d342549d-487a-4b41-89cc-8ce263aed373\") " pod="openstack/neutron-db-sync-sxblk" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.705181 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-kzxbf" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.720294 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-sxblk" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.737614 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-gtbcc" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.761357 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f7179e9e-f623-44a2-9f70-021244770e56-logs\") pod \"placement-db-sync-pd5s5\" (UID: \"f7179e9e-f623-44a2-9f70-021244770e56\") " pod="openstack/placement-db-sync-pd5s5" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.761414 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d02b9de-9f5e-44fc-9e29-6a01b47c6ad7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6d02b9de-9f5e-44fc-9e29-6a01b47c6ad7\") " pod="openstack/ceilometer-0" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.761454 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7179e9e-f623-44a2-9f70-021244770e56-config-data\") pod \"placement-db-sync-pd5s5\" (UID: \"f7179e9e-f623-44a2-9f70-021244770e56\") " pod="openstack/placement-db-sync-pd5s5" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.761485 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6d02b9de-9f5e-44fc-9e29-6a01b47c6ad7-log-httpd\") pod \"ceilometer-0\" (UID: \"6d02b9de-9f5e-44fc-9e29-6a01b47c6ad7\") " pod="openstack/ceilometer-0" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.761509 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6d02b9de-9f5e-44fc-9e29-6a01b47c6ad7-run-httpd\") pod \"ceilometer-0\" (UID: \"6d02b9de-9f5e-44fc-9e29-6a01b47c6ad7\") " pod="openstack/ceilometer-0" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.761555 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ae883c36-aac5-49f4-839c-d0140fe724cc-db-sync-config-data\") pod \"barbican-db-sync-649gh\" (UID: \"ae883c36-aac5-49f4-839c-d0140fe724cc\") " pod="openstack/barbican-db-sync-649gh" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.761588 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82p95\" (UniqueName: \"kubernetes.io/projected/f7179e9e-f623-44a2-9f70-021244770e56-kube-api-access-82p95\") pod \"placement-db-sync-pd5s5\" (UID: \"f7179e9e-f623-44a2-9f70-021244770e56\") " pod="openstack/placement-db-sync-pd5s5" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.761608 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae883c36-aac5-49f4-839c-d0140fe724cc-combined-ca-bundle\") pod \"barbican-db-sync-649gh\" (UID: \"ae883c36-aac5-49f4-839c-d0140fe724cc\") " pod="openstack/barbican-db-sync-649gh" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.761654 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6d02b9de-9f5e-44fc-9e29-6a01b47c6ad7-scripts\") pod \"ceilometer-0\" (UID: \"6d02b9de-9f5e-44fc-9e29-6a01b47c6ad7\") " pod="openstack/ceilometer-0" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.761686 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7179e9e-f623-44a2-9f70-021244770e56-combined-ca-bundle\") pod \"placement-db-sync-pd5s5\" (UID: \"f7179e9e-f623-44a2-9f70-021244770e56\") " pod="openstack/placement-db-sync-pd5s5" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.761710 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d02b9de-9f5e-44fc-9e29-6a01b47c6ad7-config-data\") pod \"ceilometer-0\" (UID: \"6d02b9de-9f5e-44fc-9e29-6a01b47c6ad7\") " pod="openstack/ceilometer-0" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.761766 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjqts\" (UniqueName: \"kubernetes.io/projected/ae883c36-aac5-49f4-839c-d0140fe724cc-kube-api-access-mjqts\") pod \"barbican-db-sync-649gh\" (UID: \"ae883c36-aac5-49f4-839c-d0140fe724cc\") " pod="openstack/barbican-db-sync-649gh" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.761798 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6d02b9de-9f5e-44fc-9e29-6a01b47c6ad7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6d02b9de-9f5e-44fc-9e29-6a01b47c6ad7\") " pod="openstack/ceilometer-0" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.761845 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f7179e9e-f623-44a2-9f70-021244770e56-scripts\") pod \"placement-db-sync-pd5s5\" (UID: \"f7179e9e-f623-44a2-9f70-021244770e56\") " pod="openstack/placement-db-sync-pd5s5" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.761905 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4250e624-4c9d-426c-8856-76782f0bb0e3-ovsdbserver-nb\") pod \"dnsmasq-dns-6ffb94d8ff-wlntx\" (UID: \"4250e624-4c9d-426c-8856-76782f0bb0e3\") " pod="openstack/dnsmasq-dns-6ffb94d8ff-wlntx" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.761940 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4250e624-4c9d-426c-8856-76782f0bb0e3-dns-svc\") pod \"dnsmasq-dns-6ffb94d8ff-wlntx\" (UID: \"4250e624-4c9d-426c-8856-76782f0bb0e3\") " pod="openstack/dnsmasq-dns-6ffb94d8ff-wlntx" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.761960 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55s9v\" (UniqueName: \"kubernetes.io/projected/6d02b9de-9f5e-44fc-9e29-6a01b47c6ad7-kube-api-access-55s9v\") pod \"ceilometer-0\" (UID: \"6d02b9de-9f5e-44fc-9e29-6a01b47c6ad7\") " pod="openstack/ceilometer-0" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.761994 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4250e624-4c9d-426c-8856-76782f0bb0e3-config\") pod \"dnsmasq-dns-6ffb94d8ff-wlntx\" (UID: \"4250e624-4c9d-426c-8856-76782f0bb0e3\") " pod="openstack/dnsmasq-dns-6ffb94d8ff-wlntx" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.762072 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qv2zw\" (UniqueName: \"kubernetes.io/projected/4250e624-4c9d-426c-8856-76782f0bb0e3-kube-api-access-qv2zw\") pod \"dnsmasq-dns-6ffb94d8ff-wlntx\" (UID: \"4250e624-4c9d-426c-8856-76782f0bb0e3\") " pod="openstack/dnsmasq-dns-6ffb94d8ff-wlntx" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.762106 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4250e624-4c9d-426c-8856-76782f0bb0e3-ovsdbserver-sb\") pod \"dnsmasq-dns-6ffb94d8ff-wlntx\" (UID: \"4250e624-4c9d-426c-8856-76782f0bb0e3\") " pod="openstack/dnsmasq-dns-6ffb94d8ff-wlntx" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.793228 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-s7fbv" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.864437 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qv2zw\" (UniqueName: \"kubernetes.io/projected/4250e624-4c9d-426c-8856-76782f0bb0e3-kube-api-access-qv2zw\") pod \"dnsmasq-dns-6ffb94d8ff-wlntx\" (UID: \"4250e624-4c9d-426c-8856-76782f0bb0e3\") " pod="openstack/dnsmasq-dns-6ffb94d8ff-wlntx" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.864504 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4250e624-4c9d-426c-8856-76782f0bb0e3-ovsdbserver-sb\") pod \"dnsmasq-dns-6ffb94d8ff-wlntx\" (UID: \"4250e624-4c9d-426c-8856-76782f0bb0e3\") " pod="openstack/dnsmasq-dns-6ffb94d8ff-wlntx" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.864560 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f7179e9e-f623-44a2-9f70-021244770e56-logs\") pod \"placement-db-sync-pd5s5\" (UID: \"f7179e9e-f623-44a2-9f70-021244770e56\") " pod="openstack/placement-db-sync-pd5s5" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.864580 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d02b9de-9f5e-44fc-9e29-6a01b47c6ad7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6d02b9de-9f5e-44fc-9e29-6a01b47c6ad7\") " pod="openstack/ceilometer-0" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.864621 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7179e9e-f623-44a2-9f70-021244770e56-config-data\") pod \"placement-db-sync-pd5s5\" (UID: \"f7179e9e-f623-44a2-9f70-021244770e56\") " pod="openstack/placement-db-sync-pd5s5" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.864646 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6d02b9de-9f5e-44fc-9e29-6a01b47c6ad7-log-httpd\") pod \"ceilometer-0\" (UID: \"6d02b9de-9f5e-44fc-9e29-6a01b47c6ad7\") " pod="openstack/ceilometer-0" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.864664 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6d02b9de-9f5e-44fc-9e29-6a01b47c6ad7-run-httpd\") pod \"ceilometer-0\" (UID: \"6d02b9de-9f5e-44fc-9e29-6a01b47c6ad7\") " pod="openstack/ceilometer-0" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.864744 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ae883c36-aac5-49f4-839c-d0140fe724cc-db-sync-config-data\") pod \"barbican-db-sync-649gh\" (UID: \"ae883c36-aac5-49f4-839c-d0140fe724cc\") " pod="openstack/barbican-db-sync-649gh" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.864775 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82p95\" (UniqueName: \"kubernetes.io/projected/f7179e9e-f623-44a2-9f70-021244770e56-kube-api-access-82p95\") pod \"placement-db-sync-pd5s5\" (UID: \"f7179e9e-f623-44a2-9f70-021244770e56\") " pod="openstack/placement-db-sync-pd5s5" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.864845 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae883c36-aac5-49f4-839c-d0140fe724cc-combined-ca-bundle\") pod \"barbican-db-sync-649gh\" (UID: \"ae883c36-aac5-49f4-839c-d0140fe724cc\") " pod="openstack/barbican-db-sync-649gh" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.864872 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6d02b9de-9f5e-44fc-9e29-6a01b47c6ad7-scripts\") pod \"ceilometer-0\" (UID: \"6d02b9de-9f5e-44fc-9e29-6a01b47c6ad7\") " pod="openstack/ceilometer-0" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.864920 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7179e9e-f623-44a2-9f70-021244770e56-combined-ca-bundle\") pod \"placement-db-sync-pd5s5\" (UID: \"f7179e9e-f623-44a2-9f70-021244770e56\") " pod="openstack/placement-db-sync-pd5s5" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.864938 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d02b9de-9f5e-44fc-9e29-6a01b47c6ad7-config-data\") pod \"ceilometer-0\" (UID: \"6d02b9de-9f5e-44fc-9e29-6a01b47c6ad7\") " pod="openstack/ceilometer-0" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.865015 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjqts\" (UniqueName: \"kubernetes.io/projected/ae883c36-aac5-49f4-839c-d0140fe724cc-kube-api-access-mjqts\") pod \"barbican-db-sync-649gh\" (UID: \"ae883c36-aac5-49f4-839c-d0140fe724cc\") " pod="openstack/barbican-db-sync-649gh" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.865052 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6d02b9de-9f5e-44fc-9e29-6a01b47c6ad7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6d02b9de-9f5e-44fc-9e29-6a01b47c6ad7\") " pod="openstack/ceilometer-0" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.865110 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f7179e9e-f623-44a2-9f70-021244770e56-scripts\") pod \"placement-db-sync-pd5s5\" (UID: \"f7179e9e-f623-44a2-9f70-021244770e56\") " pod="openstack/placement-db-sync-pd5s5" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.865168 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4250e624-4c9d-426c-8856-76782f0bb0e3-ovsdbserver-nb\") pod \"dnsmasq-dns-6ffb94d8ff-wlntx\" (UID: \"4250e624-4c9d-426c-8856-76782f0bb0e3\") " pod="openstack/dnsmasq-dns-6ffb94d8ff-wlntx" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.865208 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4250e624-4c9d-426c-8856-76782f0bb0e3-dns-svc\") pod \"dnsmasq-dns-6ffb94d8ff-wlntx\" (UID: \"4250e624-4c9d-426c-8856-76782f0bb0e3\") " pod="openstack/dnsmasq-dns-6ffb94d8ff-wlntx" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.865260 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55s9v\" (UniqueName: \"kubernetes.io/projected/6d02b9de-9f5e-44fc-9e29-6a01b47c6ad7-kube-api-access-55s9v\") pod \"ceilometer-0\" (UID: \"6d02b9de-9f5e-44fc-9e29-6a01b47c6ad7\") " pod="openstack/ceilometer-0" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.865326 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4250e624-4c9d-426c-8856-76782f0bb0e3-config\") pod \"dnsmasq-dns-6ffb94d8ff-wlntx\" (UID: \"4250e624-4c9d-426c-8856-76782f0bb0e3\") " pod="openstack/dnsmasq-dns-6ffb94d8ff-wlntx" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.865470 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6d02b9de-9f5e-44fc-9e29-6a01b47c6ad7-run-httpd\") pod \"ceilometer-0\" (UID: \"6d02b9de-9f5e-44fc-9e29-6a01b47c6ad7\") " pod="openstack/ceilometer-0" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.865611 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4250e624-4c9d-426c-8856-76782f0bb0e3-ovsdbserver-sb\") pod \"dnsmasq-dns-6ffb94d8ff-wlntx\" (UID: \"4250e624-4c9d-426c-8856-76782f0bb0e3\") " pod="openstack/dnsmasq-dns-6ffb94d8ff-wlntx" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.865864 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f7179e9e-f623-44a2-9f70-021244770e56-logs\") pod \"placement-db-sync-pd5s5\" (UID: \"f7179e9e-f623-44a2-9f70-021244770e56\") " pod="openstack/placement-db-sync-pd5s5" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.866627 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6d02b9de-9f5e-44fc-9e29-6a01b47c6ad7-log-httpd\") pod \"ceilometer-0\" (UID: \"6d02b9de-9f5e-44fc-9e29-6a01b47c6ad7\") " pod="openstack/ceilometer-0" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.866976 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4250e624-4c9d-426c-8856-76782f0bb0e3-config\") pod \"dnsmasq-dns-6ffb94d8ff-wlntx\" (UID: \"4250e624-4c9d-426c-8856-76782f0bb0e3\") " pod="openstack/dnsmasq-dns-6ffb94d8ff-wlntx" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.868035 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4250e624-4c9d-426c-8856-76782f0bb0e3-ovsdbserver-nb\") pod \"dnsmasq-dns-6ffb94d8ff-wlntx\" (UID: \"4250e624-4c9d-426c-8856-76782f0bb0e3\") " pod="openstack/dnsmasq-dns-6ffb94d8ff-wlntx" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.868738 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4250e624-4c9d-426c-8856-76782f0bb0e3-dns-svc\") pod \"dnsmasq-dns-6ffb94d8ff-wlntx\" (UID: \"4250e624-4c9d-426c-8856-76782f0bb0e3\") " pod="openstack/dnsmasq-dns-6ffb94d8ff-wlntx" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.870551 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ae883c36-aac5-49f4-839c-d0140fe724cc-db-sync-config-data\") pod \"barbican-db-sync-649gh\" (UID: \"ae883c36-aac5-49f4-839c-d0140fe724cc\") " pod="openstack/barbican-db-sync-649gh" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.871340 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d02b9de-9f5e-44fc-9e29-6a01b47c6ad7-config-data\") pod \"ceilometer-0\" (UID: \"6d02b9de-9f5e-44fc-9e29-6a01b47c6ad7\") " pod="openstack/ceilometer-0" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.873077 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6d02b9de-9f5e-44fc-9e29-6a01b47c6ad7-scripts\") pod \"ceilometer-0\" (UID: \"6d02b9de-9f5e-44fc-9e29-6a01b47c6ad7\") " pod="openstack/ceilometer-0" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.877963 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae883c36-aac5-49f4-839c-d0140fe724cc-combined-ca-bundle\") pod \"barbican-db-sync-649gh\" (UID: \"ae883c36-aac5-49f4-839c-d0140fe724cc\") " pod="openstack/barbican-db-sync-649gh" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.878193 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7179e9e-f623-44a2-9f70-021244770e56-config-data\") pod \"placement-db-sync-pd5s5\" (UID: \"f7179e9e-f623-44a2-9f70-021244770e56\") " pod="openstack/placement-db-sync-pd5s5" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.878317 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f7179e9e-f623-44a2-9f70-021244770e56-scripts\") pod \"placement-db-sync-pd5s5\" (UID: \"f7179e9e-f623-44a2-9f70-021244770e56\") " pod="openstack/placement-db-sync-pd5s5" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.879216 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d02b9de-9f5e-44fc-9e29-6a01b47c6ad7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6d02b9de-9f5e-44fc-9e29-6a01b47c6ad7\") " pod="openstack/ceilometer-0" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.886386 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6d02b9de-9f5e-44fc-9e29-6a01b47c6ad7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6d02b9de-9f5e-44fc-9e29-6a01b47c6ad7\") " pod="openstack/ceilometer-0" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.888072 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qv2zw\" (UniqueName: \"kubernetes.io/projected/4250e624-4c9d-426c-8856-76782f0bb0e3-kube-api-access-qv2zw\") pod \"dnsmasq-dns-6ffb94d8ff-wlntx\" (UID: \"4250e624-4c9d-426c-8856-76782f0bb0e3\") " pod="openstack/dnsmasq-dns-6ffb94d8ff-wlntx" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.890549 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7179e9e-f623-44a2-9f70-021244770e56-combined-ca-bundle\") pod \"placement-db-sync-pd5s5\" (UID: \"f7179e9e-f623-44a2-9f70-021244770e56\") " pod="openstack/placement-db-sync-pd5s5" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.891987 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82p95\" (UniqueName: \"kubernetes.io/projected/f7179e9e-f623-44a2-9f70-021244770e56-kube-api-access-82p95\") pod \"placement-db-sync-pd5s5\" (UID: \"f7179e9e-f623-44a2-9f70-021244770e56\") " pod="openstack/placement-db-sync-pd5s5" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.896793 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjqts\" (UniqueName: \"kubernetes.io/projected/ae883c36-aac5-49f4-839c-d0140fe724cc-kube-api-access-mjqts\") pod \"barbican-db-sync-649gh\" (UID: \"ae883c36-aac5-49f4-839c-d0140fe724cc\") " pod="openstack/barbican-db-sync-649gh" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.901636 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55s9v\" (UniqueName: \"kubernetes.io/projected/6d02b9de-9f5e-44fc-9e29-6a01b47c6ad7-kube-api-access-55s9v\") pod \"ceilometer-0\" (UID: \"6d02b9de-9f5e-44fc-9e29-6a01b47c6ad7\") " pod="openstack/ceilometer-0" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.927131 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-649gh" Dec 02 16:14:24 crc kubenswrapper[4933]: I1202 16:14:24.940645 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-pd5s5" Dec 02 16:14:25 crc kubenswrapper[4933]: I1202 16:14:25.079772 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 16:14:25 crc kubenswrapper[4933]: I1202 16:14:25.098320 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6ffb94d8ff-wlntx" Dec 02 16:14:25 crc kubenswrapper[4933]: I1202 16:14:25.868882 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"cf325ab8-af91-4009-9e8b-a299db2234da","Type":"ContainerStarted","Data":"c104a1178b092ed1f73af4b4b47ab5f73e004c96a1a61c6359a2de4787846150"} Dec 02 16:14:26 crc kubenswrapper[4933]: I1202 16:14:26.048384 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-kzxbf"] Dec 02 16:14:26 crc kubenswrapper[4933]: I1202 16:14:26.435448 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-gtbcc"] Dec 02 16:14:26 crc kubenswrapper[4933]: I1202 16:14:26.450148 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9d85d47c-q9sfs"] Dec 02 16:14:26 crc kubenswrapper[4933]: I1202 16:14:26.509962 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6ffb94d8ff-wlntx"] Dec 02 16:14:26 crc kubenswrapper[4933]: I1202 16:14:26.520760 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 16:14:26 crc kubenswrapper[4933]: I1202 16:14:26.534320 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-s7fbv"] Dec 02 16:14:26 crc kubenswrapper[4933]: I1202 16:14:26.541931 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-pd5s5"] Dec 02 16:14:26 crc kubenswrapper[4933]: I1202 16:14:26.610215 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-sxblk"] Dec 02 16:14:26 crc kubenswrapper[4933]: I1202 16:14:26.627573 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-649gh"] Dec 02 16:14:26 crc kubenswrapper[4933]: I1202 16:14:26.678369 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 16:14:27 crc kubenswrapper[4933]: I1202 16:14:27.932690 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"8621d49b-46de-42d1-a6aa-291b8cda18ca","Type":"ContainerStarted","Data":"09fa16c79369fa62164103f1bf7d9eeb8ca8fb4ddad250ef02704b6932a5a0f8"} Dec 02 16:14:33 crc kubenswrapper[4933]: W1202 16:14:33.533874 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6d02b9de_9f5e_44fc_9e29_6a01b47c6ad7.slice/crio-550c21306ee7cd482a5b23cad1a624c6582a6fa96e02efd4d938b89c9799676b WatchSource:0}: Error finding container 550c21306ee7cd482a5b23cad1a624c6582a6fa96e02efd4d938b89c9799676b: Status 404 returned error can't find the container with id 550c21306ee7cd482a5b23cad1a624c6582a6fa96e02efd4d938b89c9799676b Dec 02 16:14:33 crc kubenswrapper[4933]: W1202 16:14:33.536799 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podae883c36_aac5_49f4_839c_d0140fe724cc.slice/crio-9e3a9a011e145842e25c5a2cfa03fcee26cf81e130cb8b1fe579741a22345042 WatchSource:0}: Error finding container 9e3a9a011e145842e25c5a2cfa03fcee26cf81e130cb8b1fe579741a22345042: Status 404 returned error can't find the container with id 9e3a9a011e145842e25c5a2cfa03fcee26cf81e130cb8b1fe579741a22345042 Dec 02 16:14:33 crc kubenswrapper[4933]: W1202 16:14:33.542874 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4250e624_4c9d_426c_8856_76782f0bb0e3.slice/crio-14dfc16206e6010bb396f674bfc4a8272168a9e5f3375d524bc48cb36b358838 WatchSource:0}: Error finding container 14dfc16206e6010bb396f674bfc4a8272168a9e5f3375d524bc48cb36b358838: Status 404 returned error can't find the container with id 14dfc16206e6010bb396f674bfc4a8272168a9e5f3375d524bc48cb36b358838 Dec 02 16:14:33 crc kubenswrapper[4933]: W1202 16:14:33.579974 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a52b684_ce41_4f94_a185_3b8c522cd365.slice/crio-d32edbcee0c5967a658a2121fdc16fb184828ba308e8df11456f433f02dec641 WatchSource:0}: Error finding container d32edbcee0c5967a658a2121fdc16fb184828ba308e8df11456f433f02dec641: Status 404 returned error can't find the container with id d32edbcee0c5967a658a2121fdc16fb184828ba308e8df11456f433f02dec641 Dec 02 16:14:34 crc kubenswrapper[4933]: I1202 16:14:34.028604 4933 generic.go:334] "Generic (PLEG): container finished" podID="0a52b684-ce41-4f94-a185-3b8c522cd365" containerID="c384c70a6750249ce23d02517ad243f80afc875377fa2e42246e0a4a3e802bfd" exitCode=0 Dec 02 16:14:34 crc kubenswrapper[4933]: I1202 16:14:34.028682 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9d85d47c-q9sfs" event={"ID":"0a52b684-ce41-4f94-a185-3b8c522cd365","Type":"ContainerDied","Data":"c384c70a6750249ce23d02517ad243f80afc875377fa2e42246e0a4a3e802bfd"} Dec 02 16:14:34 crc kubenswrapper[4933]: I1202 16:14:34.029414 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9d85d47c-q9sfs" event={"ID":"0a52b684-ce41-4f94-a185-3b8c522cd365","Type":"ContainerStarted","Data":"d32edbcee0c5967a658a2121fdc16fb184828ba308e8df11456f433f02dec641"} Dec 02 16:14:34 crc kubenswrapper[4933]: I1202 16:14:34.059241 4933 generic.go:334] "Generic (PLEG): container finished" podID="4250e624-4c9d-426c-8856-76782f0bb0e3" containerID="feb12c25836a067ef3c2f30bef307a3f9cb9fd99f8512dff2638256a3f21cdc0" exitCode=0 Dec 02 16:14:34 crc kubenswrapper[4933]: I1202 16:14:34.059346 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6ffb94d8ff-wlntx" event={"ID":"4250e624-4c9d-426c-8856-76782f0bb0e3","Type":"ContainerDied","Data":"feb12c25836a067ef3c2f30bef307a3f9cb9fd99f8512dff2638256a3f21cdc0"} Dec 02 16:14:34 crc kubenswrapper[4933]: I1202 16:14:34.059390 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6ffb94d8ff-wlntx" event={"ID":"4250e624-4c9d-426c-8856-76782f0bb0e3","Type":"ContainerStarted","Data":"14dfc16206e6010bb396f674bfc4a8272168a9e5f3375d524bc48cb36b358838"} Dec 02 16:14:34 crc kubenswrapper[4933]: I1202 16:14:34.076941 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"8621d49b-46de-42d1-a6aa-291b8cda18ca","Type":"ContainerStarted","Data":"59dcd653762b60db0151d7ae065cff413eb3f70ef6e3c063cd5fd5573d23e160"} Dec 02 16:14:34 crc kubenswrapper[4933]: I1202 16:14:34.088683 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-gtbcc" event={"ID":"9c326307-73df-462e-98ec-0e4dc89fdd54","Type":"ContainerStarted","Data":"313c2171fb9b1539094a70fde927f6b0bb7f25b38ec453ec852f2b26a4f221d9"} Dec 02 16:14:34 crc kubenswrapper[4933]: I1202 16:14:34.098018 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-sxblk" event={"ID":"d342549d-487a-4b41-89cc-8ce263aed373","Type":"ContainerStarted","Data":"4e73c93e39976283c0cf550c068a7692530e082ccc049f2f51661c2a9b7ee222"} Dec 02 16:14:34 crc kubenswrapper[4933]: I1202 16:14:34.098060 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-sxblk" event={"ID":"d342549d-487a-4b41-89cc-8ce263aed373","Type":"ContainerStarted","Data":"5dbfb0ad8df619c1a8c6cf3472f4f2b5e126cb9821bc8370ae86a9b7a5b04d9b"} Dec 02 16:14:34 crc kubenswrapper[4933]: I1202 16:14:34.110363 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6d02b9de-9f5e-44fc-9e29-6a01b47c6ad7","Type":"ContainerStarted","Data":"550c21306ee7cd482a5b23cad1a624c6582a6fa96e02efd4d938b89c9799676b"} Dec 02 16:14:34 crc kubenswrapper[4933]: I1202 16:14:34.129913 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-pd5s5" event={"ID":"f7179e9e-f623-44a2-9f70-021244770e56","Type":"ContainerStarted","Data":"fb9b424e4304e3a730d3035c6163c7a76b941ff1dc414952e10ffd3f54504c87"} Dec 02 16:14:34 crc kubenswrapper[4933]: I1202 16:14:34.133531 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=24.133505656 podStartE2EDuration="24.133505656s" podCreationTimestamp="2025-12-02 16:14:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 16:14:34.116790802 +0000 UTC m=+1337.368017525" watchObservedRunningTime="2025-12-02 16:14:34.133505656 +0000 UTC m=+1337.384732359" Dec 02 16:14:34 crc kubenswrapper[4933]: I1202 16:14:34.136875 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-kzxbf" event={"ID":"c4239535-0d5c-4b17-a695-9f57efb4d381","Type":"ContainerStarted","Data":"fa6ffc6935851f92d2d41ea2dfa32f36e7c6c6d7dda367e75428d91c958803c0"} Dec 02 16:14:34 crc kubenswrapper[4933]: I1202 16:14:34.138872 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-s7fbv" event={"ID":"ce6f7cae-26b1-41d6-9cf0-769405edbdb5","Type":"ContainerStarted","Data":"feb9062af93b73044a3514582bb2b43e14a1b8d6f2d488fd8ebfa813c946cfae"} Dec 02 16:14:34 crc kubenswrapper[4933]: I1202 16:14:34.138914 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-s7fbv" event={"ID":"ce6f7cae-26b1-41d6-9cf0-769405edbdb5","Type":"ContainerStarted","Data":"11fa1889b412c692af9477a3e6bfd620ce6c81114268350621be45628f8c02d5"} Dec 02 16:14:34 crc kubenswrapper[4933]: I1202 16:14:34.140094 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-649gh" event={"ID":"ae883c36-aac5-49f4-839c-d0140fe724cc","Type":"ContainerStarted","Data":"9e3a9a011e145842e25c5a2cfa03fcee26cf81e130cb8b1fe579741a22345042"} Dec 02 16:14:34 crc kubenswrapper[4933]: I1202 16:14:34.152411 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-sxblk" podStartSLOduration=10.152387878 podStartE2EDuration="10.152387878s" podCreationTimestamp="2025-12-02 16:14:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 16:14:34.132324755 +0000 UTC m=+1337.383551478" watchObservedRunningTime="2025-12-02 16:14:34.152387878 +0000 UTC m=+1337.403614581" Dec 02 16:14:34 crc kubenswrapper[4933]: I1202 16:14:34.165035 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-s7fbv" podStartSLOduration=10.165012983 podStartE2EDuration="10.165012983s" podCreationTimestamp="2025-12-02 16:14:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 16:14:34.163291237 +0000 UTC m=+1337.414517960" watchObservedRunningTime="2025-12-02 16:14:34.165012983 +0000 UTC m=+1337.416239676" Dec 02 16:14:34 crc kubenswrapper[4933]: I1202 16:14:34.464322 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9d85d47c-q9sfs" Dec 02 16:14:34 crc kubenswrapper[4933]: I1202 16:14:34.532593 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0a52b684-ce41-4f94-a185-3b8c522cd365-ovsdbserver-sb\") pod \"0a52b684-ce41-4f94-a185-3b8c522cd365\" (UID: \"0a52b684-ce41-4f94-a185-3b8c522cd365\") " Dec 02 16:14:34 crc kubenswrapper[4933]: I1202 16:14:34.532658 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0a52b684-ce41-4f94-a185-3b8c522cd365-dns-svc\") pod \"0a52b684-ce41-4f94-a185-3b8c522cd365\" (UID: \"0a52b684-ce41-4f94-a185-3b8c522cd365\") " Dec 02 16:14:34 crc kubenswrapper[4933]: I1202 16:14:34.532707 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a52b684-ce41-4f94-a185-3b8c522cd365-config\") pod \"0a52b684-ce41-4f94-a185-3b8c522cd365\" (UID: \"0a52b684-ce41-4f94-a185-3b8c522cd365\") " Dec 02 16:14:34 crc kubenswrapper[4933]: I1202 16:14:34.532786 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x949c\" (UniqueName: \"kubernetes.io/projected/0a52b684-ce41-4f94-a185-3b8c522cd365-kube-api-access-x949c\") pod \"0a52b684-ce41-4f94-a185-3b8c522cd365\" (UID: \"0a52b684-ce41-4f94-a185-3b8c522cd365\") " Dec 02 16:14:34 crc kubenswrapper[4933]: I1202 16:14:34.532921 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0a52b684-ce41-4f94-a185-3b8c522cd365-ovsdbserver-nb\") pod \"0a52b684-ce41-4f94-a185-3b8c522cd365\" (UID: \"0a52b684-ce41-4f94-a185-3b8c522cd365\") " Dec 02 16:14:34 crc kubenswrapper[4933]: I1202 16:14:34.575114 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a52b684-ce41-4f94-a185-3b8c522cd365-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0a52b684-ce41-4f94-a185-3b8c522cd365" (UID: "0a52b684-ce41-4f94-a185-3b8c522cd365"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:14:34 crc kubenswrapper[4933]: I1202 16:14:34.577089 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a52b684-ce41-4f94-a185-3b8c522cd365-kube-api-access-x949c" (OuterVolumeSpecName: "kube-api-access-x949c") pod "0a52b684-ce41-4f94-a185-3b8c522cd365" (UID: "0a52b684-ce41-4f94-a185-3b8c522cd365"). InnerVolumeSpecName "kube-api-access-x949c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:14:34 crc kubenswrapper[4933]: I1202 16:14:34.585382 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a52b684-ce41-4f94-a185-3b8c522cd365-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0a52b684-ce41-4f94-a185-3b8c522cd365" (UID: "0a52b684-ce41-4f94-a185-3b8c522cd365"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:14:34 crc kubenswrapper[4933]: I1202 16:14:34.601988 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a52b684-ce41-4f94-a185-3b8c522cd365-config" (OuterVolumeSpecName: "config") pod "0a52b684-ce41-4f94-a185-3b8c522cd365" (UID: "0a52b684-ce41-4f94-a185-3b8c522cd365"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:14:34 crc kubenswrapper[4933]: I1202 16:14:34.611418 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a52b684-ce41-4f94-a185-3b8c522cd365-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0a52b684-ce41-4f94-a185-3b8c522cd365" (UID: "0a52b684-ce41-4f94-a185-3b8c522cd365"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:14:34 crc kubenswrapper[4933]: I1202 16:14:34.651019 4933 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0a52b684-ce41-4f94-a185-3b8c522cd365-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 02 16:14:34 crc kubenswrapper[4933]: I1202 16:14:34.651066 4933 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0a52b684-ce41-4f94-a185-3b8c522cd365-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 02 16:14:34 crc kubenswrapper[4933]: I1202 16:14:34.651088 4933 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0a52b684-ce41-4f94-a185-3b8c522cd365-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 16:14:34 crc kubenswrapper[4933]: I1202 16:14:34.651108 4933 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a52b684-ce41-4f94-a185-3b8c522cd365-config\") on node \"crc\" DevicePath \"\"" Dec 02 16:14:34 crc kubenswrapper[4933]: I1202 16:14:34.651120 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x949c\" (UniqueName: \"kubernetes.io/projected/0a52b684-ce41-4f94-a185-3b8c522cd365-kube-api-access-x949c\") on node \"crc\" DevicePath \"\"" Dec 02 16:14:35 crc kubenswrapper[4933]: I1202 16:14:35.157182 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-x8fgp" event={"ID":"5b7bd174-a0e2-41ff-bc1e-46095ec6f2c4","Type":"ContainerStarted","Data":"c37bb5d562f32a158fdbc43377a6a660cf52d09e6b14589c72e72f0ef54b3878"} Dec 02 16:14:35 crc kubenswrapper[4933]: I1202 16:14:35.165371 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9d85d47c-q9sfs" Dec 02 16:14:35 crc kubenswrapper[4933]: I1202 16:14:35.166230 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9d85d47c-q9sfs" event={"ID":"0a52b684-ce41-4f94-a185-3b8c522cd365","Type":"ContainerDied","Data":"d32edbcee0c5967a658a2121fdc16fb184828ba308e8df11456f433f02dec641"} Dec 02 16:14:35 crc kubenswrapper[4933]: I1202 16:14:35.166284 4933 scope.go:117] "RemoveContainer" containerID="c384c70a6750249ce23d02517ad243f80afc875377fa2e42246e0a4a3e802bfd" Dec 02 16:14:35 crc kubenswrapper[4933]: I1202 16:14:35.183249 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"cf325ab8-af91-4009-9e8b-a299db2234da","Type":"ContainerStarted","Data":"e144728a2635b1c8fe2e3b5c23c70940ab65a51a363d7b62e9176af5bcc975c9"} Dec 02 16:14:35 crc kubenswrapper[4933]: I1202 16:14:35.188916 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-x8fgp" podStartSLOduration=2.9006810979999997 podStartE2EDuration="19.188899448s" podCreationTimestamp="2025-12-02 16:14:16 +0000 UTC" firstStartedPulling="2025-12-02 16:14:17.483481382 +0000 UTC m=+1320.734708085" lastFinishedPulling="2025-12-02 16:14:33.771699732 +0000 UTC m=+1337.022926435" observedRunningTime="2025-12-02 16:14:35.174381153 +0000 UTC m=+1338.425607866" watchObservedRunningTime="2025-12-02 16:14:35.188899448 +0000 UTC m=+1338.440126151" Dec 02 16:14:35 crc kubenswrapper[4933]: I1202 16:14:35.224595 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6ffb94d8ff-wlntx" event={"ID":"4250e624-4c9d-426c-8856-76782f0bb0e3","Type":"ContainerStarted","Data":"4efceaedaa6cc040fda75de4cbd3c0d92c3c599f1dd7f6fe5f60ab7da99f525d"} Dec 02 16:14:35 crc kubenswrapper[4933]: I1202 16:14:35.225596 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6ffb94d8ff-wlntx" Dec 02 16:14:35 crc kubenswrapper[4933]: I1202 16:14:35.248202 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9d85d47c-q9sfs"] Dec 02 16:14:35 crc kubenswrapper[4933]: I1202 16:14:35.258706 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c9d85d47c-q9sfs"] Dec 02 16:14:35 crc kubenswrapper[4933]: I1202 16:14:35.268599 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6ffb94d8ff-wlntx" podStartSLOduration=11.268580686 podStartE2EDuration="11.268580686s" podCreationTimestamp="2025-12-02 16:14:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 16:14:35.245757909 +0000 UTC m=+1338.496984612" watchObservedRunningTime="2025-12-02 16:14:35.268580686 +0000 UTC m=+1338.519807389" Dec 02 16:14:35 crc kubenswrapper[4933]: I1202 16:14:35.998430 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Dec 02 16:14:36 crc kubenswrapper[4933]: I1202 16:14:36.244526 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"cf325ab8-af91-4009-9e8b-a299db2234da","Type":"ContainerStarted","Data":"5713337b63345ae7b47e61ae6763fffaaa1f1aa735db5185483c2a9b14f5f172"} Dec 02 16:14:37 crc kubenswrapper[4933]: I1202 16:14:37.064694 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a52b684-ce41-4f94-a185-3b8c522cd365" path="/var/lib/kubelet/pods/0a52b684-ce41-4f94-a185-3b8c522cd365/volumes" Dec 02 16:14:37 crc kubenswrapper[4933]: I1202 16:14:37.271892 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"cf325ab8-af91-4009-9e8b-a299db2234da","Type":"ContainerStarted","Data":"3e5a5c7b582d7f7507d2457d3812d558c773a374d755b395d9fd627340d111bc"} Dec 02 16:14:39 crc kubenswrapper[4933]: I1202 16:14:39.294535 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"cf325ab8-af91-4009-9e8b-a299db2234da","Type":"ContainerStarted","Data":"bee3f75cd5b2447ad50c4bb8bb8bff94322bc2363fbba95a4fef95d2023053ba"} Dec 02 16:14:39 crc kubenswrapper[4933]: I1202 16:14:39.297389 4933 generic.go:334] "Generic (PLEG): container finished" podID="ce6f7cae-26b1-41d6-9cf0-769405edbdb5" containerID="feb9062af93b73044a3514582bb2b43e14a1b8d6f2d488fd8ebfa813c946cfae" exitCode=0 Dec 02 16:14:39 crc kubenswrapper[4933]: I1202 16:14:39.297446 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-s7fbv" event={"ID":"ce6f7cae-26b1-41d6-9cf0-769405edbdb5","Type":"ContainerDied","Data":"feb9062af93b73044a3514582bb2b43e14a1b8d6f2d488fd8ebfa813c946cfae"} Dec 02 16:14:40 crc kubenswrapper[4933]: I1202 16:14:40.099962 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6ffb94d8ff-wlntx" Dec 02 16:14:40 crc kubenswrapper[4933]: I1202 16:14:40.177340 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-5fxhc"] Dec 02 16:14:40 crc kubenswrapper[4933]: I1202 16:14:40.177631 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-b8fbc5445-5fxhc" podUID="36ad0698-417e-43da-b2da-8d4752f57abf" containerName="dnsmasq-dns" containerID="cri-o://f815f4b942dceacd673181b01a8d49d8c21f7e4c6a9171fece2cfebed934c41f" gracePeriod=10 Dec 02 16:14:40 crc kubenswrapper[4933]: I1202 16:14:40.998570 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Dec 02 16:14:41 crc kubenswrapper[4933]: I1202 16:14:41.015277 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Dec 02 16:14:41 crc kubenswrapper[4933]: I1202 16:14:41.345638 4933 generic.go:334] "Generic (PLEG): container finished" podID="36ad0698-417e-43da-b2da-8d4752f57abf" containerID="f815f4b942dceacd673181b01a8d49d8c21f7e4c6a9171fece2cfebed934c41f" exitCode=0 Dec 02 16:14:41 crc kubenswrapper[4933]: I1202 16:14:41.345941 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-5fxhc" event={"ID":"36ad0698-417e-43da-b2da-8d4752f57abf","Type":"ContainerDied","Data":"f815f4b942dceacd673181b01a8d49d8c21f7e4c6a9171fece2cfebed934c41f"} Dec 02 16:14:41 crc kubenswrapper[4933]: I1202 16:14:41.355306 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Dec 02 16:14:43 crc kubenswrapper[4933]: I1202 16:14:43.383896 4933 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-5fxhc" podUID="36ad0698-417e-43da-b2da-8d4752f57abf" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.143:5353: connect: connection refused" Dec 02 16:14:47 crc kubenswrapper[4933]: I1202 16:14:47.169339 4933 patch_prober.go:28] interesting pod/machine-config-daemon-d2p6w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 16:14:47 crc kubenswrapper[4933]: I1202 16:14:47.169909 4933 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 16:14:48 crc kubenswrapper[4933]: I1202 16:14:48.383803 4933 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-5fxhc" podUID="36ad0698-417e-43da-b2da-8d4752f57abf" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.143:5353: connect: connection refused" Dec 02 16:14:49 crc kubenswrapper[4933]: I1202 16:14:49.431166 4933 generic.go:334] "Generic (PLEG): container finished" podID="5b7bd174-a0e2-41ff-bc1e-46095ec6f2c4" containerID="c37bb5d562f32a158fdbc43377a6a660cf52d09e6b14589c72e72f0ef54b3878" exitCode=0 Dec 02 16:14:49 crc kubenswrapper[4933]: I1202 16:14:49.431257 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-x8fgp" event={"ID":"5b7bd174-a0e2-41ff-bc1e-46095ec6f2c4","Type":"ContainerDied","Data":"c37bb5d562f32a158fdbc43377a6a660cf52d09e6b14589c72e72f0ef54b3878"} Dec 02 16:14:53 crc kubenswrapper[4933]: I1202 16:14:53.384333 4933 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-5fxhc" podUID="36ad0698-417e-43da-b2da-8d4752f57abf" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.143:5353: connect: connection refused" Dec 02 16:14:53 crc kubenswrapper[4933]: I1202 16:14:53.385130 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b8fbc5445-5fxhc" Dec 02 16:14:54 crc kubenswrapper[4933]: I1202 16:14:54.198191 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-s7fbv" Dec 02 16:14:54 crc kubenswrapper[4933]: I1202 16:14:54.321716 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce6f7cae-26b1-41d6-9cf0-769405edbdb5-scripts\") pod \"ce6f7cae-26b1-41d6-9cf0-769405edbdb5\" (UID: \"ce6f7cae-26b1-41d6-9cf0-769405edbdb5\") " Dec 02 16:14:54 crc kubenswrapper[4933]: I1202 16:14:54.321974 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ce6f7cae-26b1-41d6-9cf0-769405edbdb5-fernet-keys\") pod \"ce6f7cae-26b1-41d6-9cf0-769405edbdb5\" (UID: \"ce6f7cae-26b1-41d6-9cf0-769405edbdb5\") " Dec 02 16:14:54 crc kubenswrapper[4933]: I1202 16:14:54.322018 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce6f7cae-26b1-41d6-9cf0-769405edbdb5-combined-ca-bundle\") pod \"ce6f7cae-26b1-41d6-9cf0-769405edbdb5\" (UID: \"ce6f7cae-26b1-41d6-9cf0-769405edbdb5\") " Dec 02 16:14:54 crc kubenswrapper[4933]: I1202 16:14:54.322117 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce6f7cae-26b1-41d6-9cf0-769405edbdb5-config-data\") pod \"ce6f7cae-26b1-41d6-9cf0-769405edbdb5\" (UID: \"ce6f7cae-26b1-41d6-9cf0-769405edbdb5\") " Dec 02 16:14:54 crc kubenswrapper[4933]: I1202 16:14:54.322167 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ztzcf\" (UniqueName: \"kubernetes.io/projected/ce6f7cae-26b1-41d6-9cf0-769405edbdb5-kube-api-access-ztzcf\") pod \"ce6f7cae-26b1-41d6-9cf0-769405edbdb5\" (UID: \"ce6f7cae-26b1-41d6-9cf0-769405edbdb5\") " Dec 02 16:14:54 crc kubenswrapper[4933]: I1202 16:14:54.322281 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ce6f7cae-26b1-41d6-9cf0-769405edbdb5-credential-keys\") pod \"ce6f7cae-26b1-41d6-9cf0-769405edbdb5\" (UID: \"ce6f7cae-26b1-41d6-9cf0-769405edbdb5\") " Dec 02 16:14:54 crc kubenswrapper[4933]: I1202 16:14:54.327567 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce6f7cae-26b1-41d6-9cf0-769405edbdb5-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "ce6f7cae-26b1-41d6-9cf0-769405edbdb5" (UID: "ce6f7cae-26b1-41d6-9cf0-769405edbdb5"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:14:54 crc kubenswrapper[4933]: I1202 16:14:54.343528 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce6f7cae-26b1-41d6-9cf0-769405edbdb5-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "ce6f7cae-26b1-41d6-9cf0-769405edbdb5" (UID: "ce6f7cae-26b1-41d6-9cf0-769405edbdb5"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:14:54 crc kubenswrapper[4933]: I1202 16:14:54.345848 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce6f7cae-26b1-41d6-9cf0-769405edbdb5-kube-api-access-ztzcf" (OuterVolumeSpecName: "kube-api-access-ztzcf") pod "ce6f7cae-26b1-41d6-9cf0-769405edbdb5" (UID: "ce6f7cae-26b1-41d6-9cf0-769405edbdb5"). InnerVolumeSpecName "kube-api-access-ztzcf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:14:54 crc kubenswrapper[4933]: I1202 16:14:54.346977 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce6f7cae-26b1-41d6-9cf0-769405edbdb5-scripts" (OuterVolumeSpecName: "scripts") pod "ce6f7cae-26b1-41d6-9cf0-769405edbdb5" (UID: "ce6f7cae-26b1-41d6-9cf0-769405edbdb5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:14:54 crc kubenswrapper[4933]: I1202 16:14:54.358668 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce6f7cae-26b1-41d6-9cf0-769405edbdb5-config-data" (OuterVolumeSpecName: "config-data") pod "ce6f7cae-26b1-41d6-9cf0-769405edbdb5" (UID: "ce6f7cae-26b1-41d6-9cf0-769405edbdb5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:14:54 crc kubenswrapper[4933]: I1202 16:14:54.374073 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce6f7cae-26b1-41d6-9cf0-769405edbdb5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ce6f7cae-26b1-41d6-9cf0-769405edbdb5" (UID: "ce6f7cae-26b1-41d6-9cf0-769405edbdb5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:14:54 crc kubenswrapper[4933]: I1202 16:14:54.427486 4933 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ce6f7cae-26b1-41d6-9cf0-769405edbdb5-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 02 16:14:54 crc kubenswrapper[4933]: I1202 16:14:54.427524 4933 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce6f7cae-26b1-41d6-9cf0-769405edbdb5-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 16:14:54 crc kubenswrapper[4933]: I1202 16:14:54.427535 4933 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ce6f7cae-26b1-41d6-9cf0-769405edbdb5-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 02 16:14:54 crc kubenswrapper[4933]: I1202 16:14:54.427546 4933 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce6f7cae-26b1-41d6-9cf0-769405edbdb5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 16:14:54 crc kubenswrapper[4933]: I1202 16:14:54.427556 4933 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce6f7cae-26b1-41d6-9cf0-769405edbdb5-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 16:14:54 crc kubenswrapper[4933]: I1202 16:14:54.427568 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ztzcf\" (UniqueName: \"kubernetes.io/projected/ce6f7cae-26b1-41d6-9cf0-769405edbdb5-kube-api-access-ztzcf\") on node \"crc\" DevicePath \"\"" Dec 02 16:14:54 crc kubenswrapper[4933]: I1202 16:14:54.490570 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-s7fbv" event={"ID":"ce6f7cae-26b1-41d6-9cf0-769405edbdb5","Type":"ContainerDied","Data":"11fa1889b412c692af9477a3e6bfd620ce6c81114268350621be45628f8c02d5"} Dec 02 16:14:54 crc kubenswrapper[4933]: I1202 16:14:54.490609 4933 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="11fa1889b412c692af9477a3e6bfd620ce6c81114268350621be45628f8c02d5" Dec 02 16:14:54 crc kubenswrapper[4933]: I1202 16:14:54.490616 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-s7fbv" Dec 02 16:14:55 crc kubenswrapper[4933]: E1202 16:14:55.091623 4933 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Dec 02 16:14:55 crc kubenswrapper[4933]: E1202 16:14:55.092636 4933 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mjqts,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-649gh_openstack(ae883c36-aac5-49f4-839c-d0140fe724cc): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 16:14:55 crc kubenswrapper[4933]: E1202 16:14:55.093852 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-649gh" podUID="ae883c36-aac5-49f4-839c-d0140fe724cc" Dec 02 16:14:55 crc kubenswrapper[4933]: I1202 16:14:55.146292 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-5fxhc" Dec 02 16:14:55 crc kubenswrapper[4933]: I1202 16:14:55.243142 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/36ad0698-417e-43da-b2da-8d4752f57abf-ovsdbserver-nb\") pod \"36ad0698-417e-43da-b2da-8d4752f57abf\" (UID: \"36ad0698-417e-43da-b2da-8d4752f57abf\") " Dec 02 16:14:55 crc kubenswrapper[4933]: I1202 16:14:55.243236 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5vtfg\" (UniqueName: \"kubernetes.io/projected/36ad0698-417e-43da-b2da-8d4752f57abf-kube-api-access-5vtfg\") pod \"36ad0698-417e-43da-b2da-8d4752f57abf\" (UID: \"36ad0698-417e-43da-b2da-8d4752f57abf\") " Dec 02 16:14:55 crc kubenswrapper[4933]: I1202 16:14:55.243300 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36ad0698-417e-43da-b2da-8d4752f57abf-config\") pod \"36ad0698-417e-43da-b2da-8d4752f57abf\" (UID: \"36ad0698-417e-43da-b2da-8d4752f57abf\") " Dec 02 16:14:55 crc kubenswrapper[4933]: I1202 16:14:55.243326 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/36ad0698-417e-43da-b2da-8d4752f57abf-ovsdbserver-sb\") pod \"36ad0698-417e-43da-b2da-8d4752f57abf\" (UID: \"36ad0698-417e-43da-b2da-8d4752f57abf\") " Dec 02 16:14:55 crc kubenswrapper[4933]: I1202 16:14:55.243464 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/36ad0698-417e-43da-b2da-8d4752f57abf-dns-svc\") pod \"36ad0698-417e-43da-b2da-8d4752f57abf\" (UID: \"36ad0698-417e-43da-b2da-8d4752f57abf\") " Dec 02 16:14:55 crc kubenswrapper[4933]: I1202 16:14:55.248226 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36ad0698-417e-43da-b2da-8d4752f57abf-kube-api-access-5vtfg" (OuterVolumeSpecName: "kube-api-access-5vtfg") pod "36ad0698-417e-43da-b2da-8d4752f57abf" (UID: "36ad0698-417e-43da-b2da-8d4752f57abf"). InnerVolumeSpecName "kube-api-access-5vtfg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:14:55 crc kubenswrapper[4933]: I1202 16:14:55.308624 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36ad0698-417e-43da-b2da-8d4752f57abf-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "36ad0698-417e-43da-b2da-8d4752f57abf" (UID: "36ad0698-417e-43da-b2da-8d4752f57abf"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:14:55 crc kubenswrapper[4933]: I1202 16:14:55.312462 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36ad0698-417e-43da-b2da-8d4752f57abf-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "36ad0698-417e-43da-b2da-8d4752f57abf" (UID: "36ad0698-417e-43da-b2da-8d4752f57abf"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:14:55 crc kubenswrapper[4933]: I1202 16:14:55.315185 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-s7fbv"] Dec 02 16:14:55 crc kubenswrapper[4933]: I1202 16:14:55.319935 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36ad0698-417e-43da-b2da-8d4752f57abf-config" (OuterVolumeSpecName: "config") pod "36ad0698-417e-43da-b2da-8d4752f57abf" (UID: "36ad0698-417e-43da-b2da-8d4752f57abf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:14:55 crc kubenswrapper[4933]: I1202 16:14:55.325814 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-s7fbv"] Dec 02 16:14:55 crc kubenswrapper[4933]: I1202 16:14:55.330244 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36ad0698-417e-43da-b2da-8d4752f57abf-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "36ad0698-417e-43da-b2da-8d4752f57abf" (UID: "36ad0698-417e-43da-b2da-8d4752f57abf"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:14:55 crc kubenswrapper[4933]: I1202 16:14:55.347517 4933 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/36ad0698-417e-43da-b2da-8d4752f57abf-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 16:14:55 crc kubenswrapper[4933]: I1202 16:14:55.348426 4933 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/36ad0698-417e-43da-b2da-8d4752f57abf-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 02 16:14:55 crc kubenswrapper[4933]: I1202 16:14:55.348634 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5vtfg\" (UniqueName: \"kubernetes.io/projected/36ad0698-417e-43da-b2da-8d4752f57abf-kube-api-access-5vtfg\") on node \"crc\" DevicePath \"\"" Dec 02 16:14:55 crc kubenswrapper[4933]: I1202 16:14:55.348749 4933 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36ad0698-417e-43da-b2da-8d4752f57abf-config\") on node \"crc\" DevicePath \"\"" Dec 02 16:14:55 crc kubenswrapper[4933]: I1202 16:14:55.348838 4933 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/36ad0698-417e-43da-b2da-8d4752f57abf-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 02 16:14:55 crc kubenswrapper[4933]: I1202 16:14:55.405702 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-rwgj6"] Dec 02 16:14:55 crc kubenswrapper[4933]: E1202 16:14:55.406328 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a52b684-ce41-4f94-a185-3b8c522cd365" containerName="init" Dec 02 16:14:55 crc kubenswrapper[4933]: I1202 16:14:55.406349 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a52b684-ce41-4f94-a185-3b8c522cd365" containerName="init" Dec 02 16:14:55 crc kubenswrapper[4933]: E1202 16:14:55.406358 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36ad0698-417e-43da-b2da-8d4752f57abf" containerName="init" Dec 02 16:14:55 crc kubenswrapper[4933]: I1202 16:14:55.406365 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="36ad0698-417e-43da-b2da-8d4752f57abf" containerName="init" Dec 02 16:14:55 crc kubenswrapper[4933]: E1202 16:14:55.406379 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce6f7cae-26b1-41d6-9cf0-769405edbdb5" containerName="keystone-bootstrap" Dec 02 16:14:55 crc kubenswrapper[4933]: I1202 16:14:55.406387 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce6f7cae-26b1-41d6-9cf0-769405edbdb5" containerName="keystone-bootstrap" Dec 02 16:14:55 crc kubenswrapper[4933]: E1202 16:14:55.406419 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36ad0698-417e-43da-b2da-8d4752f57abf" containerName="dnsmasq-dns" Dec 02 16:14:55 crc kubenswrapper[4933]: I1202 16:14:55.406426 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="36ad0698-417e-43da-b2da-8d4752f57abf" containerName="dnsmasq-dns" Dec 02 16:14:55 crc kubenswrapper[4933]: I1202 16:14:55.406675 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a52b684-ce41-4f94-a185-3b8c522cd365" containerName="init" Dec 02 16:14:55 crc kubenswrapper[4933]: I1202 16:14:55.406700 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce6f7cae-26b1-41d6-9cf0-769405edbdb5" containerName="keystone-bootstrap" Dec 02 16:14:55 crc kubenswrapper[4933]: I1202 16:14:55.406767 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="36ad0698-417e-43da-b2da-8d4752f57abf" containerName="dnsmasq-dns" Dec 02 16:14:55 crc kubenswrapper[4933]: I1202 16:14:55.408160 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-rwgj6" Dec 02 16:14:55 crc kubenswrapper[4933]: I1202 16:14:55.413291 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 02 16:14:55 crc kubenswrapper[4933]: I1202 16:14:55.413668 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-2ntjs" Dec 02 16:14:55 crc kubenswrapper[4933]: I1202 16:14:55.414082 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 02 16:14:55 crc kubenswrapper[4933]: I1202 16:14:55.414594 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 02 16:14:55 crc kubenswrapper[4933]: I1202 16:14:55.414631 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 02 16:14:55 crc kubenswrapper[4933]: I1202 16:14:55.433484 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-rwgj6"] Dec 02 16:14:55 crc kubenswrapper[4933]: I1202 16:14:55.501522 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-5fxhc" Dec 02 16:14:55 crc kubenswrapper[4933]: I1202 16:14:55.501552 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-5fxhc" event={"ID":"36ad0698-417e-43da-b2da-8d4752f57abf","Type":"ContainerDied","Data":"cfc8f745c2bdcefeb4f0a535bba7977c352b6f9375c3376a0b90a0e9da59fc70"} Dec 02 16:14:55 crc kubenswrapper[4933]: I1202 16:14:55.501597 4933 scope.go:117] "RemoveContainer" containerID="f815f4b942dceacd673181b01a8d49d8c21f7e4c6a9171fece2cfebed934c41f" Dec 02 16:14:55 crc kubenswrapper[4933]: E1202 16:14:55.506076 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-649gh" podUID="ae883c36-aac5-49f4-839c-d0140fe724cc" Dec 02 16:14:55 crc kubenswrapper[4933]: I1202 16:14:55.550919 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-5fxhc"] Dec 02 16:14:55 crc kubenswrapper[4933]: I1202 16:14:55.552809 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/39ec29fe-4f0d-4afe-9440-0cdc7b30d651-credential-keys\") pod \"keystone-bootstrap-rwgj6\" (UID: \"39ec29fe-4f0d-4afe-9440-0cdc7b30d651\") " pod="openstack/keystone-bootstrap-rwgj6" Dec 02 16:14:55 crc kubenswrapper[4933]: I1202 16:14:55.552995 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/39ec29fe-4f0d-4afe-9440-0cdc7b30d651-fernet-keys\") pod \"keystone-bootstrap-rwgj6\" (UID: \"39ec29fe-4f0d-4afe-9440-0cdc7b30d651\") " pod="openstack/keystone-bootstrap-rwgj6" Dec 02 16:14:55 crc kubenswrapper[4933]: I1202 16:14:55.553097 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39ec29fe-4f0d-4afe-9440-0cdc7b30d651-scripts\") pod \"keystone-bootstrap-rwgj6\" (UID: \"39ec29fe-4f0d-4afe-9440-0cdc7b30d651\") " pod="openstack/keystone-bootstrap-rwgj6" Dec 02 16:14:55 crc kubenswrapper[4933]: I1202 16:14:55.553196 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39ec29fe-4f0d-4afe-9440-0cdc7b30d651-combined-ca-bundle\") pod \"keystone-bootstrap-rwgj6\" (UID: \"39ec29fe-4f0d-4afe-9440-0cdc7b30d651\") " pod="openstack/keystone-bootstrap-rwgj6" Dec 02 16:14:55 crc kubenswrapper[4933]: I1202 16:14:55.553336 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39ec29fe-4f0d-4afe-9440-0cdc7b30d651-config-data\") pod \"keystone-bootstrap-rwgj6\" (UID: \"39ec29fe-4f0d-4afe-9440-0cdc7b30d651\") " pod="openstack/keystone-bootstrap-rwgj6" Dec 02 16:14:55 crc kubenswrapper[4933]: I1202 16:14:55.553472 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6xkn\" (UniqueName: \"kubernetes.io/projected/39ec29fe-4f0d-4afe-9440-0cdc7b30d651-kube-api-access-h6xkn\") pod \"keystone-bootstrap-rwgj6\" (UID: \"39ec29fe-4f0d-4afe-9440-0cdc7b30d651\") " pod="openstack/keystone-bootstrap-rwgj6" Dec 02 16:14:55 crc kubenswrapper[4933]: I1202 16:14:55.565986 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-5fxhc"] Dec 02 16:14:55 crc kubenswrapper[4933]: I1202 16:14:55.655130 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/39ec29fe-4f0d-4afe-9440-0cdc7b30d651-credential-keys\") pod \"keystone-bootstrap-rwgj6\" (UID: \"39ec29fe-4f0d-4afe-9440-0cdc7b30d651\") " pod="openstack/keystone-bootstrap-rwgj6" Dec 02 16:14:55 crc kubenswrapper[4933]: I1202 16:14:55.655493 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/39ec29fe-4f0d-4afe-9440-0cdc7b30d651-fernet-keys\") pod \"keystone-bootstrap-rwgj6\" (UID: \"39ec29fe-4f0d-4afe-9440-0cdc7b30d651\") " pod="openstack/keystone-bootstrap-rwgj6" Dec 02 16:14:55 crc kubenswrapper[4933]: I1202 16:14:55.655570 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39ec29fe-4f0d-4afe-9440-0cdc7b30d651-scripts\") pod \"keystone-bootstrap-rwgj6\" (UID: \"39ec29fe-4f0d-4afe-9440-0cdc7b30d651\") " pod="openstack/keystone-bootstrap-rwgj6" Dec 02 16:14:55 crc kubenswrapper[4933]: I1202 16:14:55.655606 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39ec29fe-4f0d-4afe-9440-0cdc7b30d651-combined-ca-bundle\") pod \"keystone-bootstrap-rwgj6\" (UID: \"39ec29fe-4f0d-4afe-9440-0cdc7b30d651\") " pod="openstack/keystone-bootstrap-rwgj6" Dec 02 16:14:55 crc kubenswrapper[4933]: I1202 16:14:55.655672 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39ec29fe-4f0d-4afe-9440-0cdc7b30d651-config-data\") pod \"keystone-bootstrap-rwgj6\" (UID: \"39ec29fe-4f0d-4afe-9440-0cdc7b30d651\") " pod="openstack/keystone-bootstrap-rwgj6" Dec 02 16:14:55 crc kubenswrapper[4933]: I1202 16:14:55.655761 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6xkn\" (UniqueName: \"kubernetes.io/projected/39ec29fe-4f0d-4afe-9440-0cdc7b30d651-kube-api-access-h6xkn\") pod \"keystone-bootstrap-rwgj6\" (UID: \"39ec29fe-4f0d-4afe-9440-0cdc7b30d651\") " pod="openstack/keystone-bootstrap-rwgj6" Dec 02 16:14:55 crc kubenswrapper[4933]: I1202 16:14:55.660214 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/39ec29fe-4f0d-4afe-9440-0cdc7b30d651-credential-keys\") pod \"keystone-bootstrap-rwgj6\" (UID: \"39ec29fe-4f0d-4afe-9440-0cdc7b30d651\") " pod="openstack/keystone-bootstrap-rwgj6" Dec 02 16:14:55 crc kubenswrapper[4933]: I1202 16:14:55.661801 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39ec29fe-4f0d-4afe-9440-0cdc7b30d651-scripts\") pod \"keystone-bootstrap-rwgj6\" (UID: \"39ec29fe-4f0d-4afe-9440-0cdc7b30d651\") " pod="openstack/keystone-bootstrap-rwgj6" Dec 02 16:14:55 crc kubenswrapper[4933]: I1202 16:14:55.666993 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/39ec29fe-4f0d-4afe-9440-0cdc7b30d651-fernet-keys\") pod \"keystone-bootstrap-rwgj6\" (UID: \"39ec29fe-4f0d-4afe-9440-0cdc7b30d651\") " pod="openstack/keystone-bootstrap-rwgj6" Dec 02 16:14:55 crc kubenswrapper[4933]: I1202 16:14:55.669390 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39ec29fe-4f0d-4afe-9440-0cdc7b30d651-combined-ca-bundle\") pod \"keystone-bootstrap-rwgj6\" (UID: \"39ec29fe-4f0d-4afe-9440-0cdc7b30d651\") " pod="openstack/keystone-bootstrap-rwgj6" Dec 02 16:14:55 crc kubenswrapper[4933]: I1202 16:14:55.676320 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6xkn\" (UniqueName: \"kubernetes.io/projected/39ec29fe-4f0d-4afe-9440-0cdc7b30d651-kube-api-access-h6xkn\") pod \"keystone-bootstrap-rwgj6\" (UID: \"39ec29fe-4f0d-4afe-9440-0cdc7b30d651\") " pod="openstack/keystone-bootstrap-rwgj6" Dec 02 16:14:55 crc kubenswrapper[4933]: I1202 16:14:55.682135 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39ec29fe-4f0d-4afe-9440-0cdc7b30d651-config-data\") pod \"keystone-bootstrap-rwgj6\" (UID: \"39ec29fe-4f0d-4afe-9440-0cdc7b30d651\") " pod="openstack/keystone-bootstrap-rwgj6" Dec 02 16:14:55 crc kubenswrapper[4933]: I1202 16:14:55.757335 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-rwgj6" Dec 02 16:14:57 crc kubenswrapper[4933]: I1202 16:14:57.071128 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36ad0698-417e-43da-b2da-8d4752f57abf" path="/var/lib/kubelet/pods/36ad0698-417e-43da-b2da-8d4752f57abf/volumes" Dec 02 16:14:57 crc kubenswrapper[4933]: I1202 16:14:57.072310 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce6f7cae-26b1-41d6-9cf0-769405edbdb5" path="/var/lib/kubelet/pods/ce6f7cae-26b1-41d6-9cf0-769405edbdb5/volumes" Dec 02 16:15:00 crc kubenswrapper[4933]: I1202 16:15:00.141019 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411535-sfhvv"] Dec 02 16:15:00 crc kubenswrapper[4933]: I1202 16:15:00.143001 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411535-sfhvv" Dec 02 16:15:00 crc kubenswrapper[4933]: I1202 16:15:00.145525 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 02 16:15:00 crc kubenswrapper[4933]: I1202 16:15:00.153246 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 02 16:15:00 crc kubenswrapper[4933]: I1202 16:15:00.156200 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411535-sfhvv"] Dec 02 16:15:00 crc kubenswrapper[4933]: I1202 16:15:00.190078 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7ddf\" (UniqueName: \"kubernetes.io/projected/249f9151-f57b-4ab8-8e3a-5c5256e2ed1c-kube-api-access-w7ddf\") pod \"collect-profiles-29411535-sfhvv\" (UID: \"249f9151-f57b-4ab8-8e3a-5c5256e2ed1c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411535-sfhvv" Dec 02 16:15:00 crc kubenswrapper[4933]: I1202 16:15:00.190192 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/249f9151-f57b-4ab8-8e3a-5c5256e2ed1c-secret-volume\") pod \"collect-profiles-29411535-sfhvv\" (UID: \"249f9151-f57b-4ab8-8e3a-5c5256e2ed1c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411535-sfhvv" Dec 02 16:15:00 crc kubenswrapper[4933]: I1202 16:15:00.190358 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/249f9151-f57b-4ab8-8e3a-5c5256e2ed1c-config-volume\") pod \"collect-profiles-29411535-sfhvv\" (UID: \"249f9151-f57b-4ab8-8e3a-5c5256e2ed1c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411535-sfhvv" Dec 02 16:15:00 crc kubenswrapper[4933]: I1202 16:15:00.291678 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/249f9151-f57b-4ab8-8e3a-5c5256e2ed1c-config-volume\") pod \"collect-profiles-29411535-sfhvv\" (UID: \"249f9151-f57b-4ab8-8e3a-5c5256e2ed1c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411535-sfhvv" Dec 02 16:15:00 crc kubenswrapper[4933]: I1202 16:15:00.291967 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7ddf\" (UniqueName: \"kubernetes.io/projected/249f9151-f57b-4ab8-8e3a-5c5256e2ed1c-kube-api-access-w7ddf\") pod \"collect-profiles-29411535-sfhvv\" (UID: \"249f9151-f57b-4ab8-8e3a-5c5256e2ed1c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411535-sfhvv" Dec 02 16:15:00 crc kubenswrapper[4933]: I1202 16:15:00.292016 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/249f9151-f57b-4ab8-8e3a-5c5256e2ed1c-secret-volume\") pod \"collect-profiles-29411535-sfhvv\" (UID: \"249f9151-f57b-4ab8-8e3a-5c5256e2ed1c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411535-sfhvv" Dec 02 16:15:00 crc kubenswrapper[4933]: I1202 16:15:00.292664 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/249f9151-f57b-4ab8-8e3a-5c5256e2ed1c-config-volume\") pod \"collect-profiles-29411535-sfhvv\" (UID: \"249f9151-f57b-4ab8-8e3a-5c5256e2ed1c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411535-sfhvv" Dec 02 16:15:00 crc kubenswrapper[4933]: I1202 16:15:00.304298 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/249f9151-f57b-4ab8-8e3a-5c5256e2ed1c-secret-volume\") pod \"collect-profiles-29411535-sfhvv\" (UID: \"249f9151-f57b-4ab8-8e3a-5c5256e2ed1c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411535-sfhvv" Dec 02 16:15:00 crc kubenswrapper[4933]: I1202 16:15:00.308794 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7ddf\" (UniqueName: \"kubernetes.io/projected/249f9151-f57b-4ab8-8e3a-5c5256e2ed1c-kube-api-access-w7ddf\") pod \"collect-profiles-29411535-sfhvv\" (UID: \"249f9151-f57b-4ab8-8e3a-5c5256e2ed1c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411535-sfhvv" Dec 02 16:15:00 crc kubenswrapper[4933]: I1202 16:15:00.494219 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411535-sfhvv" Dec 02 16:15:04 crc kubenswrapper[4933]: E1202 16:15:04.458347 4933 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified" Dec 02 16:15:04 crc kubenswrapper[4933]: E1202 16:15:04.458773 4933 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:heat-db-sync,Image:quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified,Command:[/bin/bash],Args:[-c /usr/bin/heat-manage --config-dir /etc/heat/heat.conf.d db_sync],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/heat/heat.conf.d/00-default.conf,SubPath:00-default.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/heat/heat.conf.d/01-custom.conf,SubPath:01-custom.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mqxrb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42418,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*42418,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-db-sync-kzxbf_openstack(c4239535-0d5c-4b17-a695-9f57efb4d381): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 16:15:04 crc kubenswrapper[4933]: E1202 16:15:04.460109 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/heat-db-sync-kzxbf" podUID="c4239535-0d5c-4b17-a695-9f57efb4d381" Dec 02 16:15:04 crc kubenswrapper[4933]: I1202 16:15:04.541703 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-x8fgp" Dec 02 16:15:04 crc kubenswrapper[4933]: I1202 16:15:04.631533 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-x8fgp" Dec 02 16:15:04 crc kubenswrapper[4933]: I1202 16:15:04.631578 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-x8fgp" event={"ID":"5b7bd174-a0e2-41ff-bc1e-46095ec6f2c4","Type":"ContainerDied","Data":"64cc692081ef026e3a5ad414a271e843a471f5f0994d2b7549c4b30da61264a0"} Dec 02 16:15:04 crc kubenswrapper[4933]: I1202 16:15:04.631642 4933 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="64cc692081ef026e3a5ad414a271e843a471f5f0994d2b7549c4b30da61264a0" Dec 02 16:15:04 crc kubenswrapper[4933]: E1202 16:15:04.632080 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified\\\"\"" pod="openstack/heat-db-sync-kzxbf" podUID="c4239535-0d5c-4b17-a695-9f57efb4d381" Dec 02 16:15:04 crc kubenswrapper[4933]: I1202 16:15:04.691511 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b7bd174-a0e2-41ff-bc1e-46095ec6f2c4-combined-ca-bundle\") pod \"5b7bd174-a0e2-41ff-bc1e-46095ec6f2c4\" (UID: \"5b7bd174-a0e2-41ff-bc1e-46095ec6f2c4\") " Dec 02 16:15:04 crc kubenswrapper[4933]: I1202 16:15:04.691589 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b7bd174-a0e2-41ff-bc1e-46095ec6f2c4-config-data\") pod \"5b7bd174-a0e2-41ff-bc1e-46095ec6f2c4\" (UID: \"5b7bd174-a0e2-41ff-bc1e-46095ec6f2c4\") " Dec 02 16:15:04 crc kubenswrapper[4933]: I1202 16:15:04.691638 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5b7bd174-a0e2-41ff-bc1e-46095ec6f2c4-db-sync-config-data\") pod \"5b7bd174-a0e2-41ff-bc1e-46095ec6f2c4\" (UID: \"5b7bd174-a0e2-41ff-bc1e-46095ec6f2c4\") " Dec 02 16:15:04 crc kubenswrapper[4933]: I1202 16:15:04.692181 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-djvls\" (UniqueName: \"kubernetes.io/projected/5b7bd174-a0e2-41ff-bc1e-46095ec6f2c4-kube-api-access-djvls\") pod \"5b7bd174-a0e2-41ff-bc1e-46095ec6f2c4\" (UID: \"5b7bd174-a0e2-41ff-bc1e-46095ec6f2c4\") " Dec 02 16:15:04 crc kubenswrapper[4933]: I1202 16:15:04.702976 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b7bd174-a0e2-41ff-bc1e-46095ec6f2c4-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "5b7bd174-a0e2-41ff-bc1e-46095ec6f2c4" (UID: "5b7bd174-a0e2-41ff-bc1e-46095ec6f2c4"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:15:04 crc kubenswrapper[4933]: I1202 16:15:04.703092 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b7bd174-a0e2-41ff-bc1e-46095ec6f2c4-kube-api-access-djvls" (OuterVolumeSpecName: "kube-api-access-djvls") pod "5b7bd174-a0e2-41ff-bc1e-46095ec6f2c4" (UID: "5b7bd174-a0e2-41ff-bc1e-46095ec6f2c4"). InnerVolumeSpecName "kube-api-access-djvls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:15:04 crc kubenswrapper[4933]: I1202 16:15:04.724331 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b7bd174-a0e2-41ff-bc1e-46095ec6f2c4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5b7bd174-a0e2-41ff-bc1e-46095ec6f2c4" (UID: "5b7bd174-a0e2-41ff-bc1e-46095ec6f2c4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:15:04 crc kubenswrapper[4933]: I1202 16:15:04.750979 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b7bd174-a0e2-41ff-bc1e-46095ec6f2c4-config-data" (OuterVolumeSpecName: "config-data") pod "5b7bd174-a0e2-41ff-bc1e-46095ec6f2c4" (UID: "5b7bd174-a0e2-41ff-bc1e-46095ec6f2c4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:15:04 crc kubenswrapper[4933]: I1202 16:15:04.793953 4933 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b7bd174-a0e2-41ff-bc1e-46095ec6f2c4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 16:15:04 crc kubenswrapper[4933]: I1202 16:15:04.793985 4933 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b7bd174-a0e2-41ff-bc1e-46095ec6f2c4-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 16:15:04 crc kubenswrapper[4933]: I1202 16:15:04.793994 4933 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5b7bd174-a0e2-41ff-bc1e-46095ec6f2c4-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 16:15:04 crc kubenswrapper[4933]: I1202 16:15:04.794005 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-djvls\" (UniqueName: \"kubernetes.io/projected/5b7bd174-a0e2-41ff-bc1e-46095ec6f2c4-kube-api-access-djvls\") on node \"crc\" DevicePath \"\"" Dec 02 16:15:04 crc kubenswrapper[4933]: E1202 16:15:04.851580 4933 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified" Dec 02 16:15:04 crc kubenswrapper[4933]: E1202 16:15:04.851736 4933 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n58ch598h59ch9dh58fh5ffhb9h545h5f8h88h677h684h5b7h545h5c7h594h6dh67bh56bh566h549hd8h67fh68bhffh649h5cdh5d6h549h55ch97h54dq,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-55s9v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(6d02b9de-9f5e-44fc-9e29-6a01b47c6ad7): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 16:15:05 crc kubenswrapper[4933]: I1202 16:15:05.641580 4933 generic.go:334] "Generic (PLEG): container finished" podID="d342549d-487a-4b41-89cc-8ce263aed373" containerID="4e73c93e39976283c0cf550c068a7692530e082ccc049f2f51661c2a9b7ee222" exitCode=0 Dec 02 16:15:05 crc kubenswrapper[4933]: I1202 16:15:05.641641 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-sxblk" event={"ID":"d342549d-487a-4b41-89cc-8ce263aed373","Type":"ContainerDied","Data":"4e73c93e39976283c0cf550c068a7692530e082ccc049f2f51661c2a9b7ee222"} Dec 02 16:15:05 crc kubenswrapper[4933]: E1202 16:15:05.936132 4933 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Dec 02 16:15:05 crc kubenswrapper[4933]: E1202 16:15:05.936323 4933 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-htbmg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-gtbcc_openstack(9c326307-73df-462e-98ec-0e4dc89fdd54): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 16:15:05 crc kubenswrapper[4933]: I1202 16:15:05.939272 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-56798b757f-hcdxn"] Dec 02 16:15:05 crc kubenswrapper[4933]: E1202 16:15:05.939348 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-gtbcc" podUID="9c326307-73df-462e-98ec-0e4dc89fdd54" Dec 02 16:15:05 crc kubenswrapper[4933]: E1202 16:15:05.939889 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b7bd174-a0e2-41ff-bc1e-46095ec6f2c4" containerName="glance-db-sync" Dec 02 16:15:05 crc kubenswrapper[4933]: I1202 16:15:05.939916 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b7bd174-a0e2-41ff-bc1e-46095ec6f2c4" containerName="glance-db-sync" Dec 02 16:15:05 crc kubenswrapper[4933]: I1202 16:15:05.940191 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b7bd174-a0e2-41ff-bc1e-46095ec6f2c4" containerName="glance-db-sync" Dec 02 16:15:05 crc kubenswrapper[4933]: I1202 16:15:05.942578 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56798b757f-hcdxn" Dec 02 16:15:05 crc kubenswrapper[4933]: I1202 16:15:05.965550 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56798b757f-hcdxn"] Dec 02 16:15:06 crc kubenswrapper[4933]: I1202 16:15:06.030091 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/753f73e1-f883-4c44-a499-3e73eadba372-ovsdbserver-nb\") pod \"dnsmasq-dns-56798b757f-hcdxn\" (UID: \"753f73e1-f883-4c44-a499-3e73eadba372\") " pod="openstack/dnsmasq-dns-56798b757f-hcdxn" Dec 02 16:15:06 crc kubenswrapper[4933]: I1202 16:15:06.030143 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/753f73e1-f883-4c44-a499-3e73eadba372-ovsdbserver-sb\") pod \"dnsmasq-dns-56798b757f-hcdxn\" (UID: \"753f73e1-f883-4c44-a499-3e73eadba372\") " pod="openstack/dnsmasq-dns-56798b757f-hcdxn" Dec 02 16:15:06 crc kubenswrapper[4933]: I1202 16:15:06.030336 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/753f73e1-f883-4c44-a499-3e73eadba372-config\") pod \"dnsmasq-dns-56798b757f-hcdxn\" (UID: \"753f73e1-f883-4c44-a499-3e73eadba372\") " pod="openstack/dnsmasq-dns-56798b757f-hcdxn" Dec 02 16:15:06 crc kubenswrapper[4933]: I1202 16:15:06.030380 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b88th\" (UniqueName: \"kubernetes.io/projected/753f73e1-f883-4c44-a499-3e73eadba372-kube-api-access-b88th\") pod \"dnsmasq-dns-56798b757f-hcdxn\" (UID: \"753f73e1-f883-4c44-a499-3e73eadba372\") " pod="openstack/dnsmasq-dns-56798b757f-hcdxn" Dec 02 16:15:06 crc kubenswrapper[4933]: I1202 16:15:06.030478 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/753f73e1-f883-4c44-a499-3e73eadba372-dns-svc\") pod \"dnsmasq-dns-56798b757f-hcdxn\" (UID: \"753f73e1-f883-4c44-a499-3e73eadba372\") " pod="openstack/dnsmasq-dns-56798b757f-hcdxn" Dec 02 16:15:06 crc kubenswrapper[4933]: I1202 16:15:06.044219 4933 scope.go:117] "RemoveContainer" containerID="5b185b4ccbd021ccf09063083956147391757c9423cb7ebad08d581d2737a815" Dec 02 16:15:06 crc kubenswrapper[4933]: I1202 16:15:06.134876 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/753f73e1-f883-4c44-a499-3e73eadba372-config\") pod \"dnsmasq-dns-56798b757f-hcdxn\" (UID: \"753f73e1-f883-4c44-a499-3e73eadba372\") " pod="openstack/dnsmasq-dns-56798b757f-hcdxn" Dec 02 16:15:06 crc kubenswrapper[4933]: I1202 16:15:06.135514 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b88th\" (UniqueName: \"kubernetes.io/projected/753f73e1-f883-4c44-a499-3e73eadba372-kube-api-access-b88th\") pod \"dnsmasq-dns-56798b757f-hcdxn\" (UID: \"753f73e1-f883-4c44-a499-3e73eadba372\") " pod="openstack/dnsmasq-dns-56798b757f-hcdxn" Dec 02 16:15:06 crc kubenswrapper[4933]: I1202 16:15:06.135566 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/753f73e1-f883-4c44-a499-3e73eadba372-dns-svc\") pod \"dnsmasq-dns-56798b757f-hcdxn\" (UID: \"753f73e1-f883-4c44-a499-3e73eadba372\") " pod="openstack/dnsmasq-dns-56798b757f-hcdxn" Dec 02 16:15:06 crc kubenswrapper[4933]: I1202 16:15:06.135719 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/753f73e1-f883-4c44-a499-3e73eadba372-ovsdbserver-nb\") pod \"dnsmasq-dns-56798b757f-hcdxn\" (UID: \"753f73e1-f883-4c44-a499-3e73eadba372\") " pod="openstack/dnsmasq-dns-56798b757f-hcdxn" Dec 02 16:15:06 crc kubenswrapper[4933]: I1202 16:15:06.135742 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/753f73e1-f883-4c44-a499-3e73eadba372-ovsdbserver-sb\") pod \"dnsmasq-dns-56798b757f-hcdxn\" (UID: \"753f73e1-f883-4c44-a499-3e73eadba372\") " pod="openstack/dnsmasq-dns-56798b757f-hcdxn" Dec 02 16:15:06 crc kubenswrapper[4933]: I1202 16:15:06.137945 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/753f73e1-f883-4c44-a499-3e73eadba372-ovsdbserver-nb\") pod \"dnsmasq-dns-56798b757f-hcdxn\" (UID: \"753f73e1-f883-4c44-a499-3e73eadba372\") " pod="openstack/dnsmasq-dns-56798b757f-hcdxn" Dec 02 16:15:06 crc kubenswrapper[4933]: I1202 16:15:06.138033 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/753f73e1-f883-4c44-a499-3e73eadba372-dns-svc\") pod \"dnsmasq-dns-56798b757f-hcdxn\" (UID: \"753f73e1-f883-4c44-a499-3e73eadba372\") " pod="openstack/dnsmasq-dns-56798b757f-hcdxn" Dec 02 16:15:06 crc kubenswrapper[4933]: I1202 16:15:06.138754 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/753f73e1-f883-4c44-a499-3e73eadba372-config\") pod \"dnsmasq-dns-56798b757f-hcdxn\" (UID: \"753f73e1-f883-4c44-a499-3e73eadba372\") " pod="openstack/dnsmasq-dns-56798b757f-hcdxn" Dec 02 16:15:06 crc kubenswrapper[4933]: I1202 16:15:06.139252 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/753f73e1-f883-4c44-a499-3e73eadba372-ovsdbserver-sb\") pod \"dnsmasq-dns-56798b757f-hcdxn\" (UID: \"753f73e1-f883-4c44-a499-3e73eadba372\") " pod="openstack/dnsmasq-dns-56798b757f-hcdxn" Dec 02 16:15:06 crc kubenswrapper[4933]: I1202 16:15:06.169684 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b88th\" (UniqueName: \"kubernetes.io/projected/753f73e1-f883-4c44-a499-3e73eadba372-kube-api-access-b88th\") pod \"dnsmasq-dns-56798b757f-hcdxn\" (UID: \"753f73e1-f883-4c44-a499-3e73eadba372\") " pod="openstack/dnsmasq-dns-56798b757f-hcdxn" Dec 02 16:15:06 crc kubenswrapper[4933]: I1202 16:15:06.230408 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56798b757f-hcdxn" Dec 02 16:15:06 crc kubenswrapper[4933]: I1202 16:15:06.669404 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"cf325ab8-af91-4009-9e8b-a299db2234da","Type":"ContainerStarted","Data":"83f22478c73db100fb380ba62ef7abb268b77324c5488cefc10f09b4a544fc0b"} Dec 02 16:15:06 crc kubenswrapper[4933]: I1202 16:15:06.669896 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"cf325ab8-af91-4009-9e8b-a299db2234da","Type":"ContainerStarted","Data":"c29c839e8c82aed5694b5f83f1860f26e2e36ff3fd32a87d012856aede5127db"} Dec 02 16:15:06 crc kubenswrapper[4933]: I1202 16:15:06.672671 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-pd5s5" event={"ID":"f7179e9e-f623-44a2-9f70-021244770e56","Type":"ContainerStarted","Data":"21d83126c87174185d7f389dac8b1554c1fbb2e8b04e811961f27bd1a5cff784"} Dec 02 16:15:06 crc kubenswrapper[4933]: E1202 16:15:06.695695 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-gtbcc" podUID="9c326307-73df-462e-98ec-0e4dc89fdd54" Dec 02 16:15:06 crc kubenswrapper[4933]: I1202 16:15:06.702567 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-pd5s5" podStartSLOduration=10.389798179 podStartE2EDuration="42.702548575s" podCreationTimestamp="2025-12-02 16:14:24 +0000 UTC" firstStartedPulling="2025-12-02 16:14:33.580026949 +0000 UTC m=+1336.831253672" lastFinishedPulling="2025-12-02 16:15:05.892777365 +0000 UTC m=+1369.144004068" observedRunningTime="2025-12-02 16:15:06.693283513 +0000 UTC m=+1369.944510236" watchObservedRunningTime="2025-12-02 16:15:06.702548575 +0000 UTC m=+1369.953775278" Dec 02 16:15:06 crc kubenswrapper[4933]: I1202 16:15:06.722880 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411535-sfhvv"] Dec 02 16:15:06 crc kubenswrapper[4933]: W1202 16:15:06.733100 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod249f9151_f57b_4ab8_8e3a_5c5256e2ed1c.slice/crio-9b4158a9c618c852b5df6841332ed2eb564b7f95b723fa4755ac721afb6a6e51 WatchSource:0}: Error finding container 9b4158a9c618c852b5df6841332ed2eb564b7f95b723fa4755ac721afb6a6e51: Status 404 returned error can't find the container with id 9b4158a9c618c852b5df6841332ed2eb564b7f95b723fa4755ac721afb6a6e51 Dec 02 16:15:06 crc kubenswrapper[4933]: I1202 16:15:06.740368 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-rwgj6"] Dec 02 16:15:06 crc kubenswrapper[4933]: I1202 16:15:06.926272 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56798b757f-hcdxn"] Dec 02 16:15:06 crc kubenswrapper[4933]: W1202 16:15:06.952643 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod753f73e1_f883_4c44_a499_3e73eadba372.slice/crio-b2f1e17caef99f451ddbeb323efa37dc3d65316eb6a291df036a51b800efca4e WatchSource:0}: Error finding container b2f1e17caef99f451ddbeb323efa37dc3d65316eb6a291df036a51b800efca4e: Status 404 returned error can't find the container with id b2f1e17caef99f451ddbeb323efa37dc3d65316eb6a291df036a51b800efca4e Dec 02 16:15:07 crc kubenswrapper[4933]: I1202 16:15:07.039172 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 16:15:07 crc kubenswrapper[4933]: I1202 16:15:07.064776 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 02 16:15:07 crc kubenswrapper[4933]: I1202 16:15:07.070419 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-6fkqb" Dec 02 16:15:07 crc kubenswrapper[4933]: I1202 16:15:07.070988 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Dec 02 16:15:07 crc kubenswrapper[4933]: I1202 16:15:07.071208 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 02 16:15:07 crc kubenswrapper[4933]: I1202 16:15:07.101520 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 16:15:07 crc kubenswrapper[4933]: I1202 16:15:07.176142 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a957352-db06-4697-9911-a5e2a15e6ef0-scripts\") pod \"glance-default-external-api-0\" (UID: \"4a957352-db06-4697-9911-a5e2a15e6ef0\") " pod="openstack/glance-default-external-api-0" Dec 02 16:15:07 crc kubenswrapper[4933]: I1202 16:15:07.176249 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4a957352-db06-4697-9911-a5e2a15e6ef0-logs\") pod \"glance-default-external-api-0\" (UID: \"4a957352-db06-4697-9911-a5e2a15e6ef0\") " pod="openstack/glance-default-external-api-0" Dec 02 16:15:07 crc kubenswrapper[4933]: I1202 16:15:07.176285 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4a957352-db06-4697-9911-a5e2a15e6ef0-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"4a957352-db06-4697-9911-a5e2a15e6ef0\") " pod="openstack/glance-default-external-api-0" Dec 02 16:15:07 crc kubenswrapper[4933]: I1202 16:15:07.176318 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6rlj\" (UniqueName: \"kubernetes.io/projected/4a957352-db06-4697-9911-a5e2a15e6ef0-kube-api-access-n6rlj\") pod \"glance-default-external-api-0\" (UID: \"4a957352-db06-4697-9911-a5e2a15e6ef0\") " pod="openstack/glance-default-external-api-0" Dec 02 16:15:07 crc kubenswrapper[4933]: I1202 16:15:07.176386 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a957352-db06-4697-9911-a5e2a15e6ef0-config-data\") pod \"glance-default-external-api-0\" (UID: \"4a957352-db06-4697-9911-a5e2a15e6ef0\") " pod="openstack/glance-default-external-api-0" Dec 02 16:15:07 crc kubenswrapper[4933]: I1202 16:15:07.176412 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"4a957352-db06-4697-9911-a5e2a15e6ef0\") " pod="openstack/glance-default-external-api-0" Dec 02 16:15:07 crc kubenswrapper[4933]: I1202 16:15:07.176432 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a957352-db06-4697-9911-a5e2a15e6ef0-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"4a957352-db06-4697-9911-a5e2a15e6ef0\") " pod="openstack/glance-default-external-api-0" Dec 02 16:15:07 crc kubenswrapper[4933]: I1202 16:15:07.228199 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 16:15:07 crc kubenswrapper[4933]: I1202 16:15:07.230501 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 02 16:15:07 crc kubenswrapper[4933]: I1202 16:15:07.232533 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 02 16:15:07 crc kubenswrapper[4933]: I1202 16:15:07.262015 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 16:15:07 crc kubenswrapper[4933]: I1202 16:15:07.285241 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a957352-db06-4697-9911-a5e2a15e6ef0-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"4a957352-db06-4697-9911-a5e2a15e6ef0\") " pod="openstack/glance-default-external-api-0" Dec 02 16:15:07 crc kubenswrapper[4933]: I1202 16:15:07.285528 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a957352-db06-4697-9911-a5e2a15e6ef0-scripts\") pod \"glance-default-external-api-0\" (UID: \"4a957352-db06-4697-9911-a5e2a15e6ef0\") " pod="openstack/glance-default-external-api-0" Dec 02 16:15:07 crc kubenswrapper[4933]: I1202 16:15:07.285677 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4a957352-db06-4697-9911-a5e2a15e6ef0-logs\") pod \"glance-default-external-api-0\" (UID: \"4a957352-db06-4697-9911-a5e2a15e6ef0\") " pod="openstack/glance-default-external-api-0" Dec 02 16:15:07 crc kubenswrapper[4933]: I1202 16:15:07.285730 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4a957352-db06-4697-9911-a5e2a15e6ef0-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"4a957352-db06-4697-9911-a5e2a15e6ef0\") " pod="openstack/glance-default-external-api-0" Dec 02 16:15:07 crc kubenswrapper[4933]: I1202 16:15:07.285766 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n6rlj\" (UniqueName: \"kubernetes.io/projected/4a957352-db06-4697-9911-a5e2a15e6ef0-kube-api-access-n6rlj\") pod \"glance-default-external-api-0\" (UID: \"4a957352-db06-4697-9911-a5e2a15e6ef0\") " pod="openstack/glance-default-external-api-0" Dec 02 16:15:07 crc kubenswrapper[4933]: I1202 16:15:07.285923 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a957352-db06-4697-9911-a5e2a15e6ef0-config-data\") pod \"glance-default-external-api-0\" (UID: \"4a957352-db06-4697-9911-a5e2a15e6ef0\") " pod="openstack/glance-default-external-api-0" Dec 02 16:15:07 crc kubenswrapper[4933]: I1202 16:15:07.285961 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"4a957352-db06-4697-9911-a5e2a15e6ef0\") " pod="openstack/glance-default-external-api-0" Dec 02 16:15:07 crc kubenswrapper[4933]: I1202 16:15:07.288454 4933 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"4a957352-db06-4697-9911-a5e2a15e6ef0\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-external-api-0" Dec 02 16:15:07 crc kubenswrapper[4933]: I1202 16:15:07.291404 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4a957352-db06-4697-9911-a5e2a15e6ef0-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"4a957352-db06-4697-9911-a5e2a15e6ef0\") " pod="openstack/glance-default-external-api-0" Dec 02 16:15:07 crc kubenswrapper[4933]: I1202 16:15:07.293011 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4a957352-db06-4697-9911-a5e2a15e6ef0-logs\") pod \"glance-default-external-api-0\" (UID: \"4a957352-db06-4697-9911-a5e2a15e6ef0\") " pod="openstack/glance-default-external-api-0" Dec 02 16:15:07 crc kubenswrapper[4933]: I1202 16:15:07.301091 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a957352-db06-4697-9911-a5e2a15e6ef0-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"4a957352-db06-4697-9911-a5e2a15e6ef0\") " pod="openstack/glance-default-external-api-0" Dec 02 16:15:07 crc kubenswrapper[4933]: I1202 16:15:07.302423 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a957352-db06-4697-9911-a5e2a15e6ef0-config-data\") pod \"glance-default-external-api-0\" (UID: \"4a957352-db06-4697-9911-a5e2a15e6ef0\") " pod="openstack/glance-default-external-api-0" Dec 02 16:15:07 crc kubenswrapper[4933]: I1202 16:15:07.319459 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a957352-db06-4697-9911-a5e2a15e6ef0-scripts\") pod \"glance-default-external-api-0\" (UID: \"4a957352-db06-4697-9911-a5e2a15e6ef0\") " pod="openstack/glance-default-external-api-0" Dec 02 16:15:07 crc kubenswrapper[4933]: I1202 16:15:07.327725 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6rlj\" (UniqueName: \"kubernetes.io/projected/4a957352-db06-4697-9911-a5e2a15e6ef0-kube-api-access-n6rlj\") pod \"glance-default-external-api-0\" (UID: \"4a957352-db06-4697-9911-a5e2a15e6ef0\") " pod="openstack/glance-default-external-api-0" Dec 02 16:15:07 crc kubenswrapper[4933]: I1202 16:15:07.334146 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-sxblk" Dec 02 16:15:07 crc kubenswrapper[4933]: I1202 16:15:07.382506 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"4a957352-db06-4697-9911-a5e2a15e6ef0\") " pod="openstack/glance-default-external-api-0" Dec 02 16:15:07 crc kubenswrapper[4933]: I1202 16:15:07.392780 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldl6x\" (UniqueName: \"kubernetes.io/projected/ef22a91b-23f4-4686-9672-57a7ce6e12d0-kube-api-access-ldl6x\") pod \"glance-default-internal-api-0\" (UID: \"ef22a91b-23f4-4686-9672-57a7ce6e12d0\") " pod="openstack/glance-default-internal-api-0" Dec 02 16:15:07 crc kubenswrapper[4933]: I1202 16:15:07.392971 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef22a91b-23f4-4686-9672-57a7ce6e12d0-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ef22a91b-23f4-4686-9672-57a7ce6e12d0\") " pod="openstack/glance-default-internal-api-0" Dec 02 16:15:07 crc kubenswrapper[4933]: I1202 16:15:07.393083 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef22a91b-23f4-4686-9672-57a7ce6e12d0-logs\") pod \"glance-default-internal-api-0\" (UID: \"ef22a91b-23f4-4686-9672-57a7ce6e12d0\") " pod="openstack/glance-default-internal-api-0" Dec 02 16:15:07 crc kubenswrapper[4933]: I1202 16:15:07.393139 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef22a91b-23f4-4686-9672-57a7ce6e12d0-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ef22a91b-23f4-4686-9672-57a7ce6e12d0\") " pod="openstack/glance-default-internal-api-0" Dec 02 16:15:07 crc kubenswrapper[4933]: I1202 16:15:07.393215 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ef22a91b-23f4-4686-9672-57a7ce6e12d0-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ef22a91b-23f4-4686-9672-57a7ce6e12d0\") " pod="openstack/glance-default-internal-api-0" Dec 02 16:15:07 crc kubenswrapper[4933]: I1202 16:15:07.393255 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef22a91b-23f4-4686-9672-57a7ce6e12d0-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ef22a91b-23f4-4686-9672-57a7ce6e12d0\") " pod="openstack/glance-default-internal-api-0" Dec 02 16:15:07 crc kubenswrapper[4933]: I1202 16:15:07.393283 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"ef22a91b-23f4-4686-9672-57a7ce6e12d0\") " pod="openstack/glance-default-internal-api-0" Dec 02 16:15:07 crc kubenswrapper[4933]: I1202 16:15:07.494553 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d342549d-487a-4b41-89cc-8ce263aed373-combined-ca-bundle\") pod \"d342549d-487a-4b41-89cc-8ce263aed373\" (UID: \"d342549d-487a-4b41-89cc-8ce263aed373\") " Dec 02 16:15:07 crc kubenswrapper[4933]: I1202 16:15:07.494599 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d342549d-487a-4b41-89cc-8ce263aed373-config\") pod \"d342549d-487a-4b41-89cc-8ce263aed373\" (UID: \"d342549d-487a-4b41-89cc-8ce263aed373\") " Dec 02 16:15:07 crc kubenswrapper[4933]: I1202 16:15:07.494871 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tgfhp\" (UniqueName: \"kubernetes.io/projected/d342549d-487a-4b41-89cc-8ce263aed373-kube-api-access-tgfhp\") pod \"d342549d-487a-4b41-89cc-8ce263aed373\" (UID: \"d342549d-487a-4b41-89cc-8ce263aed373\") " Dec 02 16:15:07 crc kubenswrapper[4933]: I1202 16:15:07.495177 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ldl6x\" (UniqueName: \"kubernetes.io/projected/ef22a91b-23f4-4686-9672-57a7ce6e12d0-kube-api-access-ldl6x\") pod \"glance-default-internal-api-0\" (UID: \"ef22a91b-23f4-4686-9672-57a7ce6e12d0\") " pod="openstack/glance-default-internal-api-0" Dec 02 16:15:07 crc kubenswrapper[4933]: I1202 16:15:07.495286 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef22a91b-23f4-4686-9672-57a7ce6e12d0-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ef22a91b-23f4-4686-9672-57a7ce6e12d0\") " pod="openstack/glance-default-internal-api-0" Dec 02 16:15:07 crc kubenswrapper[4933]: I1202 16:15:07.495363 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef22a91b-23f4-4686-9672-57a7ce6e12d0-logs\") pod \"glance-default-internal-api-0\" (UID: \"ef22a91b-23f4-4686-9672-57a7ce6e12d0\") " pod="openstack/glance-default-internal-api-0" Dec 02 16:15:07 crc kubenswrapper[4933]: I1202 16:15:07.495401 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef22a91b-23f4-4686-9672-57a7ce6e12d0-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ef22a91b-23f4-4686-9672-57a7ce6e12d0\") " pod="openstack/glance-default-internal-api-0" Dec 02 16:15:07 crc kubenswrapper[4933]: I1202 16:15:07.495434 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ef22a91b-23f4-4686-9672-57a7ce6e12d0-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ef22a91b-23f4-4686-9672-57a7ce6e12d0\") " pod="openstack/glance-default-internal-api-0" Dec 02 16:15:07 crc kubenswrapper[4933]: I1202 16:15:07.495462 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef22a91b-23f4-4686-9672-57a7ce6e12d0-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ef22a91b-23f4-4686-9672-57a7ce6e12d0\") " pod="openstack/glance-default-internal-api-0" Dec 02 16:15:07 crc kubenswrapper[4933]: I1202 16:15:07.495481 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"ef22a91b-23f4-4686-9672-57a7ce6e12d0\") " pod="openstack/glance-default-internal-api-0" Dec 02 16:15:07 crc kubenswrapper[4933]: I1202 16:15:07.495640 4933 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"ef22a91b-23f4-4686-9672-57a7ce6e12d0\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-internal-api-0" Dec 02 16:15:07 crc kubenswrapper[4933]: I1202 16:15:07.498256 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef22a91b-23f4-4686-9672-57a7ce6e12d0-logs\") pod \"glance-default-internal-api-0\" (UID: \"ef22a91b-23f4-4686-9672-57a7ce6e12d0\") " pod="openstack/glance-default-internal-api-0" Dec 02 16:15:07 crc kubenswrapper[4933]: I1202 16:15:07.500390 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d342549d-487a-4b41-89cc-8ce263aed373-kube-api-access-tgfhp" (OuterVolumeSpecName: "kube-api-access-tgfhp") pod "d342549d-487a-4b41-89cc-8ce263aed373" (UID: "d342549d-487a-4b41-89cc-8ce263aed373"). InnerVolumeSpecName "kube-api-access-tgfhp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:15:07 crc kubenswrapper[4933]: I1202 16:15:07.500607 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ef22a91b-23f4-4686-9672-57a7ce6e12d0-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ef22a91b-23f4-4686-9672-57a7ce6e12d0\") " pod="openstack/glance-default-internal-api-0" Dec 02 16:15:07 crc kubenswrapper[4933]: I1202 16:15:07.511862 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef22a91b-23f4-4686-9672-57a7ce6e12d0-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ef22a91b-23f4-4686-9672-57a7ce6e12d0\") " pod="openstack/glance-default-internal-api-0" Dec 02 16:15:07 crc kubenswrapper[4933]: I1202 16:15:07.516596 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef22a91b-23f4-4686-9672-57a7ce6e12d0-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ef22a91b-23f4-4686-9672-57a7ce6e12d0\") " pod="openstack/glance-default-internal-api-0" Dec 02 16:15:07 crc kubenswrapper[4933]: I1202 16:15:07.517181 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef22a91b-23f4-4686-9672-57a7ce6e12d0-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ef22a91b-23f4-4686-9672-57a7ce6e12d0\") " pod="openstack/glance-default-internal-api-0" Dec 02 16:15:07 crc kubenswrapper[4933]: I1202 16:15:07.534907 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d342549d-487a-4b41-89cc-8ce263aed373-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d342549d-487a-4b41-89cc-8ce263aed373" (UID: "d342549d-487a-4b41-89cc-8ce263aed373"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:15:07 crc kubenswrapper[4933]: I1202 16:15:07.541394 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldl6x\" (UniqueName: \"kubernetes.io/projected/ef22a91b-23f4-4686-9672-57a7ce6e12d0-kube-api-access-ldl6x\") pod \"glance-default-internal-api-0\" (UID: \"ef22a91b-23f4-4686-9672-57a7ce6e12d0\") " pod="openstack/glance-default-internal-api-0" Dec 02 16:15:07 crc kubenswrapper[4933]: I1202 16:15:07.549594 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d342549d-487a-4b41-89cc-8ce263aed373-config" (OuterVolumeSpecName: "config") pod "d342549d-487a-4b41-89cc-8ce263aed373" (UID: "d342549d-487a-4b41-89cc-8ce263aed373"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:15:07 crc kubenswrapper[4933]: I1202 16:15:07.561143 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"ef22a91b-23f4-4686-9672-57a7ce6e12d0\") " pod="openstack/glance-default-internal-api-0" Dec 02 16:15:07 crc kubenswrapper[4933]: I1202 16:15:07.597985 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tgfhp\" (UniqueName: \"kubernetes.io/projected/d342549d-487a-4b41-89cc-8ce263aed373-kube-api-access-tgfhp\") on node \"crc\" DevicePath \"\"" Dec 02 16:15:07 crc kubenswrapper[4933]: I1202 16:15:07.598025 4933 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d342549d-487a-4b41-89cc-8ce263aed373-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 16:15:07 crc kubenswrapper[4933]: I1202 16:15:07.598034 4933 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/d342549d-487a-4b41-89cc-8ce263aed373-config\") on node \"crc\" DevicePath \"\"" Dec 02 16:15:07 crc kubenswrapper[4933]: I1202 16:15:07.628604 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 02 16:15:07 crc kubenswrapper[4933]: I1202 16:15:07.647479 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 02 16:15:07 crc kubenswrapper[4933]: I1202 16:15:07.711773 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"cf325ab8-af91-4009-9e8b-a299db2234da","Type":"ContainerStarted","Data":"6f3295c51d6c4f8244ee72fea03ba95d8aab1cb154286bb50d9d068ef0ba7434"} Dec 02 16:15:07 crc kubenswrapper[4933]: I1202 16:15:07.715346 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-rwgj6" event={"ID":"39ec29fe-4f0d-4afe-9440-0cdc7b30d651","Type":"ContainerStarted","Data":"aa43f89a5ee3cc09277b8d1c5371526e918b571f89705e74235a7dbb6f5a2482"} Dec 02 16:15:07 crc kubenswrapper[4933]: I1202 16:15:07.715385 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-rwgj6" event={"ID":"39ec29fe-4f0d-4afe-9440-0cdc7b30d651","Type":"ContainerStarted","Data":"9971143dd34708cdc50c8e12f324a4488f2e2a4ffecf3da1386dff627a0e5223"} Dec 02 16:15:07 crc kubenswrapper[4933]: I1202 16:15:07.727056 4933 generic.go:334] "Generic (PLEG): container finished" podID="249f9151-f57b-4ab8-8e3a-5c5256e2ed1c" containerID="e3e507e10cf4ad3f6184e645ef67d092ac7ef28eb186805dbe88dbf2a8d0c8e6" exitCode=0 Dec 02 16:15:07 crc kubenswrapper[4933]: I1202 16:15:07.727167 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411535-sfhvv" event={"ID":"249f9151-f57b-4ab8-8e3a-5c5256e2ed1c","Type":"ContainerDied","Data":"e3e507e10cf4ad3f6184e645ef67d092ac7ef28eb186805dbe88dbf2a8d0c8e6"} Dec 02 16:15:07 crc kubenswrapper[4933]: I1202 16:15:07.727201 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411535-sfhvv" event={"ID":"249f9151-f57b-4ab8-8e3a-5c5256e2ed1c","Type":"ContainerStarted","Data":"9b4158a9c618c852b5df6841332ed2eb564b7f95b723fa4755ac721afb6a6e51"} Dec 02 16:15:07 crc kubenswrapper[4933]: I1202 16:15:07.736294 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-sxblk" event={"ID":"d342549d-487a-4b41-89cc-8ce263aed373","Type":"ContainerDied","Data":"5dbfb0ad8df619c1a8c6cf3472f4f2b5e126cb9821bc8370ae86a9b7a5b04d9b"} Dec 02 16:15:07 crc kubenswrapper[4933]: I1202 16:15:07.736317 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-sxblk" Dec 02 16:15:07 crc kubenswrapper[4933]: I1202 16:15:07.736333 4933 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5dbfb0ad8df619c1a8c6cf3472f4f2b5e126cb9821bc8370ae86a9b7a5b04d9b" Dec 02 16:15:07 crc kubenswrapper[4933]: I1202 16:15:07.739404 4933 generic.go:334] "Generic (PLEG): container finished" podID="753f73e1-f883-4c44-a499-3e73eadba372" containerID="c1d02d84063ca3a37879551577a39230871433a740de7d8e61bfff843d900172" exitCode=0 Dec 02 16:15:07 crc kubenswrapper[4933]: I1202 16:15:07.740702 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56798b757f-hcdxn" event={"ID":"753f73e1-f883-4c44-a499-3e73eadba372","Type":"ContainerDied","Data":"c1d02d84063ca3a37879551577a39230871433a740de7d8e61bfff843d900172"} Dec 02 16:15:07 crc kubenswrapper[4933]: I1202 16:15:07.740733 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56798b757f-hcdxn" event={"ID":"753f73e1-f883-4c44-a499-3e73eadba372","Type":"ContainerStarted","Data":"b2f1e17caef99f451ddbeb323efa37dc3d65316eb6a291df036a51b800efca4e"} Dec 02 16:15:07 crc kubenswrapper[4933]: I1202 16:15:07.774003 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=67.342583821 podStartE2EDuration="1m24.773977967s" podCreationTimestamp="2025-12-02 16:13:43 +0000 UTC" firstStartedPulling="2025-12-02 16:14:17.064762585 +0000 UTC m=+1320.315989288" lastFinishedPulling="2025-12-02 16:14:34.496156731 +0000 UTC m=+1337.747383434" observedRunningTime="2025-12-02 16:15:07.766218857 +0000 UTC m=+1371.017445580" watchObservedRunningTime="2025-12-02 16:15:07.773977967 +0000 UTC m=+1371.025204670" Dec 02 16:15:08 crc kubenswrapper[4933]: I1202 16:15:08.014894 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56798b757f-hcdxn"] Dec 02 16:15:08 crc kubenswrapper[4933]: I1202 16:15:08.033841 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-rwgj6" podStartSLOduration=13.03380282 podStartE2EDuration="13.03380282s" podCreationTimestamp="2025-12-02 16:14:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 16:15:07.925936328 +0000 UTC m=+1371.177163031" watchObservedRunningTime="2025-12-02 16:15:08.03380282 +0000 UTC m=+1371.285029533" Dec 02 16:15:08 crc kubenswrapper[4933]: I1202 16:15:08.063813 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b6c948c7-gg98x"] Dec 02 16:15:08 crc kubenswrapper[4933]: E1202 16:15:08.064386 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d342549d-487a-4b41-89cc-8ce263aed373" containerName="neutron-db-sync" Dec 02 16:15:08 crc kubenswrapper[4933]: I1202 16:15:08.064406 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="d342549d-487a-4b41-89cc-8ce263aed373" containerName="neutron-db-sync" Dec 02 16:15:08 crc kubenswrapper[4933]: I1202 16:15:08.064667 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="d342549d-487a-4b41-89cc-8ce263aed373" containerName="neutron-db-sync" Dec 02 16:15:08 crc kubenswrapper[4933]: I1202 16:15:08.065910 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b6c948c7-gg98x" Dec 02 16:15:08 crc kubenswrapper[4933]: I1202 16:15:08.082317 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-858c7d46f6-k58rb"] Dec 02 16:15:08 crc kubenswrapper[4933]: I1202 16:15:08.084545 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-858c7d46f6-k58rb" Dec 02 16:15:08 crc kubenswrapper[4933]: I1202 16:15:08.087266 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-k4lbn" Dec 02 16:15:08 crc kubenswrapper[4933]: I1202 16:15:08.087376 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Dec 02 16:15:08 crc kubenswrapper[4933]: I1202 16:15:08.089537 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 02 16:15:08 crc kubenswrapper[4933]: I1202 16:15:08.089670 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 02 16:15:08 crc kubenswrapper[4933]: I1202 16:15:08.111846 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b6c948c7-gg98x"] Dec 02 16:15:08 crc kubenswrapper[4933]: I1202 16:15:08.131604 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xdvc\" (UniqueName: \"kubernetes.io/projected/43ba7114-d190-4ddc-93e1-d00663935f84-kube-api-access-7xdvc\") pod \"dnsmasq-dns-b6c948c7-gg98x\" (UID: \"43ba7114-d190-4ddc-93e1-d00663935f84\") " pod="openstack/dnsmasq-dns-b6c948c7-gg98x" Dec 02 16:15:08 crc kubenswrapper[4933]: I1202 16:15:08.131643 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/43ba7114-d190-4ddc-93e1-d00663935f84-ovsdbserver-sb\") pod \"dnsmasq-dns-b6c948c7-gg98x\" (UID: \"43ba7114-d190-4ddc-93e1-d00663935f84\") " pod="openstack/dnsmasq-dns-b6c948c7-gg98x" Dec 02 16:15:08 crc kubenswrapper[4933]: I1202 16:15:08.131736 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43ba7114-d190-4ddc-93e1-d00663935f84-config\") pod \"dnsmasq-dns-b6c948c7-gg98x\" (UID: \"43ba7114-d190-4ddc-93e1-d00663935f84\") " pod="openstack/dnsmasq-dns-b6c948c7-gg98x" Dec 02 16:15:08 crc kubenswrapper[4933]: I1202 16:15:08.131757 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/43ba7114-d190-4ddc-93e1-d00663935f84-ovsdbserver-nb\") pod \"dnsmasq-dns-b6c948c7-gg98x\" (UID: \"43ba7114-d190-4ddc-93e1-d00663935f84\") " pod="openstack/dnsmasq-dns-b6c948c7-gg98x" Dec 02 16:15:08 crc kubenswrapper[4933]: I1202 16:15:08.132028 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/43ba7114-d190-4ddc-93e1-d00663935f84-dns-svc\") pod \"dnsmasq-dns-b6c948c7-gg98x\" (UID: \"43ba7114-d190-4ddc-93e1-d00663935f84\") " pod="openstack/dnsmasq-dns-b6c948c7-gg98x" Dec 02 16:15:08 crc kubenswrapper[4933]: I1202 16:15:08.165342 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-858c7d46f6-k58rb"] Dec 02 16:15:08 crc kubenswrapper[4933]: E1202 16:15:08.217538 4933 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Dec 02 16:15:08 crc kubenswrapper[4933]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/753f73e1-f883-4c44-a499-3e73eadba372/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Dec 02 16:15:08 crc kubenswrapper[4933]: > podSandboxID="b2f1e17caef99f451ddbeb323efa37dc3d65316eb6a291df036a51b800efca4e" Dec 02 16:15:08 crc kubenswrapper[4933]: E1202 16:15:08.217690 4933 kuberuntime_manager.go:1274] "Unhandled Error" err=< Dec 02 16:15:08 crc kubenswrapper[4933]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n7h6dh64ch5dhdh6dh68dh565hdfh55bh68ch698h548hbdh5cchd9h67fh5fh569h55bh67bh9dh555h678h669hddh76h659h599hbdhfbhc5q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-nb,SubPath:ovsdbserver-nb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-sb,SubPath:ovsdbserver-sb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-b88th,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-56798b757f-hcdxn_openstack(753f73e1-f883-4c44-a499-3e73eadba372): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/753f73e1-f883-4c44-a499-3e73eadba372/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Dec 02 16:15:08 crc kubenswrapper[4933]: > logger="UnhandledError" Dec 02 16:15:08 crc kubenswrapper[4933]: E1202 16:15:08.219855 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/753f73e1-f883-4c44-a499-3e73eadba372/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-56798b757f-hcdxn" podUID="753f73e1-f883-4c44-a499-3e73eadba372" Dec 02 16:15:08 crc kubenswrapper[4933]: I1202 16:15:08.237090 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwb5s\" (UniqueName: \"kubernetes.io/projected/d981749f-83d5-4096-869d-617b2a6f1ce7-kube-api-access-wwb5s\") pod \"neutron-858c7d46f6-k58rb\" (UID: \"d981749f-83d5-4096-869d-617b2a6f1ce7\") " pod="openstack/neutron-858c7d46f6-k58rb" Dec 02 16:15:08 crc kubenswrapper[4933]: I1202 16:15:08.237208 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/43ba7114-d190-4ddc-93e1-d00663935f84-dns-svc\") pod \"dnsmasq-dns-b6c948c7-gg98x\" (UID: \"43ba7114-d190-4ddc-93e1-d00663935f84\") " pod="openstack/dnsmasq-dns-b6c948c7-gg98x" Dec 02 16:15:08 crc kubenswrapper[4933]: I1202 16:15:08.238648 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/43ba7114-d190-4ddc-93e1-d00663935f84-dns-svc\") pod \"dnsmasq-dns-b6c948c7-gg98x\" (UID: \"43ba7114-d190-4ddc-93e1-d00663935f84\") " pod="openstack/dnsmasq-dns-b6c948c7-gg98x" Dec 02 16:15:08 crc kubenswrapper[4933]: I1202 16:15:08.240966 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d981749f-83d5-4096-869d-617b2a6f1ce7-httpd-config\") pod \"neutron-858c7d46f6-k58rb\" (UID: \"d981749f-83d5-4096-869d-617b2a6f1ce7\") " pod="openstack/neutron-858c7d46f6-k58rb" Dec 02 16:15:08 crc kubenswrapper[4933]: I1202 16:15:08.241092 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xdvc\" (UniqueName: \"kubernetes.io/projected/43ba7114-d190-4ddc-93e1-d00663935f84-kube-api-access-7xdvc\") pod \"dnsmasq-dns-b6c948c7-gg98x\" (UID: \"43ba7114-d190-4ddc-93e1-d00663935f84\") " pod="openstack/dnsmasq-dns-b6c948c7-gg98x" Dec 02 16:15:08 crc kubenswrapper[4933]: I1202 16:15:08.241128 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/43ba7114-d190-4ddc-93e1-d00663935f84-ovsdbserver-sb\") pod \"dnsmasq-dns-b6c948c7-gg98x\" (UID: \"43ba7114-d190-4ddc-93e1-d00663935f84\") " pod="openstack/dnsmasq-dns-b6c948c7-gg98x" Dec 02 16:15:08 crc kubenswrapper[4933]: I1202 16:15:08.241183 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d981749f-83d5-4096-869d-617b2a6f1ce7-combined-ca-bundle\") pod \"neutron-858c7d46f6-k58rb\" (UID: \"d981749f-83d5-4096-869d-617b2a6f1ce7\") " pod="openstack/neutron-858c7d46f6-k58rb" Dec 02 16:15:08 crc kubenswrapper[4933]: I1202 16:15:08.241308 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43ba7114-d190-4ddc-93e1-d00663935f84-config\") pod \"dnsmasq-dns-b6c948c7-gg98x\" (UID: \"43ba7114-d190-4ddc-93e1-d00663935f84\") " pod="openstack/dnsmasq-dns-b6c948c7-gg98x" Dec 02 16:15:08 crc kubenswrapper[4933]: I1202 16:15:08.241339 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/43ba7114-d190-4ddc-93e1-d00663935f84-ovsdbserver-nb\") pod \"dnsmasq-dns-b6c948c7-gg98x\" (UID: \"43ba7114-d190-4ddc-93e1-d00663935f84\") " pod="openstack/dnsmasq-dns-b6c948c7-gg98x" Dec 02 16:15:08 crc kubenswrapper[4933]: I1202 16:15:08.241372 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d981749f-83d5-4096-869d-617b2a6f1ce7-ovndb-tls-certs\") pod \"neutron-858c7d46f6-k58rb\" (UID: \"d981749f-83d5-4096-869d-617b2a6f1ce7\") " pod="openstack/neutron-858c7d46f6-k58rb" Dec 02 16:15:08 crc kubenswrapper[4933]: I1202 16:15:08.241443 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d981749f-83d5-4096-869d-617b2a6f1ce7-config\") pod \"neutron-858c7d46f6-k58rb\" (UID: \"d981749f-83d5-4096-869d-617b2a6f1ce7\") " pod="openstack/neutron-858c7d46f6-k58rb" Dec 02 16:15:08 crc kubenswrapper[4933]: I1202 16:15:08.242241 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/43ba7114-d190-4ddc-93e1-d00663935f84-ovsdbserver-sb\") pod \"dnsmasq-dns-b6c948c7-gg98x\" (UID: \"43ba7114-d190-4ddc-93e1-d00663935f84\") " pod="openstack/dnsmasq-dns-b6c948c7-gg98x" Dec 02 16:15:08 crc kubenswrapper[4933]: I1202 16:15:08.242859 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/43ba7114-d190-4ddc-93e1-d00663935f84-ovsdbserver-nb\") pod \"dnsmasq-dns-b6c948c7-gg98x\" (UID: \"43ba7114-d190-4ddc-93e1-d00663935f84\") " pod="openstack/dnsmasq-dns-b6c948c7-gg98x" Dec 02 16:15:08 crc kubenswrapper[4933]: I1202 16:15:08.245120 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43ba7114-d190-4ddc-93e1-d00663935f84-config\") pod \"dnsmasq-dns-b6c948c7-gg98x\" (UID: \"43ba7114-d190-4ddc-93e1-d00663935f84\") " pod="openstack/dnsmasq-dns-b6c948c7-gg98x" Dec 02 16:15:08 crc kubenswrapper[4933]: I1202 16:15:08.267815 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b6c948c7-gg98x"] Dec 02 16:15:08 crc kubenswrapper[4933]: E1202 16:15:08.269354 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-7xdvc], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-b6c948c7-gg98x" podUID="43ba7114-d190-4ddc-93e1-d00663935f84" Dec 02 16:15:08 crc kubenswrapper[4933]: I1202 16:15:08.283751 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-rsdtj"] Dec 02 16:15:08 crc kubenswrapper[4933]: I1202 16:15:08.285591 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc5c4795-rsdtj" Dec 02 16:15:08 crc kubenswrapper[4933]: I1202 16:15:08.296496 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-rsdtj"] Dec 02 16:15:08 crc kubenswrapper[4933]: I1202 16:15:08.300178 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Dec 02 16:15:08 crc kubenswrapper[4933]: I1202 16:15:08.301340 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xdvc\" (UniqueName: \"kubernetes.io/projected/43ba7114-d190-4ddc-93e1-d00663935f84-kube-api-access-7xdvc\") pod \"dnsmasq-dns-b6c948c7-gg98x\" (UID: \"43ba7114-d190-4ddc-93e1-d00663935f84\") " pod="openstack/dnsmasq-dns-b6c948c7-gg98x" Dec 02 16:15:08 crc kubenswrapper[4933]: I1202 16:15:08.343830 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wwb5s\" (UniqueName: \"kubernetes.io/projected/d981749f-83d5-4096-869d-617b2a6f1ce7-kube-api-access-wwb5s\") pod \"neutron-858c7d46f6-k58rb\" (UID: \"d981749f-83d5-4096-869d-617b2a6f1ce7\") " pod="openstack/neutron-858c7d46f6-k58rb" Dec 02 16:15:08 crc kubenswrapper[4933]: I1202 16:15:08.343901 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d981749f-83d5-4096-869d-617b2a6f1ce7-httpd-config\") pod \"neutron-858c7d46f6-k58rb\" (UID: \"d981749f-83d5-4096-869d-617b2a6f1ce7\") " pod="openstack/neutron-858c7d46f6-k58rb" Dec 02 16:15:08 crc kubenswrapper[4933]: I1202 16:15:08.343948 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d981749f-83d5-4096-869d-617b2a6f1ce7-combined-ca-bundle\") pod \"neutron-858c7d46f6-k58rb\" (UID: \"d981749f-83d5-4096-869d-617b2a6f1ce7\") " pod="openstack/neutron-858c7d46f6-k58rb" Dec 02 16:15:08 crc kubenswrapper[4933]: I1202 16:15:08.344091 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d981749f-83d5-4096-869d-617b2a6f1ce7-ovndb-tls-certs\") pod \"neutron-858c7d46f6-k58rb\" (UID: \"d981749f-83d5-4096-869d-617b2a6f1ce7\") " pod="openstack/neutron-858c7d46f6-k58rb" Dec 02 16:15:08 crc kubenswrapper[4933]: I1202 16:15:08.344141 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d981749f-83d5-4096-869d-617b2a6f1ce7-config\") pod \"neutron-858c7d46f6-k58rb\" (UID: \"d981749f-83d5-4096-869d-617b2a6f1ce7\") " pod="openstack/neutron-858c7d46f6-k58rb" Dec 02 16:15:08 crc kubenswrapper[4933]: I1202 16:15:08.348194 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d981749f-83d5-4096-869d-617b2a6f1ce7-ovndb-tls-certs\") pod \"neutron-858c7d46f6-k58rb\" (UID: \"d981749f-83d5-4096-869d-617b2a6f1ce7\") " pod="openstack/neutron-858c7d46f6-k58rb" Dec 02 16:15:08 crc kubenswrapper[4933]: I1202 16:15:08.352019 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d981749f-83d5-4096-869d-617b2a6f1ce7-httpd-config\") pod \"neutron-858c7d46f6-k58rb\" (UID: \"d981749f-83d5-4096-869d-617b2a6f1ce7\") " pod="openstack/neutron-858c7d46f6-k58rb" Dec 02 16:15:08 crc kubenswrapper[4933]: I1202 16:15:08.352045 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/d981749f-83d5-4096-869d-617b2a6f1ce7-config\") pod \"neutron-858c7d46f6-k58rb\" (UID: \"d981749f-83d5-4096-869d-617b2a6f1ce7\") " pod="openstack/neutron-858c7d46f6-k58rb" Dec 02 16:15:08 crc kubenswrapper[4933]: I1202 16:15:08.358433 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d981749f-83d5-4096-869d-617b2a6f1ce7-combined-ca-bundle\") pod \"neutron-858c7d46f6-k58rb\" (UID: \"d981749f-83d5-4096-869d-617b2a6f1ce7\") " pod="openstack/neutron-858c7d46f6-k58rb" Dec 02 16:15:08 crc kubenswrapper[4933]: I1202 16:15:08.380717 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwb5s\" (UniqueName: \"kubernetes.io/projected/d981749f-83d5-4096-869d-617b2a6f1ce7-kube-api-access-wwb5s\") pod \"neutron-858c7d46f6-k58rb\" (UID: \"d981749f-83d5-4096-869d-617b2a6f1ce7\") " pod="openstack/neutron-858c7d46f6-k58rb" Dec 02 16:15:08 crc kubenswrapper[4933]: I1202 16:15:08.446255 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f7534185-21e1-4f0b-81e8-550e0c24b0b3-ovsdbserver-sb\") pod \"dnsmasq-dns-5ccc5c4795-rsdtj\" (UID: \"f7534185-21e1-4f0b-81e8-550e0c24b0b3\") " pod="openstack/dnsmasq-dns-5ccc5c4795-rsdtj" Dec 02 16:15:08 crc kubenswrapper[4933]: I1202 16:15:08.446601 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnzvk\" (UniqueName: \"kubernetes.io/projected/f7534185-21e1-4f0b-81e8-550e0c24b0b3-kube-api-access-jnzvk\") pod \"dnsmasq-dns-5ccc5c4795-rsdtj\" (UID: \"f7534185-21e1-4f0b-81e8-550e0c24b0b3\") " pod="openstack/dnsmasq-dns-5ccc5c4795-rsdtj" Dec 02 16:15:08 crc kubenswrapper[4933]: I1202 16:15:08.446679 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f7534185-21e1-4f0b-81e8-550e0c24b0b3-ovsdbserver-nb\") pod \"dnsmasq-dns-5ccc5c4795-rsdtj\" (UID: \"f7534185-21e1-4f0b-81e8-550e0c24b0b3\") " pod="openstack/dnsmasq-dns-5ccc5c4795-rsdtj" Dec 02 16:15:08 crc kubenswrapper[4933]: I1202 16:15:08.446744 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7534185-21e1-4f0b-81e8-550e0c24b0b3-config\") pod \"dnsmasq-dns-5ccc5c4795-rsdtj\" (UID: \"f7534185-21e1-4f0b-81e8-550e0c24b0b3\") " pod="openstack/dnsmasq-dns-5ccc5c4795-rsdtj" Dec 02 16:15:08 crc kubenswrapper[4933]: I1202 16:15:08.446815 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f7534185-21e1-4f0b-81e8-550e0c24b0b3-dns-svc\") pod \"dnsmasq-dns-5ccc5c4795-rsdtj\" (UID: \"f7534185-21e1-4f0b-81e8-550e0c24b0b3\") " pod="openstack/dnsmasq-dns-5ccc5c4795-rsdtj" Dec 02 16:15:08 crc kubenswrapper[4933]: I1202 16:15:08.446863 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f7534185-21e1-4f0b-81e8-550e0c24b0b3-dns-swift-storage-0\") pod \"dnsmasq-dns-5ccc5c4795-rsdtj\" (UID: \"f7534185-21e1-4f0b-81e8-550e0c24b0b3\") " pod="openstack/dnsmasq-dns-5ccc5c4795-rsdtj" Dec 02 16:15:08 crc kubenswrapper[4933]: I1202 16:15:08.519036 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-858c7d46f6-k58rb" Dec 02 16:15:08 crc kubenswrapper[4933]: I1202 16:15:08.548392 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7534185-21e1-4f0b-81e8-550e0c24b0b3-config\") pod \"dnsmasq-dns-5ccc5c4795-rsdtj\" (UID: \"f7534185-21e1-4f0b-81e8-550e0c24b0b3\") " pod="openstack/dnsmasq-dns-5ccc5c4795-rsdtj" Dec 02 16:15:08 crc kubenswrapper[4933]: I1202 16:15:08.548776 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f7534185-21e1-4f0b-81e8-550e0c24b0b3-dns-svc\") pod \"dnsmasq-dns-5ccc5c4795-rsdtj\" (UID: \"f7534185-21e1-4f0b-81e8-550e0c24b0b3\") " pod="openstack/dnsmasq-dns-5ccc5c4795-rsdtj" Dec 02 16:15:08 crc kubenswrapper[4933]: I1202 16:15:08.548945 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f7534185-21e1-4f0b-81e8-550e0c24b0b3-dns-swift-storage-0\") pod \"dnsmasq-dns-5ccc5c4795-rsdtj\" (UID: \"f7534185-21e1-4f0b-81e8-550e0c24b0b3\") " pod="openstack/dnsmasq-dns-5ccc5c4795-rsdtj" Dec 02 16:15:08 crc kubenswrapper[4933]: I1202 16:15:08.549146 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f7534185-21e1-4f0b-81e8-550e0c24b0b3-ovsdbserver-sb\") pod \"dnsmasq-dns-5ccc5c4795-rsdtj\" (UID: \"f7534185-21e1-4f0b-81e8-550e0c24b0b3\") " pod="openstack/dnsmasq-dns-5ccc5c4795-rsdtj" Dec 02 16:15:08 crc kubenswrapper[4933]: I1202 16:15:08.549431 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7534185-21e1-4f0b-81e8-550e0c24b0b3-config\") pod \"dnsmasq-dns-5ccc5c4795-rsdtj\" (UID: \"f7534185-21e1-4f0b-81e8-550e0c24b0b3\") " pod="openstack/dnsmasq-dns-5ccc5c4795-rsdtj" Dec 02 16:15:08 crc kubenswrapper[4933]: I1202 16:15:08.549694 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f7534185-21e1-4f0b-81e8-550e0c24b0b3-dns-swift-storage-0\") pod \"dnsmasq-dns-5ccc5c4795-rsdtj\" (UID: \"f7534185-21e1-4f0b-81e8-550e0c24b0b3\") " pod="openstack/dnsmasq-dns-5ccc5c4795-rsdtj" Dec 02 16:15:08 crc kubenswrapper[4933]: I1202 16:15:08.549848 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f7534185-21e1-4f0b-81e8-550e0c24b0b3-dns-svc\") pod \"dnsmasq-dns-5ccc5c4795-rsdtj\" (UID: \"f7534185-21e1-4f0b-81e8-550e0c24b0b3\") " pod="openstack/dnsmasq-dns-5ccc5c4795-rsdtj" Dec 02 16:15:08 crc kubenswrapper[4933]: I1202 16:15:08.550132 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f7534185-21e1-4f0b-81e8-550e0c24b0b3-ovsdbserver-sb\") pod \"dnsmasq-dns-5ccc5c4795-rsdtj\" (UID: \"f7534185-21e1-4f0b-81e8-550e0c24b0b3\") " pod="openstack/dnsmasq-dns-5ccc5c4795-rsdtj" Dec 02 16:15:08 crc kubenswrapper[4933]: I1202 16:15:08.550254 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnzvk\" (UniqueName: \"kubernetes.io/projected/f7534185-21e1-4f0b-81e8-550e0c24b0b3-kube-api-access-jnzvk\") pod \"dnsmasq-dns-5ccc5c4795-rsdtj\" (UID: \"f7534185-21e1-4f0b-81e8-550e0c24b0b3\") " pod="openstack/dnsmasq-dns-5ccc5c4795-rsdtj" Dec 02 16:15:08 crc kubenswrapper[4933]: I1202 16:15:08.552279 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f7534185-21e1-4f0b-81e8-550e0c24b0b3-ovsdbserver-nb\") pod \"dnsmasq-dns-5ccc5c4795-rsdtj\" (UID: \"f7534185-21e1-4f0b-81e8-550e0c24b0b3\") " pod="openstack/dnsmasq-dns-5ccc5c4795-rsdtj" Dec 02 16:15:08 crc kubenswrapper[4933]: I1202 16:15:08.554014 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f7534185-21e1-4f0b-81e8-550e0c24b0b3-ovsdbserver-nb\") pod \"dnsmasq-dns-5ccc5c4795-rsdtj\" (UID: \"f7534185-21e1-4f0b-81e8-550e0c24b0b3\") " pod="openstack/dnsmasq-dns-5ccc5c4795-rsdtj" Dec 02 16:15:08 crc kubenswrapper[4933]: I1202 16:15:08.567720 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnzvk\" (UniqueName: \"kubernetes.io/projected/f7534185-21e1-4f0b-81e8-550e0c24b0b3-kube-api-access-jnzvk\") pod \"dnsmasq-dns-5ccc5c4795-rsdtj\" (UID: \"f7534185-21e1-4f0b-81e8-550e0c24b0b3\") " pod="openstack/dnsmasq-dns-5ccc5c4795-rsdtj" Dec 02 16:15:08 crc kubenswrapper[4933]: I1202 16:15:08.585049 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 16:15:08 crc kubenswrapper[4933]: I1202 16:15:08.649358 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc5c4795-rsdtj" Dec 02 16:15:08 crc kubenswrapper[4933]: I1202 16:15:08.735669 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 16:15:08 crc kubenswrapper[4933]: I1202 16:15:08.772321 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4a957352-db06-4697-9911-a5e2a15e6ef0","Type":"ContainerStarted","Data":"02e3728faee3e7b836bf3a7a8ab4b7af71063acc277d252d6124fa7b135c989b"} Dec 02 16:15:08 crc kubenswrapper[4933]: I1202 16:15:08.772404 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b6c948c7-gg98x" Dec 02 16:15:08 crc kubenswrapper[4933]: I1202 16:15:08.810487 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b6c948c7-gg98x" Dec 02 16:15:08 crc kubenswrapper[4933]: I1202 16:15:08.928036 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/43ba7114-d190-4ddc-93e1-d00663935f84-ovsdbserver-nb\") pod \"43ba7114-d190-4ddc-93e1-d00663935f84\" (UID: \"43ba7114-d190-4ddc-93e1-d00663935f84\") " Dec 02 16:15:08 crc kubenswrapper[4933]: I1202 16:15:08.928154 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7xdvc\" (UniqueName: \"kubernetes.io/projected/43ba7114-d190-4ddc-93e1-d00663935f84-kube-api-access-7xdvc\") pod \"43ba7114-d190-4ddc-93e1-d00663935f84\" (UID: \"43ba7114-d190-4ddc-93e1-d00663935f84\") " Dec 02 16:15:08 crc kubenswrapper[4933]: I1202 16:15:08.928220 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/43ba7114-d190-4ddc-93e1-d00663935f84-dns-svc\") pod \"43ba7114-d190-4ddc-93e1-d00663935f84\" (UID: \"43ba7114-d190-4ddc-93e1-d00663935f84\") " Dec 02 16:15:08 crc kubenswrapper[4933]: I1202 16:15:08.928265 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43ba7114-d190-4ddc-93e1-d00663935f84-config\") pod \"43ba7114-d190-4ddc-93e1-d00663935f84\" (UID: \"43ba7114-d190-4ddc-93e1-d00663935f84\") " Dec 02 16:15:08 crc kubenswrapper[4933]: I1202 16:15:08.928345 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/43ba7114-d190-4ddc-93e1-d00663935f84-ovsdbserver-sb\") pod \"43ba7114-d190-4ddc-93e1-d00663935f84\" (UID: \"43ba7114-d190-4ddc-93e1-d00663935f84\") " Dec 02 16:15:08 crc kubenswrapper[4933]: I1202 16:15:08.929490 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43ba7114-d190-4ddc-93e1-d00663935f84-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "43ba7114-d190-4ddc-93e1-d00663935f84" (UID: "43ba7114-d190-4ddc-93e1-d00663935f84"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:15:08 crc kubenswrapper[4933]: I1202 16:15:08.929882 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43ba7114-d190-4ddc-93e1-d00663935f84-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "43ba7114-d190-4ddc-93e1-d00663935f84" (UID: "43ba7114-d190-4ddc-93e1-d00663935f84"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:15:08 crc kubenswrapper[4933]: I1202 16:15:08.933665 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43ba7114-d190-4ddc-93e1-d00663935f84-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "43ba7114-d190-4ddc-93e1-d00663935f84" (UID: "43ba7114-d190-4ddc-93e1-d00663935f84"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:15:08 crc kubenswrapper[4933]: I1202 16:15:08.934770 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43ba7114-d190-4ddc-93e1-d00663935f84-config" (OuterVolumeSpecName: "config") pod "43ba7114-d190-4ddc-93e1-d00663935f84" (UID: "43ba7114-d190-4ddc-93e1-d00663935f84"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:15:08 crc kubenswrapper[4933]: I1202 16:15:08.938914 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43ba7114-d190-4ddc-93e1-d00663935f84-kube-api-access-7xdvc" (OuterVolumeSpecName: "kube-api-access-7xdvc") pod "43ba7114-d190-4ddc-93e1-d00663935f84" (UID: "43ba7114-d190-4ddc-93e1-d00663935f84"). InnerVolumeSpecName "kube-api-access-7xdvc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:15:09 crc kubenswrapper[4933]: I1202 16:15:09.032730 4933 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/43ba7114-d190-4ddc-93e1-d00663935f84-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 02 16:15:09 crc kubenswrapper[4933]: I1202 16:15:09.032750 4933 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/43ba7114-d190-4ddc-93e1-d00663935f84-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 02 16:15:09 crc kubenswrapper[4933]: I1202 16:15:09.032759 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7xdvc\" (UniqueName: \"kubernetes.io/projected/43ba7114-d190-4ddc-93e1-d00663935f84-kube-api-access-7xdvc\") on node \"crc\" DevicePath \"\"" Dec 02 16:15:09 crc kubenswrapper[4933]: I1202 16:15:09.032768 4933 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/43ba7114-d190-4ddc-93e1-d00663935f84-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 16:15:09 crc kubenswrapper[4933]: I1202 16:15:09.032776 4933 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43ba7114-d190-4ddc-93e1-d00663935f84-config\") on node \"crc\" DevicePath \"\"" Dec 02 16:15:09 crc kubenswrapper[4933]: I1202 16:15:09.355538 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-858c7d46f6-k58rb"] Dec 02 16:15:09 crc kubenswrapper[4933]: W1202 16:15:09.775934 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podef22a91b_23f4_4686_9672_57a7ce6e12d0.slice/crio-86704492a6dd16e3cffe841d0c38fad6cc776f9d67966953cdff1096a7f5f034 WatchSource:0}: Error finding container 86704492a6dd16e3cffe841d0c38fad6cc776f9d67966953cdff1096a7f5f034: Status 404 returned error can't find the container with id 86704492a6dd16e3cffe841d0c38fad6cc776f9d67966953cdff1096a7f5f034 Dec 02 16:15:09 crc kubenswrapper[4933]: I1202 16:15:09.794176 4933 generic.go:334] "Generic (PLEG): container finished" podID="f7179e9e-f623-44a2-9f70-021244770e56" containerID="21d83126c87174185d7f389dac8b1554c1fbb2e8b04e811961f27bd1a5cff784" exitCode=0 Dec 02 16:15:09 crc kubenswrapper[4933]: I1202 16:15:09.794318 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b6c948c7-gg98x" Dec 02 16:15:09 crc kubenswrapper[4933]: I1202 16:15:09.794415 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-pd5s5" event={"ID":"f7179e9e-f623-44a2-9f70-021244770e56","Type":"ContainerDied","Data":"21d83126c87174185d7f389dac8b1554c1fbb2e8b04e811961f27bd1a5cff784"} Dec 02 16:15:09 crc kubenswrapper[4933]: I1202 16:15:09.921973 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 16:15:10 crc kubenswrapper[4933]: I1202 16:15:10.013228 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 16:15:10 crc kubenswrapper[4933]: I1202 16:15:10.075198 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56798b757f-hcdxn" Dec 02 16:15:10 crc kubenswrapper[4933]: I1202 16:15:10.086419 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411535-sfhvv" Dec 02 16:15:10 crc kubenswrapper[4933]: I1202 16:15:10.150105 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b6c948c7-gg98x"] Dec 02 16:15:10 crc kubenswrapper[4933]: I1202 16:15:10.168409 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b6c948c7-gg98x"] Dec 02 16:15:10 crc kubenswrapper[4933]: I1202 16:15:10.168520 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/249f9151-f57b-4ab8-8e3a-5c5256e2ed1c-config-volume\") pod \"249f9151-f57b-4ab8-8e3a-5c5256e2ed1c\" (UID: \"249f9151-f57b-4ab8-8e3a-5c5256e2ed1c\") " Dec 02 16:15:10 crc kubenswrapper[4933]: I1202 16:15:10.168812 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b88th\" (UniqueName: \"kubernetes.io/projected/753f73e1-f883-4c44-a499-3e73eadba372-kube-api-access-b88th\") pod \"753f73e1-f883-4c44-a499-3e73eadba372\" (UID: \"753f73e1-f883-4c44-a499-3e73eadba372\") " Dec 02 16:15:10 crc kubenswrapper[4933]: I1202 16:15:10.168958 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/753f73e1-f883-4c44-a499-3e73eadba372-ovsdbserver-nb\") pod \"753f73e1-f883-4c44-a499-3e73eadba372\" (UID: \"753f73e1-f883-4c44-a499-3e73eadba372\") " Dec 02 16:15:10 crc kubenswrapper[4933]: I1202 16:15:10.168984 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/753f73e1-f883-4c44-a499-3e73eadba372-config\") pod \"753f73e1-f883-4c44-a499-3e73eadba372\" (UID: \"753f73e1-f883-4c44-a499-3e73eadba372\") " Dec 02 16:15:10 crc kubenswrapper[4933]: I1202 16:15:10.169026 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/753f73e1-f883-4c44-a499-3e73eadba372-dns-svc\") pod \"753f73e1-f883-4c44-a499-3e73eadba372\" (UID: \"753f73e1-f883-4c44-a499-3e73eadba372\") " Dec 02 16:15:10 crc kubenswrapper[4933]: I1202 16:15:10.169101 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/249f9151-f57b-4ab8-8e3a-5c5256e2ed1c-secret-volume\") pod \"249f9151-f57b-4ab8-8e3a-5c5256e2ed1c\" (UID: \"249f9151-f57b-4ab8-8e3a-5c5256e2ed1c\") " Dec 02 16:15:10 crc kubenswrapper[4933]: I1202 16:15:10.169175 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7ddf\" (UniqueName: \"kubernetes.io/projected/249f9151-f57b-4ab8-8e3a-5c5256e2ed1c-kube-api-access-w7ddf\") pod \"249f9151-f57b-4ab8-8e3a-5c5256e2ed1c\" (UID: \"249f9151-f57b-4ab8-8e3a-5c5256e2ed1c\") " Dec 02 16:15:10 crc kubenswrapper[4933]: I1202 16:15:10.169213 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/753f73e1-f883-4c44-a499-3e73eadba372-ovsdbserver-sb\") pod \"753f73e1-f883-4c44-a499-3e73eadba372\" (UID: \"753f73e1-f883-4c44-a499-3e73eadba372\") " Dec 02 16:15:10 crc kubenswrapper[4933]: I1202 16:15:10.169799 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/249f9151-f57b-4ab8-8e3a-5c5256e2ed1c-config-volume" (OuterVolumeSpecName: "config-volume") pod "249f9151-f57b-4ab8-8e3a-5c5256e2ed1c" (UID: "249f9151-f57b-4ab8-8e3a-5c5256e2ed1c"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:15:10 crc kubenswrapper[4933]: I1202 16:15:10.170122 4933 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/249f9151-f57b-4ab8-8e3a-5c5256e2ed1c-config-volume\") on node \"crc\" DevicePath \"\"" Dec 02 16:15:10 crc kubenswrapper[4933]: I1202 16:15:10.185555 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/753f73e1-f883-4c44-a499-3e73eadba372-kube-api-access-b88th" (OuterVolumeSpecName: "kube-api-access-b88th") pod "753f73e1-f883-4c44-a499-3e73eadba372" (UID: "753f73e1-f883-4c44-a499-3e73eadba372"). InnerVolumeSpecName "kube-api-access-b88th". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:15:10 crc kubenswrapper[4933]: I1202 16:15:10.186298 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/249f9151-f57b-4ab8-8e3a-5c5256e2ed1c-kube-api-access-w7ddf" (OuterVolumeSpecName: "kube-api-access-w7ddf") pod "249f9151-f57b-4ab8-8e3a-5c5256e2ed1c" (UID: "249f9151-f57b-4ab8-8e3a-5c5256e2ed1c"). InnerVolumeSpecName "kube-api-access-w7ddf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:15:10 crc kubenswrapper[4933]: I1202 16:15:10.218325 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/249f9151-f57b-4ab8-8e3a-5c5256e2ed1c-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "249f9151-f57b-4ab8-8e3a-5c5256e2ed1c" (UID: "249f9151-f57b-4ab8-8e3a-5c5256e2ed1c"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:15:10 crc kubenswrapper[4933]: I1202 16:15:10.273551 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b88th\" (UniqueName: \"kubernetes.io/projected/753f73e1-f883-4c44-a499-3e73eadba372-kube-api-access-b88th\") on node \"crc\" DevicePath \"\"" Dec 02 16:15:10 crc kubenswrapper[4933]: I1202 16:15:10.273581 4933 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/249f9151-f57b-4ab8-8e3a-5c5256e2ed1c-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 02 16:15:10 crc kubenswrapper[4933]: I1202 16:15:10.273589 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7ddf\" (UniqueName: \"kubernetes.io/projected/249f9151-f57b-4ab8-8e3a-5c5256e2ed1c-kube-api-access-w7ddf\") on node \"crc\" DevicePath \"\"" Dec 02 16:15:10 crc kubenswrapper[4933]: I1202 16:15:10.351535 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/753f73e1-f883-4c44-a499-3e73eadba372-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "753f73e1-f883-4c44-a499-3e73eadba372" (UID: "753f73e1-f883-4c44-a499-3e73eadba372"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:15:10 crc kubenswrapper[4933]: I1202 16:15:10.354427 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/753f73e1-f883-4c44-a499-3e73eadba372-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "753f73e1-f883-4c44-a499-3e73eadba372" (UID: "753f73e1-f883-4c44-a499-3e73eadba372"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:15:10 crc kubenswrapper[4933]: I1202 16:15:10.377746 4933 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/753f73e1-f883-4c44-a499-3e73eadba372-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 02 16:15:10 crc kubenswrapper[4933]: I1202 16:15:10.377791 4933 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/753f73e1-f883-4c44-a499-3e73eadba372-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 16:15:10 crc kubenswrapper[4933]: I1202 16:15:10.378588 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-rsdtj"] Dec 02 16:15:10 crc kubenswrapper[4933]: I1202 16:15:10.457909 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/753f73e1-f883-4c44-a499-3e73eadba372-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "753f73e1-f883-4c44-a499-3e73eadba372" (UID: "753f73e1-f883-4c44-a499-3e73eadba372"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:15:10 crc kubenswrapper[4933]: I1202 16:15:10.479952 4933 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/753f73e1-f883-4c44-a499-3e73eadba372-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 02 16:15:10 crc kubenswrapper[4933]: I1202 16:15:10.480264 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/753f73e1-f883-4c44-a499-3e73eadba372-config" (OuterVolumeSpecName: "config") pod "753f73e1-f883-4c44-a499-3e73eadba372" (UID: "753f73e1-f883-4c44-a499-3e73eadba372"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:15:10 crc kubenswrapper[4933]: I1202 16:15:10.582106 4933 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/753f73e1-f883-4c44-a499-3e73eadba372-config\") on node \"crc\" DevicePath \"\"" Dec 02 16:15:10 crc kubenswrapper[4933]: I1202 16:15:10.807790 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411535-sfhvv" Dec 02 16:15:10 crc kubenswrapper[4933]: I1202 16:15:10.807790 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411535-sfhvv" event={"ID":"249f9151-f57b-4ab8-8e3a-5c5256e2ed1c","Type":"ContainerDied","Data":"9b4158a9c618c852b5df6841332ed2eb564b7f95b723fa4755ac721afb6a6e51"} Dec 02 16:15:10 crc kubenswrapper[4933]: I1202 16:15:10.807889 4933 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b4158a9c618c852b5df6841332ed2eb564b7f95b723fa4755ac721afb6a6e51" Dec 02 16:15:10 crc kubenswrapper[4933]: I1202 16:15:10.809639 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-858c7d46f6-k58rb" event={"ID":"d981749f-83d5-4096-869d-617b2a6f1ce7","Type":"ContainerStarted","Data":"44c621895de73fe9045cc23d2f328372058a1a20e57f0bf15e8ae1507b0ef978"} Dec 02 16:15:10 crc kubenswrapper[4933]: I1202 16:15:10.809676 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-858c7d46f6-k58rb" event={"ID":"d981749f-83d5-4096-869d-617b2a6f1ce7","Type":"ContainerStarted","Data":"e856d80dfe6cd8e2c018c8bc68cc8e2b7ce3692ad51e95bf05f1b5748a8e5187"} Dec 02 16:15:10 crc kubenswrapper[4933]: I1202 16:15:10.809691 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-858c7d46f6-k58rb" Dec 02 16:15:10 crc kubenswrapper[4933]: I1202 16:15:10.809700 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-858c7d46f6-k58rb" event={"ID":"d981749f-83d5-4096-869d-617b2a6f1ce7","Type":"ContainerStarted","Data":"4a1275fc69bb461d08284214c94e1baf3bc0df6abd4f78aa5c3b5cf816985c4e"} Dec 02 16:15:10 crc kubenswrapper[4933]: I1202 16:15:10.811434 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56798b757f-hcdxn" event={"ID":"753f73e1-f883-4c44-a499-3e73eadba372","Type":"ContainerDied","Data":"b2f1e17caef99f451ddbeb323efa37dc3d65316eb6a291df036a51b800efca4e"} Dec 02 16:15:10 crc kubenswrapper[4933]: I1202 16:15:10.811447 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56798b757f-hcdxn" Dec 02 16:15:10 crc kubenswrapper[4933]: I1202 16:15:10.811468 4933 scope.go:117] "RemoveContainer" containerID="c1d02d84063ca3a37879551577a39230871433a740de7d8e61bfff843d900172" Dec 02 16:15:10 crc kubenswrapper[4933]: I1202 16:15:10.821436 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6d02b9de-9f5e-44fc-9e29-6a01b47c6ad7","Type":"ContainerStarted","Data":"5be483c4efbbc0bdf004464f4b5ac3acc48e3c0bc23a17e87f1eb680e31449a9"} Dec 02 16:15:10 crc kubenswrapper[4933]: I1202 16:15:10.835010 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-858c7d46f6-k58rb" podStartSLOduration=3.8349912489999998 podStartE2EDuration="3.834991249s" podCreationTimestamp="2025-12-02 16:15:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 16:15:10.828667277 +0000 UTC m=+1374.079894000" watchObservedRunningTime="2025-12-02 16:15:10.834991249 +0000 UTC m=+1374.086217952" Dec 02 16:15:10 crc kubenswrapper[4933]: I1202 16:15:10.847288 4933 generic.go:334] "Generic (PLEG): container finished" podID="f7534185-21e1-4f0b-81e8-550e0c24b0b3" containerID="266a0e409caef7d1f1179bd90f18a2c60de994cd64c89905025cfa118c98920b" exitCode=0 Dec 02 16:15:10 crc kubenswrapper[4933]: I1202 16:15:10.847355 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc5c4795-rsdtj" event={"ID":"f7534185-21e1-4f0b-81e8-550e0c24b0b3","Type":"ContainerDied","Data":"266a0e409caef7d1f1179bd90f18a2c60de994cd64c89905025cfa118c98920b"} Dec 02 16:15:10 crc kubenswrapper[4933]: I1202 16:15:10.847386 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc5c4795-rsdtj" event={"ID":"f7534185-21e1-4f0b-81e8-550e0c24b0b3","Type":"ContainerStarted","Data":"72ad20b5587dcb8d367b7637a2f4cc3d4b97f70f1acf8220f5ab7a8d2b5d3c79"} Dec 02 16:15:10 crc kubenswrapper[4933]: I1202 16:15:10.850120 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ef22a91b-23f4-4686-9672-57a7ce6e12d0","Type":"ContainerStarted","Data":"86704492a6dd16e3cffe841d0c38fad6cc776f9d67966953cdff1096a7f5f034"} Dec 02 16:15:10 crc kubenswrapper[4933]: I1202 16:15:10.854497 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-649gh" event={"ID":"ae883c36-aac5-49f4-839c-d0140fe724cc","Type":"ContainerStarted","Data":"695f67b9aea9f7e5082f73616ee2fd565dbbf11fa4f169dba5f0d34bb67047dc"} Dec 02 16:15:10 crc kubenswrapper[4933]: I1202 16:15:10.994241 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56798b757f-hcdxn"] Dec 02 16:15:11 crc kubenswrapper[4933]: I1202 16:15:11.031370 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-56798b757f-hcdxn"] Dec 02 16:15:11 crc kubenswrapper[4933]: I1202 16:15:11.045899 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-649gh" podStartSLOduration=10.704960198 podStartE2EDuration="47.045878551s" podCreationTimestamp="2025-12-02 16:14:24 +0000 UTC" firstStartedPulling="2025-12-02 16:14:33.542843871 +0000 UTC m=+1336.794070564" lastFinishedPulling="2025-12-02 16:15:09.883762214 +0000 UTC m=+1373.134988917" observedRunningTime="2025-12-02 16:15:10.928766968 +0000 UTC m=+1374.179993671" watchObservedRunningTime="2025-12-02 16:15:11.045878551 +0000 UTC m=+1374.297105274" Dec 02 16:15:11 crc kubenswrapper[4933]: I1202 16:15:11.068048 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43ba7114-d190-4ddc-93e1-d00663935f84" path="/var/lib/kubelet/pods/43ba7114-d190-4ddc-93e1-d00663935f84/volumes" Dec 02 16:15:11 crc kubenswrapper[4933]: I1202 16:15:11.069151 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="753f73e1-f883-4c44-a499-3e73eadba372" path="/var/lib/kubelet/pods/753f73e1-f883-4c44-a499-3e73eadba372/volumes" Dec 02 16:15:11 crc kubenswrapper[4933]: I1202 16:15:11.424968 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-pd5s5" Dec 02 16:15:11 crc kubenswrapper[4933]: I1202 16:15:11.515639 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7179e9e-f623-44a2-9f70-021244770e56-combined-ca-bundle\") pod \"f7179e9e-f623-44a2-9f70-021244770e56\" (UID: \"f7179e9e-f623-44a2-9f70-021244770e56\") " Dec 02 16:15:11 crc kubenswrapper[4933]: I1202 16:15:11.515738 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f7179e9e-f623-44a2-9f70-021244770e56-scripts\") pod \"f7179e9e-f623-44a2-9f70-021244770e56\" (UID: \"f7179e9e-f623-44a2-9f70-021244770e56\") " Dec 02 16:15:11 crc kubenswrapper[4933]: I1202 16:15:11.515914 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7179e9e-f623-44a2-9f70-021244770e56-config-data\") pod \"f7179e9e-f623-44a2-9f70-021244770e56\" (UID: \"f7179e9e-f623-44a2-9f70-021244770e56\") " Dec 02 16:15:11 crc kubenswrapper[4933]: I1202 16:15:11.515985 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f7179e9e-f623-44a2-9f70-021244770e56-logs\") pod \"f7179e9e-f623-44a2-9f70-021244770e56\" (UID: \"f7179e9e-f623-44a2-9f70-021244770e56\") " Dec 02 16:15:11 crc kubenswrapper[4933]: I1202 16:15:11.516019 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-82p95\" (UniqueName: \"kubernetes.io/projected/f7179e9e-f623-44a2-9f70-021244770e56-kube-api-access-82p95\") pod \"f7179e9e-f623-44a2-9f70-021244770e56\" (UID: \"f7179e9e-f623-44a2-9f70-021244770e56\") " Dec 02 16:15:11 crc kubenswrapper[4933]: I1202 16:15:11.518186 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7179e9e-f623-44a2-9f70-021244770e56-logs" (OuterVolumeSpecName: "logs") pod "f7179e9e-f623-44a2-9f70-021244770e56" (UID: "f7179e9e-f623-44a2-9f70-021244770e56"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 16:15:11 crc kubenswrapper[4933]: I1202 16:15:11.526065 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7179e9e-f623-44a2-9f70-021244770e56-kube-api-access-82p95" (OuterVolumeSpecName: "kube-api-access-82p95") pod "f7179e9e-f623-44a2-9f70-021244770e56" (UID: "f7179e9e-f623-44a2-9f70-021244770e56"). InnerVolumeSpecName "kube-api-access-82p95". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:15:11 crc kubenswrapper[4933]: I1202 16:15:11.527009 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7179e9e-f623-44a2-9f70-021244770e56-scripts" (OuterVolumeSpecName: "scripts") pod "f7179e9e-f623-44a2-9f70-021244770e56" (UID: "f7179e9e-f623-44a2-9f70-021244770e56"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:15:11 crc kubenswrapper[4933]: I1202 16:15:11.618448 4933 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f7179e9e-f623-44a2-9f70-021244770e56-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 16:15:11 crc kubenswrapper[4933]: I1202 16:15:11.618483 4933 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f7179e9e-f623-44a2-9f70-021244770e56-logs\") on node \"crc\" DevicePath \"\"" Dec 02 16:15:11 crc kubenswrapper[4933]: I1202 16:15:11.618492 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-82p95\" (UniqueName: \"kubernetes.io/projected/f7179e9e-f623-44a2-9f70-021244770e56-kube-api-access-82p95\") on node \"crc\" DevicePath \"\"" Dec 02 16:15:11 crc kubenswrapper[4933]: I1202 16:15:11.623111 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7179e9e-f623-44a2-9f70-021244770e56-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f7179e9e-f623-44a2-9f70-021244770e56" (UID: "f7179e9e-f623-44a2-9f70-021244770e56"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:15:11 crc kubenswrapper[4933]: I1202 16:15:11.634844 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7179e9e-f623-44a2-9f70-021244770e56-config-data" (OuterVolumeSpecName: "config-data") pod "f7179e9e-f623-44a2-9f70-021244770e56" (UID: "f7179e9e-f623-44a2-9f70-021244770e56"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:15:11 crc kubenswrapper[4933]: I1202 16:15:11.719943 4933 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7179e9e-f623-44a2-9f70-021244770e56-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 16:15:11 crc kubenswrapper[4933]: I1202 16:15:11.719972 4933 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7179e9e-f623-44a2-9f70-021244770e56-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 16:15:11 crc kubenswrapper[4933]: I1202 16:15:11.874178 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ef22a91b-23f4-4686-9672-57a7ce6e12d0","Type":"ContainerStarted","Data":"87ec0d64fe245d52731152be4488ea3425e1eed1e7b848fa64c2e0012a3edac8"} Dec 02 16:15:11 crc kubenswrapper[4933]: I1202 16:15:11.878553 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4a957352-db06-4697-9911-a5e2a15e6ef0","Type":"ContainerStarted","Data":"804083dd5edcb7c4348c4e1a844c078de80adbc62db64af94c5e39eb09efb0e7"} Dec 02 16:15:11 crc kubenswrapper[4933]: I1202 16:15:11.924118 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-pd5s5" Dec 02 16:15:11 crc kubenswrapper[4933]: I1202 16:15:11.924609 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-pd5s5" event={"ID":"f7179e9e-f623-44a2-9f70-021244770e56","Type":"ContainerDied","Data":"fb9b424e4304e3a730d3035c6163c7a76b941ff1dc414952e10ffd3f54504c87"} Dec 02 16:15:11 crc kubenswrapper[4933]: I1202 16:15:11.924646 4933 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fb9b424e4304e3a730d3035c6163c7a76b941ff1dc414952e10ffd3f54504c87" Dec 02 16:15:11 crc kubenswrapper[4933]: I1202 16:15:11.957476 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc5c4795-rsdtj" event={"ID":"f7534185-21e1-4f0b-81e8-550e0c24b0b3","Type":"ContainerStarted","Data":"65f079f4e0d724a170824f5c5fa2c7091b6e7cfc8b158d84a8b002b63c105f29"} Dec 02 16:15:11 crc kubenswrapper[4933]: I1202 16:15:11.957568 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5ccc5c4795-rsdtj" Dec 02 16:15:12 crc kubenswrapper[4933]: I1202 16:15:12.060188 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-64bbf7fc9b-xn4tj"] Dec 02 16:15:12 crc kubenswrapper[4933]: E1202 16:15:12.061077 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="753f73e1-f883-4c44-a499-3e73eadba372" containerName="init" Dec 02 16:15:12 crc kubenswrapper[4933]: I1202 16:15:12.061099 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="753f73e1-f883-4c44-a499-3e73eadba372" containerName="init" Dec 02 16:15:12 crc kubenswrapper[4933]: E1202 16:15:12.061111 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="249f9151-f57b-4ab8-8e3a-5c5256e2ed1c" containerName="collect-profiles" Dec 02 16:15:12 crc kubenswrapper[4933]: I1202 16:15:12.061142 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="249f9151-f57b-4ab8-8e3a-5c5256e2ed1c" containerName="collect-profiles" Dec 02 16:15:12 crc kubenswrapper[4933]: E1202 16:15:12.061188 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7179e9e-f623-44a2-9f70-021244770e56" containerName="placement-db-sync" Dec 02 16:15:12 crc kubenswrapper[4933]: I1202 16:15:12.061217 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7179e9e-f623-44a2-9f70-021244770e56" containerName="placement-db-sync" Dec 02 16:15:12 crc kubenswrapper[4933]: I1202 16:15:12.061509 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="249f9151-f57b-4ab8-8e3a-5c5256e2ed1c" containerName="collect-profiles" Dec 02 16:15:12 crc kubenswrapper[4933]: I1202 16:15:12.061557 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="753f73e1-f883-4c44-a499-3e73eadba372" containerName="init" Dec 02 16:15:12 crc kubenswrapper[4933]: I1202 16:15:12.061579 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7179e9e-f623-44a2-9f70-021244770e56" containerName="placement-db-sync" Dec 02 16:15:12 crc kubenswrapper[4933]: I1202 16:15:12.063401 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-64bbf7fc9b-xn4tj" Dec 02 16:15:12 crc kubenswrapper[4933]: I1202 16:15:12.075038 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 02 16:15:12 crc kubenswrapper[4933]: I1202 16:15:12.075300 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Dec 02 16:15:12 crc kubenswrapper[4933]: I1202 16:15:12.075538 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 02 16:15:12 crc kubenswrapper[4933]: I1202 16:15:12.075717 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Dec 02 16:15:12 crc kubenswrapper[4933]: I1202 16:15:12.075910 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-xpbfr" Dec 02 16:15:12 crc kubenswrapper[4933]: I1202 16:15:12.086087 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-64bbf7fc9b-xn4tj"] Dec 02 16:15:12 crc kubenswrapper[4933]: I1202 16:15:12.087760 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5ccc5c4795-rsdtj" podStartSLOduration=4.08773741 podStartE2EDuration="4.08773741s" podCreationTimestamp="2025-12-02 16:15:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 16:15:12.003993644 +0000 UTC m=+1375.255220347" watchObservedRunningTime="2025-12-02 16:15:12.08773741 +0000 UTC m=+1375.338964113" Dec 02 16:15:12 crc kubenswrapper[4933]: E1202 16:15:12.106653 4933 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf7179e9e_f623_44a2_9f70_021244770e56.slice\": RecentStats: unable to find data in memory cache]" Dec 02 16:15:12 crc kubenswrapper[4933]: I1202 16:15:12.133654 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgwfn\" (UniqueName: \"kubernetes.io/projected/250ea801-8fe7-44d3-b5ec-78cd22a3fa39-kube-api-access-vgwfn\") pod \"placement-64bbf7fc9b-xn4tj\" (UID: \"250ea801-8fe7-44d3-b5ec-78cd22a3fa39\") " pod="openstack/placement-64bbf7fc9b-xn4tj" Dec 02 16:15:12 crc kubenswrapper[4933]: I1202 16:15:12.133778 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/250ea801-8fe7-44d3-b5ec-78cd22a3fa39-logs\") pod \"placement-64bbf7fc9b-xn4tj\" (UID: \"250ea801-8fe7-44d3-b5ec-78cd22a3fa39\") " pod="openstack/placement-64bbf7fc9b-xn4tj" Dec 02 16:15:12 crc kubenswrapper[4933]: I1202 16:15:12.133816 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/250ea801-8fe7-44d3-b5ec-78cd22a3fa39-public-tls-certs\") pod \"placement-64bbf7fc9b-xn4tj\" (UID: \"250ea801-8fe7-44d3-b5ec-78cd22a3fa39\") " pod="openstack/placement-64bbf7fc9b-xn4tj" Dec 02 16:15:12 crc kubenswrapper[4933]: I1202 16:15:12.133922 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/250ea801-8fe7-44d3-b5ec-78cd22a3fa39-config-data\") pod \"placement-64bbf7fc9b-xn4tj\" (UID: \"250ea801-8fe7-44d3-b5ec-78cd22a3fa39\") " pod="openstack/placement-64bbf7fc9b-xn4tj" Dec 02 16:15:12 crc kubenswrapper[4933]: I1202 16:15:12.133959 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/250ea801-8fe7-44d3-b5ec-78cd22a3fa39-scripts\") pod \"placement-64bbf7fc9b-xn4tj\" (UID: \"250ea801-8fe7-44d3-b5ec-78cd22a3fa39\") " pod="openstack/placement-64bbf7fc9b-xn4tj" Dec 02 16:15:12 crc kubenswrapper[4933]: I1202 16:15:12.134014 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/250ea801-8fe7-44d3-b5ec-78cd22a3fa39-combined-ca-bundle\") pod \"placement-64bbf7fc9b-xn4tj\" (UID: \"250ea801-8fe7-44d3-b5ec-78cd22a3fa39\") " pod="openstack/placement-64bbf7fc9b-xn4tj" Dec 02 16:15:12 crc kubenswrapper[4933]: I1202 16:15:12.134073 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/250ea801-8fe7-44d3-b5ec-78cd22a3fa39-internal-tls-certs\") pod \"placement-64bbf7fc9b-xn4tj\" (UID: \"250ea801-8fe7-44d3-b5ec-78cd22a3fa39\") " pod="openstack/placement-64bbf7fc9b-xn4tj" Dec 02 16:15:12 crc kubenswrapper[4933]: I1202 16:15:12.234920 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/250ea801-8fe7-44d3-b5ec-78cd22a3fa39-scripts\") pod \"placement-64bbf7fc9b-xn4tj\" (UID: \"250ea801-8fe7-44d3-b5ec-78cd22a3fa39\") " pod="openstack/placement-64bbf7fc9b-xn4tj" Dec 02 16:15:12 crc kubenswrapper[4933]: I1202 16:15:12.235203 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/250ea801-8fe7-44d3-b5ec-78cd22a3fa39-combined-ca-bundle\") pod \"placement-64bbf7fc9b-xn4tj\" (UID: \"250ea801-8fe7-44d3-b5ec-78cd22a3fa39\") " pod="openstack/placement-64bbf7fc9b-xn4tj" Dec 02 16:15:12 crc kubenswrapper[4933]: I1202 16:15:12.235245 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/250ea801-8fe7-44d3-b5ec-78cd22a3fa39-internal-tls-certs\") pod \"placement-64bbf7fc9b-xn4tj\" (UID: \"250ea801-8fe7-44d3-b5ec-78cd22a3fa39\") " pod="openstack/placement-64bbf7fc9b-xn4tj" Dec 02 16:15:12 crc kubenswrapper[4933]: I1202 16:15:12.235275 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vgwfn\" (UniqueName: \"kubernetes.io/projected/250ea801-8fe7-44d3-b5ec-78cd22a3fa39-kube-api-access-vgwfn\") pod \"placement-64bbf7fc9b-xn4tj\" (UID: \"250ea801-8fe7-44d3-b5ec-78cd22a3fa39\") " pod="openstack/placement-64bbf7fc9b-xn4tj" Dec 02 16:15:12 crc kubenswrapper[4933]: I1202 16:15:12.235327 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/250ea801-8fe7-44d3-b5ec-78cd22a3fa39-logs\") pod \"placement-64bbf7fc9b-xn4tj\" (UID: \"250ea801-8fe7-44d3-b5ec-78cd22a3fa39\") " pod="openstack/placement-64bbf7fc9b-xn4tj" Dec 02 16:15:12 crc kubenswrapper[4933]: I1202 16:15:12.235356 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/250ea801-8fe7-44d3-b5ec-78cd22a3fa39-public-tls-certs\") pod \"placement-64bbf7fc9b-xn4tj\" (UID: \"250ea801-8fe7-44d3-b5ec-78cd22a3fa39\") " pod="openstack/placement-64bbf7fc9b-xn4tj" Dec 02 16:15:12 crc kubenswrapper[4933]: I1202 16:15:12.235413 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/250ea801-8fe7-44d3-b5ec-78cd22a3fa39-config-data\") pod \"placement-64bbf7fc9b-xn4tj\" (UID: \"250ea801-8fe7-44d3-b5ec-78cd22a3fa39\") " pod="openstack/placement-64bbf7fc9b-xn4tj" Dec 02 16:15:12 crc kubenswrapper[4933]: I1202 16:15:12.237349 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/250ea801-8fe7-44d3-b5ec-78cd22a3fa39-logs\") pod \"placement-64bbf7fc9b-xn4tj\" (UID: \"250ea801-8fe7-44d3-b5ec-78cd22a3fa39\") " pod="openstack/placement-64bbf7fc9b-xn4tj" Dec 02 16:15:12 crc kubenswrapper[4933]: I1202 16:15:12.243001 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/250ea801-8fe7-44d3-b5ec-78cd22a3fa39-scripts\") pod \"placement-64bbf7fc9b-xn4tj\" (UID: \"250ea801-8fe7-44d3-b5ec-78cd22a3fa39\") " pod="openstack/placement-64bbf7fc9b-xn4tj" Dec 02 16:15:12 crc kubenswrapper[4933]: I1202 16:15:12.245303 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/250ea801-8fe7-44d3-b5ec-78cd22a3fa39-public-tls-certs\") pod \"placement-64bbf7fc9b-xn4tj\" (UID: \"250ea801-8fe7-44d3-b5ec-78cd22a3fa39\") " pod="openstack/placement-64bbf7fc9b-xn4tj" Dec 02 16:15:12 crc kubenswrapper[4933]: I1202 16:15:12.245546 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/250ea801-8fe7-44d3-b5ec-78cd22a3fa39-internal-tls-certs\") pod \"placement-64bbf7fc9b-xn4tj\" (UID: \"250ea801-8fe7-44d3-b5ec-78cd22a3fa39\") " pod="openstack/placement-64bbf7fc9b-xn4tj" Dec 02 16:15:12 crc kubenswrapper[4933]: I1202 16:15:12.245580 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/250ea801-8fe7-44d3-b5ec-78cd22a3fa39-config-data\") pod \"placement-64bbf7fc9b-xn4tj\" (UID: \"250ea801-8fe7-44d3-b5ec-78cd22a3fa39\") " pod="openstack/placement-64bbf7fc9b-xn4tj" Dec 02 16:15:12 crc kubenswrapper[4933]: I1202 16:15:12.249382 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/250ea801-8fe7-44d3-b5ec-78cd22a3fa39-combined-ca-bundle\") pod \"placement-64bbf7fc9b-xn4tj\" (UID: \"250ea801-8fe7-44d3-b5ec-78cd22a3fa39\") " pod="openstack/placement-64bbf7fc9b-xn4tj" Dec 02 16:15:12 crc kubenswrapper[4933]: I1202 16:15:12.256955 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgwfn\" (UniqueName: \"kubernetes.io/projected/250ea801-8fe7-44d3-b5ec-78cd22a3fa39-kube-api-access-vgwfn\") pod \"placement-64bbf7fc9b-xn4tj\" (UID: \"250ea801-8fe7-44d3-b5ec-78cd22a3fa39\") " pod="openstack/placement-64bbf7fc9b-xn4tj" Dec 02 16:15:12 crc kubenswrapper[4933]: I1202 16:15:12.412836 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-64bbf7fc9b-xn4tj" Dec 02 16:15:12 crc kubenswrapper[4933]: I1202 16:15:12.986136 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ef22a91b-23f4-4686-9672-57a7ce6e12d0","Type":"ContainerStarted","Data":"55cb85ceed22dd3ba266235cd16dffe5c36bca5deb769b3023a594a835016f6f"} Dec 02 16:15:12 crc kubenswrapper[4933]: I1202 16:15:12.987482 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="ef22a91b-23f4-4686-9672-57a7ce6e12d0" containerName="glance-log" containerID="cri-o://87ec0d64fe245d52731152be4488ea3425e1eed1e7b848fa64c2e0012a3edac8" gracePeriod=30 Dec 02 16:15:12 crc kubenswrapper[4933]: I1202 16:15:12.989320 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="ef22a91b-23f4-4686-9672-57a7ce6e12d0" containerName="glance-httpd" containerID="cri-o://55cb85ceed22dd3ba266235cd16dffe5c36bca5deb769b3023a594a835016f6f" gracePeriod=30 Dec 02 16:15:13 crc kubenswrapper[4933]: I1202 16:15:13.010941 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4a957352-db06-4697-9911-a5e2a15e6ef0","Type":"ContainerStarted","Data":"20796fd5735b6a1cb92e902f30c0069563a9319191a9688c37f869b1d7993fae"} Dec 02 16:15:13 crc kubenswrapper[4933]: I1202 16:15:13.010955 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="4a957352-db06-4697-9911-a5e2a15e6ef0" containerName="glance-log" containerID="cri-o://804083dd5edcb7c4348c4e1a844c078de80adbc62db64af94c5e39eb09efb0e7" gracePeriod=30 Dec 02 16:15:13 crc kubenswrapper[4933]: I1202 16:15:13.011149 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="4a957352-db06-4697-9911-a5e2a15e6ef0" containerName="glance-httpd" containerID="cri-o://20796fd5735b6a1cb92e902f30c0069563a9319191a9688c37f869b1d7993fae" gracePeriod=30 Dec 02 16:15:13 crc kubenswrapper[4933]: I1202 16:15:13.090618 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=8.090595179 podStartE2EDuration="8.090595179s" podCreationTimestamp="2025-12-02 16:15:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 16:15:13.066128994 +0000 UTC m=+1376.317355697" watchObservedRunningTime="2025-12-02 16:15:13.090595179 +0000 UTC m=+1376.341821882" Dec 02 16:15:13 crc kubenswrapper[4933]: I1202 16:15:13.123519 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=7.123493933 podStartE2EDuration="7.123493933s" podCreationTimestamp="2025-12-02 16:15:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 16:15:13.024958045 +0000 UTC m=+1376.276184758" watchObservedRunningTime="2025-12-02 16:15:13.123493933 +0000 UTC m=+1376.374720636" Dec 02 16:15:13 crc kubenswrapper[4933]: I1202 16:15:13.199115 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-64bbf7fc9b-xn4tj"] Dec 02 16:15:13 crc kubenswrapper[4933]: I1202 16:15:13.667732 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-7d5bbf46cc-92zkh"] Dec 02 16:15:13 crc kubenswrapper[4933]: I1202 16:15:13.671173 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7d5bbf46cc-92zkh" Dec 02 16:15:13 crc kubenswrapper[4933]: I1202 16:15:13.677048 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Dec 02 16:15:13 crc kubenswrapper[4933]: I1202 16:15:13.677065 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Dec 02 16:15:13 crc kubenswrapper[4933]: I1202 16:15:13.709933 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7d5bbf46cc-92zkh"] Dec 02 16:15:13 crc kubenswrapper[4933]: I1202 16:15:13.848603 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/03882bf4-d9e9-460d-a94e-fb17189d0278-ovndb-tls-certs\") pod \"neutron-7d5bbf46cc-92zkh\" (UID: \"03882bf4-d9e9-460d-a94e-fb17189d0278\") " pod="openstack/neutron-7d5bbf46cc-92zkh" Dec 02 16:15:13 crc kubenswrapper[4933]: I1202 16:15:13.848667 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/03882bf4-d9e9-460d-a94e-fb17189d0278-public-tls-certs\") pod \"neutron-7d5bbf46cc-92zkh\" (UID: \"03882bf4-d9e9-460d-a94e-fb17189d0278\") " pod="openstack/neutron-7d5bbf46cc-92zkh" Dec 02 16:15:13 crc kubenswrapper[4933]: I1202 16:15:13.849533 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03882bf4-d9e9-460d-a94e-fb17189d0278-combined-ca-bundle\") pod \"neutron-7d5bbf46cc-92zkh\" (UID: \"03882bf4-d9e9-460d-a94e-fb17189d0278\") " pod="openstack/neutron-7d5bbf46cc-92zkh" Dec 02 16:15:13 crc kubenswrapper[4933]: I1202 16:15:13.849565 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/03882bf4-d9e9-460d-a94e-fb17189d0278-httpd-config\") pod \"neutron-7d5bbf46cc-92zkh\" (UID: \"03882bf4-d9e9-460d-a94e-fb17189d0278\") " pod="openstack/neutron-7d5bbf46cc-92zkh" Dec 02 16:15:13 crc kubenswrapper[4933]: I1202 16:15:13.849591 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5k46c\" (UniqueName: \"kubernetes.io/projected/03882bf4-d9e9-460d-a94e-fb17189d0278-kube-api-access-5k46c\") pod \"neutron-7d5bbf46cc-92zkh\" (UID: \"03882bf4-d9e9-460d-a94e-fb17189d0278\") " pod="openstack/neutron-7d5bbf46cc-92zkh" Dec 02 16:15:13 crc kubenswrapper[4933]: I1202 16:15:13.849678 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/03882bf4-d9e9-460d-a94e-fb17189d0278-config\") pod \"neutron-7d5bbf46cc-92zkh\" (UID: \"03882bf4-d9e9-460d-a94e-fb17189d0278\") " pod="openstack/neutron-7d5bbf46cc-92zkh" Dec 02 16:15:13 crc kubenswrapper[4933]: I1202 16:15:13.849978 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/03882bf4-d9e9-460d-a94e-fb17189d0278-internal-tls-certs\") pod \"neutron-7d5bbf46cc-92zkh\" (UID: \"03882bf4-d9e9-460d-a94e-fb17189d0278\") " pod="openstack/neutron-7d5bbf46cc-92zkh" Dec 02 16:15:13 crc kubenswrapper[4933]: I1202 16:15:13.938686 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 02 16:15:13 crc kubenswrapper[4933]: I1202 16:15:13.952094 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/03882bf4-d9e9-460d-a94e-fb17189d0278-ovndb-tls-certs\") pod \"neutron-7d5bbf46cc-92zkh\" (UID: \"03882bf4-d9e9-460d-a94e-fb17189d0278\") " pod="openstack/neutron-7d5bbf46cc-92zkh" Dec 02 16:15:13 crc kubenswrapper[4933]: I1202 16:15:13.952170 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/03882bf4-d9e9-460d-a94e-fb17189d0278-public-tls-certs\") pod \"neutron-7d5bbf46cc-92zkh\" (UID: \"03882bf4-d9e9-460d-a94e-fb17189d0278\") " pod="openstack/neutron-7d5bbf46cc-92zkh" Dec 02 16:15:13 crc kubenswrapper[4933]: I1202 16:15:13.952289 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03882bf4-d9e9-460d-a94e-fb17189d0278-combined-ca-bundle\") pod \"neutron-7d5bbf46cc-92zkh\" (UID: \"03882bf4-d9e9-460d-a94e-fb17189d0278\") " pod="openstack/neutron-7d5bbf46cc-92zkh" Dec 02 16:15:13 crc kubenswrapper[4933]: I1202 16:15:13.952306 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/03882bf4-d9e9-460d-a94e-fb17189d0278-httpd-config\") pod \"neutron-7d5bbf46cc-92zkh\" (UID: \"03882bf4-d9e9-460d-a94e-fb17189d0278\") " pod="openstack/neutron-7d5bbf46cc-92zkh" Dec 02 16:15:13 crc kubenswrapper[4933]: I1202 16:15:13.952325 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5k46c\" (UniqueName: \"kubernetes.io/projected/03882bf4-d9e9-460d-a94e-fb17189d0278-kube-api-access-5k46c\") pod \"neutron-7d5bbf46cc-92zkh\" (UID: \"03882bf4-d9e9-460d-a94e-fb17189d0278\") " pod="openstack/neutron-7d5bbf46cc-92zkh" Dec 02 16:15:13 crc kubenswrapper[4933]: I1202 16:15:13.952383 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/03882bf4-d9e9-460d-a94e-fb17189d0278-config\") pod \"neutron-7d5bbf46cc-92zkh\" (UID: \"03882bf4-d9e9-460d-a94e-fb17189d0278\") " pod="openstack/neutron-7d5bbf46cc-92zkh" Dec 02 16:15:13 crc kubenswrapper[4933]: I1202 16:15:13.952410 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/03882bf4-d9e9-460d-a94e-fb17189d0278-internal-tls-certs\") pod \"neutron-7d5bbf46cc-92zkh\" (UID: \"03882bf4-d9e9-460d-a94e-fb17189d0278\") " pod="openstack/neutron-7d5bbf46cc-92zkh" Dec 02 16:15:13 crc kubenswrapper[4933]: I1202 16:15:13.958586 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/03882bf4-d9e9-460d-a94e-fb17189d0278-internal-tls-certs\") pod \"neutron-7d5bbf46cc-92zkh\" (UID: \"03882bf4-d9e9-460d-a94e-fb17189d0278\") " pod="openstack/neutron-7d5bbf46cc-92zkh" Dec 02 16:15:13 crc kubenswrapper[4933]: I1202 16:15:13.960466 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/03882bf4-d9e9-460d-a94e-fb17189d0278-config\") pod \"neutron-7d5bbf46cc-92zkh\" (UID: \"03882bf4-d9e9-460d-a94e-fb17189d0278\") " pod="openstack/neutron-7d5bbf46cc-92zkh" Dec 02 16:15:13 crc kubenswrapper[4933]: I1202 16:15:13.963680 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03882bf4-d9e9-460d-a94e-fb17189d0278-combined-ca-bundle\") pod \"neutron-7d5bbf46cc-92zkh\" (UID: \"03882bf4-d9e9-460d-a94e-fb17189d0278\") " pod="openstack/neutron-7d5bbf46cc-92zkh" Dec 02 16:15:13 crc kubenswrapper[4933]: I1202 16:15:13.965569 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/03882bf4-d9e9-460d-a94e-fb17189d0278-public-tls-certs\") pod \"neutron-7d5bbf46cc-92zkh\" (UID: \"03882bf4-d9e9-460d-a94e-fb17189d0278\") " pod="openstack/neutron-7d5bbf46cc-92zkh" Dec 02 16:15:13 crc kubenswrapper[4933]: I1202 16:15:13.972598 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/03882bf4-d9e9-460d-a94e-fb17189d0278-httpd-config\") pod \"neutron-7d5bbf46cc-92zkh\" (UID: \"03882bf4-d9e9-460d-a94e-fb17189d0278\") " pod="openstack/neutron-7d5bbf46cc-92zkh" Dec 02 16:15:13 crc kubenswrapper[4933]: I1202 16:15:13.975240 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/03882bf4-d9e9-460d-a94e-fb17189d0278-ovndb-tls-certs\") pod \"neutron-7d5bbf46cc-92zkh\" (UID: \"03882bf4-d9e9-460d-a94e-fb17189d0278\") " pod="openstack/neutron-7d5bbf46cc-92zkh" Dec 02 16:15:13 crc kubenswrapper[4933]: I1202 16:15:13.998781 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5k46c\" (UniqueName: \"kubernetes.io/projected/03882bf4-d9e9-460d-a94e-fb17189d0278-kube-api-access-5k46c\") pod \"neutron-7d5bbf46cc-92zkh\" (UID: \"03882bf4-d9e9-460d-a94e-fb17189d0278\") " pod="openstack/neutron-7d5bbf46cc-92zkh" Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.009394 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7d5bbf46cc-92zkh" Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.044781 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.052741 4933 generic.go:334] "Generic (PLEG): container finished" podID="39ec29fe-4f0d-4afe-9440-0cdc7b30d651" containerID="aa43f89a5ee3cc09277b8d1c5371526e918b571f89705e74235a7dbb6f5a2482" exitCode=0 Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.052849 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-rwgj6" event={"ID":"39ec29fe-4f0d-4afe-9440-0cdc7b30d651","Type":"ContainerDied","Data":"aa43f89a5ee3cc09277b8d1c5371526e918b571f89705e74235a7dbb6f5a2482"} Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.053568 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ldl6x\" (UniqueName: \"kubernetes.io/projected/ef22a91b-23f4-4686-9672-57a7ce6e12d0-kube-api-access-ldl6x\") pod \"ef22a91b-23f4-4686-9672-57a7ce6e12d0\" (UID: \"ef22a91b-23f4-4686-9672-57a7ce6e12d0\") " Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.053755 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef22a91b-23f4-4686-9672-57a7ce6e12d0-config-data\") pod \"ef22a91b-23f4-4686-9672-57a7ce6e12d0\" (UID: \"ef22a91b-23f4-4686-9672-57a7ce6e12d0\") " Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.053807 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ef22a91b-23f4-4686-9672-57a7ce6e12d0\" (UID: \"ef22a91b-23f4-4686-9672-57a7ce6e12d0\") " Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.054218 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ef22a91b-23f4-4686-9672-57a7ce6e12d0-httpd-run\") pod \"ef22a91b-23f4-4686-9672-57a7ce6e12d0\" (UID: \"ef22a91b-23f4-4686-9672-57a7ce6e12d0\") " Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.054247 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef22a91b-23f4-4686-9672-57a7ce6e12d0-scripts\") pod \"ef22a91b-23f4-4686-9672-57a7ce6e12d0\" (UID: \"ef22a91b-23f4-4686-9672-57a7ce6e12d0\") " Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.054302 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef22a91b-23f4-4686-9672-57a7ce6e12d0-logs\") pod \"ef22a91b-23f4-4686-9672-57a7ce6e12d0\" (UID: \"ef22a91b-23f4-4686-9672-57a7ce6e12d0\") " Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.054322 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef22a91b-23f4-4686-9672-57a7ce6e12d0-combined-ca-bundle\") pod \"ef22a91b-23f4-4686-9672-57a7ce6e12d0\" (UID: \"ef22a91b-23f4-4686-9672-57a7ce6e12d0\") " Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.056361 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef22a91b-23f4-4686-9672-57a7ce6e12d0-logs" (OuterVolumeSpecName: "logs") pod "ef22a91b-23f4-4686-9672-57a7ce6e12d0" (UID: "ef22a91b-23f4-4686-9672-57a7ce6e12d0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.056654 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef22a91b-23f4-4686-9672-57a7ce6e12d0-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "ef22a91b-23f4-4686-9672-57a7ce6e12d0" (UID: "ef22a91b-23f4-4686-9672-57a7ce6e12d0"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.064717 4933 generic.go:334] "Generic (PLEG): container finished" podID="ef22a91b-23f4-4686-9672-57a7ce6e12d0" containerID="55cb85ceed22dd3ba266235cd16dffe5c36bca5deb769b3023a594a835016f6f" exitCode=0 Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.064742 4933 generic.go:334] "Generic (PLEG): container finished" podID="ef22a91b-23f4-4686-9672-57a7ce6e12d0" containerID="87ec0d64fe245d52731152be4488ea3425e1eed1e7b848fa64c2e0012a3edac8" exitCode=143 Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.064782 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ef22a91b-23f4-4686-9672-57a7ce6e12d0","Type":"ContainerDied","Data":"55cb85ceed22dd3ba266235cd16dffe5c36bca5deb769b3023a594a835016f6f"} Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.064807 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ef22a91b-23f4-4686-9672-57a7ce6e12d0","Type":"ContainerDied","Data":"87ec0d64fe245d52731152be4488ea3425e1eed1e7b848fa64c2e0012a3edac8"} Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.064832 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ef22a91b-23f4-4686-9672-57a7ce6e12d0","Type":"ContainerDied","Data":"86704492a6dd16e3cffe841d0c38fad6cc776f9d67966953cdff1096a7f5f034"} Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.064848 4933 scope.go:117] "RemoveContainer" containerID="55cb85ceed22dd3ba266235cd16dffe5c36bca5deb769b3023a594a835016f6f" Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.064970 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.071025 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef22a91b-23f4-4686-9672-57a7ce6e12d0-scripts" (OuterVolumeSpecName: "scripts") pod "ef22a91b-23f4-4686-9672-57a7ce6e12d0" (UID: "ef22a91b-23f4-4686-9672-57a7ce6e12d0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.079129 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef22a91b-23f4-4686-9672-57a7ce6e12d0-kube-api-access-ldl6x" (OuterVolumeSpecName: "kube-api-access-ldl6x") pod "ef22a91b-23f4-4686-9672-57a7ce6e12d0" (UID: "ef22a91b-23f4-4686-9672-57a7ce6e12d0"). InnerVolumeSpecName "kube-api-access-ldl6x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.107504 4933 generic.go:334] "Generic (PLEG): container finished" podID="4a957352-db06-4697-9911-a5e2a15e6ef0" containerID="20796fd5735b6a1cb92e902f30c0069563a9319191a9688c37f869b1d7993fae" exitCode=0 Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.107539 4933 generic.go:334] "Generic (PLEG): container finished" podID="4a957352-db06-4697-9911-a5e2a15e6ef0" containerID="804083dd5edcb7c4348c4e1a844c078de80adbc62db64af94c5e39eb09efb0e7" exitCode=143 Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.107604 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4a957352-db06-4697-9911-a5e2a15e6ef0","Type":"ContainerDied","Data":"20796fd5735b6a1cb92e902f30c0069563a9319191a9688c37f869b1d7993fae"} Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.107632 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4a957352-db06-4697-9911-a5e2a15e6ef0","Type":"ContainerDied","Data":"804083dd5edcb7c4348c4e1a844c078de80adbc62db64af94c5e39eb09efb0e7"} Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.107705 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.125222 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "ef22a91b-23f4-4686-9672-57a7ce6e12d0" (UID: "ef22a91b-23f4-4686-9672-57a7ce6e12d0"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.146323 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-64bbf7fc9b-xn4tj" event={"ID":"250ea801-8fe7-44d3-b5ec-78cd22a3fa39","Type":"ContainerStarted","Data":"4408d163dca72714f346faa03a642062befe9fea6870db74bd9492cd0a2991a6"} Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.146429 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-64bbf7fc9b-xn4tj" event={"ID":"250ea801-8fe7-44d3-b5ec-78cd22a3fa39","Type":"ContainerStarted","Data":"04105bfce24e3a1c1b09b865978237d413ddb2cd02233dea44a6d941550539b0"} Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.149539 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-64bbf7fc9b-xn4tj" Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.149574 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-64bbf7fc9b-xn4tj" Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.178761 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4a957352-db06-4697-9911-a5e2a15e6ef0-logs\") pod \"4a957352-db06-4697-9911-a5e2a15e6ef0\" (UID: \"4a957352-db06-4697-9911-a5e2a15e6ef0\") " Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.179042 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a957352-db06-4697-9911-a5e2a15e6ef0-config-data\") pod \"4a957352-db06-4697-9911-a5e2a15e6ef0\" (UID: \"4a957352-db06-4697-9911-a5e2a15e6ef0\") " Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.179158 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a957352-db06-4697-9911-a5e2a15e6ef0-combined-ca-bundle\") pod \"4a957352-db06-4697-9911-a5e2a15e6ef0\" (UID: \"4a957352-db06-4697-9911-a5e2a15e6ef0\") " Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.179253 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a957352-db06-4697-9911-a5e2a15e6ef0-scripts\") pod \"4a957352-db06-4697-9911-a5e2a15e6ef0\" (UID: \"4a957352-db06-4697-9911-a5e2a15e6ef0\") " Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.179328 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4a957352-db06-4697-9911-a5e2a15e6ef0-httpd-run\") pod \"4a957352-db06-4697-9911-a5e2a15e6ef0\" (UID: \"4a957352-db06-4697-9911-a5e2a15e6ef0\") " Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.179437 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"4a957352-db06-4697-9911-a5e2a15e6ef0\" (UID: \"4a957352-db06-4697-9911-a5e2a15e6ef0\") " Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.179692 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n6rlj\" (UniqueName: \"kubernetes.io/projected/4a957352-db06-4697-9911-a5e2a15e6ef0-kube-api-access-n6rlj\") pod \"4a957352-db06-4697-9911-a5e2a15e6ef0\" (UID: \"4a957352-db06-4697-9911-a5e2a15e6ef0\") " Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.180205 4933 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.180272 4933 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ef22a91b-23f4-4686-9672-57a7ce6e12d0-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.180324 4933 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef22a91b-23f4-4686-9672-57a7ce6e12d0-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.180375 4933 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef22a91b-23f4-4686-9672-57a7ce6e12d0-logs\") on node \"crc\" DevicePath \"\"" Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.180425 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ldl6x\" (UniqueName: \"kubernetes.io/projected/ef22a91b-23f4-4686-9672-57a7ce6e12d0-kube-api-access-ldl6x\") on node \"crc\" DevicePath \"\"" Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.189123 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef22a91b-23f4-4686-9672-57a7ce6e12d0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ef22a91b-23f4-4686-9672-57a7ce6e12d0" (UID: "ef22a91b-23f4-4686-9672-57a7ce6e12d0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.199969 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a957352-db06-4697-9911-a5e2a15e6ef0-logs" (OuterVolumeSpecName: "logs") pod "4a957352-db06-4697-9911-a5e2a15e6ef0" (UID: "4a957352-db06-4697-9911-a5e2a15e6ef0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.203106 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a957352-db06-4697-9911-a5e2a15e6ef0-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "4a957352-db06-4697-9911-a5e2a15e6ef0" (UID: "4a957352-db06-4697-9911-a5e2a15e6ef0"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.212475 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a957352-db06-4697-9911-a5e2a15e6ef0-scripts" (OuterVolumeSpecName: "scripts") pod "4a957352-db06-4697-9911-a5e2a15e6ef0" (UID: "4a957352-db06-4697-9911-a5e2a15e6ef0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.212567 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance") pod "4a957352-db06-4697-9911-a5e2a15e6ef0" (UID: "4a957352-db06-4697-9911-a5e2a15e6ef0"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.216951 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-64bbf7fc9b-xn4tj" podStartSLOduration=3.216929583 podStartE2EDuration="3.216929583s" podCreationTimestamp="2025-12-02 16:15:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 16:15:14.199249503 +0000 UTC m=+1377.450476206" watchObservedRunningTime="2025-12-02 16:15:14.216929583 +0000 UTC m=+1377.468156276" Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.232601 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a957352-db06-4697-9911-a5e2a15e6ef0-kube-api-access-n6rlj" (OuterVolumeSpecName: "kube-api-access-n6rlj") pod "4a957352-db06-4697-9911-a5e2a15e6ef0" (UID: "4a957352-db06-4697-9911-a5e2a15e6ef0"). InnerVolumeSpecName "kube-api-access-n6rlj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.236534 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef22a91b-23f4-4686-9672-57a7ce6e12d0-config-data" (OuterVolumeSpecName: "config-data") pod "ef22a91b-23f4-4686-9672-57a7ce6e12d0" (UID: "ef22a91b-23f4-4686-9672-57a7ce6e12d0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.270072 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a957352-db06-4697-9911-a5e2a15e6ef0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4a957352-db06-4697-9911-a5e2a15e6ef0" (UID: "4a957352-db06-4697-9911-a5e2a15e6ef0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.274462 4933 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.285234 4933 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a957352-db06-4697-9911-a5e2a15e6ef0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.285270 4933 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a957352-db06-4697-9911-a5e2a15e6ef0-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.285282 4933 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.285293 4933 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4a957352-db06-4697-9911-a5e2a15e6ef0-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.285319 4933 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.285331 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n6rlj\" (UniqueName: \"kubernetes.io/projected/4a957352-db06-4697-9911-a5e2a15e6ef0-kube-api-access-n6rlj\") on node \"crc\" DevicePath \"\"" Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.285349 4933 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef22a91b-23f4-4686-9672-57a7ce6e12d0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.285360 4933 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4a957352-db06-4697-9911-a5e2a15e6ef0-logs\") on node \"crc\" DevicePath \"\"" Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.285371 4933 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef22a91b-23f4-4686-9672-57a7ce6e12d0-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.301255 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a957352-db06-4697-9911-a5e2a15e6ef0-config-data" (OuterVolumeSpecName: "config-data") pod "4a957352-db06-4697-9911-a5e2a15e6ef0" (UID: "4a957352-db06-4697-9911-a5e2a15e6ef0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.356971 4933 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.388146 4933 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.388179 4933 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a957352-db06-4697-9911-a5e2a15e6ef0-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.411430 4933 scope.go:117] "RemoveContainer" containerID="87ec0d64fe245d52731152be4488ea3425e1eed1e7b848fa64c2e0012a3edac8" Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.449787 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.480892 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.491250 4933 scope.go:117] "RemoveContainer" containerID="55cb85ceed22dd3ba266235cd16dffe5c36bca5deb769b3023a594a835016f6f" Dec 02 16:15:14 crc kubenswrapper[4933]: E1202 16:15:14.491977 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55cb85ceed22dd3ba266235cd16dffe5c36bca5deb769b3023a594a835016f6f\": container with ID starting with 55cb85ceed22dd3ba266235cd16dffe5c36bca5deb769b3023a594a835016f6f not found: ID does not exist" containerID="55cb85ceed22dd3ba266235cd16dffe5c36bca5deb769b3023a594a835016f6f" Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.492029 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55cb85ceed22dd3ba266235cd16dffe5c36bca5deb769b3023a594a835016f6f"} err="failed to get container status \"55cb85ceed22dd3ba266235cd16dffe5c36bca5deb769b3023a594a835016f6f\": rpc error: code = NotFound desc = could not find container \"55cb85ceed22dd3ba266235cd16dffe5c36bca5deb769b3023a594a835016f6f\": container with ID starting with 55cb85ceed22dd3ba266235cd16dffe5c36bca5deb769b3023a594a835016f6f not found: ID does not exist" Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.492057 4933 scope.go:117] "RemoveContainer" containerID="87ec0d64fe245d52731152be4488ea3425e1eed1e7b848fa64c2e0012a3edac8" Dec 02 16:15:14 crc kubenswrapper[4933]: E1202 16:15:14.492464 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87ec0d64fe245d52731152be4488ea3425e1eed1e7b848fa64c2e0012a3edac8\": container with ID starting with 87ec0d64fe245d52731152be4488ea3425e1eed1e7b848fa64c2e0012a3edac8 not found: ID does not exist" containerID="87ec0d64fe245d52731152be4488ea3425e1eed1e7b848fa64c2e0012a3edac8" Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.492504 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87ec0d64fe245d52731152be4488ea3425e1eed1e7b848fa64c2e0012a3edac8"} err="failed to get container status \"87ec0d64fe245d52731152be4488ea3425e1eed1e7b848fa64c2e0012a3edac8\": rpc error: code = NotFound desc = could not find container \"87ec0d64fe245d52731152be4488ea3425e1eed1e7b848fa64c2e0012a3edac8\": container with ID starting with 87ec0d64fe245d52731152be4488ea3425e1eed1e7b848fa64c2e0012a3edac8 not found: ID does not exist" Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.492531 4933 scope.go:117] "RemoveContainer" containerID="55cb85ceed22dd3ba266235cd16dffe5c36bca5deb769b3023a594a835016f6f" Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.495110 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55cb85ceed22dd3ba266235cd16dffe5c36bca5deb769b3023a594a835016f6f"} err="failed to get container status \"55cb85ceed22dd3ba266235cd16dffe5c36bca5deb769b3023a594a835016f6f\": rpc error: code = NotFound desc = could not find container \"55cb85ceed22dd3ba266235cd16dffe5c36bca5deb769b3023a594a835016f6f\": container with ID starting with 55cb85ceed22dd3ba266235cd16dffe5c36bca5deb769b3023a594a835016f6f not found: ID does not exist" Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.495164 4933 scope.go:117] "RemoveContainer" containerID="87ec0d64fe245d52731152be4488ea3425e1eed1e7b848fa64c2e0012a3edac8" Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.497435 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87ec0d64fe245d52731152be4488ea3425e1eed1e7b848fa64c2e0012a3edac8"} err="failed to get container status \"87ec0d64fe245d52731152be4488ea3425e1eed1e7b848fa64c2e0012a3edac8\": rpc error: code = NotFound desc = could not find container \"87ec0d64fe245d52731152be4488ea3425e1eed1e7b848fa64c2e0012a3edac8\": container with ID starting with 87ec0d64fe245d52731152be4488ea3425e1eed1e7b848fa64c2e0012a3edac8 not found: ID does not exist" Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.497475 4933 scope.go:117] "RemoveContainer" containerID="20796fd5735b6a1cb92e902f30c0069563a9319191a9688c37f869b1d7993fae" Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.500382 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 16:15:14 crc kubenswrapper[4933]: E1202 16:15:14.504652 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a957352-db06-4697-9911-a5e2a15e6ef0" containerName="glance-log" Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.504694 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a957352-db06-4697-9911-a5e2a15e6ef0" containerName="glance-log" Dec 02 16:15:14 crc kubenswrapper[4933]: E1202 16:15:14.504719 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a957352-db06-4697-9911-a5e2a15e6ef0" containerName="glance-httpd" Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.504731 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a957352-db06-4697-9911-a5e2a15e6ef0" containerName="glance-httpd" Dec 02 16:15:14 crc kubenswrapper[4933]: E1202 16:15:14.504748 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef22a91b-23f4-4686-9672-57a7ce6e12d0" containerName="glance-log" Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.504759 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef22a91b-23f4-4686-9672-57a7ce6e12d0" containerName="glance-log" Dec 02 16:15:14 crc kubenswrapper[4933]: E1202 16:15:14.504802 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef22a91b-23f4-4686-9672-57a7ce6e12d0" containerName="glance-httpd" Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.504812 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef22a91b-23f4-4686-9672-57a7ce6e12d0" containerName="glance-httpd" Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.505211 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef22a91b-23f4-4686-9672-57a7ce6e12d0" containerName="glance-log" Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.505230 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a957352-db06-4697-9911-a5e2a15e6ef0" containerName="glance-httpd" Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.505244 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a957352-db06-4697-9911-a5e2a15e6ef0" containerName="glance-log" Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.505279 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef22a91b-23f4-4686-9672-57a7ce6e12d0" containerName="glance-httpd" Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.509498 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.516739 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.519016 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-6fkqb" Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.519336 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.519481 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.519621 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.526989 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.574278 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.593183 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.595164 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.601068 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.601139 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.601577 4933 scope.go:117] "RemoveContainer" containerID="804083dd5edcb7c4348c4e1a844c078de80adbc62db64af94c5e39eb09efb0e7" Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.603259 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.689984 4933 scope.go:117] "RemoveContainer" containerID="20796fd5735b6a1cb92e902f30c0069563a9319191a9688c37f869b1d7993fae" Dec 02 16:15:14 crc kubenswrapper[4933]: E1202 16:15:14.690732 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20796fd5735b6a1cb92e902f30c0069563a9319191a9688c37f869b1d7993fae\": container with ID starting with 20796fd5735b6a1cb92e902f30c0069563a9319191a9688c37f869b1d7993fae not found: ID does not exist" containerID="20796fd5735b6a1cb92e902f30c0069563a9319191a9688c37f869b1d7993fae" Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.690756 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20796fd5735b6a1cb92e902f30c0069563a9319191a9688c37f869b1d7993fae"} err="failed to get container status \"20796fd5735b6a1cb92e902f30c0069563a9319191a9688c37f869b1d7993fae\": rpc error: code = NotFound desc = could not find container \"20796fd5735b6a1cb92e902f30c0069563a9319191a9688c37f869b1d7993fae\": container with ID starting with 20796fd5735b6a1cb92e902f30c0069563a9319191a9688c37f869b1d7993fae not found: ID does not exist" Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.690781 4933 scope.go:117] "RemoveContainer" containerID="804083dd5edcb7c4348c4e1a844c078de80adbc62db64af94c5e39eb09efb0e7" Dec 02 16:15:14 crc kubenswrapper[4933]: E1202 16:15:14.691545 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"804083dd5edcb7c4348c4e1a844c078de80adbc62db64af94c5e39eb09efb0e7\": container with ID starting with 804083dd5edcb7c4348c4e1a844c078de80adbc62db64af94c5e39eb09efb0e7 not found: ID does not exist" containerID="804083dd5edcb7c4348c4e1a844c078de80adbc62db64af94c5e39eb09efb0e7" Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.691570 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"804083dd5edcb7c4348c4e1a844c078de80adbc62db64af94c5e39eb09efb0e7"} err="failed to get container status \"804083dd5edcb7c4348c4e1a844c078de80adbc62db64af94c5e39eb09efb0e7\": rpc error: code = NotFound desc = could not find container \"804083dd5edcb7c4348c4e1a844c078de80adbc62db64af94c5e39eb09efb0e7\": container with ID starting with 804083dd5edcb7c4348c4e1a844c078de80adbc62db64af94c5e39eb09efb0e7 not found: ID does not exist" Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.699015 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fbc64d10-680e-40b1-816f-9d63f8eb8b11-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"fbc64d10-680e-40b1-816f-9d63f8eb8b11\") " pod="openstack/glance-default-external-api-0" Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.699069 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbc64d10-680e-40b1-816f-9d63f8eb8b11-config-data\") pod \"glance-default-external-api-0\" (UID: \"fbc64d10-680e-40b1-816f-9d63f8eb8b11\") " pod="openstack/glance-default-external-api-0" Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.699091 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fbc64d10-680e-40b1-816f-9d63f8eb8b11-scripts\") pod \"glance-default-external-api-0\" (UID: \"fbc64d10-680e-40b1-816f-9d63f8eb8b11\") " pod="openstack/glance-default-external-api-0" Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.699110 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fbc64d10-680e-40b1-816f-9d63f8eb8b11-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"fbc64d10-680e-40b1-816f-9d63f8eb8b11\") " pod="openstack/glance-default-external-api-0" Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.699130 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrf27\" (UniqueName: \"kubernetes.io/projected/103581af-5f22-4b11-a0a3-093da3661978-kube-api-access-vrf27\") pod \"glance-default-internal-api-0\" (UID: \"103581af-5f22-4b11-a0a3-093da3661978\") " pod="openstack/glance-default-internal-api-0" Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.699183 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/103581af-5f22-4b11-a0a3-093da3661978-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"103581af-5f22-4b11-a0a3-093da3661978\") " pod="openstack/glance-default-internal-api-0" Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.699203 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/103581af-5f22-4b11-a0a3-093da3661978-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"103581af-5f22-4b11-a0a3-093da3661978\") " pod="openstack/glance-default-internal-api-0" Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.699227 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/103581af-5f22-4b11-a0a3-093da3661978-config-data\") pod \"glance-default-internal-api-0\" (UID: \"103581af-5f22-4b11-a0a3-093da3661978\") " pod="openstack/glance-default-internal-api-0" Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.699248 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"fbc64d10-680e-40b1-816f-9d63f8eb8b11\") " pod="openstack/glance-default-external-api-0" Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.699265 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qg29m\" (UniqueName: \"kubernetes.io/projected/fbc64d10-680e-40b1-816f-9d63f8eb8b11-kube-api-access-qg29m\") pod \"glance-default-external-api-0\" (UID: \"fbc64d10-680e-40b1-816f-9d63f8eb8b11\") " pod="openstack/glance-default-external-api-0" Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.699290 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"103581af-5f22-4b11-a0a3-093da3661978\") " pod="openstack/glance-default-internal-api-0" Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.699315 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/103581af-5f22-4b11-a0a3-093da3661978-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"103581af-5f22-4b11-a0a3-093da3661978\") " pod="openstack/glance-default-internal-api-0" Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.699332 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbc64d10-680e-40b1-816f-9d63f8eb8b11-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"fbc64d10-680e-40b1-816f-9d63f8eb8b11\") " pod="openstack/glance-default-external-api-0" Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.699365 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/103581af-5f22-4b11-a0a3-093da3661978-logs\") pod \"glance-default-internal-api-0\" (UID: \"103581af-5f22-4b11-a0a3-093da3661978\") " pod="openstack/glance-default-internal-api-0" Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.699394 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fbc64d10-680e-40b1-816f-9d63f8eb8b11-logs\") pod \"glance-default-external-api-0\" (UID: \"fbc64d10-680e-40b1-816f-9d63f8eb8b11\") " pod="openstack/glance-default-external-api-0" Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.699427 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/103581af-5f22-4b11-a0a3-093da3661978-scripts\") pod \"glance-default-internal-api-0\" (UID: \"103581af-5f22-4b11-a0a3-093da3661978\") " pod="openstack/glance-default-internal-api-0" Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.801416 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/103581af-5f22-4b11-a0a3-093da3661978-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"103581af-5f22-4b11-a0a3-093da3661978\") " pod="openstack/glance-default-internal-api-0" Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.801463 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/103581af-5f22-4b11-a0a3-093da3661978-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"103581af-5f22-4b11-a0a3-093da3661978\") " pod="openstack/glance-default-internal-api-0" Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.801489 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/103581af-5f22-4b11-a0a3-093da3661978-config-data\") pod \"glance-default-internal-api-0\" (UID: \"103581af-5f22-4b11-a0a3-093da3661978\") " pod="openstack/glance-default-internal-api-0" Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.801511 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"fbc64d10-680e-40b1-816f-9d63f8eb8b11\") " pod="openstack/glance-default-external-api-0" Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.801533 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qg29m\" (UniqueName: \"kubernetes.io/projected/fbc64d10-680e-40b1-816f-9d63f8eb8b11-kube-api-access-qg29m\") pod \"glance-default-external-api-0\" (UID: \"fbc64d10-680e-40b1-816f-9d63f8eb8b11\") " pod="openstack/glance-default-external-api-0" Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.801559 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"103581af-5f22-4b11-a0a3-093da3661978\") " pod="openstack/glance-default-internal-api-0" Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.801613 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/103581af-5f22-4b11-a0a3-093da3661978-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"103581af-5f22-4b11-a0a3-093da3661978\") " pod="openstack/glance-default-internal-api-0" Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.801635 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbc64d10-680e-40b1-816f-9d63f8eb8b11-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"fbc64d10-680e-40b1-816f-9d63f8eb8b11\") " pod="openstack/glance-default-external-api-0" Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.801693 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/103581af-5f22-4b11-a0a3-093da3661978-logs\") pod \"glance-default-internal-api-0\" (UID: \"103581af-5f22-4b11-a0a3-093da3661978\") " pod="openstack/glance-default-internal-api-0" Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.801743 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fbc64d10-680e-40b1-816f-9d63f8eb8b11-logs\") pod \"glance-default-external-api-0\" (UID: \"fbc64d10-680e-40b1-816f-9d63f8eb8b11\") " pod="openstack/glance-default-external-api-0" Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.801802 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/103581af-5f22-4b11-a0a3-093da3661978-scripts\") pod \"glance-default-internal-api-0\" (UID: \"103581af-5f22-4b11-a0a3-093da3661978\") " pod="openstack/glance-default-internal-api-0" Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.801911 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fbc64d10-680e-40b1-816f-9d63f8eb8b11-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"fbc64d10-680e-40b1-816f-9d63f8eb8b11\") " pod="openstack/glance-default-external-api-0" Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.801941 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbc64d10-680e-40b1-816f-9d63f8eb8b11-config-data\") pod \"glance-default-external-api-0\" (UID: \"fbc64d10-680e-40b1-816f-9d63f8eb8b11\") " pod="openstack/glance-default-external-api-0" Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.801969 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fbc64d10-680e-40b1-816f-9d63f8eb8b11-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"fbc64d10-680e-40b1-816f-9d63f8eb8b11\") " pod="openstack/glance-default-external-api-0" Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.801986 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fbc64d10-680e-40b1-816f-9d63f8eb8b11-scripts\") pod \"glance-default-external-api-0\" (UID: \"fbc64d10-680e-40b1-816f-9d63f8eb8b11\") " pod="openstack/glance-default-external-api-0" Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.802004 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vrf27\" (UniqueName: \"kubernetes.io/projected/103581af-5f22-4b11-a0a3-093da3661978-kube-api-access-vrf27\") pod \"glance-default-internal-api-0\" (UID: \"103581af-5f22-4b11-a0a3-093da3661978\") " pod="openstack/glance-default-internal-api-0" Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.802683 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/103581af-5f22-4b11-a0a3-093da3661978-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"103581af-5f22-4b11-a0a3-093da3661978\") " pod="openstack/glance-default-internal-api-0" Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.802937 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/103581af-5f22-4b11-a0a3-093da3661978-logs\") pod \"glance-default-internal-api-0\" (UID: \"103581af-5f22-4b11-a0a3-093da3661978\") " pod="openstack/glance-default-internal-api-0" Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.803244 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fbc64d10-680e-40b1-816f-9d63f8eb8b11-logs\") pod \"glance-default-external-api-0\" (UID: \"fbc64d10-680e-40b1-816f-9d63f8eb8b11\") " pod="openstack/glance-default-external-api-0" Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.803481 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fbc64d10-680e-40b1-816f-9d63f8eb8b11-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"fbc64d10-680e-40b1-816f-9d63f8eb8b11\") " pod="openstack/glance-default-external-api-0" Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.806151 4933 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"fbc64d10-680e-40b1-816f-9d63f8eb8b11\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-external-api-0" Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.811263 4933 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"103581af-5f22-4b11-a0a3-093da3661978\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-internal-api-0" Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.813117 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbc64d10-680e-40b1-816f-9d63f8eb8b11-config-data\") pod \"glance-default-external-api-0\" (UID: \"fbc64d10-680e-40b1-816f-9d63f8eb8b11\") " pod="openstack/glance-default-external-api-0" Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.813662 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbc64d10-680e-40b1-816f-9d63f8eb8b11-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"fbc64d10-680e-40b1-816f-9d63f8eb8b11\") " pod="openstack/glance-default-external-api-0" Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.817544 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/103581af-5f22-4b11-a0a3-093da3661978-scripts\") pod \"glance-default-internal-api-0\" (UID: \"103581af-5f22-4b11-a0a3-093da3661978\") " pod="openstack/glance-default-internal-api-0" Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.830005 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fbc64d10-680e-40b1-816f-9d63f8eb8b11-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"fbc64d10-680e-40b1-816f-9d63f8eb8b11\") " pod="openstack/glance-default-external-api-0" Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.835978 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/103581af-5f22-4b11-a0a3-093da3661978-config-data\") pod \"glance-default-internal-api-0\" (UID: \"103581af-5f22-4b11-a0a3-093da3661978\") " pod="openstack/glance-default-internal-api-0" Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.842497 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fbc64d10-680e-40b1-816f-9d63f8eb8b11-scripts\") pod \"glance-default-external-api-0\" (UID: \"fbc64d10-680e-40b1-816f-9d63f8eb8b11\") " pod="openstack/glance-default-external-api-0" Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.844611 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/103581af-5f22-4b11-a0a3-093da3661978-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"103581af-5f22-4b11-a0a3-093da3661978\") " pod="openstack/glance-default-internal-api-0" Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.848630 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/103581af-5f22-4b11-a0a3-093da3661978-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"103581af-5f22-4b11-a0a3-093da3661978\") " pod="openstack/glance-default-internal-api-0" Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.865661 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qg29m\" (UniqueName: \"kubernetes.io/projected/fbc64d10-680e-40b1-816f-9d63f8eb8b11-kube-api-access-qg29m\") pod \"glance-default-external-api-0\" (UID: \"fbc64d10-680e-40b1-816f-9d63f8eb8b11\") " pod="openstack/glance-default-external-api-0" Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.876548 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrf27\" (UniqueName: \"kubernetes.io/projected/103581af-5f22-4b11-a0a3-093da3661978-kube-api-access-vrf27\") pod \"glance-default-internal-api-0\" (UID: \"103581af-5f22-4b11-a0a3-093da3661978\") " pod="openstack/glance-default-internal-api-0" Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.945229 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"fbc64d10-680e-40b1-816f-9d63f8eb8b11\") " pod="openstack/glance-default-external-api-0" Dec 02 16:15:14 crc kubenswrapper[4933]: I1202 16:15:14.960078 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"103581af-5f22-4b11-a0a3-093da3661978\") " pod="openstack/glance-default-internal-api-0" Dec 02 16:15:15 crc kubenswrapper[4933]: I1202 16:15:14.998449 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7d5bbf46cc-92zkh"] Dec 02 16:15:15 crc kubenswrapper[4933]: I1202 16:15:15.067624 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a957352-db06-4697-9911-a5e2a15e6ef0" path="/var/lib/kubelet/pods/4a957352-db06-4697-9911-a5e2a15e6ef0/volumes" Dec 02 16:15:15 crc kubenswrapper[4933]: I1202 16:15:15.068936 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef22a91b-23f4-4686-9672-57a7ce6e12d0" path="/var/lib/kubelet/pods/ef22a91b-23f4-4686-9672-57a7ce6e12d0/volumes" Dec 02 16:15:15 crc kubenswrapper[4933]: I1202 16:15:15.148042 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 02 16:15:15 crc kubenswrapper[4933]: I1202 16:15:15.168563 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-64bbf7fc9b-xn4tj" event={"ID":"250ea801-8fe7-44d3-b5ec-78cd22a3fa39","Type":"ContainerStarted","Data":"499b1378751087febf40568f0ca3e7ebdceb8e613ab3bd05e5c309290ee463fa"} Dec 02 16:15:15 crc kubenswrapper[4933]: I1202 16:15:15.179320 4933 generic.go:334] "Generic (PLEG): container finished" podID="ae883c36-aac5-49f4-839c-d0140fe724cc" containerID="695f67b9aea9f7e5082f73616ee2fd565dbbf11fa4f169dba5f0d34bb67047dc" exitCode=0 Dec 02 16:15:15 crc kubenswrapper[4933]: I1202 16:15:15.179613 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-649gh" event={"ID":"ae883c36-aac5-49f4-839c-d0140fe724cc","Type":"ContainerDied","Data":"695f67b9aea9f7e5082f73616ee2fd565dbbf11fa4f169dba5f0d34bb67047dc"} Dec 02 16:15:15 crc kubenswrapper[4933]: I1202 16:15:15.183766 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7d5bbf46cc-92zkh" event={"ID":"03882bf4-d9e9-460d-a94e-fb17189d0278","Type":"ContainerStarted","Data":"d8aca38f159db60781633d6fe02876184ca38410f04a96d150b0af69e6eae4c3"} Dec 02 16:15:15 crc kubenswrapper[4933]: I1202 16:15:15.235213 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 02 16:15:15 crc kubenswrapper[4933]: I1202 16:15:15.737280 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-rwgj6" Dec 02 16:15:15 crc kubenswrapper[4933]: I1202 16:15:15.867701 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39ec29fe-4f0d-4afe-9440-0cdc7b30d651-scripts\") pod \"39ec29fe-4f0d-4afe-9440-0cdc7b30d651\" (UID: \"39ec29fe-4f0d-4afe-9440-0cdc7b30d651\") " Dec 02 16:15:15 crc kubenswrapper[4933]: I1202 16:15:15.867766 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39ec29fe-4f0d-4afe-9440-0cdc7b30d651-config-data\") pod \"39ec29fe-4f0d-4afe-9440-0cdc7b30d651\" (UID: \"39ec29fe-4f0d-4afe-9440-0cdc7b30d651\") " Dec 02 16:15:15 crc kubenswrapper[4933]: I1202 16:15:15.867809 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39ec29fe-4f0d-4afe-9440-0cdc7b30d651-combined-ca-bundle\") pod \"39ec29fe-4f0d-4afe-9440-0cdc7b30d651\" (UID: \"39ec29fe-4f0d-4afe-9440-0cdc7b30d651\") " Dec 02 16:15:15 crc kubenswrapper[4933]: I1202 16:15:15.867932 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/39ec29fe-4f0d-4afe-9440-0cdc7b30d651-credential-keys\") pod \"39ec29fe-4f0d-4afe-9440-0cdc7b30d651\" (UID: \"39ec29fe-4f0d-4afe-9440-0cdc7b30d651\") " Dec 02 16:15:15 crc kubenswrapper[4933]: I1202 16:15:15.867982 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h6xkn\" (UniqueName: \"kubernetes.io/projected/39ec29fe-4f0d-4afe-9440-0cdc7b30d651-kube-api-access-h6xkn\") pod \"39ec29fe-4f0d-4afe-9440-0cdc7b30d651\" (UID: \"39ec29fe-4f0d-4afe-9440-0cdc7b30d651\") " Dec 02 16:15:15 crc kubenswrapper[4933]: I1202 16:15:15.868056 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/39ec29fe-4f0d-4afe-9440-0cdc7b30d651-fernet-keys\") pod \"39ec29fe-4f0d-4afe-9440-0cdc7b30d651\" (UID: \"39ec29fe-4f0d-4afe-9440-0cdc7b30d651\") " Dec 02 16:15:15 crc kubenswrapper[4933]: I1202 16:15:15.872731 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39ec29fe-4f0d-4afe-9440-0cdc7b30d651-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "39ec29fe-4f0d-4afe-9440-0cdc7b30d651" (UID: "39ec29fe-4f0d-4afe-9440-0cdc7b30d651"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:15:15 crc kubenswrapper[4933]: I1202 16:15:15.873875 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39ec29fe-4f0d-4afe-9440-0cdc7b30d651-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "39ec29fe-4f0d-4afe-9440-0cdc7b30d651" (UID: "39ec29fe-4f0d-4afe-9440-0cdc7b30d651"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:15:15 crc kubenswrapper[4933]: I1202 16:15:15.874973 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39ec29fe-4f0d-4afe-9440-0cdc7b30d651-scripts" (OuterVolumeSpecName: "scripts") pod "39ec29fe-4f0d-4afe-9440-0cdc7b30d651" (UID: "39ec29fe-4f0d-4afe-9440-0cdc7b30d651"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:15:15 crc kubenswrapper[4933]: I1202 16:15:15.876191 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39ec29fe-4f0d-4afe-9440-0cdc7b30d651-kube-api-access-h6xkn" (OuterVolumeSpecName: "kube-api-access-h6xkn") pod "39ec29fe-4f0d-4afe-9440-0cdc7b30d651" (UID: "39ec29fe-4f0d-4afe-9440-0cdc7b30d651"). InnerVolumeSpecName "kube-api-access-h6xkn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:15:15 crc kubenswrapper[4933]: I1202 16:15:15.905990 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39ec29fe-4f0d-4afe-9440-0cdc7b30d651-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "39ec29fe-4f0d-4afe-9440-0cdc7b30d651" (UID: "39ec29fe-4f0d-4afe-9440-0cdc7b30d651"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:15:15 crc kubenswrapper[4933]: I1202 16:15:15.912713 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39ec29fe-4f0d-4afe-9440-0cdc7b30d651-config-data" (OuterVolumeSpecName: "config-data") pod "39ec29fe-4f0d-4afe-9440-0cdc7b30d651" (UID: "39ec29fe-4f0d-4afe-9440-0cdc7b30d651"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:15:15 crc kubenswrapper[4933]: I1202 16:15:15.948960 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 16:15:15 crc kubenswrapper[4933]: W1202 16:15:15.952593 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod103581af_5f22_4b11_a0a3_093da3661978.slice/crio-b40d0f3200d7ab3da43e2885e75f07a1751e33ceaa9d2399305f86b2135ea2b7 WatchSource:0}: Error finding container b40d0f3200d7ab3da43e2885e75f07a1751e33ceaa9d2399305f86b2135ea2b7: Status 404 returned error can't find the container with id b40d0f3200d7ab3da43e2885e75f07a1751e33ceaa9d2399305f86b2135ea2b7 Dec 02 16:15:15 crc kubenswrapper[4933]: I1202 16:15:15.970320 4933 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/39ec29fe-4f0d-4afe-9440-0cdc7b30d651-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 02 16:15:15 crc kubenswrapper[4933]: I1202 16:15:15.970365 4933 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39ec29fe-4f0d-4afe-9440-0cdc7b30d651-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 16:15:15 crc kubenswrapper[4933]: I1202 16:15:15.970374 4933 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39ec29fe-4f0d-4afe-9440-0cdc7b30d651-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 16:15:15 crc kubenswrapper[4933]: I1202 16:15:15.970382 4933 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39ec29fe-4f0d-4afe-9440-0cdc7b30d651-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 16:15:15 crc kubenswrapper[4933]: I1202 16:15:15.970395 4933 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/39ec29fe-4f0d-4afe-9440-0cdc7b30d651-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 02 16:15:15 crc kubenswrapper[4933]: I1202 16:15:15.970403 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h6xkn\" (UniqueName: \"kubernetes.io/projected/39ec29fe-4f0d-4afe-9440-0cdc7b30d651-kube-api-access-h6xkn\") on node \"crc\" DevicePath \"\"" Dec 02 16:15:16 crc kubenswrapper[4933]: I1202 16:15:16.042488 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 16:15:16 crc kubenswrapper[4933]: W1202 16:15:16.044153 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfbc64d10_680e_40b1_816f_9d63f8eb8b11.slice/crio-6b4c8ebc1d1fb92cda87594582e1b13c1ac130b328b42367044b0a9856378a38 WatchSource:0}: Error finding container 6b4c8ebc1d1fb92cda87594582e1b13c1ac130b328b42367044b0a9856378a38: Status 404 returned error can't find the container with id 6b4c8ebc1d1fb92cda87594582e1b13c1ac130b328b42367044b0a9856378a38 Dec 02 16:15:16 crc kubenswrapper[4933]: I1202 16:15:16.218581 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7d5bbf46cc-92zkh" event={"ID":"03882bf4-d9e9-460d-a94e-fb17189d0278","Type":"ContainerStarted","Data":"eac78c79c5603269599003dc0d5f503e77c963273775f7f9b706e9388d726c17"} Dec 02 16:15:16 crc kubenswrapper[4933]: I1202 16:15:16.220159 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7d5bbf46cc-92zkh" event={"ID":"03882bf4-d9e9-460d-a94e-fb17189d0278","Type":"ContainerStarted","Data":"6cadf7857650d01166cbdf767d5a75c04d1df370817708e41dbbbcec1a3e0d8c"} Dec 02 16:15:16 crc kubenswrapper[4933]: I1202 16:15:16.220181 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-7d5bbf46cc-92zkh" Dec 02 16:15:16 crc kubenswrapper[4933]: I1202 16:15:16.224567 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-rwgj6" event={"ID":"39ec29fe-4f0d-4afe-9440-0cdc7b30d651","Type":"ContainerDied","Data":"9971143dd34708cdc50c8e12f324a4488f2e2a4ffecf3da1386dff627a0e5223"} Dec 02 16:15:16 crc kubenswrapper[4933]: I1202 16:15:16.224617 4933 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9971143dd34708cdc50c8e12f324a4488f2e2a4ffecf3da1386dff627a0e5223" Dec 02 16:15:16 crc kubenswrapper[4933]: I1202 16:15:16.224699 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-rwgj6" Dec 02 16:15:16 crc kubenswrapper[4933]: I1202 16:15:16.268176 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fbc64d10-680e-40b1-816f-9d63f8eb8b11","Type":"ContainerStarted","Data":"6b4c8ebc1d1fb92cda87594582e1b13c1ac130b328b42367044b0a9856378a38"} Dec 02 16:15:16 crc kubenswrapper[4933]: I1202 16:15:16.276342 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-7d5bbf46cc-92zkh" podStartSLOduration=3.27631863 podStartE2EDuration="3.27631863s" podCreationTimestamp="2025-12-02 16:15:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 16:15:16.258506256 +0000 UTC m=+1379.509732959" watchObservedRunningTime="2025-12-02 16:15:16.27631863 +0000 UTC m=+1379.527545333" Dec 02 16:15:16 crc kubenswrapper[4933]: I1202 16:15:16.297115 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"103581af-5f22-4b11-a0a3-093da3661978","Type":"ContainerStarted","Data":"b40d0f3200d7ab3da43e2885e75f07a1751e33ceaa9d2399305f86b2135ea2b7"} Dec 02 16:15:16 crc kubenswrapper[4933]: I1202 16:15:16.313529 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-5445f4c57f-62kcb"] Dec 02 16:15:16 crc kubenswrapper[4933]: E1202 16:15:16.314028 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39ec29fe-4f0d-4afe-9440-0cdc7b30d651" containerName="keystone-bootstrap" Dec 02 16:15:16 crc kubenswrapper[4933]: I1202 16:15:16.314047 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="39ec29fe-4f0d-4afe-9440-0cdc7b30d651" containerName="keystone-bootstrap" Dec 02 16:15:16 crc kubenswrapper[4933]: I1202 16:15:16.314295 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="39ec29fe-4f0d-4afe-9440-0cdc7b30d651" containerName="keystone-bootstrap" Dec 02 16:15:16 crc kubenswrapper[4933]: I1202 16:15:16.315026 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5445f4c57f-62kcb" Dec 02 16:15:16 crc kubenswrapper[4933]: I1202 16:15:16.322473 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 02 16:15:16 crc kubenswrapper[4933]: I1202 16:15:16.322683 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 02 16:15:16 crc kubenswrapper[4933]: I1202 16:15:16.323464 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-2ntjs" Dec 02 16:15:16 crc kubenswrapper[4933]: I1202 16:15:16.323596 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Dec 02 16:15:16 crc kubenswrapper[4933]: I1202 16:15:16.323942 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Dec 02 16:15:16 crc kubenswrapper[4933]: I1202 16:15:16.324038 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 02 16:15:16 crc kubenswrapper[4933]: I1202 16:15:16.352569 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5445f4c57f-62kcb"] Dec 02 16:15:16 crc kubenswrapper[4933]: I1202 16:15:16.489423 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfvvw\" (UniqueName: \"kubernetes.io/projected/e87150f8-ace7-485f-bbfe-8818205e400b-kube-api-access-lfvvw\") pod \"keystone-5445f4c57f-62kcb\" (UID: \"e87150f8-ace7-485f-bbfe-8818205e400b\") " pod="openstack/keystone-5445f4c57f-62kcb" Dec 02 16:15:16 crc kubenswrapper[4933]: I1202 16:15:16.489497 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e87150f8-ace7-485f-bbfe-8818205e400b-credential-keys\") pod \"keystone-5445f4c57f-62kcb\" (UID: \"e87150f8-ace7-485f-bbfe-8818205e400b\") " pod="openstack/keystone-5445f4c57f-62kcb" Dec 02 16:15:16 crc kubenswrapper[4933]: I1202 16:15:16.489528 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e87150f8-ace7-485f-bbfe-8818205e400b-internal-tls-certs\") pod \"keystone-5445f4c57f-62kcb\" (UID: \"e87150f8-ace7-485f-bbfe-8818205e400b\") " pod="openstack/keystone-5445f4c57f-62kcb" Dec 02 16:15:16 crc kubenswrapper[4933]: I1202 16:15:16.489588 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e87150f8-ace7-485f-bbfe-8818205e400b-config-data\") pod \"keystone-5445f4c57f-62kcb\" (UID: \"e87150f8-ace7-485f-bbfe-8818205e400b\") " pod="openstack/keystone-5445f4c57f-62kcb" Dec 02 16:15:16 crc kubenswrapper[4933]: I1202 16:15:16.489671 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e87150f8-ace7-485f-bbfe-8818205e400b-public-tls-certs\") pod \"keystone-5445f4c57f-62kcb\" (UID: \"e87150f8-ace7-485f-bbfe-8818205e400b\") " pod="openstack/keystone-5445f4c57f-62kcb" Dec 02 16:15:16 crc kubenswrapper[4933]: I1202 16:15:16.489809 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e87150f8-ace7-485f-bbfe-8818205e400b-scripts\") pod \"keystone-5445f4c57f-62kcb\" (UID: \"e87150f8-ace7-485f-bbfe-8818205e400b\") " pod="openstack/keystone-5445f4c57f-62kcb" Dec 02 16:15:16 crc kubenswrapper[4933]: I1202 16:15:16.489856 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e87150f8-ace7-485f-bbfe-8818205e400b-fernet-keys\") pod \"keystone-5445f4c57f-62kcb\" (UID: \"e87150f8-ace7-485f-bbfe-8818205e400b\") " pod="openstack/keystone-5445f4c57f-62kcb" Dec 02 16:15:16 crc kubenswrapper[4933]: I1202 16:15:16.489897 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e87150f8-ace7-485f-bbfe-8818205e400b-combined-ca-bundle\") pod \"keystone-5445f4c57f-62kcb\" (UID: \"e87150f8-ace7-485f-bbfe-8818205e400b\") " pod="openstack/keystone-5445f4c57f-62kcb" Dec 02 16:15:16 crc kubenswrapper[4933]: I1202 16:15:16.642491 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e87150f8-ace7-485f-bbfe-8818205e400b-public-tls-certs\") pod \"keystone-5445f4c57f-62kcb\" (UID: \"e87150f8-ace7-485f-bbfe-8818205e400b\") " pod="openstack/keystone-5445f4c57f-62kcb" Dec 02 16:15:16 crc kubenswrapper[4933]: I1202 16:15:16.642793 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e87150f8-ace7-485f-bbfe-8818205e400b-scripts\") pod \"keystone-5445f4c57f-62kcb\" (UID: \"e87150f8-ace7-485f-bbfe-8818205e400b\") " pod="openstack/keystone-5445f4c57f-62kcb" Dec 02 16:15:16 crc kubenswrapper[4933]: I1202 16:15:16.642912 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e87150f8-ace7-485f-bbfe-8818205e400b-fernet-keys\") pod \"keystone-5445f4c57f-62kcb\" (UID: \"e87150f8-ace7-485f-bbfe-8818205e400b\") " pod="openstack/keystone-5445f4c57f-62kcb" Dec 02 16:15:16 crc kubenswrapper[4933]: I1202 16:15:16.643049 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e87150f8-ace7-485f-bbfe-8818205e400b-combined-ca-bundle\") pod \"keystone-5445f4c57f-62kcb\" (UID: \"e87150f8-ace7-485f-bbfe-8818205e400b\") " pod="openstack/keystone-5445f4c57f-62kcb" Dec 02 16:15:16 crc kubenswrapper[4933]: I1202 16:15:16.643215 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfvvw\" (UniqueName: \"kubernetes.io/projected/e87150f8-ace7-485f-bbfe-8818205e400b-kube-api-access-lfvvw\") pod \"keystone-5445f4c57f-62kcb\" (UID: \"e87150f8-ace7-485f-bbfe-8818205e400b\") " pod="openstack/keystone-5445f4c57f-62kcb" Dec 02 16:15:16 crc kubenswrapper[4933]: I1202 16:15:16.643314 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e87150f8-ace7-485f-bbfe-8818205e400b-credential-keys\") pod \"keystone-5445f4c57f-62kcb\" (UID: \"e87150f8-ace7-485f-bbfe-8818205e400b\") " pod="openstack/keystone-5445f4c57f-62kcb" Dec 02 16:15:16 crc kubenswrapper[4933]: I1202 16:15:16.643460 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e87150f8-ace7-485f-bbfe-8818205e400b-internal-tls-certs\") pod \"keystone-5445f4c57f-62kcb\" (UID: \"e87150f8-ace7-485f-bbfe-8818205e400b\") " pod="openstack/keystone-5445f4c57f-62kcb" Dec 02 16:15:16 crc kubenswrapper[4933]: I1202 16:15:16.643593 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e87150f8-ace7-485f-bbfe-8818205e400b-config-data\") pod \"keystone-5445f4c57f-62kcb\" (UID: \"e87150f8-ace7-485f-bbfe-8818205e400b\") " pod="openstack/keystone-5445f4c57f-62kcb" Dec 02 16:15:16 crc kubenswrapper[4933]: I1202 16:15:16.653576 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e87150f8-ace7-485f-bbfe-8818205e400b-credential-keys\") pod \"keystone-5445f4c57f-62kcb\" (UID: \"e87150f8-ace7-485f-bbfe-8818205e400b\") " pod="openstack/keystone-5445f4c57f-62kcb" Dec 02 16:15:16 crc kubenswrapper[4933]: I1202 16:15:16.654493 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e87150f8-ace7-485f-bbfe-8818205e400b-public-tls-certs\") pod \"keystone-5445f4c57f-62kcb\" (UID: \"e87150f8-ace7-485f-bbfe-8818205e400b\") " pod="openstack/keystone-5445f4c57f-62kcb" Dec 02 16:15:16 crc kubenswrapper[4933]: I1202 16:15:16.655015 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e87150f8-ace7-485f-bbfe-8818205e400b-combined-ca-bundle\") pod \"keystone-5445f4c57f-62kcb\" (UID: \"e87150f8-ace7-485f-bbfe-8818205e400b\") " pod="openstack/keystone-5445f4c57f-62kcb" Dec 02 16:15:16 crc kubenswrapper[4933]: I1202 16:15:16.656321 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e87150f8-ace7-485f-bbfe-8818205e400b-scripts\") pod \"keystone-5445f4c57f-62kcb\" (UID: \"e87150f8-ace7-485f-bbfe-8818205e400b\") " pod="openstack/keystone-5445f4c57f-62kcb" Dec 02 16:15:16 crc kubenswrapper[4933]: I1202 16:15:16.663653 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e87150f8-ace7-485f-bbfe-8818205e400b-config-data\") pod \"keystone-5445f4c57f-62kcb\" (UID: \"e87150f8-ace7-485f-bbfe-8818205e400b\") " pod="openstack/keystone-5445f4c57f-62kcb" Dec 02 16:15:16 crc kubenswrapper[4933]: I1202 16:15:16.667420 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e87150f8-ace7-485f-bbfe-8818205e400b-fernet-keys\") pod \"keystone-5445f4c57f-62kcb\" (UID: \"e87150f8-ace7-485f-bbfe-8818205e400b\") " pod="openstack/keystone-5445f4c57f-62kcb" Dec 02 16:15:16 crc kubenswrapper[4933]: I1202 16:15:16.682371 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e87150f8-ace7-485f-bbfe-8818205e400b-internal-tls-certs\") pod \"keystone-5445f4c57f-62kcb\" (UID: \"e87150f8-ace7-485f-bbfe-8818205e400b\") " pod="openstack/keystone-5445f4c57f-62kcb" Dec 02 16:15:16 crc kubenswrapper[4933]: I1202 16:15:16.692045 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfvvw\" (UniqueName: \"kubernetes.io/projected/e87150f8-ace7-485f-bbfe-8818205e400b-kube-api-access-lfvvw\") pod \"keystone-5445f4c57f-62kcb\" (UID: \"e87150f8-ace7-485f-bbfe-8818205e400b\") " pod="openstack/keystone-5445f4c57f-62kcb" Dec 02 16:15:16 crc kubenswrapper[4933]: I1202 16:15:16.964135 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5445f4c57f-62kcb" Dec 02 16:15:17 crc kubenswrapper[4933]: I1202 16:15:17.169528 4933 patch_prober.go:28] interesting pod/machine-config-daemon-d2p6w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 16:15:17 crc kubenswrapper[4933]: I1202 16:15:17.173079 4933 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 16:15:17 crc kubenswrapper[4933]: I1202 16:15:17.173138 4933 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" Dec 02 16:15:17 crc kubenswrapper[4933]: I1202 16:15:17.174952 4933 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"88738e3bef53d0478fb4beaed256d677959bb94ab5e545598a6a14e9cb4a5112"} pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 16:15:17 crc kubenswrapper[4933]: I1202 16:15:17.175017 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" containerName="machine-config-daemon" containerID="cri-o://88738e3bef53d0478fb4beaed256d677959bb94ab5e545598a6a14e9cb4a5112" gracePeriod=600 Dec 02 16:15:18 crc kubenswrapper[4933]: I1202 16:15:18.344176 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fbc64d10-680e-40b1-816f-9d63f8eb8b11","Type":"ContainerStarted","Data":"f856f18354a9d19cec7e8aeecbe0b1a6429c990950c7e4e13fa0710d8137cfc0"} Dec 02 16:15:18 crc kubenswrapper[4933]: I1202 16:15:18.347143 4933 generic.go:334] "Generic (PLEG): container finished" podID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" containerID="88738e3bef53d0478fb4beaed256d677959bb94ab5e545598a6a14e9cb4a5112" exitCode=0 Dec 02 16:15:18 crc kubenswrapper[4933]: I1202 16:15:18.347159 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" event={"ID":"e6c1c5e6-50dd-428a-890c-2c3f0456f2fa","Type":"ContainerDied","Data":"88738e3bef53d0478fb4beaed256d677959bb94ab5e545598a6a14e9cb4a5112"} Dec 02 16:15:18 crc kubenswrapper[4933]: I1202 16:15:18.347240 4933 scope.go:117] "RemoveContainer" containerID="6a4e12f1c3641af98254a5deef90580d464dc44072cf6822887fee1b15222e36" Dec 02 16:15:18 crc kubenswrapper[4933]: I1202 16:15:18.349493 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"103581af-5f22-4b11-a0a3-093da3661978","Type":"ContainerStarted","Data":"37212acc77563c3a21486221b9d97152443c021cd2709eef50f3258d2a878908"} Dec 02 16:15:18 crc kubenswrapper[4933]: I1202 16:15:18.652011 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5ccc5c4795-rsdtj" Dec 02 16:15:19 crc kubenswrapper[4933]: I1202 16:15:19.782047 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6ffb94d8ff-wlntx"] Dec 02 16:15:19 crc kubenswrapper[4933]: I1202 16:15:19.783090 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6ffb94d8ff-wlntx" podUID="4250e624-4c9d-426c-8856-76782f0bb0e3" containerName="dnsmasq-dns" containerID="cri-o://4efceaedaa6cc040fda75de4cbd3c0d92c3c599f1dd7f6fe5f60ab7da99f525d" gracePeriod=10 Dec 02 16:15:20 crc kubenswrapper[4933]: I1202 16:15:20.099684 4933 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6ffb94d8ff-wlntx" podUID="4250e624-4c9d-426c-8856-76782f0bb0e3" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.178:5353: connect: connection refused" Dec 02 16:15:20 crc kubenswrapper[4933]: I1202 16:15:20.785464 4933 generic.go:334] "Generic (PLEG): container finished" podID="4250e624-4c9d-426c-8856-76782f0bb0e3" containerID="4efceaedaa6cc040fda75de4cbd3c0d92c3c599f1dd7f6fe5f60ab7da99f525d" exitCode=0 Dec 02 16:15:20 crc kubenswrapper[4933]: I1202 16:15:20.785540 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6ffb94d8ff-wlntx" event={"ID":"4250e624-4c9d-426c-8856-76782f0bb0e3","Type":"ContainerDied","Data":"4efceaedaa6cc040fda75de4cbd3c0d92c3c599f1dd7f6fe5f60ab7da99f525d"} Dec 02 16:15:22 crc kubenswrapper[4933]: I1202 16:15:22.440348 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-649gh" Dec 02 16:15:22 crc kubenswrapper[4933]: E1202 16:15:22.507093 4933 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6c1c5e6_50dd_428a_890c_2c3f0456f2fa.slice/crio-88738e3bef53d0478fb4beaed256d677959bb94ab5e545598a6a14e9cb4a5112.scope\": RecentStats: unable to find data in memory cache]" Dec 02 16:15:22 crc kubenswrapper[4933]: I1202 16:15:22.624535 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ae883c36-aac5-49f4-839c-d0140fe724cc-db-sync-config-data\") pod \"ae883c36-aac5-49f4-839c-d0140fe724cc\" (UID: \"ae883c36-aac5-49f4-839c-d0140fe724cc\") " Dec 02 16:15:22 crc kubenswrapper[4933]: I1202 16:15:22.625983 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mjqts\" (UniqueName: \"kubernetes.io/projected/ae883c36-aac5-49f4-839c-d0140fe724cc-kube-api-access-mjqts\") pod \"ae883c36-aac5-49f4-839c-d0140fe724cc\" (UID: \"ae883c36-aac5-49f4-839c-d0140fe724cc\") " Dec 02 16:15:22 crc kubenswrapper[4933]: I1202 16:15:22.626238 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae883c36-aac5-49f4-839c-d0140fe724cc-combined-ca-bundle\") pod \"ae883c36-aac5-49f4-839c-d0140fe724cc\" (UID: \"ae883c36-aac5-49f4-839c-d0140fe724cc\") " Dec 02 16:15:22 crc kubenswrapper[4933]: I1202 16:15:22.631317 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae883c36-aac5-49f4-839c-d0140fe724cc-kube-api-access-mjqts" (OuterVolumeSpecName: "kube-api-access-mjqts") pod "ae883c36-aac5-49f4-839c-d0140fe724cc" (UID: "ae883c36-aac5-49f4-839c-d0140fe724cc"). InnerVolumeSpecName "kube-api-access-mjqts". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:15:22 crc kubenswrapper[4933]: I1202 16:15:22.631403 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae883c36-aac5-49f4-839c-d0140fe724cc-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "ae883c36-aac5-49f4-839c-d0140fe724cc" (UID: "ae883c36-aac5-49f4-839c-d0140fe724cc"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:15:22 crc kubenswrapper[4933]: I1202 16:15:22.673040 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae883c36-aac5-49f4-839c-d0140fe724cc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ae883c36-aac5-49f4-839c-d0140fe724cc" (UID: "ae883c36-aac5-49f4-839c-d0140fe724cc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:15:22 crc kubenswrapper[4933]: I1202 16:15:22.728697 4933 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae883c36-aac5-49f4-839c-d0140fe724cc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 16:15:22 crc kubenswrapper[4933]: I1202 16:15:22.728741 4933 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ae883c36-aac5-49f4-839c-d0140fe724cc-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 16:15:22 crc kubenswrapper[4933]: I1202 16:15:22.728750 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mjqts\" (UniqueName: \"kubernetes.io/projected/ae883c36-aac5-49f4-839c-d0140fe724cc-kube-api-access-mjqts\") on node \"crc\" DevicePath \"\"" Dec 02 16:15:22 crc kubenswrapper[4933]: I1202 16:15:22.879249 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6ffb94d8ff-wlntx" Dec 02 16:15:22 crc kubenswrapper[4933]: I1202 16:15:22.880335 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-649gh" event={"ID":"ae883c36-aac5-49f4-839c-d0140fe724cc","Type":"ContainerDied","Data":"9e3a9a011e145842e25c5a2cfa03fcee26cf81e130cb8b1fe579741a22345042"} Dec 02 16:15:22 crc kubenswrapper[4933]: I1202 16:15:22.880374 4933 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9e3a9a011e145842e25c5a2cfa03fcee26cf81e130cb8b1fe579741a22345042" Dec 02 16:15:22 crc kubenswrapper[4933]: I1202 16:15:22.880427 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-649gh" Dec 02 16:15:23 crc kubenswrapper[4933]: I1202 16:15:23.055632 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4250e624-4c9d-426c-8856-76782f0bb0e3-config\") pod \"4250e624-4c9d-426c-8856-76782f0bb0e3\" (UID: \"4250e624-4c9d-426c-8856-76782f0bb0e3\") " Dec 02 16:15:23 crc kubenswrapper[4933]: I1202 16:15:23.055853 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4250e624-4c9d-426c-8856-76782f0bb0e3-dns-svc\") pod \"4250e624-4c9d-426c-8856-76782f0bb0e3\" (UID: \"4250e624-4c9d-426c-8856-76782f0bb0e3\") " Dec 02 16:15:23 crc kubenswrapper[4933]: I1202 16:15:23.055956 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4250e624-4c9d-426c-8856-76782f0bb0e3-ovsdbserver-nb\") pod \"4250e624-4c9d-426c-8856-76782f0bb0e3\" (UID: \"4250e624-4c9d-426c-8856-76782f0bb0e3\") " Dec 02 16:15:23 crc kubenswrapper[4933]: I1202 16:15:23.055997 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qv2zw\" (UniqueName: \"kubernetes.io/projected/4250e624-4c9d-426c-8856-76782f0bb0e3-kube-api-access-qv2zw\") pod \"4250e624-4c9d-426c-8856-76782f0bb0e3\" (UID: \"4250e624-4c9d-426c-8856-76782f0bb0e3\") " Dec 02 16:15:23 crc kubenswrapper[4933]: I1202 16:15:23.056064 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4250e624-4c9d-426c-8856-76782f0bb0e3-ovsdbserver-sb\") pod \"4250e624-4c9d-426c-8856-76782f0bb0e3\" (UID: \"4250e624-4c9d-426c-8856-76782f0bb0e3\") " Dec 02 16:15:23 crc kubenswrapper[4933]: I1202 16:15:23.087563 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4250e624-4c9d-426c-8856-76782f0bb0e3-kube-api-access-qv2zw" (OuterVolumeSpecName: "kube-api-access-qv2zw") pod "4250e624-4c9d-426c-8856-76782f0bb0e3" (UID: "4250e624-4c9d-426c-8856-76782f0bb0e3"). InnerVolumeSpecName "kube-api-access-qv2zw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:15:23 crc kubenswrapper[4933]: I1202 16:15:23.150008 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5445f4c57f-62kcb"] Dec 02 16:15:23 crc kubenswrapper[4933]: I1202 16:15:23.167997 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qv2zw\" (UniqueName: \"kubernetes.io/projected/4250e624-4c9d-426c-8856-76782f0bb0e3-kube-api-access-qv2zw\") on node \"crc\" DevicePath \"\"" Dec 02 16:15:23 crc kubenswrapper[4933]: I1202 16:15:23.193015 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4250e624-4c9d-426c-8856-76782f0bb0e3-config" (OuterVolumeSpecName: "config") pod "4250e624-4c9d-426c-8856-76782f0bb0e3" (UID: "4250e624-4c9d-426c-8856-76782f0bb0e3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:15:23 crc kubenswrapper[4933]: I1202 16:15:23.198318 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4250e624-4c9d-426c-8856-76782f0bb0e3-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4250e624-4c9d-426c-8856-76782f0bb0e3" (UID: "4250e624-4c9d-426c-8856-76782f0bb0e3"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:15:23 crc kubenswrapper[4933]: I1202 16:15:23.206324 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4250e624-4c9d-426c-8856-76782f0bb0e3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4250e624-4c9d-426c-8856-76782f0bb0e3" (UID: "4250e624-4c9d-426c-8856-76782f0bb0e3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:15:23 crc kubenswrapper[4933]: I1202 16:15:23.248676 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4250e624-4c9d-426c-8856-76782f0bb0e3-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4250e624-4c9d-426c-8856-76782f0bb0e3" (UID: "4250e624-4c9d-426c-8856-76782f0bb0e3"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:15:23 crc kubenswrapper[4933]: I1202 16:15:23.270341 4933 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4250e624-4c9d-426c-8856-76782f0bb0e3-config\") on node \"crc\" DevicePath \"\"" Dec 02 16:15:23 crc kubenswrapper[4933]: I1202 16:15:23.270371 4933 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4250e624-4c9d-426c-8856-76782f0bb0e3-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 16:15:23 crc kubenswrapper[4933]: I1202 16:15:23.270381 4933 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4250e624-4c9d-426c-8856-76782f0bb0e3-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 02 16:15:23 crc kubenswrapper[4933]: I1202 16:15:23.270390 4933 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4250e624-4c9d-426c-8856-76782f0bb0e3-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 02 16:15:24 crc kubenswrapper[4933]: I1202 16:15:24.133144 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" event={"ID":"e6c1c5e6-50dd-428a-890c-2c3f0456f2fa","Type":"ContainerStarted","Data":"3464037e44f6657a3e78f158e1b9ac51eea30ff49dcabcf02071d819dacd47c7"} Dec 02 16:15:24 crc kubenswrapper[4933]: I1202 16:15:24.149360 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-56bd7dd77f-sxk4g"] Dec 02 16:15:24 crc kubenswrapper[4933]: E1202 16:15:24.150065 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4250e624-4c9d-426c-8856-76782f0bb0e3" containerName="init" Dec 02 16:15:24 crc kubenswrapper[4933]: I1202 16:15:24.150079 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="4250e624-4c9d-426c-8856-76782f0bb0e3" containerName="init" Dec 02 16:15:24 crc kubenswrapper[4933]: E1202 16:15:24.150110 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4250e624-4c9d-426c-8856-76782f0bb0e3" containerName="dnsmasq-dns" Dec 02 16:15:24 crc kubenswrapper[4933]: I1202 16:15:24.150116 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="4250e624-4c9d-426c-8856-76782f0bb0e3" containerName="dnsmasq-dns" Dec 02 16:15:24 crc kubenswrapper[4933]: E1202 16:15:24.150131 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae883c36-aac5-49f4-839c-d0140fe724cc" containerName="barbican-db-sync" Dec 02 16:15:24 crc kubenswrapper[4933]: I1202 16:15:24.150137 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae883c36-aac5-49f4-839c-d0140fe724cc" containerName="barbican-db-sync" Dec 02 16:15:24 crc kubenswrapper[4933]: I1202 16:15:24.150357 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="4250e624-4c9d-426c-8856-76782f0bb0e3" containerName="dnsmasq-dns" Dec 02 16:15:24 crc kubenswrapper[4933]: I1202 16:15:24.150380 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae883c36-aac5-49f4-839c-d0140fe724cc" containerName="barbican-db-sync" Dec 02 16:15:24 crc kubenswrapper[4933]: I1202 16:15:24.172660 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-56bd7dd77f-sxk4g" Dec 02 16:15:24 crc kubenswrapper[4933]: I1202 16:15:24.175627 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-xjnn2" Dec 02 16:15:24 crc kubenswrapper[4933]: I1202 16:15:24.176987 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5445f4c57f-62kcb" event={"ID":"e87150f8-ace7-485f-bbfe-8818205e400b","Type":"ContainerStarted","Data":"0e3773afb70c49fc08b51e3ac43405da7a2ec395104159acfbb4ecce03150faf"} Dec 02 16:15:24 crc kubenswrapper[4933]: I1202 16:15:24.177117 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 02 16:15:24 crc kubenswrapper[4933]: I1202 16:15:24.177343 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Dec 02 16:15:24 crc kubenswrapper[4933]: I1202 16:15:24.202109 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-64dd586798-jl4hw"] Dec 02 16:15:24 crc kubenswrapper[4933]: I1202 16:15:24.212685 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-64dd586798-jl4hw" Dec 02 16:15:24 crc kubenswrapper[4933]: I1202 16:15:24.218922 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Dec 02 16:15:24 crc kubenswrapper[4933]: I1202 16:15:24.234608 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6d02b9de-9f5e-44fc-9e29-6a01b47c6ad7","Type":"ContainerStarted","Data":"231ac796b1f2247613553fc4a1ad4f8c9d33f2ebe0b8b3755239d3b29f0a8d3f"} Dec 02 16:15:24 crc kubenswrapper[4933]: I1202 16:15:24.242419 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-kzxbf" event={"ID":"c4239535-0d5c-4b17-a695-9f57efb4d381","Type":"ContainerStarted","Data":"dae24602cbed033884446a88e1c910eb45268850208f26904539043d4084c582"} Dec 02 16:15:24 crc kubenswrapper[4933]: I1202 16:15:24.262807 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6ffb94d8ff-wlntx" event={"ID":"4250e624-4c9d-426c-8856-76782f0bb0e3","Type":"ContainerDied","Data":"14dfc16206e6010bb396f674bfc4a8272168a9e5f3375d524bc48cb36b358838"} Dec 02 16:15:24 crc kubenswrapper[4933]: I1202 16:15:24.268548 4933 scope.go:117] "RemoveContainer" containerID="4efceaedaa6cc040fda75de4cbd3c0d92c3c599f1dd7f6fe5f60ab7da99f525d" Dec 02 16:15:24 crc kubenswrapper[4933]: I1202 16:15:24.263492 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6ffb94d8ff-wlntx" Dec 02 16:15:24 crc kubenswrapper[4933]: I1202 16:15:24.353942 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-56bd7dd77f-sxk4g"] Dec 02 16:15:24 crc kubenswrapper[4933]: I1202 16:15:24.394067 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78795142-23f4-4bfd-ba25-479d6cc3c19f-combined-ca-bundle\") pod \"barbican-keystone-listener-64dd586798-jl4hw\" (UID: \"78795142-23f4-4bfd-ba25-479d6cc3c19f\") " pod="openstack/barbican-keystone-listener-64dd586798-jl4hw" Dec 02 16:15:24 crc kubenswrapper[4933]: I1202 16:15:24.394272 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04729c7a-7c1b-4138-832a-f6d0bf327720-combined-ca-bundle\") pod \"barbican-worker-56bd7dd77f-sxk4g\" (UID: \"04729c7a-7c1b-4138-832a-f6d0bf327720\") " pod="openstack/barbican-worker-56bd7dd77f-sxk4g" Dec 02 16:15:24 crc kubenswrapper[4933]: I1202 16:15:24.394306 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/04729c7a-7c1b-4138-832a-f6d0bf327720-config-data-custom\") pod \"barbican-worker-56bd7dd77f-sxk4g\" (UID: \"04729c7a-7c1b-4138-832a-f6d0bf327720\") " pod="openstack/barbican-worker-56bd7dd77f-sxk4g" Dec 02 16:15:24 crc kubenswrapper[4933]: I1202 16:15:24.394382 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04729c7a-7c1b-4138-832a-f6d0bf327720-config-data\") pod \"barbican-worker-56bd7dd77f-sxk4g\" (UID: \"04729c7a-7c1b-4138-832a-f6d0bf327720\") " pod="openstack/barbican-worker-56bd7dd77f-sxk4g" Dec 02 16:15:24 crc kubenswrapper[4933]: I1202 16:15:24.394530 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/04729c7a-7c1b-4138-832a-f6d0bf327720-logs\") pod \"barbican-worker-56bd7dd77f-sxk4g\" (UID: \"04729c7a-7c1b-4138-832a-f6d0bf327720\") " pod="openstack/barbican-worker-56bd7dd77f-sxk4g" Dec 02 16:15:24 crc kubenswrapper[4933]: I1202 16:15:24.394630 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6k2z\" (UniqueName: \"kubernetes.io/projected/04729c7a-7c1b-4138-832a-f6d0bf327720-kube-api-access-m6k2z\") pod \"barbican-worker-56bd7dd77f-sxk4g\" (UID: \"04729c7a-7c1b-4138-832a-f6d0bf327720\") " pod="openstack/barbican-worker-56bd7dd77f-sxk4g" Dec 02 16:15:24 crc kubenswrapper[4933]: I1202 16:15:24.394666 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/78795142-23f4-4bfd-ba25-479d6cc3c19f-config-data-custom\") pod \"barbican-keystone-listener-64dd586798-jl4hw\" (UID: \"78795142-23f4-4bfd-ba25-479d6cc3c19f\") " pod="openstack/barbican-keystone-listener-64dd586798-jl4hw" Dec 02 16:15:24 crc kubenswrapper[4933]: I1202 16:15:24.394714 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/78795142-23f4-4bfd-ba25-479d6cc3c19f-logs\") pod \"barbican-keystone-listener-64dd586798-jl4hw\" (UID: \"78795142-23f4-4bfd-ba25-479d6cc3c19f\") " pod="openstack/barbican-keystone-listener-64dd586798-jl4hw" Dec 02 16:15:24 crc kubenswrapper[4933]: I1202 16:15:24.394750 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78795142-23f4-4bfd-ba25-479d6cc3c19f-config-data\") pod \"barbican-keystone-listener-64dd586798-jl4hw\" (UID: \"78795142-23f4-4bfd-ba25-479d6cc3c19f\") " pod="openstack/barbican-keystone-listener-64dd586798-jl4hw" Dec 02 16:15:24 crc kubenswrapper[4933]: I1202 16:15:24.394771 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8c59v\" (UniqueName: \"kubernetes.io/projected/78795142-23f4-4bfd-ba25-479d6cc3c19f-kube-api-access-8c59v\") pod \"barbican-keystone-listener-64dd586798-jl4hw\" (UID: \"78795142-23f4-4bfd-ba25-479d6cc3c19f\") " pod="openstack/barbican-keystone-listener-64dd586798-jl4hw" Dec 02 16:15:24 crc kubenswrapper[4933]: I1202 16:15:24.406026 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-64dd586798-jl4hw"] Dec 02 16:15:24 crc kubenswrapper[4933]: I1202 16:15:24.437271 4933 scope.go:117] "RemoveContainer" containerID="feb12c25836a067ef3c2f30bef307a3f9cb9fd99f8512dff2638256a3f21cdc0" Dec 02 16:15:24 crc kubenswrapper[4933]: I1202 16:15:24.439179 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-kzxj4"] Dec 02 16:15:24 crc kubenswrapper[4933]: I1202 16:15:24.441295 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688c87cc99-kzxj4" Dec 02 16:15:24 crc kubenswrapper[4933]: I1202 16:15:24.469423 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-kzxj4"] Dec 02 16:15:24 crc kubenswrapper[4933]: I1202 16:15:24.487890 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-sync-kzxbf" podStartSLOduration=11.509125478 podStartE2EDuration="1m0.487867788s" podCreationTimestamp="2025-12-02 16:14:24 +0000 UTC" firstStartedPulling="2025-12-02 16:14:33.556905885 +0000 UTC m=+1336.808132628" lastFinishedPulling="2025-12-02 16:15:22.535648235 +0000 UTC m=+1385.786874938" observedRunningTime="2025-12-02 16:15:24.330300096 +0000 UTC m=+1387.581526799" watchObservedRunningTime="2025-12-02 16:15:24.487867788 +0000 UTC m=+1387.739094501" Dec 02 16:15:24 crc kubenswrapper[4933]: I1202 16:15:24.497383 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/04729c7a-7c1b-4138-832a-f6d0bf327720-logs\") pod \"barbican-worker-56bd7dd77f-sxk4g\" (UID: \"04729c7a-7c1b-4138-832a-f6d0bf327720\") " pod="openstack/barbican-worker-56bd7dd77f-sxk4g" Dec 02 16:15:24 crc kubenswrapper[4933]: I1202 16:15:24.497478 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6k2z\" (UniqueName: \"kubernetes.io/projected/04729c7a-7c1b-4138-832a-f6d0bf327720-kube-api-access-m6k2z\") pod \"barbican-worker-56bd7dd77f-sxk4g\" (UID: \"04729c7a-7c1b-4138-832a-f6d0bf327720\") " pod="openstack/barbican-worker-56bd7dd77f-sxk4g" Dec 02 16:15:24 crc kubenswrapper[4933]: I1202 16:15:24.497515 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/78795142-23f4-4bfd-ba25-479d6cc3c19f-config-data-custom\") pod \"barbican-keystone-listener-64dd586798-jl4hw\" (UID: \"78795142-23f4-4bfd-ba25-479d6cc3c19f\") " pod="openstack/barbican-keystone-listener-64dd586798-jl4hw" Dec 02 16:15:24 crc kubenswrapper[4933]: I1202 16:15:24.497551 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/78795142-23f4-4bfd-ba25-479d6cc3c19f-logs\") pod \"barbican-keystone-listener-64dd586798-jl4hw\" (UID: \"78795142-23f4-4bfd-ba25-479d6cc3c19f\") " pod="openstack/barbican-keystone-listener-64dd586798-jl4hw" Dec 02 16:15:24 crc kubenswrapper[4933]: I1202 16:15:24.497573 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78795142-23f4-4bfd-ba25-479d6cc3c19f-config-data\") pod \"barbican-keystone-listener-64dd586798-jl4hw\" (UID: \"78795142-23f4-4bfd-ba25-479d6cc3c19f\") " pod="openstack/barbican-keystone-listener-64dd586798-jl4hw" Dec 02 16:15:24 crc kubenswrapper[4933]: I1202 16:15:24.497592 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8c59v\" (UniqueName: \"kubernetes.io/projected/78795142-23f4-4bfd-ba25-479d6cc3c19f-kube-api-access-8c59v\") pod \"barbican-keystone-listener-64dd586798-jl4hw\" (UID: \"78795142-23f4-4bfd-ba25-479d6cc3c19f\") " pod="openstack/barbican-keystone-listener-64dd586798-jl4hw" Dec 02 16:15:24 crc kubenswrapper[4933]: I1202 16:15:24.497664 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78795142-23f4-4bfd-ba25-479d6cc3c19f-combined-ca-bundle\") pod \"barbican-keystone-listener-64dd586798-jl4hw\" (UID: \"78795142-23f4-4bfd-ba25-479d6cc3c19f\") " pod="openstack/barbican-keystone-listener-64dd586798-jl4hw" Dec 02 16:15:24 crc kubenswrapper[4933]: I1202 16:15:24.497681 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04729c7a-7c1b-4138-832a-f6d0bf327720-combined-ca-bundle\") pod \"barbican-worker-56bd7dd77f-sxk4g\" (UID: \"04729c7a-7c1b-4138-832a-f6d0bf327720\") " pod="openstack/barbican-worker-56bd7dd77f-sxk4g" Dec 02 16:15:24 crc kubenswrapper[4933]: I1202 16:15:24.497708 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/04729c7a-7c1b-4138-832a-f6d0bf327720-config-data-custom\") pod \"barbican-worker-56bd7dd77f-sxk4g\" (UID: \"04729c7a-7c1b-4138-832a-f6d0bf327720\") " pod="openstack/barbican-worker-56bd7dd77f-sxk4g" Dec 02 16:15:24 crc kubenswrapper[4933]: I1202 16:15:24.497764 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04729c7a-7c1b-4138-832a-f6d0bf327720-config-data\") pod \"barbican-worker-56bd7dd77f-sxk4g\" (UID: \"04729c7a-7c1b-4138-832a-f6d0bf327720\") " pod="openstack/barbican-worker-56bd7dd77f-sxk4g" Dec 02 16:15:24 crc kubenswrapper[4933]: I1202 16:15:24.499376 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/04729c7a-7c1b-4138-832a-f6d0bf327720-logs\") pod \"barbican-worker-56bd7dd77f-sxk4g\" (UID: \"04729c7a-7c1b-4138-832a-f6d0bf327720\") " pod="openstack/barbican-worker-56bd7dd77f-sxk4g" Dec 02 16:15:24 crc kubenswrapper[4933]: I1202 16:15:24.502434 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/78795142-23f4-4bfd-ba25-479d6cc3c19f-logs\") pod \"barbican-keystone-listener-64dd586798-jl4hw\" (UID: \"78795142-23f4-4bfd-ba25-479d6cc3c19f\") " pod="openstack/barbican-keystone-listener-64dd586798-jl4hw" Dec 02 16:15:24 crc kubenswrapper[4933]: I1202 16:15:24.514438 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04729c7a-7c1b-4138-832a-f6d0bf327720-combined-ca-bundle\") pod \"barbican-worker-56bd7dd77f-sxk4g\" (UID: \"04729c7a-7c1b-4138-832a-f6d0bf327720\") " pod="openstack/barbican-worker-56bd7dd77f-sxk4g" Dec 02 16:15:24 crc kubenswrapper[4933]: I1202 16:15:24.515897 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04729c7a-7c1b-4138-832a-f6d0bf327720-config-data\") pod \"barbican-worker-56bd7dd77f-sxk4g\" (UID: \"04729c7a-7c1b-4138-832a-f6d0bf327720\") " pod="openstack/barbican-worker-56bd7dd77f-sxk4g" Dec 02 16:15:24 crc kubenswrapper[4933]: I1202 16:15:24.515957 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/78795142-23f4-4bfd-ba25-479d6cc3c19f-config-data-custom\") pod \"barbican-keystone-listener-64dd586798-jl4hw\" (UID: \"78795142-23f4-4bfd-ba25-479d6cc3c19f\") " pod="openstack/barbican-keystone-listener-64dd586798-jl4hw" Dec 02 16:15:24 crc kubenswrapper[4933]: I1202 16:15:24.516874 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/04729c7a-7c1b-4138-832a-f6d0bf327720-config-data-custom\") pod \"barbican-worker-56bd7dd77f-sxk4g\" (UID: \"04729c7a-7c1b-4138-832a-f6d0bf327720\") " pod="openstack/barbican-worker-56bd7dd77f-sxk4g" Dec 02 16:15:24 crc kubenswrapper[4933]: I1202 16:15:24.517180 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78795142-23f4-4bfd-ba25-479d6cc3c19f-config-data\") pod \"barbican-keystone-listener-64dd586798-jl4hw\" (UID: \"78795142-23f4-4bfd-ba25-479d6cc3c19f\") " pod="openstack/barbican-keystone-listener-64dd586798-jl4hw" Dec 02 16:15:24 crc kubenswrapper[4933]: I1202 16:15:24.542593 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78795142-23f4-4bfd-ba25-479d6cc3c19f-combined-ca-bundle\") pod \"barbican-keystone-listener-64dd586798-jl4hw\" (UID: \"78795142-23f4-4bfd-ba25-479d6cc3c19f\") " pod="openstack/barbican-keystone-listener-64dd586798-jl4hw" Dec 02 16:15:24 crc kubenswrapper[4933]: I1202 16:15:24.549663 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-647f597d98-xvnzx"] Dec 02 16:15:24 crc kubenswrapper[4933]: I1202 16:15:24.551582 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-647f597d98-xvnzx" Dec 02 16:15:24 crc kubenswrapper[4933]: I1202 16:15:24.560422 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6k2z\" (UniqueName: \"kubernetes.io/projected/04729c7a-7c1b-4138-832a-f6d0bf327720-kube-api-access-m6k2z\") pod \"barbican-worker-56bd7dd77f-sxk4g\" (UID: \"04729c7a-7c1b-4138-832a-f6d0bf327720\") " pod="openstack/barbican-worker-56bd7dd77f-sxk4g" Dec 02 16:15:24 crc kubenswrapper[4933]: I1202 16:15:24.562689 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Dec 02 16:15:24 crc kubenswrapper[4933]: I1202 16:15:24.563604 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-647f597d98-xvnzx"] Dec 02 16:15:24 crc kubenswrapper[4933]: I1202 16:15:24.564424 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8c59v\" (UniqueName: \"kubernetes.io/projected/78795142-23f4-4bfd-ba25-479d6cc3c19f-kube-api-access-8c59v\") pod \"barbican-keystone-listener-64dd586798-jl4hw\" (UID: \"78795142-23f4-4bfd-ba25-479d6cc3c19f\") " pod="openstack/barbican-keystone-listener-64dd586798-jl4hw" Dec 02 16:15:24 crc kubenswrapper[4933]: I1202 16:15:24.564486 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-56bd7dd77f-sxk4g" Dec 02 16:15:24 crc kubenswrapper[4933]: I1202 16:15:24.580027 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6ffb94d8ff-wlntx"] Dec 02 16:15:24 crc kubenswrapper[4933]: I1202 16:15:24.591665 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6ffb94d8ff-wlntx"] Dec 02 16:15:24 crc kubenswrapper[4933]: I1202 16:15:24.599817 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4cc5d99c-4636-42d0-a7d3-6af1d371695c-dns-swift-storage-0\") pod \"dnsmasq-dns-688c87cc99-kzxj4\" (UID: \"4cc5d99c-4636-42d0-a7d3-6af1d371695c\") " pod="openstack/dnsmasq-dns-688c87cc99-kzxj4" Dec 02 16:15:24 crc kubenswrapper[4933]: I1202 16:15:24.600070 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmqc6\" (UniqueName: \"kubernetes.io/projected/4cc5d99c-4636-42d0-a7d3-6af1d371695c-kube-api-access-cmqc6\") pod \"dnsmasq-dns-688c87cc99-kzxj4\" (UID: \"4cc5d99c-4636-42d0-a7d3-6af1d371695c\") " pod="openstack/dnsmasq-dns-688c87cc99-kzxj4" Dec 02 16:15:24 crc kubenswrapper[4933]: I1202 16:15:24.600515 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4cc5d99c-4636-42d0-a7d3-6af1d371695c-config\") pod \"dnsmasq-dns-688c87cc99-kzxj4\" (UID: \"4cc5d99c-4636-42d0-a7d3-6af1d371695c\") " pod="openstack/dnsmasq-dns-688c87cc99-kzxj4" Dec 02 16:15:24 crc kubenswrapper[4933]: I1202 16:15:24.600621 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4cc5d99c-4636-42d0-a7d3-6af1d371695c-dns-svc\") pod \"dnsmasq-dns-688c87cc99-kzxj4\" (UID: \"4cc5d99c-4636-42d0-a7d3-6af1d371695c\") " pod="openstack/dnsmasq-dns-688c87cc99-kzxj4" Dec 02 16:15:24 crc kubenswrapper[4933]: I1202 16:15:24.600745 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4cc5d99c-4636-42d0-a7d3-6af1d371695c-ovsdbserver-nb\") pod \"dnsmasq-dns-688c87cc99-kzxj4\" (UID: \"4cc5d99c-4636-42d0-a7d3-6af1d371695c\") " pod="openstack/dnsmasq-dns-688c87cc99-kzxj4" Dec 02 16:15:24 crc kubenswrapper[4933]: I1202 16:15:24.600980 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4cc5d99c-4636-42d0-a7d3-6af1d371695c-ovsdbserver-sb\") pod \"dnsmasq-dns-688c87cc99-kzxj4\" (UID: \"4cc5d99c-4636-42d0-a7d3-6af1d371695c\") " pod="openstack/dnsmasq-dns-688c87cc99-kzxj4" Dec 02 16:15:24 crc kubenswrapper[4933]: I1202 16:15:24.739324 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-64dd586798-jl4hw" Dec 02 16:15:24 crc kubenswrapper[4933]: I1202 16:15:24.741422 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4cc5d99c-4636-42d0-a7d3-6af1d371695c-config\") pod \"dnsmasq-dns-688c87cc99-kzxj4\" (UID: \"4cc5d99c-4636-42d0-a7d3-6af1d371695c\") " pod="openstack/dnsmasq-dns-688c87cc99-kzxj4" Dec 02 16:15:24 crc kubenswrapper[4933]: I1202 16:15:24.741460 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4cc5d99c-4636-42d0-a7d3-6af1d371695c-dns-svc\") pod \"dnsmasq-dns-688c87cc99-kzxj4\" (UID: \"4cc5d99c-4636-42d0-a7d3-6af1d371695c\") " pod="openstack/dnsmasq-dns-688c87cc99-kzxj4" Dec 02 16:15:24 crc kubenswrapper[4933]: I1202 16:15:24.741498 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59d1316b-e770-4a63-bd2c-f9c51b6f3082-combined-ca-bundle\") pod \"barbican-api-647f597d98-xvnzx\" (UID: \"59d1316b-e770-4a63-bd2c-f9c51b6f3082\") " pod="openstack/barbican-api-647f597d98-xvnzx" Dec 02 16:15:24 crc kubenswrapper[4933]: I1202 16:15:24.741538 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4cc5d99c-4636-42d0-a7d3-6af1d371695c-ovsdbserver-nb\") pod \"dnsmasq-dns-688c87cc99-kzxj4\" (UID: \"4cc5d99c-4636-42d0-a7d3-6af1d371695c\") " pod="openstack/dnsmasq-dns-688c87cc99-kzxj4" Dec 02 16:15:24 crc kubenswrapper[4933]: I1202 16:15:24.741606 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/59d1316b-e770-4a63-bd2c-f9c51b6f3082-config-data-custom\") pod \"barbican-api-647f597d98-xvnzx\" (UID: \"59d1316b-e770-4a63-bd2c-f9c51b6f3082\") " pod="openstack/barbican-api-647f597d98-xvnzx" Dec 02 16:15:24 crc kubenswrapper[4933]: I1202 16:15:24.741631 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4cc5d99c-4636-42d0-a7d3-6af1d371695c-ovsdbserver-sb\") pod \"dnsmasq-dns-688c87cc99-kzxj4\" (UID: \"4cc5d99c-4636-42d0-a7d3-6af1d371695c\") " pod="openstack/dnsmasq-dns-688c87cc99-kzxj4" Dec 02 16:15:24 crc kubenswrapper[4933]: I1202 16:15:24.741652 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59d1316b-e770-4a63-bd2c-f9c51b6f3082-logs\") pod \"barbican-api-647f597d98-xvnzx\" (UID: \"59d1316b-e770-4a63-bd2c-f9c51b6f3082\") " pod="openstack/barbican-api-647f597d98-xvnzx" Dec 02 16:15:24 crc kubenswrapper[4933]: I1202 16:15:24.741686 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4cc5d99c-4636-42d0-a7d3-6af1d371695c-dns-swift-storage-0\") pod \"dnsmasq-dns-688c87cc99-kzxj4\" (UID: \"4cc5d99c-4636-42d0-a7d3-6af1d371695c\") " pod="openstack/dnsmasq-dns-688c87cc99-kzxj4" Dec 02 16:15:24 crc kubenswrapper[4933]: I1202 16:15:24.741708 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmqc6\" (UniqueName: \"kubernetes.io/projected/4cc5d99c-4636-42d0-a7d3-6af1d371695c-kube-api-access-cmqc6\") pod \"dnsmasq-dns-688c87cc99-kzxj4\" (UID: \"4cc5d99c-4636-42d0-a7d3-6af1d371695c\") " pod="openstack/dnsmasq-dns-688c87cc99-kzxj4" Dec 02 16:15:24 crc kubenswrapper[4933]: I1202 16:15:24.741740 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59d1316b-e770-4a63-bd2c-f9c51b6f3082-config-data\") pod \"barbican-api-647f597d98-xvnzx\" (UID: \"59d1316b-e770-4a63-bd2c-f9c51b6f3082\") " pod="openstack/barbican-api-647f597d98-xvnzx" Dec 02 16:15:24 crc kubenswrapper[4933]: I1202 16:15:24.741761 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2qz5\" (UniqueName: \"kubernetes.io/projected/59d1316b-e770-4a63-bd2c-f9c51b6f3082-kube-api-access-b2qz5\") pod \"barbican-api-647f597d98-xvnzx\" (UID: \"59d1316b-e770-4a63-bd2c-f9c51b6f3082\") " pod="openstack/barbican-api-647f597d98-xvnzx" Dec 02 16:15:24 crc kubenswrapper[4933]: I1202 16:15:24.742913 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4cc5d99c-4636-42d0-a7d3-6af1d371695c-config\") pod \"dnsmasq-dns-688c87cc99-kzxj4\" (UID: \"4cc5d99c-4636-42d0-a7d3-6af1d371695c\") " pod="openstack/dnsmasq-dns-688c87cc99-kzxj4" Dec 02 16:15:24 crc kubenswrapper[4933]: I1202 16:15:24.743873 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4cc5d99c-4636-42d0-a7d3-6af1d371695c-dns-svc\") pod \"dnsmasq-dns-688c87cc99-kzxj4\" (UID: \"4cc5d99c-4636-42d0-a7d3-6af1d371695c\") " pod="openstack/dnsmasq-dns-688c87cc99-kzxj4" Dec 02 16:15:24 crc kubenswrapper[4933]: I1202 16:15:24.744273 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4cc5d99c-4636-42d0-a7d3-6af1d371695c-ovsdbserver-sb\") pod \"dnsmasq-dns-688c87cc99-kzxj4\" (UID: \"4cc5d99c-4636-42d0-a7d3-6af1d371695c\") " pod="openstack/dnsmasq-dns-688c87cc99-kzxj4" Dec 02 16:15:24 crc kubenswrapper[4933]: I1202 16:15:24.744324 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4cc5d99c-4636-42d0-a7d3-6af1d371695c-ovsdbserver-nb\") pod \"dnsmasq-dns-688c87cc99-kzxj4\" (UID: \"4cc5d99c-4636-42d0-a7d3-6af1d371695c\") " pod="openstack/dnsmasq-dns-688c87cc99-kzxj4" Dec 02 16:15:24 crc kubenswrapper[4933]: I1202 16:15:24.744442 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4cc5d99c-4636-42d0-a7d3-6af1d371695c-dns-swift-storage-0\") pod \"dnsmasq-dns-688c87cc99-kzxj4\" (UID: \"4cc5d99c-4636-42d0-a7d3-6af1d371695c\") " pod="openstack/dnsmasq-dns-688c87cc99-kzxj4" Dec 02 16:15:24 crc kubenswrapper[4933]: I1202 16:15:24.768730 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmqc6\" (UniqueName: \"kubernetes.io/projected/4cc5d99c-4636-42d0-a7d3-6af1d371695c-kube-api-access-cmqc6\") pod \"dnsmasq-dns-688c87cc99-kzxj4\" (UID: \"4cc5d99c-4636-42d0-a7d3-6af1d371695c\") " pod="openstack/dnsmasq-dns-688c87cc99-kzxj4" Dec 02 16:15:24 crc kubenswrapper[4933]: I1202 16:15:24.778906 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688c87cc99-kzxj4" Dec 02 16:15:24 crc kubenswrapper[4933]: I1202 16:15:24.843100 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/59d1316b-e770-4a63-bd2c-f9c51b6f3082-config-data-custom\") pod \"barbican-api-647f597d98-xvnzx\" (UID: \"59d1316b-e770-4a63-bd2c-f9c51b6f3082\") " pod="openstack/barbican-api-647f597d98-xvnzx" Dec 02 16:15:24 crc kubenswrapper[4933]: I1202 16:15:24.843603 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59d1316b-e770-4a63-bd2c-f9c51b6f3082-logs\") pod \"barbican-api-647f597d98-xvnzx\" (UID: \"59d1316b-e770-4a63-bd2c-f9c51b6f3082\") " pod="openstack/barbican-api-647f597d98-xvnzx" Dec 02 16:15:24 crc kubenswrapper[4933]: I1202 16:15:24.843676 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59d1316b-e770-4a63-bd2c-f9c51b6f3082-config-data\") pod \"barbican-api-647f597d98-xvnzx\" (UID: \"59d1316b-e770-4a63-bd2c-f9c51b6f3082\") " pod="openstack/barbican-api-647f597d98-xvnzx" Dec 02 16:15:24 crc kubenswrapper[4933]: I1202 16:15:24.843700 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2qz5\" (UniqueName: \"kubernetes.io/projected/59d1316b-e770-4a63-bd2c-f9c51b6f3082-kube-api-access-b2qz5\") pod \"barbican-api-647f597d98-xvnzx\" (UID: \"59d1316b-e770-4a63-bd2c-f9c51b6f3082\") " pod="openstack/barbican-api-647f597d98-xvnzx" Dec 02 16:15:24 crc kubenswrapper[4933]: I1202 16:15:24.843787 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59d1316b-e770-4a63-bd2c-f9c51b6f3082-combined-ca-bundle\") pod \"barbican-api-647f597d98-xvnzx\" (UID: \"59d1316b-e770-4a63-bd2c-f9c51b6f3082\") " pod="openstack/barbican-api-647f597d98-xvnzx" Dec 02 16:15:24 crc kubenswrapper[4933]: I1202 16:15:24.852231 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59d1316b-e770-4a63-bd2c-f9c51b6f3082-logs\") pod \"barbican-api-647f597d98-xvnzx\" (UID: \"59d1316b-e770-4a63-bd2c-f9c51b6f3082\") " pod="openstack/barbican-api-647f597d98-xvnzx" Dec 02 16:15:24 crc kubenswrapper[4933]: I1202 16:15:24.860400 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/59d1316b-e770-4a63-bd2c-f9c51b6f3082-config-data-custom\") pod \"barbican-api-647f597d98-xvnzx\" (UID: \"59d1316b-e770-4a63-bd2c-f9c51b6f3082\") " pod="openstack/barbican-api-647f597d98-xvnzx" Dec 02 16:15:24 crc kubenswrapper[4933]: I1202 16:15:24.865741 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59d1316b-e770-4a63-bd2c-f9c51b6f3082-config-data\") pod \"barbican-api-647f597d98-xvnzx\" (UID: \"59d1316b-e770-4a63-bd2c-f9c51b6f3082\") " pod="openstack/barbican-api-647f597d98-xvnzx" Dec 02 16:15:24 crc kubenswrapper[4933]: I1202 16:15:24.869912 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59d1316b-e770-4a63-bd2c-f9c51b6f3082-combined-ca-bundle\") pod \"barbican-api-647f597d98-xvnzx\" (UID: \"59d1316b-e770-4a63-bd2c-f9c51b6f3082\") " pod="openstack/barbican-api-647f597d98-xvnzx" Dec 02 16:15:24 crc kubenswrapper[4933]: I1202 16:15:24.893608 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2qz5\" (UniqueName: \"kubernetes.io/projected/59d1316b-e770-4a63-bd2c-f9c51b6f3082-kube-api-access-b2qz5\") pod \"barbican-api-647f597d98-xvnzx\" (UID: \"59d1316b-e770-4a63-bd2c-f9c51b6f3082\") " pod="openstack/barbican-api-647f597d98-xvnzx" Dec 02 16:15:25 crc kubenswrapper[4933]: I1202 16:15:25.074371 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4250e624-4c9d-426c-8856-76782f0bb0e3" path="/var/lib/kubelet/pods/4250e624-4c9d-426c-8856-76782f0bb0e3/volumes" Dec 02 16:15:25 crc kubenswrapper[4933]: I1202 16:15:25.090429 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-647f597d98-xvnzx" Dec 02 16:15:25 crc kubenswrapper[4933]: I1202 16:15:25.387264 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-56bd7dd77f-sxk4g"] Dec 02 16:15:25 crc kubenswrapper[4933]: I1202 16:15:25.396335 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fbc64d10-680e-40b1-816f-9d63f8eb8b11","Type":"ContainerStarted","Data":"5c5aa0435eded8b97e626868c198bbadc6dab2d39b392f902f20122a2205444f"} Dec 02 16:15:25 crc kubenswrapper[4933]: I1202 16:15:25.416834 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"103581af-5f22-4b11-a0a3-093da3661978","Type":"ContainerStarted","Data":"26e822c5be606218010c86b961572d270d34c59c223eab95d2ccc2b9a909c107"} Dec 02 16:15:25 crc kubenswrapper[4933]: I1202 16:15:25.459084 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5445f4c57f-62kcb" event={"ID":"e87150f8-ace7-485f-bbfe-8818205e400b","Type":"ContainerStarted","Data":"5195b175c15661447f98c5e1ce049afd7f1098cfbb4432323f44ff97c4a643c3"} Dec 02 16:15:25 crc kubenswrapper[4933]: I1202 16:15:25.460043 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-5445f4c57f-62kcb" Dec 02 16:15:25 crc kubenswrapper[4933]: I1202 16:15:25.472813 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=11.472789189 podStartE2EDuration="11.472789189s" podCreationTimestamp="2025-12-02 16:15:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 16:15:25.437260544 +0000 UTC m=+1388.688487247" watchObservedRunningTime="2025-12-02 16:15:25.472789189 +0000 UTC m=+1388.724015892" Dec 02 16:15:25 crc kubenswrapper[4933]: I1202 16:15:25.493125 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-gtbcc" event={"ID":"9c326307-73df-462e-98ec-0e4dc89fdd54","Type":"ContainerStarted","Data":"c0ba5a252954e9480867705618a22bf881b3aa04ad391282d038ca64caa0c0f4"} Dec 02 16:15:25 crc kubenswrapper[4933]: I1202 16:15:25.524226 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=11.524197847 podStartE2EDuration="11.524197847s" podCreationTimestamp="2025-12-02 16:15:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 16:15:25.477489727 +0000 UTC m=+1388.728716440" watchObservedRunningTime="2025-12-02 16:15:25.524197847 +0000 UTC m=+1388.775424550" Dec 02 16:15:25 crc kubenswrapper[4933]: I1202 16:15:25.597006 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-5445f4c57f-62kcb" podStartSLOduration=9.596987855 podStartE2EDuration="9.596987855s" podCreationTimestamp="2025-12-02 16:15:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 16:15:25.510806523 +0000 UTC m=+1388.762033226" watchObservedRunningTime="2025-12-02 16:15:25.596987855 +0000 UTC m=+1388.848214558" Dec 02 16:15:25 crc kubenswrapper[4933]: I1202 16:15:25.602951 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-gtbcc" podStartSLOduration=12.62690034 podStartE2EDuration="1m1.602933857s" podCreationTimestamp="2025-12-02 16:14:24 +0000 UTC" firstStartedPulling="2025-12-02 16:14:33.558934999 +0000 UTC m=+1336.810161742" lastFinishedPulling="2025-12-02 16:15:22.534968556 +0000 UTC m=+1385.786195259" observedRunningTime="2025-12-02 16:15:25.542549065 +0000 UTC m=+1388.793775768" watchObservedRunningTime="2025-12-02 16:15:25.602933857 +0000 UTC m=+1388.854160560" Dec 02 16:15:25 crc kubenswrapper[4933]: W1202 16:15:25.619486 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4cc5d99c_4636_42d0_a7d3_6af1d371695c.slice/crio-e656ba1d5be3675ceccc6b78d3a1bd287dcb194dc5446d5e05426ebc21495ff8 WatchSource:0}: Error finding container e656ba1d5be3675ceccc6b78d3a1bd287dcb194dc5446d5e05426ebc21495ff8: Status 404 returned error can't find the container with id e656ba1d5be3675ceccc6b78d3a1bd287dcb194dc5446d5e05426ebc21495ff8 Dec 02 16:15:25 crc kubenswrapper[4933]: I1202 16:15:25.632325 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-kzxj4"] Dec 02 16:15:25 crc kubenswrapper[4933]: I1202 16:15:25.792121 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-64dd586798-jl4hw"] Dec 02 16:15:25 crc kubenswrapper[4933]: W1202 16:15:25.811550 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod78795142_23f4_4bfd_ba25_479d6cc3c19f.slice/crio-a7bf612c4b720fac14d9a4497b39d574df002d4cf5123ebc3cf1ab1274190731 WatchSource:0}: Error finding container a7bf612c4b720fac14d9a4497b39d574df002d4cf5123ebc3cf1ab1274190731: Status 404 returned error can't find the container with id a7bf612c4b720fac14d9a4497b39d574df002d4cf5123ebc3cf1ab1274190731 Dec 02 16:15:26 crc kubenswrapper[4933]: I1202 16:15:26.023617 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-647f597d98-xvnzx"] Dec 02 16:15:26 crc kubenswrapper[4933]: W1202 16:15:26.030103 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod59d1316b_e770_4a63_bd2c_f9c51b6f3082.slice/crio-bf785c10d5f86cd47afb4f81c65ef7a71ff46388ed76ca8d9d1ba86438641689 WatchSource:0}: Error finding container bf785c10d5f86cd47afb4f81c65ef7a71ff46388ed76ca8d9d1ba86438641689: Status 404 returned error can't find the container with id bf785c10d5f86cd47afb4f81c65ef7a71ff46388ed76ca8d9d1ba86438641689 Dec 02 16:15:26 crc kubenswrapper[4933]: I1202 16:15:26.605495 4933 generic.go:334] "Generic (PLEG): container finished" podID="4cc5d99c-4636-42d0-a7d3-6af1d371695c" containerID="afa60d60efcfc158a95c12d22fd25cc73f3c38cfa614deb3f004db461bcb8f15" exitCode=0 Dec 02 16:15:26 crc kubenswrapper[4933]: I1202 16:15:26.606050 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688c87cc99-kzxj4" event={"ID":"4cc5d99c-4636-42d0-a7d3-6af1d371695c","Type":"ContainerDied","Data":"afa60d60efcfc158a95c12d22fd25cc73f3c38cfa614deb3f004db461bcb8f15"} Dec 02 16:15:26 crc kubenswrapper[4933]: I1202 16:15:26.606078 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688c87cc99-kzxj4" event={"ID":"4cc5d99c-4636-42d0-a7d3-6af1d371695c","Type":"ContainerStarted","Data":"e656ba1d5be3675ceccc6b78d3a1bd287dcb194dc5446d5e05426ebc21495ff8"} Dec 02 16:15:26 crc kubenswrapper[4933]: I1202 16:15:26.686293 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-647f597d98-xvnzx" event={"ID":"59d1316b-e770-4a63-bd2c-f9c51b6f3082","Type":"ContainerStarted","Data":"098ad90f3ee386fb3380cf335920d9277e2008438cf1a012f4e7cd1e1740ca47"} Dec 02 16:15:26 crc kubenswrapper[4933]: I1202 16:15:26.686677 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-647f597d98-xvnzx" event={"ID":"59d1316b-e770-4a63-bd2c-f9c51b6f3082","Type":"ContainerStarted","Data":"bf785c10d5f86cd47afb4f81c65ef7a71ff46388ed76ca8d9d1ba86438641689"} Dec 02 16:15:26 crc kubenswrapper[4933]: I1202 16:15:26.765089 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-64dd586798-jl4hw" event={"ID":"78795142-23f4-4bfd-ba25-479d6cc3c19f","Type":"ContainerStarted","Data":"a7bf612c4b720fac14d9a4497b39d574df002d4cf5123ebc3cf1ab1274190731"} Dec 02 16:15:26 crc kubenswrapper[4933]: I1202 16:15:26.770970 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-56bd7dd77f-sxk4g" event={"ID":"04729c7a-7c1b-4138-832a-f6d0bf327720","Type":"ContainerStarted","Data":"7f8a55b83908c48fd2e2f92e8777040d81b7ef0b683286a1e6e1f37b2068d66a"} Dec 02 16:15:26 crc kubenswrapper[4933]: E1202 16:15:26.942366 4933 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6c1c5e6_50dd_428a_890c_2c3f0456f2fa.slice/crio-88738e3bef53d0478fb4beaed256d677959bb94ab5e545598a6a14e9cb4a5112.scope\": RecentStats: unable to find data in memory cache]" Dec 02 16:15:27 crc kubenswrapper[4933]: I1202 16:15:27.786681 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688c87cc99-kzxj4" event={"ID":"4cc5d99c-4636-42d0-a7d3-6af1d371695c","Type":"ContainerStarted","Data":"e74f1ee919c41a51f40effceb07124f910423bd80fefa16105d908176f1dd0c1"} Dec 02 16:15:27 crc kubenswrapper[4933]: I1202 16:15:27.787109 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-688c87cc99-kzxj4" Dec 02 16:15:27 crc kubenswrapper[4933]: I1202 16:15:27.792621 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-647f597d98-xvnzx" event={"ID":"59d1316b-e770-4a63-bd2c-f9c51b6f3082","Type":"ContainerStarted","Data":"16e86c960ed5727d5a82c5c718ceb9b1831d1b6e070955b05fb1a8c642a714c5"} Dec 02 16:15:27 crc kubenswrapper[4933]: I1202 16:15:27.792669 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-647f597d98-xvnzx" Dec 02 16:15:27 crc kubenswrapper[4933]: I1202 16:15:27.792726 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-647f597d98-xvnzx" Dec 02 16:15:27 crc kubenswrapper[4933]: I1202 16:15:27.817479 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-688c87cc99-kzxj4" podStartSLOduration=3.817464011 podStartE2EDuration="3.817464011s" podCreationTimestamp="2025-12-02 16:15:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 16:15:27.816520975 +0000 UTC m=+1391.067747678" watchObservedRunningTime="2025-12-02 16:15:27.817464011 +0000 UTC m=+1391.068690714" Dec 02 16:15:27 crc kubenswrapper[4933]: I1202 16:15:27.849760 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-647f597d98-xvnzx" podStartSLOduration=3.849739798 podStartE2EDuration="3.849739798s" podCreationTimestamp="2025-12-02 16:15:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 16:15:27.846816218 +0000 UTC m=+1391.098042931" watchObservedRunningTime="2025-12-02 16:15:27.849739798 +0000 UTC m=+1391.100966501" Dec 02 16:15:28 crc kubenswrapper[4933]: I1202 16:15:28.428969 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-56654f9db6-xk8pt"] Dec 02 16:15:28 crc kubenswrapper[4933]: I1202 16:15:28.432517 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-56654f9db6-xk8pt" Dec 02 16:15:28 crc kubenswrapper[4933]: I1202 16:15:28.456248 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Dec 02 16:15:28 crc kubenswrapper[4933]: I1202 16:15:28.456288 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Dec 02 16:15:28 crc kubenswrapper[4933]: I1202 16:15:28.480936 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-56654f9db6-xk8pt"] Dec 02 16:15:28 crc kubenswrapper[4933]: I1202 16:15:28.527133 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cec17902-d3a5-4961-88eb-65c3773747fa-config-data\") pod \"barbican-api-56654f9db6-xk8pt\" (UID: \"cec17902-d3a5-4961-88eb-65c3773747fa\") " pod="openstack/barbican-api-56654f9db6-xk8pt" Dec 02 16:15:28 crc kubenswrapper[4933]: I1202 16:15:28.527190 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cec17902-d3a5-4961-88eb-65c3773747fa-internal-tls-certs\") pod \"barbican-api-56654f9db6-xk8pt\" (UID: \"cec17902-d3a5-4961-88eb-65c3773747fa\") " pod="openstack/barbican-api-56654f9db6-xk8pt" Dec 02 16:15:28 crc kubenswrapper[4933]: I1202 16:15:28.527528 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cec17902-d3a5-4961-88eb-65c3773747fa-logs\") pod \"barbican-api-56654f9db6-xk8pt\" (UID: \"cec17902-d3a5-4961-88eb-65c3773747fa\") " pod="openstack/barbican-api-56654f9db6-xk8pt" Dec 02 16:15:28 crc kubenswrapper[4933]: I1202 16:15:28.527594 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvhjz\" (UniqueName: \"kubernetes.io/projected/cec17902-d3a5-4961-88eb-65c3773747fa-kube-api-access-xvhjz\") pod \"barbican-api-56654f9db6-xk8pt\" (UID: \"cec17902-d3a5-4961-88eb-65c3773747fa\") " pod="openstack/barbican-api-56654f9db6-xk8pt" Dec 02 16:15:28 crc kubenswrapper[4933]: I1202 16:15:28.527651 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cec17902-d3a5-4961-88eb-65c3773747fa-combined-ca-bundle\") pod \"barbican-api-56654f9db6-xk8pt\" (UID: \"cec17902-d3a5-4961-88eb-65c3773747fa\") " pod="openstack/barbican-api-56654f9db6-xk8pt" Dec 02 16:15:28 crc kubenswrapper[4933]: I1202 16:15:28.527767 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cec17902-d3a5-4961-88eb-65c3773747fa-public-tls-certs\") pod \"barbican-api-56654f9db6-xk8pt\" (UID: \"cec17902-d3a5-4961-88eb-65c3773747fa\") " pod="openstack/barbican-api-56654f9db6-xk8pt" Dec 02 16:15:28 crc kubenswrapper[4933]: I1202 16:15:28.527836 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cec17902-d3a5-4961-88eb-65c3773747fa-config-data-custom\") pod \"barbican-api-56654f9db6-xk8pt\" (UID: \"cec17902-d3a5-4961-88eb-65c3773747fa\") " pod="openstack/barbican-api-56654f9db6-xk8pt" Dec 02 16:15:28 crc kubenswrapper[4933]: I1202 16:15:28.632091 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cec17902-d3a5-4961-88eb-65c3773747fa-logs\") pod \"barbican-api-56654f9db6-xk8pt\" (UID: \"cec17902-d3a5-4961-88eb-65c3773747fa\") " pod="openstack/barbican-api-56654f9db6-xk8pt" Dec 02 16:15:28 crc kubenswrapper[4933]: I1202 16:15:28.632484 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvhjz\" (UniqueName: \"kubernetes.io/projected/cec17902-d3a5-4961-88eb-65c3773747fa-kube-api-access-xvhjz\") pod \"barbican-api-56654f9db6-xk8pt\" (UID: \"cec17902-d3a5-4961-88eb-65c3773747fa\") " pod="openstack/barbican-api-56654f9db6-xk8pt" Dec 02 16:15:28 crc kubenswrapper[4933]: I1202 16:15:28.632594 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cec17902-d3a5-4961-88eb-65c3773747fa-combined-ca-bundle\") pod \"barbican-api-56654f9db6-xk8pt\" (UID: \"cec17902-d3a5-4961-88eb-65c3773747fa\") " pod="openstack/barbican-api-56654f9db6-xk8pt" Dec 02 16:15:28 crc kubenswrapper[4933]: I1202 16:15:28.632733 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cec17902-d3a5-4961-88eb-65c3773747fa-public-tls-certs\") pod \"barbican-api-56654f9db6-xk8pt\" (UID: \"cec17902-d3a5-4961-88eb-65c3773747fa\") " pod="openstack/barbican-api-56654f9db6-xk8pt" Dec 02 16:15:28 crc kubenswrapper[4933]: I1202 16:15:28.632863 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cec17902-d3a5-4961-88eb-65c3773747fa-config-data-custom\") pod \"barbican-api-56654f9db6-xk8pt\" (UID: \"cec17902-d3a5-4961-88eb-65c3773747fa\") " pod="openstack/barbican-api-56654f9db6-xk8pt" Dec 02 16:15:28 crc kubenswrapper[4933]: I1202 16:15:28.633019 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cec17902-d3a5-4961-88eb-65c3773747fa-config-data\") pod \"barbican-api-56654f9db6-xk8pt\" (UID: \"cec17902-d3a5-4961-88eb-65c3773747fa\") " pod="openstack/barbican-api-56654f9db6-xk8pt" Dec 02 16:15:28 crc kubenswrapper[4933]: I1202 16:15:28.633122 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cec17902-d3a5-4961-88eb-65c3773747fa-internal-tls-certs\") pod \"barbican-api-56654f9db6-xk8pt\" (UID: \"cec17902-d3a5-4961-88eb-65c3773747fa\") " pod="openstack/barbican-api-56654f9db6-xk8pt" Dec 02 16:15:28 crc kubenswrapper[4933]: I1202 16:15:28.634454 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cec17902-d3a5-4961-88eb-65c3773747fa-logs\") pod \"barbican-api-56654f9db6-xk8pt\" (UID: \"cec17902-d3a5-4961-88eb-65c3773747fa\") " pod="openstack/barbican-api-56654f9db6-xk8pt" Dec 02 16:15:28 crc kubenswrapper[4933]: I1202 16:15:28.641586 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cec17902-d3a5-4961-88eb-65c3773747fa-config-data\") pod \"barbican-api-56654f9db6-xk8pt\" (UID: \"cec17902-d3a5-4961-88eb-65c3773747fa\") " pod="openstack/barbican-api-56654f9db6-xk8pt" Dec 02 16:15:28 crc kubenswrapper[4933]: I1202 16:15:28.641975 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cec17902-d3a5-4961-88eb-65c3773747fa-combined-ca-bundle\") pod \"barbican-api-56654f9db6-xk8pt\" (UID: \"cec17902-d3a5-4961-88eb-65c3773747fa\") " pod="openstack/barbican-api-56654f9db6-xk8pt" Dec 02 16:15:28 crc kubenswrapper[4933]: I1202 16:15:28.642542 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cec17902-d3a5-4961-88eb-65c3773747fa-internal-tls-certs\") pod \"barbican-api-56654f9db6-xk8pt\" (UID: \"cec17902-d3a5-4961-88eb-65c3773747fa\") " pod="openstack/barbican-api-56654f9db6-xk8pt" Dec 02 16:15:28 crc kubenswrapper[4933]: I1202 16:15:28.644965 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cec17902-d3a5-4961-88eb-65c3773747fa-config-data-custom\") pod \"barbican-api-56654f9db6-xk8pt\" (UID: \"cec17902-d3a5-4961-88eb-65c3773747fa\") " pod="openstack/barbican-api-56654f9db6-xk8pt" Dec 02 16:15:28 crc kubenswrapper[4933]: I1202 16:15:28.652122 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvhjz\" (UniqueName: \"kubernetes.io/projected/cec17902-d3a5-4961-88eb-65c3773747fa-kube-api-access-xvhjz\") pod \"barbican-api-56654f9db6-xk8pt\" (UID: \"cec17902-d3a5-4961-88eb-65c3773747fa\") " pod="openstack/barbican-api-56654f9db6-xk8pt" Dec 02 16:15:28 crc kubenswrapper[4933]: I1202 16:15:28.652510 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cec17902-d3a5-4961-88eb-65c3773747fa-public-tls-certs\") pod \"barbican-api-56654f9db6-xk8pt\" (UID: \"cec17902-d3a5-4961-88eb-65c3773747fa\") " pod="openstack/barbican-api-56654f9db6-xk8pt" Dec 02 16:15:28 crc kubenswrapper[4933]: I1202 16:15:28.801400 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-56bd7dd77f-sxk4g" event={"ID":"04729c7a-7c1b-4138-832a-f6d0bf327720","Type":"ContainerStarted","Data":"94f7246beb9dc2b8f7827705fff88e41efa6da7374f48e67f6b282df32071bcd"} Dec 02 16:15:28 crc kubenswrapper[4933]: I1202 16:15:28.801449 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-56bd7dd77f-sxk4g" event={"ID":"04729c7a-7c1b-4138-832a-f6d0bf327720","Type":"ContainerStarted","Data":"5433657ae55626e4ef0dafb2bacc2da5d976624ceeeb8b52c9bca1ec0fe7e7c7"} Dec 02 16:15:28 crc kubenswrapper[4933]: I1202 16:15:28.805065 4933 generic.go:334] "Generic (PLEG): container finished" podID="c4239535-0d5c-4b17-a695-9f57efb4d381" containerID="dae24602cbed033884446a88e1c910eb45268850208f26904539043d4084c582" exitCode=0 Dec 02 16:15:28 crc kubenswrapper[4933]: I1202 16:15:28.805146 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-kzxbf" event={"ID":"c4239535-0d5c-4b17-a695-9f57efb4d381","Type":"ContainerDied","Data":"dae24602cbed033884446a88e1c910eb45268850208f26904539043d4084c582"} Dec 02 16:15:28 crc kubenswrapper[4933]: I1202 16:15:28.816743 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-56654f9db6-xk8pt" Dec 02 16:15:28 crc kubenswrapper[4933]: I1202 16:15:28.828002 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-56bd7dd77f-sxk4g" podStartSLOduration=3.406030325 podStartE2EDuration="5.827982777s" podCreationTimestamp="2025-12-02 16:15:23 +0000 UTC" firstStartedPulling="2025-12-02 16:15:25.456095715 +0000 UTC m=+1388.707322418" lastFinishedPulling="2025-12-02 16:15:27.878048167 +0000 UTC m=+1391.129274870" observedRunningTime="2025-12-02 16:15:28.816356561 +0000 UTC m=+1392.067583264" watchObservedRunningTime="2025-12-02 16:15:28.827982777 +0000 UTC m=+1392.079209480" Dec 02 16:15:29 crc kubenswrapper[4933]: I1202 16:15:29.407787 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-56654f9db6-xk8pt"] Dec 02 16:15:29 crc kubenswrapper[4933]: W1202 16:15:29.637693 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcec17902_d3a5_4961_88eb_65c3773747fa.slice/crio-8340898cea4bbc0cbad7fc0acf736aa516f8dc3913fe0032eca770a671b66ef6 WatchSource:0}: Error finding container 8340898cea4bbc0cbad7fc0acf736aa516f8dc3913fe0032eca770a671b66ef6: Status 404 returned error can't find the container with id 8340898cea4bbc0cbad7fc0acf736aa516f8dc3913fe0032eca770a671b66ef6 Dec 02 16:15:29 crc kubenswrapper[4933]: I1202 16:15:29.822947 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-56654f9db6-xk8pt" event={"ID":"cec17902-d3a5-4961-88eb-65c3773747fa","Type":"ContainerStarted","Data":"8340898cea4bbc0cbad7fc0acf736aa516f8dc3913fe0032eca770a671b66ef6"} Dec 02 16:15:30 crc kubenswrapper[4933]: I1202 16:15:30.332758 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-kzxbf" Dec 02 16:15:30 crc kubenswrapper[4933]: I1202 16:15:30.478269 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4239535-0d5c-4b17-a695-9f57efb4d381-config-data\") pod \"c4239535-0d5c-4b17-a695-9f57efb4d381\" (UID: \"c4239535-0d5c-4b17-a695-9f57efb4d381\") " Dec 02 16:15:30 crc kubenswrapper[4933]: I1202 16:15:30.478580 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4239535-0d5c-4b17-a695-9f57efb4d381-combined-ca-bundle\") pod \"c4239535-0d5c-4b17-a695-9f57efb4d381\" (UID: \"c4239535-0d5c-4b17-a695-9f57efb4d381\") " Dec 02 16:15:30 crc kubenswrapper[4933]: I1202 16:15:30.478792 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mqxrb\" (UniqueName: \"kubernetes.io/projected/c4239535-0d5c-4b17-a695-9f57efb4d381-kube-api-access-mqxrb\") pod \"c4239535-0d5c-4b17-a695-9f57efb4d381\" (UID: \"c4239535-0d5c-4b17-a695-9f57efb4d381\") " Dec 02 16:15:30 crc kubenswrapper[4933]: I1202 16:15:30.484604 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4239535-0d5c-4b17-a695-9f57efb4d381-kube-api-access-mqxrb" (OuterVolumeSpecName: "kube-api-access-mqxrb") pod "c4239535-0d5c-4b17-a695-9f57efb4d381" (UID: "c4239535-0d5c-4b17-a695-9f57efb4d381"). InnerVolumeSpecName "kube-api-access-mqxrb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:15:30 crc kubenswrapper[4933]: I1202 16:15:30.531568 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4239535-0d5c-4b17-a695-9f57efb4d381-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c4239535-0d5c-4b17-a695-9f57efb4d381" (UID: "c4239535-0d5c-4b17-a695-9f57efb4d381"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:15:30 crc kubenswrapper[4933]: I1202 16:15:30.580953 4933 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4239535-0d5c-4b17-a695-9f57efb4d381-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 16:15:30 crc kubenswrapper[4933]: I1202 16:15:30.580995 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mqxrb\" (UniqueName: \"kubernetes.io/projected/c4239535-0d5c-4b17-a695-9f57efb4d381-kube-api-access-mqxrb\") on node \"crc\" DevicePath \"\"" Dec 02 16:15:30 crc kubenswrapper[4933]: I1202 16:15:30.597161 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4239535-0d5c-4b17-a695-9f57efb4d381-config-data" (OuterVolumeSpecName: "config-data") pod "c4239535-0d5c-4b17-a695-9f57efb4d381" (UID: "c4239535-0d5c-4b17-a695-9f57efb4d381"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:15:30 crc kubenswrapper[4933]: I1202 16:15:30.682910 4933 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4239535-0d5c-4b17-a695-9f57efb4d381-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 16:15:30 crc kubenswrapper[4933]: I1202 16:15:30.835452 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-kzxbf" Dec 02 16:15:30 crc kubenswrapper[4933]: I1202 16:15:30.835451 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-kzxbf" event={"ID":"c4239535-0d5c-4b17-a695-9f57efb4d381","Type":"ContainerDied","Data":"fa6ffc6935851f92d2d41ea2dfa32f36e7c6c6d7dda367e75428d91c958803c0"} Dec 02 16:15:30 crc kubenswrapper[4933]: I1202 16:15:30.835910 4933 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fa6ffc6935851f92d2d41ea2dfa32f36e7c6c6d7dda367e75428d91c958803c0" Dec 02 16:15:30 crc kubenswrapper[4933]: I1202 16:15:30.838857 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-56654f9db6-xk8pt" event={"ID":"cec17902-d3a5-4961-88eb-65c3773747fa","Type":"ContainerStarted","Data":"576d8959be9b625a429ab18164256a61fb9fd46babdd0f92458efa69a4a92456"} Dec 02 16:15:30 crc kubenswrapper[4933]: I1202 16:15:30.840469 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-64dd586798-jl4hw" event={"ID":"78795142-23f4-4bfd-ba25-479d6cc3c19f","Type":"ContainerStarted","Data":"f487441dbfa1ff19b56f80c3bb75b98e6bb106529a2733ad42d254b120a53c5e"} Dec 02 16:15:31 crc kubenswrapper[4933]: I1202 16:15:31.855590 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-64dd586798-jl4hw" event={"ID":"78795142-23f4-4bfd-ba25-479d6cc3c19f","Type":"ContainerStarted","Data":"f75648c32d12b4d2435980ef81288e581e93121b475744511226ff9b58179d4b"} Dec 02 16:15:31 crc kubenswrapper[4933]: I1202 16:15:31.858631 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-56654f9db6-xk8pt" event={"ID":"cec17902-d3a5-4961-88eb-65c3773747fa","Type":"ContainerStarted","Data":"2081dda03ee776ef1a09f6509fb4009da8c093aca45416f32056a0441482171c"} Dec 02 16:15:31 crc kubenswrapper[4933]: I1202 16:15:31.859350 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-56654f9db6-xk8pt" Dec 02 16:15:31 crc kubenswrapper[4933]: I1202 16:15:31.859380 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-56654f9db6-xk8pt" Dec 02 16:15:31 crc kubenswrapper[4933]: I1202 16:15:31.886472 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-64dd586798-jl4hw" podStartSLOduration=4.967050196 podStartE2EDuration="8.886452489s" podCreationTimestamp="2025-12-02 16:15:23 +0000 UTC" firstStartedPulling="2025-12-02 16:15:25.815146505 +0000 UTC m=+1389.066373208" lastFinishedPulling="2025-12-02 16:15:29.734548798 +0000 UTC m=+1392.985775501" observedRunningTime="2025-12-02 16:15:31.883055787 +0000 UTC m=+1395.134282530" watchObservedRunningTime="2025-12-02 16:15:31.886452489 +0000 UTC m=+1395.137679192" Dec 02 16:15:31 crc kubenswrapper[4933]: I1202 16:15:31.906443 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-56654f9db6-xk8pt" podStartSLOduration=3.906423222 podStartE2EDuration="3.906423222s" podCreationTimestamp="2025-12-02 16:15:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 16:15:31.902150006 +0000 UTC m=+1395.153376709" watchObservedRunningTime="2025-12-02 16:15:31.906423222 +0000 UTC m=+1395.157649945" Dec 02 16:15:32 crc kubenswrapper[4933]: E1202 16:15:32.844429 4933 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6c1c5e6_50dd_428a_890c_2c3f0456f2fa.slice/crio-88738e3bef53d0478fb4beaed256d677959bb94ab5e545598a6a14e9cb4a5112.scope\": RecentStats: unable to find data in memory cache]" Dec 02 16:15:32 crc kubenswrapper[4933]: I1202 16:15:32.871928 4933 generic.go:334] "Generic (PLEG): container finished" podID="9c326307-73df-462e-98ec-0e4dc89fdd54" containerID="c0ba5a252954e9480867705618a22bf881b3aa04ad391282d038ca64caa0c0f4" exitCode=0 Dec 02 16:15:32 crc kubenswrapper[4933]: I1202 16:15:32.873287 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-gtbcc" event={"ID":"9c326307-73df-462e-98ec-0e4dc89fdd54","Type":"ContainerDied","Data":"c0ba5a252954e9480867705618a22bf881b3aa04ad391282d038ca64caa0c0f4"} Dec 02 16:15:34 crc kubenswrapper[4933]: I1202 16:15:34.687106 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-gtbcc" Dec 02 16:15:34 crc kubenswrapper[4933]: I1202 16:15:34.780095 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-688c87cc99-kzxj4" Dec 02 16:15:34 crc kubenswrapper[4933]: I1202 16:15:34.809007 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9c326307-73df-462e-98ec-0e4dc89fdd54-etc-machine-id\") pod \"9c326307-73df-462e-98ec-0e4dc89fdd54\" (UID: \"9c326307-73df-462e-98ec-0e4dc89fdd54\") " Dec 02 16:15:34 crc kubenswrapper[4933]: I1202 16:15:34.809093 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c326307-73df-462e-98ec-0e4dc89fdd54-combined-ca-bundle\") pod \"9c326307-73df-462e-98ec-0e4dc89fdd54\" (UID: \"9c326307-73df-462e-98ec-0e4dc89fdd54\") " Dec 02 16:15:34 crc kubenswrapper[4933]: I1202 16:15:34.809119 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c326307-73df-462e-98ec-0e4dc89fdd54-scripts\") pod \"9c326307-73df-462e-98ec-0e4dc89fdd54\" (UID: \"9c326307-73df-462e-98ec-0e4dc89fdd54\") " Dec 02 16:15:34 crc kubenswrapper[4933]: I1202 16:15:34.809145 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c326307-73df-462e-98ec-0e4dc89fdd54-config-data\") pod \"9c326307-73df-462e-98ec-0e4dc89fdd54\" (UID: \"9c326307-73df-462e-98ec-0e4dc89fdd54\") " Dec 02 16:15:34 crc kubenswrapper[4933]: I1202 16:15:34.809204 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htbmg\" (UniqueName: \"kubernetes.io/projected/9c326307-73df-462e-98ec-0e4dc89fdd54-kube-api-access-htbmg\") pod \"9c326307-73df-462e-98ec-0e4dc89fdd54\" (UID: \"9c326307-73df-462e-98ec-0e4dc89fdd54\") " Dec 02 16:15:34 crc kubenswrapper[4933]: I1202 16:15:34.809301 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9c326307-73df-462e-98ec-0e4dc89fdd54-db-sync-config-data\") pod \"9c326307-73df-462e-98ec-0e4dc89fdd54\" (UID: \"9c326307-73df-462e-98ec-0e4dc89fdd54\") " Dec 02 16:15:34 crc kubenswrapper[4933]: I1202 16:15:34.812331 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9c326307-73df-462e-98ec-0e4dc89fdd54-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "9c326307-73df-462e-98ec-0e4dc89fdd54" (UID: "9c326307-73df-462e-98ec-0e4dc89fdd54"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 16:15:34 crc kubenswrapper[4933]: I1202 16:15:34.832428 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c326307-73df-462e-98ec-0e4dc89fdd54-scripts" (OuterVolumeSpecName: "scripts") pod "9c326307-73df-462e-98ec-0e4dc89fdd54" (UID: "9c326307-73df-462e-98ec-0e4dc89fdd54"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:15:34 crc kubenswrapper[4933]: I1202 16:15:34.832481 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c326307-73df-462e-98ec-0e4dc89fdd54-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "9c326307-73df-462e-98ec-0e4dc89fdd54" (UID: "9c326307-73df-462e-98ec-0e4dc89fdd54"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:15:34 crc kubenswrapper[4933]: I1202 16:15:34.832630 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c326307-73df-462e-98ec-0e4dc89fdd54-kube-api-access-htbmg" (OuterVolumeSpecName: "kube-api-access-htbmg") pod "9c326307-73df-462e-98ec-0e4dc89fdd54" (UID: "9c326307-73df-462e-98ec-0e4dc89fdd54"). InnerVolumeSpecName "kube-api-access-htbmg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:15:34 crc kubenswrapper[4933]: I1202 16:15:34.851801 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-rsdtj"] Dec 02 16:15:34 crc kubenswrapper[4933]: I1202 16:15:34.852098 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5ccc5c4795-rsdtj" podUID="f7534185-21e1-4f0b-81e8-550e0c24b0b3" containerName="dnsmasq-dns" containerID="cri-o://65f079f4e0d724a170824f5c5fa2c7091b6e7cfc8b158d84a8b002b63c105f29" gracePeriod=10 Dec 02 16:15:34 crc kubenswrapper[4933]: I1202 16:15:34.882067 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c326307-73df-462e-98ec-0e4dc89fdd54-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9c326307-73df-462e-98ec-0e4dc89fdd54" (UID: "9c326307-73df-462e-98ec-0e4dc89fdd54"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:15:34 crc kubenswrapper[4933]: I1202 16:15:34.928350 4933 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9c326307-73df-462e-98ec-0e4dc89fdd54-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 16:15:34 crc kubenswrapper[4933]: I1202 16:15:34.928389 4933 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9c326307-73df-462e-98ec-0e4dc89fdd54-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 02 16:15:34 crc kubenswrapper[4933]: I1202 16:15:34.928402 4933 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c326307-73df-462e-98ec-0e4dc89fdd54-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 16:15:34 crc kubenswrapper[4933]: I1202 16:15:34.928416 4933 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c326307-73df-462e-98ec-0e4dc89fdd54-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 16:15:34 crc kubenswrapper[4933]: I1202 16:15:34.928429 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htbmg\" (UniqueName: \"kubernetes.io/projected/9c326307-73df-462e-98ec-0e4dc89fdd54-kube-api-access-htbmg\") on node \"crc\" DevicePath \"\"" Dec 02 16:15:34 crc kubenswrapper[4933]: I1202 16:15:34.946923 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c326307-73df-462e-98ec-0e4dc89fdd54-config-data" (OuterVolumeSpecName: "config-data") pod "9c326307-73df-462e-98ec-0e4dc89fdd54" (UID: "9c326307-73df-462e-98ec-0e4dc89fdd54"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:15:34 crc kubenswrapper[4933]: I1202 16:15:34.954961 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-gtbcc" event={"ID":"9c326307-73df-462e-98ec-0e4dc89fdd54","Type":"ContainerDied","Data":"313c2171fb9b1539094a70fde927f6b0bb7f25b38ec453ec852f2b26a4f221d9"} Dec 02 16:15:34 crc kubenswrapper[4933]: I1202 16:15:34.955009 4933 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="313c2171fb9b1539094a70fde927f6b0bb7f25b38ec453ec852f2b26a4f221d9" Dec 02 16:15:34 crc kubenswrapper[4933]: I1202 16:15:34.955106 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-gtbcc" Dec 02 16:15:35 crc kubenswrapper[4933]: I1202 16:15:35.040228 4933 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c326307-73df-462e-98ec-0e4dc89fdd54-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 16:15:35 crc kubenswrapper[4933]: I1202 16:15:35.149086 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 02 16:15:35 crc kubenswrapper[4933]: I1202 16:15:35.149479 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 02 16:15:35 crc kubenswrapper[4933]: I1202 16:15:35.182258 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 02 16:15:35 crc kubenswrapper[4933]: E1202 16:15:35.182941 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c326307-73df-462e-98ec-0e4dc89fdd54" containerName="cinder-db-sync" Dec 02 16:15:35 crc kubenswrapper[4933]: I1202 16:15:35.182953 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c326307-73df-462e-98ec-0e4dc89fdd54" containerName="cinder-db-sync" Dec 02 16:15:35 crc kubenswrapper[4933]: E1202 16:15:35.182964 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4239535-0d5c-4b17-a695-9f57efb4d381" containerName="heat-db-sync" Dec 02 16:15:35 crc kubenswrapper[4933]: I1202 16:15:35.182970 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4239535-0d5c-4b17-a695-9f57efb4d381" containerName="heat-db-sync" Dec 02 16:15:35 crc kubenswrapper[4933]: I1202 16:15:35.183165 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4239535-0d5c-4b17-a695-9f57efb4d381" containerName="heat-db-sync" Dec 02 16:15:35 crc kubenswrapper[4933]: I1202 16:15:35.183181 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c326307-73df-462e-98ec-0e4dc89fdd54" containerName="cinder-db-sync" Dec 02 16:15:35 crc kubenswrapper[4933]: I1202 16:15:35.185479 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 02 16:15:35 crc kubenswrapper[4933]: I1202 16:15:35.192003 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 02 16:15:35 crc kubenswrapper[4933]: I1202 16:15:35.192593 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 02 16:15:35 crc kubenswrapper[4933]: I1202 16:15:35.192728 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 02 16:15:35 crc kubenswrapper[4933]: I1202 16:15:35.192845 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-4k8nt" Dec 02 16:15:35 crc kubenswrapper[4933]: I1202 16:15:35.203509 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 02 16:15:35 crc kubenswrapper[4933]: I1202 16:15:35.238331 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 02 16:15:35 crc kubenswrapper[4933]: I1202 16:15:35.249058 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 02 16:15:35 crc kubenswrapper[4933]: I1202 16:15:35.249170 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 02 16:15:35 crc kubenswrapper[4933]: I1202 16:15:35.251044 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 02 16:15:35 crc kubenswrapper[4933]: I1202 16:15:35.252050 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d6fe320-04c0-4184-a67d-50676c0717b5-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"8d6fe320-04c0-4184-a67d-50676c0717b5\") " pod="openstack/cinder-scheduler-0" Dec 02 16:15:35 crc kubenswrapper[4933]: I1202 16:15:35.252093 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8d6fe320-04c0-4184-a67d-50676c0717b5-scripts\") pod \"cinder-scheduler-0\" (UID: \"8d6fe320-04c0-4184-a67d-50676c0717b5\") " pod="openstack/cinder-scheduler-0" Dec 02 16:15:35 crc kubenswrapper[4933]: I1202 16:15:35.252134 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8d6fe320-04c0-4184-a67d-50676c0717b5-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"8d6fe320-04c0-4184-a67d-50676c0717b5\") " pod="openstack/cinder-scheduler-0" Dec 02 16:15:35 crc kubenswrapper[4933]: I1202 16:15:35.252187 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d6fe320-04c0-4184-a67d-50676c0717b5-config-data\") pod \"cinder-scheduler-0\" (UID: \"8d6fe320-04c0-4184-a67d-50676c0717b5\") " pod="openstack/cinder-scheduler-0" Dec 02 16:15:35 crc kubenswrapper[4933]: I1202 16:15:35.252241 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8d6fe320-04c0-4184-a67d-50676c0717b5-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"8d6fe320-04c0-4184-a67d-50676c0717b5\") " pod="openstack/cinder-scheduler-0" Dec 02 16:15:35 crc kubenswrapper[4933]: I1202 16:15:35.252274 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nh25g\" (UniqueName: \"kubernetes.io/projected/8d6fe320-04c0-4184-a67d-50676c0717b5-kube-api-access-nh25g\") pod \"cinder-scheduler-0\" (UID: \"8d6fe320-04c0-4184-a67d-50676c0717b5\") " pod="openstack/cinder-scheduler-0" Dec 02 16:15:35 crc kubenswrapper[4933]: I1202 16:15:35.260185 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-zj8w6"] Dec 02 16:15:35 crc kubenswrapper[4933]: I1202 16:15:35.262080 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb4fc677f-zj8w6" Dec 02 16:15:35 crc kubenswrapper[4933]: I1202 16:15:35.309357 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-zj8w6"] Dec 02 16:15:35 crc kubenswrapper[4933]: I1202 16:15:35.356012 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8d6fe320-04c0-4184-a67d-50676c0717b5-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"8d6fe320-04c0-4184-a67d-50676c0717b5\") " pod="openstack/cinder-scheduler-0" Dec 02 16:15:35 crc kubenswrapper[4933]: I1202 16:15:35.356106 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d6fe320-04c0-4184-a67d-50676c0717b5-config-data\") pod \"cinder-scheduler-0\" (UID: \"8d6fe320-04c0-4184-a67d-50676c0717b5\") " pod="openstack/cinder-scheduler-0" Dec 02 16:15:35 crc kubenswrapper[4933]: I1202 16:15:35.356157 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8d6fe320-04c0-4184-a67d-50676c0717b5-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"8d6fe320-04c0-4184-a67d-50676c0717b5\") " pod="openstack/cinder-scheduler-0" Dec 02 16:15:35 crc kubenswrapper[4933]: I1202 16:15:35.356212 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nh25g\" (UniqueName: \"kubernetes.io/projected/8d6fe320-04c0-4184-a67d-50676c0717b5-kube-api-access-nh25g\") pod \"cinder-scheduler-0\" (UID: \"8d6fe320-04c0-4184-a67d-50676c0717b5\") " pod="openstack/cinder-scheduler-0" Dec 02 16:15:35 crc kubenswrapper[4933]: I1202 16:15:35.356416 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d6fe320-04c0-4184-a67d-50676c0717b5-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"8d6fe320-04c0-4184-a67d-50676c0717b5\") " pod="openstack/cinder-scheduler-0" Dec 02 16:15:35 crc kubenswrapper[4933]: I1202 16:15:35.356454 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8d6fe320-04c0-4184-a67d-50676c0717b5-scripts\") pod \"cinder-scheduler-0\" (UID: \"8d6fe320-04c0-4184-a67d-50676c0717b5\") " pod="openstack/cinder-scheduler-0" Dec 02 16:15:35 crc kubenswrapper[4933]: I1202 16:15:35.367855 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8d6fe320-04c0-4184-a67d-50676c0717b5-scripts\") pod \"cinder-scheduler-0\" (UID: \"8d6fe320-04c0-4184-a67d-50676c0717b5\") " pod="openstack/cinder-scheduler-0" Dec 02 16:15:35 crc kubenswrapper[4933]: I1202 16:15:35.369740 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8d6fe320-04c0-4184-a67d-50676c0717b5-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"8d6fe320-04c0-4184-a67d-50676c0717b5\") " pod="openstack/cinder-scheduler-0" Dec 02 16:15:35 crc kubenswrapper[4933]: I1202 16:15:35.369787 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8d6fe320-04c0-4184-a67d-50676c0717b5-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"8d6fe320-04c0-4184-a67d-50676c0717b5\") " pod="openstack/cinder-scheduler-0" Dec 02 16:15:35 crc kubenswrapper[4933]: I1202 16:15:35.374966 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d6fe320-04c0-4184-a67d-50676c0717b5-config-data\") pod \"cinder-scheduler-0\" (UID: \"8d6fe320-04c0-4184-a67d-50676c0717b5\") " pod="openstack/cinder-scheduler-0" Dec 02 16:15:35 crc kubenswrapper[4933]: I1202 16:15:35.377187 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d6fe320-04c0-4184-a67d-50676c0717b5-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"8d6fe320-04c0-4184-a67d-50676c0717b5\") " pod="openstack/cinder-scheduler-0" Dec 02 16:15:35 crc kubenswrapper[4933]: I1202 16:15:35.384587 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 02 16:15:35 crc kubenswrapper[4933]: I1202 16:15:35.390312 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 02 16:15:35 crc kubenswrapper[4933]: I1202 16:15:35.394204 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nh25g\" (UniqueName: \"kubernetes.io/projected/8d6fe320-04c0-4184-a67d-50676c0717b5-kube-api-access-nh25g\") pod \"cinder-scheduler-0\" (UID: \"8d6fe320-04c0-4184-a67d-50676c0717b5\") " pod="openstack/cinder-scheduler-0" Dec 02 16:15:35 crc kubenswrapper[4933]: I1202 16:15:35.427261 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 02 16:15:35 crc kubenswrapper[4933]: I1202 16:15:35.437298 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 02 16:15:35 crc kubenswrapper[4933]: I1202 16:15:35.442613 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 02 16:15:35 crc kubenswrapper[4933]: I1202 16:15:35.449416 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 02 16:15:35 crc kubenswrapper[4933]: I1202 16:15:35.459507 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/317f6015-0b9d-4a80-b022-2b77224b0284-dns-svc\") pod \"dnsmasq-dns-6bb4fc677f-zj8w6\" (UID: \"317f6015-0b9d-4a80-b022-2b77224b0284\") " pod="openstack/dnsmasq-dns-6bb4fc677f-zj8w6" Dec 02 16:15:35 crc kubenswrapper[4933]: I1202 16:15:35.459583 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76ll4\" (UniqueName: \"kubernetes.io/projected/317f6015-0b9d-4a80-b022-2b77224b0284-kube-api-access-76ll4\") pod \"dnsmasq-dns-6bb4fc677f-zj8w6\" (UID: \"317f6015-0b9d-4a80-b022-2b77224b0284\") " pod="openstack/dnsmasq-dns-6bb4fc677f-zj8w6" Dec 02 16:15:35 crc kubenswrapper[4933]: I1202 16:15:35.459645 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/317f6015-0b9d-4a80-b022-2b77224b0284-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb4fc677f-zj8w6\" (UID: \"317f6015-0b9d-4a80-b022-2b77224b0284\") " pod="openstack/dnsmasq-dns-6bb4fc677f-zj8w6" Dec 02 16:15:35 crc kubenswrapper[4933]: I1202 16:15:35.459671 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/317f6015-0b9d-4a80-b022-2b77224b0284-dns-swift-storage-0\") pod \"dnsmasq-dns-6bb4fc677f-zj8w6\" (UID: \"317f6015-0b9d-4a80-b022-2b77224b0284\") " pod="openstack/dnsmasq-dns-6bb4fc677f-zj8w6" Dec 02 16:15:35 crc kubenswrapper[4933]: I1202 16:15:35.459709 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/317f6015-0b9d-4a80-b022-2b77224b0284-config\") pod \"dnsmasq-dns-6bb4fc677f-zj8w6\" (UID: \"317f6015-0b9d-4a80-b022-2b77224b0284\") " pod="openstack/dnsmasq-dns-6bb4fc677f-zj8w6" Dec 02 16:15:35 crc kubenswrapper[4933]: I1202 16:15:35.459755 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/317f6015-0b9d-4a80-b022-2b77224b0284-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb4fc677f-zj8w6\" (UID: \"317f6015-0b9d-4a80-b022-2b77224b0284\") " pod="openstack/dnsmasq-dns-6bb4fc677f-zj8w6" Dec 02 16:15:35 crc kubenswrapper[4933]: I1202 16:15:35.550628 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 02 16:15:35 crc kubenswrapper[4933]: I1202 16:15:35.568096 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/317f6015-0b9d-4a80-b022-2b77224b0284-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb4fc677f-zj8w6\" (UID: \"317f6015-0b9d-4a80-b022-2b77224b0284\") " pod="openstack/dnsmasq-dns-6bb4fc677f-zj8w6" Dec 02 16:15:35 crc kubenswrapper[4933]: I1202 16:15:35.568349 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/275eb386-3fe4-4326-aca5-4658b83a8783-config-data\") pod \"cinder-api-0\" (UID: \"275eb386-3fe4-4326-aca5-4658b83a8783\") " pod="openstack/cinder-api-0" Dec 02 16:15:35 crc kubenswrapper[4933]: I1202 16:15:35.568471 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/275eb386-3fe4-4326-aca5-4658b83a8783-scripts\") pod \"cinder-api-0\" (UID: \"275eb386-3fe4-4326-aca5-4658b83a8783\") " pod="openstack/cinder-api-0" Dec 02 16:15:35 crc kubenswrapper[4933]: I1202 16:15:35.568554 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/275eb386-3fe4-4326-aca5-4658b83a8783-config-data-custom\") pod \"cinder-api-0\" (UID: \"275eb386-3fe4-4326-aca5-4658b83a8783\") " pod="openstack/cinder-api-0" Dec 02 16:15:35 crc kubenswrapper[4933]: I1202 16:15:35.568622 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/275eb386-3fe4-4326-aca5-4658b83a8783-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"275eb386-3fe4-4326-aca5-4658b83a8783\") " pod="openstack/cinder-api-0" Dec 02 16:15:35 crc kubenswrapper[4933]: I1202 16:15:35.568697 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmrcs\" (UniqueName: \"kubernetes.io/projected/275eb386-3fe4-4326-aca5-4658b83a8783-kube-api-access-gmrcs\") pod \"cinder-api-0\" (UID: \"275eb386-3fe4-4326-aca5-4658b83a8783\") " pod="openstack/cinder-api-0" Dec 02 16:15:35 crc kubenswrapper[4933]: I1202 16:15:35.568811 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/317f6015-0b9d-4a80-b022-2b77224b0284-dns-svc\") pod \"dnsmasq-dns-6bb4fc677f-zj8w6\" (UID: \"317f6015-0b9d-4a80-b022-2b77224b0284\") " pod="openstack/dnsmasq-dns-6bb4fc677f-zj8w6" Dec 02 16:15:35 crc kubenswrapper[4933]: I1202 16:15:35.568941 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76ll4\" (UniqueName: \"kubernetes.io/projected/317f6015-0b9d-4a80-b022-2b77224b0284-kube-api-access-76ll4\") pod \"dnsmasq-dns-6bb4fc677f-zj8w6\" (UID: \"317f6015-0b9d-4a80-b022-2b77224b0284\") " pod="openstack/dnsmasq-dns-6bb4fc677f-zj8w6" Dec 02 16:15:35 crc kubenswrapper[4933]: I1202 16:15:35.569024 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/275eb386-3fe4-4326-aca5-4658b83a8783-etc-machine-id\") pod \"cinder-api-0\" (UID: \"275eb386-3fe4-4326-aca5-4658b83a8783\") " pod="openstack/cinder-api-0" Dec 02 16:15:35 crc kubenswrapper[4933]: I1202 16:15:35.569120 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/317f6015-0b9d-4a80-b022-2b77224b0284-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb4fc677f-zj8w6\" (UID: \"317f6015-0b9d-4a80-b022-2b77224b0284\") " pod="openstack/dnsmasq-dns-6bb4fc677f-zj8w6" Dec 02 16:15:35 crc kubenswrapper[4933]: I1202 16:15:35.569195 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/275eb386-3fe4-4326-aca5-4658b83a8783-logs\") pod \"cinder-api-0\" (UID: \"275eb386-3fe4-4326-aca5-4658b83a8783\") " pod="openstack/cinder-api-0" Dec 02 16:15:35 crc kubenswrapper[4933]: I1202 16:15:35.569267 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/317f6015-0b9d-4a80-b022-2b77224b0284-dns-swift-storage-0\") pod \"dnsmasq-dns-6bb4fc677f-zj8w6\" (UID: \"317f6015-0b9d-4a80-b022-2b77224b0284\") " pod="openstack/dnsmasq-dns-6bb4fc677f-zj8w6" Dec 02 16:15:35 crc kubenswrapper[4933]: I1202 16:15:35.569366 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/317f6015-0b9d-4a80-b022-2b77224b0284-config\") pod \"dnsmasq-dns-6bb4fc677f-zj8w6\" (UID: \"317f6015-0b9d-4a80-b022-2b77224b0284\") " pod="openstack/dnsmasq-dns-6bb4fc677f-zj8w6" Dec 02 16:15:35 crc kubenswrapper[4933]: I1202 16:15:35.569453 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/317f6015-0b9d-4a80-b022-2b77224b0284-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb4fc677f-zj8w6\" (UID: \"317f6015-0b9d-4a80-b022-2b77224b0284\") " pod="openstack/dnsmasq-dns-6bb4fc677f-zj8w6" Dec 02 16:15:35 crc kubenswrapper[4933]: I1202 16:15:35.569836 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/317f6015-0b9d-4a80-b022-2b77224b0284-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb4fc677f-zj8w6\" (UID: \"317f6015-0b9d-4a80-b022-2b77224b0284\") " pod="openstack/dnsmasq-dns-6bb4fc677f-zj8w6" Dec 02 16:15:35 crc kubenswrapper[4933]: I1202 16:15:35.570880 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/317f6015-0b9d-4a80-b022-2b77224b0284-config\") pod \"dnsmasq-dns-6bb4fc677f-zj8w6\" (UID: \"317f6015-0b9d-4a80-b022-2b77224b0284\") " pod="openstack/dnsmasq-dns-6bb4fc677f-zj8w6" Dec 02 16:15:35 crc kubenswrapper[4933]: I1202 16:15:35.571266 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/317f6015-0b9d-4a80-b022-2b77224b0284-dns-swift-storage-0\") pod \"dnsmasq-dns-6bb4fc677f-zj8w6\" (UID: \"317f6015-0b9d-4a80-b022-2b77224b0284\") " pod="openstack/dnsmasq-dns-6bb4fc677f-zj8w6" Dec 02 16:15:35 crc kubenswrapper[4933]: I1202 16:15:35.576268 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/317f6015-0b9d-4a80-b022-2b77224b0284-dns-svc\") pod \"dnsmasq-dns-6bb4fc677f-zj8w6\" (UID: \"317f6015-0b9d-4a80-b022-2b77224b0284\") " pod="openstack/dnsmasq-dns-6bb4fc677f-zj8w6" Dec 02 16:15:35 crc kubenswrapper[4933]: I1202 16:15:35.595405 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76ll4\" (UniqueName: \"kubernetes.io/projected/317f6015-0b9d-4a80-b022-2b77224b0284-kube-api-access-76ll4\") pod \"dnsmasq-dns-6bb4fc677f-zj8w6\" (UID: \"317f6015-0b9d-4a80-b022-2b77224b0284\") " pod="openstack/dnsmasq-dns-6bb4fc677f-zj8w6" Dec 02 16:15:35 crc kubenswrapper[4933]: I1202 16:15:35.600443 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb4fc677f-zj8w6" Dec 02 16:15:35 crc kubenswrapper[4933]: I1202 16:15:35.672351 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/275eb386-3fe4-4326-aca5-4658b83a8783-config-data\") pod \"cinder-api-0\" (UID: \"275eb386-3fe4-4326-aca5-4658b83a8783\") " pod="openstack/cinder-api-0" Dec 02 16:15:35 crc kubenswrapper[4933]: I1202 16:15:35.672410 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/275eb386-3fe4-4326-aca5-4658b83a8783-scripts\") pod \"cinder-api-0\" (UID: \"275eb386-3fe4-4326-aca5-4658b83a8783\") " pod="openstack/cinder-api-0" Dec 02 16:15:35 crc kubenswrapper[4933]: I1202 16:15:35.672439 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/275eb386-3fe4-4326-aca5-4658b83a8783-config-data-custom\") pod \"cinder-api-0\" (UID: \"275eb386-3fe4-4326-aca5-4658b83a8783\") " pod="openstack/cinder-api-0" Dec 02 16:15:35 crc kubenswrapper[4933]: I1202 16:15:35.672454 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/275eb386-3fe4-4326-aca5-4658b83a8783-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"275eb386-3fe4-4326-aca5-4658b83a8783\") " pod="openstack/cinder-api-0" Dec 02 16:15:35 crc kubenswrapper[4933]: I1202 16:15:35.672483 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gmrcs\" (UniqueName: \"kubernetes.io/projected/275eb386-3fe4-4326-aca5-4658b83a8783-kube-api-access-gmrcs\") pod \"cinder-api-0\" (UID: \"275eb386-3fe4-4326-aca5-4658b83a8783\") " pod="openstack/cinder-api-0" Dec 02 16:15:35 crc kubenswrapper[4933]: I1202 16:15:35.672571 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/275eb386-3fe4-4326-aca5-4658b83a8783-etc-machine-id\") pod \"cinder-api-0\" (UID: \"275eb386-3fe4-4326-aca5-4658b83a8783\") " pod="openstack/cinder-api-0" Dec 02 16:15:35 crc kubenswrapper[4933]: I1202 16:15:35.672610 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/275eb386-3fe4-4326-aca5-4658b83a8783-logs\") pod \"cinder-api-0\" (UID: \"275eb386-3fe4-4326-aca5-4658b83a8783\") " pod="openstack/cinder-api-0" Dec 02 16:15:35 crc kubenswrapper[4933]: I1202 16:15:35.673044 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/275eb386-3fe4-4326-aca5-4658b83a8783-etc-machine-id\") pod \"cinder-api-0\" (UID: \"275eb386-3fe4-4326-aca5-4658b83a8783\") " pod="openstack/cinder-api-0" Dec 02 16:15:35 crc kubenswrapper[4933]: I1202 16:15:35.673060 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/275eb386-3fe4-4326-aca5-4658b83a8783-logs\") pod \"cinder-api-0\" (UID: \"275eb386-3fe4-4326-aca5-4658b83a8783\") " pod="openstack/cinder-api-0" Dec 02 16:15:35 crc kubenswrapper[4933]: I1202 16:15:35.683793 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/275eb386-3fe4-4326-aca5-4658b83a8783-scripts\") pod \"cinder-api-0\" (UID: \"275eb386-3fe4-4326-aca5-4658b83a8783\") " pod="openstack/cinder-api-0" Dec 02 16:15:35 crc kubenswrapper[4933]: I1202 16:15:35.687686 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/275eb386-3fe4-4326-aca5-4658b83a8783-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"275eb386-3fe4-4326-aca5-4658b83a8783\") " pod="openstack/cinder-api-0" Dec 02 16:15:35 crc kubenswrapper[4933]: I1202 16:15:35.687960 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/275eb386-3fe4-4326-aca5-4658b83a8783-config-data\") pod \"cinder-api-0\" (UID: \"275eb386-3fe4-4326-aca5-4658b83a8783\") " pod="openstack/cinder-api-0" Dec 02 16:15:35 crc kubenswrapper[4933]: I1202 16:15:35.690264 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/275eb386-3fe4-4326-aca5-4658b83a8783-config-data-custom\") pod \"cinder-api-0\" (UID: \"275eb386-3fe4-4326-aca5-4658b83a8783\") " pod="openstack/cinder-api-0" Dec 02 16:15:35 crc kubenswrapper[4933]: I1202 16:15:35.701124 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmrcs\" (UniqueName: \"kubernetes.io/projected/275eb386-3fe4-4326-aca5-4658b83a8783-kube-api-access-gmrcs\") pod \"cinder-api-0\" (UID: \"275eb386-3fe4-4326-aca5-4658b83a8783\") " pod="openstack/cinder-api-0" Dec 02 16:15:35 crc kubenswrapper[4933]: I1202 16:15:35.785482 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 02 16:15:35 crc kubenswrapper[4933]: I1202 16:15:35.993034 4933 generic.go:334] "Generic (PLEG): container finished" podID="f7534185-21e1-4f0b-81e8-550e0c24b0b3" containerID="65f079f4e0d724a170824f5c5fa2c7091b6e7cfc8b158d84a8b002b63c105f29" exitCode=0 Dec 02 16:15:35 crc kubenswrapper[4933]: I1202 16:15:35.993118 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc5c4795-rsdtj" event={"ID":"f7534185-21e1-4f0b-81e8-550e0c24b0b3","Type":"ContainerDied","Data":"65f079f4e0d724a170824f5c5fa2c7091b6e7cfc8b158d84a8b002b63c105f29"} Dec 02 16:15:35 crc kubenswrapper[4933]: I1202 16:15:35.993646 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 02 16:15:35 crc kubenswrapper[4933]: I1202 16:15:35.993731 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 02 16:15:35 crc kubenswrapper[4933]: I1202 16:15:35.993745 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 02 16:15:35 crc kubenswrapper[4933]: I1202 16:15:35.993756 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 02 16:15:37 crc kubenswrapper[4933]: I1202 16:15:37.494474 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 02 16:15:37 crc kubenswrapper[4933]: I1202 16:15:37.537647 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-647f597d98-xvnzx" Dec 02 16:15:37 crc kubenswrapper[4933]: I1202 16:15:37.880198 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-647f597d98-xvnzx" Dec 02 16:15:38 crc kubenswrapper[4933]: I1202 16:15:38.537487 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-858c7d46f6-k58rb" Dec 02 16:15:39 crc kubenswrapper[4933]: I1202 16:15:39.180181 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc5c4795-rsdtj" Dec 02 16:15:39 crc kubenswrapper[4933]: I1202 16:15:39.278715 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f7534185-21e1-4f0b-81e8-550e0c24b0b3-dns-svc\") pod \"f7534185-21e1-4f0b-81e8-550e0c24b0b3\" (UID: \"f7534185-21e1-4f0b-81e8-550e0c24b0b3\") " Dec 02 16:15:39 crc kubenswrapper[4933]: I1202 16:15:39.279047 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7534185-21e1-4f0b-81e8-550e0c24b0b3-config\") pod \"f7534185-21e1-4f0b-81e8-550e0c24b0b3\" (UID: \"f7534185-21e1-4f0b-81e8-550e0c24b0b3\") " Dec 02 16:15:39 crc kubenswrapper[4933]: I1202 16:15:39.279097 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f7534185-21e1-4f0b-81e8-550e0c24b0b3-dns-swift-storage-0\") pod \"f7534185-21e1-4f0b-81e8-550e0c24b0b3\" (UID: \"f7534185-21e1-4f0b-81e8-550e0c24b0b3\") " Dec 02 16:15:39 crc kubenswrapper[4933]: I1202 16:15:39.279161 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f7534185-21e1-4f0b-81e8-550e0c24b0b3-ovsdbserver-nb\") pod \"f7534185-21e1-4f0b-81e8-550e0c24b0b3\" (UID: \"f7534185-21e1-4f0b-81e8-550e0c24b0b3\") " Dec 02 16:15:39 crc kubenswrapper[4933]: I1202 16:15:39.279196 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f7534185-21e1-4f0b-81e8-550e0c24b0b3-ovsdbserver-sb\") pod \"f7534185-21e1-4f0b-81e8-550e0c24b0b3\" (UID: \"f7534185-21e1-4f0b-81e8-550e0c24b0b3\") " Dec 02 16:15:39 crc kubenswrapper[4933]: I1202 16:15:39.279261 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jnzvk\" (UniqueName: \"kubernetes.io/projected/f7534185-21e1-4f0b-81e8-550e0c24b0b3-kube-api-access-jnzvk\") pod \"f7534185-21e1-4f0b-81e8-550e0c24b0b3\" (UID: \"f7534185-21e1-4f0b-81e8-550e0c24b0b3\") " Dec 02 16:15:39 crc kubenswrapper[4933]: I1202 16:15:39.302076 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7534185-21e1-4f0b-81e8-550e0c24b0b3-kube-api-access-jnzvk" (OuterVolumeSpecName: "kube-api-access-jnzvk") pod "f7534185-21e1-4f0b-81e8-550e0c24b0b3" (UID: "f7534185-21e1-4f0b-81e8-550e0c24b0b3"). InnerVolumeSpecName "kube-api-access-jnzvk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:15:39 crc kubenswrapper[4933]: I1202 16:15:39.381393 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jnzvk\" (UniqueName: \"kubernetes.io/projected/f7534185-21e1-4f0b-81e8-550e0c24b0b3-kube-api-access-jnzvk\") on node \"crc\" DevicePath \"\"" Dec 02 16:15:39 crc kubenswrapper[4933]: I1202 16:15:39.568246 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-zj8w6"] Dec 02 16:15:39 crc kubenswrapper[4933]: I1202 16:15:39.618753 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 02 16:15:39 crc kubenswrapper[4933]: I1202 16:15:39.662225 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7534185-21e1-4f0b-81e8-550e0c24b0b3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f7534185-21e1-4f0b-81e8-550e0c24b0b3" (UID: "f7534185-21e1-4f0b-81e8-550e0c24b0b3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:15:39 crc kubenswrapper[4933]: I1202 16:15:39.664276 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7534185-21e1-4f0b-81e8-550e0c24b0b3-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f7534185-21e1-4f0b-81e8-550e0c24b0b3" (UID: "f7534185-21e1-4f0b-81e8-550e0c24b0b3"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:15:39 crc kubenswrapper[4933]: I1202 16:15:39.664777 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7534185-21e1-4f0b-81e8-550e0c24b0b3-config" (OuterVolumeSpecName: "config") pod "f7534185-21e1-4f0b-81e8-550e0c24b0b3" (UID: "f7534185-21e1-4f0b-81e8-550e0c24b0b3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:15:39 crc kubenswrapper[4933]: I1202 16:15:39.690678 4933 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f7534185-21e1-4f0b-81e8-550e0c24b0b3-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 16:15:39 crc kubenswrapper[4933]: I1202 16:15:39.690718 4933 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7534185-21e1-4f0b-81e8-550e0c24b0b3-config\") on node \"crc\" DevicePath \"\"" Dec 02 16:15:39 crc kubenswrapper[4933]: I1202 16:15:39.690729 4933 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f7534185-21e1-4f0b-81e8-550e0c24b0b3-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 02 16:15:39 crc kubenswrapper[4933]: E1202 16:15:39.717487 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="6d02b9de-9f5e-44fc-9e29-6a01b47c6ad7" Dec 02 16:15:39 crc kubenswrapper[4933]: I1202 16:15:39.718243 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7534185-21e1-4f0b-81e8-550e0c24b0b3-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f7534185-21e1-4f0b-81e8-550e0c24b0b3" (UID: "f7534185-21e1-4f0b-81e8-550e0c24b0b3"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:15:39 crc kubenswrapper[4933]: I1202 16:15:39.752937 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7534185-21e1-4f0b-81e8-550e0c24b0b3-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "f7534185-21e1-4f0b-81e8-550e0c24b0b3" (UID: "f7534185-21e1-4f0b-81e8-550e0c24b0b3"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:15:39 crc kubenswrapper[4933]: I1202 16:15:39.801361 4933 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f7534185-21e1-4f0b-81e8-550e0c24b0b3-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 02 16:15:39 crc kubenswrapper[4933]: I1202 16:15:39.801388 4933 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f7534185-21e1-4f0b-81e8-550e0c24b0b3-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 02 16:15:39 crc kubenswrapper[4933]: I1202 16:15:39.814929 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 02 16:15:40 crc kubenswrapper[4933]: I1202 16:15:40.113210 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6d02b9de-9f5e-44fc-9e29-6a01b47c6ad7","Type":"ContainerStarted","Data":"5a265ce9cfac5e2d718a8f4fd680bc99910422fc26fd8a3080f8dffb10282c0e"} Dec 02 16:15:40 crc kubenswrapper[4933]: I1202 16:15:40.113375 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6d02b9de-9f5e-44fc-9e29-6a01b47c6ad7" containerName="ceilometer-notification-agent" containerID="cri-o://5be483c4efbbc0bdf004464f4b5ac3acc48e3c0bc23a17e87f1eb680e31449a9" gracePeriod=30 Dec 02 16:15:40 crc kubenswrapper[4933]: I1202 16:15:40.113453 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 02 16:15:40 crc kubenswrapper[4933]: I1202 16:15:40.113840 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6d02b9de-9f5e-44fc-9e29-6a01b47c6ad7" containerName="proxy-httpd" containerID="cri-o://5a265ce9cfac5e2d718a8f4fd680bc99910422fc26fd8a3080f8dffb10282c0e" gracePeriod=30 Dec 02 16:15:40 crc kubenswrapper[4933]: I1202 16:15:40.113888 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6d02b9de-9f5e-44fc-9e29-6a01b47c6ad7" containerName="sg-core" containerID="cri-o://231ac796b1f2247613553fc4a1ad4f8c9d33f2ebe0b8b3755239d3b29f0a8d3f" gracePeriod=30 Dec 02 16:15:40 crc kubenswrapper[4933]: I1202 16:15:40.131592 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-zj8w6" event={"ID":"317f6015-0b9d-4a80-b022-2b77224b0284","Type":"ContainerStarted","Data":"e94949f39f78da1a801f22416f0a4d14c105ecdf40dd2a8fa01eb8d574b8d34a"} Dec 02 16:15:40 crc kubenswrapper[4933]: I1202 16:15:40.131638 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-zj8w6" event={"ID":"317f6015-0b9d-4a80-b022-2b77224b0284","Type":"ContainerStarted","Data":"29227f17a8df73d6a7cce763652a4d6bbf51496172bbb6023be119e084ad73c3"} Dec 02 16:15:40 crc kubenswrapper[4933]: I1202 16:15:40.152306 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc5c4795-rsdtj" event={"ID":"f7534185-21e1-4f0b-81e8-550e0c24b0b3","Type":"ContainerDied","Data":"72ad20b5587dcb8d367b7637a2f4cc3d4b97f70f1acf8220f5ab7a8d2b5d3c79"} Dec 02 16:15:40 crc kubenswrapper[4933]: I1202 16:15:40.152369 4933 scope.go:117] "RemoveContainer" containerID="65f079f4e0d724a170824f5c5fa2c7091b6e7cfc8b158d84a8b002b63c105f29" Dec 02 16:15:40 crc kubenswrapper[4933]: I1202 16:15:40.154104 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc5c4795-rsdtj" Dec 02 16:15:40 crc kubenswrapper[4933]: I1202 16:15:40.168148 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"275eb386-3fe4-4326-aca5-4658b83a8783","Type":"ContainerStarted","Data":"d58f54cdc0d4e67c5658fd8c6c1aa5e273ddb8b1870b6b28885eefc9b98fd4e2"} Dec 02 16:15:40 crc kubenswrapper[4933]: I1202 16:15:40.174633 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"8d6fe320-04c0-4184-a67d-50676c0717b5","Type":"ContainerStarted","Data":"93d9fea5ae34e09e6b85ba49db3f9d1ba3e987bdbf8ae6717ec5afcfe0693d28"} Dec 02 16:15:40 crc kubenswrapper[4933]: I1202 16:15:40.259573 4933 scope.go:117] "RemoveContainer" containerID="266a0e409caef7d1f1179bd90f18a2c60de994cd64c89905025cfa118c98920b" Dec 02 16:15:40 crc kubenswrapper[4933]: I1202 16:15:40.270400 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-rsdtj"] Dec 02 16:15:40 crc kubenswrapper[4933]: I1202 16:15:40.282207 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-rsdtj"] Dec 02 16:15:40 crc kubenswrapper[4933]: I1202 16:15:40.759858 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 02 16:15:40 crc kubenswrapper[4933]: I1202 16:15:40.760211 4933 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 02 16:15:40 crc kubenswrapper[4933]: I1202 16:15:40.760808 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 02 16:15:40 crc kubenswrapper[4933]: I1202 16:15:40.797055 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 02 16:15:40 crc kubenswrapper[4933]: I1202 16:15:40.797316 4933 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 02 16:15:40 crc kubenswrapper[4933]: I1202 16:15:40.872744 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 02 16:15:41 crc kubenswrapper[4933]: I1202 16:15:41.083913 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7534185-21e1-4f0b-81e8-550e0c24b0b3" path="/var/lib/kubelet/pods/f7534185-21e1-4f0b-81e8-550e0c24b0b3/volumes" Dec 02 16:15:41 crc kubenswrapper[4933]: I1202 16:15:41.209593 4933 generic.go:334] "Generic (PLEG): container finished" podID="317f6015-0b9d-4a80-b022-2b77224b0284" containerID="e94949f39f78da1a801f22416f0a4d14c105ecdf40dd2a8fa01eb8d574b8d34a" exitCode=0 Dec 02 16:15:41 crc kubenswrapper[4933]: I1202 16:15:41.209680 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-zj8w6" event={"ID":"317f6015-0b9d-4a80-b022-2b77224b0284","Type":"ContainerDied","Data":"e94949f39f78da1a801f22416f0a4d14c105ecdf40dd2a8fa01eb8d574b8d34a"} Dec 02 16:15:41 crc kubenswrapper[4933]: I1202 16:15:41.259748 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"275eb386-3fe4-4326-aca5-4658b83a8783","Type":"ContainerStarted","Data":"6fe16daec987538433d6adf8c740cdafa347db9e5e864e13c9d2b64a817e3ca5"} Dec 02 16:15:41 crc kubenswrapper[4933]: I1202 16:15:41.268836 4933 generic.go:334] "Generic (PLEG): container finished" podID="6d02b9de-9f5e-44fc-9e29-6a01b47c6ad7" containerID="5a265ce9cfac5e2d718a8f4fd680bc99910422fc26fd8a3080f8dffb10282c0e" exitCode=0 Dec 02 16:15:41 crc kubenswrapper[4933]: I1202 16:15:41.268880 4933 generic.go:334] "Generic (PLEG): container finished" podID="6d02b9de-9f5e-44fc-9e29-6a01b47c6ad7" containerID="231ac796b1f2247613553fc4a1ad4f8c9d33f2ebe0b8b3755239d3b29f0a8d3f" exitCode=2 Dec 02 16:15:41 crc kubenswrapper[4933]: I1202 16:15:41.268869 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6d02b9de-9f5e-44fc-9e29-6a01b47c6ad7","Type":"ContainerDied","Data":"5a265ce9cfac5e2d718a8f4fd680bc99910422fc26fd8a3080f8dffb10282c0e"} Dec 02 16:15:41 crc kubenswrapper[4933]: I1202 16:15:41.268917 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6d02b9de-9f5e-44fc-9e29-6a01b47c6ad7","Type":"ContainerDied","Data":"231ac796b1f2247613553fc4a1ad4f8c9d33f2ebe0b8b3755239d3b29f0a8d3f"} Dec 02 16:15:41 crc kubenswrapper[4933]: I1202 16:15:41.461203 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-56654f9db6-xk8pt" Dec 02 16:15:41 crc kubenswrapper[4933]: I1202 16:15:41.508954 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-56654f9db6-xk8pt" Dec 02 16:15:41 crc kubenswrapper[4933]: I1202 16:15:41.571344 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-647f597d98-xvnzx"] Dec 02 16:15:41 crc kubenswrapper[4933]: I1202 16:15:41.571856 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-647f597d98-xvnzx" podUID="59d1316b-e770-4a63-bd2c-f9c51b6f3082" containerName="barbican-api-log" containerID="cri-o://098ad90f3ee386fb3380cf335920d9277e2008438cf1a012f4e7cd1e1740ca47" gracePeriod=30 Dec 02 16:15:41 crc kubenswrapper[4933]: I1202 16:15:41.571974 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-647f597d98-xvnzx" podUID="59d1316b-e770-4a63-bd2c-f9c51b6f3082" containerName="barbican-api" containerID="cri-o://16e86c960ed5727d5a82c5c718ceb9b1831d1b6e070955b05fb1a8c642a714c5" gracePeriod=30 Dec 02 16:15:42 crc kubenswrapper[4933]: E1202 16:15:42.217519 4933 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6c1c5e6_50dd_428a_890c_2c3f0456f2fa.slice/crio-88738e3bef53d0478fb4beaed256d677959bb94ab5e545598a6a14e9cb4a5112.scope\": RecentStats: unable to find data in memory cache]" Dec 02 16:15:42 crc kubenswrapper[4933]: I1202 16:15:42.296909 4933 generic.go:334] "Generic (PLEG): container finished" podID="59d1316b-e770-4a63-bd2c-f9c51b6f3082" containerID="098ad90f3ee386fb3380cf335920d9277e2008438cf1a012f4e7cd1e1740ca47" exitCode=143 Dec 02 16:15:42 crc kubenswrapper[4933]: I1202 16:15:42.297009 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-647f597d98-xvnzx" event={"ID":"59d1316b-e770-4a63-bd2c-f9c51b6f3082","Type":"ContainerDied","Data":"098ad90f3ee386fb3380cf335920d9277e2008438cf1a012f4e7cd1e1740ca47"} Dec 02 16:15:42 crc kubenswrapper[4933]: I1202 16:15:42.299244 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-zj8w6" event={"ID":"317f6015-0b9d-4a80-b022-2b77224b0284","Type":"ContainerStarted","Data":"99001867c6db24be7d002dbc692073a058a7ce942910f622408b5ba18f8a730e"} Dec 02 16:15:42 crc kubenswrapper[4933]: I1202 16:15:42.300396 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6bb4fc677f-zj8w6" Dec 02 16:15:42 crc kubenswrapper[4933]: I1202 16:15:42.327189 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"275eb386-3fe4-4326-aca5-4658b83a8783","Type":"ContainerStarted","Data":"4a0971639cd14f1cfd98e67c6cee918a1300afbfb85fa40767af6f4febb88f05"} Dec 02 16:15:42 crc kubenswrapper[4933]: I1202 16:15:42.327427 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="275eb386-3fe4-4326-aca5-4658b83a8783" containerName="cinder-api-log" containerID="cri-o://6fe16daec987538433d6adf8c740cdafa347db9e5e864e13c9d2b64a817e3ca5" gracePeriod=30 Dec 02 16:15:42 crc kubenswrapper[4933]: I1202 16:15:42.327713 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 02 16:15:42 crc kubenswrapper[4933]: I1202 16:15:42.327762 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="275eb386-3fe4-4326-aca5-4658b83a8783" containerName="cinder-api" containerID="cri-o://4a0971639cd14f1cfd98e67c6cee918a1300afbfb85fa40767af6f4febb88f05" gracePeriod=30 Dec 02 16:15:42 crc kubenswrapper[4933]: I1202 16:15:42.327990 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6bb4fc677f-zj8w6" podStartSLOduration=7.32797036 podStartE2EDuration="7.32797036s" podCreationTimestamp="2025-12-02 16:15:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 16:15:42.325315118 +0000 UTC m=+1405.576541821" watchObservedRunningTime="2025-12-02 16:15:42.32797036 +0000 UTC m=+1405.579197063" Dec 02 16:15:42 crc kubenswrapper[4933]: I1202 16:15:42.341186 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"8d6fe320-04c0-4184-a67d-50676c0717b5","Type":"ContainerStarted","Data":"ef0193529a29cc40187469d2c484931607336e42925a30872414d9f579bc3945"} Dec 02 16:15:42 crc kubenswrapper[4933]: I1202 16:15:42.369316 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=7.369297483 podStartE2EDuration="7.369297483s" podCreationTimestamp="2025-12-02 16:15:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 16:15:42.353197826 +0000 UTC m=+1405.604424529" watchObservedRunningTime="2025-12-02 16:15:42.369297483 +0000 UTC m=+1405.620524186" Dec 02 16:15:42 crc kubenswrapper[4933]: E1202 16:15:42.901617 4933 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6c1c5e6_50dd_428a_890c_2c3f0456f2fa.slice/crio-88738e3bef53d0478fb4beaed256d677959bb94ab5e545598a6a14e9cb4a5112.scope\": RecentStats: unable to find data in memory cache]" Dec 02 16:15:43 crc kubenswrapper[4933]: I1202 16:15:43.352648 4933 generic.go:334] "Generic (PLEG): container finished" podID="275eb386-3fe4-4326-aca5-4658b83a8783" containerID="4a0971639cd14f1cfd98e67c6cee918a1300afbfb85fa40767af6f4febb88f05" exitCode=0 Dec 02 16:15:43 crc kubenswrapper[4933]: I1202 16:15:43.352977 4933 generic.go:334] "Generic (PLEG): container finished" podID="275eb386-3fe4-4326-aca5-4658b83a8783" containerID="6fe16daec987538433d6adf8c740cdafa347db9e5e864e13c9d2b64a817e3ca5" exitCode=143 Dec 02 16:15:43 crc kubenswrapper[4933]: I1202 16:15:43.352719 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"275eb386-3fe4-4326-aca5-4658b83a8783","Type":"ContainerDied","Data":"4a0971639cd14f1cfd98e67c6cee918a1300afbfb85fa40767af6f4febb88f05"} Dec 02 16:15:43 crc kubenswrapper[4933]: I1202 16:15:43.353022 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"275eb386-3fe4-4326-aca5-4658b83a8783","Type":"ContainerDied","Data":"6fe16daec987538433d6adf8c740cdafa347db9e5e864e13c9d2b64a817e3ca5"} Dec 02 16:15:43 crc kubenswrapper[4933]: I1202 16:15:43.353041 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"275eb386-3fe4-4326-aca5-4658b83a8783","Type":"ContainerDied","Data":"d58f54cdc0d4e67c5658fd8c6c1aa5e273ddb8b1870b6b28885eefc9b98fd4e2"} Dec 02 16:15:43 crc kubenswrapper[4933]: I1202 16:15:43.353054 4933 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d58f54cdc0d4e67c5658fd8c6c1aa5e273ddb8b1870b6b28885eefc9b98fd4e2" Dec 02 16:15:43 crc kubenswrapper[4933]: I1202 16:15:43.355291 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"8d6fe320-04c0-4184-a67d-50676c0717b5","Type":"ContainerStarted","Data":"e36013856ecff5111ea7596fef6b0a38bd1cb5b093d5814e8b206b60a88f9b4a"} Dec 02 16:15:43 crc kubenswrapper[4933]: I1202 16:15:43.386849 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=7.394257152 podStartE2EDuration="8.386813331s" podCreationTimestamp="2025-12-02 16:15:35 +0000 UTC" firstStartedPulling="2025-12-02 16:15:39.631069356 +0000 UTC m=+1402.882296059" lastFinishedPulling="2025-12-02 16:15:40.623625535 +0000 UTC m=+1403.874852238" observedRunningTime="2025-12-02 16:15:43.375157084 +0000 UTC m=+1406.626383787" watchObservedRunningTime="2025-12-02 16:15:43.386813331 +0000 UTC m=+1406.638040034" Dec 02 16:15:43 crc kubenswrapper[4933]: I1202 16:15:43.435292 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 02 16:15:43 crc kubenswrapper[4933]: I1202 16:15:43.602367 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/275eb386-3fe4-4326-aca5-4658b83a8783-scripts\") pod \"275eb386-3fe4-4326-aca5-4658b83a8783\" (UID: \"275eb386-3fe4-4326-aca5-4658b83a8783\") " Dec 02 16:15:43 crc kubenswrapper[4933]: I1202 16:15:43.602491 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/275eb386-3fe4-4326-aca5-4658b83a8783-logs\") pod \"275eb386-3fe4-4326-aca5-4658b83a8783\" (UID: \"275eb386-3fe4-4326-aca5-4658b83a8783\") " Dec 02 16:15:43 crc kubenswrapper[4933]: I1202 16:15:43.602740 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/275eb386-3fe4-4326-aca5-4658b83a8783-combined-ca-bundle\") pod \"275eb386-3fe4-4326-aca5-4658b83a8783\" (UID: \"275eb386-3fe4-4326-aca5-4658b83a8783\") " Dec 02 16:15:43 crc kubenswrapper[4933]: I1202 16:15:43.602865 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/275eb386-3fe4-4326-aca5-4658b83a8783-etc-machine-id\") pod \"275eb386-3fe4-4326-aca5-4658b83a8783\" (UID: \"275eb386-3fe4-4326-aca5-4658b83a8783\") " Dec 02 16:15:43 crc kubenswrapper[4933]: I1202 16:15:43.602894 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gmrcs\" (UniqueName: \"kubernetes.io/projected/275eb386-3fe4-4326-aca5-4658b83a8783-kube-api-access-gmrcs\") pod \"275eb386-3fe4-4326-aca5-4658b83a8783\" (UID: \"275eb386-3fe4-4326-aca5-4658b83a8783\") " Dec 02 16:15:43 crc kubenswrapper[4933]: I1202 16:15:43.602975 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/275eb386-3fe4-4326-aca5-4658b83a8783-config-data\") pod \"275eb386-3fe4-4326-aca5-4658b83a8783\" (UID: \"275eb386-3fe4-4326-aca5-4658b83a8783\") " Dec 02 16:15:43 crc kubenswrapper[4933]: I1202 16:15:43.603023 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/275eb386-3fe4-4326-aca5-4658b83a8783-config-data-custom\") pod \"275eb386-3fe4-4326-aca5-4658b83a8783\" (UID: \"275eb386-3fe4-4326-aca5-4658b83a8783\") " Dec 02 16:15:43 crc kubenswrapper[4933]: I1202 16:15:43.603476 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/275eb386-3fe4-4326-aca5-4658b83a8783-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "275eb386-3fe4-4326-aca5-4658b83a8783" (UID: "275eb386-3fe4-4326-aca5-4658b83a8783"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 16:15:43 crc kubenswrapper[4933]: I1202 16:15:43.604236 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/275eb386-3fe4-4326-aca5-4658b83a8783-logs" (OuterVolumeSpecName: "logs") pod "275eb386-3fe4-4326-aca5-4658b83a8783" (UID: "275eb386-3fe4-4326-aca5-4658b83a8783"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 16:15:43 crc kubenswrapper[4933]: I1202 16:15:43.616013 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/275eb386-3fe4-4326-aca5-4658b83a8783-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "275eb386-3fe4-4326-aca5-4658b83a8783" (UID: "275eb386-3fe4-4326-aca5-4658b83a8783"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:15:43 crc kubenswrapper[4933]: I1202 16:15:43.616681 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/275eb386-3fe4-4326-aca5-4658b83a8783-scripts" (OuterVolumeSpecName: "scripts") pod "275eb386-3fe4-4326-aca5-4658b83a8783" (UID: "275eb386-3fe4-4326-aca5-4658b83a8783"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:15:43 crc kubenswrapper[4933]: I1202 16:15:43.620943 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/275eb386-3fe4-4326-aca5-4658b83a8783-kube-api-access-gmrcs" (OuterVolumeSpecName: "kube-api-access-gmrcs") pod "275eb386-3fe4-4326-aca5-4658b83a8783" (UID: "275eb386-3fe4-4326-aca5-4658b83a8783"). InnerVolumeSpecName "kube-api-access-gmrcs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:15:43 crc kubenswrapper[4933]: I1202 16:15:43.650620 4933 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5ccc5c4795-rsdtj" podUID="f7534185-21e1-4f0b-81e8-550e0c24b0b3" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.186:5353: i/o timeout" Dec 02 16:15:43 crc kubenswrapper[4933]: I1202 16:15:43.688564 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/275eb386-3fe4-4326-aca5-4658b83a8783-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "275eb386-3fe4-4326-aca5-4658b83a8783" (UID: "275eb386-3fe4-4326-aca5-4658b83a8783"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:15:43 crc kubenswrapper[4933]: I1202 16:15:43.702902 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/275eb386-3fe4-4326-aca5-4658b83a8783-config-data" (OuterVolumeSpecName: "config-data") pod "275eb386-3fe4-4326-aca5-4658b83a8783" (UID: "275eb386-3fe4-4326-aca5-4658b83a8783"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:15:43 crc kubenswrapper[4933]: I1202 16:15:43.705210 4933 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/275eb386-3fe4-4326-aca5-4658b83a8783-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 16:15:43 crc kubenswrapper[4933]: I1202 16:15:43.705248 4933 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/275eb386-3fe4-4326-aca5-4658b83a8783-logs\") on node \"crc\" DevicePath \"\"" Dec 02 16:15:43 crc kubenswrapper[4933]: I1202 16:15:43.705259 4933 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/275eb386-3fe4-4326-aca5-4658b83a8783-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 16:15:43 crc kubenswrapper[4933]: I1202 16:15:43.705270 4933 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/275eb386-3fe4-4326-aca5-4658b83a8783-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 02 16:15:43 crc kubenswrapper[4933]: I1202 16:15:43.705279 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gmrcs\" (UniqueName: \"kubernetes.io/projected/275eb386-3fe4-4326-aca5-4658b83a8783-kube-api-access-gmrcs\") on node \"crc\" DevicePath \"\"" Dec 02 16:15:43 crc kubenswrapper[4933]: I1202 16:15:43.705287 4933 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/275eb386-3fe4-4326-aca5-4658b83a8783-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 16:15:43 crc kubenswrapper[4933]: I1202 16:15:43.705295 4933 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/275eb386-3fe4-4326-aca5-4658b83a8783-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 02 16:15:44 crc kubenswrapper[4933]: I1202 16:15:44.021712 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-7d5bbf46cc-92zkh" Dec 02 16:15:44 crc kubenswrapper[4933]: I1202 16:15:44.096866 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-858c7d46f6-k58rb"] Dec 02 16:15:44 crc kubenswrapper[4933]: I1202 16:15:44.097181 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-858c7d46f6-k58rb" podUID="d981749f-83d5-4096-869d-617b2a6f1ce7" containerName="neutron-httpd" containerID="cri-o://44c621895de73fe9045cc23d2f328372058a1a20e57f0bf15e8ae1507b0ef978" gracePeriod=30 Dec 02 16:15:44 crc kubenswrapper[4933]: I1202 16:15:44.098083 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-858c7d46f6-k58rb" podUID="d981749f-83d5-4096-869d-617b2a6f1ce7" containerName="neutron-api" containerID="cri-o://e856d80dfe6cd8e2c018c8bc68cc8e2b7ce3692ad51e95bf05f1b5748a8e5187" gracePeriod=30 Dec 02 16:15:44 crc kubenswrapper[4933]: I1202 16:15:44.122183 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-64bbf7fc9b-xn4tj" Dec 02 16:15:44 crc kubenswrapper[4933]: I1202 16:15:44.179172 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-64bbf7fc9b-xn4tj" Dec 02 16:15:44 crc kubenswrapper[4933]: I1202 16:15:44.370254 4933 generic.go:334] "Generic (PLEG): container finished" podID="d981749f-83d5-4096-869d-617b2a6f1ce7" containerID="44c621895de73fe9045cc23d2f328372058a1a20e57f0bf15e8ae1507b0ef978" exitCode=0 Dec 02 16:15:44 crc kubenswrapper[4933]: I1202 16:15:44.370448 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-858c7d46f6-k58rb" event={"ID":"d981749f-83d5-4096-869d-617b2a6f1ce7","Type":"ContainerDied","Data":"44c621895de73fe9045cc23d2f328372058a1a20e57f0bf15e8ae1507b0ef978"} Dec 02 16:15:44 crc kubenswrapper[4933]: I1202 16:15:44.371080 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 02 16:15:44 crc kubenswrapper[4933]: I1202 16:15:44.441762 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 02 16:15:44 crc kubenswrapper[4933]: I1202 16:15:44.455666 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Dec 02 16:15:44 crc kubenswrapper[4933]: I1202 16:15:44.471222 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 02 16:15:44 crc kubenswrapper[4933]: E1202 16:15:44.471732 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7534185-21e1-4f0b-81e8-550e0c24b0b3" containerName="init" Dec 02 16:15:44 crc kubenswrapper[4933]: I1202 16:15:44.471745 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7534185-21e1-4f0b-81e8-550e0c24b0b3" containerName="init" Dec 02 16:15:44 crc kubenswrapper[4933]: E1202 16:15:44.471762 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="275eb386-3fe4-4326-aca5-4658b83a8783" containerName="cinder-api" Dec 02 16:15:44 crc kubenswrapper[4933]: I1202 16:15:44.471768 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="275eb386-3fe4-4326-aca5-4658b83a8783" containerName="cinder-api" Dec 02 16:15:44 crc kubenswrapper[4933]: E1202 16:15:44.471799 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="275eb386-3fe4-4326-aca5-4658b83a8783" containerName="cinder-api-log" Dec 02 16:15:44 crc kubenswrapper[4933]: I1202 16:15:44.471806 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="275eb386-3fe4-4326-aca5-4658b83a8783" containerName="cinder-api-log" Dec 02 16:15:44 crc kubenswrapper[4933]: E1202 16:15:44.471844 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7534185-21e1-4f0b-81e8-550e0c24b0b3" containerName="dnsmasq-dns" Dec 02 16:15:44 crc kubenswrapper[4933]: I1202 16:15:44.471851 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7534185-21e1-4f0b-81e8-550e0c24b0b3" containerName="dnsmasq-dns" Dec 02 16:15:44 crc kubenswrapper[4933]: I1202 16:15:44.472042 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="275eb386-3fe4-4326-aca5-4658b83a8783" containerName="cinder-api-log" Dec 02 16:15:44 crc kubenswrapper[4933]: I1202 16:15:44.472064 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="275eb386-3fe4-4326-aca5-4658b83a8783" containerName="cinder-api" Dec 02 16:15:44 crc kubenswrapper[4933]: I1202 16:15:44.472072 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7534185-21e1-4f0b-81e8-550e0c24b0b3" containerName="dnsmasq-dns" Dec 02 16:15:44 crc kubenswrapper[4933]: I1202 16:15:44.473224 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 02 16:15:44 crc kubenswrapper[4933]: I1202 16:15:44.476692 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Dec 02 16:15:44 crc kubenswrapper[4933]: I1202 16:15:44.477124 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 02 16:15:44 crc kubenswrapper[4933]: I1202 16:15:44.484509 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 02 16:15:44 crc kubenswrapper[4933]: I1202 16:15:44.487718 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Dec 02 16:15:44 crc kubenswrapper[4933]: I1202 16:15:44.633981 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e9fc2a4-bdc8-4f3f-9ef1-b9e9a141cd02-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"9e9fc2a4-bdc8-4f3f-9ef1-b9e9a141cd02\") " pod="openstack/cinder-api-0" Dec 02 16:15:44 crc kubenswrapper[4933]: I1202 16:15:44.634058 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e9fc2a4-bdc8-4f3f-9ef1-b9e9a141cd02-public-tls-certs\") pod \"cinder-api-0\" (UID: \"9e9fc2a4-bdc8-4f3f-9ef1-b9e9a141cd02\") " pod="openstack/cinder-api-0" Dec 02 16:15:44 crc kubenswrapper[4933]: I1202 16:15:44.634130 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9e9fc2a4-bdc8-4f3f-9ef1-b9e9a141cd02-scripts\") pod \"cinder-api-0\" (UID: \"9e9fc2a4-bdc8-4f3f-9ef1-b9e9a141cd02\") " pod="openstack/cinder-api-0" Dec 02 16:15:44 crc kubenswrapper[4933]: I1202 16:15:44.634149 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e9fc2a4-bdc8-4f3f-9ef1-b9e9a141cd02-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"9e9fc2a4-bdc8-4f3f-9ef1-b9e9a141cd02\") " pod="openstack/cinder-api-0" Dec 02 16:15:44 crc kubenswrapper[4933]: I1202 16:15:44.634164 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e9fc2a4-bdc8-4f3f-9ef1-b9e9a141cd02-config-data\") pod \"cinder-api-0\" (UID: \"9e9fc2a4-bdc8-4f3f-9ef1-b9e9a141cd02\") " pod="openstack/cinder-api-0" Dec 02 16:15:44 crc kubenswrapper[4933]: I1202 16:15:44.634198 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9e9fc2a4-bdc8-4f3f-9ef1-b9e9a141cd02-logs\") pod \"cinder-api-0\" (UID: \"9e9fc2a4-bdc8-4f3f-9ef1-b9e9a141cd02\") " pod="openstack/cinder-api-0" Dec 02 16:15:44 crc kubenswrapper[4933]: I1202 16:15:44.634231 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gs8vp\" (UniqueName: \"kubernetes.io/projected/9e9fc2a4-bdc8-4f3f-9ef1-b9e9a141cd02-kube-api-access-gs8vp\") pod \"cinder-api-0\" (UID: \"9e9fc2a4-bdc8-4f3f-9ef1-b9e9a141cd02\") " pod="openstack/cinder-api-0" Dec 02 16:15:44 crc kubenswrapper[4933]: I1202 16:15:44.634258 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9e9fc2a4-bdc8-4f3f-9ef1-b9e9a141cd02-etc-machine-id\") pod \"cinder-api-0\" (UID: \"9e9fc2a4-bdc8-4f3f-9ef1-b9e9a141cd02\") " pod="openstack/cinder-api-0" Dec 02 16:15:44 crc kubenswrapper[4933]: I1202 16:15:44.634294 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9e9fc2a4-bdc8-4f3f-9ef1-b9e9a141cd02-config-data-custom\") pod \"cinder-api-0\" (UID: \"9e9fc2a4-bdc8-4f3f-9ef1-b9e9a141cd02\") " pod="openstack/cinder-api-0" Dec 02 16:15:44 crc kubenswrapper[4933]: I1202 16:15:44.736350 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9e9fc2a4-bdc8-4f3f-9ef1-b9e9a141cd02-config-data-custom\") pod \"cinder-api-0\" (UID: \"9e9fc2a4-bdc8-4f3f-9ef1-b9e9a141cd02\") " pod="openstack/cinder-api-0" Dec 02 16:15:44 crc kubenswrapper[4933]: I1202 16:15:44.736519 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e9fc2a4-bdc8-4f3f-9ef1-b9e9a141cd02-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"9e9fc2a4-bdc8-4f3f-9ef1-b9e9a141cd02\") " pod="openstack/cinder-api-0" Dec 02 16:15:44 crc kubenswrapper[4933]: I1202 16:15:44.736718 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e9fc2a4-bdc8-4f3f-9ef1-b9e9a141cd02-public-tls-certs\") pod \"cinder-api-0\" (UID: \"9e9fc2a4-bdc8-4f3f-9ef1-b9e9a141cd02\") " pod="openstack/cinder-api-0" Dec 02 16:15:44 crc kubenswrapper[4933]: I1202 16:15:44.736935 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9e9fc2a4-bdc8-4f3f-9ef1-b9e9a141cd02-scripts\") pod \"cinder-api-0\" (UID: \"9e9fc2a4-bdc8-4f3f-9ef1-b9e9a141cd02\") " pod="openstack/cinder-api-0" Dec 02 16:15:44 crc kubenswrapper[4933]: I1202 16:15:44.736975 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e9fc2a4-bdc8-4f3f-9ef1-b9e9a141cd02-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"9e9fc2a4-bdc8-4f3f-9ef1-b9e9a141cd02\") " pod="openstack/cinder-api-0" Dec 02 16:15:44 crc kubenswrapper[4933]: I1202 16:15:44.736994 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e9fc2a4-bdc8-4f3f-9ef1-b9e9a141cd02-config-data\") pod \"cinder-api-0\" (UID: \"9e9fc2a4-bdc8-4f3f-9ef1-b9e9a141cd02\") " pod="openstack/cinder-api-0" Dec 02 16:15:44 crc kubenswrapper[4933]: I1202 16:15:44.737069 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9e9fc2a4-bdc8-4f3f-9ef1-b9e9a141cd02-logs\") pod \"cinder-api-0\" (UID: \"9e9fc2a4-bdc8-4f3f-9ef1-b9e9a141cd02\") " pod="openstack/cinder-api-0" Dec 02 16:15:44 crc kubenswrapper[4933]: I1202 16:15:44.737128 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gs8vp\" (UniqueName: \"kubernetes.io/projected/9e9fc2a4-bdc8-4f3f-9ef1-b9e9a141cd02-kube-api-access-gs8vp\") pod \"cinder-api-0\" (UID: \"9e9fc2a4-bdc8-4f3f-9ef1-b9e9a141cd02\") " pod="openstack/cinder-api-0" Dec 02 16:15:44 crc kubenswrapper[4933]: I1202 16:15:44.737163 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9e9fc2a4-bdc8-4f3f-9ef1-b9e9a141cd02-etc-machine-id\") pod \"cinder-api-0\" (UID: \"9e9fc2a4-bdc8-4f3f-9ef1-b9e9a141cd02\") " pod="openstack/cinder-api-0" Dec 02 16:15:44 crc kubenswrapper[4933]: I1202 16:15:44.737354 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9e9fc2a4-bdc8-4f3f-9ef1-b9e9a141cd02-etc-machine-id\") pod \"cinder-api-0\" (UID: \"9e9fc2a4-bdc8-4f3f-9ef1-b9e9a141cd02\") " pod="openstack/cinder-api-0" Dec 02 16:15:44 crc kubenswrapper[4933]: I1202 16:15:44.737405 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9e9fc2a4-bdc8-4f3f-9ef1-b9e9a141cd02-logs\") pod \"cinder-api-0\" (UID: \"9e9fc2a4-bdc8-4f3f-9ef1-b9e9a141cd02\") " pod="openstack/cinder-api-0" Dec 02 16:15:44 crc kubenswrapper[4933]: I1202 16:15:44.741873 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e9fc2a4-bdc8-4f3f-9ef1-b9e9a141cd02-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"9e9fc2a4-bdc8-4f3f-9ef1-b9e9a141cd02\") " pod="openstack/cinder-api-0" Dec 02 16:15:44 crc kubenswrapper[4933]: I1202 16:15:44.744070 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e9fc2a4-bdc8-4f3f-9ef1-b9e9a141cd02-config-data\") pod \"cinder-api-0\" (UID: \"9e9fc2a4-bdc8-4f3f-9ef1-b9e9a141cd02\") " pod="openstack/cinder-api-0" Dec 02 16:15:44 crc kubenswrapper[4933]: I1202 16:15:44.745209 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e9fc2a4-bdc8-4f3f-9ef1-b9e9a141cd02-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"9e9fc2a4-bdc8-4f3f-9ef1-b9e9a141cd02\") " pod="openstack/cinder-api-0" Dec 02 16:15:44 crc kubenswrapper[4933]: I1202 16:15:44.745630 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9e9fc2a4-bdc8-4f3f-9ef1-b9e9a141cd02-scripts\") pod \"cinder-api-0\" (UID: \"9e9fc2a4-bdc8-4f3f-9ef1-b9e9a141cd02\") " pod="openstack/cinder-api-0" Dec 02 16:15:44 crc kubenswrapper[4933]: I1202 16:15:44.746256 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e9fc2a4-bdc8-4f3f-9ef1-b9e9a141cd02-public-tls-certs\") pod \"cinder-api-0\" (UID: \"9e9fc2a4-bdc8-4f3f-9ef1-b9e9a141cd02\") " pod="openstack/cinder-api-0" Dec 02 16:15:44 crc kubenswrapper[4933]: I1202 16:15:44.755580 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9e9fc2a4-bdc8-4f3f-9ef1-b9e9a141cd02-config-data-custom\") pod \"cinder-api-0\" (UID: \"9e9fc2a4-bdc8-4f3f-9ef1-b9e9a141cd02\") " pod="openstack/cinder-api-0" Dec 02 16:15:44 crc kubenswrapper[4933]: I1202 16:15:44.756323 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gs8vp\" (UniqueName: \"kubernetes.io/projected/9e9fc2a4-bdc8-4f3f-9ef1-b9e9a141cd02-kube-api-access-gs8vp\") pod \"cinder-api-0\" (UID: \"9e9fc2a4-bdc8-4f3f-9ef1-b9e9a141cd02\") " pod="openstack/cinder-api-0" Dec 02 16:15:44 crc kubenswrapper[4933]: I1202 16:15:44.792691 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 02 16:15:45 crc kubenswrapper[4933]: I1202 16:15:45.098253 4933 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-647f597d98-xvnzx" podUID="59d1316b-e770-4a63-bd2c-f9c51b6f3082" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.195:9311/healthcheck\": dial tcp 10.217.0.195:9311: connect: connection refused" Dec 02 16:15:45 crc kubenswrapper[4933]: I1202 16:15:45.098284 4933 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-647f597d98-xvnzx" podUID="59d1316b-e770-4a63-bd2c-f9c51b6f3082" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.195:9311/healthcheck\": dial tcp 10.217.0.195:9311: connect: connection refused" Dec 02 16:15:45 crc kubenswrapper[4933]: I1202 16:15:45.125302 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="275eb386-3fe4-4326-aca5-4658b83a8783" path="/var/lib/kubelet/pods/275eb386-3fe4-4326-aca5-4658b83a8783/volumes" Dec 02 16:15:45 crc kubenswrapper[4933]: I1202 16:15:45.450567 4933 generic.go:334] "Generic (PLEG): container finished" podID="6d02b9de-9f5e-44fc-9e29-6a01b47c6ad7" containerID="5be483c4efbbc0bdf004464f4b5ac3acc48e3c0bc23a17e87f1eb680e31449a9" exitCode=0 Dec 02 16:15:45 crc kubenswrapper[4933]: I1202 16:15:45.450878 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6d02b9de-9f5e-44fc-9e29-6a01b47c6ad7","Type":"ContainerDied","Data":"5be483c4efbbc0bdf004464f4b5ac3acc48e3c0bc23a17e87f1eb680e31449a9"} Dec 02 16:15:45 crc kubenswrapper[4933]: I1202 16:15:45.490339 4933 generic.go:334] "Generic (PLEG): container finished" podID="59d1316b-e770-4a63-bd2c-f9c51b6f3082" containerID="16e86c960ed5727d5a82c5c718ceb9b1831d1b6e070955b05fb1a8c642a714c5" exitCode=0 Dec 02 16:15:45 crc kubenswrapper[4933]: I1202 16:15:45.490381 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-647f597d98-xvnzx" event={"ID":"59d1316b-e770-4a63-bd2c-f9c51b6f3082","Type":"ContainerDied","Data":"16e86c960ed5727d5a82c5c718ceb9b1831d1b6e070955b05fb1a8c642a714c5"} Dec 02 16:15:45 crc kubenswrapper[4933]: I1202 16:15:45.520437 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 02 16:15:45 crc kubenswrapper[4933]: W1202 16:15:45.557984 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9e9fc2a4_bdc8_4f3f_9ef1_b9e9a141cd02.slice/crio-11c3cfb45c53d85c31c452bf78d003e58710bc7425b98f6532dec049f8a59073 WatchSource:0}: Error finding container 11c3cfb45c53d85c31c452bf78d003e58710bc7425b98f6532dec049f8a59073: Status 404 returned error can't find the container with id 11c3cfb45c53d85c31c452bf78d003e58710bc7425b98f6532dec049f8a59073 Dec 02 16:15:45 crc kubenswrapper[4933]: I1202 16:15:45.558178 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 02 16:15:45 crc kubenswrapper[4933]: I1202 16:15:45.837788 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-647f597d98-xvnzx" Dec 02 16:15:45 crc kubenswrapper[4933]: I1202 16:15:45.991756 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/59d1316b-e770-4a63-bd2c-f9c51b6f3082-config-data-custom\") pod \"59d1316b-e770-4a63-bd2c-f9c51b6f3082\" (UID: \"59d1316b-e770-4a63-bd2c-f9c51b6f3082\") " Dec 02 16:15:45 crc kubenswrapper[4933]: I1202 16:15:45.992099 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b2qz5\" (UniqueName: \"kubernetes.io/projected/59d1316b-e770-4a63-bd2c-f9c51b6f3082-kube-api-access-b2qz5\") pod \"59d1316b-e770-4a63-bd2c-f9c51b6f3082\" (UID: \"59d1316b-e770-4a63-bd2c-f9c51b6f3082\") " Dec 02 16:15:45 crc kubenswrapper[4933]: I1202 16:15:45.992127 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59d1316b-e770-4a63-bd2c-f9c51b6f3082-config-data\") pod \"59d1316b-e770-4a63-bd2c-f9c51b6f3082\" (UID: \"59d1316b-e770-4a63-bd2c-f9c51b6f3082\") " Dec 02 16:15:45 crc kubenswrapper[4933]: I1202 16:15:45.992241 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59d1316b-e770-4a63-bd2c-f9c51b6f3082-combined-ca-bundle\") pod \"59d1316b-e770-4a63-bd2c-f9c51b6f3082\" (UID: \"59d1316b-e770-4a63-bd2c-f9c51b6f3082\") " Dec 02 16:15:45 crc kubenswrapper[4933]: I1202 16:15:45.992265 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59d1316b-e770-4a63-bd2c-f9c51b6f3082-logs\") pod \"59d1316b-e770-4a63-bd2c-f9c51b6f3082\" (UID: \"59d1316b-e770-4a63-bd2c-f9c51b6f3082\") " Dec 02 16:15:45 crc kubenswrapper[4933]: I1202 16:15:45.993476 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59d1316b-e770-4a63-bd2c-f9c51b6f3082-logs" (OuterVolumeSpecName: "logs") pod "59d1316b-e770-4a63-bd2c-f9c51b6f3082" (UID: "59d1316b-e770-4a63-bd2c-f9c51b6f3082"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 16:15:45 crc kubenswrapper[4933]: I1202 16:15:45.998388 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59d1316b-e770-4a63-bd2c-f9c51b6f3082-kube-api-access-b2qz5" (OuterVolumeSpecName: "kube-api-access-b2qz5") pod "59d1316b-e770-4a63-bd2c-f9c51b6f3082" (UID: "59d1316b-e770-4a63-bd2c-f9c51b6f3082"). InnerVolumeSpecName "kube-api-access-b2qz5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:15:46 crc kubenswrapper[4933]: I1202 16:15:46.001571 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59d1316b-e770-4a63-bd2c-f9c51b6f3082-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "59d1316b-e770-4a63-bd2c-f9c51b6f3082" (UID: "59d1316b-e770-4a63-bd2c-f9c51b6f3082"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:15:46 crc kubenswrapper[4933]: I1202 16:15:46.070595 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59d1316b-e770-4a63-bd2c-f9c51b6f3082-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "59d1316b-e770-4a63-bd2c-f9c51b6f3082" (UID: "59d1316b-e770-4a63-bd2c-f9c51b6f3082"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:15:46 crc kubenswrapper[4933]: I1202 16:15:46.096355 4933 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/59d1316b-e770-4a63-bd2c-f9c51b6f3082-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 02 16:15:46 crc kubenswrapper[4933]: I1202 16:15:46.096393 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b2qz5\" (UniqueName: \"kubernetes.io/projected/59d1316b-e770-4a63-bd2c-f9c51b6f3082-kube-api-access-b2qz5\") on node \"crc\" DevicePath \"\"" Dec 02 16:15:46 crc kubenswrapper[4933]: I1202 16:15:46.096405 4933 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59d1316b-e770-4a63-bd2c-f9c51b6f3082-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 16:15:46 crc kubenswrapper[4933]: I1202 16:15:46.096414 4933 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59d1316b-e770-4a63-bd2c-f9c51b6f3082-logs\") on node \"crc\" DevicePath \"\"" Dec 02 16:15:46 crc kubenswrapper[4933]: I1202 16:15:46.121097 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59d1316b-e770-4a63-bd2c-f9c51b6f3082-config-data" (OuterVolumeSpecName: "config-data") pod "59d1316b-e770-4a63-bd2c-f9c51b6f3082" (UID: "59d1316b-e770-4a63-bd2c-f9c51b6f3082"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:15:46 crc kubenswrapper[4933]: I1202 16:15:46.165212 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 16:15:46 crc kubenswrapper[4933]: I1202 16:15:46.207716 4933 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59d1316b-e770-4a63-bd2c-f9c51b6f3082-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 16:15:46 crc kubenswrapper[4933]: I1202 16:15:46.309342 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d02b9de-9f5e-44fc-9e29-6a01b47c6ad7-config-data\") pod \"6d02b9de-9f5e-44fc-9e29-6a01b47c6ad7\" (UID: \"6d02b9de-9f5e-44fc-9e29-6a01b47c6ad7\") " Dec 02 16:15:46 crc kubenswrapper[4933]: I1202 16:15:46.309846 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6d02b9de-9f5e-44fc-9e29-6a01b47c6ad7-sg-core-conf-yaml\") pod \"6d02b9de-9f5e-44fc-9e29-6a01b47c6ad7\" (UID: \"6d02b9de-9f5e-44fc-9e29-6a01b47c6ad7\") " Dec 02 16:15:46 crc kubenswrapper[4933]: I1202 16:15:46.309876 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6d02b9de-9f5e-44fc-9e29-6a01b47c6ad7-run-httpd\") pod \"6d02b9de-9f5e-44fc-9e29-6a01b47c6ad7\" (UID: \"6d02b9de-9f5e-44fc-9e29-6a01b47c6ad7\") " Dec 02 16:15:46 crc kubenswrapper[4933]: I1202 16:15:46.309905 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6d02b9de-9f5e-44fc-9e29-6a01b47c6ad7-log-httpd\") pod \"6d02b9de-9f5e-44fc-9e29-6a01b47c6ad7\" (UID: \"6d02b9de-9f5e-44fc-9e29-6a01b47c6ad7\") " Dec 02 16:15:46 crc kubenswrapper[4933]: I1202 16:15:46.309940 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-55s9v\" (UniqueName: \"kubernetes.io/projected/6d02b9de-9f5e-44fc-9e29-6a01b47c6ad7-kube-api-access-55s9v\") pod \"6d02b9de-9f5e-44fc-9e29-6a01b47c6ad7\" (UID: \"6d02b9de-9f5e-44fc-9e29-6a01b47c6ad7\") " Dec 02 16:15:46 crc kubenswrapper[4933]: I1202 16:15:46.309967 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d02b9de-9f5e-44fc-9e29-6a01b47c6ad7-combined-ca-bundle\") pod \"6d02b9de-9f5e-44fc-9e29-6a01b47c6ad7\" (UID: \"6d02b9de-9f5e-44fc-9e29-6a01b47c6ad7\") " Dec 02 16:15:46 crc kubenswrapper[4933]: I1202 16:15:46.310106 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6d02b9de-9f5e-44fc-9e29-6a01b47c6ad7-scripts\") pod \"6d02b9de-9f5e-44fc-9e29-6a01b47c6ad7\" (UID: \"6d02b9de-9f5e-44fc-9e29-6a01b47c6ad7\") " Dec 02 16:15:46 crc kubenswrapper[4933]: I1202 16:15:46.310230 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d02b9de-9f5e-44fc-9e29-6a01b47c6ad7-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "6d02b9de-9f5e-44fc-9e29-6a01b47c6ad7" (UID: "6d02b9de-9f5e-44fc-9e29-6a01b47c6ad7"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 16:15:46 crc kubenswrapper[4933]: I1202 16:15:46.310478 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d02b9de-9f5e-44fc-9e29-6a01b47c6ad7-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "6d02b9de-9f5e-44fc-9e29-6a01b47c6ad7" (UID: "6d02b9de-9f5e-44fc-9e29-6a01b47c6ad7"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 16:15:46 crc kubenswrapper[4933]: I1202 16:15:46.311148 4933 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6d02b9de-9f5e-44fc-9e29-6a01b47c6ad7-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 16:15:46 crc kubenswrapper[4933]: I1202 16:15:46.311165 4933 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6d02b9de-9f5e-44fc-9e29-6a01b47c6ad7-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 16:15:46 crc kubenswrapper[4933]: I1202 16:15:46.319989 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d02b9de-9f5e-44fc-9e29-6a01b47c6ad7-scripts" (OuterVolumeSpecName: "scripts") pod "6d02b9de-9f5e-44fc-9e29-6a01b47c6ad7" (UID: "6d02b9de-9f5e-44fc-9e29-6a01b47c6ad7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:15:46 crc kubenswrapper[4933]: I1202 16:15:46.320000 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d02b9de-9f5e-44fc-9e29-6a01b47c6ad7-kube-api-access-55s9v" (OuterVolumeSpecName: "kube-api-access-55s9v") pod "6d02b9de-9f5e-44fc-9e29-6a01b47c6ad7" (UID: "6d02b9de-9f5e-44fc-9e29-6a01b47c6ad7"). InnerVolumeSpecName "kube-api-access-55s9v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:15:46 crc kubenswrapper[4933]: I1202 16:15:46.369132 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d02b9de-9f5e-44fc-9e29-6a01b47c6ad7-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "6d02b9de-9f5e-44fc-9e29-6a01b47c6ad7" (UID: "6d02b9de-9f5e-44fc-9e29-6a01b47c6ad7"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:15:46 crc kubenswrapper[4933]: I1202 16:15:46.412873 4933 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6d02b9de-9f5e-44fc-9e29-6a01b47c6ad7-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 02 16:15:46 crc kubenswrapper[4933]: I1202 16:15:46.412904 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-55s9v\" (UniqueName: \"kubernetes.io/projected/6d02b9de-9f5e-44fc-9e29-6a01b47c6ad7-kube-api-access-55s9v\") on node \"crc\" DevicePath \"\"" Dec 02 16:15:46 crc kubenswrapper[4933]: I1202 16:15:46.412916 4933 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6d02b9de-9f5e-44fc-9e29-6a01b47c6ad7-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 16:15:46 crc kubenswrapper[4933]: I1202 16:15:46.422677 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d02b9de-9f5e-44fc-9e29-6a01b47c6ad7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6d02b9de-9f5e-44fc-9e29-6a01b47c6ad7" (UID: "6d02b9de-9f5e-44fc-9e29-6a01b47c6ad7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:15:46 crc kubenswrapper[4933]: I1202 16:15:46.461105 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d02b9de-9f5e-44fc-9e29-6a01b47c6ad7-config-data" (OuterVolumeSpecName: "config-data") pod "6d02b9de-9f5e-44fc-9e29-6a01b47c6ad7" (UID: "6d02b9de-9f5e-44fc-9e29-6a01b47c6ad7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:15:46 crc kubenswrapper[4933]: I1202 16:15:46.519381 4933 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d02b9de-9f5e-44fc-9e29-6a01b47c6ad7-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 16:15:46 crc kubenswrapper[4933]: I1202 16:15:46.519408 4933 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d02b9de-9f5e-44fc-9e29-6a01b47c6ad7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 16:15:46 crc kubenswrapper[4933]: I1202 16:15:46.541010 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-647f597d98-xvnzx" event={"ID":"59d1316b-e770-4a63-bd2c-f9c51b6f3082","Type":"ContainerDied","Data":"bf785c10d5f86cd47afb4f81c65ef7a71ff46388ed76ca8d9d1ba86438641689"} Dec 02 16:15:46 crc kubenswrapper[4933]: I1202 16:15:46.541065 4933 scope.go:117] "RemoveContainer" containerID="16e86c960ed5727d5a82c5c718ceb9b1831d1b6e070955b05fb1a8c642a714c5" Dec 02 16:15:46 crc kubenswrapper[4933]: I1202 16:15:46.541129 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-647f597d98-xvnzx" Dec 02 16:15:46 crc kubenswrapper[4933]: I1202 16:15:46.543542 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9e9fc2a4-bdc8-4f3f-9ef1-b9e9a141cd02","Type":"ContainerStarted","Data":"d10f6fdde660f4fa11a8186a5a95976e728d253b435f932ef18fa0db1c61975e"} Dec 02 16:15:46 crc kubenswrapper[4933]: I1202 16:15:46.543683 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9e9fc2a4-bdc8-4f3f-9ef1-b9e9a141cd02","Type":"ContainerStarted","Data":"11c3cfb45c53d85c31c452bf78d003e58710bc7425b98f6532dec049f8a59073"} Dec 02 16:15:46 crc kubenswrapper[4933]: I1202 16:15:46.546532 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6d02b9de-9f5e-44fc-9e29-6a01b47c6ad7","Type":"ContainerDied","Data":"550c21306ee7cd482a5b23cad1a624c6582a6fa96e02efd4d938b89c9799676b"} Dec 02 16:15:46 crc kubenswrapper[4933]: I1202 16:15:46.546671 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 16:15:46 crc kubenswrapper[4933]: I1202 16:15:46.566123 4933 scope.go:117] "RemoveContainer" containerID="098ad90f3ee386fb3380cf335920d9277e2008438cf1a012f4e7cd1e1740ca47" Dec 02 16:15:46 crc kubenswrapper[4933]: I1202 16:15:46.618421 4933 scope.go:117] "RemoveContainer" containerID="5a265ce9cfac5e2d718a8f4fd680bc99910422fc26fd8a3080f8dffb10282c0e" Dec 02 16:15:46 crc kubenswrapper[4933]: I1202 16:15:46.642125 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 16:15:46 crc kubenswrapper[4933]: I1202 16:15:46.664497 4933 scope.go:117] "RemoveContainer" containerID="231ac796b1f2247613553fc4a1ad4f8c9d33f2ebe0b8b3755239d3b29f0a8d3f" Dec 02 16:15:46 crc kubenswrapper[4933]: I1202 16:15:46.666894 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 02 16:15:46 crc kubenswrapper[4933]: I1202 16:15:46.687924 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-647f597d98-xvnzx"] Dec 02 16:15:46 crc kubenswrapper[4933]: I1202 16:15:46.701340 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-647f597d98-xvnzx"] Dec 02 16:15:46 crc kubenswrapper[4933]: I1202 16:15:46.712077 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 02 16:15:46 crc kubenswrapper[4933]: E1202 16:15:46.712607 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d02b9de-9f5e-44fc-9e29-6a01b47c6ad7" containerName="proxy-httpd" Dec 02 16:15:46 crc kubenswrapper[4933]: I1202 16:15:46.712622 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d02b9de-9f5e-44fc-9e29-6a01b47c6ad7" containerName="proxy-httpd" Dec 02 16:15:46 crc kubenswrapper[4933]: E1202 16:15:46.712655 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59d1316b-e770-4a63-bd2c-f9c51b6f3082" containerName="barbican-api-log" Dec 02 16:15:46 crc kubenswrapper[4933]: I1202 16:15:46.712662 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="59d1316b-e770-4a63-bd2c-f9c51b6f3082" containerName="barbican-api-log" Dec 02 16:15:46 crc kubenswrapper[4933]: E1202 16:15:46.712875 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d02b9de-9f5e-44fc-9e29-6a01b47c6ad7" containerName="sg-core" Dec 02 16:15:46 crc kubenswrapper[4933]: I1202 16:15:46.712887 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d02b9de-9f5e-44fc-9e29-6a01b47c6ad7" containerName="sg-core" Dec 02 16:15:46 crc kubenswrapper[4933]: E1202 16:15:46.712905 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d02b9de-9f5e-44fc-9e29-6a01b47c6ad7" containerName="ceilometer-notification-agent" Dec 02 16:15:46 crc kubenswrapper[4933]: I1202 16:15:46.712911 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d02b9de-9f5e-44fc-9e29-6a01b47c6ad7" containerName="ceilometer-notification-agent" Dec 02 16:15:46 crc kubenswrapper[4933]: E1202 16:15:46.712930 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59d1316b-e770-4a63-bd2c-f9c51b6f3082" containerName="barbican-api" Dec 02 16:15:46 crc kubenswrapper[4933]: I1202 16:15:46.712936 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="59d1316b-e770-4a63-bd2c-f9c51b6f3082" containerName="barbican-api" Dec 02 16:15:46 crc kubenswrapper[4933]: I1202 16:15:46.713196 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="59d1316b-e770-4a63-bd2c-f9c51b6f3082" containerName="barbican-api-log" Dec 02 16:15:46 crc kubenswrapper[4933]: I1202 16:15:46.713235 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="59d1316b-e770-4a63-bd2c-f9c51b6f3082" containerName="barbican-api" Dec 02 16:15:46 crc kubenswrapper[4933]: I1202 16:15:46.713266 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d02b9de-9f5e-44fc-9e29-6a01b47c6ad7" containerName="ceilometer-notification-agent" Dec 02 16:15:46 crc kubenswrapper[4933]: I1202 16:15:46.713280 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d02b9de-9f5e-44fc-9e29-6a01b47c6ad7" containerName="sg-core" Dec 02 16:15:46 crc kubenswrapper[4933]: I1202 16:15:46.713295 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d02b9de-9f5e-44fc-9e29-6a01b47c6ad7" containerName="proxy-httpd" Dec 02 16:15:46 crc kubenswrapper[4933]: I1202 16:15:46.715253 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 16:15:46 crc kubenswrapper[4933]: I1202 16:15:46.723437 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 02 16:15:46 crc kubenswrapper[4933]: I1202 16:15:46.723620 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 02 16:15:46 crc kubenswrapper[4933]: I1202 16:15:46.724910 4933 scope.go:117] "RemoveContainer" containerID="5be483c4efbbc0bdf004464f4b5ac3acc48e3c0bc23a17e87f1eb680e31449a9" Dec 02 16:15:46 crc kubenswrapper[4933]: I1202 16:15:46.742141 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 16:15:46 crc kubenswrapper[4933]: I1202 16:15:46.832834 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41bc6554-0594-4a0b-9b1c-063f09af19ef-log-httpd\") pod \"ceilometer-0\" (UID: \"41bc6554-0594-4a0b-9b1c-063f09af19ef\") " pod="openstack/ceilometer-0" Dec 02 16:15:46 crc kubenswrapper[4933]: I1202 16:15:46.833109 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/41bc6554-0594-4a0b-9b1c-063f09af19ef-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"41bc6554-0594-4a0b-9b1c-063f09af19ef\") " pod="openstack/ceilometer-0" Dec 02 16:15:46 crc kubenswrapper[4933]: I1202 16:15:46.833151 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41bc6554-0594-4a0b-9b1c-063f09af19ef-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"41bc6554-0594-4a0b-9b1c-063f09af19ef\") " pod="openstack/ceilometer-0" Dec 02 16:15:46 crc kubenswrapper[4933]: I1202 16:15:46.833179 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41bc6554-0594-4a0b-9b1c-063f09af19ef-run-httpd\") pod \"ceilometer-0\" (UID: \"41bc6554-0594-4a0b-9b1c-063f09af19ef\") " pod="openstack/ceilometer-0" Dec 02 16:15:46 crc kubenswrapper[4933]: I1202 16:15:46.833237 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41bc6554-0594-4a0b-9b1c-063f09af19ef-scripts\") pod \"ceilometer-0\" (UID: \"41bc6554-0594-4a0b-9b1c-063f09af19ef\") " pod="openstack/ceilometer-0" Dec 02 16:15:46 crc kubenswrapper[4933]: I1202 16:15:46.833260 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tpds\" (UniqueName: \"kubernetes.io/projected/41bc6554-0594-4a0b-9b1c-063f09af19ef-kube-api-access-2tpds\") pod \"ceilometer-0\" (UID: \"41bc6554-0594-4a0b-9b1c-063f09af19ef\") " pod="openstack/ceilometer-0" Dec 02 16:15:46 crc kubenswrapper[4933]: I1202 16:15:46.833337 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41bc6554-0594-4a0b-9b1c-063f09af19ef-config-data\") pod \"ceilometer-0\" (UID: \"41bc6554-0594-4a0b-9b1c-063f09af19ef\") " pod="openstack/ceilometer-0" Dec 02 16:15:46 crc kubenswrapper[4933]: I1202 16:15:46.935076 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41bc6554-0594-4a0b-9b1c-063f09af19ef-scripts\") pod \"ceilometer-0\" (UID: \"41bc6554-0594-4a0b-9b1c-063f09af19ef\") " pod="openstack/ceilometer-0" Dec 02 16:15:46 crc kubenswrapper[4933]: I1202 16:15:46.935144 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2tpds\" (UniqueName: \"kubernetes.io/projected/41bc6554-0594-4a0b-9b1c-063f09af19ef-kube-api-access-2tpds\") pod \"ceilometer-0\" (UID: \"41bc6554-0594-4a0b-9b1c-063f09af19ef\") " pod="openstack/ceilometer-0" Dec 02 16:15:46 crc kubenswrapper[4933]: I1202 16:15:46.935191 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41bc6554-0594-4a0b-9b1c-063f09af19ef-config-data\") pod \"ceilometer-0\" (UID: \"41bc6554-0594-4a0b-9b1c-063f09af19ef\") " pod="openstack/ceilometer-0" Dec 02 16:15:46 crc kubenswrapper[4933]: I1202 16:15:46.935295 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41bc6554-0594-4a0b-9b1c-063f09af19ef-log-httpd\") pod \"ceilometer-0\" (UID: \"41bc6554-0594-4a0b-9b1c-063f09af19ef\") " pod="openstack/ceilometer-0" Dec 02 16:15:46 crc kubenswrapper[4933]: I1202 16:15:46.935315 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/41bc6554-0594-4a0b-9b1c-063f09af19ef-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"41bc6554-0594-4a0b-9b1c-063f09af19ef\") " pod="openstack/ceilometer-0" Dec 02 16:15:46 crc kubenswrapper[4933]: I1202 16:15:46.935350 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41bc6554-0594-4a0b-9b1c-063f09af19ef-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"41bc6554-0594-4a0b-9b1c-063f09af19ef\") " pod="openstack/ceilometer-0" Dec 02 16:15:46 crc kubenswrapper[4933]: I1202 16:15:46.935373 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41bc6554-0594-4a0b-9b1c-063f09af19ef-run-httpd\") pod \"ceilometer-0\" (UID: \"41bc6554-0594-4a0b-9b1c-063f09af19ef\") " pod="openstack/ceilometer-0" Dec 02 16:15:46 crc kubenswrapper[4933]: I1202 16:15:46.936049 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41bc6554-0594-4a0b-9b1c-063f09af19ef-run-httpd\") pod \"ceilometer-0\" (UID: \"41bc6554-0594-4a0b-9b1c-063f09af19ef\") " pod="openstack/ceilometer-0" Dec 02 16:15:46 crc kubenswrapper[4933]: I1202 16:15:46.936194 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41bc6554-0594-4a0b-9b1c-063f09af19ef-log-httpd\") pod \"ceilometer-0\" (UID: \"41bc6554-0594-4a0b-9b1c-063f09af19ef\") " pod="openstack/ceilometer-0" Dec 02 16:15:46 crc kubenswrapper[4933]: I1202 16:15:46.940431 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/41bc6554-0594-4a0b-9b1c-063f09af19ef-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"41bc6554-0594-4a0b-9b1c-063f09af19ef\") " pod="openstack/ceilometer-0" Dec 02 16:15:46 crc kubenswrapper[4933]: I1202 16:15:46.949417 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41bc6554-0594-4a0b-9b1c-063f09af19ef-scripts\") pod \"ceilometer-0\" (UID: \"41bc6554-0594-4a0b-9b1c-063f09af19ef\") " pod="openstack/ceilometer-0" Dec 02 16:15:46 crc kubenswrapper[4933]: I1202 16:15:46.950660 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41bc6554-0594-4a0b-9b1c-063f09af19ef-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"41bc6554-0594-4a0b-9b1c-063f09af19ef\") " pod="openstack/ceilometer-0" Dec 02 16:15:46 crc kubenswrapper[4933]: I1202 16:15:46.954688 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41bc6554-0594-4a0b-9b1c-063f09af19ef-config-data\") pod \"ceilometer-0\" (UID: \"41bc6554-0594-4a0b-9b1c-063f09af19ef\") " pod="openstack/ceilometer-0" Dec 02 16:15:46 crc kubenswrapper[4933]: I1202 16:15:46.955103 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tpds\" (UniqueName: \"kubernetes.io/projected/41bc6554-0594-4a0b-9b1c-063f09af19ef-kube-api-access-2tpds\") pod \"ceilometer-0\" (UID: \"41bc6554-0594-4a0b-9b1c-063f09af19ef\") " pod="openstack/ceilometer-0" Dec 02 16:15:47 crc kubenswrapper[4933]: I1202 16:15:47.078139 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59d1316b-e770-4a63-bd2c-f9c51b6f3082" path="/var/lib/kubelet/pods/59d1316b-e770-4a63-bd2c-f9c51b6f3082/volumes" Dec 02 16:15:47 crc kubenswrapper[4933]: I1202 16:15:47.078865 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d02b9de-9f5e-44fc-9e29-6a01b47c6ad7" path="/var/lib/kubelet/pods/6d02b9de-9f5e-44fc-9e29-6a01b47c6ad7/volumes" Dec 02 16:15:47 crc kubenswrapper[4933]: I1202 16:15:47.153293 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 16:15:47 crc kubenswrapper[4933]: I1202 16:15:47.559280 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9e9fc2a4-bdc8-4f3f-9ef1-b9e9a141cd02","Type":"ContainerStarted","Data":"637aec9e37c021706630db4dbd05e2649c87a901044d72cbfa18e46cd3aeb907"} Dec 02 16:15:47 crc kubenswrapper[4933]: I1202 16:15:47.559529 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 02 16:15:47 crc kubenswrapper[4933]: I1202 16:15:47.581936 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.581913848 podStartE2EDuration="3.581913848s" podCreationTimestamp="2025-12-02 16:15:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 16:15:47.578779613 +0000 UTC m=+1410.830006316" watchObservedRunningTime="2025-12-02 16:15:47.581913848 +0000 UTC m=+1410.833140551" Dec 02 16:15:47 crc kubenswrapper[4933]: I1202 16:15:47.649816 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 16:15:48 crc kubenswrapper[4933]: E1202 16:15:48.283246 4933 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6c1c5e6_50dd_428a_890c_2c3f0456f2fa.slice/crio-88738e3bef53d0478fb4beaed256d677959bb94ab5e545598a6a14e9cb4a5112.scope\": RecentStats: unable to find data in memory cache]" Dec 02 16:15:48 crc kubenswrapper[4933]: E1202 16:15:48.283625 4933 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6c1c5e6_50dd_428a_890c_2c3f0456f2fa.slice/crio-88738e3bef53d0478fb4beaed256d677959bb94ab5e545598a6a14e9cb4a5112.scope\": RecentStats: unable to find data in memory cache]" Dec 02 16:15:48 crc kubenswrapper[4933]: I1202 16:15:48.578973 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"41bc6554-0594-4a0b-9b1c-063f09af19ef","Type":"ContainerStarted","Data":"031a7ad11bf9c6a8a8c6dc1425d10b1dc132c180d71a86bb43b8c3bb2cdc9116"} Dec 02 16:15:48 crc kubenswrapper[4933]: I1202 16:15:48.580109 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"41bc6554-0594-4a0b-9b1c-063f09af19ef","Type":"ContainerStarted","Data":"56d6855ea9721351e5a8af06f6e077fb01313db7f3ea02a3098e5e44b9120591"} Dec 02 16:15:49 crc kubenswrapper[4933]: I1202 16:15:49.011113 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-5445f4c57f-62kcb" Dec 02 16:15:49 crc kubenswrapper[4933]: I1202 16:15:49.581777 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-858c7d46f6-k58rb" Dec 02 16:15:49 crc kubenswrapper[4933]: I1202 16:15:49.589396 4933 generic.go:334] "Generic (PLEG): container finished" podID="d981749f-83d5-4096-869d-617b2a6f1ce7" containerID="e856d80dfe6cd8e2c018c8bc68cc8e2b7ce3692ad51e95bf05f1b5748a8e5187" exitCode=0 Dec 02 16:15:49 crc kubenswrapper[4933]: I1202 16:15:49.589417 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-858c7d46f6-k58rb" Dec 02 16:15:49 crc kubenswrapper[4933]: I1202 16:15:49.589472 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-858c7d46f6-k58rb" event={"ID":"d981749f-83d5-4096-869d-617b2a6f1ce7","Type":"ContainerDied","Data":"e856d80dfe6cd8e2c018c8bc68cc8e2b7ce3692ad51e95bf05f1b5748a8e5187"} Dec 02 16:15:49 crc kubenswrapper[4933]: I1202 16:15:49.589503 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-858c7d46f6-k58rb" event={"ID":"d981749f-83d5-4096-869d-617b2a6f1ce7","Type":"ContainerDied","Data":"4a1275fc69bb461d08284214c94e1baf3bc0df6abd4f78aa5c3b5cf816985c4e"} Dec 02 16:15:49 crc kubenswrapper[4933]: I1202 16:15:49.589557 4933 scope.go:117] "RemoveContainer" containerID="44c621895de73fe9045cc23d2f328372058a1a20e57f0bf15e8ae1507b0ef978" Dec 02 16:15:49 crc kubenswrapper[4933]: I1202 16:15:49.594349 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"41bc6554-0594-4a0b-9b1c-063f09af19ef","Type":"ContainerStarted","Data":"c6240e38b31d693201a8016a4efb4d03d3c6b66e5de3e1fc3934db6e3b139897"} Dec 02 16:15:49 crc kubenswrapper[4933]: I1202 16:15:49.642724 4933 scope.go:117] "RemoveContainer" containerID="e856d80dfe6cd8e2c018c8bc68cc8e2b7ce3692ad51e95bf05f1b5748a8e5187" Dec 02 16:15:49 crc kubenswrapper[4933]: I1202 16:15:49.663345 4933 scope.go:117] "RemoveContainer" containerID="44c621895de73fe9045cc23d2f328372058a1a20e57f0bf15e8ae1507b0ef978" Dec 02 16:15:49 crc kubenswrapper[4933]: E1202 16:15:49.663706 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44c621895de73fe9045cc23d2f328372058a1a20e57f0bf15e8ae1507b0ef978\": container with ID starting with 44c621895de73fe9045cc23d2f328372058a1a20e57f0bf15e8ae1507b0ef978 not found: ID does not exist" containerID="44c621895de73fe9045cc23d2f328372058a1a20e57f0bf15e8ae1507b0ef978" Dec 02 16:15:49 crc kubenswrapper[4933]: I1202 16:15:49.663738 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44c621895de73fe9045cc23d2f328372058a1a20e57f0bf15e8ae1507b0ef978"} err="failed to get container status \"44c621895de73fe9045cc23d2f328372058a1a20e57f0bf15e8ae1507b0ef978\": rpc error: code = NotFound desc = could not find container \"44c621895de73fe9045cc23d2f328372058a1a20e57f0bf15e8ae1507b0ef978\": container with ID starting with 44c621895de73fe9045cc23d2f328372058a1a20e57f0bf15e8ae1507b0ef978 not found: ID does not exist" Dec 02 16:15:49 crc kubenswrapper[4933]: I1202 16:15:49.663762 4933 scope.go:117] "RemoveContainer" containerID="e856d80dfe6cd8e2c018c8bc68cc8e2b7ce3692ad51e95bf05f1b5748a8e5187" Dec 02 16:15:49 crc kubenswrapper[4933]: E1202 16:15:49.664012 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e856d80dfe6cd8e2c018c8bc68cc8e2b7ce3692ad51e95bf05f1b5748a8e5187\": container with ID starting with e856d80dfe6cd8e2c018c8bc68cc8e2b7ce3692ad51e95bf05f1b5748a8e5187 not found: ID does not exist" containerID="e856d80dfe6cd8e2c018c8bc68cc8e2b7ce3692ad51e95bf05f1b5748a8e5187" Dec 02 16:15:49 crc kubenswrapper[4933]: I1202 16:15:49.664035 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e856d80dfe6cd8e2c018c8bc68cc8e2b7ce3692ad51e95bf05f1b5748a8e5187"} err="failed to get container status \"e856d80dfe6cd8e2c018c8bc68cc8e2b7ce3692ad51e95bf05f1b5748a8e5187\": rpc error: code = NotFound desc = could not find container \"e856d80dfe6cd8e2c018c8bc68cc8e2b7ce3692ad51e95bf05f1b5748a8e5187\": container with ID starting with e856d80dfe6cd8e2c018c8bc68cc8e2b7ce3692ad51e95bf05f1b5748a8e5187 not found: ID does not exist" Dec 02 16:15:49 crc kubenswrapper[4933]: I1202 16:15:49.741556 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d981749f-83d5-4096-869d-617b2a6f1ce7-combined-ca-bundle\") pod \"d981749f-83d5-4096-869d-617b2a6f1ce7\" (UID: \"d981749f-83d5-4096-869d-617b2a6f1ce7\") " Dec 02 16:15:49 crc kubenswrapper[4933]: I1202 16:15:49.741919 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wwb5s\" (UniqueName: \"kubernetes.io/projected/d981749f-83d5-4096-869d-617b2a6f1ce7-kube-api-access-wwb5s\") pod \"d981749f-83d5-4096-869d-617b2a6f1ce7\" (UID: \"d981749f-83d5-4096-869d-617b2a6f1ce7\") " Dec 02 16:15:49 crc kubenswrapper[4933]: I1202 16:15:49.742181 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d981749f-83d5-4096-869d-617b2a6f1ce7-config\") pod \"d981749f-83d5-4096-869d-617b2a6f1ce7\" (UID: \"d981749f-83d5-4096-869d-617b2a6f1ce7\") " Dec 02 16:15:49 crc kubenswrapper[4933]: I1202 16:15:49.742445 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d981749f-83d5-4096-869d-617b2a6f1ce7-ovndb-tls-certs\") pod \"d981749f-83d5-4096-869d-617b2a6f1ce7\" (UID: \"d981749f-83d5-4096-869d-617b2a6f1ce7\") " Dec 02 16:15:49 crc kubenswrapper[4933]: I1202 16:15:49.742865 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d981749f-83d5-4096-869d-617b2a6f1ce7-httpd-config\") pod \"d981749f-83d5-4096-869d-617b2a6f1ce7\" (UID: \"d981749f-83d5-4096-869d-617b2a6f1ce7\") " Dec 02 16:15:49 crc kubenswrapper[4933]: I1202 16:15:49.747503 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d981749f-83d5-4096-869d-617b2a6f1ce7-kube-api-access-wwb5s" (OuterVolumeSpecName: "kube-api-access-wwb5s") pod "d981749f-83d5-4096-869d-617b2a6f1ce7" (UID: "d981749f-83d5-4096-869d-617b2a6f1ce7"). InnerVolumeSpecName "kube-api-access-wwb5s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:15:49 crc kubenswrapper[4933]: I1202 16:15:49.748191 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d981749f-83d5-4096-869d-617b2a6f1ce7-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "d981749f-83d5-4096-869d-617b2a6f1ce7" (UID: "d981749f-83d5-4096-869d-617b2a6f1ce7"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:15:49 crc kubenswrapper[4933]: I1202 16:15:49.809706 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d981749f-83d5-4096-869d-617b2a6f1ce7-config" (OuterVolumeSpecName: "config") pod "d981749f-83d5-4096-869d-617b2a6f1ce7" (UID: "d981749f-83d5-4096-869d-617b2a6f1ce7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:15:49 crc kubenswrapper[4933]: I1202 16:15:49.821514 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d981749f-83d5-4096-869d-617b2a6f1ce7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d981749f-83d5-4096-869d-617b2a6f1ce7" (UID: "d981749f-83d5-4096-869d-617b2a6f1ce7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:15:49 crc kubenswrapper[4933]: I1202 16:15:49.844595 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d981749f-83d5-4096-869d-617b2a6f1ce7-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "d981749f-83d5-4096-869d-617b2a6f1ce7" (UID: "d981749f-83d5-4096-869d-617b2a6f1ce7"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:15:49 crc kubenswrapper[4933]: I1202 16:15:49.847151 4933 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d981749f-83d5-4096-869d-617b2a6f1ce7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 16:15:49 crc kubenswrapper[4933]: I1202 16:15:49.847198 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wwb5s\" (UniqueName: \"kubernetes.io/projected/d981749f-83d5-4096-869d-617b2a6f1ce7-kube-api-access-wwb5s\") on node \"crc\" DevicePath \"\"" Dec 02 16:15:49 crc kubenswrapper[4933]: I1202 16:15:49.847222 4933 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/d981749f-83d5-4096-869d-617b2a6f1ce7-config\") on node \"crc\" DevicePath \"\"" Dec 02 16:15:49 crc kubenswrapper[4933]: I1202 16:15:49.847233 4933 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d981749f-83d5-4096-869d-617b2a6f1ce7-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 02 16:15:49 crc kubenswrapper[4933]: I1202 16:15:49.847244 4933 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d981749f-83d5-4096-869d-617b2a6f1ce7-httpd-config\") on node \"crc\" DevicePath \"\"" Dec 02 16:15:49 crc kubenswrapper[4933]: I1202 16:15:49.942456 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-858c7d46f6-k58rb"] Dec 02 16:15:49 crc kubenswrapper[4933]: I1202 16:15:49.951326 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-858c7d46f6-k58rb"] Dec 02 16:15:50 crc kubenswrapper[4933]: I1202 16:15:50.603032 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6bb4fc677f-zj8w6" Dec 02 16:15:50 crc kubenswrapper[4933]: I1202 16:15:50.691973 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-kzxj4"] Dec 02 16:15:50 crc kubenswrapper[4933]: I1202 16:15:50.692487 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-688c87cc99-kzxj4" podUID="4cc5d99c-4636-42d0-a7d3-6af1d371695c" containerName="dnsmasq-dns" containerID="cri-o://e74f1ee919c41a51f40effceb07124f910423bd80fefa16105d908176f1dd0c1" gracePeriod=10 Dec 02 16:15:50 crc kubenswrapper[4933]: I1202 16:15:50.874008 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 02 16:15:50 crc kubenswrapper[4933]: I1202 16:15:50.969299 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 02 16:15:51 crc kubenswrapper[4933]: I1202 16:15:51.076900 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d981749f-83d5-4096-869d-617b2a6f1ce7" path="/var/lib/kubelet/pods/d981749f-83d5-4096-869d-617b2a6f1ce7/volumes" Dec 02 16:15:51 crc kubenswrapper[4933]: I1202 16:15:51.454031 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688c87cc99-kzxj4" Dec 02 16:15:51 crc kubenswrapper[4933]: I1202 16:15:51.593677 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4cc5d99c-4636-42d0-a7d3-6af1d371695c-ovsdbserver-sb\") pod \"4cc5d99c-4636-42d0-a7d3-6af1d371695c\" (UID: \"4cc5d99c-4636-42d0-a7d3-6af1d371695c\") " Dec 02 16:15:51 crc kubenswrapper[4933]: I1202 16:15:51.593735 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cmqc6\" (UniqueName: \"kubernetes.io/projected/4cc5d99c-4636-42d0-a7d3-6af1d371695c-kube-api-access-cmqc6\") pod \"4cc5d99c-4636-42d0-a7d3-6af1d371695c\" (UID: \"4cc5d99c-4636-42d0-a7d3-6af1d371695c\") " Dec 02 16:15:51 crc kubenswrapper[4933]: I1202 16:15:51.593757 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4cc5d99c-4636-42d0-a7d3-6af1d371695c-ovsdbserver-nb\") pod \"4cc5d99c-4636-42d0-a7d3-6af1d371695c\" (UID: \"4cc5d99c-4636-42d0-a7d3-6af1d371695c\") " Dec 02 16:15:51 crc kubenswrapper[4933]: I1202 16:15:51.593810 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4cc5d99c-4636-42d0-a7d3-6af1d371695c-dns-swift-storage-0\") pod \"4cc5d99c-4636-42d0-a7d3-6af1d371695c\" (UID: \"4cc5d99c-4636-42d0-a7d3-6af1d371695c\") " Dec 02 16:15:51 crc kubenswrapper[4933]: I1202 16:15:51.593873 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4cc5d99c-4636-42d0-a7d3-6af1d371695c-config\") pod \"4cc5d99c-4636-42d0-a7d3-6af1d371695c\" (UID: \"4cc5d99c-4636-42d0-a7d3-6af1d371695c\") " Dec 02 16:15:51 crc kubenswrapper[4933]: I1202 16:15:51.593915 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4cc5d99c-4636-42d0-a7d3-6af1d371695c-dns-svc\") pod \"4cc5d99c-4636-42d0-a7d3-6af1d371695c\" (UID: \"4cc5d99c-4636-42d0-a7d3-6af1d371695c\") " Dec 02 16:15:51 crc kubenswrapper[4933]: I1202 16:15:51.598307 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4cc5d99c-4636-42d0-a7d3-6af1d371695c-kube-api-access-cmqc6" (OuterVolumeSpecName: "kube-api-access-cmqc6") pod "4cc5d99c-4636-42d0-a7d3-6af1d371695c" (UID: "4cc5d99c-4636-42d0-a7d3-6af1d371695c"). InnerVolumeSpecName "kube-api-access-cmqc6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:15:51 crc kubenswrapper[4933]: I1202 16:15:51.626259 4933 generic.go:334] "Generic (PLEG): container finished" podID="4cc5d99c-4636-42d0-a7d3-6af1d371695c" containerID="e74f1ee919c41a51f40effceb07124f910423bd80fefa16105d908176f1dd0c1" exitCode=0 Dec 02 16:15:51 crc kubenswrapper[4933]: I1202 16:15:51.628052 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="8d6fe320-04c0-4184-a67d-50676c0717b5" containerName="cinder-scheduler" containerID="cri-o://ef0193529a29cc40187469d2c484931607336e42925a30872414d9f579bc3945" gracePeriod=30 Dec 02 16:15:51 crc kubenswrapper[4933]: I1202 16:15:51.626522 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688c87cc99-kzxj4" Dec 02 16:15:51 crc kubenswrapper[4933]: I1202 16:15:51.628583 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="8d6fe320-04c0-4184-a67d-50676c0717b5" containerName="probe" containerID="cri-o://e36013856ecff5111ea7596fef6b0a38bd1cb5b093d5814e8b206b60a88f9b4a" gracePeriod=30 Dec 02 16:15:51 crc kubenswrapper[4933]: I1202 16:15:51.626450 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688c87cc99-kzxj4" event={"ID":"4cc5d99c-4636-42d0-a7d3-6af1d371695c","Type":"ContainerDied","Data":"e74f1ee919c41a51f40effceb07124f910423bd80fefa16105d908176f1dd0c1"} Dec 02 16:15:51 crc kubenswrapper[4933]: I1202 16:15:51.628751 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688c87cc99-kzxj4" event={"ID":"4cc5d99c-4636-42d0-a7d3-6af1d371695c","Type":"ContainerDied","Data":"e656ba1d5be3675ceccc6b78d3a1bd287dcb194dc5446d5e05426ebc21495ff8"} Dec 02 16:15:51 crc kubenswrapper[4933]: I1202 16:15:51.628773 4933 scope.go:117] "RemoveContainer" containerID="e74f1ee919c41a51f40effceb07124f910423bd80fefa16105d908176f1dd0c1" Dec 02 16:15:51 crc kubenswrapper[4933]: I1202 16:15:51.659269 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4cc5d99c-4636-42d0-a7d3-6af1d371695c-config" (OuterVolumeSpecName: "config") pod "4cc5d99c-4636-42d0-a7d3-6af1d371695c" (UID: "4cc5d99c-4636-42d0-a7d3-6af1d371695c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:15:51 crc kubenswrapper[4933]: I1202 16:15:51.668038 4933 scope.go:117] "RemoveContainer" containerID="afa60d60efcfc158a95c12d22fd25cc73f3c38cfa614deb3f004db461bcb8f15" Dec 02 16:15:51 crc kubenswrapper[4933]: I1202 16:15:51.669493 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4cc5d99c-4636-42d0-a7d3-6af1d371695c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4cc5d99c-4636-42d0-a7d3-6af1d371695c" (UID: "4cc5d99c-4636-42d0-a7d3-6af1d371695c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:15:51 crc kubenswrapper[4933]: I1202 16:15:51.696632 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4cc5d99c-4636-42d0-a7d3-6af1d371695c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4cc5d99c-4636-42d0-a7d3-6af1d371695c" (UID: "4cc5d99c-4636-42d0-a7d3-6af1d371695c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:15:51 crc kubenswrapper[4933]: I1202 16:15:51.699402 4933 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4cc5d99c-4636-42d0-a7d3-6af1d371695c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 02 16:15:51 crc kubenswrapper[4933]: I1202 16:15:51.699440 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cmqc6\" (UniqueName: \"kubernetes.io/projected/4cc5d99c-4636-42d0-a7d3-6af1d371695c-kube-api-access-cmqc6\") on node \"crc\" DevicePath \"\"" Dec 02 16:15:51 crc kubenswrapper[4933]: I1202 16:15:51.699454 4933 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4cc5d99c-4636-42d0-a7d3-6af1d371695c-config\") on node \"crc\" DevicePath \"\"" Dec 02 16:15:51 crc kubenswrapper[4933]: I1202 16:15:51.699465 4933 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4cc5d99c-4636-42d0-a7d3-6af1d371695c-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 16:15:51 crc kubenswrapper[4933]: I1202 16:15:51.709502 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4cc5d99c-4636-42d0-a7d3-6af1d371695c-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "4cc5d99c-4636-42d0-a7d3-6af1d371695c" (UID: "4cc5d99c-4636-42d0-a7d3-6af1d371695c"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:15:51 crc kubenswrapper[4933]: I1202 16:15:51.712232 4933 scope.go:117] "RemoveContainer" containerID="e74f1ee919c41a51f40effceb07124f910423bd80fefa16105d908176f1dd0c1" Dec 02 16:15:51 crc kubenswrapper[4933]: E1202 16:15:51.713425 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e74f1ee919c41a51f40effceb07124f910423bd80fefa16105d908176f1dd0c1\": container with ID starting with e74f1ee919c41a51f40effceb07124f910423bd80fefa16105d908176f1dd0c1 not found: ID does not exist" containerID="e74f1ee919c41a51f40effceb07124f910423bd80fefa16105d908176f1dd0c1" Dec 02 16:15:51 crc kubenswrapper[4933]: I1202 16:15:51.713474 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e74f1ee919c41a51f40effceb07124f910423bd80fefa16105d908176f1dd0c1"} err="failed to get container status \"e74f1ee919c41a51f40effceb07124f910423bd80fefa16105d908176f1dd0c1\": rpc error: code = NotFound desc = could not find container \"e74f1ee919c41a51f40effceb07124f910423bd80fefa16105d908176f1dd0c1\": container with ID starting with e74f1ee919c41a51f40effceb07124f910423bd80fefa16105d908176f1dd0c1 not found: ID does not exist" Dec 02 16:15:51 crc kubenswrapper[4933]: I1202 16:15:51.713509 4933 scope.go:117] "RemoveContainer" containerID="afa60d60efcfc158a95c12d22fd25cc73f3c38cfa614deb3f004db461bcb8f15" Dec 02 16:15:51 crc kubenswrapper[4933]: E1202 16:15:51.713908 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"afa60d60efcfc158a95c12d22fd25cc73f3c38cfa614deb3f004db461bcb8f15\": container with ID starting with afa60d60efcfc158a95c12d22fd25cc73f3c38cfa614deb3f004db461bcb8f15 not found: ID does not exist" containerID="afa60d60efcfc158a95c12d22fd25cc73f3c38cfa614deb3f004db461bcb8f15" Dec 02 16:15:51 crc kubenswrapper[4933]: I1202 16:15:51.713940 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"afa60d60efcfc158a95c12d22fd25cc73f3c38cfa614deb3f004db461bcb8f15"} err="failed to get container status \"afa60d60efcfc158a95c12d22fd25cc73f3c38cfa614deb3f004db461bcb8f15\": rpc error: code = NotFound desc = could not find container \"afa60d60efcfc158a95c12d22fd25cc73f3c38cfa614deb3f004db461bcb8f15\": container with ID starting with afa60d60efcfc158a95c12d22fd25cc73f3c38cfa614deb3f004db461bcb8f15 not found: ID does not exist" Dec 02 16:15:51 crc kubenswrapper[4933]: I1202 16:15:51.726598 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4cc5d99c-4636-42d0-a7d3-6af1d371695c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4cc5d99c-4636-42d0-a7d3-6af1d371695c" (UID: "4cc5d99c-4636-42d0-a7d3-6af1d371695c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:15:51 crc kubenswrapper[4933]: I1202 16:15:51.801103 4933 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4cc5d99c-4636-42d0-a7d3-6af1d371695c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 02 16:15:51 crc kubenswrapper[4933]: I1202 16:15:51.801132 4933 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4cc5d99c-4636-42d0-a7d3-6af1d371695c-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 02 16:15:51 crc kubenswrapper[4933]: I1202 16:15:51.963939 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-kzxj4"] Dec 02 16:15:51 crc kubenswrapper[4933]: I1202 16:15:51.981102 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-kzxj4"] Dec 02 16:15:52 crc kubenswrapper[4933]: I1202 16:15:52.180052 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 02 16:15:52 crc kubenswrapper[4933]: E1202 16:15:52.180560 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d981749f-83d5-4096-869d-617b2a6f1ce7" containerName="neutron-api" Dec 02 16:15:52 crc kubenswrapper[4933]: I1202 16:15:52.180579 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="d981749f-83d5-4096-869d-617b2a6f1ce7" containerName="neutron-api" Dec 02 16:15:52 crc kubenswrapper[4933]: E1202 16:15:52.180598 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d981749f-83d5-4096-869d-617b2a6f1ce7" containerName="neutron-httpd" Dec 02 16:15:52 crc kubenswrapper[4933]: I1202 16:15:52.180604 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="d981749f-83d5-4096-869d-617b2a6f1ce7" containerName="neutron-httpd" Dec 02 16:15:52 crc kubenswrapper[4933]: E1202 16:15:52.180616 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cc5d99c-4636-42d0-a7d3-6af1d371695c" containerName="init" Dec 02 16:15:52 crc kubenswrapper[4933]: I1202 16:15:52.180623 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cc5d99c-4636-42d0-a7d3-6af1d371695c" containerName="init" Dec 02 16:15:52 crc kubenswrapper[4933]: E1202 16:15:52.180646 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cc5d99c-4636-42d0-a7d3-6af1d371695c" containerName="dnsmasq-dns" Dec 02 16:15:52 crc kubenswrapper[4933]: I1202 16:15:52.180651 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cc5d99c-4636-42d0-a7d3-6af1d371695c" containerName="dnsmasq-dns" Dec 02 16:15:52 crc kubenswrapper[4933]: I1202 16:15:52.180887 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="d981749f-83d5-4096-869d-617b2a6f1ce7" containerName="neutron-httpd" Dec 02 16:15:52 crc kubenswrapper[4933]: I1202 16:15:52.180926 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="d981749f-83d5-4096-869d-617b2a6f1ce7" containerName="neutron-api" Dec 02 16:15:52 crc kubenswrapper[4933]: I1202 16:15:52.180946 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="4cc5d99c-4636-42d0-a7d3-6af1d371695c" containerName="dnsmasq-dns" Dec 02 16:15:52 crc kubenswrapper[4933]: I1202 16:15:52.181725 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 02 16:15:52 crc kubenswrapper[4933]: I1202 16:15:52.185431 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Dec 02 16:15:52 crc kubenswrapper[4933]: I1202 16:15:52.185484 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Dec 02 16:15:52 crc kubenswrapper[4933]: I1202 16:15:52.185520 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-8hl7c" Dec 02 16:15:52 crc kubenswrapper[4933]: I1202 16:15:52.193493 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 02 16:15:52 crc kubenswrapper[4933]: I1202 16:15:52.327987 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f673f811-c1d1-4e11-94d8-4932e9761bbf-combined-ca-bundle\") pod \"openstackclient\" (UID: \"f673f811-c1d1-4e11-94d8-4932e9761bbf\") " pod="openstack/openstackclient" Dec 02 16:15:52 crc kubenswrapper[4933]: I1202 16:15:52.328421 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktnfv\" (UniqueName: \"kubernetes.io/projected/f673f811-c1d1-4e11-94d8-4932e9761bbf-kube-api-access-ktnfv\") pod \"openstackclient\" (UID: \"f673f811-c1d1-4e11-94d8-4932e9761bbf\") " pod="openstack/openstackclient" Dec 02 16:15:52 crc kubenswrapper[4933]: I1202 16:15:52.328450 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f673f811-c1d1-4e11-94d8-4932e9761bbf-openstack-config\") pod \"openstackclient\" (UID: \"f673f811-c1d1-4e11-94d8-4932e9761bbf\") " pod="openstack/openstackclient" Dec 02 16:15:52 crc kubenswrapper[4933]: I1202 16:15:52.328498 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f673f811-c1d1-4e11-94d8-4932e9761bbf-openstack-config-secret\") pod \"openstackclient\" (UID: \"f673f811-c1d1-4e11-94d8-4932e9761bbf\") " pod="openstack/openstackclient" Dec 02 16:15:52 crc kubenswrapper[4933]: I1202 16:15:52.430127 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktnfv\" (UniqueName: \"kubernetes.io/projected/f673f811-c1d1-4e11-94d8-4932e9761bbf-kube-api-access-ktnfv\") pod \"openstackclient\" (UID: \"f673f811-c1d1-4e11-94d8-4932e9761bbf\") " pod="openstack/openstackclient" Dec 02 16:15:52 crc kubenswrapper[4933]: I1202 16:15:52.430175 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f673f811-c1d1-4e11-94d8-4932e9761bbf-openstack-config\") pod \"openstackclient\" (UID: \"f673f811-c1d1-4e11-94d8-4932e9761bbf\") " pod="openstack/openstackclient" Dec 02 16:15:52 crc kubenswrapper[4933]: I1202 16:15:52.430195 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f673f811-c1d1-4e11-94d8-4932e9761bbf-openstack-config-secret\") pod \"openstackclient\" (UID: \"f673f811-c1d1-4e11-94d8-4932e9761bbf\") " pod="openstack/openstackclient" Dec 02 16:15:52 crc kubenswrapper[4933]: I1202 16:15:52.430302 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f673f811-c1d1-4e11-94d8-4932e9761bbf-combined-ca-bundle\") pod \"openstackclient\" (UID: \"f673f811-c1d1-4e11-94d8-4932e9761bbf\") " pod="openstack/openstackclient" Dec 02 16:15:52 crc kubenswrapper[4933]: I1202 16:15:52.431036 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f673f811-c1d1-4e11-94d8-4932e9761bbf-openstack-config\") pod \"openstackclient\" (UID: \"f673f811-c1d1-4e11-94d8-4932e9761bbf\") " pod="openstack/openstackclient" Dec 02 16:15:52 crc kubenswrapper[4933]: I1202 16:15:52.433892 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f673f811-c1d1-4e11-94d8-4932e9761bbf-openstack-config-secret\") pod \"openstackclient\" (UID: \"f673f811-c1d1-4e11-94d8-4932e9761bbf\") " pod="openstack/openstackclient" Dec 02 16:15:52 crc kubenswrapper[4933]: I1202 16:15:52.434088 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f673f811-c1d1-4e11-94d8-4932e9761bbf-combined-ca-bundle\") pod \"openstackclient\" (UID: \"f673f811-c1d1-4e11-94d8-4932e9761bbf\") " pod="openstack/openstackclient" Dec 02 16:15:52 crc kubenswrapper[4933]: I1202 16:15:52.449554 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktnfv\" (UniqueName: \"kubernetes.io/projected/f673f811-c1d1-4e11-94d8-4932e9761bbf-kube-api-access-ktnfv\") pod \"openstackclient\" (UID: \"f673f811-c1d1-4e11-94d8-4932e9761bbf\") " pod="openstack/openstackclient" Dec 02 16:15:52 crc kubenswrapper[4933]: I1202 16:15:52.500152 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 02 16:15:52 crc kubenswrapper[4933]: I1202 16:15:52.669862 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"41bc6554-0594-4a0b-9b1c-063f09af19ef","Type":"ContainerStarted","Data":"9d15d81a7d9d565d0c133fb641536c92dbd0b74fa375f2f19df145e67a197434"} Dec 02 16:15:52 crc kubenswrapper[4933]: I1202 16:15:52.686188 4933 generic.go:334] "Generic (PLEG): container finished" podID="8d6fe320-04c0-4184-a67d-50676c0717b5" containerID="e36013856ecff5111ea7596fef6b0a38bd1cb5b093d5814e8b206b60a88f9b4a" exitCode=0 Dec 02 16:15:52 crc kubenswrapper[4933]: I1202 16:15:52.686273 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"8d6fe320-04c0-4184-a67d-50676c0717b5","Type":"ContainerDied","Data":"e36013856ecff5111ea7596fef6b0a38bd1cb5b093d5814e8b206b60a88f9b4a"} Dec 02 16:15:52 crc kubenswrapper[4933]: E1202 16:15:52.982336 4933 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6c1c5e6_50dd_428a_890c_2c3f0456f2fa.slice/crio-88738e3bef53d0478fb4beaed256d677959bb94ab5e545598a6a14e9cb4a5112.scope\": RecentStats: unable to find data in memory cache]" Dec 02 16:15:53 crc kubenswrapper[4933]: I1202 16:15:53.013477 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 02 16:15:53 crc kubenswrapper[4933]: W1202 16:15:53.023064 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf673f811_c1d1_4e11_94d8_4932e9761bbf.slice/crio-435d025caacf366be3ad6f635ef480218fa81fb9107fb1bf01250ebc92532781 WatchSource:0}: Error finding container 435d025caacf366be3ad6f635ef480218fa81fb9107fb1bf01250ebc92532781: Status 404 returned error can't find the container with id 435d025caacf366be3ad6f635ef480218fa81fb9107fb1bf01250ebc92532781 Dec 02 16:15:53 crc kubenswrapper[4933]: I1202 16:15:53.069551 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4cc5d99c-4636-42d0-a7d3-6af1d371695c" path="/var/lib/kubelet/pods/4cc5d99c-4636-42d0-a7d3-6af1d371695c/volumes" Dec 02 16:15:53 crc kubenswrapper[4933]: I1202 16:15:53.699125 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"f673f811-c1d1-4e11-94d8-4932e9761bbf","Type":"ContainerStarted","Data":"435d025caacf366be3ad6f635ef480218fa81fb9107fb1bf01250ebc92532781"} Dec 02 16:15:53 crc kubenswrapper[4933]: I1202 16:15:53.701712 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"41bc6554-0594-4a0b-9b1c-063f09af19ef","Type":"ContainerStarted","Data":"ed70c571191960cb8e95c29b093ad9380836c1a17a85f2da93b285fba4985071"} Dec 02 16:15:53 crc kubenswrapper[4933]: I1202 16:15:53.701884 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 02 16:15:53 crc kubenswrapper[4933]: I1202 16:15:53.740464 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.5787641629999998 podStartE2EDuration="7.740447723s" podCreationTimestamp="2025-12-02 16:15:46 +0000 UTC" firstStartedPulling="2025-12-02 16:15:47.652182258 +0000 UTC m=+1410.903408961" lastFinishedPulling="2025-12-02 16:15:52.813865818 +0000 UTC m=+1416.065092521" observedRunningTime="2025-12-02 16:15:53.735598101 +0000 UTC m=+1416.986824804" watchObservedRunningTime="2025-12-02 16:15:53.740447723 +0000 UTC m=+1416.991674426" Dec 02 16:15:54 crc kubenswrapper[4933]: I1202 16:15:54.507213 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-76479c94dd-9jkgg"] Dec 02 16:15:54 crc kubenswrapper[4933]: I1202 16:15:54.509005 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-76479c94dd-9jkgg" Dec 02 16:15:54 crc kubenswrapper[4933]: I1202 16:15:54.513912 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-engine-config-data" Dec 02 16:15:54 crc kubenswrapper[4933]: I1202 16:15:54.513947 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-kbbdj" Dec 02 16:15:54 crc kubenswrapper[4933]: I1202 16:15:54.514120 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Dec 02 16:15:54 crc kubenswrapper[4933]: I1202 16:15:54.547122 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-76479c94dd-9jkgg"] Dec 02 16:15:54 crc kubenswrapper[4933]: I1202 16:15:54.652008 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7d978555f9-9jz6z"] Dec 02 16:15:54 crc kubenswrapper[4933]: I1202 16:15:54.653939 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d978555f9-9jz6z" Dec 02 16:15:54 crc kubenswrapper[4933]: I1202 16:15:54.679313 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d978555f9-9jz6z"] Dec 02 16:15:54 crc kubenswrapper[4933]: I1202 16:15:54.691436 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8aebfa1d-65c0-4f16-bcee-86e03d923f99-config-data-custom\") pod \"heat-engine-76479c94dd-9jkgg\" (UID: \"8aebfa1d-65c0-4f16-bcee-86e03d923f99\") " pod="openstack/heat-engine-76479c94dd-9jkgg" Dec 02 16:15:54 crc kubenswrapper[4933]: I1202 16:15:54.691564 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rw47r\" (UniqueName: \"kubernetes.io/projected/8aebfa1d-65c0-4f16-bcee-86e03d923f99-kube-api-access-rw47r\") pod \"heat-engine-76479c94dd-9jkgg\" (UID: \"8aebfa1d-65c0-4f16-bcee-86e03d923f99\") " pod="openstack/heat-engine-76479c94dd-9jkgg" Dec 02 16:15:54 crc kubenswrapper[4933]: I1202 16:15:54.691638 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8aebfa1d-65c0-4f16-bcee-86e03d923f99-combined-ca-bundle\") pod \"heat-engine-76479c94dd-9jkgg\" (UID: \"8aebfa1d-65c0-4f16-bcee-86e03d923f99\") " pod="openstack/heat-engine-76479c94dd-9jkgg" Dec 02 16:15:54 crc kubenswrapper[4933]: I1202 16:15:54.691658 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8aebfa1d-65c0-4f16-bcee-86e03d923f99-config-data\") pod \"heat-engine-76479c94dd-9jkgg\" (UID: \"8aebfa1d-65c0-4f16-bcee-86e03d923f99\") " pod="openstack/heat-engine-76479c94dd-9jkgg" Dec 02 16:15:54 crc kubenswrapper[4933]: I1202 16:15:54.747049 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-75b459779d-29rn7"] Dec 02 16:15:54 crc kubenswrapper[4933]: I1202 16:15:54.754150 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-75b459779d-29rn7" Dec 02 16:15:54 crc kubenswrapper[4933]: I1202 16:15:54.758732 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-cfnapi-config-data" Dec 02 16:15:54 crc kubenswrapper[4933]: I1202 16:15:54.779868 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-75b459779d-29rn7"] Dec 02 16:15:54 crc kubenswrapper[4933]: I1202 16:15:54.797325 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8aebfa1d-65c0-4f16-bcee-86e03d923f99-config-data-custom\") pod \"heat-engine-76479c94dd-9jkgg\" (UID: \"8aebfa1d-65c0-4f16-bcee-86e03d923f99\") " pod="openstack/heat-engine-76479c94dd-9jkgg" Dec 02 16:15:54 crc kubenswrapper[4933]: I1202 16:15:54.797402 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/01dfefe7-534e-41e1-9f9b-a59f177f4c7e-ovsdbserver-sb\") pod \"dnsmasq-dns-7d978555f9-9jz6z\" (UID: \"01dfefe7-534e-41e1-9f9b-a59f177f4c7e\") " pod="openstack/dnsmasq-dns-7d978555f9-9jz6z" Dec 02 16:15:54 crc kubenswrapper[4933]: I1202 16:15:54.797504 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01dfefe7-534e-41e1-9f9b-a59f177f4c7e-config\") pod \"dnsmasq-dns-7d978555f9-9jz6z\" (UID: \"01dfefe7-534e-41e1-9f9b-a59f177f4c7e\") " pod="openstack/dnsmasq-dns-7d978555f9-9jz6z" Dec 02 16:15:54 crc kubenswrapper[4933]: I1202 16:15:54.797615 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rw47r\" (UniqueName: \"kubernetes.io/projected/8aebfa1d-65c0-4f16-bcee-86e03d923f99-kube-api-access-rw47r\") pod \"heat-engine-76479c94dd-9jkgg\" (UID: \"8aebfa1d-65c0-4f16-bcee-86e03d923f99\") " pod="openstack/heat-engine-76479c94dd-9jkgg" Dec 02 16:15:54 crc kubenswrapper[4933]: I1202 16:15:54.797642 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/01dfefe7-534e-41e1-9f9b-a59f177f4c7e-ovsdbserver-nb\") pod \"dnsmasq-dns-7d978555f9-9jz6z\" (UID: \"01dfefe7-534e-41e1-9f9b-a59f177f4c7e\") " pod="openstack/dnsmasq-dns-7d978555f9-9jz6z" Dec 02 16:15:54 crc kubenswrapper[4933]: I1202 16:15:54.797692 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/01dfefe7-534e-41e1-9f9b-a59f177f4c7e-dns-svc\") pod \"dnsmasq-dns-7d978555f9-9jz6z\" (UID: \"01dfefe7-534e-41e1-9f9b-a59f177f4c7e\") " pod="openstack/dnsmasq-dns-7d978555f9-9jz6z" Dec 02 16:15:54 crc kubenswrapper[4933]: I1202 16:15:54.797800 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8aebfa1d-65c0-4f16-bcee-86e03d923f99-combined-ca-bundle\") pod \"heat-engine-76479c94dd-9jkgg\" (UID: \"8aebfa1d-65c0-4f16-bcee-86e03d923f99\") " pod="openstack/heat-engine-76479c94dd-9jkgg" Dec 02 16:15:54 crc kubenswrapper[4933]: I1202 16:15:54.797856 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8aebfa1d-65c0-4f16-bcee-86e03d923f99-config-data\") pod \"heat-engine-76479c94dd-9jkgg\" (UID: \"8aebfa1d-65c0-4f16-bcee-86e03d923f99\") " pod="openstack/heat-engine-76479c94dd-9jkgg" Dec 02 16:15:54 crc kubenswrapper[4933]: I1202 16:15:54.797899 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/01dfefe7-534e-41e1-9f9b-a59f177f4c7e-dns-swift-storage-0\") pod \"dnsmasq-dns-7d978555f9-9jz6z\" (UID: \"01dfefe7-534e-41e1-9f9b-a59f177f4c7e\") " pod="openstack/dnsmasq-dns-7d978555f9-9jz6z" Dec 02 16:15:54 crc kubenswrapper[4933]: I1202 16:15:54.797937 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sffnq\" (UniqueName: \"kubernetes.io/projected/01dfefe7-534e-41e1-9f9b-a59f177f4c7e-kube-api-access-sffnq\") pod \"dnsmasq-dns-7d978555f9-9jz6z\" (UID: \"01dfefe7-534e-41e1-9f9b-a59f177f4c7e\") " pod="openstack/dnsmasq-dns-7d978555f9-9jz6z" Dec 02 16:15:54 crc kubenswrapper[4933]: I1202 16:15:54.806121 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8aebfa1d-65c0-4f16-bcee-86e03d923f99-config-data-custom\") pod \"heat-engine-76479c94dd-9jkgg\" (UID: \"8aebfa1d-65c0-4f16-bcee-86e03d923f99\") " pod="openstack/heat-engine-76479c94dd-9jkgg" Dec 02 16:15:54 crc kubenswrapper[4933]: I1202 16:15:54.811292 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8aebfa1d-65c0-4f16-bcee-86e03d923f99-config-data\") pod \"heat-engine-76479c94dd-9jkgg\" (UID: \"8aebfa1d-65c0-4f16-bcee-86e03d923f99\") " pod="openstack/heat-engine-76479c94dd-9jkgg" Dec 02 16:15:54 crc kubenswrapper[4933]: I1202 16:15:54.814651 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8aebfa1d-65c0-4f16-bcee-86e03d923f99-combined-ca-bundle\") pod \"heat-engine-76479c94dd-9jkgg\" (UID: \"8aebfa1d-65c0-4f16-bcee-86e03d923f99\") " pod="openstack/heat-engine-76479c94dd-9jkgg" Dec 02 16:15:54 crc kubenswrapper[4933]: I1202 16:15:54.825344 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rw47r\" (UniqueName: \"kubernetes.io/projected/8aebfa1d-65c0-4f16-bcee-86e03d923f99-kube-api-access-rw47r\") pod \"heat-engine-76479c94dd-9jkgg\" (UID: \"8aebfa1d-65c0-4f16-bcee-86e03d923f99\") " pod="openstack/heat-engine-76479c94dd-9jkgg" Dec 02 16:15:54 crc kubenswrapper[4933]: I1202 16:15:54.826886 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-84dddfc884-5ff6c"] Dec 02 16:15:54 crc kubenswrapper[4933]: I1202 16:15:54.828642 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-84dddfc884-5ff6c" Dec 02 16:15:54 crc kubenswrapper[4933]: I1202 16:15:54.832702 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-api-config-data" Dec 02 16:15:54 crc kubenswrapper[4933]: I1202 16:15:54.845887 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-84dddfc884-5ff6c"] Dec 02 16:15:54 crc kubenswrapper[4933]: I1202 16:15:54.861774 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-76479c94dd-9jkgg" Dec 02 16:15:54 crc kubenswrapper[4933]: I1202 16:15:54.901392 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2da32456-86db-40a7-afb5-9d8b6e5866d6-combined-ca-bundle\") pod \"heat-cfnapi-75b459779d-29rn7\" (UID: \"2da32456-86db-40a7-afb5-9d8b6e5866d6\") " pod="openstack/heat-cfnapi-75b459779d-29rn7" Dec 02 16:15:54 crc kubenswrapper[4933]: I1202 16:15:54.901449 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/01dfefe7-534e-41e1-9f9b-a59f177f4c7e-dns-swift-storage-0\") pod \"dnsmasq-dns-7d978555f9-9jz6z\" (UID: \"01dfefe7-534e-41e1-9f9b-a59f177f4c7e\") " pod="openstack/dnsmasq-dns-7d978555f9-9jz6z" Dec 02 16:15:54 crc kubenswrapper[4933]: I1202 16:15:54.901486 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sffnq\" (UniqueName: \"kubernetes.io/projected/01dfefe7-534e-41e1-9f9b-a59f177f4c7e-kube-api-access-sffnq\") pod \"dnsmasq-dns-7d978555f9-9jz6z\" (UID: \"01dfefe7-534e-41e1-9f9b-a59f177f4c7e\") " pod="openstack/dnsmasq-dns-7d978555f9-9jz6z" Dec 02 16:15:54 crc kubenswrapper[4933]: I1202 16:15:54.901570 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zw7s5\" (UniqueName: \"kubernetes.io/projected/2da32456-86db-40a7-afb5-9d8b6e5866d6-kube-api-access-zw7s5\") pod \"heat-cfnapi-75b459779d-29rn7\" (UID: \"2da32456-86db-40a7-afb5-9d8b6e5866d6\") " pod="openstack/heat-cfnapi-75b459779d-29rn7" Dec 02 16:15:54 crc kubenswrapper[4933]: I1202 16:15:54.901602 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/01dfefe7-534e-41e1-9f9b-a59f177f4c7e-ovsdbserver-sb\") pod \"dnsmasq-dns-7d978555f9-9jz6z\" (UID: \"01dfefe7-534e-41e1-9f9b-a59f177f4c7e\") " pod="openstack/dnsmasq-dns-7d978555f9-9jz6z" Dec 02 16:15:54 crc kubenswrapper[4933]: I1202 16:15:54.901675 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01dfefe7-534e-41e1-9f9b-a59f177f4c7e-config\") pod \"dnsmasq-dns-7d978555f9-9jz6z\" (UID: \"01dfefe7-534e-41e1-9f9b-a59f177f4c7e\") " pod="openstack/dnsmasq-dns-7d978555f9-9jz6z" Dec 02 16:15:54 crc kubenswrapper[4933]: I1202 16:15:54.901696 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2da32456-86db-40a7-afb5-9d8b6e5866d6-config-data\") pod \"heat-cfnapi-75b459779d-29rn7\" (UID: \"2da32456-86db-40a7-afb5-9d8b6e5866d6\") " pod="openstack/heat-cfnapi-75b459779d-29rn7" Dec 02 16:15:54 crc kubenswrapper[4933]: I1202 16:15:54.901770 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/01dfefe7-534e-41e1-9f9b-a59f177f4c7e-ovsdbserver-nb\") pod \"dnsmasq-dns-7d978555f9-9jz6z\" (UID: \"01dfefe7-534e-41e1-9f9b-a59f177f4c7e\") " pod="openstack/dnsmasq-dns-7d978555f9-9jz6z" Dec 02 16:15:54 crc kubenswrapper[4933]: I1202 16:15:54.901796 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2da32456-86db-40a7-afb5-9d8b6e5866d6-config-data-custom\") pod \"heat-cfnapi-75b459779d-29rn7\" (UID: \"2da32456-86db-40a7-afb5-9d8b6e5866d6\") " pod="openstack/heat-cfnapi-75b459779d-29rn7" Dec 02 16:15:54 crc kubenswrapper[4933]: I1202 16:15:54.901839 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/01dfefe7-534e-41e1-9f9b-a59f177f4c7e-dns-svc\") pod \"dnsmasq-dns-7d978555f9-9jz6z\" (UID: \"01dfefe7-534e-41e1-9f9b-a59f177f4c7e\") " pod="openstack/dnsmasq-dns-7d978555f9-9jz6z" Dec 02 16:15:54 crc kubenswrapper[4933]: I1202 16:15:54.902989 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/01dfefe7-534e-41e1-9f9b-a59f177f4c7e-dns-svc\") pod \"dnsmasq-dns-7d978555f9-9jz6z\" (UID: \"01dfefe7-534e-41e1-9f9b-a59f177f4c7e\") " pod="openstack/dnsmasq-dns-7d978555f9-9jz6z" Dec 02 16:15:54 crc kubenswrapper[4933]: I1202 16:15:54.903187 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/01dfefe7-534e-41e1-9f9b-a59f177f4c7e-ovsdbserver-nb\") pod \"dnsmasq-dns-7d978555f9-9jz6z\" (UID: \"01dfefe7-534e-41e1-9f9b-a59f177f4c7e\") " pod="openstack/dnsmasq-dns-7d978555f9-9jz6z" Dec 02 16:15:54 crc kubenswrapper[4933]: I1202 16:15:54.903392 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/01dfefe7-534e-41e1-9f9b-a59f177f4c7e-ovsdbserver-sb\") pod \"dnsmasq-dns-7d978555f9-9jz6z\" (UID: \"01dfefe7-534e-41e1-9f9b-a59f177f4c7e\") " pod="openstack/dnsmasq-dns-7d978555f9-9jz6z" Dec 02 16:15:54 crc kubenswrapper[4933]: I1202 16:15:54.903790 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/01dfefe7-534e-41e1-9f9b-a59f177f4c7e-dns-swift-storage-0\") pod \"dnsmasq-dns-7d978555f9-9jz6z\" (UID: \"01dfefe7-534e-41e1-9f9b-a59f177f4c7e\") " pod="openstack/dnsmasq-dns-7d978555f9-9jz6z" Dec 02 16:15:54 crc kubenswrapper[4933]: I1202 16:15:54.904015 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01dfefe7-534e-41e1-9f9b-a59f177f4c7e-config\") pod \"dnsmasq-dns-7d978555f9-9jz6z\" (UID: \"01dfefe7-534e-41e1-9f9b-a59f177f4c7e\") " pod="openstack/dnsmasq-dns-7d978555f9-9jz6z" Dec 02 16:15:54 crc kubenswrapper[4933]: I1202 16:15:54.930584 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sffnq\" (UniqueName: \"kubernetes.io/projected/01dfefe7-534e-41e1-9f9b-a59f177f4c7e-kube-api-access-sffnq\") pod \"dnsmasq-dns-7d978555f9-9jz6z\" (UID: \"01dfefe7-534e-41e1-9f9b-a59f177f4c7e\") " pod="openstack/dnsmasq-dns-7d978555f9-9jz6z" Dec 02 16:15:55 crc kubenswrapper[4933]: I1202 16:15:55.001039 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d978555f9-9jz6z" Dec 02 16:15:55 crc kubenswrapper[4933]: I1202 16:15:55.004362 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zw7s5\" (UniqueName: \"kubernetes.io/projected/2da32456-86db-40a7-afb5-9d8b6e5866d6-kube-api-access-zw7s5\") pod \"heat-cfnapi-75b459779d-29rn7\" (UID: \"2da32456-86db-40a7-afb5-9d8b6e5866d6\") " pod="openstack/heat-cfnapi-75b459779d-29rn7" Dec 02 16:15:55 crc kubenswrapper[4933]: I1202 16:15:55.004436 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/036484a3-ab5b-4bf2-a02c-9fb583904529-config-data\") pod \"heat-api-84dddfc884-5ff6c\" (UID: \"036484a3-ab5b-4bf2-a02c-9fb583904529\") " pod="openstack/heat-api-84dddfc884-5ff6c" Dec 02 16:15:55 crc kubenswrapper[4933]: I1202 16:15:55.004514 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2da32456-86db-40a7-afb5-9d8b6e5866d6-config-data\") pod \"heat-cfnapi-75b459779d-29rn7\" (UID: \"2da32456-86db-40a7-afb5-9d8b6e5866d6\") " pod="openstack/heat-cfnapi-75b459779d-29rn7" Dec 02 16:15:55 crc kubenswrapper[4933]: I1202 16:15:55.004767 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2da32456-86db-40a7-afb5-9d8b6e5866d6-config-data-custom\") pod \"heat-cfnapi-75b459779d-29rn7\" (UID: \"2da32456-86db-40a7-afb5-9d8b6e5866d6\") " pod="openstack/heat-cfnapi-75b459779d-29rn7" Dec 02 16:15:55 crc kubenswrapper[4933]: I1202 16:15:55.004980 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/036484a3-ab5b-4bf2-a02c-9fb583904529-combined-ca-bundle\") pod \"heat-api-84dddfc884-5ff6c\" (UID: \"036484a3-ab5b-4bf2-a02c-9fb583904529\") " pod="openstack/heat-api-84dddfc884-5ff6c" Dec 02 16:15:55 crc kubenswrapper[4933]: I1202 16:15:55.005007 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8twx\" (UniqueName: \"kubernetes.io/projected/036484a3-ab5b-4bf2-a02c-9fb583904529-kube-api-access-p8twx\") pod \"heat-api-84dddfc884-5ff6c\" (UID: \"036484a3-ab5b-4bf2-a02c-9fb583904529\") " pod="openstack/heat-api-84dddfc884-5ff6c" Dec 02 16:15:55 crc kubenswrapper[4933]: I1202 16:15:55.005077 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2da32456-86db-40a7-afb5-9d8b6e5866d6-combined-ca-bundle\") pod \"heat-cfnapi-75b459779d-29rn7\" (UID: \"2da32456-86db-40a7-afb5-9d8b6e5866d6\") " pod="openstack/heat-cfnapi-75b459779d-29rn7" Dec 02 16:15:55 crc kubenswrapper[4933]: I1202 16:15:55.005114 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/036484a3-ab5b-4bf2-a02c-9fb583904529-config-data-custom\") pod \"heat-api-84dddfc884-5ff6c\" (UID: \"036484a3-ab5b-4bf2-a02c-9fb583904529\") " pod="openstack/heat-api-84dddfc884-5ff6c" Dec 02 16:15:55 crc kubenswrapper[4933]: I1202 16:15:55.010107 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2da32456-86db-40a7-afb5-9d8b6e5866d6-combined-ca-bundle\") pod \"heat-cfnapi-75b459779d-29rn7\" (UID: \"2da32456-86db-40a7-afb5-9d8b6e5866d6\") " pod="openstack/heat-cfnapi-75b459779d-29rn7" Dec 02 16:15:55 crc kubenswrapper[4933]: I1202 16:15:55.015547 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2da32456-86db-40a7-afb5-9d8b6e5866d6-config-data\") pod \"heat-cfnapi-75b459779d-29rn7\" (UID: \"2da32456-86db-40a7-afb5-9d8b6e5866d6\") " pod="openstack/heat-cfnapi-75b459779d-29rn7" Dec 02 16:15:55 crc kubenswrapper[4933]: I1202 16:15:55.034572 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2da32456-86db-40a7-afb5-9d8b6e5866d6-config-data-custom\") pod \"heat-cfnapi-75b459779d-29rn7\" (UID: \"2da32456-86db-40a7-afb5-9d8b6e5866d6\") " pod="openstack/heat-cfnapi-75b459779d-29rn7" Dec 02 16:15:55 crc kubenswrapper[4933]: I1202 16:15:55.038707 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zw7s5\" (UniqueName: \"kubernetes.io/projected/2da32456-86db-40a7-afb5-9d8b6e5866d6-kube-api-access-zw7s5\") pod \"heat-cfnapi-75b459779d-29rn7\" (UID: \"2da32456-86db-40a7-afb5-9d8b6e5866d6\") " pod="openstack/heat-cfnapi-75b459779d-29rn7" Dec 02 16:15:55 crc kubenswrapper[4933]: I1202 16:15:55.093239 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-75b459779d-29rn7" Dec 02 16:15:55 crc kubenswrapper[4933]: I1202 16:15:55.108409 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/036484a3-ab5b-4bf2-a02c-9fb583904529-config-data\") pod \"heat-api-84dddfc884-5ff6c\" (UID: \"036484a3-ab5b-4bf2-a02c-9fb583904529\") " pod="openstack/heat-api-84dddfc884-5ff6c" Dec 02 16:15:55 crc kubenswrapper[4933]: I1202 16:15:55.108664 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/036484a3-ab5b-4bf2-a02c-9fb583904529-combined-ca-bundle\") pod \"heat-api-84dddfc884-5ff6c\" (UID: \"036484a3-ab5b-4bf2-a02c-9fb583904529\") " pod="openstack/heat-api-84dddfc884-5ff6c" Dec 02 16:15:55 crc kubenswrapper[4933]: I1202 16:15:55.108694 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8twx\" (UniqueName: \"kubernetes.io/projected/036484a3-ab5b-4bf2-a02c-9fb583904529-kube-api-access-p8twx\") pod \"heat-api-84dddfc884-5ff6c\" (UID: \"036484a3-ab5b-4bf2-a02c-9fb583904529\") " pod="openstack/heat-api-84dddfc884-5ff6c" Dec 02 16:15:55 crc kubenswrapper[4933]: I1202 16:15:55.108751 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/036484a3-ab5b-4bf2-a02c-9fb583904529-config-data-custom\") pod \"heat-api-84dddfc884-5ff6c\" (UID: \"036484a3-ab5b-4bf2-a02c-9fb583904529\") " pod="openstack/heat-api-84dddfc884-5ff6c" Dec 02 16:15:55 crc kubenswrapper[4933]: I1202 16:15:55.114671 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/036484a3-ab5b-4bf2-a02c-9fb583904529-config-data-custom\") pod \"heat-api-84dddfc884-5ff6c\" (UID: \"036484a3-ab5b-4bf2-a02c-9fb583904529\") " pod="openstack/heat-api-84dddfc884-5ff6c" Dec 02 16:15:55 crc kubenswrapper[4933]: I1202 16:15:55.120313 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/036484a3-ab5b-4bf2-a02c-9fb583904529-combined-ca-bundle\") pod \"heat-api-84dddfc884-5ff6c\" (UID: \"036484a3-ab5b-4bf2-a02c-9fb583904529\") " pod="openstack/heat-api-84dddfc884-5ff6c" Dec 02 16:15:55 crc kubenswrapper[4933]: I1202 16:15:55.126400 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/036484a3-ab5b-4bf2-a02c-9fb583904529-config-data\") pod \"heat-api-84dddfc884-5ff6c\" (UID: \"036484a3-ab5b-4bf2-a02c-9fb583904529\") " pod="openstack/heat-api-84dddfc884-5ff6c" Dec 02 16:15:55 crc kubenswrapper[4933]: I1202 16:15:55.131541 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8twx\" (UniqueName: \"kubernetes.io/projected/036484a3-ab5b-4bf2-a02c-9fb583904529-kube-api-access-p8twx\") pod \"heat-api-84dddfc884-5ff6c\" (UID: \"036484a3-ab5b-4bf2-a02c-9fb583904529\") " pod="openstack/heat-api-84dddfc884-5ff6c" Dec 02 16:15:55 crc kubenswrapper[4933]: I1202 16:15:55.394338 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-84dddfc884-5ff6c" Dec 02 16:15:55 crc kubenswrapper[4933]: I1202 16:15:55.529585 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-76479c94dd-9jkgg"] Dec 02 16:15:55 crc kubenswrapper[4933]: I1202 16:15:55.787480 4933 generic.go:334] "Generic (PLEG): container finished" podID="8d6fe320-04c0-4184-a67d-50676c0717b5" containerID="ef0193529a29cc40187469d2c484931607336e42925a30872414d9f579bc3945" exitCode=0 Dec 02 16:15:55 crc kubenswrapper[4933]: I1202 16:15:55.787737 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"8d6fe320-04c0-4184-a67d-50676c0717b5","Type":"ContainerDied","Data":"ef0193529a29cc40187469d2c484931607336e42925a30872414d9f579bc3945"} Dec 02 16:15:55 crc kubenswrapper[4933]: I1202 16:15:55.790415 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-76479c94dd-9jkgg" event={"ID":"8aebfa1d-65c0-4f16-bcee-86e03d923f99","Type":"ContainerStarted","Data":"407e37b3be8333c149e6ce9c198dccd919da281b9496beb91a98b634e64c96d4"} Dec 02 16:15:55 crc kubenswrapper[4933]: I1202 16:15:55.834114 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d978555f9-9jz6z"] Dec 02 16:15:55 crc kubenswrapper[4933]: I1202 16:15:55.957871 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-75b459779d-29rn7"] Dec 02 16:15:56 crc kubenswrapper[4933]: I1202 16:15:56.228379 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-84dddfc884-5ff6c"] Dec 02 16:15:56 crc kubenswrapper[4933]: W1202 16:15:56.278720 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod036484a3_ab5b_4bf2_a02c_9fb583904529.slice/crio-47f10ee3d6e47daa64aa5cd6183750639c41f6dc31a437a36bcf97e4402a1a8a WatchSource:0}: Error finding container 47f10ee3d6e47daa64aa5cd6183750639c41f6dc31a437a36bcf97e4402a1a8a: Status 404 returned error can't find the container with id 47f10ee3d6e47daa64aa5cd6183750639c41f6dc31a437a36bcf97e4402a1a8a Dec 02 16:15:56 crc kubenswrapper[4933]: I1202 16:15:56.407727 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 02 16:15:56 crc kubenswrapper[4933]: I1202 16:15:56.461710 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d6fe320-04c0-4184-a67d-50676c0717b5-combined-ca-bundle\") pod \"8d6fe320-04c0-4184-a67d-50676c0717b5\" (UID: \"8d6fe320-04c0-4184-a67d-50676c0717b5\") " Dec 02 16:15:56 crc kubenswrapper[4933]: I1202 16:15:56.461797 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nh25g\" (UniqueName: \"kubernetes.io/projected/8d6fe320-04c0-4184-a67d-50676c0717b5-kube-api-access-nh25g\") pod \"8d6fe320-04c0-4184-a67d-50676c0717b5\" (UID: \"8d6fe320-04c0-4184-a67d-50676c0717b5\") " Dec 02 16:15:56 crc kubenswrapper[4933]: I1202 16:15:56.461973 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8d6fe320-04c0-4184-a67d-50676c0717b5-etc-machine-id\") pod \"8d6fe320-04c0-4184-a67d-50676c0717b5\" (UID: \"8d6fe320-04c0-4184-a67d-50676c0717b5\") " Dec 02 16:15:56 crc kubenswrapper[4933]: I1202 16:15:56.462044 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8d6fe320-04c0-4184-a67d-50676c0717b5-config-data-custom\") pod \"8d6fe320-04c0-4184-a67d-50676c0717b5\" (UID: \"8d6fe320-04c0-4184-a67d-50676c0717b5\") " Dec 02 16:15:56 crc kubenswrapper[4933]: I1202 16:15:56.462066 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d6fe320-04c0-4184-a67d-50676c0717b5-config-data\") pod \"8d6fe320-04c0-4184-a67d-50676c0717b5\" (UID: \"8d6fe320-04c0-4184-a67d-50676c0717b5\") " Dec 02 16:15:56 crc kubenswrapper[4933]: I1202 16:15:56.462115 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8d6fe320-04c0-4184-a67d-50676c0717b5-scripts\") pod \"8d6fe320-04c0-4184-a67d-50676c0717b5\" (UID: \"8d6fe320-04c0-4184-a67d-50676c0717b5\") " Dec 02 16:15:56 crc kubenswrapper[4933]: I1202 16:15:56.463925 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8d6fe320-04c0-4184-a67d-50676c0717b5-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "8d6fe320-04c0-4184-a67d-50676c0717b5" (UID: "8d6fe320-04c0-4184-a67d-50676c0717b5"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 16:15:56 crc kubenswrapper[4933]: I1202 16:15:56.470123 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d6fe320-04c0-4184-a67d-50676c0717b5-kube-api-access-nh25g" (OuterVolumeSpecName: "kube-api-access-nh25g") pod "8d6fe320-04c0-4184-a67d-50676c0717b5" (UID: "8d6fe320-04c0-4184-a67d-50676c0717b5"). InnerVolumeSpecName "kube-api-access-nh25g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:15:56 crc kubenswrapper[4933]: I1202 16:15:56.473341 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d6fe320-04c0-4184-a67d-50676c0717b5-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "8d6fe320-04c0-4184-a67d-50676c0717b5" (UID: "8d6fe320-04c0-4184-a67d-50676c0717b5"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:15:56 crc kubenswrapper[4933]: I1202 16:15:56.476477 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d6fe320-04c0-4184-a67d-50676c0717b5-scripts" (OuterVolumeSpecName: "scripts") pod "8d6fe320-04c0-4184-a67d-50676c0717b5" (UID: "8d6fe320-04c0-4184-a67d-50676c0717b5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:15:56 crc kubenswrapper[4933]: I1202 16:15:56.567611 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nh25g\" (UniqueName: \"kubernetes.io/projected/8d6fe320-04c0-4184-a67d-50676c0717b5-kube-api-access-nh25g\") on node \"crc\" DevicePath \"\"" Dec 02 16:15:56 crc kubenswrapper[4933]: I1202 16:15:56.567640 4933 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8d6fe320-04c0-4184-a67d-50676c0717b5-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 02 16:15:56 crc kubenswrapper[4933]: I1202 16:15:56.567651 4933 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8d6fe320-04c0-4184-a67d-50676c0717b5-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 02 16:15:56 crc kubenswrapper[4933]: I1202 16:15:56.567660 4933 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8d6fe320-04c0-4184-a67d-50676c0717b5-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 16:15:56 crc kubenswrapper[4933]: I1202 16:15:56.595207 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d6fe320-04c0-4184-a67d-50676c0717b5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8d6fe320-04c0-4184-a67d-50676c0717b5" (UID: "8d6fe320-04c0-4184-a67d-50676c0717b5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:15:56 crc kubenswrapper[4933]: I1202 16:15:56.669552 4933 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d6fe320-04c0-4184-a67d-50676c0717b5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 16:15:56 crc kubenswrapper[4933]: I1202 16:15:56.702236 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d6fe320-04c0-4184-a67d-50676c0717b5-config-data" (OuterVolumeSpecName: "config-data") pod "8d6fe320-04c0-4184-a67d-50676c0717b5" (UID: "8d6fe320-04c0-4184-a67d-50676c0717b5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:15:56 crc kubenswrapper[4933]: I1202 16:15:56.771396 4933 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d6fe320-04c0-4184-a67d-50676c0717b5-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 16:15:56 crc kubenswrapper[4933]: I1202 16:15:56.826200 4933 generic.go:334] "Generic (PLEG): container finished" podID="01dfefe7-534e-41e1-9f9b-a59f177f4c7e" containerID="63762d47840846fc93de31c6a3336d8c32796518b8886be61035ba6567611654" exitCode=0 Dec 02 16:15:56 crc kubenswrapper[4933]: I1202 16:15:56.827179 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d978555f9-9jz6z" event={"ID":"01dfefe7-534e-41e1-9f9b-a59f177f4c7e","Type":"ContainerDied","Data":"63762d47840846fc93de31c6a3336d8c32796518b8886be61035ba6567611654"} Dec 02 16:15:56 crc kubenswrapper[4933]: I1202 16:15:56.827206 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d978555f9-9jz6z" event={"ID":"01dfefe7-534e-41e1-9f9b-a59f177f4c7e","Type":"ContainerStarted","Data":"07f83c27f58afda8100dcb5ef6bd319f39d955614ebab2ee0f4e7e6678bd7287"} Dec 02 16:15:56 crc kubenswrapper[4933]: I1202 16:15:56.842345 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 02 16:15:56 crc kubenswrapper[4933]: I1202 16:15:56.843369 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"8d6fe320-04c0-4184-a67d-50676c0717b5","Type":"ContainerDied","Data":"93d9fea5ae34e09e6b85ba49db3f9d1ba3e987bdbf8ae6717ec5afcfe0693d28"} Dec 02 16:15:56 crc kubenswrapper[4933]: I1202 16:15:56.843422 4933 scope.go:117] "RemoveContainer" containerID="e36013856ecff5111ea7596fef6b0a38bd1cb5b093d5814e8b206b60a88f9b4a" Dec 02 16:15:56 crc kubenswrapper[4933]: I1202 16:15:56.853179 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-84dddfc884-5ff6c" event={"ID":"036484a3-ab5b-4bf2-a02c-9fb583904529","Type":"ContainerStarted","Data":"47f10ee3d6e47daa64aa5cd6183750639c41f6dc31a437a36bcf97e4402a1a8a"} Dec 02 16:15:56 crc kubenswrapper[4933]: I1202 16:15:56.855508 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-76479c94dd-9jkgg" event={"ID":"8aebfa1d-65c0-4f16-bcee-86e03d923f99","Type":"ContainerStarted","Data":"cd2e80308c9f1b00eaa6c2edd6cc51563d704053c166b8a9f5f3440137e64f3c"} Dec 02 16:15:56 crc kubenswrapper[4933]: I1202 16:15:56.855743 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-76479c94dd-9jkgg" Dec 02 16:15:56 crc kubenswrapper[4933]: I1202 16:15:56.859705 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-75b459779d-29rn7" event={"ID":"2da32456-86db-40a7-afb5-9d8b6e5866d6","Type":"ContainerStarted","Data":"c2c7abdafa6e14b4ed869e5d13a7d730056bfc6dc68be6be98906acc7268680d"} Dec 02 16:15:56 crc kubenswrapper[4933]: I1202 16:15:56.885084 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-76479c94dd-9jkgg" podStartSLOduration=2.885042496 podStartE2EDuration="2.885042496s" podCreationTimestamp="2025-12-02 16:15:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 16:15:56.877946053 +0000 UTC m=+1420.129172756" watchObservedRunningTime="2025-12-02 16:15:56.885042496 +0000 UTC m=+1420.136269199" Dec 02 16:15:57 crc kubenswrapper[4933]: I1202 16:15:57.063878 4933 scope.go:117] "RemoveContainer" containerID="ef0193529a29cc40187469d2c484931607336e42925a30872414d9f579bc3945" Dec 02 16:15:57 crc kubenswrapper[4933]: I1202 16:15:57.244081 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 02 16:15:57 crc kubenswrapper[4933]: I1202 16:15:57.268641 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 02 16:15:57 crc kubenswrapper[4933]: I1202 16:15:57.282156 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 02 16:15:57 crc kubenswrapper[4933]: E1202 16:15:57.282709 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d6fe320-04c0-4184-a67d-50676c0717b5" containerName="probe" Dec 02 16:15:57 crc kubenswrapper[4933]: I1202 16:15:57.282730 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d6fe320-04c0-4184-a67d-50676c0717b5" containerName="probe" Dec 02 16:15:57 crc kubenswrapper[4933]: E1202 16:15:57.282750 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d6fe320-04c0-4184-a67d-50676c0717b5" containerName="cinder-scheduler" Dec 02 16:15:57 crc kubenswrapper[4933]: I1202 16:15:57.282758 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d6fe320-04c0-4184-a67d-50676c0717b5" containerName="cinder-scheduler" Dec 02 16:15:57 crc kubenswrapper[4933]: I1202 16:15:57.283004 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d6fe320-04c0-4184-a67d-50676c0717b5" containerName="probe" Dec 02 16:15:57 crc kubenswrapper[4933]: I1202 16:15:57.283031 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d6fe320-04c0-4184-a67d-50676c0717b5" containerName="cinder-scheduler" Dec 02 16:15:57 crc kubenswrapper[4933]: I1202 16:15:57.284295 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 02 16:15:57 crc kubenswrapper[4933]: I1202 16:15:57.291941 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 02 16:15:57 crc kubenswrapper[4933]: I1202 16:15:57.295205 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 02 16:15:57 crc kubenswrapper[4933]: I1202 16:15:57.412636 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60f07df2-68b3-4c78-9279-5dd6d9c71397-scripts\") pod \"cinder-scheduler-0\" (UID: \"60f07df2-68b3-4c78-9279-5dd6d9c71397\") " pod="openstack/cinder-scheduler-0" Dec 02 16:15:57 crc kubenswrapper[4933]: I1202 16:15:57.412779 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/60f07df2-68b3-4c78-9279-5dd6d9c71397-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"60f07df2-68b3-4c78-9279-5dd6d9c71397\") " pod="openstack/cinder-scheduler-0" Dec 02 16:15:57 crc kubenswrapper[4933]: I1202 16:15:57.412896 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60f07df2-68b3-4c78-9279-5dd6d9c71397-config-data\") pod \"cinder-scheduler-0\" (UID: \"60f07df2-68b3-4c78-9279-5dd6d9c71397\") " pod="openstack/cinder-scheduler-0" Dec 02 16:15:57 crc kubenswrapper[4933]: I1202 16:15:57.412917 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/60f07df2-68b3-4c78-9279-5dd6d9c71397-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"60f07df2-68b3-4c78-9279-5dd6d9c71397\") " pod="openstack/cinder-scheduler-0" Dec 02 16:15:57 crc kubenswrapper[4933]: I1202 16:15:57.413070 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60f07df2-68b3-4c78-9279-5dd6d9c71397-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"60f07df2-68b3-4c78-9279-5dd6d9c71397\") " pod="openstack/cinder-scheduler-0" Dec 02 16:15:57 crc kubenswrapper[4933]: I1202 16:15:57.413249 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsqbk\" (UniqueName: \"kubernetes.io/projected/60f07df2-68b3-4c78-9279-5dd6d9c71397-kube-api-access-fsqbk\") pod \"cinder-scheduler-0\" (UID: \"60f07df2-68b3-4c78-9279-5dd6d9c71397\") " pod="openstack/cinder-scheduler-0" Dec 02 16:15:57 crc kubenswrapper[4933]: E1202 16:15:57.513425 4933 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6c1c5e6_50dd_428a_890c_2c3f0456f2fa.slice/crio-88738e3bef53d0478fb4beaed256d677959bb94ab5e545598a6a14e9cb4a5112.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8d6fe320_04c0_4184_a67d_50676c0717b5.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8d6fe320_04c0_4184_a67d_50676c0717b5.slice/crio-93d9fea5ae34e09e6b85ba49db3f9d1ba3e987bdbf8ae6717ec5afcfe0693d28\": RecentStats: unable to find data in memory cache]" Dec 02 16:15:57 crc kubenswrapper[4933]: I1202 16:15:57.519564 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60f07df2-68b3-4c78-9279-5dd6d9c71397-scripts\") pod \"cinder-scheduler-0\" (UID: \"60f07df2-68b3-4c78-9279-5dd6d9c71397\") " pod="openstack/cinder-scheduler-0" Dec 02 16:15:57 crc kubenswrapper[4933]: I1202 16:15:57.519632 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/60f07df2-68b3-4c78-9279-5dd6d9c71397-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"60f07df2-68b3-4c78-9279-5dd6d9c71397\") " pod="openstack/cinder-scheduler-0" Dec 02 16:15:57 crc kubenswrapper[4933]: I1202 16:15:57.519673 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60f07df2-68b3-4c78-9279-5dd6d9c71397-config-data\") pod \"cinder-scheduler-0\" (UID: \"60f07df2-68b3-4c78-9279-5dd6d9c71397\") " pod="openstack/cinder-scheduler-0" Dec 02 16:15:57 crc kubenswrapper[4933]: I1202 16:15:57.519689 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/60f07df2-68b3-4c78-9279-5dd6d9c71397-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"60f07df2-68b3-4c78-9279-5dd6d9c71397\") " pod="openstack/cinder-scheduler-0" Dec 02 16:15:57 crc kubenswrapper[4933]: I1202 16:15:57.519778 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60f07df2-68b3-4c78-9279-5dd6d9c71397-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"60f07df2-68b3-4c78-9279-5dd6d9c71397\") " pod="openstack/cinder-scheduler-0" Dec 02 16:15:57 crc kubenswrapper[4933]: I1202 16:15:57.519888 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fsqbk\" (UniqueName: \"kubernetes.io/projected/60f07df2-68b3-4c78-9279-5dd6d9c71397-kube-api-access-fsqbk\") pod \"cinder-scheduler-0\" (UID: \"60f07df2-68b3-4c78-9279-5dd6d9c71397\") " pod="openstack/cinder-scheduler-0" Dec 02 16:15:57 crc kubenswrapper[4933]: I1202 16:15:57.520890 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/60f07df2-68b3-4c78-9279-5dd6d9c71397-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"60f07df2-68b3-4c78-9279-5dd6d9c71397\") " pod="openstack/cinder-scheduler-0" Dec 02 16:15:57 crc kubenswrapper[4933]: I1202 16:15:57.528226 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60f07df2-68b3-4c78-9279-5dd6d9c71397-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"60f07df2-68b3-4c78-9279-5dd6d9c71397\") " pod="openstack/cinder-scheduler-0" Dec 02 16:15:57 crc kubenswrapper[4933]: I1202 16:15:57.530329 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60f07df2-68b3-4c78-9279-5dd6d9c71397-scripts\") pod \"cinder-scheduler-0\" (UID: \"60f07df2-68b3-4c78-9279-5dd6d9c71397\") " pod="openstack/cinder-scheduler-0" Dec 02 16:15:57 crc kubenswrapper[4933]: I1202 16:15:57.531125 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60f07df2-68b3-4c78-9279-5dd6d9c71397-config-data\") pod \"cinder-scheduler-0\" (UID: \"60f07df2-68b3-4c78-9279-5dd6d9c71397\") " pod="openstack/cinder-scheduler-0" Dec 02 16:15:57 crc kubenswrapper[4933]: I1202 16:15:57.534807 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/60f07df2-68b3-4c78-9279-5dd6d9c71397-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"60f07df2-68b3-4c78-9279-5dd6d9c71397\") " pod="openstack/cinder-scheduler-0" Dec 02 16:15:57 crc kubenswrapper[4933]: I1202 16:15:57.543276 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsqbk\" (UniqueName: \"kubernetes.io/projected/60f07df2-68b3-4c78-9279-5dd6d9c71397-kube-api-access-fsqbk\") pod \"cinder-scheduler-0\" (UID: \"60f07df2-68b3-4c78-9279-5dd6d9c71397\") " pod="openstack/cinder-scheduler-0" Dec 02 16:15:57 crc kubenswrapper[4933]: I1202 16:15:57.625724 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 02 16:15:57 crc kubenswrapper[4933]: I1202 16:15:57.931418 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d978555f9-9jz6z" event={"ID":"01dfefe7-534e-41e1-9f9b-a59f177f4c7e","Type":"ContainerStarted","Data":"b1faa251b79679e454ca73c37d4bb5745d5e4905f4ebb944f3c376ba708c320b"} Dec 02 16:15:57 crc kubenswrapper[4933]: I1202 16:15:57.931743 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7d978555f9-9jz6z" Dec 02 16:15:57 crc kubenswrapper[4933]: I1202 16:15:57.961139 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7d978555f9-9jz6z" podStartSLOduration=3.961094185 podStartE2EDuration="3.961094185s" podCreationTimestamp="2025-12-02 16:15:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 16:15:57.955595685 +0000 UTC m=+1421.206822378" watchObservedRunningTime="2025-12-02 16:15:57.961094185 +0000 UTC m=+1421.212320888" Dec 02 16:15:58 crc kubenswrapper[4933]: I1202 16:15:58.263886 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 02 16:15:58 crc kubenswrapper[4933]: I1202 16:15:58.807717 4933 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cinder-api-0" podUID="9e9fc2a4-bdc8-4f3f-9ef1-b9e9a141cd02" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.200:8776/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 02 16:15:59 crc kubenswrapper[4933]: I1202 16:15:59.001957 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"60f07df2-68b3-4c78-9279-5dd6d9c71397","Type":"ContainerStarted","Data":"413b8c934210b6af3a93732046f27b448656eaa995ab5f84166c1f42dd9d5bce"} Dec 02 16:15:59 crc kubenswrapper[4933]: I1202 16:15:59.113503 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d6fe320-04c0-4184-a67d-50676c0717b5" path="/var/lib/kubelet/pods/8d6fe320-04c0-4184-a67d-50676c0717b5/volumes" Dec 02 16:15:59 crc kubenswrapper[4933]: I1202 16:15:59.773237 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Dec 02 16:16:00 crc kubenswrapper[4933]: I1202 16:16:00.020054 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"60f07df2-68b3-4c78-9279-5dd6d9c71397","Type":"ContainerStarted","Data":"759e11e86e24ae1edd7935f0654f2e097dfe980801a66993aa4fb05b587f58d7"} Dec 02 16:16:00 crc kubenswrapper[4933]: I1202 16:16:00.560714 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-64f6f8f6c-x5wnf"] Dec 02 16:16:00 crc kubenswrapper[4933]: I1202 16:16:00.563658 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-64f6f8f6c-x5wnf" Dec 02 16:16:00 crc kubenswrapper[4933]: I1202 16:16:00.567976 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Dec 02 16:16:00 crc kubenswrapper[4933]: I1202 16:16:00.568250 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Dec 02 16:16:00 crc kubenswrapper[4933]: I1202 16:16:00.570454 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 02 16:16:00 crc kubenswrapper[4933]: I1202 16:16:00.599557 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-64f6f8f6c-x5wnf"] Dec 02 16:16:00 crc kubenswrapper[4933]: I1202 16:16:00.750942 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/97b9d44e-b2f2-4da0-82e0-28c657d8df41-etc-swift\") pod \"swift-proxy-64f6f8f6c-x5wnf\" (UID: \"97b9d44e-b2f2-4da0-82e0-28c657d8df41\") " pod="openstack/swift-proxy-64f6f8f6c-x5wnf" Dec 02 16:16:00 crc kubenswrapper[4933]: I1202 16:16:00.751032 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/97b9d44e-b2f2-4da0-82e0-28c657d8df41-log-httpd\") pod \"swift-proxy-64f6f8f6c-x5wnf\" (UID: \"97b9d44e-b2f2-4da0-82e0-28c657d8df41\") " pod="openstack/swift-proxy-64f6f8f6c-x5wnf" Dec 02 16:16:00 crc kubenswrapper[4933]: I1202 16:16:00.751070 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97b9d44e-b2f2-4da0-82e0-28c657d8df41-config-data\") pod \"swift-proxy-64f6f8f6c-x5wnf\" (UID: \"97b9d44e-b2f2-4da0-82e0-28c657d8df41\") " pod="openstack/swift-proxy-64f6f8f6c-x5wnf" Dec 02 16:16:00 crc kubenswrapper[4933]: I1202 16:16:00.751162 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/97b9d44e-b2f2-4da0-82e0-28c657d8df41-run-httpd\") pod \"swift-proxy-64f6f8f6c-x5wnf\" (UID: \"97b9d44e-b2f2-4da0-82e0-28c657d8df41\") " pod="openstack/swift-proxy-64f6f8f6c-x5wnf" Dec 02 16:16:00 crc kubenswrapper[4933]: I1202 16:16:00.751278 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8z4rb\" (UniqueName: \"kubernetes.io/projected/97b9d44e-b2f2-4da0-82e0-28c657d8df41-kube-api-access-8z4rb\") pod \"swift-proxy-64f6f8f6c-x5wnf\" (UID: \"97b9d44e-b2f2-4da0-82e0-28c657d8df41\") " pod="openstack/swift-proxy-64f6f8f6c-x5wnf" Dec 02 16:16:00 crc kubenswrapper[4933]: I1202 16:16:00.751319 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/97b9d44e-b2f2-4da0-82e0-28c657d8df41-public-tls-certs\") pod \"swift-proxy-64f6f8f6c-x5wnf\" (UID: \"97b9d44e-b2f2-4da0-82e0-28c657d8df41\") " pod="openstack/swift-proxy-64f6f8f6c-x5wnf" Dec 02 16:16:00 crc kubenswrapper[4933]: I1202 16:16:00.751375 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/97b9d44e-b2f2-4da0-82e0-28c657d8df41-internal-tls-certs\") pod \"swift-proxy-64f6f8f6c-x5wnf\" (UID: \"97b9d44e-b2f2-4da0-82e0-28c657d8df41\") " pod="openstack/swift-proxy-64f6f8f6c-x5wnf" Dec 02 16:16:00 crc kubenswrapper[4933]: I1202 16:16:00.751417 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97b9d44e-b2f2-4da0-82e0-28c657d8df41-combined-ca-bundle\") pod \"swift-proxy-64f6f8f6c-x5wnf\" (UID: \"97b9d44e-b2f2-4da0-82e0-28c657d8df41\") " pod="openstack/swift-proxy-64f6f8f6c-x5wnf" Dec 02 16:16:00 crc kubenswrapper[4933]: I1202 16:16:00.853679 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8z4rb\" (UniqueName: \"kubernetes.io/projected/97b9d44e-b2f2-4da0-82e0-28c657d8df41-kube-api-access-8z4rb\") pod \"swift-proxy-64f6f8f6c-x5wnf\" (UID: \"97b9d44e-b2f2-4da0-82e0-28c657d8df41\") " pod="openstack/swift-proxy-64f6f8f6c-x5wnf" Dec 02 16:16:00 crc kubenswrapper[4933]: I1202 16:16:00.853721 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/97b9d44e-b2f2-4da0-82e0-28c657d8df41-public-tls-certs\") pod \"swift-proxy-64f6f8f6c-x5wnf\" (UID: \"97b9d44e-b2f2-4da0-82e0-28c657d8df41\") " pod="openstack/swift-proxy-64f6f8f6c-x5wnf" Dec 02 16:16:00 crc kubenswrapper[4933]: I1202 16:16:00.853755 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/97b9d44e-b2f2-4da0-82e0-28c657d8df41-internal-tls-certs\") pod \"swift-proxy-64f6f8f6c-x5wnf\" (UID: \"97b9d44e-b2f2-4da0-82e0-28c657d8df41\") " pod="openstack/swift-proxy-64f6f8f6c-x5wnf" Dec 02 16:16:00 crc kubenswrapper[4933]: I1202 16:16:00.853782 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97b9d44e-b2f2-4da0-82e0-28c657d8df41-combined-ca-bundle\") pod \"swift-proxy-64f6f8f6c-x5wnf\" (UID: \"97b9d44e-b2f2-4da0-82e0-28c657d8df41\") " pod="openstack/swift-proxy-64f6f8f6c-x5wnf" Dec 02 16:16:00 crc kubenswrapper[4933]: I1202 16:16:00.853892 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/97b9d44e-b2f2-4da0-82e0-28c657d8df41-etc-swift\") pod \"swift-proxy-64f6f8f6c-x5wnf\" (UID: \"97b9d44e-b2f2-4da0-82e0-28c657d8df41\") " pod="openstack/swift-proxy-64f6f8f6c-x5wnf" Dec 02 16:16:00 crc kubenswrapper[4933]: I1202 16:16:00.853930 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/97b9d44e-b2f2-4da0-82e0-28c657d8df41-log-httpd\") pod \"swift-proxy-64f6f8f6c-x5wnf\" (UID: \"97b9d44e-b2f2-4da0-82e0-28c657d8df41\") " pod="openstack/swift-proxy-64f6f8f6c-x5wnf" Dec 02 16:16:00 crc kubenswrapper[4933]: I1202 16:16:00.853954 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97b9d44e-b2f2-4da0-82e0-28c657d8df41-config-data\") pod \"swift-proxy-64f6f8f6c-x5wnf\" (UID: \"97b9d44e-b2f2-4da0-82e0-28c657d8df41\") " pod="openstack/swift-proxy-64f6f8f6c-x5wnf" Dec 02 16:16:00 crc kubenswrapper[4933]: I1202 16:16:00.854407 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/97b9d44e-b2f2-4da0-82e0-28c657d8df41-log-httpd\") pod \"swift-proxy-64f6f8f6c-x5wnf\" (UID: \"97b9d44e-b2f2-4da0-82e0-28c657d8df41\") " pod="openstack/swift-proxy-64f6f8f6c-x5wnf" Dec 02 16:16:00 crc kubenswrapper[4933]: I1202 16:16:00.854721 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/97b9d44e-b2f2-4da0-82e0-28c657d8df41-run-httpd\") pod \"swift-proxy-64f6f8f6c-x5wnf\" (UID: \"97b9d44e-b2f2-4da0-82e0-28c657d8df41\") " pod="openstack/swift-proxy-64f6f8f6c-x5wnf" Dec 02 16:16:00 crc kubenswrapper[4933]: I1202 16:16:00.856212 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/97b9d44e-b2f2-4da0-82e0-28c657d8df41-run-httpd\") pod \"swift-proxy-64f6f8f6c-x5wnf\" (UID: \"97b9d44e-b2f2-4da0-82e0-28c657d8df41\") " pod="openstack/swift-proxy-64f6f8f6c-x5wnf" Dec 02 16:16:00 crc kubenswrapper[4933]: I1202 16:16:00.863428 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/97b9d44e-b2f2-4da0-82e0-28c657d8df41-etc-swift\") pod \"swift-proxy-64f6f8f6c-x5wnf\" (UID: \"97b9d44e-b2f2-4da0-82e0-28c657d8df41\") " pod="openstack/swift-proxy-64f6f8f6c-x5wnf" Dec 02 16:16:00 crc kubenswrapper[4933]: I1202 16:16:00.864013 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/97b9d44e-b2f2-4da0-82e0-28c657d8df41-public-tls-certs\") pod \"swift-proxy-64f6f8f6c-x5wnf\" (UID: \"97b9d44e-b2f2-4da0-82e0-28c657d8df41\") " pod="openstack/swift-proxy-64f6f8f6c-x5wnf" Dec 02 16:16:00 crc kubenswrapper[4933]: I1202 16:16:00.865771 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97b9d44e-b2f2-4da0-82e0-28c657d8df41-config-data\") pod \"swift-proxy-64f6f8f6c-x5wnf\" (UID: \"97b9d44e-b2f2-4da0-82e0-28c657d8df41\") " pod="openstack/swift-proxy-64f6f8f6c-x5wnf" Dec 02 16:16:00 crc kubenswrapper[4933]: I1202 16:16:00.872574 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/97b9d44e-b2f2-4da0-82e0-28c657d8df41-internal-tls-certs\") pod \"swift-proxy-64f6f8f6c-x5wnf\" (UID: \"97b9d44e-b2f2-4da0-82e0-28c657d8df41\") " pod="openstack/swift-proxy-64f6f8f6c-x5wnf" Dec 02 16:16:00 crc kubenswrapper[4933]: I1202 16:16:00.875806 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97b9d44e-b2f2-4da0-82e0-28c657d8df41-combined-ca-bundle\") pod \"swift-proxy-64f6f8f6c-x5wnf\" (UID: \"97b9d44e-b2f2-4da0-82e0-28c657d8df41\") " pod="openstack/swift-proxy-64f6f8f6c-x5wnf" Dec 02 16:16:00 crc kubenswrapper[4933]: I1202 16:16:00.896472 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8z4rb\" (UniqueName: \"kubernetes.io/projected/97b9d44e-b2f2-4da0-82e0-28c657d8df41-kube-api-access-8z4rb\") pod \"swift-proxy-64f6f8f6c-x5wnf\" (UID: \"97b9d44e-b2f2-4da0-82e0-28c657d8df41\") " pod="openstack/swift-proxy-64f6f8f6c-x5wnf" Dec 02 16:16:01 crc kubenswrapper[4933]: I1202 16:16:01.187693 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-64f6f8f6c-x5wnf" Dec 02 16:16:01 crc kubenswrapper[4933]: I1202 16:16:01.849195 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-64b6fd7f9d-b5l2s"] Dec 02 16:16:01 crc kubenswrapper[4933]: I1202 16:16:01.857865 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-64b6fd7f9d-b5l2s" Dec 02 16:16:01 crc kubenswrapper[4933]: I1202 16:16:01.878093 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-57cc8c89b8-9grd9"] Dec 02 16:16:01 crc kubenswrapper[4933]: I1202 16:16:01.881275 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-57cc8c89b8-9grd9" Dec 02 16:16:02 crc kubenswrapper[4933]: I1202 16:16:02.018191 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-64b6fd7f9d-b5l2s"] Dec 02 16:16:02 crc kubenswrapper[4933]: I1202 16:16:02.054908 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-57cc8c89b8-9grd9"] Dec 02 16:16:02 crc kubenswrapper[4933]: I1202 16:16:02.062689 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqrzs\" (UniqueName: \"kubernetes.io/projected/b5ea07dd-c4dc-421e-b710-8d759f0d9487-kube-api-access-jqrzs\") pod \"heat-api-57cc8c89b8-9grd9\" (UID: \"b5ea07dd-c4dc-421e-b710-8d759f0d9487\") " pod="openstack/heat-api-57cc8c89b8-9grd9" Dec 02 16:16:02 crc kubenswrapper[4933]: I1202 16:16:02.062760 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59434c0f-e21f-45a2-909b-fa1143a5b5ae-combined-ca-bundle\") pod \"heat-engine-64b6fd7f9d-b5l2s\" (UID: \"59434c0f-e21f-45a2-909b-fa1143a5b5ae\") " pod="openstack/heat-engine-64b6fd7f9d-b5l2s" Dec 02 16:16:02 crc kubenswrapper[4933]: I1202 16:16:02.062778 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b5ea07dd-c4dc-421e-b710-8d759f0d9487-config-data-custom\") pod \"heat-api-57cc8c89b8-9grd9\" (UID: \"b5ea07dd-c4dc-421e-b710-8d759f0d9487\") " pod="openstack/heat-api-57cc8c89b8-9grd9" Dec 02 16:16:02 crc kubenswrapper[4933]: I1202 16:16:02.062796 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5ea07dd-c4dc-421e-b710-8d759f0d9487-config-data\") pod \"heat-api-57cc8c89b8-9grd9\" (UID: \"b5ea07dd-c4dc-421e-b710-8d759f0d9487\") " pod="openstack/heat-api-57cc8c89b8-9grd9" Dec 02 16:16:02 crc kubenswrapper[4933]: I1202 16:16:02.062884 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/59434c0f-e21f-45a2-909b-fa1143a5b5ae-config-data-custom\") pod \"heat-engine-64b6fd7f9d-b5l2s\" (UID: \"59434c0f-e21f-45a2-909b-fa1143a5b5ae\") " pod="openstack/heat-engine-64b6fd7f9d-b5l2s" Dec 02 16:16:02 crc kubenswrapper[4933]: I1202 16:16:02.062933 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59434c0f-e21f-45a2-909b-fa1143a5b5ae-config-data\") pod \"heat-engine-64b6fd7f9d-b5l2s\" (UID: \"59434c0f-e21f-45a2-909b-fa1143a5b5ae\") " pod="openstack/heat-engine-64b6fd7f9d-b5l2s" Dec 02 16:16:02 crc kubenswrapper[4933]: I1202 16:16:02.062970 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfxk8\" (UniqueName: \"kubernetes.io/projected/59434c0f-e21f-45a2-909b-fa1143a5b5ae-kube-api-access-lfxk8\") pod \"heat-engine-64b6fd7f9d-b5l2s\" (UID: \"59434c0f-e21f-45a2-909b-fa1143a5b5ae\") " pod="openstack/heat-engine-64b6fd7f9d-b5l2s" Dec 02 16:16:02 crc kubenswrapper[4933]: I1202 16:16:02.063009 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5ea07dd-c4dc-421e-b710-8d759f0d9487-combined-ca-bundle\") pod \"heat-api-57cc8c89b8-9grd9\" (UID: \"b5ea07dd-c4dc-421e-b710-8d759f0d9487\") " pod="openstack/heat-api-57cc8c89b8-9grd9" Dec 02 16:16:02 crc kubenswrapper[4933]: I1202 16:16:02.098883 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-5545fbccfb-mqjbx"] Dec 02 16:16:02 crc kubenswrapper[4933]: I1202 16:16:02.100593 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-5545fbccfb-mqjbx" Dec 02 16:16:02 crc kubenswrapper[4933]: I1202 16:16:02.155895 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-5545fbccfb-mqjbx"] Dec 02 16:16:02 crc kubenswrapper[4933]: I1202 16:16:02.174166 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfxk8\" (UniqueName: \"kubernetes.io/projected/59434c0f-e21f-45a2-909b-fa1143a5b5ae-kube-api-access-lfxk8\") pod \"heat-engine-64b6fd7f9d-b5l2s\" (UID: \"59434c0f-e21f-45a2-909b-fa1143a5b5ae\") " pod="openstack/heat-engine-64b6fd7f9d-b5l2s" Dec 02 16:16:02 crc kubenswrapper[4933]: I1202 16:16:02.174247 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5ea07dd-c4dc-421e-b710-8d759f0d9487-combined-ca-bundle\") pod \"heat-api-57cc8c89b8-9grd9\" (UID: \"b5ea07dd-c4dc-421e-b710-8d759f0d9487\") " pod="openstack/heat-api-57cc8c89b8-9grd9" Dec 02 16:16:02 crc kubenswrapper[4933]: I1202 16:16:02.174340 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqrzs\" (UniqueName: \"kubernetes.io/projected/b5ea07dd-c4dc-421e-b710-8d759f0d9487-kube-api-access-jqrzs\") pod \"heat-api-57cc8c89b8-9grd9\" (UID: \"b5ea07dd-c4dc-421e-b710-8d759f0d9487\") " pod="openstack/heat-api-57cc8c89b8-9grd9" Dec 02 16:16:02 crc kubenswrapper[4933]: I1202 16:16:02.174379 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59434c0f-e21f-45a2-909b-fa1143a5b5ae-combined-ca-bundle\") pod \"heat-engine-64b6fd7f9d-b5l2s\" (UID: \"59434c0f-e21f-45a2-909b-fa1143a5b5ae\") " pod="openstack/heat-engine-64b6fd7f9d-b5l2s" Dec 02 16:16:02 crc kubenswrapper[4933]: I1202 16:16:02.174398 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b5ea07dd-c4dc-421e-b710-8d759f0d9487-config-data-custom\") pod \"heat-api-57cc8c89b8-9grd9\" (UID: \"b5ea07dd-c4dc-421e-b710-8d759f0d9487\") " pod="openstack/heat-api-57cc8c89b8-9grd9" Dec 02 16:16:02 crc kubenswrapper[4933]: I1202 16:16:02.174414 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5ea07dd-c4dc-421e-b710-8d759f0d9487-config-data\") pod \"heat-api-57cc8c89b8-9grd9\" (UID: \"b5ea07dd-c4dc-421e-b710-8d759f0d9487\") " pod="openstack/heat-api-57cc8c89b8-9grd9" Dec 02 16:16:02 crc kubenswrapper[4933]: I1202 16:16:02.174448 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/59434c0f-e21f-45a2-909b-fa1143a5b5ae-config-data-custom\") pod \"heat-engine-64b6fd7f9d-b5l2s\" (UID: \"59434c0f-e21f-45a2-909b-fa1143a5b5ae\") " pod="openstack/heat-engine-64b6fd7f9d-b5l2s" Dec 02 16:16:02 crc kubenswrapper[4933]: I1202 16:16:02.174494 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59434c0f-e21f-45a2-909b-fa1143a5b5ae-config-data\") pod \"heat-engine-64b6fd7f9d-b5l2s\" (UID: \"59434c0f-e21f-45a2-909b-fa1143a5b5ae\") " pod="openstack/heat-engine-64b6fd7f9d-b5l2s" Dec 02 16:16:02 crc kubenswrapper[4933]: I1202 16:16:02.191852 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b5ea07dd-c4dc-421e-b710-8d759f0d9487-config-data-custom\") pod \"heat-api-57cc8c89b8-9grd9\" (UID: \"b5ea07dd-c4dc-421e-b710-8d759f0d9487\") " pod="openstack/heat-api-57cc8c89b8-9grd9" Dec 02 16:16:02 crc kubenswrapper[4933]: I1202 16:16:02.193984 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59434c0f-e21f-45a2-909b-fa1143a5b5ae-combined-ca-bundle\") pod \"heat-engine-64b6fd7f9d-b5l2s\" (UID: \"59434c0f-e21f-45a2-909b-fa1143a5b5ae\") " pod="openstack/heat-engine-64b6fd7f9d-b5l2s" Dec 02 16:16:02 crc kubenswrapper[4933]: I1202 16:16:02.210957 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59434c0f-e21f-45a2-909b-fa1143a5b5ae-config-data\") pod \"heat-engine-64b6fd7f9d-b5l2s\" (UID: \"59434c0f-e21f-45a2-909b-fa1143a5b5ae\") " pod="openstack/heat-engine-64b6fd7f9d-b5l2s" Dec 02 16:16:02 crc kubenswrapper[4933]: I1202 16:16:02.225861 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5ea07dd-c4dc-421e-b710-8d759f0d9487-config-data\") pod \"heat-api-57cc8c89b8-9grd9\" (UID: \"b5ea07dd-c4dc-421e-b710-8d759f0d9487\") " pod="openstack/heat-api-57cc8c89b8-9grd9" Dec 02 16:16:02 crc kubenswrapper[4933]: I1202 16:16:02.226700 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/59434c0f-e21f-45a2-909b-fa1143a5b5ae-config-data-custom\") pod \"heat-engine-64b6fd7f9d-b5l2s\" (UID: \"59434c0f-e21f-45a2-909b-fa1143a5b5ae\") " pod="openstack/heat-engine-64b6fd7f9d-b5l2s" Dec 02 16:16:02 crc kubenswrapper[4933]: I1202 16:16:02.227267 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5ea07dd-c4dc-421e-b710-8d759f0d9487-combined-ca-bundle\") pod \"heat-api-57cc8c89b8-9grd9\" (UID: \"b5ea07dd-c4dc-421e-b710-8d759f0d9487\") " pod="openstack/heat-api-57cc8c89b8-9grd9" Dec 02 16:16:02 crc kubenswrapper[4933]: I1202 16:16:02.245159 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqrzs\" (UniqueName: \"kubernetes.io/projected/b5ea07dd-c4dc-421e-b710-8d759f0d9487-kube-api-access-jqrzs\") pod \"heat-api-57cc8c89b8-9grd9\" (UID: \"b5ea07dd-c4dc-421e-b710-8d759f0d9487\") " pod="openstack/heat-api-57cc8c89b8-9grd9" Dec 02 16:16:02 crc kubenswrapper[4933]: I1202 16:16:02.245615 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfxk8\" (UniqueName: \"kubernetes.io/projected/59434c0f-e21f-45a2-909b-fa1143a5b5ae-kube-api-access-lfxk8\") pod \"heat-engine-64b6fd7f9d-b5l2s\" (UID: \"59434c0f-e21f-45a2-909b-fa1143a5b5ae\") " pod="openstack/heat-engine-64b6fd7f9d-b5l2s" Dec 02 16:16:02 crc kubenswrapper[4933]: I1202 16:16:02.280072 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4554a65a-0086-4a8d-8d70-57e016ad418e-config-data-custom\") pod \"heat-cfnapi-5545fbccfb-mqjbx\" (UID: \"4554a65a-0086-4a8d-8d70-57e016ad418e\") " pod="openstack/heat-cfnapi-5545fbccfb-mqjbx" Dec 02 16:16:02 crc kubenswrapper[4933]: I1202 16:16:02.280148 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4554a65a-0086-4a8d-8d70-57e016ad418e-combined-ca-bundle\") pod \"heat-cfnapi-5545fbccfb-mqjbx\" (UID: \"4554a65a-0086-4a8d-8d70-57e016ad418e\") " pod="openstack/heat-cfnapi-5545fbccfb-mqjbx" Dec 02 16:16:02 crc kubenswrapper[4933]: I1202 16:16:02.280201 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4554a65a-0086-4a8d-8d70-57e016ad418e-config-data\") pod \"heat-cfnapi-5545fbccfb-mqjbx\" (UID: \"4554a65a-0086-4a8d-8d70-57e016ad418e\") " pod="openstack/heat-cfnapi-5545fbccfb-mqjbx" Dec 02 16:16:02 crc kubenswrapper[4933]: I1202 16:16:02.280316 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbfdm\" (UniqueName: \"kubernetes.io/projected/4554a65a-0086-4a8d-8d70-57e016ad418e-kube-api-access-jbfdm\") pod \"heat-cfnapi-5545fbccfb-mqjbx\" (UID: \"4554a65a-0086-4a8d-8d70-57e016ad418e\") " pod="openstack/heat-cfnapi-5545fbccfb-mqjbx" Dec 02 16:16:02 crc kubenswrapper[4933]: I1202 16:16:02.340618 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-57cc8c89b8-9grd9" Dec 02 16:16:02 crc kubenswrapper[4933]: I1202 16:16:02.381904 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbfdm\" (UniqueName: \"kubernetes.io/projected/4554a65a-0086-4a8d-8d70-57e016ad418e-kube-api-access-jbfdm\") pod \"heat-cfnapi-5545fbccfb-mqjbx\" (UID: \"4554a65a-0086-4a8d-8d70-57e016ad418e\") " pod="openstack/heat-cfnapi-5545fbccfb-mqjbx" Dec 02 16:16:02 crc kubenswrapper[4933]: I1202 16:16:02.381976 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4554a65a-0086-4a8d-8d70-57e016ad418e-config-data-custom\") pod \"heat-cfnapi-5545fbccfb-mqjbx\" (UID: \"4554a65a-0086-4a8d-8d70-57e016ad418e\") " pod="openstack/heat-cfnapi-5545fbccfb-mqjbx" Dec 02 16:16:02 crc kubenswrapper[4933]: I1202 16:16:02.382023 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4554a65a-0086-4a8d-8d70-57e016ad418e-combined-ca-bundle\") pod \"heat-cfnapi-5545fbccfb-mqjbx\" (UID: \"4554a65a-0086-4a8d-8d70-57e016ad418e\") " pod="openstack/heat-cfnapi-5545fbccfb-mqjbx" Dec 02 16:16:02 crc kubenswrapper[4933]: I1202 16:16:02.382085 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4554a65a-0086-4a8d-8d70-57e016ad418e-config-data\") pod \"heat-cfnapi-5545fbccfb-mqjbx\" (UID: \"4554a65a-0086-4a8d-8d70-57e016ad418e\") " pod="openstack/heat-cfnapi-5545fbccfb-mqjbx" Dec 02 16:16:02 crc kubenswrapper[4933]: I1202 16:16:02.407732 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4554a65a-0086-4a8d-8d70-57e016ad418e-combined-ca-bundle\") pod \"heat-cfnapi-5545fbccfb-mqjbx\" (UID: \"4554a65a-0086-4a8d-8d70-57e016ad418e\") " pod="openstack/heat-cfnapi-5545fbccfb-mqjbx" Dec 02 16:16:02 crc kubenswrapper[4933]: I1202 16:16:02.409625 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4554a65a-0086-4a8d-8d70-57e016ad418e-config-data-custom\") pod \"heat-cfnapi-5545fbccfb-mqjbx\" (UID: \"4554a65a-0086-4a8d-8d70-57e016ad418e\") " pod="openstack/heat-cfnapi-5545fbccfb-mqjbx" Dec 02 16:16:02 crc kubenswrapper[4933]: I1202 16:16:02.410335 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4554a65a-0086-4a8d-8d70-57e016ad418e-config-data\") pod \"heat-cfnapi-5545fbccfb-mqjbx\" (UID: \"4554a65a-0086-4a8d-8d70-57e016ad418e\") " pod="openstack/heat-cfnapi-5545fbccfb-mqjbx" Dec 02 16:16:02 crc kubenswrapper[4933]: I1202 16:16:02.418738 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbfdm\" (UniqueName: \"kubernetes.io/projected/4554a65a-0086-4a8d-8d70-57e016ad418e-kube-api-access-jbfdm\") pod \"heat-cfnapi-5545fbccfb-mqjbx\" (UID: \"4554a65a-0086-4a8d-8d70-57e016ad418e\") " pod="openstack/heat-cfnapi-5545fbccfb-mqjbx" Dec 02 16:16:02 crc kubenswrapper[4933]: I1202 16:16:02.445527 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-64b6fd7f9d-b5l2s" Dec 02 16:16:02 crc kubenswrapper[4933]: I1202 16:16:02.661051 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-5545fbccfb-mqjbx" Dec 02 16:16:02 crc kubenswrapper[4933]: I1202 16:16:02.676653 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-64f6f8f6c-x5wnf"] Dec 02 16:16:03 crc kubenswrapper[4933]: E1202 16:16:03.121481 4933 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6c1c5e6_50dd_428a_890c_2c3f0456f2fa.slice/crio-88738e3bef53d0478fb4beaed256d677959bb94ab5e545598a6a14e9cb4a5112.scope\": RecentStats: unable to find data in memory cache]" Dec 02 16:16:03 crc kubenswrapper[4933]: I1202 16:16:03.207785 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-64f6f8f6c-x5wnf" event={"ID":"97b9d44e-b2f2-4da0-82e0-28c657d8df41","Type":"ContainerStarted","Data":"daa73e293194017bffbe9e3d2663fdca63e41139c165f7cfcc31645d3630c700"} Dec 02 16:16:03 crc kubenswrapper[4933]: I1202 16:16:03.209436 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-57cc8c89b8-9grd9"] Dec 02 16:16:03 crc kubenswrapper[4933]: I1202 16:16:03.211302 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-84dddfc884-5ff6c" event={"ID":"036484a3-ab5b-4bf2-a02c-9fb583904529","Type":"ContainerStarted","Data":"2b9268acc0ad9dc119c98d5f1d24eac32655dee2c52b944ec29cfacba1dbb5ef"} Dec 02 16:16:03 crc kubenswrapper[4933]: I1202 16:16:03.214686 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-84dddfc884-5ff6c" Dec 02 16:16:03 crc kubenswrapper[4933]: I1202 16:16:03.307517 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-75b459779d-29rn7" event={"ID":"2da32456-86db-40a7-afb5-9d8b6e5866d6","Type":"ContainerStarted","Data":"71eefb8aaf3f8c9dd4b0128b1cc474ca55934f7a7d97a32177b25ae97337d552"} Dec 02 16:16:03 crc kubenswrapper[4933]: I1202 16:16:03.308486 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-75b459779d-29rn7" Dec 02 16:16:03 crc kubenswrapper[4933]: I1202 16:16:03.435867 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-84dddfc884-5ff6c" podStartSLOduration=4.001890025 podStartE2EDuration="9.435840204s" podCreationTimestamp="2025-12-02 16:15:54 +0000 UTC" firstStartedPulling="2025-12-02 16:15:56.291456583 +0000 UTC m=+1419.542683286" lastFinishedPulling="2025-12-02 16:16:01.725406762 +0000 UTC m=+1424.976633465" observedRunningTime="2025-12-02 16:16:03.261616389 +0000 UTC m=+1426.512843092" watchObservedRunningTime="2025-12-02 16:16:03.435840204 +0000 UTC m=+1426.687066937" Dec 02 16:16:03 crc kubenswrapper[4933]: I1202 16:16:03.484566 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-64b6fd7f9d-b5l2s"] Dec 02 16:16:03 crc kubenswrapper[4933]: I1202 16:16:03.489724 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-75b459779d-29rn7" podStartSLOduration=3.722763387 podStartE2EDuration="9.489703358s" podCreationTimestamp="2025-12-02 16:15:54 +0000 UTC" firstStartedPulling="2025-12-02 16:15:55.953780204 +0000 UTC m=+1419.205006907" lastFinishedPulling="2025-12-02 16:16:01.720720175 +0000 UTC m=+1424.971946878" observedRunningTime="2025-12-02 16:16:03.373590582 +0000 UTC m=+1426.624817275" watchObservedRunningTime="2025-12-02 16:16:03.489703358 +0000 UTC m=+1426.740930061" Dec 02 16:16:03 crc kubenswrapper[4933]: I1202 16:16:03.534255 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-5545fbccfb-mqjbx"] Dec 02 16:16:03 crc kubenswrapper[4933]: I1202 16:16:03.867147 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 16:16:03 crc kubenswrapper[4933]: I1202 16:16:03.867512 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="41bc6554-0594-4a0b-9b1c-063f09af19ef" containerName="ceilometer-central-agent" containerID="cri-o://031a7ad11bf9c6a8a8c6dc1425d10b1dc132c180d71a86bb43b8c3bb2cdc9116" gracePeriod=30 Dec 02 16:16:03 crc kubenswrapper[4933]: I1202 16:16:03.867978 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="41bc6554-0594-4a0b-9b1c-063f09af19ef" containerName="proxy-httpd" containerID="cri-o://ed70c571191960cb8e95c29b093ad9380836c1a17a85f2da93b285fba4985071" gracePeriod=30 Dec 02 16:16:03 crc kubenswrapper[4933]: I1202 16:16:03.868038 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="41bc6554-0594-4a0b-9b1c-063f09af19ef" containerName="sg-core" containerID="cri-o://9d15d81a7d9d565d0c133fb641536c92dbd0b74fa375f2f19df145e67a197434" gracePeriod=30 Dec 02 16:16:03 crc kubenswrapper[4933]: I1202 16:16:03.868075 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="41bc6554-0594-4a0b-9b1c-063f09af19ef" containerName="ceilometer-notification-agent" containerID="cri-o://c6240e38b31d693201a8016a4efb4d03d3c6b66e5de3e1fc3934db6e3b139897" gracePeriod=30 Dec 02 16:16:03 crc kubenswrapper[4933]: I1202 16:16:03.986741 4933 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="41bc6554-0594-4a0b-9b1c-063f09af19ef" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.201:3000/\": read tcp 10.217.0.2:53456->10.217.0.201:3000: read: connection reset by peer" Dec 02 16:16:04 crc kubenswrapper[4933]: I1202 16:16:04.326925 4933 generic.go:334] "Generic (PLEG): container finished" podID="41bc6554-0594-4a0b-9b1c-063f09af19ef" containerID="ed70c571191960cb8e95c29b093ad9380836c1a17a85f2da93b285fba4985071" exitCode=0 Dec 02 16:16:04 crc kubenswrapper[4933]: I1202 16:16:04.326958 4933 generic.go:334] "Generic (PLEG): container finished" podID="41bc6554-0594-4a0b-9b1c-063f09af19ef" containerID="9d15d81a7d9d565d0c133fb641536c92dbd0b74fa375f2f19df145e67a197434" exitCode=2 Dec 02 16:16:04 crc kubenswrapper[4933]: I1202 16:16:04.327017 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"41bc6554-0594-4a0b-9b1c-063f09af19ef","Type":"ContainerDied","Data":"ed70c571191960cb8e95c29b093ad9380836c1a17a85f2da93b285fba4985071"} Dec 02 16:16:04 crc kubenswrapper[4933]: I1202 16:16:04.327057 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"41bc6554-0594-4a0b-9b1c-063f09af19ef","Type":"ContainerDied","Data":"9d15d81a7d9d565d0c133fb641536c92dbd0b74fa375f2f19df145e67a197434"} Dec 02 16:16:04 crc kubenswrapper[4933]: I1202 16:16:04.329282 4933 generic.go:334] "Generic (PLEG): container finished" podID="b5ea07dd-c4dc-421e-b710-8d759f0d9487" containerID="abe10cb3c29ecb543a46d49742195c331b2d8579670e41803029f69a2c88d71a" exitCode=1 Dec 02 16:16:04 crc kubenswrapper[4933]: I1202 16:16:04.329332 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-57cc8c89b8-9grd9" event={"ID":"b5ea07dd-c4dc-421e-b710-8d759f0d9487","Type":"ContainerDied","Data":"abe10cb3c29ecb543a46d49742195c331b2d8579670e41803029f69a2c88d71a"} Dec 02 16:16:04 crc kubenswrapper[4933]: I1202 16:16:04.329353 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-57cc8c89b8-9grd9" event={"ID":"b5ea07dd-c4dc-421e-b710-8d759f0d9487","Type":"ContainerStarted","Data":"54da929ff1e237b7222e00fbce8def44efbb1aa6460df4a49e3db48c0d1fa804"} Dec 02 16:16:04 crc kubenswrapper[4933]: I1202 16:16:04.330510 4933 scope.go:117] "RemoveContainer" containerID="abe10cb3c29ecb543a46d49742195c331b2d8579670e41803029f69a2c88d71a" Dec 02 16:16:04 crc kubenswrapper[4933]: I1202 16:16:04.335068 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"60f07df2-68b3-4c78-9279-5dd6d9c71397","Type":"ContainerStarted","Data":"29bfe94d35b81a42979c20ff6817f11f7bca46186173ecf3466fa18da16d9ac6"} Dec 02 16:16:04 crc kubenswrapper[4933]: I1202 16:16:04.353729 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-5545fbccfb-mqjbx" event={"ID":"4554a65a-0086-4a8d-8d70-57e016ad418e","Type":"ContainerStarted","Data":"c274ad5cdd5a15af8007b0cd93ac3eedb1e23fde84c46cd69d6ad15cd5e2416b"} Dec 02 16:16:04 crc kubenswrapper[4933]: I1202 16:16:04.353950 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-5545fbccfb-mqjbx" event={"ID":"4554a65a-0086-4a8d-8d70-57e016ad418e","Type":"ContainerStarted","Data":"da059523de4bf7b97bd488a4aed6051e04027f4035cf36d64a28830722c09296"} Dec 02 16:16:04 crc kubenswrapper[4933]: I1202 16:16:04.353976 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-5545fbccfb-mqjbx" Dec 02 16:16:04 crc kubenswrapper[4933]: I1202 16:16:04.356075 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-64b6fd7f9d-b5l2s" event={"ID":"59434c0f-e21f-45a2-909b-fa1143a5b5ae","Type":"ContainerStarted","Data":"233e8e7fffa44b80d7319fbe8af766215914d33ff5dc13cd6f7267d68b6d25b8"} Dec 02 16:16:04 crc kubenswrapper[4933]: I1202 16:16:04.356115 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-64b6fd7f9d-b5l2s" event={"ID":"59434c0f-e21f-45a2-909b-fa1143a5b5ae","Type":"ContainerStarted","Data":"f6a899d43c400f1edd0baa4bbb9b5afe2f83946f9b7961f7ecf625e2b3793b42"} Dec 02 16:16:04 crc kubenswrapper[4933]: I1202 16:16:04.357160 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-64b6fd7f9d-b5l2s" Dec 02 16:16:04 crc kubenswrapper[4933]: I1202 16:16:04.363048 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-64f6f8f6c-x5wnf" event={"ID":"97b9d44e-b2f2-4da0-82e0-28c657d8df41","Type":"ContainerStarted","Data":"a0fb390ad08d6c2038889eb818376f670a59bedbab2bc0fa23777928bdcf229a"} Dec 02 16:16:04 crc kubenswrapper[4933]: I1202 16:16:04.363098 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-64f6f8f6c-x5wnf" event={"ID":"97b9d44e-b2f2-4da0-82e0-28c657d8df41","Type":"ContainerStarted","Data":"860ffd2c26e463385c2f0fc0df4785e8ae3786eabadd0bb7fee16de1ce81a118"} Dec 02 16:16:04 crc kubenswrapper[4933]: I1202 16:16:04.363458 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-64f6f8f6c-x5wnf" Dec 02 16:16:04 crc kubenswrapper[4933]: I1202 16:16:04.363479 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-64f6f8f6c-x5wnf" Dec 02 16:16:04 crc kubenswrapper[4933]: I1202 16:16:04.421652 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=7.421630798 podStartE2EDuration="7.421630798s" podCreationTimestamp="2025-12-02 16:15:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 16:16:04.369126811 +0000 UTC m=+1427.620353514" watchObservedRunningTime="2025-12-02 16:16:04.421630798 +0000 UTC m=+1427.672857501" Dec 02 16:16:04 crc kubenswrapper[4933]: I1202 16:16:04.435845 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-5545fbccfb-mqjbx" podStartSLOduration=3.435807904 podStartE2EDuration="3.435807904s" podCreationTimestamp="2025-12-02 16:16:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 16:16:04.395890379 +0000 UTC m=+1427.647117082" watchObservedRunningTime="2025-12-02 16:16:04.435807904 +0000 UTC m=+1427.687034607" Dec 02 16:16:04 crc kubenswrapper[4933]: I1202 16:16:04.477964 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-64b6fd7f9d-b5l2s" podStartSLOduration=3.477809535 podStartE2EDuration="3.477809535s" podCreationTimestamp="2025-12-02 16:16:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 16:16:04.415392799 +0000 UTC m=+1427.666619512" watchObservedRunningTime="2025-12-02 16:16:04.477809535 +0000 UTC m=+1427.729036238" Dec 02 16:16:04 crc kubenswrapper[4933]: I1202 16:16:04.520387 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-64f6f8f6c-x5wnf" podStartSLOduration=4.520364892 podStartE2EDuration="4.520364892s" podCreationTimestamp="2025-12-02 16:16:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 16:16:04.444733286 +0000 UTC m=+1427.695959989" watchObservedRunningTime="2025-12-02 16:16:04.520364892 +0000 UTC m=+1427.771591605" Dec 02 16:16:05 crc kubenswrapper[4933]: I1202 16:16:05.007547 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7d978555f9-9jz6z" Dec 02 16:16:05 crc kubenswrapper[4933]: I1202 16:16:05.127081 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-zj8w6"] Dec 02 16:16:05 crc kubenswrapper[4933]: I1202 16:16:05.127343 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6bb4fc677f-zj8w6" podUID="317f6015-0b9d-4a80-b022-2b77224b0284" containerName="dnsmasq-dns" containerID="cri-o://99001867c6db24be7d002dbc692073a058a7ce942910f622408b5ba18f8a730e" gracePeriod=10 Dec 02 16:16:05 crc kubenswrapper[4933]: I1202 16:16:05.391971 4933 generic.go:334] "Generic (PLEG): container finished" podID="317f6015-0b9d-4a80-b022-2b77224b0284" containerID="99001867c6db24be7d002dbc692073a058a7ce942910f622408b5ba18f8a730e" exitCode=0 Dec 02 16:16:05 crc kubenswrapper[4933]: I1202 16:16:05.392162 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-zj8w6" event={"ID":"317f6015-0b9d-4a80-b022-2b77224b0284","Type":"ContainerDied","Data":"99001867c6db24be7d002dbc692073a058a7ce942910f622408b5ba18f8a730e"} Dec 02 16:16:05 crc kubenswrapper[4933]: I1202 16:16:05.417136 4933 generic.go:334] "Generic (PLEG): container finished" podID="41bc6554-0594-4a0b-9b1c-063f09af19ef" containerID="031a7ad11bf9c6a8a8c6dc1425d10b1dc132c180d71a86bb43b8c3bb2cdc9116" exitCode=0 Dec 02 16:16:05 crc kubenswrapper[4933]: I1202 16:16:05.417236 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"41bc6554-0594-4a0b-9b1c-063f09af19ef","Type":"ContainerDied","Data":"031a7ad11bf9c6a8a8c6dc1425d10b1dc132c180d71a86bb43b8c3bb2cdc9116"} Dec 02 16:16:05 crc kubenswrapper[4933]: I1202 16:16:05.429872 4933 generic.go:334] "Generic (PLEG): container finished" podID="b5ea07dd-c4dc-421e-b710-8d759f0d9487" containerID="826aee863825a409f3215548bc30dc822c450b46483aaa49e0c7b31a17bb87c1" exitCode=1 Dec 02 16:16:05 crc kubenswrapper[4933]: I1202 16:16:05.429961 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-57cc8c89b8-9grd9" event={"ID":"b5ea07dd-c4dc-421e-b710-8d759f0d9487","Type":"ContainerDied","Data":"826aee863825a409f3215548bc30dc822c450b46483aaa49e0c7b31a17bb87c1"} Dec 02 16:16:05 crc kubenswrapper[4933]: I1202 16:16:05.429995 4933 scope.go:117] "RemoveContainer" containerID="abe10cb3c29ecb543a46d49742195c331b2d8579670e41803029f69a2c88d71a" Dec 02 16:16:05 crc kubenswrapper[4933]: I1202 16:16:05.430663 4933 scope.go:117] "RemoveContainer" containerID="826aee863825a409f3215548bc30dc822c450b46483aaa49e0c7b31a17bb87c1" Dec 02 16:16:05 crc kubenswrapper[4933]: E1202 16:16:05.431069 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-57cc8c89b8-9grd9_openstack(b5ea07dd-c4dc-421e-b710-8d759f0d9487)\"" pod="openstack/heat-api-57cc8c89b8-9grd9" podUID="b5ea07dd-c4dc-421e-b710-8d759f0d9487" Dec 02 16:16:05 crc kubenswrapper[4933]: I1202 16:16:05.436336 4933 generic.go:334] "Generic (PLEG): container finished" podID="4554a65a-0086-4a8d-8d70-57e016ad418e" containerID="c274ad5cdd5a15af8007b0cd93ac3eedb1e23fde84c46cd69d6ad15cd5e2416b" exitCode=1 Dec 02 16:16:05 crc kubenswrapper[4933]: I1202 16:16:05.437988 4933 scope.go:117] "RemoveContainer" containerID="c274ad5cdd5a15af8007b0cd93ac3eedb1e23fde84c46cd69d6ad15cd5e2416b" Dec 02 16:16:05 crc kubenswrapper[4933]: I1202 16:16:05.439035 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-5545fbccfb-mqjbx" event={"ID":"4554a65a-0086-4a8d-8d70-57e016ad418e","Type":"ContainerDied","Data":"c274ad5cdd5a15af8007b0cd93ac3eedb1e23fde84c46cd69d6ad15cd5e2416b"} Dec 02 16:16:05 crc kubenswrapper[4933]: I1202 16:16:05.627283 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-84dddfc884-5ff6c"] Dec 02 16:16:05 crc kubenswrapper[4933]: I1202 16:16:05.627487 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-api-84dddfc884-5ff6c" podUID="036484a3-ab5b-4bf2-a02c-9fb583904529" containerName="heat-api" containerID="cri-o://2b9268acc0ad9dc119c98d5f1d24eac32655dee2c52b944ec29cfacba1dbb5ef" gracePeriod=60 Dec 02 16:16:05 crc kubenswrapper[4933]: I1202 16:16:05.748393 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-75b459779d-29rn7"] Dec 02 16:16:05 crc kubenswrapper[4933]: I1202 16:16:05.748809 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-cfnapi-75b459779d-29rn7" podUID="2da32456-86db-40a7-afb5-9d8b6e5866d6" containerName="heat-cfnapi" containerID="cri-o://71eefb8aaf3f8c9dd4b0128b1cc474ca55934f7a7d97a32177b25ae97337d552" gracePeriod=60 Dec 02 16:16:05 crc kubenswrapper[4933]: I1202 16:16:05.819334 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-5558cd5dc7-s4k68"] Dec 02 16:16:05 crc kubenswrapper[4933]: I1202 16:16:05.820871 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5558cd5dc7-s4k68" Dec 02 16:16:05 crc kubenswrapper[4933]: I1202 16:16:05.835839 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-5558cd5dc7-s4k68"] Dec 02 16:16:05 crc kubenswrapper[4933]: I1202 16:16:05.842731 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-api-internal-svc" Dec 02 16:16:05 crc kubenswrapper[4933]: I1202 16:16:05.843458 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-api-public-svc" Dec 02 16:16:05 crc kubenswrapper[4933]: I1202 16:16:05.850276 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4caba072-5d91-4126-909a-b4e7ad6167d6-config-data-custom\") pod \"heat-api-5558cd5dc7-s4k68\" (UID: \"4caba072-5d91-4126-909a-b4e7ad6167d6\") " pod="openstack/heat-api-5558cd5dc7-s4k68" Dec 02 16:16:05 crc kubenswrapper[4933]: I1202 16:16:05.850390 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxwgk\" (UniqueName: \"kubernetes.io/projected/4caba072-5d91-4126-909a-b4e7ad6167d6-kube-api-access-nxwgk\") pod \"heat-api-5558cd5dc7-s4k68\" (UID: \"4caba072-5d91-4126-909a-b4e7ad6167d6\") " pod="openstack/heat-api-5558cd5dc7-s4k68" Dec 02 16:16:05 crc kubenswrapper[4933]: I1202 16:16:05.850596 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4caba072-5d91-4126-909a-b4e7ad6167d6-config-data\") pod \"heat-api-5558cd5dc7-s4k68\" (UID: \"4caba072-5d91-4126-909a-b4e7ad6167d6\") " pod="openstack/heat-api-5558cd5dc7-s4k68" Dec 02 16:16:05 crc kubenswrapper[4933]: I1202 16:16:05.850696 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4caba072-5d91-4126-909a-b4e7ad6167d6-public-tls-certs\") pod \"heat-api-5558cd5dc7-s4k68\" (UID: \"4caba072-5d91-4126-909a-b4e7ad6167d6\") " pod="openstack/heat-api-5558cd5dc7-s4k68" Dec 02 16:16:05 crc kubenswrapper[4933]: I1202 16:16:05.850723 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4caba072-5d91-4126-909a-b4e7ad6167d6-internal-tls-certs\") pod \"heat-api-5558cd5dc7-s4k68\" (UID: \"4caba072-5d91-4126-909a-b4e7ad6167d6\") " pod="openstack/heat-api-5558cd5dc7-s4k68" Dec 02 16:16:05 crc kubenswrapper[4933]: I1202 16:16:05.850780 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4caba072-5d91-4126-909a-b4e7ad6167d6-combined-ca-bundle\") pod \"heat-api-5558cd5dc7-s4k68\" (UID: \"4caba072-5d91-4126-909a-b4e7ad6167d6\") " pod="openstack/heat-api-5558cd5dc7-s4k68" Dec 02 16:16:05 crc kubenswrapper[4933]: I1202 16:16:05.878881 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-5979c644c5-dckr9"] Dec 02 16:16:05 crc kubenswrapper[4933]: I1202 16:16:05.880483 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-5979c644c5-dckr9" Dec 02 16:16:05 crc kubenswrapper[4933]: I1202 16:16:05.892707 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-cfnapi-public-svc" Dec 02 16:16:05 crc kubenswrapper[4933]: I1202 16:16:05.892882 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-cfnapi-internal-svc" Dec 02 16:16:05 crc kubenswrapper[4933]: I1202 16:16:05.940783 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-5979c644c5-dckr9"] Dec 02 16:16:05 crc kubenswrapper[4933]: I1202 16:16:05.955944 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwrxb\" (UniqueName: \"kubernetes.io/projected/3aa14443-9372-4f3d-bc34-3463b051b107-kube-api-access-mwrxb\") pod \"heat-cfnapi-5979c644c5-dckr9\" (UID: \"3aa14443-9372-4f3d-bc34-3463b051b107\") " pod="openstack/heat-cfnapi-5979c644c5-dckr9" Dec 02 16:16:05 crc kubenswrapper[4933]: I1202 16:16:05.955990 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4caba072-5d91-4126-909a-b4e7ad6167d6-public-tls-certs\") pod \"heat-api-5558cd5dc7-s4k68\" (UID: \"4caba072-5d91-4126-909a-b4e7ad6167d6\") " pod="openstack/heat-api-5558cd5dc7-s4k68" Dec 02 16:16:05 crc kubenswrapper[4933]: I1202 16:16:05.956020 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4caba072-5d91-4126-909a-b4e7ad6167d6-internal-tls-certs\") pod \"heat-api-5558cd5dc7-s4k68\" (UID: \"4caba072-5d91-4126-909a-b4e7ad6167d6\") " pod="openstack/heat-api-5558cd5dc7-s4k68" Dec 02 16:16:05 crc kubenswrapper[4933]: I1202 16:16:05.956043 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3aa14443-9372-4f3d-bc34-3463b051b107-combined-ca-bundle\") pod \"heat-cfnapi-5979c644c5-dckr9\" (UID: \"3aa14443-9372-4f3d-bc34-3463b051b107\") " pod="openstack/heat-cfnapi-5979c644c5-dckr9" Dec 02 16:16:05 crc kubenswrapper[4933]: I1202 16:16:05.956097 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4caba072-5d91-4126-909a-b4e7ad6167d6-combined-ca-bundle\") pod \"heat-api-5558cd5dc7-s4k68\" (UID: \"4caba072-5d91-4126-909a-b4e7ad6167d6\") " pod="openstack/heat-api-5558cd5dc7-s4k68" Dec 02 16:16:05 crc kubenswrapper[4933]: I1202 16:16:05.956162 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4caba072-5d91-4126-909a-b4e7ad6167d6-config-data-custom\") pod \"heat-api-5558cd5dc7-s4k68\" (UID: \"4caba072-5d91-4126-909a-b4e7ad6167d6\") " pod="openstack/heat-api-5558cd5dc7-s4k68" Dec 02 16:16:05 crc kubenswrapper[4933]: I1202 16:16:05.956184 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxwgk\" (UniqueName: \"kubernetes.io/projected/4caba072-5d91-4126-909a-b4e7ad6167d6-kube-api-access-nxwgk\") pod \"heat-api-5558cd5dc7-s4k68\" (UID: \"4caba072-5d91-4126-909a-b4e7ad6167d6\") " pod="openstack/heat-api-5558cd5dc7-s4k68" Dec 02 16:16:05 crc kubenswrapper[4933]: I1202 16:16:05.956217 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4caba072-5d91-4126-909a-b4e7ad6167d6-config-data\") pod \"heat-api-5558cd5dc7-s4k68\" (UID: \"4caba072-5d91-4126-909a-b4e7ad6167d6\") " pod="openstack/heat-api-5558cd5dc7-s4k68" Dec 02 16:16:05 crc kubenswrapper[4933]: I1202 16:16:05.956237 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3aa14443-9372-4f3d-bc34-3463b051b107-config-data\") pod \"heat-cfnapi-5979c644c5-dckr9\" (UID: \"3aa14443-9372-4f3d-bc34-3463b051b107\") " pod="openstack/heat-cfnapi-5979c644c5-dckr9" Dec 02 16:16:05 crc kubenswrapper[4933]: I1202 16:16:05.956260 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3aa14443-9372-4f3d-bc34-3463b051b107-public-tls-certs\") pod \"heat-cfnapi-5979c644c5-dckr9\" (UID: \"3aa14443-9372-4f3d-bc34-3463b051b107\") " pod="openstack/heat-cfnapi-5979c644c5-dckr9" Dec 02 16:16:05 crc kubenswrapper[4933]: I1202 16:16:05.956310 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3aa14443-9372-4f3d-bc34-3463b051b107-internal-tls-certs\") pod \"heat-cfnapi-5979c644c5-dckr9\" (UID: \"3aa14443-9372-4f3d-bc34-3463b051b107\") " pod="openstack/heat-cfnapi-5979c644c5-dckr9" Dec 02 16:16:05 crc kubenswrapper[4933]: I1202 16:16:05.956333 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3aa14443-9372-4f3d-bc34-3463b051b107-config-data-custom\") pod \"heat-cfnapi-5979c644c5-dckr9\" (UID: \"3aa14443-9372-4f3d-bc34-3463b051b107\") " pod="openstack/heat-cfnapi-5979c644c5-dckr9" Dec 02 16:16:05 crc kubenswrapper[4933]: I1202 16:16:05.963650 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4caba072-5d91-4126-909a-b4e7ad6167d6-public-tls-certs\") pod \"heat-api-5558cd5dc7-s4k68\" (UID: \"4caba072-5d91-4126-909a-b4e7ad6167d6\") " pod="openstack/heat-api-5558cd5dc7-s4k68" Dec 02 16:16:05 crc kubenswrapper[4933]: I1202 16:16:05.964180 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4caba072-5d91-4126-909a-b4e7ad6167d6-config-data\") pod \"heat-api-5558cd5dc7-s4k68\" (UID: \"4caba072-5d91-4126-909a-b4e7ad6167d6\") " pod="openstack/heat-api-5558cd5dc7-s4k68" Dec 02 16:16:05 crc kubenswrapper[4933]: I1202 16:16:05.967577 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4caba072-5d91-4126-909a-b4e7ad6167d6-combined-ca-bundle\") pod \"heat-api-5558cd5dc7-s4k68\" (UID: \"4caba072-5d91-4126-909a-b4e7ad6167d6\") " pod="openstack/heat-api-5558cd5dc7-s4k68" Dec 02 16:16:05 crc kubenswrapper[4933]: I1202 16:16:05.993596 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4caba072-5d91-4126-909a-b4e7ad6167d6-config-data-custom\") pod \"heat-api-5558cd5dc7-s4k68\" (UID: \"4caba072-5d91-4126-909a-b4e7ad6167d6\") " pod="openstack/heat-api-5558cd5dc7-s4k68" Dec 02 16:16:05 crc kubenswrapper[4933]: I1202 16:16:05.995474 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4caba072-5d91-4126-909a-b4e7ad6167d6-internal-tls-certs\") pod \"heat-api-5558cd5dc7-s4k68\" (UID: \"4caba072-5d91-4126-909a-b4e7ad6167d6\") " pod="openstack/heat-api-5558cd5dc7-s4k68" Dec 02 16:16:06 crc kubenswrapper[4933]: I1202 16:16:06.010698 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxwgk\" (UniqueName: \"kubernetes.io/projected/4caba072-5d91-4126-909a-b4e7ad6167d6-kube-api-access-nxwgk\") pod \"heat-api-5558cd5dc7-s4k68\" (UID: \"4caba072-5d91-4126-909a-b4e7ad6167d6\") " pod="openstack/heat-api-5558cd5dc7-s4k68" Dec 02 16:16:06 crc kubenswrapper[4933]: I1202 16:16:06.058455 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3aa14443-9372-4f3d-bc34-3463b051b107-config-data\") pod \"heat-cfnapi-5979c644c5-dckr9\" (UID: \"3aa14443-9372-4f3d-bc34-3463b051b107\") " pod="openstack/heat-cfnapi-5979c644c5-dckr9" Dec 02 16:16:06 crc kubenswrapper[4933]: I1202 16:16:06.058509 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3aa14443-9372-4f3d-bc34-3463b051b107-public-tls-certs\") pod \"heat-cfnapi-5979c644c5-dckr9\" (UID: \"3aa14443-9372-4f3d-bc34-3463b051b107\") " pod="openstack/heat-cfnapi-5979c644c5-dckr9" Dec 02 16:16:06 crc kubenswrapper[4933]: I1202 16:16:06.058569 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3aa14443-9372-4f3d-bc34-3463b051b107-internal-tls-certs\") pod \"heat-cfnapi-5979c644c5-dckr9\" (UID: \"3aa14443-9372-4f3d-bc34-3463b051b107\") " pod="openstack/heat-cfnapi-5979c644c5-dckr9" Dec 02 16:16:06 crc kubenswrapper[4933]: I1202 16:16:06.058593 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3aa14443-9372-4f3d-bc34-3463b051b107-config-data-custom\") pod \"heat-cfnapi-5979c644c5-dckr9\" (UID: \"3aa14443-9372-4f3d-bc34-3463b051b107\") " pod="openstack/heat-cfnapi-5979c644c5-dckr9" Dec 02 16:16:06 crc kubenswrapper[4933]: I1202 16:16:06.058637 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwrxb\" (UniqueName: \"kubernetes.io/projected/3aa14443-9372-4f3d-bc34-3463b051b107-kube-api-access-mwrxb\") pod \"heat-cfnapi-5979c644c5-dckr9\" (UID: \"3aa14443-9372-4f3d-bc34-3463b051b107\") " pod="openstack/heat-cfnapi-5979c644c5-dckr9" Dec 02 16:16:06 crc kubenswrapper[4933]: I1202 16:16:06.058676 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3aa14443-9372-4f3d-bc34-3463b051b107-combined-ca-bundle\") pod \"heat-cfnapi-5979c644c5-dckr9\" (UID: \"3aa14443-9372-4f3d-bc34-3463b051b107\") " pod="openstack/heat-cfnapi-5979c644c5-dckr9" Dec 02 16:16:06 crc kubenswrapper[4933]: I1202 16:16:06.070835 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3aa14443-9372-4f3d-bc34-3463b051b107-internal-tls-certs\") pod \"heat-cfnapi-5979c644c5-dckr9\" (UID: \"3aa14443-9372-4f3d-bc34-3463b051b107\") " pod="openstack/heat-cfnapi-5979c644c5-dckr9" Dec 02 16:16:06 crc kubenswrapper[4933]: I1202 16:16:06.086400 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3aa14443-9372-4f3d-bc34-3463b051b107-config-data\") pod \"heat-cfnapi-5979c644c5-dckr9\" (UID: \"3aa14443-9372-4f3d-bc34-3463b051b107\") " pod="openstack/heat-cfnapi-5979c644c5-dckr9" Dec 02 16:16:06 crc kubenswrapper[4933]: I1202 16:16:06.086649 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3aa14443-9372-4f3d-bc34-3463b051b107-config-data-custom\") pod \"heat-cfnapi-5979c644c5-dckr9\" (UID: \"3aa14443-9372-4f3d-bc34-3463b051b107\") " pod="openstack/heat-cfnapi-5979c644c5-dckr9" Dec 02 16:16:06 crc kubenswrapper[4933]: I1202 16:16:06.092558 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3aa14443-9372-4f3d-bc34-3463b051b107-public-tls-certs\") pod \"heat-cfnapi-5979c644c5-dckr9\" (UID: \"3aa14443-9372-4f3d-bc34-3463b051b107\") " pod="openstack/heat-cfnapi-5979c644c5-dckr9" Dec 02 16:16:06 crc kubenswrapper[4933]: I1202 16:16:06.096679 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3aa14443-9372-4f3d-bc34-3463b051b107-combined-ca-bundle\") pod \"heat-cfnapi-5979c644c5-dckr9\" (UID: \"3aa14443-9372-4f3d-bc34-3463b051b107\") " pod="openstack/heat-cfnapi-5979c644c5-dckr9" Dec 02 16:16:06 crc kubenswrapper[4933]: I1202 16:16:06.122819 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwrxb\" (UniqueName: \"kubernetes.io/projected/3aa14443-9372-4f3d-bc34-3463b051b107-kube-api-access-mwrxb\") pod \"heat-cfnapi-5979c644c5-dckr9\" (UID: \"3aa14443-9372-4f3d-bc34-3463b051b107\") " pod="openstack/heat-cfnapi-5979c644c5-dckr9" Dec 02 16:16:06 crc kubenswrapper[4933]: I1202 16:16:06.170478 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5558cd5dc7-s4k68" Dec 02 16:16:06 crc kubenswrapper[4933]: I1202 16:16:06.183417 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb4fc677f-zj8w6" Dec 02 16:16:06 crc kubenswrapper[4933]: I1202 16:16:06.230456 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-5979c644c5-dckr9" Dec 02 16:16:06 crc kubenswrapper[4933]: I1202 16:16:06.264974 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-76ll4\" (UniqueName: \"kubernetes.io/projected/317f6015-0b9d-4a80-b022-2b77224b0284-kube-api-access-76ll4\") pod \"317f6015-0b9d-4a80-b022-2b77224b0284\" (UID: \"317f6015-0b9d-4a80-b022-2b77224b0284\") " Dec 02 16:16:06 crc kubenswrapper[4933]: I1202 16:16:06.265093 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/317f6015-0b9d-4a80-b022-2b77224b0284-dns-svc\") pod \"317f6015-0b9d-4a80-b022-2b77224b0284\" (UID: \"317f6015-0b9d-4a80-b022-2b77224b0284\") " Dec 02 16:16:06 crc kubenswrapper[4933]: I1202 16:16:06.265191 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/317f6015-0b9d-4a80-b022-2b77224b0284-config\") pod \"317f6015-0b9d-4a80-b022-2b77224b0284\" (UID: \"317f6015-0b9d-4a80-b022-2b77224b0284\") " Dec 02 16:16:06 crc kubenswrapper[4933]: I1202 16:16:06.265254 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/317f6015-0b9d-4a80-b022-2b77224b0284-ovsdbserver-sb\") pod \"317f6015-0b9d-4a80-b022-2b77224b0284\" (UID: \"317f6015-0b9d-4a80-b022-2b77224b0284\") " Dec 02 16:16:06 crc kubenswrapper[4933]: I1202 16:16:06.265339 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/317f6015-0b9d-4a80-b022-2b77224b0284-ovsdbserver-nb\") pod \"317f6015-0b9d-4a80-b022-2b77224b0284\" (UID: \"317f6015-0b9d-4a80-b022-2b77224b0284\") " Dec 02 16:16:06 crc kubenswrapper[4933]: I1202 16:16:06.265434 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/317f6015-0b9d-4a80-b022-2b77224b0284-dns-swift-storage-0\") pod \"317f6015-0b9d-4a80-b022-2b77224b0284\" (UID: \"317f6015-0b9d-4a80-b022-2b77224b0284\") " Dec 02 16:16:06 crc kubenswrapper[4933]: I1202 16:16:06.288687 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/317f6015-0b9d-4a80-b022-2b77224b0284-kube-api-access-76ll4" (OuterVolumeSpecName: "kube-api-access-76ll4") pod "317f6015-0b9d-4a80-b022-2b77224b0284" (UID: "317f6015-0b9d-4a80-b022-2b77224b0284"). InnerVolumeSpecName "kube-api-access-76ll4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:16:06 crc kubenswrapper[4933]: I1202 16:16:06.368999 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-76ll4\" (UniqueName: \"kubernetes.io/projected/317f6015-0b9d-4a80-b022-2b77224b0284-kube-api-access-76ll4\") on node \"crc\" DevicePath \"\"" Dec 02 16:16:06 crc kubenswrapper[4933]: I1202 16:16:06.515391 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-zj8w6" event={"ID":"317f6015-0b9d-4a80-b022-2b77224b0284","Type":"ContainerDied","Data":"29227f17a8df73d6a7cce763652a4d6bbf51496172bbb6023be119e084ad73c3"} Dec 02 16:16:06 crc kubenswrapper[4933]: I1202 16:16:06.515692 4933 scope.go:117] "RemoveContainer" containerID="99001867c6db24be7d002dbc692073a058a7ce942910f622408b5ba18f8a730e" Dec 02 16:16:06 crc kubenswrapper[4933]: I1202 16:16:06.515814 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb4fc677f-zj8w6" Dec 02 16:16:06 crc kubenswrapper[4933]: I1202 16:16:06.565140 4933 generic.go:334] "Generic (PLEG): container finished" podID="41bc6554-0594-4a0b-9b1c-063f09af19ef" containerID="c6240e38b31d693201a8016a4efb4d03d3c6b66e5de3e1fc3934db6e3b139897" exitCode=0 Dec 02 16:16:06 crc kubenswrapper[4933]: I1202 16:16:06.565201 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"41bc6554-0594-4a0b-9b1c-063f09af19ef","Type":"ContainerDied","Data":"c6240e38b31d693201a8016a4efb4d03d3c6b66e5de3e1fc3934db6e3b139897"} Dec 02 16:16:06 crc kubenswrapper[4933]: I1202 16:16:06.595298 4933 scope.go:117] "RemoveContainer" containerID="826aee863825a409f3215548bc30dc822c450b46483aaa49e0c7b31a17bb87c1" Dec 02 16:16:06 crc kubenswrapper[4933]: E1202 16:16:06.595859 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-57cc8c89b8-9grd9_openstack(b5ea07dd-c4dc-421e-b710-8d759f0d9487)\"" pod="openstack/heat-api-57cc8c89b8-9grd9" podUID="b5ea07dd-c4dc-421e-b710-8d759f0d9487" Dec 02 16:16:06 crc kubenswrapper[4933]: I1202 16:16:06.601255 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/317f6015-0b9d-4a80-b022-2b77224b0284-config" (OuterVolumeSpecName: "config") pod "317f6015-0b9d-4a80-b022-2b77224b0284" (UID: "317f6015-0b9d-4a80-b022-2b77224b0284"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:16:06 crc kubenswrapper[4933]: I1202 16:16:06.612612 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/317f6015-0b9d-4a80-b022-2b77224b0284-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "317f6015-0b9d-4a80-b022-2b77224b0284" (UID: "317f6015-0b9d-4a80-b022-2b77224b0284"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:16:06 crc kubenswrapper[4933]: I1202 16:16:06.654297 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/317f6015-0b9d-4a80-b022-2b77224b0284-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "317f6015-0b9d-4a80-b022-2b77224b0284" (UID: "317f6015-0b9d-4a80-b022-2b77224b0284"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:16:06 crc kubenswrapper[4933]: I1202 16:16:06.683046 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/317f6015-0b9d-4a80-b022-2b77224b0284-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "317f6015-0b9d-4a80-b022-2b77224b0284" (UID: "317f6015-0b9d-4a80-b022-2b77224b0284"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:16:06 crc kubenswrapper[4933]: I1202 16:16:06.684545 4933 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/317f6015-0b9d-4a80-b022-2b77224b0284-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 02 16:16:06 crc kubenswrapper[4933]: I1202 16:16:06.684573 4933 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/317f6015-0b9d-4a80-b022-2b77224b0284-config\") on node \"crc\" DevicePath \"\"" Dec 02 16:16:06 crc kubenswrapper[4933]: I1202 16:16:06.684588 4933 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/317f6015-0b9d-4a80-b022-2b77224b0284-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 02 16:16:06 crc kubenswrapper[4933]: I1202 16:16:06.723226 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/317f6015-0b9d-4a80-b022-2b77224b0284-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "317f6015-0b9d-4a80-b022-2b77224b0284" (UID: "317f6015-0b9d-4a80-b022-2b77224b0284"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:16:06 crc kubenswrapper[4933]: I1202 16:16:06.786308 4933 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/317f6015-0b9d-4a80-b022-2b77224b0284-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 02 16:16:06 crc kubenswrapper[4933]: I1202 16:16:06.786334 4933 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/317f6015-0b9d-4a80-b022-2b77224b0284-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 16:16:06 crc kubenswrapper[4933]: I1202 16:16:06.865185 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-zj8w6"] Dec 02 16:16:06 crc kubenswrapper[4933]: I1202 16:16:06.872011 4933 scope.go:117] "RemoveContainer" containerID="e94949f39f78da1a801f22416f0a4d14c105ecdf40dd2a8fa01eb8d574b8d34a" Dec 02 16:16:06 crc kubenswrapper[4933]: I1202 16:16:06.875063 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-zj8w6"] Dec 02 16:16:06 crc kubenswrapper[4933]: I1202 16:16:06.895005 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 16:16:06 crc kubenswrapper[4933]: I1202 16:16:06.995856 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/41bc6554-0594-4a0b-9b1c-063f09af19ef-sg-core-conf-yaml\") pod \"41bc6554-0594-4a0b-9b1c-063f09af19ef\" (UID: \"41bc6554-0594-4a0b-9b1c-063f09af19ef\") " Dec 02 16:16:06 crc kubenswrapper[4933]: I1202 16:16:06.995934 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2tpds\" (UniqueName: \"kubernetes.io/projected/41bc6554-0594-4a0b-9b1c-063f09af19ef-kube-api-access-2tpds\") pod \"41bc6554-0594-4a0b-9b1c-063f09af19ef\" (UID: \"41bc6554-0594-4a0b-9b1c-063f09af19ef\") " Dec 02 16:16:06 crc kubenswrapper[4933]: I1202 16:16:06.996084 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41bc6554-0594-4a0b-9b1c-063f09af19ef-combined-ca-bundle\") pod \"41bc6554-0594-4a0b-9b1c-063f09af19ef\" (UID: \"41bc6554-0594-4a0b-9b1c-063f09af19ef\") " Dec 02 16:16:06 crc kubenswrapper[4933]: I1202 16:16:06.996111 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41bc6554-0594-4a0b-9b1c-063f09af19ef-scripts\") pod \"41bc6554-0594-4a0b-9b1c-063f09af19ef\" (UID: \"41bc6554-0594-4a0b-9b1c-063f09af19ef\") " Dec 02 16:16:06 crc kubenswrapper[4933]: I1202 16:16:06.996158 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41bc6554-0594-4a0b-9b1c-063f09af19ef-config-data\") pod \"41bc6554-0594-4a0b-9b1c-063f09af19ef\" (UID: \"41bc6554-0594-4a0b-9b1c-063f09af19ef\") " Dec 02 16:16:06 crc kubenswrapper[4933]: I1202 16:16:06.996251 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41bc6554-0594-4a0b-9b1c-063f09af19ef-run-httpd\") pod \"41bc6554-0594-4a0b-9b1c-063f09af19ef\" (UID: \"41bc6554-0594-4a0b-9b1c-063f09af19ef\") " Dec 02 16:16:06 crc kubenswrapper[4933]: I1202 16:16:06.996296 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41bc6554-0594-4a0b-9b1c-063f09af19ef-log-httpd\") pod \"41bc6554-0594-4a0b-9b1c-063f09af19ef\" (UID: \"41bc6554-0594-4a0b-9b1c-063f09af19ef\") " Dec 02 16:16:06 crc kubenswrapper[4933]: I1202 16:16:06.997736 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41bc6554-0594-4a0b-9b1c-063f09af19ef-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "41bc6554-0594-4a0b-9b1c-063f09af19ef" (UID: "41bc6554-0594-4a0b-9b1c-063f09af19ef"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 16:16:06 crc kubenswrapper[4933]: I1202 16:16:06.998661 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41bc6554-0594-4a0b-9b1c-063f09af19ef-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "41bc6554-0594-4a0b-9b1c-063f09af19ef" (UID: "41bc6554-0594-4a0b-9b1c-063f09af19ef"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 16:16:07 crc kubenswrapper[4933]: I1202 16:16:06.999139 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-5979c644c5-dckr9"] Dec 02 16:16:07 crc kubenswrapper[4933]: W1202 16:16:07.002074 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3aa14443_9372_4f3d_bc34_3463b051b107.slice/crio-d6bb5d2dc6f3c14c0923dbcea314c35ef7d8b113e5912ae5fcddf415151f2c07 WatchSource:0}: Error finding container d6bb5d2dc6f3c14c0923dbcea314c35ef7d8b113e5912ae5fcddf415151f2c07: Status 404 returned error can't find the container with id d6bb5d2dc6f3c14c0923dbcea314c35ef7d8b113e5912ae5fcddf415151f2c07 Dec 02 16:16:07 crc kubenswrapper[4933]: I1202 16:16:07.002595 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41bc6554-0594-4a0b-9b1c-063f09af19ef-scripts" (OuterVolumeSpecName: "scripts") pod "41bc6554-0594-4a0b-9b1c-063f09af19ef" (UID: "41bc6554-0594-4a0b-9b1c-063f09af19ef"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:16:07 crc kubenswrapper[4933]: I1202 16:16:07.005684 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41bc6554-0594-4a0b-9b1c-063f09af19ef-kube-api-access-2tpds" (OuterVolumeSpecName: "kube-api-access-2tpds") pod "41bc6554-0594-4a0b-9b1c-063f09af19ef" (UID: "41bc6554-0594-4a0b-9b1c-063f09af19ef"). InnerVolumeSpecName "kube-api-access-2tpds". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:16:07 crc kubenswrapper[4933]: I1202 16:16:07.088195 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="317f6015-0b9d-4a80-b022-2b77224b0284" path="/var/lib/kubelet/pods/317f6015-0b9d-4a80-b022-2b77224b0284/volumes" Dec 02 16:16:07 crc kubenswrapper[4933]: I1202 16:16:07.118391 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41bc6554-0594-4a0b-9b1c-063f09af19ef-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "41bc6554-0594-4a0b-9b1c-063f09af19ef" (UID: "41bc6554-0594-4a0b-9b1c-063f09af19ef"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:16:07 crc kubenswrapper[4933]: I1202 16:16:07.130928 4933 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41bc6554-0594-4a0b-9b1c-063f09af19ef-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 16:16:07 crc kubenswrapper[4933]: I1202 16:16:07.133951 4933 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41bc6554-0594-4a0b-9b1c-063f09af19ef-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 16:16:07 crc kubenswrapper[4933]: I1202 16:16:07.134959 4933 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41bc6554-0594-4a0b-9b1c-063f09af19ef-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 16:16:07 crc kubenswrapper[4933]: I1202 16:16:07.135078 4933 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/41bc6554-0594-4a0b-9b1c-063f09af19ef-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 02 16:16:07 crc kubenswrapper[4933]: I1202 16:16:07.135177 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2tpds\" (UniqueName: \"kubernetes.io/projected/41bc6554-0594-4a0b-9b1c-063f09af19ef-kube-api-access-2tpds\") on node \"crc\" DevicePath \"\"" Dec 02 16:16:07 crc kubenswrapper[4933]: I1202 16:16:07.209538 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41bc6554-0594-4a0b-9b1c-063f09af19ef-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "41bc6554-0594-4a0b-9b1c-063f09af19ef" (UID: "41bc6554-0594-4a0b-9b1c-063f09af19ef"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:16:07 crc kubenswrapper[4933]: I1202 16:16:07.218931 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-5558cd5dc7-s4k68"] Dec 02 16:16:07 crc kubenswrapper[4933]: I1202 16:16:07.230240 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-75b459779d-29rn7" Dec 02 16:16:07 crc kubenswrapper[4933]: I1202 16:16:07.236807 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41bc6554-0594-4a0b-9b1c-063f09af19ef-config-data" (OuterVolumeSpecName: "config-data") pod "41bc6554-0594-4a0b-9b1c-063f09af19ef" (UID: "41bc6554-0594-4a0b-9b1c-063f09af19ef"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:16:07 crc kubenswrapper[4933]: I1202 16:16:07.254326 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2da32456-86db-40a7-afb5-9d8b6e5866d6-combined-ca-bundle\") pod \"2da32456-86db-40a7-afb5-9d8b6e5866d6\" (UID: \"2da32456-86db-40a7-afb5-9d8b6e5866d6\") " Dec 02 16:16:07 crc kubenswrapper[4933]: I1202 16:16:07.254444 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2da32456-86db-40a7-afb5-9d8b6e5866d6-config-data-custom\") pod \"2da32456-86db-40a7-afb5-9d8b6e5866d6\" (UID: \"2da32456-86db-40a7-afb5-9d8b6e5866d6\") " Dec 02 16:16:07 crc kubenswrapper[4933]: I1202 16:16:07.254525 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zw7s5\" (UniqueName: \"kubernetes.io/projected/2da32456-86db-40a7-afb5-9d8b6e5866d6-kube-api-access-zw7s5\") pod \"2da32456-86db-40a7-afb5-9d8b6e5866d6\" (UID: \"2da32456-86db-40a7-afb5-9d8b6e5866d6\") " Dec 02 16:16:07 crc kubenswrapper[4933]: I1202 16:16:07.254581 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2da32456-86db-40a7-afb5-9d8b6e5866d6-config-data\") pod \"2da32456-86db-40a7-afb5-9d8b6e5866d6\" (UID: \"2da32456-86db-40a7-afb5-9d8b6e5866d6\") " Dec 02 16:16:07 crc kubenswrapper[4933]: I1202 16:16:07.255302 4933 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41bc6554-0594-4a0b-9b1c-063f09af19ef-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 16:16:07 crc kubenswrapper[4933]: I1202 16:16:07.255319 4933 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41bc6554-0594-4a0b-9b1c-063f09af19ef-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 16:16:07 crc kubenswrapper[4933]: I1202 16:16:07.266555 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2da32456-86db-40a7-afb5-9d8b6e5866d6-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "2da32456-86db-40a7-afb5-9d8b6e5866d6" (UID: "2da32456-86db-40a7-afb5-9d8b6e5866d6"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:16:07 crc kubenswrapper[4933]: I1202 16:16:07.267285 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2da32456-86db-40a7-afb5-9d8b6e5866d6-kube-api-access-zw7s5" (OuterVolumeSpecName: "kube-api-access-zw7s5") pod "2da32456-86db-40a7-afb5-9d8b6e5866d6" (UID: "2da32456-86db-40a7-afb5-9d8b6e5866d6"). InnerVolumeSpecName "kube-api-access-zw7s5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:16:07 crc kubenswrapper[4933]: I1202 16:16:07.331003 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2da32456-86db-40a7-afb5-9d8b6e5866d6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2da32456-86db-40a7-afb5-9d8b6e5866d6" (UID: "2da32456-86db-40a7-afb5-9d8b6e5866d6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:16:07 crc kubenswrapper[4933]: I1202 16:16:07.355314 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-57cc8c89b8-9grd9" Dec 02 16:16:07 crc kubenswrapper[4933]: I1202 16:16:07.355366 4933 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-api-57cc8c89b8-9grd9" Dec 02 16:16:07 crc kubenswrapper[4933]: I1202 16:16:07.357553 4933 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2da32456-86db-40a7-afb5-9d8b6e5866d6-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 02 16:16:07 crc kubenswrapper[4933]: I1202 16:16:07.357577 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zw7s5\" (UniqueName: \"kubernetes.io/projected/2da32456-86db-40a7-afb5-9d8b6e5866d6-kube-api-access-zw7s5\") on node \"crc\" DevicePath \"\"" Dec 02 16:16:07 crc kubenswrapper[4933]: I1202 16:16:07.357655 4933 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2da32456-86db-40a7-afb5-9d8b6e5866d6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 16:16:07 crc kubenswrapper[4933]: I1202 16:16:07.374645 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2da32456-86db-40a7-afb5-9d8b6e5866d6-config-data" (OuterVolumeSpecName: "config-data") pod "2da32456-86db-40a7-afb5-9d8b6e5866d6" (UID: "2da32456-86db-40a7-afb5-9d8b6e5866d6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:16:07 crc kubenswrapper[4933]: I1202 16:16:07.459129 4933 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2da32456-86db-40a7-afb5-9d8b6e5866d6-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 16:16:07 crc kubenswrapper[4933]: I1202 16:16:07.604184 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-5979c644c5-dckr9" event={"ID":"3aa14443-9372-4f3d-bc34-3463b051b107","Type":"ContainerStarted","Data":"d6bb5d2dc6f3c14c0923dbcea314c35ef7d8b113e5912ae5fcddf415151f2c07"} Dec 02 16:16:07 crc kubenswrapper[4933]: I1202 16:16:07.613017 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"41bc6554-0594-4a0b-9b1c-063f09af19ef","Type":"ContainerDied","Data":"56d6855ea9721351e5a8af06f6e077fb01313db7f3ea02a3098e5e44b9120591"} Dec 02 16:16:07 crc kubenswrapper[4933]: I1202 16:16:07.613073 4933 scope.go:117] "RemoveContainer" containerID="ed70c571191960cb8e95c29b093ad9380836c1a17a85f2da93b285fba4985071" Dec 02 16:16:07 crc kubenswrapper[4933]: I1202 16:16:07.613243 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 16:16:07 crc kubenswrapper[4933]: I1202 16:16:07.624137 4933 generic.go:334] "Generic (PLEG): container finished" podID="036484a3-ab5b-4bf2-a02c-9fb583904529" containerID="2b9268acc0ad9dc119c98d5f1d24eac32655dee2c52b944ec29cfacba1dbb5ef" exitCode=0 Dec 02 16:16:07 crc kubenswrapper[4933]: I1202 16:16:07.624223 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-84dddfc884-5ff6c" event={"ID":"036484a3-ab5b-4bf2-a02c-9fb583904529","Type":"ContainerDied","Data":"2b9268acc0ad9dc119c98d5f1d24eac32655dee2c52b944ec29cfacba1dbb5ef"} Dec 02 16:16:07 crc kubenswrapper[4933]: I1202 16:16:07.626943 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 02 16:16:07 crc kubenswrapper[4933]: I1202 16:16:07.628888 4933 generic.go:334] "Generic (PLEG): container finished" podID="2da32456-86db-40a7-afb5-9d8b6e5866d6" containerID="71eefb8aaf3f8c9dd4b0128b1cc474ca55934f7a7d97a32177b25ae97337d552" exitCode=0 Dec 02 16:16:07 crc kubenswrapper[4933]: I1202 16:16:07.628930 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-75b459779d-29rn7" Dec 02 16:16:07 crc kubenswrapper[4933]: I1202 16:16:07.628974 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-75b459779d-29rn7" event={"ID":"2da32456-86db-40a7-afb5-9d8b6e5866d6","Type":"ContainerDied","Data":"71eefb8aaf3f8c9dd4b0128b1cc474ca55934f7a7d97a32177b25ae97337d552"} Dec 02 16:16:07 crc kubenswrapper[4933]: I1202 16:16:07.628996 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-75b459779d-29rn7" event={"ID":"2da32456-86db-40a7-afb5-9d8b6e5866d6","Type":"ContainerDied","Data":"c2c7abdafa6e14b4ed869e5d13a7d730056bfc6dc68be6be98906acc7268680d"} Dec 02 16:16:07 crc kubenswrapper[4933]: I1202 16:16:07.632168 4933 generic.go:334] "Generic (PLEG): container finished" podID="4554a65a-0086-4a8d-8d70-57e016ad418e" containerID="0fb953aaa2a273d0f89bdf4e52e7e7f823d467071d652705440301b7fed4cbf9" exitCode=1 Dec 02 16:16:07 crc kubenswrapper[4933]: I1202 16:16:07.632172 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-5545fbccfb-mqjbx" event={"ID":"4554a65a-0086-4a8d-8d70-57e016ad418e","Type":"ContainerDied","Data":"0fb953aaa2a273d0f89bdf4e52e7e7f823d467071d652705440301b7fed4cbf9"} Dec 02 16:16:07 crc kubenswrapper[4933]: I1202 16:16:07.632730 4933 scope.go:117] "RemoveContainer" containerID="0fb953aaa2a273d0f89bdf4e52e7e7f823d467071d652705440301b7fed4cbf9" Dec 02 16:16:07 crc kubenswrapper[4933]: E1202 16:16:07.633442 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-5545fbccfb-mqjbx_openstack(4554a65a-0086-4a8d-8d70-57e016ad418e)\"" pod="openstack/heat-cfnapi-5545fbccfb-mqjbx" podUID="4554a65a-0086-4a8d-8d70-57e016ad418e" Dec 02 16:16:07 crc kubenswrapper[4933]: I1202 16:16:07.634671 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5558cd5dc7-s4k68" event={"ID":"4caba072-5d91-4126-909a-b4e7ad6167d6","Type":"ContainerStarted","Data":"e955d0062e741a6b46397d5c5d7268ec2ca4a0b5773f8e4a110b5fb1b04055f2"} Dec 02 16:16:07 crc kubenswrapper[4933]: I1202 16:16:07.639466 4933 scope.go:117] "RemoveContainer" containerID="826aee863825a409f3215548bc30dc822c450b46483aaa49e0c7b31a17bb87c1" Dec 02 16:16:07 crc kubenswrapper[4933]: E1202 16:16:07.639676 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-57cc8c89b8-9grd9_openstack(b5ea07dd-c4dc-421e-b710-8d759f0d9487)\"" pod="openstack/heat-api-57cc8c89b8-9grd9" podUID="b5ea07dd-c4dc-421e-b710-8d759f0d9487" Dec 02 16:16:07 crc kubenswrapper[4933]: I1202 16:16:07.652953 4933 scope.go:117] "RemoveContainer" containerID="9d15d81a7d9d565d0c133fb641536c92dbd0b74fa375f2f19df145e67a197434" Dec 02 16:16:07 crc kubenswrapper[4933]: I1202 16:16:07.665508 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-5545fbccfb-mqjbx" Dec 02 16:16:07 crc kubenswrapper[4933]: I1202 16:16:07.666496 4933 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-cfnapi-5545fbccfb-mqjbx" Dec 02 16:16:07 crc kubenswrapper[4933]: I1202 16:16:07.794202 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 16:16:07 crc kubenswrapper[4933]: I1202 16:16:07.798808 4933 scope.go:117] "RemoveContainer" containerID="c6240e38b31d693201a8016a4efb4d03d3c6b66e5de3e1fc3934db6e3b139897" Dec 02 16:16:07 crc kubenswrapper[4933]: I1202 16:16:07.832490 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 02 16:16:07 crc kubenswrapper[4933]: I1202 16:16:07.843429 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-84dddfc884-5ff6c" Dec 02 16:16:07 crc kubenswrapper[4933]: I1202 16:16:07.854968 4933 scope.go:117] "RemoveContainer" containerID="031a7ad11bf9c6a8a8c6dc1425d10b1dc132c180d71a86bb43b8c3bb2cdc9116" Dec 02 16:16:07 crc kubenswrapper[4933]: I1202 16:16:07.857352 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 02 16:16:07 crc kubenswrapper[4933]: E1202 16:16:07.858172 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="036484a3-ab5b-4bf2-a02c-9fb583904529" containerName="heat-api" Dec 02 16:16:07 crc kubenswrapper[4933]: I1202 16:16:07.858200 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="036484a3-ab5b-4bf2-a02c-9fb583904529" containerName="heat-api" Dec 02 16:16:07 crc kubenswrapper[4933]: E1202 16:16:07.858220 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41bc6554-0594-4a0b-9b1c-063f09af19ef" containerName="sg-core" Dec 02 16:16:07 crc kubenswrapper[4933]: I1202 16:16:07.858228 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="41bc6554-0594-4a0b-9b1c-063f09af19ef" containerName="sg-core" Dec 02 16:16:07 crc kubenswrapper[4933]: E1202 16:16:07.858242 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="317f6015-0b9d-4a80-b022-2b77224b0284" containerName="init" Dec 02 16:16:07 crc kubenswrapper[4933]: I1202 16:16:07.858250 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="317f6015-0b9d-4a80-b022-2b77224b0284" containerName="init" Dec 02 16:16:07 crc kubenswrapper[4933]: E1202 16:16:07.858276 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41bc6554-0594-4a0b-9b1c-063f09af19ef" containerName="ceilometer-central-agent" Dec 02 16:16:07 crc kubenswrapper[4933]: I1202 16:16:07.858284 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="41bc6554-0594-4a0b-9b1c-063f09af19ef" containerName="ceilometer-central-agent" Dec 02 16:16:07 crc kubenswrapper[4933]: E1202 16:16:07.858309 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41bc6554-0594-4a0b-9b1c-063f09af19ef" containerName="ceilometer-notification-agent" Dec 02 16:16:07 crc kubenswrapper[4933]: I1202 16:16:07.858318 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="41bc6554-0594-4a0b-9b1c-063f09af19ef" containerName="ceilometer-notification-agent" Dec 02 16:16:07 crc kubenswrapper[4933]: E1202 16:16:07.858337 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2da32456-86db-40a7-afb5-9d8b6e5866d6" containerName="heat-cfnapi" Dec 02 16:16:07 crc kubenswrapper[4933]: I1202 16:16:07.858346 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="2da32456-86db-40a7-afb5-9d8b6e5866d6" containerName="heat-cfnapi" Dec 02 16:16:07 crc kubenswrapper[4933]: E1202 16:16:07.858376 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41bc6554-0594-4a0b-9b1c-063f09af19ef" containerName="proxy-httpd" Dec 02 16:16:07 crc kubenswrapper[4933]: I1202 16:16:07.858384 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="41bc6554-0594-4a0b-9b1c-063f09af19ef" containerName="proxy-httpd" Dec 02 16:16:07 crc kubenswrapper[4933]: E1202 16:16:07.858407 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="317f6015-0b9d-4a80-b022-2b77224b0284" containerName="dnsmasq-dns" Dec 02 16:16:07 crc kubenswrapper[4933]: I1202 16:16:07.858414 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="317f6015-0b9d-4a80-b022-2b77224b0284" containerName="dnsmasq-dns" Dec 02 16:16:07 crc kubenswrapper[4933]: I1202 16:16:07.858677 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="2da32456-86db-40a7-afb5-9d8b6e5866d6" containerName="heat-cfnapi" Dec 02 16:16:07 crc kubenswrapper[4933]: I1202 16:16:07.858696 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="41bc6554-0594-4a0b-9b1c-063f09af19ef" containerName="ceilometer-notification-agent" Dec 02 16:16:07 crc kubenswrapper[4933]: I1202 16:16:07.858704 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="41bc6554-0594-4a0b-9b1c-063f09af19ef" containerName="sg-core" Dec 02 16:16:07 crc kubenswrapper[4933]: I1202 16:16:07.858726 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="317f6015-0b9d-4a80-b022-2b77224b0284" containerName="dnsmasq-dns" Dec 02 16:16:07 crc kubenswrapper[4933]: I1202 16:16:07.858744 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="41bc6554-0594-4a0b-9b1c-063f09af19ef" containerName="ceilometer-central-agent" Dec 02 16:16:07 crc kubenswrapper[4933]: I1202 16:16:07.858765 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="41bc6554-0594-4a0b-9b1c-063f09af19ef" containerName="proxy-httpd" Dec 02 16:16:07 crc kubenswrapper[4933]: I1202 16:16:07.858781 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="036484a3-ab5b-4bf2-a02c-9fb583904529" containerName="heat-api" Dec 02 16:16:07 crc kubenswrapper[4933]: I1202 16:16:07.871447 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 16:16:07 crc kubenswrapper[4933]: I1202 16:16:07.878501 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 02 16:16:07 crc kubenswrapper[4933]: I1202 16:16:07.878704 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 02 16:16:07 crc kubenswrapper[4933]: I1202 16:16:07.894296 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-75b459779d-29rn7"] Dec 02 16:16:07 crc kubenswrapper[4933]: I1202 16:16:07.914920 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-75b459779d-29rn7"] Dec 02 16:16:07 crc kubenswrapper[4933]: I1202 16:16:07.916435 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 02 16:16:07 crc kubenswrapper[4933]: I1202 16:16:07.924540 4933 scope.go:117] "RemoveContainer" containerID="71eefb8aaf3f8c9dd4b0128b1cc474ca55934f7a7d97a32177b25ae97337d552" Dec 02 16:16:07 crc kubenswrapper[4933]: I1202 16:16:07.932052 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 16:16:07 crc kubenswrapper[4933]: I1202 16:16:07.979106 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/036484a3-ab5b-4bf2-a02c-9fb583904529-config-data\") pod \"036484a3-ab5b-4bf2-a02c-9fb583904529\" (UID: \"036484a3-ab5b-4bf2-a02c-9fb583904529\") " Dec 02 16:16:07 crc kubenswrapper[4933]: I1202 16:16:07.979157 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p8twx\" (UniqueName: \"kubernetes.io/projected/036484a3-ab5b-4bf2-a02c-9fb583904529-kube-api-access-p8twx\") pod \"036484a3-ab5b-4bf2-a02c-9fb583904529\" (UID: \"036484a3-ab5b-4bf2-a02c-9fb583904529\") " Dec 02 16:16:07 crc kubenswrapper[4933]: I1202 16:16:07.979296 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/036484a3-ab5b-4bf2-a02c-9fb583904529-combined-ca-bundle\") pod \"036484a3-ab5b-4bf2-a02c-9fb583904529\" (UID: \"036484a3-ab5b-4bf2-a02c-9fb583904529\") " Dec 02 16:16:07 crc kubenswrapper[4933]: I1202 16:16:07.979405 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/036484a3-ab5b-4bf2-a02c-9fb583904529-config-data-custom\") pod \"036484a3-ab5b-4bf2-a02c-9fb583904529\" (UID: \"036484a3-ab5b-4bf2-a02c-9fb583904529\") " Dec 02 16:16:07 crc kubenswrapper[4933]: I1202 16:16:07.979893 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73119236-88e7-4698-a4a2-5ee8e0c1df03-scripts\") pod \"ceilometer-0\" (UID: \"73119236-88e7-4698-a4a2-5ee8e0c1df03\") " pod="openstack/ceilometer-0" Dec 02 16:16:07 crc kubenswrapper[4933]: I1202 16:16:07.979939 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73119236-88e7-4698-a4a2-5ee8e0c1df03-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"73119236-88e7-4698-a4a2-5ee8e0c1df03\") " pod="openstack/ceilometer-0" Dec 02 16:16:07 crc kubenswrapper[4933]: I1202 16:16:07.979980 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/73119236-88e7-4698-a4a2-5ee8e0c1df03-run-httpd\") pod \"ceilometer-0\" (UID: \"73119236-88e7-4698-a4a2-5ee8e0c1df03\") " pod="openstack/ceilometer-0" Dec 02 16:16:07 crc kubenswrapper[4933]: I1202 16:16:07.980021 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73119236-88e7-4698-a4a2-5ee8e0c1df03-config-data\") pod \"ceilometer-0\" (UID: \"73119236-88e7-4698-a4a2-5ee8e0c1df03\") " pod="openstack/ceilometer-0" Dec 02 16:16:07 crc kubenswrapper[4933]: I1202 16:16:07.980088 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqw5z\" (UniqueName: \"kubernetes.io/projected/73119236-88e7-4698-a4a2-5ee8e0c1df03-kube-api-access-gqw5z\") pod \"ceilometer-0\" (UID: \"73119236-88e7-4698-a4a2-5ee8e0c1df03\") " pod="openstack/ceilometer-0" Dec 02 16:16:07 crc kubenswrapper[4933]: I1202 16:16:07.980118 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/73119236-88e7-4698-a4a2-5ee8e0c1df03-log-httpd\") pod \"ceilometer-0\" (UID: \"73119236-88e7-4698-a4a2-5ee8e0c1df03\") " pod="openstack/ceilometer-0" Dec 02 16:16:07 crc kubenswrapper[4933]: I1202 16:16:07.980136 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/73119236-88e7-4698-a4a2-5ee8e0c1df03-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"73119236-88e7-4698-a4a2-5ee8e0c1df03\") " pod="openstack/ceilometer-0" Dec 02 16:16:07 crc kubenswrapper[4933]: I1202 16:16:07.998022 4933 scope.go:117] "RemoveContainer" containerID="71eefb8aaf3f8c9dd4b0128b1cc474ca55934f7a7d97a32177b25ae97337d552" Dec 02 16:16:07 crc kubenswrapper[4933]: E1202 16:16:07.999609 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71eefb8aaf3f8c9dd4b0128b1cc474ca55934f7a7d97a32177b25ae97337d552\": container with ID starting with 71eefb8aaf3f8c9dd4b0128b1cc474ca55934f7a7d97a32177b25ae97337d552 not found: ID does not exist" containerID="71eefb8aaf3f8c9dd4b0128b1cc474ca55934f7a7d97a32177b25ae97337d552" Dec 02 16:16:07 crc kubenswrapper[4933]: I1202 16:16:07.999642 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71eefb8aaf3f8c9dd4b0128b1cc474ca55934f7a7d97a32177b25ae97337d552"} err="failed to get container status \"71eefb8aaf3f8c9dd4b0128b1cc474ca55934f7a7d97a32177b25ae97337d552\": rpc error: code = NotFound desc = could not find container \"71eefb8aaf3f8c9dd4b0128b1cc474ca55934f7a7d97a32177b25ae97337d552\": container with ID starting with 71eefb8aaf3f8c9dd4b0128b1cc474ca55934f7a7d97a32177b25ae97337d552 not found: ID does not exist" Dec 02 16:16:07 crc kubenswrapper[4933]: I1202 16:16:07.999665 4933 scope.go:117] "RemoveContainer" containerID="c274ad5cdd5a15af8007b0cd93ac3eedb1e23fde84c46cd69d6ad15cd5e2416b" Dec 02 16:16:08 crc kubenswrapper[4933]: I1202 16:16:08.000082 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/036484a3-ab5b-4bf2-a02c-9fb583904529-kube-api-access-p8twx" (OuterVolumeSpecName: "kube-api-access-p8twx") pod "036484a3-ab5b-4bf2-a02c-9fb583904529" (UID: "036484a3-ab5b-4bf2-a02c-9fb583904529"). InnerVolumeSpecName "kube-api-access-p8twx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:16:08 crc kubenswrapper[4933]: I1202 16:16:08.035763 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/036484a3-ab5b-4bf2-a02c-9fb583904529-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "036484a3-ab5b-4bf2-a02c-9fb583904529" (UID: "036484a3-ab5b-4bf2-a02c-9fb583904529"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:16:08 crc kubenswrapper[4933]: I1202 16:16:08.090690 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73119236-88e7-4698-a4a2-5ee8e0c1df03-scripts\") pod \"ceilometer-0\" (UID: \"73119236-88e7-4698-a4a2-5ee8e0c1df03\") " pod="openstack/ceilometer-0" Dec 02 16:16:08 crc kubenswrapper[4933]: I1202 16:16:08.090774 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73119236-88e7-4698-a4a2-5ee8e0c1df03-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"73119236-88e7-4698-a4a2-5ee8e0c1df03\") " pod="openstack/ceilometer-0" Dec 02 16:16:08 crc kubenswrapper[4933]: I1202 16:16:08.090892 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/73119236-88e7-4698-a4a2-5ee8e0c1df03-run-httpd\") pod \"ceilometer-0\" (UID: \"73119236-88e7-4698-a4a2-5ee8e0c1df03\") " pod="openstack/ceilometer-0" Dec 02 16:16:08 crc kubenswrapper[4933]: I1202 16:16:08.090991 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73119236-88e7-4698-a4a2-5ee8e0c1df03-config-data\") pod \"ceilometer-0\" (UID: \"73119236-88e7-4698-a4a2-5ee8e0c1df03\") " pod="openstack/ceilometer-0" Dec 02 16:16:08 crc kubenswrapper[4933]: I1202 16:16:08.091137 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqw5z\" (UniqueName: \"kubernetes.io/projected/73119236-88e7-4698-a4a2-5ee8e0c1df03-kube-api-access-gqw5z\") pod \"ceilometer-0\" (UID: \"73119236-88e7-4698-a4a2-5ee8e0c1df03\") " pod="openstack/ceilometer-0" Dec 02 16:16:08 crc kubenswrapper[4933]: I1202 16:16:08.091197 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/73119236-88e7-4698-a4a2-5ee8e0c1df03-log-httpd\") pod \"ceilometer-0\" (UID: \"73119236-88e7-4698-a4a2-5ee8e0c1df03\") " pod="openstack/ceilometer-0" Dec 02 16:16:08 crc kubenswrapper[4933]: I1202 16:16:08.091237 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/73119236-88e7-4698-a4a2-5ee8e0c1df03-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"73119236-88e7-4698-a4a2-5ee8e0c1df03\") " pod="openstack/ceilometer-0" Dec 02 16:16:08 crc kubenswrapper[4933]: I1202 16:16:08.091423 4933 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/036484a3-ab5b-4bf2-a02c-9fb583904529-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 02 16:16:08 crc kubenswrapper[4933]: I1202 16:16:08.091447 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p8twx\" (UniqueName: \"kubernetes.io/projected/036484a3-ab5b-4bf2-a02c-9fb583904529-kube-api-access-p8twx\") on node \"crc\" DevicePath \"\"" Dec 02 16:16:08 crc kubenswrapper[4933]: I1202 16:16:08.105377 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/73119236-88e7-4698-a4a2-5ee8e0c1df03-run-httpd\") pod \"ceilometer-0\" (UID: \"73119236-88e7-4698-a4a2-5ee8e0c1df03\") " pod="openstack/ceilometer-0" Dec 02 16:16:08 crc kubenswrapper[4933]: I1202 16:16:08.106963 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73119236-88e7-4698-a4a2-5ee8e0c1df03-scripts\") pod \"ceilometer-0\" (UID: \"73119236-88e7-4698-a4a2-5ee8e0c1df03\") " pod="openstack/ceilometer-0" Dec 02 16:16:08 crc kubenswrapper[4933]: I1202 16:16:08.107479 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/73119236-88e7-4698-a4a2-5ee8e0c1df03-log-httpd\") pod \"ceilometer-0\" (UID: \"73119236-88e7-4698-a4a2-5ee8e0c1df03\") " pod="openstack/ceilometer-0" Dec 02 16:16:08 crc kubenswrapper[4933]: I1202 16:16:08.108087 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73119236-88e7-4698-a4a2-5ee8e0c1df03-config-data\") pod \"ceilometer-0\" (UID: \"73119236-88e7-4698-a4a2-5ee8e0c1df03\") " pod="openstack/ceilometer-0" Dec 02 16:16:08 crc kubenswrapper[4933]: I1202 16:16:08.115508 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/036484a3-ab5b-4bf2-a02c-9fb583904529-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "036484a3-ab5b-4bf2-a02c-9fb583904529" (UID: "036484a3-ab5b-4bf2-a02c-9fb583904529"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:16:08 crc kubenswrapper[4933]: I1202 16:16:08.116685 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73119236-88e7-4698-a4a2-5ee8e0c1df03-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"73119236-88e7-4698-a4a2-5ee8e0c1df03\") " pod="openstack/ceilometer-0" Dec 02 16:16:08 crc kubenswrapper[4933]: I1202 16:16:08.117902 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/73119236-88e7-4698-a4a2-5ee8e0c1df03-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"73119236-88e7-4698-a4a2-5ee8e0c1df03\") " pod="openstack/ceilometer-0" Dec 02 16:16:08 crc kubenswrapper[4933]: I1202 16:16:08.137368 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/036484a3-ab5b-4bf2-a02c-9fb583904529-config-data" (OuterVolumeSpecName: "config-data") pod "036484a3-ab5b-4bf2-a02c-9fb583904529" (UID: "036484a3-ab5b-4bf2-a02c-9fb583904529"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:16:08 crc kubenswrapper[4933]: I1202 16:16:08.146860 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqw5z\" (UniqueName: \"kubernetes.io/projected/73119236-88e7-4698-a4a2-5ee8e0c1df03-kube-api-access-gqw5z\") pod \"ceilometer-0\" (UID: \"73119236-88e7-4698-a4a2-5ee8e0c1df03\") " pod="openstack/ceilometer-0" Dec 02 16:16:08 crc kubenswrapper[4933]: I1202 16:16:08.195133 4933 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/036484a3-ab5b-4bf2-a02c-9fb583904529-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 16:16:08 crc kubenswrapper[4933]: I1202 16:16:08.195169 4933 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/036484a3-ab5b-4bf2-a02c-9fb583904529-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 16:16:08 crc kubenswrapper[4933]: I1202 16:16:08.230196 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 16:16:08 crc kubenswrapper[4933]: I1202 16:16:08.683805 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-5979c644c5-dckr9" event={"ID":"3aa14443-9372-4f3d-bc34-3463b051b107","Type":"ContainerStarted","Data":"0b71cb6a351224d9b0489185cbd0ef0bfbecce853323c3977354db13eab3915f"} Dec 02 16:16:08 crc kubenswrapper[4933]: I1202 16:16:08.685943 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-5979c644c5-dckr9" Dec 02 16:16:08 crc kubenswrapper[4933]: I1202 16:16:08.693665 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-84dddfc884-5ff6c" Dec 02 16:16:08 crc kubenswrapper[4933]: I1202 16:16:08.693763 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-84dddfc884-5ff6c" event={"ID":"036484a3-ab5b-4bf2-a02c-9fb583904529","Type":"ContainerDied","Data":"47f10ee3d6e47daa64aa5cd6183750639c41f6dc31a437a36bcf97e4402a1a8a"} Dec 02 16:16:08 crc kubenswrapper[4933]: I1202 16:16:08.693836 4933 scope.go:117] "RemoveContainer" containerID="2b9268acc0ad9dc119c98d5f1d24eac32655dee2c52b944ec29cfacba1dbb5ef" Dec 02 16:16:08 crc kubenswrapper[4933]: I1202 16:16:08.721517 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-5979c644c5-dckr9" podStartSLOduration=3.721495323 podStartE2EDuration="3.721495323s" podCreationTimestamp="2025-12-02 16:16:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 16:16:08.707368579 +0000 UTC m=+1431.958595282" watchObservedRunningTime="2025-12-02 16:16:08.721495323 +0000 UTC m=+1431.972722026" Dec 02 16:16:08 crc kubenswrapper[4933]: I1202 16:16:08.757064 4933 scope.go:117] "RemoveContainer" containerID="0fb953aaa2a273d0f89bdf4e52e7e7f823d467071d652705440301b7fed4cbf9" Dec 02 16:16:08 crc kubenswrapper[4933]: E1202 16:16:08.757369 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-5545fbccfb-mqjbx_openstack(4554a65a-0086-4a8d-8d70-57e016ad418e)\"" pod="openstack/heat-cfnapi-5545fbccfb-mqjbx" podUID="4554a65a-0086-4a8d-8d70-57e016ad418e" Dec 02 16:16:08 crc kubenswrapper[4933]: I1202 16:16:08.772250 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5558cd5dc7-s4k68" event={"ID":"4caba072-5d91-4126-909a-b4e7ad6167d6","Type":"ContainerStarted","Data":"502fb37e69c4ed4b648add46e46605f39c20b2666f164ad2678b88200437e652"} Dec 02 16:16:08 crc kubenswrapper[4933]: I1202 16:16:08.772386 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-5558cd5dc7-s4k68" Dec 02 16:16:08 crc kubenswrapper[4933]: I1202 16:16:08.773173 4933 scope.go:117] "RemoveContainer" containerID="826aee863825a409f3215548bc30dc822c450b46483aaa49e0c7b31a17bb87c1" Dec 02 16:16:08 crc kubenswrapper[4933]: E1202 16:16:08.773860 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-57cc8c89b8-9grd9_openstack(b5ea07dd-c4dc-421e-b710-8d759f0d9487)\"" pod="openstack/heat-api-57cc8c89b8-9grd9" podUID="b5ea07dd-c4dc-421e-b710-8d759f0d9487" Dec 02 16:16:08 crc kubenswrapper[4933]: I1202 16:16:08.790216 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-84dddfc884-5ff6c"] Dec 02 16:16:08 crc kubenswrapper[4933]: I1202 16:16:08.837696 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-84dddfc884-5ff6c"] Dec 02 16:16:08 crc kubenswrapper[4933]: I1202 16:16:08.891056 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-5558cd5dc7-s4k68" podStartSLOduration=3.891033641 podStartE2EDuration="3.891033641s" podCreationTimestamp="2025-12-02 16:16:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 16:16:08.833262991 +0000 UTC m=+1432.084489704" watchObservedRunningTime="2025-12-02 16:16:08.891033641 +0000 UTC m=+1432.142260344" Dec 02 16:16:08 crc kubenswrapper[4933]: I1202 16:16:08.931123 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 16:16:08 crc kubenswrapper[4933]: W1202 16:16:08.933167 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod73119236_88e7_4698_a4a2_5ee8e0c1df03.slice/crio-fcb12a4b366f509b85c03d70d592a1a7b50ddacccfa349cc1407bfa2519575f6 WatchSource:0}: Error finding container fcb12a4b366f509b85c03d70d592a1a7b50ddacccfa349cc1407bfa2519575f6: Status 404 returned error can't find the container with id fcb12a4b366f509b85c03d70d592a1a7b50ddacccfa349cc1407bfa2519575f6 Dec 02 16:16:09 crc kubenswrapper[4933]: I1202 16:16:09.081976 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="036484a3-ab5b-4bf2-a02c-9fb583904529" path="/var/lib/kubelet/pods/036484a3-ab5b-4bf2-a02c-9fb583904529/volumes" Dec 02 16:16:09 crc kubenswrapper[4933]: I1202 16:16:09.083613 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2da32456-86db-40a7-afb5-9d8b6e5866d6" path="/var/lib/kubelet/pods/2da32456-86db-40a7-afb5-9d8b6e5866d6/volumes" Dec 02 16:16:09 crc kubenswrapper[4933]: I1202 16:16:09.084257 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41bc6554-0594-4a0b-9b1c-063f09af19ef" path="/var/lib/kubelet/pods/41bc6554-0594-4a0b-9b1c-063f09af19ef/volumes" Dec 02 16:16:09 crc kubenswrapper[4933]: I1202 16:16:09.812281 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"73119236-88e7-4698-a4a2-5ee8e0c1df03","Type":"ContainerStarted","Data":"fcb12a4b366f509b85c03d70d592a1a7b50ddacccfa349cc1407bfa2519575f6"} Dec 02 16:16:09 crc kubenswrapper[4933]: I1202 16:16:09.814260 4933 scope.go:117] "RemoveContainer" containerID="0fb953aaa2a273d0f89bdf4e52e7e7f823d467071d652705440301b7fed4cbf9" Dec 02 16:16:09 crc kubenswrapper[4933]: E1202 16:16:09.814777 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-5545fbccfb-mqjbx_openstack(4554a65a-0086-4a8d-8d70-57e016ad418e)\"" pod="openstack/heat-cfnapi-5545fbccfb-mqjbx" podUID="4554a65a-0086-4a8d-8d70-57e016ad418e" Dec 02 16:16:10 crc kubenswrapper[4933]: I1202 16:16:10.609297 4933 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6bb4fc677f-zj8w6" podUID="317f6015-0b9d-4a80-b022-2b77224b0284" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.198:5353: i/o timeout" Dec 02 16:16:11 crc kubenswrapper[4933]: I1202 16:16:11.198668 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-64f6f8f6c-x5wnf" Dec 02 16:16:11 crc kubenswrapper[4933]: I1202 16:16:11.200303 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-64f6f8f6c-x5wnf" Dec 02 16:16:11 crc kubenswrapper[4933]: I1202 16:16:11.495751 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 16:16:12 crc kubenswrapper[4933]: E1202 16:16:12.241968 4933 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6c1c5e6_50dd_428a_890c_2c3f0456f2fa.slice/crio-88738e3bef53d0478fb4beaed256d677959bb94ab5e545598a6a14e9cb4a5112.scope\": RecentStats: unable to find data in memory cache]" Dec 02 16:16:12 crc kubenswrapper[4933]: I1202 16:16:12.612260 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-h8zrv"] Dec 02 16:16:12 crc kubenswrapper[4933]: I1202 16:16:12.620527 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h8zrv" Dec 02 16:16:12 crc kubenswrapper[4933]: I1202 16:16:12.633069 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-h8zrv"] Dec 02 16:16:12 crc kubenswrapper[4933]: I1202 16:16:12.742414 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa2e200b-ec52-4e46-bc66-c40d8b3c0c27-utilities\") pod \"redhat-operators-h8zrv\" (UID: \"aa2e200b-ec52-4e46-bc66-c40d8b3c0c27\") " pod="openshift-marketplace/redhat-operators-h8zrv" Dec 02 16:16:12 crc kubenswrapper[4933]: I1202 16:16:12.742692 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8w6bp\" (UniqueName: \"kubernetes.io/projected/aa2e200b-ec52-4e46-bc66-c40d8b3c0c27-kube-api-access-8w6bp\") pod \"redhat-operators-h8zrv\" (UID: \"aa2e200b-ec52-4e46-bc66-c40d8b3c0c27\") " pod="openshift-marketplace/redhat-operators-h8zrv" Dec 02 16:16:12 crc kubenswrapper[4933]: I1202 16:16:12.742773 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa2e200b-ec52-4e46-bc66-c40d8b3c0c27-catalog-content\") pod \"redhat-operators-h8zrv\" (UID: \"aa2e200b-ec52-4e46-bc66-c40d8b3c0c27\") " pod="openshift-marketplace/redhat-operators-h8zrv" Dec 02 16:16:12 crc kubenswrapper[4933]: I1202 16:16:12.846350 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa2e200b-ec52-4e46-bc66-c40d8b3c0c27-catalog-content\") pod \"redhat-operators-h8zrv\" (UID: \"aa2e200b-ec52-4e46-bc66-c40d8b3c0c27\") " pod="openshift-marketplace/redhat-operators-h8zrv" Dec 02 16:16:12 crc kubenswrapper[4933]: I1202 16:16:12.846521 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa2e200b-ec52-4e46-bc66-c40d8b3c0c27-utilities\") pod \"redhat-operators-h8zrv\" (UID: \"aa2e200b-ec52-4e46-bc66-c40d8b3c0c27\") " pod="openshift-marketplace/redhat-operators-h8zrv" Dec 02 16:16:12 crc kubenswrapper[4933]: I1202 16:16:12.846556 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8w6bp\" (UniqueName: \"kubernetes.io/projected/aa2e200b-ec52-4e46-bc66-c40d8b3c0c27-kube-api-access-8w6bp\") pod \"redhat-operators-h8zrv\" (UID: \"aa2e200b-ec52-4e46-bc66-c40d8b3c0c27\") " pod="openshift-marketplace/redhat-operators-h8zrv" Dec 02 16:16:12 crc kubenswrapper[4933]: I1202 16:16:12.847255 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa2e200b-ec52-4e46-bc66-c40d8b3c0c27-catalog-content\") pod \"redhat-operators-h8zrv\" (UID: \"aa2e200b-ec52-4e46-bc66-c40d8b3c0c27\") " pod="openshift-marketplace/redhat-operators-h8zrv" Dec 02 16:16:12 crc kubenswrapper[4933]: I1202 16:16:12.847457 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa2e200b-ec52-4e46-bc66-c40d8b3c0c27-utilities\") pod \"redhat-operators-h8zrv\" (UID: \"aa2e200b-ec52-4e46-bc66-c40d8b3c0c27\") " pod="openshift-marketplace/redhat-operators-h8zrv" Dec 02 16:16:12 crc kubenswrapper[4933]: I1202 16:16:12.884706 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8w6bp\" (UniqueName: \"kubernetes.io/projected/aa2e200b-ec52-4e46-bc66-c40d8b3c0c27-kube-api-access-8w6bp\") pod \"redhat-operators-h8zrv\" (UID: \"aa2e200b-ec52-4e46-bc66-c40d8b3c0c27\") " pod="openshift-marketplace/redhat-operators-h8zrv" Dec 02 16:16:12 crc kubenswrapper[4933]: I1202 16:16:12.953064 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h8zrv" Dec 02 16:16:13 crc kubenswrapper[4933]: E1202 16:16:13.358940 4933 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6c1c5e6_50dd_428a_890c_2c3f0456f2fa.slice/crio-88738e3bef53d0478fb4beaed256d677959bb94ab5e545598a6a14e9cb4a5112.scope\": RecentStats: unable to find data in memory cache]" Dec 02 16:16:14 crc kubenswrapper[4933]: I1202 16:16:14.903685 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-76479c94dd-9jkgg" Dec 02 16:16:15 crc kubenswrapper[4933]: I1202 16:16:15.508757 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 16:16:15 crc kubenswrapper[4933]: I1202 16:16:15.509774 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="103581af-5f22-4b11-a0a3-093da3661978" containerName="glance-log" containerID="cri-o://37212acc77563c3a21486221b9d97152443c021cd2709eef50f3258d2a878908" gracePeriod=30 Dec 02 16:16:15 crc kubenswrapper[4933]: I1202 16:16:15.509989 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="103581af-5f22-4b11-a0a3-093da3661978" containerName="glance-httpd" containerID="cri-o://26e822c5be606218010c86b961572d270d34c59c223eab95d2ccc2b9a909c107" gracePeriod=30 Dec 02 16:16:15 crc kubenswrapper[4933]: I1202 16:16:15.886018 4933 generic.go:334] "Generic (PLEG): container finished" podID="103581af-5f22-4b11-a0a3-093da3661978" containerID="37212acc77563c3a21486221b9d97152443c021cd2709eef50f3258d2a878908" exitCode=143 Dec 02 16:16:15 crc kubenswrapper[4933]: I1202 16:16:15.886093 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"103581af-5f22-4b11-a0a3-093da3661978","Type":"ContainerDied","Data":"37212acc77563c3a21486221b9d97152443c021cd2709eef50f3258d2a878908"} Dec 02 16:16:16 crc kubenswrapper[4933]: I1202 16:16:16.914912 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"73119236-88e7-4698-a4a2-5ee8e0c1df03","Type":"ContainerStarted","Data":"a1662fe208ee4f6a63768b5d8e7fb70bd4c5727b46c305e4155194d1686078e1"} Dec 02 16:16:16 crc kubenswrapper[4933]: I1202 16:16:16.922708 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"f673f811-c1d1-4e11-94d8-4932e9761bbf","Type":"ContainerStarted","Data":"150f7dad86bc1774da871c5115009a493074d35c71e575b8e73d9692c42a24e4"} Dec 02 16:16:16 crc kubenswrapper[4933]: I1202 16:16:16.942585 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-h8zrv"] Dec 02 16:16:16 crc kubenswrapper[4933]: I1202 16:16:16.957056 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=1.559134715 podStartE2EDuration="24.957038044s" podCreationTimestamp="2025-12-02 16:15:52 +0000 UTC" firstStartedPulling="2025-12-02 16:15:53.024515533 +0000 UTC m=+1416.275742236" lastFinishedPulling="2025-12-02 16:16:16.422418872 +0000 UTC m=+1439.673645565" observedRunningTime="2025-12-02 16:16:16.954531816 +0000 UTC m=+1440.205758539" watchObservedRunningTime="2025-12-02 16:16:16.957038044 +0000 UTC m=+1440.208264747" Dec 02 16:16:17 crc kubenswrapper[4933]: I1202 16:16:17.938798 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"73119236-88e7-4698-a4a2-5ee8e0c1df03","Type":"ContainerStarted","Data":"3c5ff5c103092ea63036d4eb06e5e5e43dd10eeb3dcd959a721a4a396bd3a165"} Dec 02 16:16:17 crc kubenswrapper[4933]: I1202 16:16:17.940960 4933 generic.go:334] "Generic (PLEG): container finished" podID="aa2e200b-ec52-4e46-bc66-c40d8b3c0c27" containerID="7d904d52a1e68759a1ffcde979b9f78288fc4ff777a99f6e8b3234c9555658a0" exitCode=0 Dec 02 16:16:17 crc kubenswrapper[4933]: I1202 16:16:17.942448 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h8zrv" event={"ID":"aa2e200b-ec52-4e46-bc66-c40d8b3c0c27","Type":"ContainerDied","Data":"7d904d52a1e68759a1ffcde979b9f78288fc4ff777a99f6e8b3234c9555658a0"} Dec 02 16:16:17 crc kubenswrapper[4933]: I1202 16:16:17.942480 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h8zrv" event={"ID":"aa2e200b-ec52-4e46-bc66-c40d8b3c0c27","Type":"ContainerStarted","Data":"f309632ffe67b95609fcc28ce15d819e0d1e7d07df0ae6ffd4dfdbf424256b22"} Dec 02 16:16:18 crc kubenswrapper[4933]: I1202 16:16:18.202716 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-5979c644c5-dckr9" Dec 02 16:16:18 crc kubenswrapper[4933]: I1202 16:16:18.294489 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-5545fbccfb-mqjbx"] Dec 02 16:16:18 crc kubenswrapper[4933]: I1202 16:16:18.459979 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-5558cd5dc7-s4k68" Dec 02 16:16:18 crc kubenswrapper[4933]: I1202 16:16:18.559020 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-57cc8c89b8-9grd9"] Dec 02 16:16:19 crc kubenswrapper[4933]: I1202 16:16:19.374211 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-5545fbccfb-mqjbx" Dec 02 16:16:19 crc kubenswrapper[4933]: I1202 16:16:19.384852 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-57cc8c89b8-9grd9" Dec 02 16:16:19 crc kubenswrapper[4933]: I1202 16:16:19.459785 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4554a65a-0086-4a8d-8d70-57e016ad418e-combined-ca-bundle\") pod \"4554a65a-0086-4a8d-8d70-57e016ad418e\" (UID: \"4554a65a-0086-4a8d-8d70-57e016ad418e\") " Dec 02 16:16:19 crc kubenswrapper[4933]: I1202 16:16:19.459892 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5ea07dd-c4dc-421e-b710-8d759f0d9487-combined-ca-bundle\") pod \"b5ea07dd-c4dc-421e-b710-8d759f0d9487\" (UID: \"b5ea07dd-c4dc-421e-b710-8d759f0d9487\") " Dec 02 16:16:19 crc kubenswrapper[4933]: I1202 16:16:19.459941 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jqrzs\" (UniqueName: \"kubernetes.io/projected/b5ea07dd-c4dc-421e-b710-8d759f0d9487-kube-api-access-jqrzs\") pod \"b5ea07dd-c4dc-421e-b710-8d759f0d9487\" (UID: \"b5ea07dd-c4dc-421e-b710-8d759f0d9487\") " Dec 02 16:16:19 crc kubenswrapper[4933]: I1202 16:16:19.459974 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4554a65a-0086-4a8d-8d70-57e016ad418e-config-data\") pod \"4554a65a-0086-4a8d-8d70-57e016ad418e\" (UID: \"4554a65a-0086-4a8d-8d70-57e016ad418e\") " Dec 02 16:16:19 crc kubenswrapper[4933]: I1202 16:16:19.460026 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5ea07dd-c4dc-421e-b710-8d759f0d9487-config-data\") pod \"b5ea07dd-c4dc-421e-b710-8d759f0d9487\" (UID: \"b5ea07dd-c4dc-421e-b710-8d759f0d9487\") " Dec 02 16:16:19 crc kubenswrapper[4933]: I1202 16:16:19.460091 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b5ea07dd-c4dc-421e-b710-8d759f0d9487-config-data-custom\") pod \"b5ea07dd-c4dc-421e-b710-8d759f0d9487\" (UID: \"b5ea07dd-c4dc-421e-b710-8d759f0d9487\") " Dec 02 16:16:19 crc kubenswrapper[4933]: I1202 16:16:19.460180 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4554a65a-0086-4a8d-8d70-57e016ad418e-config-data-custom\") pod \"4554a65a-0086-4a8d-8d70-57e016ad418e\" (UID: \"4554a65a-0086-4a8d-8d70-57e016ad418e\") " Dec 02 16:16:19 crc kubenswrapper[4933]: I1202 16:16:19.460203 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jbfdm\" (UniqueName: \"kubernetes.io/projected/4554a65a-0086-4a8d-8d70-57e016ad418e-kube-api-access-jbfdm\") pod \"4554a65a-0086-4a8d-8d70-57e016ad418e\" (UID: \"4554a65a-0086-4a8d-8d70-57e016ad418e\") " Dec 02 16:16:19 crc kubenswrapper[4933]: I1202 16:16:19.467457 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4554a65a-0086-4a8d-8d70-57e016ad418e-kube-api-access-jbfdm" (OuterVolumeSpecName: "kube-api-access-jbfdm") pod "4554a65a-0086-4a8d-8d70-57e016ad418e" (UID: "4554a65a-0086-4a8d-8d70-57e016ad418e"). InnerVolumeSpecName "kube-api-access-jbfdm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:16:19 crc kubenswrapper[4933]: I1202 16:16:19.469864 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4554a65a-0086-4a8d-8d70-57e016ad418e-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "4554a65a-0086-4a8d-8d70-57e016ad418e" (UID: "4554a65a-0086-4a8d-8d70-57e016ad418e"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:16:19 crc kubenswrapper[4933]: I1202 16:16:19.475417 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5ea07dd-c4dc-421e-b710-8d759f0d9487-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "b5ea07dd-c4dc-421e-b710-8d759f0d9487" (UID: "b5ea07dd-c4dc-421e-b710-8d759f0d9487"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:16:19 crc kubenswrapper[4933]: I1202 16:16:19.498080 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5ea07dd-c4dc-421e-b710-8d759f0d9487-kube-api-access-jqrzs" (OuterVolumeSpecName: "kube-api-access-jqrzs") pod "b5ea07dd-c4dc-421e-b710-8d759f0d9487" (UID: "b5ea07dd-c4dc-421e-b710-8d759f0d9487"). InnerVolumeSpecName "kube-api-access-jqrzs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:16:19 crc kubenswrapper[4933]: I1202 16:16:19.562929 4933 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b5ea07dd-c4dc-421e-b710-8d759f0d9487-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 02 16:16:19 crc kubenswrapper[4933]: I1202 16:16:19.562991 4933 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4554a65a-0086-4a8d-8d70-57e016ad418e-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 02 16:16:19 crc kubenswrapper[4933]: I1202 16:16:19.563001 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jbfdm\" (UniqueName: \"kubernetes.io/projected/4554a65a-0086-4a8d-8d70-57e016ad418e-kube-api-access-jbfdm\") on node \"crc\" DevicePath \"\"" Dec 02 16:16:19 crc kubenswrapper[4933]: I1202 16:16:19.563013 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jqrzs\" (UniqueName: \"kubernetes.io/projected/b5ea07dd-c4dc-421e-b710-8d759f0d9487-kube-api-access-jqrzs\") on node \"crc\" DevicePath \"\"" Dec 02 16:16:19 crc kubenswrapper[4933]: I1202 16:16:19.635087 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4554a65a-0086-4a8d-8d70-57e016ad418e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4554a65a-0086-4a8d-8d70-57e016ad418e" (UID: "4554a65a-0086-4a8d-8d70-57e016ad418e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:16:19 crc kubenswrapper[4933]: I1202 16:16:19.665699 4933 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4554a65a-0086-4a8d-8d70-57e016ad418e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 16:16:19 crc kubenswrapper[4933]: I1202 16:16:19.721205 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 02 16:16:19 crc kubenswrapper[4933]: I1202 16:16:19.730733 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5ea07dd-c4dc-421e-b710-8d759f0d9487-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b5ea07dd-c4dc-421e-b710-8d759f0d9487" (UID: "b5ea07dd-c4dc-421e-b710-8d759f0d9487"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:16:19 crc kubenswrapper[4933]: I1202 16:16:19.745230 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5ea07dd-c4dc-421e-b710-8d759f0d9487-config-data" (OuterVolumeSpecName: "config-data") pod "b5ea07dd-c4dc-421e-b710-8d759f0d9487" (UID: "b5ea07dd-c4dc-421e-b710-8d759f0d9487"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:16:19 crc kubenswrapper[4933]: I1202 16:16:19.767349 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vrf27\" (UniqueName: \"kubernetes.io/projected/103581af-5f22-4b11-a0a3-093da3661978-kube-api-access-vrf27\") pod \"103581af-5f22-4b11-a0a3-093da3661978\" (UID: \"103581af-5f22-4b11-a0a3-093da3661978\") " Dec 02 16:16:19 crc kubenswrapper[4933]: I1202 16:16:19.767605 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/103581af-5f22-4b11-a0a3-093da3661978-scripts\") pod \"103581af-5f22-4b11-a0a3-093da3661978\" (UID: \"103581af-5f22-4b11-a0a3-093da3661978\") " Dec 02 16:16:19 crc kubenswrapper[4933]: I1202 16:16:19.767644 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/103581af-5f22-4b11-a0a3-093da3661978-config-data\") pod \"103581af-5f22-4b11-a0a3-093da3661978\" (UID: \"103581af-5f22-4b11-a0a3-093da3661978\") " Dec 02 16:16:19 crc kubenswrapper[4933]: I1202 16:16:19.767692 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/103581af-5f22-4b11-a0a3-093da3661978-httpd-run\") pod \"103581af-5f22-4b11-a0a3-093da3661978\" (UID: \"103581af-5f22-4b11-a0a3-093da3661978\") " Dec 02 16:16:19 crc kubenswrapper[4933]: I1202 16:16:19.767754 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/103581af-5f22-4b11-a0a3-093da3661978-logs\") pod \"103581af-5f22-4b11-a0a3-093da3661978\" (UID: \"103581af-5f22-4b11-a0a3-093da3661978\") " Dec 02 16:16:19 crc kubenswrapper[4933]: I1202 16:16:19.767778 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"103581af-5f22-4b11-a0a3-093da3661978\" (UID: \"103581af-5f22-4b11-a0a3-093da3661978\") " Dec 02 16:16:19 crc kubenswrapper[4933]: I1202 16:16:19.767896 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/103581af-5f22-4b11-a0a3-093da3661978-internal-tls-certs\") pod \"103581af-5f22-4b11-a0a3-093da3661978\" (UID: \"103581af-5f22-4b11-a0a3-093da3661978\") " Dec 02 16:16:19 crc kubenswrapper[4933]: I1202 16:16:19.767926 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/103581af-5f22-4b11-a0a3-093da3661978-combined-ca-bundle\") pod \"103581af-5f22-4b11-a0a3-093da3661978\" (UID: \"103581af-5f22-4b11-a0a3-093da3661978\") " Dec 02 16:16:19 crc kubenswrapper[4933]: I1202 16:16:19.768579 4933 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5ea07dd-c4dc-421e-b710-8d759f0d9487-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 16:16:19 crc kubenswrapper[4933]: I1202 16:16:19.768595 4933 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5ea07dd-c4dc-421e-b710-8d759f0d9487-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 16:16:19 crc kubenswrapper[4933]: I1202 16:16:19.769668 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/103581af-5f22-4b11-a0a3-093da3661978-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "103581af-5f22-4b11-a0a3-093da3661978" (UID: "103581af-5f22-4b11-a0a3-093da3661978"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 16:16:19 crc kubenswrapper[4933]: I1202 16:16:19.772618 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/103581af-5f22-4b11-a0a3-093da3661978-kube-api-access-vrf27" (OuterVolumeSpecName: "kube-api-access-vrf27") pod "103581af-5f22-4b11-a0a3-093da3661978" (UID: "103581af-5f22-4b11-a0a3-093da3661978"). InnerVolumeSpecName "kube-api-access-vrf27". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:16:19 crc kubenswrapper[4933]: I1202 16:16:19.773187 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/103581af-5f22-4b11-a0a3-093da3661978-logs" (OuterVolumeSpecName: "logs") pod "103581af-5f22-4b11-a0a3-093da3661978" (UID: "103581af-5f22-4b11-a0a3-093da3661978"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 16:16:19 crc kubenswrapper[4933]: I1202 16:16:19.783015 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "103581af-5f22-4b11-a0a3-093da3661978" (UID: "103581af-5f22-4b11-a0a3-093da3661978"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 02 16:16:19 crc kubenswrapper[4933]: I1202 16:16:19.793994 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/103581af-5f22-4b11-a0a3-093da3661978-scripts" (OuterVolumeSpecName: "scripts") pod "103581af-5f22-4b11-a0a3-093da3661978" (UID: "103581af-5f22-4b11-a0a3-093da3661978"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:16:19 crc kubenswrapper[4933]: I1202 16:16:19.794802 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4554a65a-0086-4a8d-8d70-57e016ad418e-config-data" (OuterVolumeSpecName: "config-data") pod "4554a65a-0086-4a8d-8d70-57e016ad418e" (UID: "4554a65a-0086-4a8d-8d70-57e016ad418e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:16:19 crc kubenswrapper[4933]: I1202 16:16:19.840034 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/103581af-5f22-4b11-a0a3-093da3661978-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "103581af-5f22-4b11-a0a3-093da3661978" (UID: "103581af-5f22-4b11-a0a3-093da3661978"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:16:19 crc kubenswrapper[4933]: I1202 16:16:19.874446 4933 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/103581af-5f22-4b11-a0a3-093da3661978-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 02 16:16:19 crc kubenswrapper[4933]: I1202 16:16:19.874481 4933 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/103581af-5f22-4b11-a0a3-093da3661978-logs\") on node \"crc\" DevicePath \"\"" Dec 02 16:16:19 crc kubenswrapper[4933]: I1202 16:16:19.874505 4933 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Dec 02 16:16:19 crc kubenswrapper[4933]: I1202 16:16:19.874516 4933 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4554a65a-0086-4a8d-8d70-57e016ad418e-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 16:16:19 crc kubenswrapper[4933]: I1202 16:16:19.874526 4933 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/103581af-5f22-4b11-a0a3-093da3661978-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 16:16:19 crc kubenswrapper[4933]: I1202 16:16:19.874536 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vrf27\" (UniqueName: \"kubernetes.io/projected/103581af-5f22-4b11-a0a3-093da3661978-kube-api-access-vrf27\") on node \"crc\" DevicePath \"\"" Dec 02 16:16:19 crc kubenswrapper[4933]: I1202 16:16:19.874544 4933 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/103581af-5f22-4b11-a0a3-093da3661978-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 16:16:19 crc kubenswrapper[4933]: I1202 16:16:19.949576 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/103581af-5f22-4b11-a0a3-093da3661978-config-data" (OuterVolumeSpecName: "config-data") pod "103581af-5f22-4b11-a0a3-093da3661978" (UID: "103581af-5f22-4b11-a0a3-093da3661978"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:16:19 crc kubenswrapper[4933]: I1202 16:16:19.959657 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/103581af-5f22-4b11-a0a3-093da3661978-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "103581af-5f22-4b11-a0a3-093da3661978" (UID: "103581af-5f22-4b11-a0a3-093da3661978"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:16:19 crc kubenswrapper[4933]: I1202 16:16:19.967786 4933 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Dec 02 16:16:19 crc kubenswrapper[4933]: I1202 16:16:19.976052 4933 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/103581af-5f22-4b11-a0a3-093da3661978-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 16:16:19 crc kubenswrapper[4933]: I1202 16:16:19.976081 4933 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Dec 02 16:16:19 crc kubenswrapper[4933]: I1202 16:16:19.976094 4933 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/103581af-5f22-4b11-a0a3-093da3661978-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 02 16:16:19 crc kubenswrapper[4933]: I1202 16:16:19.978436 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-5545fbccfb-mqjbx" event={"ID":"4554a65a-0086-4a8d-8d70-57e016ad418e","Type":"ContainerDied","Data":"da059523de4bf7b97bd488a4aed6051e04027f4035cf36d64a28830722c09296"} Dec 02 16:16:19 crc kubenswrapper[4933]: I1202 16:16:19.978485 4933 scope.go:117] "RemoveContainer" containerID="0fb953aaa2a273d0f89bdf4e52e7e7f823d467071d652705440301b7fed4cbf9" Dec 02 16:16:19 crc kubenswrapper[4933]: I1202 16:16:19.978570 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-5545fbccfb-mqjbx" Dec 02 16:16:19 crc kubenswrapper[4933]: I1202 16:16:19.988949 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"73119236-88e7-4698-a4a2-5ee8e0c1df03","Type":"ContainerStarted","Data":"b9637ba49afd541057409b63dc1da065b8bcc58906876a7bfc0cd72bcb383806"} Dec 02 16:16:20 crc kubenswrapper[4933]: I1202 16:16:20.015518 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-57cc8c89b8-9grd9" event={"ID":"b5ea07dd-c4dc-421e-b710-8d759f0d9487","Type":"ContainerDied","Data":"54da929ff1e237b7222e00fbce8def44efbb1aa6460df4a49e3db48c0d1fa804"} Dec 02 16:16:20 crc kubenswrapper[4933]: I1202 16:16:20.015605 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-57cc8c89b8-9grd9" Dec 02 16:16:20 crc kubenswrapper[4933]: I1202 16:16:20.023620 4933 generic.go:334] "Generic (PLEG): container finished" podID="103581af-5f22-4b11-a0a3-093da3661978" containerID="26e822c5be606218010c86b961572d270d34c59c223eab95d2ccc2b9a909c107" exitCode=0 Dec 02 16:16:20 crc kubenswrapper[4933]: I1202 16:16:20.023669 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"103581af-5f22-4b11-a0a3-093da3661978","Type":"ContainerDied","Data":"26e822c5be606218010c86b961572d270d34c59c223eab95d2ccc2b9a909c107"} Dec 02 16:16:20 crc kubenswrapper[4933]: I1202 16:16:20.023697 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"103581af-5f22-4b11-a0a3-093da3661978","Type":"ContainerDied","Data":"b40d0f3200d7ab3da43e2885e75f07a1751e33ceaa9d2399305f86b2135ea2b7"} Dec 02 16:16:20 crc kubenswrapper[4933]: I1202 16:16:20.023760 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 02 16:16:20 crc kubenswrapper[4933]: I1202 16:16:20.149289 4933 scope.go:117] "RemoveContainer" containerID="826aee863825a409f3215548bc30dc822c450b46483aaa49e0c7b31a17bb87c1" Dec 02 16:16:20 crc kubenswrapper[4933]: I1202 16:16:20.223946 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-57cc8c89b8-9grd9"] Dec 02 16:16:20 crc kubenswrapper[4933]: I1202 16:16:20.236875 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-57cc8c89b8-9grd9"] Dec 02 16:16:20 crc kubenswrapper[4933]: I1202 16:16:20.248899 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-5545fbccfb-mqjbx"] Dec 02 16:16:20 crc kubenswrapper[4933]: I1202 16:16:20.249583 4933 scope.go:117] "RemoveContainer" containerID="26e822c5be606218010c86b961572d270d34c59c223eab95d2ccc2b9a909c107" Dec 02 16:16:20 crc kubenswrapper[4933]: I1202 16:16:20.273468 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-5545fbccfb-mqjbx"] Dec 02 16:16:20 crc kubenswrapper[4933]: I1202 16:16:20.286380 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 16:16:20 crc kubenswrapper[4933]: I1202 16:16:20.290269 4933 scope.go:117] "RemoveContainer" containerID="37212acc77563c3a21486221b9d97152443c021cd2709eef50f3258d2a878908" Dec 02 16:16:20 crc kubenswrapper[4933]: I1202 16:16:20.304914 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 16:16:20 crc kubenswrapper[4933]: I1202 16:16:20.329539 4933 scope.go:117] "RemoveContainer" containerID="26e822c5be606218010c86b961572d270d34c59c223eab95d2ccc2b9a909c107" Dec 02 16:16:20 crc kubenswrapper[4933]: E1202 16:16:20.330588 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26e822c5be606218010c86b961572d270d34c59c223eab95d2ccc2b9a909c107\": container with ID starting with 26e822c5be606218010c86b961572d270d34c59c223eab95d2ccc2b9a909c107 not found: ID does not exist" containerID="26e822c5be606218010c86b961572d270d34c59c223eab95d2ccc2b9a909c107" Dec 02 16:16:20 crc kubenswrapper[4933]: I1202 16:16:20.330703 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26e822c5be606218010c86b961572d270d34c59c223eab95d2ccc2b9a909c107"} err="failed to get container status \"26e822c5be606218010c86b961572d270d34c59c223eab95d2ccc2b9a909c107\": rpc error: code = NotFound desc = could not find container \"26e822c5be606218010c86b961572d270d34c59c223eab95d2ccc2b9a909c107\": container with ID starting with 26e822c5be606218010c86b961572d270d34c59c223eab95d2ccc2b9a909c107 not found: ID does not exist" Dec 02 16:16:20 crc kubenswrapper[4933]: I1202 16:16:20.330777 4933 scope.go:117] "RemoveContainer" containerID="37212acc77563c3a21486221b9d97152443c021cd2709eef50f3258d2a878908" Dec 02 16:16:20 crc kubenswrapper[4933]: I1202 16:16:20.330917 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 16:16:20 crc kubenswrapper[4933]: E1202 16:16:20.331445 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4554a65a-0086-4a8d-8d70-57e016ad418e" containerName="heat-cfnapi" Dec 02 16:16:20 crc kubenswrapper[4933]: I1202 16:16:20.331514 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="4554a65a-0086-4a8d-8d70-57e016ad418e" containerName="heat-cfnapi" Dec 02 16:16:20 crc kubenswrapper[4933]: E1202 16:16:20.331584 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="103581af-5f22-4b11-a0a3-093da3661978" containerName="glance-httpd" Dec 02 16:16:20 crc kubenswrapper[4933]: I1202 16:16:20.331642 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="103581af-5f22-4b11-a0a3-093da3661978" containerName="glance-httpd" Dec 02 16:16:20 crc kubenswrapper[4933]: E1202 16:16:20.331697 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5ea07dd-c4dc-421e-b710-8d759f0d9487" containerName="heat-api" Dec 02 16:16:20 crc kubenswrapper[4933]: I1202 16:16:20.331797 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5ea07dd-c4dc-421e-b710-8d759f0d9487" containerName="heat-api" Dec 02 16:16:20 crc kubenswrapper[4933]: E1202 16:16:20.331939 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="103581af-5f22-4b11-a0a3-093da3661978" containerName="glance-log" Dec 02 16:16:20 crc kubenswrapper[4933]: I1202 16:16:20.332011 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="103581af-5f22-4b11-a0a3-093da3661978" containerName="glance-log" Dec 02 16:16:20 crc kubenswrapper[4933]: E1202 16:16:20.331772 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37212acc77563c3a21486221b9d97152443c021cd2709eef50f3258d2a878908\": container with ID starting with 37212acc77563c3a21486221b9d97152443c021cd2709eef50f3258d2a878908 not found: ID does not exist" containerID="37212acc77563c3a21486221b9d97152443c021cd2709eef50f3258d2a878908" Dec 02 16:16:20 crc kubenswrapper[4933]: I1202 16:16:20.332290 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37212acc77563c3a21486221b9d97152443c021cd2709eef50f3258d2a878908"} err="failed to get container status \"37212acc77563c3a21486221b9d97152443c021cd2709eef50f3258d2a878908\": rpc error: code = NotFound desc = could not find container \"37212acc77563c3a21486221b9d97152443c021cd2709eef50f3258d2a878908\": container with ID starting with 37212acc77563c3a21486221b9d97152443c021cd2709eef50f3258d2a878908 not found: ID does not exist" Dec 02 16:16:20 crc kubenswrapper[4933]: I1202 16:16:20.332409 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="4554a65a-0086-4a8d-8d70-57e016ad418e" containerName="heat-cfnapi" Dec 02 16:16:20 crc kubenswrapper[4933]: I1202 16:16:20.332471 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="103581af-5f22-4b11-a0a3-093da3661978" containerName="glance-log" Dec 02 16:16:20 crc kubenswrapper[4933]: I1202 16:16:20.332551 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5ea07dd-c4dc-421e-b710-8d759f0d9487" containerName="heat-api" Dec 02 16:16:20 crc kubenswrapper[4933]: I1202 16:16:20.332619 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="103581af-5f22-4b11-a0a3-093da3661978" containerName="glance-httpd" Dec 02 16:16:20 crc kubenswrapper[4933]: I1202 16:16:20.332681 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5ea07dd-c4dc-421e-b710-8d759f0d9487" containerName="heat-api" Dec 02 16:16:20 crc kubenswrapper[4933]: E1202 16:16:20.332985 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4554a65a-0086-4a8d-8d70-57e016ad418e" containerName="heat-cfnapi" Dec 02 16:16:20 crc kubenswrapper[4933]: I1202 16:16:20.333054 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="4554a65a-0086-4a8d-8d70-57e016ad418e" containerName="heat-cfnapi" Dec 02 16:16:20 crc kubenswrapper[4933]: E1202 16:16:20.333123 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5ea07dd-c4dc-421e-b710-8d759f0d9487" containerName="heat-api" Dec 02 16:16:20 crc kubenswrapper[4933]: I1202 16:16:20.333241 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5ea07dd-c4dc-421e-b710-8d759f0d9487" containerName="heat-api" Dec 02 16:16:20 crc kubenswrapper[4933]: I1202 16:16:20.333503 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="4554a65a-0086-4a8d-8d70-57e016ad418e" containerName="heat-cfnapi" Dec 02 16:16:20 crc kubenswrapper[4933]: I1202 16:16:20.334366 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 02 16:16:20 crc kubenswrapper[4933]: I1202 16:16:20.349234 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 02 16:16:20 crc kubenswrapper[4933]: I1202 16:16:20.349539 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 02 16:16:20 crc kubenswrapper[4933]: I1202 16:16:20.354849 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 16:16:20 crc kubenswrapper[4933]: I1202 16:16:20.386343 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45180c36-0fa3-4abc-a647-9b4beb0ed87d-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"45180c36-0fa3-4abc-a647-9b4beb0ed87d\") " pod="openstack/glance-default-internal-api-0" Dec 02 16:16:20 crc kubenswrapper[4933]: I1202 16:16:20.390144 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xc68x\" (UniqueName: \"kubernetes.io/projected/45180c36-0fa3-4abc-a647-9b4beb0ed87d-kube-api-access-xc68x\") pod \"glance-default-internal-api-0\" (UID: \"45180c36-0fa3-4abc-a647-9b4beb0ed87d\") " pod="openstack/glance-default-internal-api-0" Dec 02 16:16:20 crc kubenswrapper[4933]: I1202 16:16:20.390321 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/45180c36-0fa3-4abc-a647-9b4beb0ed87d-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"45180c36-0fa3-4abc-a647-9b4beb0ed87d\") " pod="openstack/glance-default-internal-api-0" Dec 02 16:16:20 crc kubenswrapper[4933]: I1202 16:16:20.390416 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/45180c36-0fa3-4abc-a647-9b4beb0ed87d-logs\") pod \"glance-default-internal-api-0\" (UID: \"45180c36-0fa3-4abc-a647-9b4beb0ed87d\") " pod="openstack/glance-default-internal-api-0" Dec 02 16:16:20 crc kubenswrapper[4933]: I1202 16:16:20.390510 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/45180c36-0fa3-4abc-a647-9b4beb0ed87d-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"45180c36-0fa3-4abc-a647-9b4beb0ed87d\") " pod="openstack/glance-default-internal-api-0" Dec 02 16:16:20 crc kubenswrapper[4933]: I1202 16:16:20.390620 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45180c36-0fa3-4abc-a647-9b4beb0ed87d-config-data\") pod \"glance-default-internal-api-0\" (UID: \"45180c36-0fa3-4abc-a647-9b4beb0ed87d\") " pod="openstack/glance-default-internal-api-0" Dec 02 16:16:20 crc kubenswrapper[4933]: I1202 16:16:20.390698 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"45180c36-0fa3-4abc-a647-9b4beb0ed87d\") " pod="openstack/glance-default-internal-api-0" Dec 02 16:16:20 crc kubenswrapper[4933]: I1202 16:16:20.390984 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/45180c36-0fa3-4abc-a647-9b4beb0ed87d-scripts\") pod \"glance-default-internal-api-0\" (UID: \"45180c36-0fa3-4abc-a647-9b4beb0ed87d\") " pod="openstack/glance-default-internal-api-0" Dec 02 16:16:20 crc kubenswrapper[4933]: I1202 16:16:20.493764 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45180c36-0fa3-4abc-a647-9b4beb0ed87d-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"45180c36-0fa3-4abc-a647-9b4beb0ed87d\") " pod="openstack/glance-default-internal-api-0" Dec 02 16:16:20 crc kubenswrapper[4933]: I1202 16:16:20.494137 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xc68x\" (UniqueName: \"kubernetes.io/projected/45180c36-0fa3-4abc-a647-9b4beb0ed87d-kube-api-access-xc68x\") pod \"glance-default-internal-api-0\" (UID: \"45180c36-0fa3-4abc-a647-9b4beb0ed87d\") " pod="openstack/glance-default-internal-api-0" Dec 02 16:16:20 crc kubenswrapper[4933]: I1202 16:16:20.494277 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/45180c36-0fa3-4abc-a647-9b4beb0ed87d-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"45180c36-0fa3-4abc-a647-9b4beb0ed87d\") " pod="openstack/glance-default-internal-api-0" Dec 02 16:16:20 crc kubenswrapper[4933]: I1202 16:16:20.494389 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/45180c36-0fa3-4abc-a647-9b4beb0ed87d-logs\") pod \"glance-default-internal-api-0\" (UID: \"45180c36-0fa3-4abc-a647-9b4beb0ed87d\") " pod="openstack/glance-default-internal-api-0" Dec 02 16:16:20 crc kubenswrapper[4933]: I1202 16:16:20.494495 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/45180c36-0fa3-4abc-a647-9b4beb0ed87d-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"45180c36-0fa3-4abc-a647-9b4beb0ed87d\") " pod="openstack/glance-default-internal-api-0" Dec 02 16:16:20 crc kubenswrapper[4933]: I1202 16:16:20.494607 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45180c36-0fa3-4abc-a647-9b4beb0ed87d-config-data\") pod \"glance-default-internal-api-0\" (UID: \"45180c36-0fa3-4abc-a647-9b4beb0ed87d\") " pod="openstack/glance-default-internal-api-0" Dec 02 16:16:20 crc kubenswrapper[4933]: I1202 16:16:20.494713 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"45180c36-0fa3-4abc-a647-9b4beb0ed87d\") " pod="openstack/glance-default-internal-api-0" Dec 02 16:16:20 crc kubenswrapper[4933]: I1202 16:16:20.494850 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/45180c36-0fa3-4abc-a647-9b4beb0ed87d-logs\") pod \"glance-default-internal-api-0\" (UID: \"45180c36-0fa3-4abc-a647-9b4beb0ed87d\") " pod="openstack/glance-default-internal-api-0" Dec 02 16:16:20 crc kubenswrapper[4933]: I1202 16:16:20.494635 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/45180c36-0fa3-4abc-a647-9b4beb0ed87d-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"45180c36-0fa3-4abc-a647-9b4beb0ed87d\") " pod="openstack/glance-default-internal-api-0" Dec 02 16:16:20 crc kubenswrapper[4933]: I1202 16:16:20.495099 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/45180c36-0fa3-4abc-a647-9b4beb0ed87d-scripts\") pod \"glance-default-internal-api-0\" (UID: \"45180c36-0fa3-4abc-a647-9b4beb0ed87d\") " pod="openstack/glance-default-internal-api-0" Dec 02 16:16:20 crc kubenswrapper[4933]: I1202 16:16:20.495138 4933 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"45180c36-0fa3-4abc-a647-9b4beb0ed87d\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-internal-api-0" Dec 02 16:16:20 crc kubenswrapper[4933]: I1202 16:16:20.498931 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45180c36-0fa3-4abc-a647-9b4beb0ed87d-config-data\") pod \"glance-default-internal-api-0\" (UID: \"45180c36-0fa3-4abc-a647-9b4beb0ed87d\") " pod="openstack/glance-default-internal-api-0" Dec 02 16:16:20 crc kubenswrapper[4933]: I1202 16:16:20.499179 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/45180c36-0fa3-4abc-a647-9b4beb0ed87d-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"45180c36-0fa3-4abc-a647-9b4beb0ed87d\") " pod="openstack/glance-default-internal-api-0" Dec 02 16:16:20 crc kubenswrapper[4933]: I1202 16:16:20.499233 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/45180c36-0fa3-4abc-a647-9b4beb0ed87d-scripts\") pod \"glance-default-internal-api-0\" (UID: \"45180c36-0fa3-4abc-a647-9b4beb0ed87d\") " pod="openstack/glance-default-internal-api-0" Dec 02 16:16:20 crc kubenswrapper[4933]: I1202 16:16:20.505485 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45180c36-0fa3-4abc-a647-9b4beb0ed87d-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"45180c36-0fa3-4abc-a647-9b4beb0ed87d\") " pod="openstack/glance-default-internal-api-0" Dec 02 16:16:20 crc kubenswrapper[4933]: I1202 16:16:20.513183 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xc68x\" (UniqueName: \"kubernetes.io/projected/45180c36-0fa3-4abc-a647-9b4beb0ed87d-kube-api-access-xc68x\") pod \"glance-default-internal-api-0\" (UID: \"45180c36-0fa3-4abc-a647-9b4beb0ed87d\") " pod="openstack/glance-default-internal-api-0" Dec 02 16:16:20 crc kubenswrapper[4933]: I1202 16:16:20.530030 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"45180c36-0fa3-4abc-a647-9b4beb0ed87d\") " pod="openstack/glance-default-internal-api-0" Dec 02 16:16:20 crc kubenswrapper[4933]: I1202 16:16:20.656299 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 02 16:16:21 crc kubenswrapper[4933]: I1202 16:16:21.053524 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h8zrv" event={"ID":"aa2e200b-ec52-4e46-bc66-c40d8b3c0c27","Type":"ContainerStarted","Data":"86977c6efa4e3349d6a8db9b5e658b6de7a1cbf704363535e45525a1e57e3a48"} Dec 02 16:16:21 crc kubenswrapper[4933]: I1202 16:16:21.076508 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="103581af-5f22-4b11-a0a3-093da3661978" path="/var/lib/kubelet/pods/103581af-5f22-4b11-a0a3-093da3661978/volumes" Dec 02 16:16:21 crc kubenswrapper[4933]: I1202 16:16:21.077124 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4554a65a-0086-4a8d-8d70-57e016ad418e" path="/var/lib/kubelet/pods/4554a65a-0086-4a8d-8d70-57e016ad418e/volumes" Dec 02 16:16:21 crc kubenswrapper[4933]: I1202 16:16:21.077801 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5ea07dd-c4dc-421e-b710-8d759f0d9487" path="/var/lib/kubelet/pods/b5ea07dd-c4dc-421e-b710-8d759f0d9487/volumes" Dec 02 16:16:21 crc kubenswrapper[4933]: I1202 16:16:21.405205 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 16:16:21 crc kubenswrapper[4933]: W1202 16:16:21.465463 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod45180c36_0fa3_4abc_a647_9b4beb0ed87d.slice/crio-a722befded0f1d82e788ec9456f7fb9b318e48093c3869e1be2dc157e4a9b355 WatchSource:0}: Error finding container a722befded0f1d82e788ec9456f7fb9b318e48093c3869e1be2dc157e4a9b355: Status 404 returned error can't find the container with id a722befded0f1d82e788ec9456f7fb9b318e48093c3869e1be2dc157e4a9b355 Dec 02 16:16:22 crc kubenswrapper[4933]: I1202 16:16:22.120433 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"45180c36-0fa3-4abc-a647-9b4beb0ed87d","Type":"ContainerStarted","Data":"a722befded0f1d82e788ec9456f7fb9b318e48093c3869e1be2dc157e4a9b355"} Dec 02 16:16:22 crc kubenswrapper[4933]: I1202 16:16:22.486525 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-64b6fd7f9d-b5l2s" Dec 02 16:16:22 crc kubenswrapper[4933]: I1202 16:16:22.544769 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-76479c94dd-9jkgg"] Dec 02 16:16:22 crc kubenswrapper[4933]: I1202 16:16:22.545062 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-engine-76479c94dd-9jkgg" podUID="8aebfa1d-65c0-4f16-bcee-86e03d923f99" containerName="heat-engine" containerID="cri-o://cd2e80308c9f1b00eaa6c2edd6cc51563d704053c166b8a9f5f3440137e64f3c" gracePeriod=60 Dec 02 16:16:23 crc kubenswrapper[4933]: I1202 16:16:23.156205 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"45180c36-0fa3-4abc-a647-9b4beb0ed87d","Type":"ContainerStarted","Data":"fa4048e310b154647d0a0938882ae6d894737ae5819ed836ce36b1acbc8b2bac"} Dec 02 16:16:24 crc kubenswrapper[4933]: I1202 16:16:24.166939 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"45180c36-0fa3-4abc-a647-9b4beb0ed87d","Type":"ContainerStarted","Data":"4de375ff19abb2b736e5c1e3b72f1b73c340a82fa6d09fb224b13ad4d38d67d5"} Dec 02 16:16:24 crc kubenswrapper[4933]: I1202 16:16:24.170050 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h8zrv" event={"ID":"aa2e200b-ec52-4e46-bc66-c40d8b3c0c27","Type":"ContainerDied","Data":"86977c6efa4e3349d6a8db9b5e658b6de7a1cbf704363535e45525a1e57e3a48"} Dec 02 16:16:24 crc kubenswrapper[4933]: I1202 16:16:24.169987 4933 generic.go:334] "Generic (PLEG): container finished" podID="aa2e200b-ec52-4e46-bc66-c40d8b3c0c27" containerID="86977c6efa4e3349d6a8db9b5e658b6de7a1cbf704363535e45525a1e57e3a48" exitCode=0 Dec 02 16:16:24 crc kubenswrapper[4933]: I1202 16:16:24.174288 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"73119236-88e7-4698-a4a2-5ee8e0c1df03","Type":"ContainerStarted","Data":"61791fdd1f8c4bad9e6dc4af89469d0b055eb6c6c306999b10f238d536cb8b07"} Dec 02 16:16:24 crc kubenswrapper[4933]: I1202 16:16:24.174458 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="73119236-88e7-4698-a4a2-5ee8e0c1df03" containerName="ceilometer-central-agent" containerID="cri-o://a1662fe208ee4f6a63768b5d8e7fb70bd4c5727b46c305e4155194d1686078e1" gracePeriod=30 Dec 02 16:16:24 crc kubenswrapper[4933]: I1202 16:16:24.174507 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 02 16:16:24 crc kubenswrapper[4933]: I1202 16:16:24.174535 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="73119236-88e7-4698-a4a2-5ee8e0c1df03" containerName="proxy-httpd" containerID="cri-o://61791fdd1f8c4bad9e6dc4af89469d0b055eb6c6c306999b10f238d536cb8b07" gracePeriod=30 Dec 02 16:16:24 crc kubenswrapper[4933]: I1202 16:16:24.174571 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="73119236-88e7-4698-a4a2-5ee8e0c1df03" containerName="sg-core" containerID="cri-o://b9637ba49afd541057409b63dc1da065b8bcc58906876a7bfc0cd72bcb383806" gracePeriod=30 Dec 02 16:16:24 crc kubenswrapper[4933]: I1202 16:16:24.174605 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="73119236-88e7-4698-a4a2-5ee8e0c1df03" containerName="ceilometer-notification-agent" containerID="cri-o://3c5ff5c103092ea63036d4eb06e5e5e43dd10eeb3dcd959a721a4a396bd3a165" gracePeriod=30 Dec 02 16:16:24 crc kubenswrapper[4933]: I1202 16:16:24.209645 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.209624747 podStartE2EDuration="4.209624747s" podCreationTimestamp="2025-12-02 16:16:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 16:16:24.203968773 +0000 UTC m=+1447.455195476" watchObservedRunningTime="2025-12-02 16:16:24.209624747 +0000 UTC m=+1447.460851450" Dec 02 16:16:24 crc kubenswrapper[4933]: I1202 16:16:24.232001 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.214165866 podStartE2EDuration="17.231984504s" podCreationTimestamp="2025-12-02 16:16:07 +0000 UTC" firstStartedPulling="2025-12-02 16:16:08.93585889 +0000 UTC m=+1432.187085593" lastFinishedPulling="2025-12-02 16:16:22.953677528 +0000 UTC m=+1446.204904231" observedRunningTime="2025-12-02 16:16:24.230544045 +0000 UTC m=+1447.481770738" watchObservedRunningTime="2025-12-02 16:16:24.231984504 +0000 UTC m=+1447.483211197" Dec 02 16:16:24 crc kubenswrapper[4933]: E1202 16:16:24.865343 4933 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cd2e80308c9f1b00eaa6c2edd6cc51563d704053c166b8a9f5f3440137e64f3c" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Dec 02 16:16:24 crc kubenswrapper[4933]: E1202 16:16:24.866698 4933 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cd2e80308c9f1b00eaa6c2edd6cc51563d704053c166b8a9f5f3440137e64f3c" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Dec 02 16:16:24 crc kubenswrapper[4933]: E1202 16:16:24.867910 4933 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cd2e80308c9f1b00eaa6c2edd6cc51563d704053c166b8a9f5f3440137e64f3c" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Dec 02 16:16:24 crc kubenswrapper[4933]: E1202 16:16:24.867942 4933 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/heat-engine-76479c94dd-9jkgg" podUID="8aebfa1d-65c0-4f16-bcee-86e03d923f99" containerName="heat-engine" Dec 02 16:16:25 crc kubenswrapper[4933]: I1202 16:16:25.197362 4933 generic.go:334] "Generic (PLEG): container finished" podID="73119236-88e7-4698-a4a2-5ee8e0c1df03" containerID="61791fdd1f8c4bad9e6dc4af89469d0b055eb6c6c306999b10f238d536cb8b07" exitCode=0 Dec 02 16:16:25 crc kubenswrapper[4933]: I1202 16:16:25.197668 4933 generic.go:334] "Generic (PLEG): container finished" podID="73119236-88e7-4698-a4a2-5ee8e0c1df03" containerID="b9637ba49afd541057409b63dc1da065b8bcc58906876a7bfc0cd72bcb383806" exitCode=2 Dec 02 16:16:25 crc kubenswrapper[4933]: I1202 16:16:25.197679 4933 generic.go:334] "Generic (PLEG): container finished" podID="73119236-88e7-4698-a4a2-5ee8e0c1df03" containerID="3c5ff5c103092ea63036d4eb06e5e5e43dd10eeb3dcd959a721a4a396bd3a165" exitCode=0 Dec 02 16:16:25 crc kubenswrapper[4933]: I1202 16:16:25.197741 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"73119236-88e7-4698-a4a2-5ee8e0c1df03","Type":"ContainerDied","Data":"61791fdd1f8c4bad9e6dc4af89469d0b055eb6c6c306999b10f238d536cb8b07"} Dec 02 16:16:25 crc kubenswrapper[4933]: I1202 16:16:25.197773 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"73119236-88e7-4698-a4a2-5ee8e0c1df03","Type":"ContainerDied","Data":"b9637ba49afd541057409b63dc1da065b8bcc58906876a7bfc0cd72bcb383806"} Dec 02 16:16:25 crc kubenswrapper[4933]: I1202 16:16:25.197787 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"73119236-88e7-4698-a4a2-5ee8e0c1df03","Type":"ContainerDied","Data":"3c5ff5c103092ea63036d4eb06e5e5e43dd10eeb3dcd959a721a4a396bd3a165"} Dec 02 16:16:25 crc kubenswrapper[4933]: I1202 16:16:25.205940 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h8zrv" event={"ID":"aa2e200b-ec52-4e46-bc66-c40d8b3c0c27","Type":"ContainerStarted","Data":"f45eb1443ec28177bbfb8574b43299e194d7c4c93c0f813f9395d9a6ad9a54d6"} Dec 02 16:16:25 crc kubenswrapper[4933]: I1202 16:16:25.242745 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-h8zrv" podStartSLOduration=6.414685624 podStartE2EDuration="13.242723147s" podCreationTimestamp="2025-12-02 16:16:12 +0000 UTC" firstStartedPulling="2025-12-02 16:16:17.944019821 +0000 UTC m=+1441.195246524" lastFinishedPulling="2025-12-02 16:16:24.772057344 +0000 UTC m=+1448.023284047" observedRunningTime="2025-12-02 16:16:25.224487652 +0000 UTC m=+1448.475714375" watchObservedRunningTime="2025-12-02 16:16:25.242723147 +0000 UTC m=+1448.493949850" Dec 02 16:16:25 crc kubenswrapper[4933]: I1202 16:16:25.930617 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 16:16:25 crc kubenswrapper[4933]: I1202 16:16:25.950271 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73119236-88e7-4698-a4a2-5ee8e0c1df03-config-data\") pod \"73119236-88e7-4698-a4a2-5ee8e0c1df03\" (UID: \"73119236-88e7-4698-a4a2-5ee8e0c1df03\") " Dec 02 16:16:25 crc kubenswrapper[4933]: I1202 16:16:25.950332 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73119236-88e7-4698-a4a2-5ee8e0c1df03-combined-ca-bundle\") pod \"73119236-88e7-4698-a4a2-5ee8e0c1df03\" (UID: \"73119236-88e7-4698-a4a2-5ee8e0c1df03\") " Dec 02 16:16:25 crc kubenswrapper[4933]: I1202 16:16:25.950537 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/73119236-88e7-4698-a4a2-5ee8e0c1df03-sg-core-conf-yaml\") pod \"73119236-88e7-4698-a4a2-5ee8e0c1df03\" (UID: \"73119236-88e7-4698-a4a2-5ee8e0c1df03\") " Dec 02 16:16:25 crc kubenswrapper[4933]: I1202 16:16:25.950577 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/73119236-88e7-4698-a4a2-5ee8e0c1df03-run-httpd\") pod \"73119236-88e7-4698-a4a2-5ee8e0c1df03\" (UID: \"73119236-88e7-4698-a4a2-5ee8e0c1df03\") " Dec 02 16:16:25 crc kubenswrapper[4933]: I1202 16:16:25.950628 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73119236-88e7-4698-a4a2-5ee8e0c1df03-scripts\") pod \"73119236-88e7-4698-a4a2-5ee8e0c1df03\" (UID: \"73119236-88e7-4698-a4a2-5ee8e0c1df03\") " Dec 02 16:16:25 crc kubenswrapper[4933]: I1202 16:16:25.950711 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gqw5z\" (UniqueName: \"kubernetes.io/projected/73119236-88e7-4698-a4a2-5ee8e0c1df03-kube-api-access-gqw5z\") pod \"73119236-88e7-4698-a4a2-5ee8e0c1df03\" (UID: \"73119236-88e7-4698-a4a2-5ee8e0c1df03\") " Dec 02 16:16:25 crc kubenswrapper[4933]: I1202 16:16:25.950860 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/73119236-88e7-4698-a4a2-5ee8e0c1df03-log-httpd\") pod \"73119236-88e7-4698-a4a2-5ee8e0c1df03\" (UID: \"73119236-88e7-4698-a4a2-5ee8e0c1df03\") " Dec 02 16:16:25 crc kubenswrapper[4933]: I1202 16:16:25.951297 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73119236-88e7-4698-a4a2-5ee8e0c1df03-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "73119236-88e7-4698-a4a2-5ee8e0c1df03" (UID: "73119236-88e7-4698-a4a2-5ee8e0c1df03"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 16:16:25 crc kubenswrapper[4933]: I1202 16:16:25.951633 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73119236-88e7-4698-a4a2-5ee8e0c1df03-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "73119236-88e7-4698-a4a2-5ee8e0c1df03" (UID: "73119236-88e7-4698-a4a2-5ee8e0c1df03"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 16:16:25 crc kubenswrapper[4933]: I1202 16:16:25.952194 4933 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/73119236-88e7-4698-a4a2-5ee8e0c1df03-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 16:16:25 crc kubenswrapper[4933]: I1202 16:16:25.952226 4933 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/73119236-88e7-4698-a4a2-5ee8e0c1df03-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 16:16:25 crc kubenswrapper[4933]: I1202 16:16:25.961452 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73119236-88e7-4698-a4a2-5ee8e0c1df03-scripts" (OuterVolumeSpecName: "scripts") pod "73119236-88e7-4698-a4a2-5ee8e0c1df03" (UID: "73119236-88e7-4698-a4a2-5ee8e0c1df03"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:16:25 crc kubenswrapper[4933]: I1202 16:16:25.975403 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73119236-88e7-4698-a4a2-5ee8e0c1df03-kube-api-access-gqw5z" (OuterVolumeSpecName: "kube-api-access-gqw5z") pod "73119236-88e7-4698-a4a2-5ee8e0c1df03" (UID: "73119236-88e7-4698-a4a2-5ee8e0c1df03"). InnerVolumeSpecName "kube-api-access-gqw5z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:16:26 crc kubenswrapper[4933]: I1202 16:16:26.053502 4933 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73119236-88e7-4698-a4a2-5ee8e0c1df03-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 16:16:26 crc kubenswrapper[4933]: I1202 16:16:26.053792 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gqw5z\" (UniqueName: \"kubernetes.io/projected/73119236-88e7-4698-a4a2-5ee8e0c1df03-kube-api-access-gqw5z\") on node \"crc\" DevicePath \"\"" Dec 02 16:16:26 crc kubenswrapper[4933]: I1202 16:16:26.069997 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73119236-88e7-4698-a4a2-5ee8e0c1df03-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "73119236-88e7-4698-a4a2-5ee8e0c1df03" (UID: "73119236-88e7-4698-a4a2-5ee8e0c1df03"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:16:26 crc kubenswrapper[4933]: I1202 16:16:26.119987 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73119236-88e7-4698-a4a2-5ee8e0c1df03-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "73119236-88e7-4698-a4a2-5ee8e0c1df03" (UID: "73119236-88e7-4698-a4a2-5ee8e0c1df03"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:16:26 crc kubenswrapper[4933]: I1202 16:16:26.135974 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73119236-88e7-4698-a4a2-5ee8e0c1df03-config-data" (OuterVolumeSpecName: "config-data") pod "73119236-88e7-4698-a4a2-5ee8e0c1df03" (UID: "73119236-88e7-4698-a4a2-5ee8e0c1df03"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:16:26 crc kubenswrapper[4933]: I1202 16:16:26.155630 4933 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73119236-88e7-4698-a4a2-5ee8e0c1df03-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 16:16:26 crc kubenswrapper[4933]: I1202 16:16:26.155665 4933 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73119236-88e7-4698-a4a2-5ee8e0c1df03-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 16:16:26 crc kubenswrapper[4933]: I1202 16:16:26.155678 4933 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/73119236-88e7-4698-a4a2-5ee8e0c1df03-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 02 16:16:26 crc kubenswrapper[4933]: I1202 16:16:26.219344 4933 generic.go:334] "Generic (PLEG): container finished" podID="73119236-88e7-4698-a4a2-5ee8e0c1df03" containerID="a1662fe208ee4f6a63768b5d8e7fb70bd4c5727b46c305e4155194d1686078e1" exitCode=0 Dec 02 16:16:26 crc kubenswrapper[4933]: I1202 16:16:26.219391 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"73119236-88e7-4698-a4a2-5ee8e0c1df03","Type":"ContainerDied","Data":"a1662fe208ee4f6a63768b5d8e7fb70bd4c5727b46c305e4155194d1686078e1"} Dec 02 16:16:26 crc kubenswrapper[4933]: I1202 16:16:26.219422 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"73119236-88e7-4698-a4a2-5ee8e0c1df03","Type":"ContainerDied","Data":"fcb12a4b366f509b85c03d70d592a1a7b50ddacccfa349cc1407bfa2519575f6"} Dec 02 16:16:26 crc kubenswrapper[4933]: I1202 16:16:26.219441 4933 scope.go:117] "RemoveContainer" containerID="61791fdd1f8c4bad9e6dc4af89469d0b055eb6c6c306999b10f238d536cb8b07" Dec 02 16:16:26 crc kubenswrapper[4933]: I1202 16:16:26.219620 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 16:16:26 crc kubenswrapper[4933]: I1202 16:16:26.250483 4933 scope.go:117] "RemoveContainer" containerID="b9637ba49afd541057409b63dc1da065b8bcc58906876a7bfc0cd72bcb383806" Dec 02 16:16:26 crc kubenswrapper[4933]: I1202 16:16:26.260790 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 16:16:26 crc kubenswrapper[4933]: I1202 16:16:26.280675 4933 scope.go:117] "RemoveContainer" containerID="3c5ff5c103092ea63036d4eb06e5e5e43dd10eeb3dcd959a721a4a396bd3a165" Dec 02 16:16:26 crc kubenswrapper[4933]: I1202 16:16:26.282565 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 02 16:16:26 crc kubenswrapper[4933]: I1202 16:16:26.307086 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 02 16:16:26 crc kubenswrapper[4933]: E1202 16:16:26.307621 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73119236-88e7-4698-a4a2-5ee8e0c1df03" containerName="sg-core" Dec 02 16:16:26 crc kubenswrapper[4933]: I1202 16:16:26.307641 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="73119236-88e7-4698-a4a2-5ee8e0c1df03" containerName="sg-core" Dec 02 16:16:26 crc kubenswrapper[4933]: E1202 16:16:26.307657 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73119236-88e7-4698-a4a2-5ee8e0c1df03" containerName="proxy-httpd" Dec 02 16:16:26 crc kubenswrapper[4933]: I1202 16:16:26.307664 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="73119236-88e7-4698-a4a2-5ee8e0c1df03" containerName="proxy-httpd" Dec 02 16:16:26 crc kubenswrapper[4933]: E1202 16:16:26.307687 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73119236-88e7-4698-a4a2-5ee8e0c1df03" containerName="ceilometer-central-agent" Dec 02 16:16:26 crc kubenswrapper[4933]: I1202 16:16:26.307694 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="73119236-88e7-4698-a4a2-5ee8e0c1df03" containerName="ceilometer-central-agent" Dec 02 16:16:26 crc kubenswrapper[4933]: E1202 16:16:26.307721 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73119236-88e7-4698-a4a2-5ee8e0c1df03" containerName="ceilometer-notification-agent" Dec 02 16:16:26 crc kubenswrapper[4933]: I1202 16:16:26.307727 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="73119236-88e7-4698-a4a2-5ee8e0c1df03" containerName="ceilometer-notification-agent" Dec 02 16:16:26 crc kubenswrapper[4933]: I1202 16:16:26.307954 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="73119236-88e7-4698-a4a2-5ee8e0c1df03" containerName="ceilometer-notification-agent" Dec 02 16:16:26 crc kubenswrapper[4933]: I1202 16:16:26.307981 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="73119236-88e7-4698-a4a2-5ee8e0c1df03" containerName="proxy-httpd" Dec 02 16:16:26 crc kubenswrapper[4933]: I1202 16:16:26.307994 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="73119236-88e7-4698-a4a2-5ee8e0c1df03" containerName="ceilometer-central-agent" Dec 02 16:16:26 crc kubenswrapper[4933]: I1202 16:16:26.308006 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="73119236-88e7-4698-a4a2-5ee8e0c1df03" containerName="sg-core" Dec 02 16:16:26 crc kubenswrapper[4933]: I1202 16:16:26.310199 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 16:16:26 crc kubenswrapper[4933]: I1202 16:16:26.315029 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 02 16:16:26 crc kubenswrapper[4933]: I1202 16:16:26.315073 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 02 16:16:26 crc kubenswrapper[4933]: I1202 16:16:26.337525 4933 scope.go:117] "RemoveContainer" containerID="a1662fe208ee4f6a63768b5d8e7fb70bd4c5727b46c305e4155194d1686078e1" Dec 02 16:16:26 crc kubenswrapper[4933]: I1202 16:16:26.348604 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 16:16:26 crc kubenswrapper[4933]: I1202 16:16:26.359969 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/916e3919-8d7f-4678-b211-7b06c3404ded-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"916e3919-8d7f-4678-b211-7b06c3404ded\") " pod="openstack/ceilometer-0" Dec 02 16:16:26 crc kubenswrapper[4933]: I1202 16:16:26.360081 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/916e3919-8d7f-4678-b211-7b06c3404ded-config-data\") pod \"ceilometer-0\" (UID: \"916e3919-8d7f-4678-b211-7b06c3404ded\") " pod="openstack/ceilometer-0" Dec 02 16:16:26 crc kubenswrapper[4933]: I1202 16:16:26.360177 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/916e3919-8d7f-4678-b211-7b06c3404ded-run-httpd\") pod \"ceilometer-0\" (UID: \"916e3919-8d7f-4678-b211-7b06c3404ded\") " pod="openstack/ceilometer-0" Dec 02 16:16:26 crc kubenswrapper[4933]: I1202 16:16:26.360215 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkbmm\" (UniqueName: \"kubernetes.io/projected/916e3919-8d7f-4678-b211-7b06c3404ded-kube-api-access-zkbmm\") pod \"ceilometer-0\" (UID: \"916e3919-8d7f-4678-b211-7b06c3404ded\") " pod="openstack/ceilometer-0" Dec 02 16:16:26 crc kubenswrapper[4933]: I1202 16:16:26.360247 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/916e3919-8d7f-4678-b211-7b06c3404ded-log-httpd\") pod \"ceilometer-0\" (UID: \"916e3919-8d7f-4678-b211-7b06c3404ded\") " pod="openstack/ceilometer-0" Dec 02 16:16:26 crc kubenswrapper[4933]: I1202 16:16:26.360384 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/916e3919-8d7f-4678-b211-7b06c3404ded-scripts\") pod \"ceilometer-0\" (UID: \"916e3919-8d7f-4678-b211-7b06c3404ded\") " pod="openstack/ceilometer-0" Dec 02 16:16:26 crc kubenswrapper[4933]: I1202 16:16:26.360462 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/916e3919-8d7f-4678-b211-7b06c3404ded-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"916e3919-8d7f-4678-b211-7b06c3404ded\") " pod="openstack/ceilometer-0" Dec 02 16:16:26 crc kubenswrapper[4933]: I1202 16:16:26.429015 4933 scope.go:117] "RemoveContainer" containerID="61791fdd1f8c4bad9e6dc4af89469d0b055eb6c6c306999b10f238d536cb8b07" Dec 02 16:16:26 crc kubenswrapper[4933]: E1202 16:16:26.432975 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61791fdd1f8c4bad9e6dc4af89469d0b055eb6c6c306999b10f238d536cb8b07\": container with ID starting with 61791fdd1f8c4bad9e6dc4af89469d0b055eb6c6c306999b10f238d536cb8b07 not found: ID does not exist" containerID="61791fdd1f8c4bad9e6dc4af89469d0b055eb6c6c306999b10f238d536cb8b07" Dec 02 16:16:26 crc kubenswrapper[4933]: I1202 16:16:26.433035 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61791fdd1f8c4bad9e6dc4af89469d0b055eb6c6c306999b10f238d536cb8b07"} err="failed to get container status \"61791fdd1f8c4bad9e6dc4af89469d0b055eb6c6c306999b10f238d536cb8b07\": rpc error: code = NotFound desc = could not find container \"61791fdd1f8c4bad9e6dc4af89469d0b055eb6c6c306999b10f238d536cb8b07\": container with ID starting with 61791fdd1f8c4bad9e6dc4af89469d0b055eb6c6c306999b10f238d536cb8b07 not found: ID does not exist" Dec 02 16:16:26 crc kubenswrapper[4933]: I1202 16:16:26.433069 4933 scope.go:117] "RemoveContainer" containerID="b9637ba49afd541057409b63dc1da065b8bcc58906876a7bfc0cd72bcb383806" Dec 02 16:16:26 crc kubenswrapper[4933]: E1202 16:16:26.445974 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9637ba49afd541057409b63dc1da065b8bcc58906876a7bfc0cd72bcb383806\": container with ID starting with b9637ba49afd541057409b63dc1da065b8bcc58906876a7bfc0cd72bcb383806 not found: ID does not exist" containerID="b9637ba49afd541057409b63dc1da065b8bcc58906876a7bfc0cd72bcb383806" Dec 02 16:16:26 crc kubenswrapper[4933]: I1202 16:16:26.446031 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9637ba49afd541057409b63dc1da065b8bcc58906876a7bfc0cd72bcb383806"} err="failed to get container status \"b9637ba49afd541057409b63dc1da065b8bcc58906876a7bfc0cd72bcb383806\": rpc error: code = NotFound desc = could not find container \"b9637ba49afd541057409b63dc1da065b8bcc58906876a7bfc0cd72bcb383806\": container with ID starting with b9637ba49afd541057409b63dc1da065b8bcc58906876a7bfc0cd72bcb383806 not found: ID does not exist" Dec 02 16:16:26 crc kubenswrapper[4933]: I1202 16:16:26.446064 4933 scope.go:117] "RemoveContainer" containerID="3c5ff5c103092ea63036d4eb06e5e5e43dd10eeb3dcd959a721a4a396bd3a165" Dec 02 16:16:26 crc kubenswrapper[4933]: E1202 16:16:26.450985 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c5ff5c103092ea63036d4eb06e5e5e43dd10eeb3dcd959a721a4a396bd3a165\": container with ID starting with 3c5ff5c103092ea63036d4eb06e5e5e43dd10eeb3dcd959a721a4a396bd3a165 not found: ID does not exist" containerID="3c5ff5c103092ea63036d4eb06e5e5e43dd10eeb3dcd959a721a4a396bd3a165" Dec 02 16:16:26 crc kubenswrapper[4933]: I1202 16:16:26.451054 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c5ff5c103092ea63036d4eb06e5e5e43dd10eeb3dcd959a721a4a396bd3a165"} err="failed to get container status \"3c5ff5c103092ea63036d4eb06e5e5e43dd10eeb3dcd959a721a4a396bd3a165\": rpc error: code = NotFound desc = could not find container \"3c5ff5c103092ea63036d4eb06e5e5e43dd10eeb3dcd959a721a4a396bd3a165\": container with ID starting with 3c5ff5c103092ea63036d4eb06e5e5e43dd10eeb3dcd959a721a4a396bd3a165 not found: ID does not exist" Dec 02 16:16:26 crc kubenswrapper[4933]: I1202 16:16:26.451085 4933 scope.go:117] "RemoveContainer" containerID="a1662fe208ee4f6a63768b5d8e7fb70bd4c5727b46c305e4155194d1686078e1" Dec 02 16:16:26 crc kubenswrapper[4933]: E1202 16:16:26.452937 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1662fe208ee4f6a63768b5d8e7fb70bd4c5727b46c305e4155194d1686078e1\": container with ID starting with a1662fe208ee4f6a63768b5d8e7fb70bd4c5727b46c305e4155194d1686078e1 not found: ID does not exist" containerID="a1662fe208ee4f6a63768b5d8e7fb70bd4c5727b46c305e4155194d1686078e1" Dec 02 16:16:26 crc kubenswrapper[4933]: I1202 16:16:26.453010 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1662fe208ee4f6a63768b5d8e7fb70bd4c5727b46c305e4155194d1686078e1"} err="failed to get container status \"a1662fe208ee4f6a63768b5d8e7fb70bd4c5727b46c305e4155194d1686078e1\": rpc error: code = NotFound desc = could not find container \"a1662fe208ee4f6a63768b5d8e7fb70bd4c5727b46c305e4155194d1686078e1\": container with ID starting with a1662fe208ee4f6a63768b5d8e7fb70bd4c5727b46c305e4155194d1686078e1 not found: ID does not exist" Dec 02 16:16:26 crc kubenswrapper[4933]: I1202 16:16:26.465083 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/916e3919-8d7f-4678-b211-7b06c3404ded-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"916e3919-8d7f-4678-b211-7b06c3404ded\") " pod="openstack/ceilometer-0" Dec 02 16:16:26 crc kubenswrapper[4933]: I1202 16:16:26.465391 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/916e3919-8d7f-4678-b211-7b06c3404ded-config-data\") pod \"ceilometer-0\" (UID: \"916e3919-8d7f-4678-b211-7b06c3404ded\") " pod="openstack/ceilometer-0" Dec 02 16:16:26 crc kubenswrapper[4933]: I1202 16:16:26.465555 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/916e3919-8d7f-4678-b211-7b06c3404ded-run-httpd\") pod \"ceilometer-0\" (UID: \"916e3919-8d7f-4678-b211-7b06c3404ded\") " pod="openstack/ceilometer-0" Dec 02 16:16:26 crc kubenswrapper[4933]: I1202 16:16:26.465975 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkbmm\" (UniqueName: \"kubernetes.io/projected/916e3919-8d7f-4678-b211-7b06c3404ded-kube-api-access-zkbmm\") pod \"ceilometer-0\" (UID: \"916e3919-8d7f-4678-b211-7b06c3404ded\") " pod="openstack/ceilometer-0" Dec 02 16:16:26 crc kubenswrapper[4933]: I1202 16:16:26.466088 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/916e3919-8d7f-4678-b211-7b06c3404ded-log-httpd\") pod \"ceilometer-0\" (UID: \"916e3919-8d7f-4678-b211-7b06c3404ded\") " pod="openstack/ceilometer-0" Dec 02 16:16:26 crc kubenswrapper[4933]: I1202 16:16:26.466178 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/916e3919-8d7f-4678-b211-7b06c3404ded-scripts\") pod \"ceilometer-0\" (UID: \"916e3919-8d7f-4678-b211-7b06c3404ded\") " pod="openstack/ceilometer-0" Dec 02 16:16:26 crc kubenswrapper[4933]: I1202 16:16:26.466314 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/916e3919-8d7f-4678-b211-7b06c3404ded-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"916e3919-8d7f-4678-b211-7b06c3404ded\") " pod="openstack/ceilometer-0" Dec 02 16:16:26 crc kubenswrapper[4933]: I1202 16:16:26.467364 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/916e3919-8d7f-4678-b211-7b06c3404ded-log-httpd\") pod \"ceilometer-0\" (UID: \"916e3919-8d7f-4678-b211-7b06c3404ded\") " pod="openstack/ceilometer-0" Dec 02 16:16:26 crc kubenswrapper[4933]: I1202 16:16:26.467679 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/916e3919-8d7f-4678-b211-7b06c3404ded-run-httpd\") pod \"ceilometer-0\" (UID: \"916e3919-8d7f-4678-b211-7b06c3404ded\") " pod="openstack/ceilometer-0" Dec 02 16:16:26 crc kubenswrapper[4933]: I1202 16:16:26.472225 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/916e3919-8d7f-4678-b211-7b06c3404ded-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"916e3919-8d7f-4678-b211-7b06c3404ded\") " pod="openstack/ceilometer-0" Dec 02 16:16:26 crc kubenswrapper[4933]: I1202 16:16:26.472668 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/916e3919-8d7f-4678-b211-7b06c3404ded-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"916e3919-8d7f-4678-b211-7b06c3404ded\") " pod="openstack/ceilometer-0" Dec 02 16:16:26 crc kubenswrapper[4933]: I1202 16:16:26.481090 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/916e3919-8d7f-4678-b211-7b06c3404ded-config-data\") pod \"ceilometer-0\" (UID: \"916e3919-8d7f-4678-b211-7b06c3404ded\") " pod="openstack/ceilometer-0" Dec 02 16:16:26 crc kubenswrapper[4933]: I1202 16:16:26.490629 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/916e3919-8d7f-4678-b211-7b06c3404ded-scripts\") pod \"ceilometer-0\" (UID: \"916e3919-8d7f-4678-b211-7b06c3404ded\") " pod="openstack/ceilometer-0" Dec 02 16:16:26 crc kubenswrapper[4933]: I1202 16:16:26.505812 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkbmm\" (UniqueName: \"kubernetes.io/projected/916e3919-8d7f-4678-b211-7b06c3404ded-kube-api-access-zkbmm\") pod \"ceilometer-0\" (UID: \"916e3919-8d7f-4678-b211-7b06c3404ded\") " pod="openstack/ceilometer-0" Dec 02 16:16:26 crc kubenswrapper[4933]: I1202 16:16:26.638456 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 16:16:27 crc kubenswrapper[4933]: I1202 16:16:27.069079 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73119236-88e7-4698-a4a2-5ee8e0c1df03" path="/var/lib/kubelet/pods/73119236-88e7-4698-a4a2-5ee8e0c1df03/volumes" Dec 02 16:16:27 crc kubenswrapper[4933]: I1202 16:16:27.206416 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 16:16:27 crc kubenswrapper[4933]: W1202 16:16:27.211158 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod916e3919_8d7f_4678_b211_7b06c3404ded.slice/crio-4f0a78d4815323123deb4bc47ef5c9b07bb1ba9e25fdfc9a068f7836fbc70eb9 WatchSource:0}: Error finding container 4f0a78d4815323123deb4bc47ef5c9b07bb1ba9e25fdfc9a068f7836fbc70eb9: Status 404 returned error can't find the container with id 4f0a78d4815323123deb4bc47ef5c9b07bb1ba9e25fdfc9a068f7836fbc70eb9 Dec 02 16:16:27 crc kubenswrapper[4933]: I1202 16:16:27.229981 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"916e3919-8d7f-4678-b211-7b06c3404ded","Type":"ContainerStarted","Data":"4f0a78d4815323123deb4bc47ef5c9b07bb1ba9e25fdfc9a068f7836fbc70eb9"} Dec 02 16:16:28 crc kubenswrapper[4933]: I1202 16:16:28.183271 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 16:16:28 crc kubenswrapper[4933]: I1202 16:16:28.183699 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="fbc64d10-680e-40b1-816f-9d63f8eb8b11" containerName="glance-log" containerID="cri-o://f856f18354a9d19cec7e8aeecbe0b1a6429c990950c7e4e13fa0710d8137cfc0" gracePeriod=30 Dec 02 16:16:28 crc kubenswrapper[4933]: I1202 16:16:28.184186 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="fbc64d10-680e-40b1-816f-9d63f8eb8b11" containerName="glance-httpd" containerID="cri-o://5c5aa0435eded8b97e626868c198bbadc6dab2d39b392f902f20122a2205444f" gracePeriod=30 Dec 02 16:16:29 crc kubenswrapper[4933]: I1202 16:16:29.258882 4933 generic.go:334] "Generic (PLEG): container finished" podID="fbc64d10-680e-40b1-816f-9d63f8eb8b11" containerID="f856f18354a9d19cec7e8aeecbe0b1a6429c990950c7e4e13fa0710d8137cfc0" exitCode=143 Dec 02 16:16:29 crc kubenswrapper[4933]: I1202 16:16:29.258978 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fbc64d10-680e-40b1-816f-9d63f8eb8b11","Type":"ContainerDied","Data":"f856f18354a9d19cec7e8aeecbe0b1a6429c990950c7e4e13fa0710d8137cfc0"} Dec 02 16:16:29 crc kubenswrapper[4933]: I1202 16:16:29.268556 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"916e3919-8d7f-4678-b211-7b06c3404ded","Type":"ContainerStarted","Data":"65b2c79311b11ac505dd309247daa9e6ebaaa85b1b9e721d26ebb282dc34d8bb"} Dec 02 16:16:30 crc kubenswrapper[4933]: I1202 16:16:30.280656 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"916e3919-8d7f-4678-b211-7b06c3404ded","Type":"ContainerStarted","Data":"a1f68e470c7f4e0eb4b1777cc6a6ad499b3da706270d9b344d21569cdd4f874f"} Dec 02 16:16:30 crc kubenswrapper[4933]: I1202 16:16:30.280999 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"916e3919-8d7f-4678-b211-7b06c3404ded","Type":"ContainerStarted","Data":"90307c26b1e9ed1dd49b9c7aa986de96e413ad48c8b10f4ace6e9ca61d3241c4"} Dec 02 16:16:30 crc kubenswrapper[4933]: I1202 16:16:30.658032 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 02 16:16:30 crc kubenswrapper[4933]: I1202 16:16:30.658389 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 02 16:16:30 crc kubenswrapper[4933]: I1202 16:16:30.719396 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 02 16:16:30 crc kubenswrapper[4933]: I1202 16:16:30.746218 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 02 16:16:31 crc kubenswrapper[4933]: I1202 16:16:31.298609 4933 generic.go:334] "Generic (PLEG): container finished" podID="8aebfa1d-65c0-4f16-bcee-86e03d923f99" containerID="cd2e80308c9f1b00eaa6c2edd6cc51563d704053c166b8a9f5f3440137e64f3c" exitCode=0 Dec 02 16:16:31 crc kubenswrapper[4933]: I1202 16:16:31.298692 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-76479c94dd-9jkgg" event={"ID":"8aebfa1d-65c0-4f16-bcee-86e03d923f99","Type":"ContainerDied","Data":"cd2e80308c9f1b00eaa6c2edd6cc51563d704053c166b8a9f5f3440137e64f3c"} Dec 02 16:16:31 crc kubenswrapper[4933]: I1202 16:16:31.298907 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-76479c94dd-9jkgg" event={"ID":"8aebfa1d-65c0-4f16-bcee-86e03d923f99","Type":"ContainerDied","Data":"407e37b3be8333c149e6ce9c198dccd919da281b9496beb91a98b634e64c96d4"} Dec 02 16:16:31 crc kubenswrapper[4933]: I1202 16:16:31.298934 4933 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="407e37b3be8333c149e6ce9c198dccd919da281b9496beb91a98b634e64c96d4" Dec 02 16:16:31 crc kubenswrapper[4933]: I1202 16:16:31.300648 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 02 16:16:31 crc kubenswrapper[4933]: I1202 16:16:31.300681 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 02 16:16:31 crc kubenswrapper[4933]: I1202 16:16:31.420623 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-76479c94dd-9jkgg" Dec 02 16:16:31 crc kubenswrapper[4933]: I1202 16:16:31.500950 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8aebfa1d-65c0-4f16-bcee-86e03d923f99-config-data\") pod \"8aebfa1d-65c0-4f16-bcee-86e03d923f99\" (UID: \"8aebfa1d-65c0-4f16-bcee-86e03d923f99\") " Dec 02 16:16:31 crc kubenswrapper[4933]: I1202 16:16:31.501071 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8aebfa1d-65c0-4f16-bcee-86e03d923f99-combined-ca-bundle\") pod \"8aebfa1d-65c0-4f16-bcee-86e03d923f99\" (UID: \"8aebfa1d-65c0-4f16-bcee-86e03d923f99\") " Dec 02 16:16:31 crc kubenswrapper[4933]: I1202 16:16:31.501098 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8aebfa1d-65c0-4f16-bcee-86e03d923f99-config-data-custom\") pod \"8aebfa1d-65c0-4f16-bcee-86e03d923f99\" (UID: \"8aebfa1d-65c0-4f16-bcee-86e03d923f99\") " Dec 02 16:16:31 crc kubenswrapper[4933]: I1202 16:16:31.501132 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rw47r\" (UniqueName: \"kubernetes.io/projected/8aebfa1d-65c0-4f16-bcee-86e03d923f99-kube-api-access-rw47r\") pod \"8aebfa1d-65c0-4f16-bcee-86e03d923f99\" (UID: \"8aebfa1d-65c0-4f16-bcee-86e03d923f99\") " Dec 02 16:16:31 crc kubenswrapper[4933]: I1202 16:16:31.516302 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8aebfa1d-65c0-4f16-bcee-86e03d923f99-kube-api-access-rw47r" (OuterVolumeSpecName: "kube-api-access-rw47r") pod "8aebfa1d-65c0-4f16-bcee-86e03d923f99" (UID: "8aebfa1d-65c0-4f16-bcee-86e03d923f99"). InnerVolumeSpecName "kube-api-access-rw47r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:16:31 crc kubenswrapper[4933]: I1202 16:16:31.516438 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8aebfa1d-65c0-4f16-bcee-86e03d923f99-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "8aebfa1d-65c0-4f16-bcee-86e03d923f99" (UID: "8aebfa1d-65c0-4f16-bcee-86e03d923f99"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:16:31 crc kubenswrapper[4933]: I1202 16:16:31.565352 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8aebfa1d-65c0-4f16-bcee-86e03d923f99-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8aebfa1d-65c0-4f16-bcee-86e03d923f99" (UID: "8aebfa1d-65c0-4f16-bcee-86e03d923f99"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:16:31 crc kubenswrapper[4933]: I1202 16:16:31.604553 4933 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8aebfa1d-65c0-4f16-bcee-86e03d923f99-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 16:16:31 crc kubenswrapper[4933]: I1202 16:16:31.604587 4933 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8aebfa1d-65c0-4f16-bcee-86e03d923f99-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 02 16:16:31 crc kubenswrapper[4933]: I1202 16:16:31.604602 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rw47r\" (UniqueName: \"kubernetes.io/projected/8aebfa1d-65c0-4f16-bcee-86e03d923f99-kube-api-access-rw47r\") on node \"crc\" DevicePath \"\"" Dec 02 16:16:31 crc kubenswrapper[4933]: I1202 16:16:31.605038 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8aebfa1d-65c0-4f16-bcee-86e03d923f99-config-data" (OuterVolumeSpecName: "config-data") pod "8aebfa1d-65c0-4f16-bcee-86e03d923f99" (UID: "8aebfa1d-65c0-4f16-bcee-86e03d923f99"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:16:31 crc kubenswrapper[4933]: I1202 16:16:31.707151 4933 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8aebfa1d-65c0-4f16-bcee-86e03d923f99-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 16:16:31 crc kubenswrapper[4933]: I1202 16:16:31.992941 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 02 16:16:32 crc kubenswrapper[4933]: I1202 16:16:32.119974 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"fbc64d10-680e-40b1-816f-9d63f8eb8b11\" (UID: \"fbc64d10-680e-40b1-816f-9d63f8eb8b11\") " Dec 02 16:16:32 crc kubenswrapper[4933]: I1202 16:16:32.120051 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbc64d10-680e-40b1-816f-9d63f8eb8b11-config-data\") pod \"fbc64d10-680e-40b1-816f-9d63f8eb8b11\" (UID: \"fbc64d10-680e-40b1-816f-9d63f8eb8b11\") " Dec 02 16:16:32 crc kubenswrapper[4933]: I1202 16:16:32.120071 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fbc64d10-680e-40b1-816f-9d63f8eb8b11-public-tls-certs\") pod \"fbc64d10-680e-40b1-816f-9d63f8eb8b11\" (UID: \"fbc64d10-680e-40b1-816f-9d63f8eb8b11\") " Dec 02 16:16:32 crc kubenswrapper[4933]: I1202 16:16:32.120117 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg29m\" (UniqueName: \"kubernetes.io/projected/fbc64d10-680e-40b1-816f-9d63f8eb8b11-kube-api-access-qg29m\") pod \"fbc64d10-680e-40b1-816f-9d63f8eb8b11\" (UID: \"fbc64d10-680e-40b1-816f-9d63f8eb8b11\") " Dec 02 16:16:32 crc kubenswrapper[4933]: I1202 16:16:32.120165 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fbc64d10-680e-40b1-816f-9d63f8eb8b11-logs\") pod \"fbc64d10-680e-40b1-816f-9d63f8eb8b11\" (UID: \"fbc64d10-680e-40b1-816f-9d63f8eb8b11\") " Dec 02 16:16:32 crc kubenswrapper[4933]: I1202 16:16:32.120217 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbc64d10-680e-40b1-816f-9d63f8eb8b11-combined-ca-bundle\") pod \"fbc64d10-680e-40b1-816f-9d63f8eb8b11\" (UID: \"fbc64d10-680e-40b1-816f-9d63f8eb8b11\") " Dec 02 16:16:32 crc kubenswrapper[4933]: I1202 16:16:32.120300 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fbc64d10-680e-40b1-816f-9d63f8eb8b11-scripts\") pod \"fbc64d10-680e-40b1-816f-9d63f8eb8b11\" (UID: \"fbc64d10-680e-40b1-816f-9d63f8eb8b11\") " Dec 02 16:16:32 crc kubenswrapper[4933]: I1202 16:16:32.120362 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fbc64d10-680e-40b1-816f-9d63f8eb8b11-httpd-run\") pod \"fbc64d10-680e-40b1-816f-9d63f8eb8b11\" (UID: \"fbc64d10-680e-40b1-816f-9d63f8eb8b11\") " Dec 02 16:16:32 crc kubenswrapper[4933]: I1202 16:16:32.142244 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fbc64d10-680e-40b1-816f-9d63f8eb8b11-logs" (OuterVolumeSpecName: "logs") pod "fbc64d10-680e-40b1-816f-9d63f8eb8b11" (UID: "fbc64d10-680e-40b1-816f-9d63f8eb8b11"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 16:16:32 crc kubenswrapper[4933]: I1202 16:16:32.142472 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fbc64d10-680e-40b1-816f-9d63f8eb8b11-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "fbc64d10-680e-40b1-816f-9d63f8eb8b11" (UID: "fbc64d10-680e-40b1-816f-9d63f8eb8b11"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 16:16:32 crc kubenswrapper[4933]: I1202 16:16:32.143011 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbc64d10-680e-40b1-816f-9d63f8eb8b11-scripts" (OuterVolumeSpecName: "scripts") pod "fbc64d10-680e-40b1-816f-9d63f8eb8b11" (UID: "fbc64d10-680e-40b1-816f-9d63f8eb8b11"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:16:32 crc kubenswrapper[4933]: I1202 16:16:32.149463 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance") pod "fbc64d10-680e-40b1-816f-9d63f8eb8b11" (UID: "fbc64d10-680e-40b1-816f-9d63f8eb8b11"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 02 16:16:32 crc kubenswrapper[4933]: I1202 16:16:32.156045 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fbc64d10-680e-40b1-816f-9d63f8eb8b11-kube-api-access-qg29m" (OuterVolumeSpecName: "kube-api-access-qg29m") pod "fbc64d10-680e-40b1-816f-9d63f8eb8b11" (UID: "fbc64d10-680e-40b1-816f-9d63f8eb8b11"). InnerVolumeSpecName "kube-api-access-qg29m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:16:32 crc kubenswrapper[4933]: I1202 16:16:32.174977 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbc64d10-680e-40b1-816f-9d63f8eb8b11-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fbc64d10-680e-40b1-816f-9d63f8eb8b11" (UID: "fbc64d10-680e-40b1-816f-9d63f8eb8b11"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:16:32 crc kubenswrapper[4933]: I1202 16:16:32.229338 4933 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fbc64d10-680e-40b1-816f-9d63f8eb8b11-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 16:16:32 crc kubenswrapper[4933]: I1202 16:16:32.231174 4933 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fbc64d10-680e-40b1-816f-9d63f8eb8b11-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 02 16:16:32 crc kubenswrapper[4933]: I1202 16:16:32.231301 4933 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Dec 02 16:16:32 crc kubenswrapper[4933]: I1202 16:16:32.231363 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg29m\" (UniqueName: \"kubernetes.io/projected/fbc64d10-680e-40b1-816f-9d63f8eb8b11-kube-api-access-qg29m\") on node \"crc\" DevicePath \"\"" Dec 02 16:16:32 crc kubenswrapper[4933]: I1202 16:16:32.231428 4933 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fbc64d10-680e-40b1-816f-9d63f8eb8b11-logs\") on node \"crc\" DevicePath \"\"" Dec 02 16:16:32 crc kubenswrapper[4933]: I1202 16:16:32.231491 4933 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbc64d10-680e-40b1-816f-9d63f8eb8b11-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 16:16:32 crc kubenswrapper[4933]: I1202 16:16:32.239058 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbc64d10-680e-40b1-816f-9d63f8eb8b11-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "fbc64d10-680e-40b1-816f-9d63f8eb8b11" (UID: "fbc64d10-680e-40b1-816f-9d63f8eb8b11"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:16:32 crc kubenswrapper[4933]: I1202 16:16:32.240048 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbc64d10-680e-40b1-816f-9d63f8eb8b11-config-data" (OuterVolumeSpecName: "config-data") pod "fbc64d10-680e-40b1-816f-9d63f8eb8b11" (UID: "fbc64d10-680e-40b1-816f-9d63f8eb8b11"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:16:32 crc kubenswrapper[4933]: I1202 16:16:32.257441 4933 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Dec 02 16:16:32 crc kubenswrapper[4933]: I1202 16:16:32.316154 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"916e3919-8d7f-4678-b211-7b06c3404ded","Type":"ContainerStarted","Data":"53b39f4f892493cbe28530b790b415ab9a051a5b19348689275e1dcdc3cf6f41"} Dec 02 16:16:32 crc kubenswrapper[4933]: I1202 16:16:32.318007 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 02 16:16:32 crc kubenswrapper[4933]: I1202 16:16:32.321372 4933 generic.go:334] "Generic (PLEG): container finished" podID="fbc64d10-680e-40b1-816f-9d63f8eb8b11" containerID="5c5aa0435eded8b97e626868c198bbadc6dab2d39b392f902f20122a2205444f" exitCode=0 Dec 02 16:16:32 crc kubenswrapper[4933]: I1202 16:16:32.323501 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 02 16:16:32 crc kubenswrapper[4933]: I1202 16:16:32.323797 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fbc64d10-680e-40b1-816f-9d63f8eb8b11","Type":"ContainerDied","Data":"5c5aa0435eded8b97e626868c198bbadc6dab2d39b392f902f20122a2205444f"} Dec 02 16:16:32 crc kubenswrapper[4933]: I1202 16:16:32.323840 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fbc64d10-680e-40b1-816f-9d63f8eb8b11","Type":"ContainerDied","Data":"6b4c8ebc1d1fb92cda87594582e1b13c1ac130b328b42367044b0a9856378a38"} Dec 02 16:16:32 crc kubenswrapper[4933]: I1202 16:16:32.323856 4933 scope.go:117] "RemoveContainer" containerID="5c5aa0435eded8b97e626868c198bbadc6dab2d39b392f902f20122a2205444f" Dec 02 16:16:32 crc kubenswrapper[4933]: I1202 16:16:32.324018 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-76479c94dd-9jkgg" Dec 02 16:16:32 crc kubenswrapper[4933]: I1202 16:16:32.335762 4933 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Dec 02 16:16:32 crc kubenswrapper[4933]: I1202 16:16:32.336207 4933 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbc64d10-680e-40b1-816f-9d63f8eb8b11-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 16:16:32 crc kubenswrapper[4933]: I1202 16:16:32.343887 4933 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fbc64d10-680e-40b1-816f-9d63f8eb8b11-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 02 16:16:32 crc kubenswrapper[4933]: I1202 16:16:32.346337 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.038994132 podStartE2EDuration="6.34631259s" podCreationTimestamp="2025-12-02 16:16:26 +0000 UTC" firstStartedPulling="2025-12-02 16:16:27.214660106 +0000 UTC m=+1450.465886799" lastFinishedPulling="2025-12-02 16:16:31.521978554 +0000 UTC m=+1454.773205257" observedRunningTime="2025-12-02 16:16:32.336800602 +0000 UTC m=+1455.588027315" watchObservedRunningTime="2025-12-02 16:16:32.34631259 +0000 UTC m=+1455.597539293" Dec 02 16:16:32 crc kubenswrapper[4933]: I1202 16:16:32.384651 4933 scope.go:117] "RemoveContainer" containerID="f856f18354a9d19cec7e8aeecbe0b1a6429c990950c7e4e13fa0710d8137cfc0" Dec 02 16:16:32 crc kubenswrapper[4933]: I1202 16:16:32.400016 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-76479c94dd-9jkgg"] Dec 02 16:16:32 crc kubenswrapper[4933]: I1202 16:16:32.433137 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-engine-76479c94dd-9jkgg"] Dec 02 16:16:32 crc kubenswrapper[4933]: I1202 16:16:32.443125 4933 scope.go:117] "RemoveContainer" containerID="5c5aa0435eded8b97e626868c198bbadc6dab2d39b392f902f20122a2205444f" Dec 02 16:16:32 crc kubenswrapper[4933]: E1202 16:16:32.445011 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c5aa0435eded8b97e626868c198bbadc6dab2d39b392f902f20122a2205444f\": container with ID starting with 5c5aa0435eded8b97e626868c198bbadc6dab2d39b392f902f20122a2205444f not found: ID does not exist" containerID="5c5aa0435eded8b97e626868c198bbadc6dab2d39b392f902f20122a2205444f" Dec 02 16:16:32 crc kubenswrapper[4933]: I1202 16:16:32.445171 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c5aa0435eded8b97e626868c198bbadc6dab2d39b392f902f20122a2205444f"} err="failed to get container status \"5c5aa0435eded8b97e626868c198bbadc6dab2d39b392f902f20122a2205444f\": rpc error: code = NotFound desc = could not find container \"5c5aa0435eded8b97e626868c198bbadc6dab2d39b392f902f20122a2205444f\": container with ID starting with 5c5aa0435eded8b97e626868c198bbadc6dab2d39b392f902f20122a2205444f not found: ID does not exist" Dec 02 16:16:32 crc kubenswrapper[4933]: I1202 16:16:32.445266 4933 scope.go:117] "RemoveContainer" containerID="f856f18354a9d19cec7e8aeecbe0b1a6429c990950c7e4e13fa0710d8137cfc0" Dec 02 16:16:32 crc kubenswrapper[4933]: E1202 16:16:32.446753 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f856f18354a9d19cec7e8aeecbe0b1a6429c990950c7e4e13fa0710d8137cfc0\": container with ID starting with f856f18354a9d19cec7e8aeecbe0b1a6429c990950c7e4e13fa0710d8137cfc0 not found: ID does not exist" containerID="f856f18354a9d19cec7e8aeecbe0b1a6429c990950c7e4e13fa0710d8137cfc0" Dec 02 16:16:32 crc kubenswrapper[4933]: I1202 16:16:32.446786 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f856f18354a9d19cec7e8aeecbe0b1a6429c990950c7e4e13fa0710d8137cfc0"} err="failed to get container status \"f856f18354a9d19cec7e8aeecbe0b1a6429c990950c7e4e13fa0710d8137cfc0\": rpc error: code = NotFound desc = could not find container \"f856f18354a9d19cec7e8aeecbe0b1a6429c990950c7e4e13fa0710d8137cfc0\": container with ID starting with f856f18354a9d19cec7e8aeecbe0b1a6429c990950c7e4e13fa0710d8137cfc0 not found: ID does not exist" Dec 02 16:16:32 crc kubenswrapper[4933]: I1202 16:16:32.470929 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 16:16:32 crc kubenswrapper[4933]: I1202 16:16:32.488015 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 16:16:32 crc kubenswrapper[4933]: I1202 16:16:32.503104 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 16:16:32 crc kubenswrapper[4933]: E1202 16:16:32.503689 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbc64d10-680e-40b1-816f-9d63f8eb8b11" containerName="glance-log" Dec 02 16:16:32 crc kubenswrapper[4933]: I1202 16:16:32.503707 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbc64d10-680e-40b1-816f-9d63f8eb8b11" containerName="glance-log" Dec 02 16:16:32 crc kubenswrapper[4933]: E1202 16:16:32.503727 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8aebfa1d-65c0-4f16-bcee-86e03d923f99" containerName="heat-engine" Dec 02 16:16:32 crc kubenswrapper[4933]: I1202 16:16:32.503735 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="8aebfa1d-65c0-4f16-bcee-86e03d923f99" containerName="heat-engine" Dec 02 16:16:32 crc kubenswrapper[4933]: E1202 16:16:32.503768 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbc64d10-680e-40b1-816f-9d63f8eb8b11" containerName="glance-httpd" Dec 02 16:16:32 crc kubenswrapper[4933]: I1202 16:16:32.503777 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbc64d10-680e-40b1-816f-9d63f8eb8b11" containerName="glance-httpd" Dec 02 16:16:32 crc kubenswrapper[4933]: I1202 16:16:32.504037 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="fbc64d10-680e-40b1-816f-9d63f8eb8b11" containerName="glance-log" Dec 02 16:16:32 crc kubenswrapper[4933]: I1202 16:16:32.504055 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="8aebfa1d-65c0-4f16-bcee-86e03d923f99" containerName="heat-engine" Dec 02 16:16:32 crc kubenswrapper[4933]: I1202 16:16:32.504071 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="fbc64d10-680e-40b1-816f-9d63f8eb8b11" containerName="glance-httpd" Dec 02 16:16:32 crc kubenswrapper[4933]: I1202 16:16:32.505381 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 02 16:16:32 crc kubenswrapper[4933]: I1202 16:16:32.508424 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 02 16:16:32 crc kubenswrapper[4933]: I1202 16:16:32.513082 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 02 16:16:32 crc kubenswrapper[4933]: I1202 16:16:32.532898 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 16:16:32 crc kubenswrapper[4933]: I1202 16:16:32.549230 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jmgd\" (UniqueName: \"kubernetes.io/projected/848e9b92-1e28-4d82-b057-0335915a6155-kube-api-access-5jmgd\") pod \"glance-default-external-api-0\" (UID: \"848e9b92-1e28-4d82-b057-0335915a6155\") " pod="openstack/glance-default-external-api-0" Dec 02 16:16:32 crc kubenswrapper[4933]: I1202 16:16:32.549318 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/848e9b92-1e28-4d82-b057-0335915a6155-config-data\") pod \"glance-default-external-api-0\" (UID: \"848e9b92-1e28-4d82-b057-0335915a6155\") " pod="openstack/glance-default-external-api-0" Dec 02 16:16:32 crc kubenswrapper[4933]: I1202 16:16:32.549345 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/848e9b92-1e28-4d82-b057-0335915a6155-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"848e9b92-1e28-4d82-b057-0335915a6155\") " pod="openstack/glance-default-external-api-0" Dec 02 16:16:32 crc kubenswrapper[4933]: I1202 16:16:32.549365 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/848e9b92-1e28-4d82-b057-0335915a6155-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"848e9b92-1e28-4d82-b057-0335915a6155\") " pod="openstack/glance-default-external-api-0" Dec 02 16:16:32 crc kubenswrapper[4933]: I1202 16:16:32.549406 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/848e9b92-1e28-4d82-b057-0335915a6155-scripts\") pod \"glance-default-external-api-0\" (UID: \"848e9b92-1e28-4d82-b057-0335915a6155\") " pod="openstack/glance-default-external-api-0" Dec 02 16:16:32 crc kubenswrapper[4933]: I1202 16:16:32.549435 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"848e9b92-1e28-4d82-b057-0335915a6155\") " pod="openstack/glance-default-external-api-0" Dec 02 16:16:32 crc kubenswrapper[4933]: I1202 16:16:32.549464 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/848e9b92-1e28-4d82-b057-0335915a6155-logs\") pod \"glance-default-external-api-0\" (UID: \"848e9b92-1e28-4d82-b057-0335915a6155\") " pod="openstack/glance-default-external-api-0" Dec 02 16:16:32 crc kubenswrapper[4933]: I1202 16:16:32.549872 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/848e9b92-1e28-4d82-b057-0335915a6155-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"848e9b92-1e28-4d82-b057-0335915a6155\") " pod="openstack/glance-default-external-api-0" Dec 02 16:16:32 crc kubenswrapper[4933]: I1202 16:16:32.651895 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/848e9b92-1e28-4d82-b057-0335915a6155-logs\") pod \"glance-default-external-api-0\" (UID: \"848e9b92-1e28-4d82-b057-0335915a6155\") " pod="openstack/glance-default-external-api-0" Dec 02 16:16:32 crc kubenswrapper[4933]: I1202 16:16:32.652028 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/848e9b92-1e28-4d82-b057-0335915a6155-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"848e9b92-1e28-4d82-b057-0335915a6155\") " pod="openstack/glance-default-external-api-0" Dec 02 16:16:32 crc kubenswrapper[4933]: I1202 16:16:32.652078 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jmgd\" (UniqueName: \"kubernetes.io/projected/848e9b92-1e28-4d82-b057-0335915a6155-kube-api-access-5jmgd\") pod \"glance-default-external-api-0\" (UID: \"848e9b92-1e28-4d82-b057-0335915a6155\") " pod="openstack/glance-default-external-api-0" Dec 02 16:16:32 crc kubenswrapper[4933]: I1202 16:16:32.652134 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/848e9b92-1e28-4d82-b057-0335915a6155-config-data\") pod \"glance-default-external-api-0\" (UID: \"848e9b92-1e28-4d82-b057-0335915a6155\") " pod="openstack/glance-default-external-api-0" Dec 02 16:16:32 crc kubenswrapper[4933]: I1202 16:16:32.652157 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/848e9b92-1e28-4d82-b057-0335915a6155-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"848e9b92-1e28-4d82-b057-0335915a6155\") " pod="openstack/glance-default-external-api-0" Dec 02 16:16:32 crc kubenswrapper[4933]: I1202 16:16:32.652179 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/848e9b92-1e28-4d82-b057-0335915a6155-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"848e9b92-1e28-4d82-b057-0335915a6155\") " pod="openstack/glance-default-external-api-0" Dec 02 16:16:32 crc kubenswrapper[4933]: I1202 16:16:32.652221 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/848e9b92-1e28-4d82-b057-0335915a6155-scripts\") pod \"glance-default-external-api-0\" (UID: \"848e9b92-1e28-4d82-b057-0335915a6155\") " pod="openstack/glance-default-external-api-0" Dec 02 16:16:32 crc kubenswrapper[4933]: I1202 16:16:32.652247 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"848e9b92-1e28-4d82-b057-0335915a6155\") " pod="openstack/glance-default-external-api-0" Dec 02 16:16:32 crc kubenswrapper[4933]: I1202 16:16:32.652558 4933 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"848e9b92-1e28-4d82-b057-0335915a6155\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-external-api-0" Dec 02 16:16:32 crc kubenswrapper[4933]: I1202 16:16:32.653381 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/848e9b92-1e28-4d82-b057-0335915a6155-logs\") pod \"glance-default-external-api-0\" (UID: \"848e9b92-1e28-4d82-b057-0335915a6155\") " pod="openstack/glance-default-external-api-0" Dec 02 16:16:32 crc kubenswrapper[4933]: I1202 16:16:32.653923 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/848e9b92-1e28-4d82-b057-0335915a6155-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"848e9b92-1e28-4d82-b057-0335915a6155\") " pod="openstack/glance-default-external-api-0" Dec 02 16:16:32 crc kubenswrapper[4933]: I1202 16:16:32.662573 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/848e9b92-1e28-4d82-b057-0335915a6155-config-data\") pod \"glance-default-external-api-0\" (UID: \"848e9b92-1e28-4d82-b057-0335915a6155\") " pod="openstack/glance-default-external-api-0" Dec 02 16:16:32 crc kubenswrapper[4933]: I1202 16:16:32.662897 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/848e9b92-1e28-4d82-b057-0335915a6155-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"848e9b92-1e28-4d82-b057-0335915a6155\") " pod="openstack/glance-default-external-api-0" Dec 02 16:16:32 crc kubenswrapper[4933]: I1202 16:16:32.664466 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/848e9b92-1e28-4d82-b057-0335915a6155-scripts\") pod \"glance-default-external-api-0\" (UID: \"848e9b92-1e28-4d82-b057-0335915a6155\") " pod="openstack/glance-default-external-api-0" Dec 02 16:16:32 crc kubenswrapper[4933]: I1202 16:16:32.664997 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/848e9b92-1e28-4d82-b057-0335915a6155-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"848e9b92-1e28-4d82-b057-0335915a6155\") " pod="openstack/glance-default-external-api-0" Dec 02 16:16:32 crc kubenswrapper[4933]: I1202 16:16:32.688484 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jmgd\" (UniqueName: \"kubernetes.io/projected/848e9b92-1e28-4d82-b057-0335915a6155-kube-api-access-5jmgd\") pod \"glance-default-external-api-0\" (UID: \"848e9b92-1e28-4d82-b057-0335915a6155\") " pod="openstack/glance-default-external-api-0" Dec 02 16:16:32 crc kubenswrapper[4933]: I1202 16:16:32.691768 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"848e9b92-1e28-4d82-b057-0335915a6155\") " pod="openstack/glance-default-external-api-0" Dec 02 16:16:32 crc kubenswrapper[4933]: I1202 16:16:32.842480 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 02 16:16:32 crc kubenswrapper[4933]: I1202 16:16:32.953230 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-h8zrv" Dec 02 16:16:32 crc kubenswrapper[4933]: I1202 16:16:32.954067 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-h8zrv" Dec 02 16:16:33 crc kubenswrapper[4933]: I1202 16:16:33.088475 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8aebfa1d-65c0-4f16-bcee-86e03d923f99" path="/var/lib/kubelet/pods/8aebfa1d-65c0-4f16-bcee-86e03d923f99/volumes" Dec 02 16:16:33 crc kubenswrapper[4933]: I1202 16:16:33.089493 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fbc64d10-680e-40b1-816f-9d63f8eb8b11" path="/var/lib/kubelet/pods/fbc64d10-680e-40b1-816f-9d63f8eb8b11/volumes" Dec 02 16:16:33 crc kubenswrapper[4933]: W1202 16:16:33.566669 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod848e9b92_1e28_4d82_b057_0335915a6155.slice/crio-b8c0b4b466dbf1bba0360429632382b49db890e05e226d7d7bca4bf4fa17a0ec WatchSource:0}: Error finding container b8c0b4b466dbf1bba0360429632382b49db890e05e226d7d7bca4bf4fa17a0ec: Status 404 returned error can't find the container with id b8c0b4b466dbf1bba0360429632382b49db890e05e226d7d7bca4bf4fa17a0ec Dec 02 16:16:33 crc kubenswrapper[4933]: I1202 16:16:33.574684 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 16:16:34 crc kubenswrapper[4933]: I1202 16:16:34.079599 4933 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-h8zrv" podUID="aa2e200b-ec52-4e46-bc66-c40d8b3c0c27" containerName="registry-server" probeResult="failure" output=< Dec 02 16:16:34 crc kubenswrapper[4933]: timeout: failed to connect service ":50051" within 1s Dec 02 16:16:34 crc kubenswrapper[4933]: > Dec 02 16:16:34 crc kubenswrapper[4933]: I1202 16:16:34.355140 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"848e9b92-1e28-4d82-b057-0335915a6155","Type":"ContainerStarted","Data":"ae3be14519914101e370cfcab63e049dd226167b73af97c024367b83da2c882b"} Dec 02 16:16:34 crc kubenswrapper[4933]: I1202 16:16:34.355513 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"848e9b92-1e28-4d82-b057-0335915a6155","Type":"ContainerStarted","Data":"b8c0b4b466dbf1bba0360429632382b49db890e05e226d7d7bca4bf4fa17a0ec"} Dec 02 16:16:35 crc kubenswrapper[4933]: I1202 16:16:35.002863 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 02 16:16:35 crc kubenswrapper[4933]: I1202 16:16:35.002973 4933 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 02 16:16:35 crc kubenswrapper[4933]: I1202 16:16:35.006034 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 02 16:16:35 crc kubenswrapper[4933]: I1202 16:16:35.165452 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 16:16:35 crc kubenswrapper[4933]: I1202 16:16:35.171227 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="916e3919-8d7f-4678-b211-7b06c3404ded" containerName="ceilometer-central-agent" containerID="cri-o://65b2c79311b11ac505dd309247daa9e6ebaaa85b1b9e721d26ebb282dc34d8bb" gracePeriod=30 Dec 02 16:16:35 crc kubenswrapper[4933]: I1202 16:16:35.171506 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="916e3919-8d7f-4678-b211-7b06c3404ded" containerName="proxy-httpd" containerID="cri-o://53b39f4f892493cbe28530b790b415ab9a051a5b19348689275e1dcdc3cf6f41" gracePeriod=30 Dec 02 16:16:35 crc kubenswrapper[4933]: I1202 16:16:35.171664 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="916e3919-8d7f-4678-b211-7b06c3404ded" containerName="ceilometer-notification-agent" containerID="cri-o://90307c26b1e9ed1dd49b9c7aa986de96e413ad48c8b10f4ace6e9ca61d3241c4" gracePeriod=30 Dec 02 16:16:35 crc kubenswrapper[4933]: I1202 16:16:35.171793 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="916e3919-8d7f-4678-b211-7b06c3404ded" containerName="sg-core" containerID="cri-o://a1f68e470c7f4e0eb4b1777cc6a6ad499b3da706270d9b344d21569cdd4f874f" gracePeriod=30 Dec 02 16:16:35 crc kubenswrapper[4933]: I1202 16:16:35.371422 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"848e9b92-1e28-4d82-b057-0335915a6155","Type":"ContainerStarted","Data":"bde0b19bc8d0946fd73e15ec842d41da89324b8fced651027601a6803c02d7ca"} Dec 02 16:16:35 crc kubenswrapper[4933]: I1202 16:16:35.394367 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.394342079 podStartE2EDuration="3.394342079s" podCreationTimestamp="2025-12-02 16:16:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 16:16:35.390269038 +0000 UTC m=+1458.641495741" watchObservedRunningTime="2025-12-02 16:16:35.394342079 +0000 UTC m=+1458.645568792" Dec 02 16:16:36 crc kubenswrapper[4933]: I1202 16:16:36.298528 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 16:16:36 crc kubenswrapper[4933]: I1202 16:16:36.409164 4933 generic.go:334] "Generic (PLEG): container finished" podID="916e3919-8d7f-4678-b211-7b06c3404ded" containerID="53b39f4f892493cbe28530b790b415ab9a051a5b19348689275e1dcdc3cf6f41" exitCode=0 Dec 02 16:16:36 crc kubenswrapper[4933]: I1202 16:16:36.409507 4933 generic.go:334] "Generic (PLEG): container finished" podID="916e3919-8d7f-4678-b211-7b06c3404ded" containerID="a1f68e470c7f4e0eb4b1777cc6a6ad499b3da706270d9b344d21569cdd4f874f" exitCode=2 Dec 02 16:16:36 crc kubenswrapper[4933]: I1202 16:16:36.409519 4933 generic.go:334] "Generic (PLEG): container finished" podID="916e3919-8d7f-4678-b211-7b06c3404ded" containerID="90307c26b1e9ed1dd49b9c7aa986de96e413ad48c8b10f4ace6e9ca61d3241c4" exitCode=0 Dec 02 16:16:36 crc kubenswrapper[4933]: I1202 16:16:36.409528 4933 generic.go:334] "Generic (PLEG): container finished" podID="916e3919-8d7f-4678-b211-7b06c3404ded" containerID="65b2c79311b11ac505dd309247daa9e6ebaaa85b1b9e721d26ebb282dc34d8bb" exitCode=0 Dec 02 16:16:36 crc kubenswrapper[4933]: I1202 16:16:36.409246 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 16:16:36 crc kubenswrapper[4933]: I1202 16:16:36.409270 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"916e3919-8d7f-4678-b211-7b06c3404ded","Type":"ContainerDied","Data":"53b39f4f892493cbe28530b790b415ab9a051a5b19348689275e1dcdc3cf6f41"} Dec 02 16:16:36 crc kubenswrapper[4933]: I1202 16:16:36.410803 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"916e3919-8d7f-4678-b211-7b06c3404ded","Type":"ContainerDied","Data":"a1f68e470c7f4e0eb4b1777cc6a6ad499b3da706270d9b344d21569cdd4f874f"} Dec 02 16:16:36 crc kubenswrapper[4933]: I1202 16:16:36.410839 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"916e3919-8d7f-4678-b211-7b06c3404ded","Type":"ContainerDied","Data":"90307c26b1e9ed1dd49b9c7aa986de96e413ad48c8b10f4ace6e9ca61d3241c4"} Dec 02 16:16:36 crc kubenswrapper[4933]: I1202 16:16:36.410854 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"916e3919-8d7f-4678-b211-7b06c3404ded","Type":"ContainerDied","Data":"65b2c79311b11ac505dd309247daa9e6ebaaa85b1b9e721d26ebb282dc34d8bb"} Dec 02 16:16:36 crc kubenswrapper[4933]: I1202 16:16:36.410865 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"916e3919-8d7f-4678-b211-7b06c3404ded","Type":"ContainerDied","Data":"4f0a78d4815323123deb4bc47ef5c9b07bb1ba9e25fdfc9a068f7836fbc70eb9"} Dec 02 16:16:36 crc kubenswrapper[4933]: I1202 16:16:36.410881 4933 scope.go:117] "RemoveContainer" containerID="53b39f4f892493cbe28530b790b415ab9a051a5b19348689275e1dcdc3cf6f41" Dec 02 16:16:36 crc kubenswrapper[4933]: I1202 16:16:36.456947 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/916e3919-8d7f-4678-b211-7b06c3404ded-scripts\") pod \"916e3919-8d7f-4678-b211-7b06c3404ded\" (UID: \"916e3919-8d7f-4678-b211-7b06c3404ded\") " Dec 02 16:16:36 crc kubenswrapper[4933]: I1202 16:16:36.457048 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/916e3919-8d7f-4678-b211-7b06c3404ded-config-data\") pod \"916e3919-8d7f-4678-b211-7b06c3404ded\" (UID: \"916e3919-8d7f-4678-b211-7b06c3404ded\") " Dec 02 16:16:36 crc kubenswrapper[4933]: I1202 16:16:36.457113 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/916e3919-8d7f-4678-b211-7b06c3404ded-log-httpd\") pod \"916e3919-8d7f-4678-b211-7b06c3404ded\" (UID: \"916e3919-8d7f-4678-b211-7b06c3404ded\") " Dec 02 16:16:36 crc kubenswrapper[4933]: I1202 16:16:36.457173 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/916e3919-8d7f-4678-b211-7b06c3404ded-combined-ca-bundle\") pod \"916e3919-8d7f-4678-b211-7b06c3404ded\" (UID: \"916e3919-8d7f-4678-b211-7b06c3404ded\") " Dec 02 16:16:36 crc kubenswrapper[4933]: I1202 16:16:36.457197 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/916e3919-8d7f-4678-b211-7b06c3404ded-run-httpd\") pod \"916e3919-8d7f-4678-b211-7b06c3404ded\" (UID: \"916e3919-8d7f-4678-b211-7b06c3404ded\") " Dec 02 16:16:36 crc kubenswrapper[4933]: I1202 16:16:36.457223 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkbmm\" (UniqueName: \"kubernetes.io/projected/916e3919-8d7f-4678-b211-7b06c3404ded-kube-api-access-zkbmm\") pod \"916e3919-8d7f-4678-b211-7b06c3404ded\" (UID: \"916e3919-8d7f-4678-b211-7b06c3404ded\") " Dec 02 16:16:36 crc kubenswrapper[4933]: I1202 16:16:36.457318 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/916e3919-8d7f-4678-b211-7b06c3404ded-sg-core-conf-yaml\") pod \"916e3919-8d7f-4678-b211-7b06c3404ded\" (UID: \"916e3919-8d7f-4678-b211-7b06c3404ded\") " Dec 02 16:16:36 crc kubenswrapper[4933]: I1202 16:16:36.457903 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/916e3919-8d7f-4678-b211-7b06c3404ded-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "916e3919-8d7f-4678-b211-7b06c3404ded" (UID: "916e3919-8d7f-4678-b211-7b06c3404ded"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 16:16:36 crc kubenswrapper[4933]: I1202 16:16:36.463182 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/916e3919-8d7f-4678-b211-7b06c3404ded-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "916e3919-8d7f-4678-b211-7b06c3404ded" (UID: "916e3919-8d7f-4678-b211-7b06c3404ded"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 16:16:36 crc kubenswrapper[4933]: I1202 16:16:36.469082 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/916e3919-8d7f-4678-b211-7b06c3404ded-scripts" (OuterVolumeSpecName: "scripts") pod "916e3919-8d7f-4678-b211-7b06c3404ded" (UID: "916e3919-8d7f-4678-b211-7b06c3404ded"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:16:36 crc kubenswrapper[4933]: I1202 16:16:36.473342 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/916e3919-8d7f-4678-b211-7b06c3404ded-kube-api-access-zkbmm" (OuterVolumeSpecName: "kube-api-access-zkbmm") pod "916e3919-8d7f-4678-b211-7b06c3404ded" (UID: "916e3919-8d7f-4678-b211-7b06c3404ded"). InnerVolumeSpecName "kube-api-access-zkbmm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:16:36 crc kubenswrapper[4933]: I1202 16:16:36.496017 4933 scope.go:117] "RemoveContainer" containerID="a1f68e470c7f4e0eb4b1777cc6a6ad499b3da706270d9b344d21569cdd4f874f" Dec 02 16:16:36 crc kubenswrapper[4933]: I1202 16:16:36.560932 4933 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/916e3919-8d7f-4678-b211-7b06c3404ded-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 16:16:36 crc kubenswrapper[4933]: I1202 16:16:36.560975 4933 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/916e3919-8d7f-4678-b211-7b06c3404ded-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 16:16:36 crc kubenswrapper[4933]: I1202 16:16:36.560988 4933 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/916e3919-8d7f-4678-b211-7b06c3404ded-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 16:16:36 crc kubenswrapper[4933]: I1202 16:16:36.560999 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkbmm\" (UniqueName: \"kubernetes.io/projected/916e3919-8d7f-4678-b211-7b06c3404ded-kube-api-access-zkbmm\") on node \"crc\" DevicePath \"\"" Dec 02 16:16:36 crc kubenswrapper[4933]: I1202 16:16:36.568946 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/916e3919-8d7f-4678-b211-7b06c3404ded-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "916e3919-8d7f-4678-b211-7b06c3404ded" (UID: "916e3919-8d7f-4678-b211-7b06c3404ded"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:16:36 crc kubenswrapper[4933]: I1202 16:16:36.654174 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/916e3919-8d7f-4678-b211-7b06c3404ded-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "916e3919-8d7f-4678-b211-7b06c3404ded" (UID: "916e3919-8d7f-4678-b211-7b06c3404ded"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:16:36 crc kubenswrapper[4933]: I1202 16:16:36.663357 4933 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/916e3919-8d7f-4678-b211-7b06c3404ded-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 16:16:36 crc kubenswrapper[4933]: I1202 16:16:36.667913 4933 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/916e3919-8d7f-4678-b211-7b06c3404ded-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 02 16:16:36 crc kubenswrapper[4933]: I1202 16:16:36.703530 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/916e3919-8d7f-4678-b211-7b06c3404ded-config-data" (OuterVolumeSpecName: "config-data") pod "916e3919-8d7f-4678-b211-7b06c3404ded" (UID: "916e3919-8d7f-4678-b211-7b06c3404ded"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:16:36 crc kubenswrapper[4933]: I1202 16:16:36.770386 4933 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/916e3919-8d7f-4678-b211-7b06c3404ded-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 16:16:36 crc kubenswrapper[4933]: I1202 16:16:36.782507 4933 scope.go:117] "RemoveContainer" containerID="90307c26b1e9ed1dd49b9c7aa986de96e413ad48c8b10f4ace6e9ca61d3241c4" Dec 02 16:16:36 crc kubenswrapper[4933]: I1202 16:16:36.789806 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 16:16:36 crc kubenswrapper[4933]: I1202 16:16:36.801672 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 02 16:16:36 crc kubenswrapper[4933]: I1202 16:16:36.821160 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 02 16:16:36 crc kubenswrapper[4933]: E1202 16:16:36.822403 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="916e3919-8d7f-4678-b211-7b06c3404ded" containerName="ceilometer-central-agent" Dec 02 16:16:36 crc kubenswrapper[4933]: I1202 16:16:36.822699 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="916e3919-8d7f-4678-b211-7b06c3404ded" containerName="ceilometer-central-agent" Dec 02 16:16:36 crc kubenswrapper[4933]: E1202 16:16:36.822802 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="916e3919-8d7f-4678-b211-7b06c3404ded" containerName="ceilometer-notification-agent" Dec 02 16:16:36 crc kubenswrapper[4933]: I1202 16:16:36.822909 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="916e3919-8d7f-4678-b211-7b06c3404ded" containerName="ceilometer-notification-agent" Dec 02 16:16:36 crc kubenswrapper[4933]: E1202 16:16:36.823013 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="916e3919-8d7f-4678-b211-7b06c3404ded" containerName="proxy-httpd" Dec 02 16:16:36 crc kubenswrapper[4933]: I1202 16:16:36.823958 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="916e3919-8d7f-4678-b211-7b06c3404ded" containerName="proxy-httpd" Dec 02 16:16:36 crc kubenswrapper[4933]: E1202 16:16:36.824107 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="916e3919-8d7f-4678-b211-7b06c3404ded" containerName="sg-core" Dec 02 16:16:36 crc kubenswrapper[4933]: I1202 16:16:36.824183 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="916e3919-8d7f-4678-b211-7b06c3404ded" containerName="sg-core" Dec 02 16:16:36 crc kubenswrapper[4933]: I1202 16:16:36.824524 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="916e3919-8d7f-4678-b211-7b06c3404ded" containerName="ceilometer-central-agent" Dec 02 16:16:36 crc kubenswrapper[4933]: I1202 16:16:36.824593 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="916e3919-8d7f-4678-b211-7b06c3404ded" containerName="ceilometer-notification-agent" Dec 02 16:16:36 crc kubenswrapper[4933]: I1202 16:16:36.824662 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="916e3919-8d7f-4678-b211-7b06c3404ded" containerName="sg-core" Dec 02 16:16:36 crc kubenswrapper[4933]: I1202 16:16:36.824729 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="916e3919-8d7f-4678-b211-7b06c3404ded" containerName="proxy-httpd" Dec 02 16:16:36 crc kubenswrapper[4933]: I1202 16:16:36.827211 4933 scope.go:117] "RemoveContainer" containerID="65b2c79311b11ac505dd309247daa9e6ebaaa85b1b9e721d26ebb282dc34d8bb" Dec 02 16:16:36 crc kubenswrapper[4933]: I1202 16:16:36.829178 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 16:16:36 crc kubenswrapper[4933]: I1202 16:16:36.831941 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 02 16:16:36 crc kubenswrapper[4933]: I1202 16:16:36.832261 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 02 16:16:36 crc kubenswrapper[4933]: I1202 16:16:36.836876 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 16:16:36 crc kubenswrapper[4933]: I1202 16:16:36.872271 4933 scope.go:117] "RemoveContainer" containerID="53b39f4f892493cbe28530b790b415ab9a051a5b19348689275e1dcdc3cf6f41" Dec 02 16:16:36 crc kubenswrapper[4933]: E1202 16:16:36.872749 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53b39f4f892493cbe28530b790b415ab9a051a5b19348689275e1dcdc3cf6f41\": container with ID starting with 53b39f4f892493cbe28530b790b415ab9a051a5b19348689275e1dcdc3cf6f41 not found: ID does not exist" containerID="53b39f4f892493cbe28530b790b415ab9a051a5b19348689275e1dcdc3cf6f41" Dec 02 16:16:36 crc kubenswrapper[4933]: I1202 16:16:36.872788 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53b39f4f892493cbe28530b790b415ab9a051a5b19348689275e1dcdc3cf6f41"} err="failed to get container status \"53b39f4f892493cbe28530b790b415ab9a051a5b19348689275e1dcdc3cf6f41\": rpc error: code = NotFound desc = could not find container \"53b39f4f892493cbe28530b790b415ab9a051a5b19348689275e1dcdc3cf6f41\": container with ID starting with 53b39f4f892493cbe28530b790b415ab9a051a5b19348689275e1dcdc3cf6f41 not found: ID does not exist" Dec 02 16:16:36 crc kubenswrapper[4933]: I1202 16:16:36.872816 4933 scope.go:117] "RemoveContainer" containerID="a1f68e470c7f4e0eb4b1777cc6a6ad499b3da706270d9b344d21569cdd4f874f" Dec 02 16:16:36 crc kubenswrapper[4933]: E1202 16:16:36.873010 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1f68e470c7f4e0eb4b1777cc6a6ad499b3da706270d9b344d21569cdd4f874f\": container with ID starting with a1f68e470c7f4e0eb4b1777cc6a6ad499b3da706270d9b344d21569cdd4f874f not found: ID does not exist" containerID="a1f68e470c7f4e0eb4b1777cc6a6ad499b3da706270d9b344d21569cdd4f874f" Dec 02 16:16:36 crc kubenswrapper[4933]: I1202 16:16:36.873032 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1f68e470c7f4e0eb4b1777cc6a6ad499b3da706270d9b344d21569cdd4f874f"} err="failed to get container status \"a1f68e470c7f4e0eb4b1777cc6a6ad499b3da706270d9b344d21569cdd4f874f\": rpc error: code = NotFound desc = could not find container \"a1f68e470c7f4e0eb4b1777cc6a6ad499b3da706270d9b344d21569cdd4f874f\": container with ID starting with a1f68e470c7f4e0eb4b1777cc6a6ad499b3da706270d9b344d21569cdd4f874f not found: ID does not exist" Dec 02 16:16:36 crc kubenswrapper[4933]: I1202 16:16:36.873073 4933 scope.go:117] "RemoveContainer" containerID="90307c26b1e9ed1dd49b9c7aa986de96e413ad48c8b10f4ace6e9ca61d3241c4" Dec 02 16:16:36 crc kubenswrapper[4933]: E1202 16:16:36.873483 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90307c26b1e9ed1dd49b9c7aa986de96e413ad48c8b10f4ace6e9ca61d3241c4\": container with ID starting with 90307c26b1e9ed1dd49b9c7aa986de96e413ad48c8b10f4ace6e9ca61d3241c4 not found: ID does not exist" containerID="90307c26b1e9ed1dd49b9c7aa986de96e413ad48c8b10f4ace6e9ca61d3241c4" Dec 02 16:16:36 crc kubenswrapper[4933]: I1202 16:16:36.873511 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90307c26b1e9ed1dd49b9c7aa986de96e413ad48c8b10f4ace6e9ca61d3241c4"} err="failed to get container status \"90307c26b1e9ed1dd49b9c7aa986de96e413ad48c8b10f4ace6e9ca61d3241c4\": rpc error: code = NotFound desc = could not find container \"90307c26b1e9ed1dd49b9c7aa986de96e413ad48c8b10f4ace6e9ca61d3241c4\": container with ID starting with 90307c26b1e9ed1dd49b9c7aa986de96e413ad48c8b10f4ace6e9ca61d3241c4 not found: ID does not exist" Dec 02 16:16:36 crc kubenswrapper[4933]: I1202 16:16:36.873523 4933 scope.go:117] "RemoveContainer" containerID="65b2c79311b11ac505dd309247daa9e6ebaaa85b1b9e721d26ebb282dc34d8bb" Dec 02 16:16:36 crc kubenswrapper[4933]: E1202 16:16:36.874165 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65b2c79311b11ac505dd309247daa9e6ebaaa85b1b9e721d26ebb282dc34d8bb\": container with ID starting with 65b2c79311b11ac505dd309247daa9e6ebaaa85b1b9e721d26ebb282dc34d8bb not found: ID does not exist" containerID="65b2c79311b11ac505dd309247daa9e6ebaaa85b1b9e721d26ebb282dc34d8bb" Dec 02 16:16:36 crc kubenswrapper[4933]: I1202 16:16:36.874189 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65b2c79311b11ac505dd309247daa9e6ebaaa85b1b9e721d26ebb282dc34d8bb"} err="failed to get container status \"65b2c79311b11ac505dd309247daa9e6ebaaa85b1b9e721d26ebb282dc34d8bb\": rpc error: code = NotFound desc = could not find container \"65b2c79311b11ac505dd309247daa9e6ebaaa85b1b9e721d26ebb282dc34d8bb\": container with ID starting with 65b2c79311b11ac505dd309247daa9e6ebaaa85b1b9e721d26ebb282dc34d8bb not found: ID does not exist" Dec 02 16:16:36 crc kubenswrapper[4933]: I1202 16:16:36.874207 4933 scope.go:117] "RemoveContainer" containerID="53b39f4f892493cbe28530b790b415ab9a051a5b19348689275e1dcdc3cf6f41" Dec 02 16:16:36 crc kubenswrapper[4933]: I1202 16:16:36.874511 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53b39f4f892493cbe28530b790b415ab9a051a5b19348689275e1dcdc3cf6f41"} err="failed to get container status \"53b39f4f892493cbe28530b790b415ab9a051a5b19348689275e1dcdc3cf6f41\": rpc error: code = NotFound desc = could not find container \"53b39f4f892493cbe28530b790b415ab9a051a5b19348689275e1dcdc3cf6f41\": container with ID starting with 53b39f4f892493cbe28530b790b415ab9a051a5b19348689275e1dcdc3cf6f41 not found: ID does not exist" Dec 02 16:16:36 crc kubenswrapper[4933]: I1202 16:16:36.874557 4933 scope.go:117] "RemoveContainer" containerID="a1f68e470c7f4e0eb4b1777cc6a6ad499b3da706270d9b344d21569cdd4f874f" Dec 02 16:16:36 crc kubenswrapper[4933]: I1202 16:16:36.874849 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1f68e470c7f4e0eb4b1777cc6a6ad499b3da706270d9b344d21569cdd4f874f"} err="failed to get container status \"a1f68e470c7f4e0eb4b1777cc6a6ad499b3da706270d9b344d21569cdd4f874f\": rpc error: code = NotFound desc = could not find container \"a1f68e470c7f4e0eb4b1777cc6a6ad499b3da706270d9b344d21569cdd4f874f\": container with ID starting with a1f68e470c7f4e0eb4b1777cc6a6ad499b3da706270d9b344d21569cdd4f874f not found: ID does not exist" Dec 02 16:16:36 crc kubenswrapper[4933]: I1202 16:16:36.874870 4933 scope.go:117] "RemoveContainer" containerID="90307c26b1e9ed1dd49b9c7aa986de96e413ad48c8b10f4ace6e9ca61d3241c4" Dec 02 16:16:36 crc kubenswrapper[4933]: I1202 16:16:36.875107 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90307c26b1e9ed1dd49b9c7aa986de96e413ad48c8b10f4ace6e9ca61d3241c4"} err="failed to get container status \"90307c26b1e9ed1dd49b9c7aa986de96e413ad48c8b10f4ace6e9ca61d3241c4\": rpc error: code = NotFound desc = could not find container \"90307c26b1e9ed1dd49b9c7aa986de96e413ad48c8b10f4ace6e9ca61d3241c4\": container with ID starting with 90307c26b1e9ed1dd49b9c7aa986de96e413ad48c8b10f4ace6e9ca61d3241c4 not found: ID does not exist" Dec 02 16:16:36 crc kubenswrapper[4933]: I1202 16:16:36.875128 4933 scope.go:117] "RemoveContainer" containerID="65b2c79311b11ac505dd309247daa9e6ebaaa85b1b9e721d26ebb282dc34d8bb" Dec 02 16:16:36 crc kubenswrapper[4933]: I1202 16:16:36.875390 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65b2c79311b11ac505dd309247daa9e6ebaaa85b1b9e721d26ebb282dc34d8bb"} err="failed to get container status \"65b2c79311b11ac505dd309247daa9e6ebaaa85b1b9e721d26ebb282dc34d8bb\": rpc error: code = NotFound desc = could not find container \"65b2c79311b11ac505dd309247daa9e6ebaaa85b1b9e721d26ebb282dc34d8bb\": container with ID starting with 65b2c79311b11ac505dd309247daa9e6ebaaa85b1b9e721d26ebb282dc34d8bb not found: ID does not exist" Dec 02 16:16:36 crc kubenswrapper[4933]: I1202 16:16:36.875409 4933 scope.go:117] "RemoveContainer" containerID="53b39f4f892493cbe28530b790b415ab9a051a5b19348689275e1dcdc3cf6f41" Dec 02 16:16:36 crc kubenswrapper[4933]: I1202 16:16:36.875650 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53b39f4f892493cbe28530b790b415ab9a051a5b19348689275e1dcdc3cf6f41"} err="failed to get container status \"53b39f4f892493cbe28530b790b415ab9a051a5b19348689275e1dcdc3cf6f41\": rpc error: code = NotFound desc = could not find container \"53b39f4f892493cbe28530b790b415ab9a051a5b19348689275e1dcdc3cf6f41\": container with ID starting with 53b39f4f892493cbe28530b790b415ab9a051a5b19348689275e1dcdc3cf6f41 not found: ID does not exist" Dec 02 16:16:36 crc kubenswrapper[4933]: I1202 16:16:36.875675 4933 scope.go:117] "RemoveContainer" containerID="a1f68e470c7f4e0eb4b1777cc6a6ad499b3da706270d9b344d21569cdd4f874f" Dec 02 16:16:36 crc kubenswrapper[4933]: I1202 16:16:36.875945 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1f68e470c7f4e0eb4b1777cc6a6ad499b3da706270d9b344d21569cdd4f874f"} err="failed to get container status \"a1f68e470c7f4e0eb4b1777cc6a6ad499b3da706270d9b344d21569cdd4f874f\": rpc error: code = NotFound desc = could not find container \"a1f68e470c7f4e0eb4b1777cc6a6ad499b3da706270d9b344d21569cdd4f874f\": container with ID starting with a1f68e470c7f4e0eb4b1777cc6a6ad499b3da706270d9b344d21569cdd4f874f not found: ID does not exist" Dec 02 16:16:36 crc kubenswrapper[4933]: I1202 16:16:36.875962 4933 scope.go:117] "RemoveContainer" containerID="90307c26b1e9ed1dd49b9c7aa986de96e413ad48c8b10f4ace6e9ca61d3241c4" Dec 02 16:16:36 crc kubenswrapper[4933]: I1202 16:16:36.876160 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90307c26b1e9ed1dd49b9c7aa986de96e413ad48c8b10f4ace6e9ca61d3241c4"} err="failed to get container status \"90307c26b1e9ed1dd49b9c7aa986de96e413ad48c8b10f4ace6e9ca61d3241c4\": rpc error: code = NotFound desc = could not find container \"90307c26b1e9ed1dd49b9c7aa986de96e413ad48c8b10f4ace6e9ca61d3241c4\": container with ID starting with 90307c26b1e9ed1dd49b9c7aa986de96e413ad48c8b10f4ace6e9ca61d3241c4 not found: ID does not exist" Dec 02 16:16:36 crc kubenswrapper[4933]: I1202 16:16:36.876176 4933 scope.go:117] "RemoveContainer" containerID="65b2c79311b11ac505dd309247daa9e6ebaaa85b1b9e721d26ebb282dc34d8bb" Dec 02 16:16:36 crc kubenswrapper[4933]: I1202 16:16:36.876377 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65b2c79311b11ac505dd309247daa9e6ebaaa85b1b9e721d26ebb282dc34d8bb"} err="failed to get container status \"65b2c79311b11ac505dd309247daa9e6ebaaa85b1b9e721d26ebb282dc34d8bb\": rpc error: code = NotFound desc = could not find container \"65b2c79311b11ac505dd309247daa9e6ebaaa85b1b9e721d26ebb282dc34d8bb\": container with ID starting with 65b2c79311b11ac505dd309247daa9e6ebaaa85b1b9e721d26ebb282dc34d8bb not found: ID does not exist" Dec 02 16:16:36 crc kubenswrapper[4933]: I1202 16:16:36.876396 4933 scope.go:117] "RemoveContainer" containerID="53b39f4f892493cbe28530b790b415ab9a051a5b19348689275e1dcdc3cf6f41" Dec 02 16:16:36 crc kubenswrapper[4933]: I1202 16:16:36.877437 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53b39f4f892493cbe28530b790b415ab9a051a5b19348689275e1dcdc3cf6f41"} err="failed to get container status \"53b39f4f892493cbe28530b790b415ab9a051a5b19348689275e1dcdc3cf6f41\": rpc error: code = NotFound desc = could not find container \"53b39f4f892493cbe28530b790b415ab9a051a5b19348689275e1dcdc3cf6f41\": container with ID starting with 53b39f4f892493cbe28530b790b415ab9a051a5b19348689275e1dcdc3cf6f41 not found: ID does not exist" Dec 02 16:16:36 crc kubenswrapper[4933]: I1202 16:16:36.877459 4933 scope.go:117] "RemoveContainer" containerID="a1f68e470c7f4e0eb4b1777cc6a6ad499b3da706270d9b344d21569cdd4f874f" Dec 02 16:16:36 crc kubenswrapper[4933]: I1202 16:16:36.877688 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1f68e470c7f4e0eb4b1777cc6a6ad499b3da706270d9b344d21569cdd4f874f"} err="failed to get container status \"a1f68e470c7f4e0eb4b1777cc6a6ad499b3da706270d9b344d21569cdd4f874f\": rpc error: code = NotFound desc = could not find container \"a1f68e470c7f4e0eb4b1777cc6a6ad499b3da706270d9b344d21569cdd4f874f\": container with ID starting with a1f68e470c7f4e0eb4b1777cc6a6ad499b3da706270d9b344d21569cdd4f874f not found: ID does not exist" Dec 02 16:16:36 crc kubenswrapper[4933]: I1202 16:16:36.877714 4933 scope.go:117] "RemoveContainer" containerID="90307c26b1e9ed1dd49b9c7aa986de96e413ad48c8b10f4ace6e9ca61d3241c4" Dec 02 16:16:36 crc kubenswrapper[4933]: I1202 16:16:36.877981 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90307c26b1e9ed1dd49b9c7aa986de96e413ad48c8b10f4ace6e9ca61d3241c4"} err="failed to get container status \"90307c26b1e9ed1dd49b9c7aa986de96e413ad48c8b10f4ace6e9ca61d3241c4\": rpc error: code = NotFound desc = could not find container \"90307c26b1e9ed1dd49b9c7aa986de96e413ad48c8b10f4ace6e9ca61d3241c4\": container with ID starting with 90307c26b1e9ed1dd49b9c7aa986de96e413ad48c8b10f4ace6e9ca61d3241c4 not found: ID does not exist" Dec 02 16:16:36 crc kubenswrapper[4933]: I1202 16:16:36.878001 4933 scope.go:117] "RemoveContainer" containerID="65b2c79311b11ac505dd309247daa9e6ebaaa85b1b9e721d26ebb282dc34d8bb" Dec 02 16:16:36 crc kubenswrapper[4933]: I1202 16:16:36.878611 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65b2c79311b11ac505dd309247daa9e6ebaaa85b1b9e721d26ebb282dc34d8bb"} err="failed to get container status \"65b2c79311b11ac505dd309247daa9e6ebaaa85b1b9e721d26ebb282dc34d8bb\": rpc error: code = NotFound desc = could not find container \"65b2c79311b11ac505dd309247daa9e6ebaaa85b1b9e721d26ebb282dc34d8bb\": container with ID starting with 65b2c79311b11ac505dd309247daa9e6ebaaa85b1b9e721d26ebb282dc34d8bb not found: ID does not exist" Dec 02 16:16:36 crc kubenswrapper[4933]: I1202 16:16:36.975305 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1360dd8b-76ca-4595-a12d-f8f5309b06cd-run-httpd\") pod \"ceilometer-0\" (UID: \"1360dd8b-76ca-4595-a12d-f8f5309b06cd\") " pod="openstack/ceilometer-0" Dec 02 16:16:36 crc kubenswrapper[4933]: I1202 16:16:36.975348 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1360dd8b-76ca-4595-a12d-f8f5309b06cd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1360dd8b-76ca-4595-a12d-f8f5309b06cd\") " pod="openstack/ceilometer-0" Dec 02 16:16:36 crc kubenswrapper[4933]: I1202 16:16:36.975394 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1360dd8b-76ca-4595-a12d-f8f5309b06cd-log-httpd\") pod \"ceilometer-0\" (UID: \"1360dd8b-76ca-4595-a12d-f8f5309b06cd\") " pod="openstack/ceilometer-0" Dec 02 16:16:36 crc kubenswrapper[4933]: I1202 16:16:36.975501 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hb48x\" (UniqueName: \"kubernetes.io/projected/1360dd8b-76ca-4595-a12d-f8f5309b06cd-kube-api-access-hb48x\") pod \"ceilometer-0\" (UID: \"1360dd8b-76ca-4595-a12d-f8f5309b06cd\") " pod="openstack/ceilometer-0" Dec 02 16:16:36 crc kubenswrapper[4933]: I1202 16:16:36.975593 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1360dd8b-76ca-4595-a12d-f8f5309b06cd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1360dd8b-76ca-4595-a12d-f8f5309b06cd\") " pod="openstack/ceilometer-0" Dec 02 16:16:36 crc kubenswrapper[4933]: I1202 16:16:36.975649 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1360dd8b-76ca-4595-a12d-f8f5309b06cd-config-data\") pod \"ceilometer-0\" (UID: \"1360dd8b-76ca-4595-a12d-f8f5309b06cd\") " pod="openstack/ceilometer-0" Dec 02 16:16:36 crc kubenswrapper[4933]: I1202 16:16:36.975690 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1360dd8b-76ca-4595-a12d-f8f5309b06cd-scripts\") pod \"ceilometer-0\" (UID: \"1360dd8b-76ca-4595-a12d-f8f5309b06cd\") " pod="openstack/ceilometer-0" Dec 02 16:16:37 crc kubenswrapper[4933]: I1202 16:16:37.065977 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="916e3919-8d7f-4678-b211-7b06c3404ded" path="/var/lib/kubelet/pods/916e3919-8d7f-4678-b211-7b06c3404ded/volumes" Dec 02 16:16:37 crc kubenswrapper[4933]: I1202 16:16:37.077789 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hb48x\" (UniqueName: \"kubernetes.io/projected/1360dd8b-76ca-4595-a12d-f8f5309b06cd-kube-api-access-hb48x\") pod \"ceilometer-0\" (UID: \"1360dd8b-76ca-4595-a12d-f8f5309b06cd\") " pod="openstack/ceilometer-0" Dec 02 16:16:37 crc kubenswrapper[4933]: I1202 16:16:37.077914 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1360dd8b-76ca-4595-a12d-f8f5309b06cd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1360dd8b-76ca-4595-a12d-f8f5309b06cd\") " pod="openstack/ceilometer-0" Dec 02 16:16:37 crc kubenswrapper[4933]: I1202 16:16:37.077944 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1360dd8b-76ca-4595-a12d-f8f5309b06cd-config-data\") pod \"ceilometer-0\" (UID: \"1360dd8b-76ca-4595-a12d-f8f5309b06cd\") " pod="openstack/ceilometer-0" Dec 02 16:16:37 crc kubenswrapper[4933]: I1202 16:16:37.077973 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1360dd8b-76ca-4595-a12d-f8f5309b06cd-scripts\") pod \"ceilometer-0\" (UID: \"1360dd8b-76ca-4595-a12d-f8f5309b06cd\") " pod="openstack/ceilometer-0" Dec 02 16:16:37 crc kubenswrapper[4933]: I1202 16:16:37.078073 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1360dd8b-76ca-4595-a12d-f8f5309b06cd-run-httpd\") pod \"ceilometer-0\" (UID: \"1360dd8b-76ca-4595-a12d-f8f5309b06cd\") " pod="openstack/ceilometer-0" Dec 02 16:16:37 crc kubenswrapper[4933]: I1202 16:16:37.078089 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1360dd8b-76ca-4595-a12d-f8f5309b06cd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1360dd8b-76ca-4595-a12d-f8f5309b06cd\") " pod="openstack/ceilometer-0" Dec 02 16:16:37 crc kubenswrapper[4933]: I1202 16:16:37.078121 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1360dd8b-76ca-4595-a12d-f8f5309b06cd-log-httpd\") pod \"ceilometer-0\" (UID: \"1360dd8b-76ca-4595-a12d-f8f5309b06cd\") " pod="openstack/ceilometer-0" Dec 02 16:16:37 crc kubenswrapper[4933]: I1202 16:16:37.078632 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1360dd8b-76ca-4595-a12d-f8f5309b06cd-log-httpd\") pod \"ceilometer-0\" (UID: \"1360dd8b-76ca-4595-a12d-f8f5309b06cd\") " pod="openstack/ceilometer-0" Dec 02 16:16:37 crc kubenswrapper[4933]: I1202 16:16:37.078805 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1360dd8b-76ca-4595-a12d-f8f5309b06cd-run-httpd\") pod \"ceilometer-0\" (UID: \"1360dd8b-76ca-4595-a12d-f8f5309b06cd\") " pod="openstack/ceilometer-0" Dec 02 16:16:37 crc kubenswrapper[4933]: I1202 16:16:37.082685 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1360dd8b-76ca-4595-a12d-f8f5309b06cd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1360dd8b-76ca-4595-a12d-f8f5309b06cd\") " pod="openstack/ceilometer-0" Dec 02 16:16:37 crc kubenswrapper[4933]: I1202 16:16:37.082970 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1360dd8b-76ca-4595-a12d-f8f5309b06cd-config-data\") pod \"ceilometer-0\" (UID: \"1360dd8b-76ca-4595-a12d-f8f5309b06cd\") " pod="openstack/ceilometer-0" Dec 02 16:16:37 crc kubenswrapper[4933]: I1202 16:16:37.083081 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1360dd8b-76ca-4595-a12d-f8f5309b06cd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1360dd8b-76ca-4595-a12d-f8f5309b06cd\") " pod="openstack/ceilometer-0" Dec 02 16:16:37 crc kubenswrapper[4933]: I1202 16:16:37.083246 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1360dd8b-76ca-4595-a12d-f8f5309b06cd-scripts\") pod \"ceilometer-0\" (UID: \"1360dd8b-76ca-4595-a12d-f8f5309b06cd\") " pod="openstack/ceilometer-0" Dec 02 16:16:37 crc kubenswrapper[4933]: I1202 16:16:37.094511 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hb48x\" (UniqueName: \"kubernetes.io/projected/1360dd8b-76ca-4595-a12d-f8f5309b06cd-kube-api-access-hb48x\") pod \"ceilometer-0\" (UID: \"1360dd8b-76ca-4595-a12d-f8f5309b06cd\") " pod="openstack/ceilometer-0" Dec 02 16:16:37 crc kubenswrapper[4933]: I1202 16:16:37.165766 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 16:16:37 crc kubenswrapper[4933]: I1202 16:16:37.687997 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 16:16:38 crc kubenswrapper[4933]: I1202 16:16:38.215333 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 16:16:38 crc kubenswrapper[4933]: I1202 16:16:38.436493 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1360dd8b-76ca-4595-a12d-f8f5309b06cd","Type":"ContainerStarted","Data":"c240c43603a99e0451c0d890e9d13534c18d7107119e0eda4dd176e26a6728ec"} Dec 02 16:16:39 crc kubenswrapper[4933]: I1202 16:16:39.448525 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1360dd8b-76ca-4595-a12d-f8f5309b06cd","Type":"ContainerStarted","Data":"c55968801393d3b705956e2dbf7487c7a48e991336ff7e84b4aff5e5ff56c20a"} Dec 02 16:16:41 crc kubenswrapper[4933]: I1202 16:16:41.469117 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1360dd8b-76ca-4595-a12d-f8f5309b06cd","Type":"ContainerStarted","Data":"5202c98bfd57880824328d8a0af71f21f541049fbd19445ed5c9cb9f17911071"} Dec 02 16:16:42 crc kubenswrapper[4933]: I1202 16:16:42.843001 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 02 16:16:42 crc kubenswrapper[4933]: I1202 16:16:42.843353 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 02 16:16:42 crc kubenswrapper[4933]: I1202 16:16:42.889061 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 02 16:16:42 crc kubenswrapper[4933]: I1202 16:16:42.906691 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 02 16:16:42 crc kubenswrapper[4933]: I1202 16:16:42.999695 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-h8zrv" Dec 02 16:16:43 crc kubenswrapper[4933]: I1202 16:16:43.066728 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-h8zrv" Dec 02 16:16:43 crc kubenswrapper[4933]: I1202 16:16:43.495836 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 02 16:16:43 crc kubenswrapper[4933]: I1202 16:16:43.496241 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 02 16:16:43 crc kubenswrapper[4933]: I1202 16:16:43.802439 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-h8zrv"] Dec 02 16:16:44 crc kubenswrapper[4933]: I1202 16:16:44.550422 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-h8zrv" podUID="aa2e200b-ec52-4e46-bc66-c40d8b3c0c27" containerName="registry-server" containerID="cri-o://f45eb1443ec28177bbfb8574b43299e194d7c4c93c0f813f9395d9a6ad9a54d6" gracePeriod=2 Dec 02 16:16:44 crc kubenswrapper[4933]: I1202 16:16:44.550956 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1360dd8b-76ca-4595-a12d-f8f5309b06cd","Type":"ContainerStarted","Data":"a5812e2d5c5ef9245b1d3f50c8f25cb2f929c5400f5e20c77006f427985b994a"} Dec 02 16:16:45 crc kubenswrapper[4933]: I1202 16:16:45.235034 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h8zrv" Dec 02 16:16:45 crc kubenswrapper[4933]: I1202 16:16:45.291613 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa2e200b-ec52-4e46-bc66-c40d8b3c0c27-utilities\") pod \"aa2e200b-ec52-4e46-bc66-c40d8b3c0c27\" (UID: \"aa2e200b-ec52-4e46-bc66-c40d8b3c0c27\") " Dec 02 16:16:45 crc kubenswrapper[4933]: I1202 16:16:45.291883 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa2e200b-ec52-4e46-bc66-c40d8b3c0c27-catalog-content\") pod \"aa2e200b-ec52-4e46-bc66-c40d8b3c0c27\" (UID: \"aa2e200b-ec52-4e46-bc66-c40d8b3c0c27\") " Dec 02 16:16:45 crc kubenswrapper[4933]: I1202 16:16:45.291912 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8w6bp\" (UniqueName: \"kubernetes.io/projected/aa2e200b-ec52-4e46-bc66-c40d8b3c0c27-kube-api-access-8w6bp\") pod \"aa2e200b-ec52-4e46-bc66-c40d8b3c0c27\" (UID: \"aa2e200b-ec52-4e46-bc66-c40d8b3c0c27\") " Dec 02 16:16:45 crc kubenswrapper[4933]: I1202 16:16:45.292523 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa2e200b-ec52-4e46-bc66-c40d8b3c0c27-utilities" (OuterVolumeSpecName: "utilities") pod "aa2e200b-ec52-4e46-bc66-c40d8b3c0c27" (UID: "aa2e200b-ec52-4e46-bc66-c40d8b3c0c27"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 16:16:45 crc kubenswrapper[4933]: I1202 16:16:45.309093 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa2e200b-ec52-4e46-bc66-c40d8b3c0c27-kube-api-access-8w6bp" (OuterVolumeSpecName: "kube-api-access-8w6bp") pod "aa2e200b-ec52-4e46-bc66-c40d8b3c0c27" (UID: "aa2e200b-ec52-4e46-bc66-c40d8b3c0c27"). InnerVolumeSpecName "kube-api-access-8w6bp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:16:45 crc kubenswrapper[4933]: I1202 16:16:45.401484 4933 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa2e200b-ec52-4e46-bc66-c40d8b3c0c27-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 16:16:45 crc kubenswrapper[4933]: I1202 16:16:45.401532 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8w6bp\" (UniqueName: \"kubernetes.io/projected/aa2e200b-ec52-4e46-bc66-c40d8b3c0c27-kube-api-access-8w6bp\") on node \"crc\" DevicePath \"\"" Dec 02 16:16:45 crc kubenswrapper[4933]: I1202 16:16:45.435469 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa2e200b-ec52-4e46-bc66-c40d8b3c0c27-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "aa2e200b-ec52-4e46-bc66-c40d8b3c0c27" (UID: "aa2e200b-ec52-4e46-bc66-c40d8b3c0c27"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 16:16:45 crc kubenswrapper[4933]: I1202 16:16:45.507569 4933 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa2e200b-ec52-4e46-bc66-c40d8b3c0c27-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 16:16:45 crc kubenswrapper[4933]: I1202 16:16:45.562699 4933 generic.go:334] "Generic (PLEG): container finished" podID="aa2e200b-ec52-4e46-bc66-c40d8b3c0c27" containerID="f45eb1443ec28177bbfb8574b43299e194d7c4c93c0f813f9395d9a6ad9a54d6" exitCode=0 Dec 02 16:16:45 crc kubenswrapper[4933]: I1202 16:16:45.562769 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h8zrv" Dec 02 16:16:45 crc kubenswrapper[4933]: I1202 16:16:45.562787 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h8zrv" event={"ID":"aa2e200b-ec52-4e46-bc66-c40d8b3c0c27","Type":"ContainerDied","Data":"f45eb1443ec28177bbfb8574b43299e194d7c4c93c0f813f9395d9a6ad9a54d6"} Dec 02 16:16:45 crc kubenswrapper[4933]: I1202 16:16:45.562880 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h8zrv" event={"ID":"aa2e200b-ec52-4e46-bc66-c40d8b3c0c27","Type":"ContainerDied","Data":"f309632ffe67b95609fcc28ce15d819e0d1e7d07df0ae6ffd4dfdbf424256b22"} Dec 02 16:16:45 crc kubenswrapper[4933]: I1202 16:16:45.562900 4933 scope.go:117] "RemoveContainer" containerID="f45eb1443ec28177bbfb8574b43299e194d7c4c93c0f813f9395d9a6ad9a54d6" Dec 02 16:16:45 crc kubenswrapper[4933]: I1202 16:16:45.570769 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1360dd8b-76ca-4595-a12d-f8f5309b06cd","Type":"ContainerStarted","Data":"e6b229e1a24db5a891b1bb3c995b31a7060b1517cbfc3e96bedafded76211113"} Dec 02 16:16:45 crc kubenswrapper[4933]: I1202 16:16:45.570963 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1360dd8b-76ca-4595-a12d-f8f5309b06cd" containerName="ceilometer-central-agent" containerID="cri-o://c55968801393d3b705956e2dbf7487c7a48e991336ff7e84b4aff5e5ff56c20a" gracePeriod=30 Dec 02 16:16:45 crc kubenswrapper[4933]: I1202 16:16:45.571196 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 02 16:16:45 crc kubenswrapper[4933]: I1202 16:16:45.571324 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1360dd8b-76ca-4595-a12d-f8f5309b06cd" containerName="proxy-httpd" containerID="cri-o://e6b229e1a24db5a891b1bb3c995b31a7060b1517cbfc3e96bedafded76211113" gracePeriod=30 Dec 02 16:16:45 crc kubenswrapper[4933]: I1202 16:16:45.571344 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1360dd8b-76ca-4595-a12d-f8f5309b06cd" containerName="ceilometer-notification-agent" containerID="cri-o://5202c98bfd57880824328d8a0af71f21f541049fbd19445ed5c9cb9f17911071" gracePeriod=30 Dec 02 16:16:45 crc kubenswrapper[4933]: I1202 16:16:45.571459 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1360dd8b-76ca-4595-a12d-f8f5309b06cd" containerName="sg-core" containerID="cri-o://a5812e2d5c5ef9245b1d3f50c8f25cb2f929c5400f5e20c77006f427985b994a" gracePeriod=30 Dec 02 16:16:45 crc kubenswrapper[4933]: I1202 16:16:45.590349 4933 scope.go:117] "RemoveContainer" containerID="86977c6efa4e3349d6a8db9b5e658b6de7a1cbf704363535e45525a1e57e3a48" Dec 02 16:16:45 crc kubenswrapper[4933]: I1202 16:16:45.606575 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.525425435 podStartE2EDuration="9.606546847s" podCreationTimestamp="2025-12-02 16:16:36 +0000 UTC" firstStartedPulling="2025-12-02 16:16:37.694114759 +0000 UTC m=+1460.945341472" lastFinishedPulling="2025-12-02 16:16:44.775236181 +0000 UTC m=+1468.026462884" observedRunningTime="2025-12-02 16:16:45.595864587 +0000 UTC m=+1468.847091290" watchObservedRunningTime="2025-12-02 16:16:45.606546847 +0000 UTC m=+1468.857773560" Dec 02 16:16:45 crc kubenswrapper[4933]: I1202 16:16:45.625466 4933 scope.go:117] "RemoveContainer" containerID="7d904d52a1e68759a1ffcde979b9f78288fc4ff777a99f6e8b3234c9555658a0" Dec 02 16:16:45 crc kubenswrapper[4933]: I1202 16:16:45.635627 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-h8zrv"] Dec 02 16:16:45 crc kubenswrapper[4933]: I1202 16:16:45.645592 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-h8zrv"] Dec 02 16:16:45 crc kubenswrapper[4933]: I1202 16:16:45.651842 4933 scope.go:117] "RemoveContainer" containerID="f45eb1443ec28177bbfb8574b43299e194d7c4c93c0f813f9395d9a6ad9a54d6" Dec 02 16:16:45 crc kubenswrapper[4933]: E1202 16:16:45.654170 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f45eb1443ec28177bbfb8574b43299e194d7c4c93c0f813f9395d9a6ad9a54d6\": container with ID starting with f45eb1443ec28177bbfb8574b43299e194d7c4c93c0f813f9395d9a6ad9a54d6 not found: ID does not exist" containerID="f45eb1443ec28177bbfb8574b43299e194d7c4c93c0f813f9395d9a6ad9a54d6" Dec 02 16:16:45 crc kubenswrapper[4933]: I1202 16:16:45.654204 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f45eb1443ec28177bbfb8574b43299e194d7c4c93c0f813f9395d9a6ad9a54d6"} err="failed to get container status \"f45eb1443ec28177bbfb8574b43299e194d7c4c93c0f813f9395d9a6ad9a54d6\": rpc error: code = NotFound desc = could not find container \"f45eb1443ec28177bbfb8574b43299e194d7c4c93c0f813f9395d9a6ad9a54d6\": container with ID starting with f45eb1443ec28177bbfb8574b43299e194d7c4c93c0f813f9395d9a6ad9a54d6 not found: ID does not exist" Dec 02 16:16:45 crc kubenswrapper[4933]: I1202 16:16:45.654243 4933 scope.go:117] "RemoveContainer" containerID="86977c6efa4e3349d6a8db9b5e658b6de7a1cbf704363535e45525a1e57e3a48" Dec 02 16:16:45 crc kubenswrapper[4933]: E1202 16:16:45.655719 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86977c6efa4e3349d6a8db9b5e658b6de7a1cbf704363535e45525a1e57e3a48\": container with ID starting with 86977c6efa4e3349d6a8db9b5e658b6de7a1cbf704363535e45525a1e57e3a48 not found: ID does not exist" containerID="86977c6efa4e3349d6a8db9b5e658b6de7a1cbf704363535e45525a1e57e3a48" Dec 02 16:16:45 crc kubenswrapper[4933]: I1202 16:16:45.655744 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86977c6efa4e3349d6a8db9b5e658b6de7a1cbf704363535e45525a1e57e3a48"} err="failed to get container status \"86977c6efa4e3349d6a8db9b5e658b6de7a1cbf704363535e45525a1e57e3a48\": rpc error: code = NotFound desc = could not find container \"86977c6efa4e3349d6a8db9b5e658b6de7a1cbf704363535e45525a1e57e3a48\": container with ID starting with 86977c6efa4e3349d6a8db9b5e658b6de7a1cbf704363535e45525a1e57e3a48 not found: ID does not exist" Dec 02 16:16:45 crc kubenswrapper[4933]: I1202 16:16:45.655760 4933 scope.go:117] "RemoveContainer" containerID="7d904d52a1e68759a1ffcde979b9f78288fc4ff777a99f6e8b3234c9555658a0" Dec 02 16:16:45 crc kubenswrapper[4933]: E1202 16:16:45.656351 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d904d52a1e68759a1ffcde979b9f78288fc4ff777a99f6e8b3234c9555658a0\": container with ID starting with 7d904d52a1e68759a1ffcde979b9f78288fc4ff777a99f6e8b3234c9555658a0 not found: ID does not exist" containerID="7d904d52a1e68759a1ffcde979b9f78288fc4ff777a99f6e8b3234c9555658a0" Dec 02 16:16:45 crc kubenswrapper[4933]: I1202 16:16:45.656396 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d904d52a1e68759a1ffcde979b9f78288fc4ff777a99f6e8b3234c9555658a0"} err="failed to get container status \"7d904d52a1e68759a1ffcde979b9f78288fc4ff777a99f6e8b3234c9555658a0\": rpc error: code = NotFound desc = could not find container \"7d904d52a1e68759a1ffcde979b9f78288fc4ff777a99f6e8b3234c9555658a0\": container with ID starting with 7d904d52a1e68759a1ffcde979b9f78288fc4ff777a99f6e8b3234c9555658a0 not found: ID does not exist" Dec 02 16:16:46 crc kubenswrapper[4933]: I1202 16:16:46.294931 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 02 16:16:46 crc kubenswrapper[4933]: I1202 16:16:46.295270 4933 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 02 16:16:46 crc kubenswrapper[4933]: I1202 16:16:46.455745 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-pvgh5"] Dec 02 16:16:46 crc kubenswrapper[4933]: E1202 16:16:46.456367 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa2e200b-ec52-4e46-bc66-c40d8b3c0c27" containerName="registry-server" Dec 02 16:16:46 crc kubenswrapper[4933]: I1202 16:16:46.456394 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa2e200b-ec52-4e46-bc66-c40d8b3c0c27" containerName="registry-server" Dec 02 16:16:46 crc kubenswrapper[4933]: E1202 16:16:46.456425 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa2e200b-ec52-4e46-bc66-c40d8b3c0c27" containerName="extract-content" Dec 02 16:16:46 crc kubenswrapper[4933]: I1202 16:16:46.456434 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa2e200b-ec52-4e46-bc66-c40d8b3c0c27" containerName="extract-content" Dec 02 16:16:46 crc kubenswrapper[4933]: E1202 16:16:46.456486 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa2e200b-ec52-4e46-bc66-c40d8b3c0c27" containerName="extract-utilities" Dec 02 16:16:46 crc kubenswrapper[4933]: I1202 16:16:46.456496 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa2e200b-ec52-4e46-bc66-c40d8b3c0c27" containerName="extract-utilities" Dec 02 16:16:46 crc kubenswrapper[4933]: I1202 16:16:46.456790 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa2e200b-ec52-4e46-bc66-c40d8b3c0c27" containerName="registry-server" Dec 02 16:16:46 crc kubenswrapper[4933]: I1202 16:16:46.457917 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-pvgh5" Dec 02 16:16:46 crc kubenswrapper[4933]: I1202 16:16:46.489477 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 02 16:16:46 crc kubenswrapper[4933]: I1202 16:16:46.494910 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-pvgh5"] Dec 02 16:16:46 crc kubenswrapper[4933]: I1202 16:16:46.530511 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9bea586b-0ef5-4524-8171-5dce637605a8-operator-scripts\") pod \"nova-api-db-create-pvgh5\" (UID: \"9bea586b-0ef5-4524-8171-5dce637605a8\") " pod="openstack/nova-api-db-create-pvgh5" Dec 02 16:16:46 crc kubenswrapper[4933]: I1202 16:16:46.530667 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvxdc\" (UniqueName: \"kubernetes.io/projected/9bea586b-0ef5-4524-8171-5dce637605a8-kube-api-access-zvxdc\") pod \"nova-api-db-create-pvgh5\" (UID: \"9bea586b-0ef5-4524-8171-5dce637605a8\") " pod="openstack/nova-api-db-create-pvgh5" Dec 02 16:16:46 crc kubenswrapper[4933]: I1202 16:16:46.579300 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-jq7m4"] Dec 02 16:16:46 crc kubenswrapper[4933]: I1202 16:16:46.581242 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-jq7m4" Dec 02 16:16:46 crc kubenswrapper[4933]: I1202 16:16:46.619903 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-jq7m4"] Dec 02 16:16:46 crc kubenswrapper[4933]: I1202 16:16:46.637986 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvxdc\" (UniqueName: \"kubernetes.io/projected/9bea586b-0ef5-4524-8171-5dce637605a8-kube-api-access-zvxdc\") pod \"nova-api-db-create-pvgh5\" (UID: \"9bea586b-0ef5-4524-8171-5dce637605a8\") " pod="openstack/nova-api-db-create-pvgh5" Dec 02 16:16:46 crc kubenswrapper[4933]: I1202 16:16:46.638082 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtnvv\" (UniqueName: \"kubernetes.io/projected/e488a57f-0085-46e8-b496-9d5886e003fb-kube-api-access-jtnvv\") pod \"nova-cell0-db-create-jq7m4\" (UID: \"e488a57f-0085-46e8-b496-9d5886e003fb\") " pod="openstack/nova-cell0-db-create-jq7m4" Dec 02 16:16:46 crc kubenswrapper[4933]: I1202 16:16:46.638314 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e488a57f-0085-46e8-b496-9d5886e003fb-operator-scripts\") pod \"nova-cell0-db-create-jq7m4\" (UID: \"e488a57f-0085-46e8-b496-9d5886e003fb\") " pod="openstack/nova-cell0-db-create-jq7m4" Dec 02 16:16:46 crc kubenswrapper[4933]: I1202 16:16:46.638420 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9bea586b-0ef5-4524-8171-5dce637605a8-operator-scripts\") pod \"nova-api-db-create-pvgh5\" (UID: \"9bea586b-0ef5-4524-8171-5dce637605a8\") " pod="openstack/nova-api-db-create-pvgh5" Dec 02 16:16:46 crc kubenswrapper[4933]: I1202 16:16:46.639413 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9bea586b-0ef5-4524-8171-5dce637605a8-operator-scripts\") pod \"nova-api-db-create-pvgh5\" (UID: \"9bea586b-0ef5-4524-8171-5dce637605a8\") " pod="openstack/nova-api-db-create-pvgh5" Dec 02 16:16:46 crc kubenswrapper[4933]: I1202 16:16:46.647120 4933 generic.go:334] "Generic (PLEG): container finished" podID="1360dd8b-76ca-4595-a12d-f8f5309b06cd" containerID="e6b229e1a24db5a891b1bb3c995b31a7060b1517cbfc3e96bedafded76211113" exitCode=0 Dec 02 16:16:46 crc kubenswrapper[4933]: I1202 16:16:46.647162 4933 generic.go:334] "Generic (PLEG): container finished" podID="1360dd8b-76ca-4595-a12d-f8f5309b06cd" containerID="a5812e2d5c5ef9245b1d3f50c8f25cb2f929c5400f5e20c77006f427985b994a" exitCode=2 Dec 02 16:16:46 crc kubenswrapper[4933]: I1202 16:16:46.647172 4933 generic.go:334] "Generic (PLEG): container finished" podID="1360dd8b-76ca-4595-a12d-f8f5309b06cd" containerID="5202c98bfd57880824328d8a0af71f21f541049fbd19445ed5c9cb9f17911071" exitCode=0 Dec 02 16:16:46 crc kubenswrapper[4933]: I1202 16:16:46.647241 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1360dd8b-76ca-4595-a12d-f8f5309b06cd","Type":"ContainerDied","Data":"e6b229e1a24db5a891b1bb3c995b31a7060b1517cbfc3e96bedafded76211113"} Dec 02 16:16:46 crc kubenswrapper[4933]: I1202 16:16:46.647271 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1360dd8b-76ca-4595-a12d-f8f5309b06cd","Type":"ContainerDied","Data":"a5812e2d5c5ef9245b1d3f50c8f25cb2f929c5400f5e20c77006f427985b994a"} Dec 02 16:16:46 crc kubenswrapper[4933]: I1202 16:16:46.647283 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1360dd8b-76ca-4595-a12d-f8f5309b06cd","Type":"ContainerDied","Data":"5202c98bfd57880824328d8a0af71f21f541049fbd19445ed5c9cb9f17911071"} Dec 02 16:16:46 crc kubenswrapper[4933]: I1202 16:16:46.677315 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvxdc\" (UniqueName: \"kubernetes.io/projected/9bea586b-0ef5-4524-8171-5dce637605a8-kube-api-access-zvxdc\") pod \"nova-api-db-create-pvgh5\" (UID: \"9bea586b-0ef5-4524-8171-5dce637605a8\") " pod="openstack/nova-api-db-create-pvgh5" Dec 02 16:16:46 crc kubenswrapper[4933]: I1202 16:16:46.689072 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-56a2-account-create-update-nt6cr"] Dec 02 16:16:46 crc kubenswrapper[4933]: I1202 16:16:46.693037 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-56a2-account-create-update-nt6cr" Dec 02 16:16:46 crc kubenswrapper[4933]: I1202 16:16:46.696180 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Dec 02 16:16:46 crc kubenswrapper[4933]: I1202 16:16:46.718005 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-56a2-account-create-update-nt6cr"] Dec 02 16:16:46 crc kubenswrapper[4933]: I1202 16:16:46.741065 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e488a57f-0085-46e8-b496-9d5886e003fb-operator-scripts\") pod \"nova-cell0-db-create-jq7m4\" (UID: \"e488a57f-0085-46e8-b496-9d5886e003fb\") " pod="openstack/nova-cell0-db-create-jq7m4" Dec 02 16:16:46 crc kubenswrapper[4933]: I1202 16:16:46.741300 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtnvv\" (UniqueName: \"kubernetes.io/projected/e488a57f-0085-46e8-b496-9d5886e003fb-kube-api-access-jtnvv\") pod \"nova-cell0-db-create-jq7m4\" (UID: \"e488a57f-0085-46e8-b496-9d5886e003fb\") " pod="openstack/nova-cell0-db-create-jq7m4" Dec 02 16:16:46 crc kubenswrapper[4933]: I1202 16:16:46.741384 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmhr4\" (UniqueName: \"kubernetes.io/projected/7971426e-62c3-4709-8e81-8ce9a491f510-kube-api-access-fmhr4\") pod \"nova-api-56a2-account-create-update-nt6cr\" (UID: \"7971426e-62c3-4709-8e81-8ce9a491f510\") " pod="openstack/nova-api-56a2-account-create-update-nt6cr" Dec 02 16:16:46 crc kubenswrapper[4933]: I1202 16:16:46.741628 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7971426e-62c3-4709-8e81-8ce9a491f510-operator-scripts\") pod \"nova-api-56a2-account-create-update-nt6cr\" (UID: \"7971426e-62c3-4709-8e81-8ce9a491f510\") " pod="openstack/nova-api-56a2-account-create-update-nt6cr" Dec 02 16:16:46 crc kubenswrapper[4933]: I1202 16:16:46.742589 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e488a57f-0085-46e8-b496-9d5886e003fb-operator-scripts\") pod \"nova-cell0-db-create-jq7m4\" (UID: \"e488a57f-0085-46e8-b496-9d5886e003fb\") " pod="openstack/nova-cell0-db-create-jq7m4" Dec 02 16:16:46 crc kubenswrapper[4933]: I1202 16:16:46.770236 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtnvv\" (UniqueName: \"kubernetes.io/projected/e488a57f-0085-46e8-b496-9d5886e003fb-kube-api-access-jtnvv\") pod \"nova-cell0-db-create-jq7m4\" (UID: \"e488a57f-0085-46e8-b496-9d5886e003fb\") " pod="openstack/nova-cell0-db-create-jq7m4" Dec 02 16:16:46 crc kubenswrapper[4933]: I1202 16:16:46.780691 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-h4bvg"] Dec 02 16:16:46 crc kubenswrapper[4933]: I1202 16:16:46.782193 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-h4bvg" Dec 02 16:16:46 crc kubenswrapper[4933]: I1202 16:16:46.799177 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-h4bvg"] Dec 02 16:16:46 crc kubenswrapper[4933]: I1202 16:16:46.799425 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-pvgh5" Dec 02 16:16:46 crc kubenswrapper[4933]: I1202 16:16:46.843731 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmhr4\" (UniqueName: \"kubernetes.io/projected/7971426e-62c3-4709-8e81-8ce9a491f510-kube-api-access-fmhr4\") pod \"nova-api-56a2-account-create-update-nt6cr\" (UID: \"7971426e-62c3-4709-8e81-8ce9a491f510\") " pod="openstack/nova-api-56a2-account-create-update-nt6cr" Dec 02 16:16:46 crc kubenswrapper[4933]: I1202 16:16:46.843840 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7971426e-62c3-4709-8e81-8ce9a491f510-operator-scripts\") pod \"nova-api-56a2-account-create-update-nt6cr\" (UID: \"7971426e-62c3-4709-8e81-8ce9a491f510\") " pod="openstack/nova-api-56a2-account-create-update-nt6cr" Dec 02 16:16:46 crc kubenswrapper[4933]: I1202 16:16:46.844498 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7971426e-62c3-4709-8e81-8ce9a491f510-operator-scripts\") pod \"nova-api-56a2-account-create-update-nt6cr\" (UID: \"7971426e-62c3-4709-8e81-8ce9a491f510\") " pod="openstack/nova-api-56a2-account-create-update-nt6cr" Dec 02 16:16:46 crc kubenswrapper[4933]: I1202 16:16:46.871105 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmhr4\" (UniqueName: \"kubernetes.io/projected/7971426e-62c3-4709-8e81-8ce9a491f510-kube-api-access-fmhr4\") pod \"nova-api-56a2-account-create-update-nt6cr\" (UID: \"7971426e-62c3-4709-8e81-8ce9a491f510\") " pod="openstack/nova-api-56a2-account-create-update-nt6cr" Dec 02 16:16:46 crc kubenswrapper[4933]: I1202 16:16:46.877298 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-37a8-account-create-update-2fzs9"] Dec 02 16:16:46 crc kubenswrapper[4933]: I1202 16:16:46.878998 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-37a8-account-create-update-2fzs9" Dec 02 16:16:46 crc kubenswrapper[4933]: I1202 16:16:46.881897 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Dec 02 16:16:46 crc kubenswrapper[4933]: I1202 16:16:46.890208 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-37a8-account-create-update-2fzs9"] Dec 02 16:16:46 crc kubenswrapper[4933]: I1202 16:16:46.930963 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-jq7m4" Dec 02 16:16:46 crc kubenswrapper[4933]: I1202 16:16:46.950264 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d4e7c120-984b-477a-929e-d0b9ddec63d8-operator-scripts\") pod \"nova-cell1-db-create-h4bvg\" (UID: \"d4e7c120-984b-477a-929e-d0b9ddec63d8\") " pod="openstack/nova-cell1-db-create-h4bvg" Dec 02 16:16:46 crc kubenswrapper[4933]: I1202 16:16:46.954927 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9d7h\" (UniqueName: \"kubernetes.io/projected/d4e7c120-984b-477a-929e-d0b9ddec63d8-kube-api-access-w9d7h\") pod \"nova-cell1-db-create-h4bvg\" (UID: \"d4e7c120-984b-477a-929e-d0b9ddec63d8\") " pod="openstack/nova-cell1-db-create-h4bvg" Dec 02 16:16:47 crc kubenswrapper[4933]: I1202 16:16:47.067131 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-578ms\" (UniqueName: \"kubernetes.io/projected/8bd90a06-f10c-4249-99c3-f186848b27f2-kube-api-access-578ms\") pod \"nova-cell0-37a8-account-create-update-2fzs9\" (UID: \"8bd90a06-f10c-4249-99c3-f186848b27f2\") " pod="openstack/nova-cell0-37a8-account-create-update-2fzs9" Dec 02 16:16:47 crc kubenswrapper[4933]: I1202 16:16:47.067477 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d4e7c120-984b-477a-929e-d0b9ddec63d8-operator-scripts\") pod \"nova-cell1-db-create-h4bvg\" (UID: \"d4e7c120-984b-477a-929e-d0b9ddec63d8\") " pod="openstack/nova-cell1-db-create-h4bvg" Dec 02 16:16:47 crc kubenswrapper[4933]: I1202 16:16:47.067632 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9d7h\" (UniqueName: \"kubernetes.io/projected/d4e7c120-984b-477a-929e-d0b9ddec63d8-kube-api-access-w9d7h\") pod \"nova-cell1-db-create-h4bvg\" (UID: \"d4e7c120-984b-477a-929e-d0b9ddec63d8\") " pod="openstack/nova-cell1-db-create-h4bvg" Dec 02 16:16:47 crc kubenswrapper[4933]: I1202 16:16:47.067955 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8bd90a06-f10c-4249-99c3-f186848b27f2-operator-scripts\") pod \"nova-cell0-37a8-account-create-update-2fzs9\" (UID: \"8bd90a06-f10c-4249-99c3-f186848b27f2\") " pod="openstack/nova-cell0-37a8-account-create-update-2fzs9" Dec 02 16:16:47 crc kubenswrapper[4933]: I1202 16:16:47.068790 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d4e7c120-984b-477a-929e-d0b9ddec63d8-operator-scripts\") pod \"nova-cell1-db-create-h4bvg\" (UID: \"d4e7c120-984b-477a-929e-d0b9ddec63d8\") " pod="openstack/nova-cell1-db-create-h4bvg" Dec 02 16:16:47 crc kubenswrapper[4933]: I1202 16:16:47.076332 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-56a2-account-create-update-nt6cr" Dec 02 16:16:47 crc kubenswrapper[4933]: I1202 16:16:47.086268 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa2e200b-ec52-4e46-bc66-c40d8b3c0c27" path="/var/lib/kubelet/pods/aa2e200b-ec52-4e46-bc66-c40d8b3c0c27/volumes" Dec 02 16:16:47 crc kubenswrapper[4933]: I1202 16:16:47.087443 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-2cb5-account-create-update-7gdfb"] Dec 02 16:16:47 crc kubenswrapper[4933]: I1202 16:16:47.090350 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-2cb5-account-create-update-7gdfb"] Dec 02 16:16:47 crc kubenswrapper[4933]: I1202 16:16:47.090464 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-2cb5-account-create-update-7gdfb" Dec 02 16:16:47 crc kubenswrapper[4933]: I1202 16:16:47.094720 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Dec 02 16:16:47 crc kubenswrapper[4933]: I1202 16:16:47.099478 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9d7h\" (UniqueName: \"kubernetes.io/projected/d4e7c120-984b-477a-929e-d0b9ddec63d8-kube-api-access-w9d7h\") pod \"nova-cell1-db-create-h4bvg\" (UID: \"d4e7c120-984b-477a-929e-d0b9ddec63d8\") " pod="openstack/nova-cell1-db-create-h4bvg" Dec 02 16:16:47 crc kubenswrapper[4933]: I1202 16:16:47.170142 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-578ms\" (UniqueName: \"kubernetes.io/projected/8bd90a06-f10c-4249-99c3-f186848b27f2-kube-api-access-578ms\") pod \"nova-cell0-37a8-account-create-update-2fzs9\" (UID: \"8bd90a06-f10c-4249-99c3-f186848b27f2\") " pod="openstack/nova-cell0-37a8-account-create-update-2fzs9" Dec 02 16:16:47 crc kubenswrapper[4933]: I1202 16:16:47.170318 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8bd90a06-f10c-4249-99c3-f186848b27f2-operator-scripts\") pod \"nova-cell0-37a8-account-create-update-2fzs9\" (UID: \"8bd90a06-f10c-4249-99c3-f186848b27f2\") " pod="openstack/nova-cell0-37a8-account-create-update-2fzs9" Dec 02 16:16:47 crc kubenswrapper[4933]: I1202 16:16:47.172787 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8bd90a06-f10c-4249-99c3-f186848b27f2-operator-scripts\") pod \"nova-cell0-37a8-account-create-update-2fzs9\" (UID: \"8bd90a06-f10c-4249-99c3-f186848b27f2\") " pod="openstack/nova-cell0-37a8-account-create-update-2fzs9" Dec 02 16:16:47 crc kubenswrapper[4933]: I1202 16:16:47.204764 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-578ms\" (UniqueName: \"kubernetes.io/projected/8bd90a06-f10c-4249-99c3-f186848b27f2-kube-api-access-578ms\") pod \"nova-cell0-37a8-account-create-update-2fzs9\" (UID: \"8bd90a06-f10c-4249-99c3-f186848b27f2\") " pod="openstack/nova-cell0-37a8-account-create-update-2fzs9" Dec 02 16:16:47 crc kubenswrapper[4933]: I1202 16:16:47.286586 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kts7t\" (UniqueName: \"kubernetes.io/projected/adec7c24-3d21-4a10-a5f7-d7bf669b1007-kube-api-access-kts7t\") pod \"nova-cell1-2cb5-account-create-update-7gdfb\" (UID: \"adec7c24-3d21-4a10-a5f7-d7bf669b1007\") " pod="openstack/nova-cell1-2cb5-account-create-update-7gdfb" Dec 02 16:16:47 crc kubenswrapper[4933]: I1202 16:16:47.288650 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/adec7c24-3d21-4a10-a5f7-d7bf669b1007-operator-scripts\") pod \"nova-cell1-2cb5-account-create-update-7gdfb\" (UID: \"adec7c24-3d21-4a10-a5f7-d7bf669b1007\") " pod="openstack/nova-cell1-2cb5-account-create-update-7gdfb" Dec 02 16:16:47 crc kubenswrapper[4933]: I1202 16:16:47.354105 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-h4bvg" Dec 02 16:16:47 crc kubenswrapper[4933]: I1202 16:16:47.378497 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-37a8-account-create-update-2fzs9" Dec 02 16:16:47 crc kubenswrapper[4933]: I1202 16:16:47.392179 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/adec7c24-3d21-4a10-a5f7-d7bf669b1007-operator-scripts\") pod \"nova-cell1-2cb5-account-create-update-7gdfb\" (UID: \"adec7c24-3d21-4a10-a5f7-d7bf669b1007\") " pod="openstack/nova-cell1-2cb5-account-create-update-7gdfb" Dec 02 16:16:47 crc kubenswrapper[4933]: I1202 16:16:47.392274 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kts7t\" (UniqueName: \"kubernetes.io/projected/adec7c24-3d21-4a10-a5f7-d7bf669b1007-kube-api-access-kts7t\") pod \"nova-cell1-2cb5-account-create-update-7gdfb\" (UID: \"adec7c24-3d21-4a10-a5f7-d7bf669b1007\") " pod="openstack/nova-cell1-2cb5-account-create-update-7gdfb" Dec 02 16:16:47 crc kubenswrapper[4933]: I1202 16:16:47.393348 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/adec7c24-3d21-4a10-a5f7-d7bf669b1007-operator-scripts\") pod \"nova-cell1-2cb5-account-create-update-7gdfb\" (UID: \"adec7c24-3d21-4a10-a5f7-d7bf669b1007\") " pod="openstack/nova-cell1-2cb5-account-create-update-7gdfb" Dec 02 16:16:47 crc kubenswrapper[4933]: I1202 16:16:47.460414 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kts7t\" (UniqueName: \"kubernetes.io/projected/adec7c24-3d21-4a10-a5f7-d7bf669b1007-kube-api-access-kts7t\") pod \"nova-cell1-2cb5-account-create-update-7gdfb\" (UID: \"adec7c24-3d21-4a10-a5f7-d7bf669b1007\") " pod="openstack/nova-cell1-2cb5-account-create-update-7gdfb" Dec 02 16:16:47 crc kubenswrapper[4933]: I1202 16:16:47.473428 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-2cb5-account-create-update-7gdfb" Dec 02 16:16:47 crc kubenswrapper[4933]: I1202 16:16:47.505331 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-pvgh5"] Dec 02 16:16:47 crc kubenswrapper[4933]: I1202 16:16:47.628285 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-jq7m4"] Dec 02 16:16:47 crc kubenswrapper[4933]: I1202 16:16:47.686926 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-pvgh5" event={"ID":"9bea586b-0ef5-4524-8171-5dce637605a8","Type":"ContainerStarted","Data":"ad2b533039fe5e1149214f78a312ab6242a5f2b38cddab7a2ff9403a952c000b"} Dec 02 16:16:47 crc kubenswrapper[4933]: I1202 16:16:47.687923 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-jq7m4" event={"ID":"e488a57f-0085-46e8-b496-9d5886e003fb","Type":"ContainerStarted","Data":"b6156d73d9d164468c982e361e2a7bdbc135c380276929185602a6a62ae07688"} Dec 02 16:16:47 crc kubenswrapper[4933]: I1202 16:16:47.874391 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-56a2-account-create-update-nt6cr"] Dec 02 16:16:48 crc kubenswrapper[4933]: I1202 16:16:48.290962 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-37a8-account-create-update-2fzs9"] Dec 02 16:16:48 crc kubenswrapper[4933]: I1202 16:16:48.381127 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-h4bvg"] Dec 02 16:16:48 crc kubenswrapper[4933]: W1202 16:16:48.396707 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd4e7c120_984b_477a_929e_d0b9ddec63d8.slice/crio-057a8f6cf3fe7ecb08bee1c49a23693e6959938aa62739de0532acd90b04f410 WatchSource:0}: Error finding container 057a8f6cf3fe7ecb08bee1c49a23693e6959938aa62739de0532acd90b04f410: Status 404 returned error can't find the container with id 057a8f6cf3fe7ecb08bee1c49a23693e6959938aa62739de0532acd90b04f410 Dec 02 16:16:48 crc kubenswrapper[4933]: I1202 16:16:48.507413 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-2cb5-account-create-update-7gdfb"] Dec 02 16:16:48 crc kubenswrapper[4933]: I1202 16:16:48.700038 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-2cb5-account-create-update-7gdfb" event={"ID":"adec7c24-3d21-4a10-a5f7-d7bf669b1007","Type":"ContainerStarted","Data":"92d90bd0b13e84a26d06f8265747e8ecb6849043876d47a247c0bc5f05ce7e3f"} Dec 02 16:16:48 crc kubenswrapper[4933]: I1202 16:16:48.708968 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-37a8-account-create-update-2fzs9" event={"ID":"8bd90a06-f10c-4249-99c3-f186848b27f2","Type":"ContainerStarted","Data":"7817274f9e9d8ada2f089d65e2b8f010643d1a99e66bb96bd5e79509a196423d"} Dec 02 16:16:48 crc kubenswrapper[4933]: I1202 16:16:48.709008 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-37a8-account-create-update-2fzs9" event={"ID":"8bd90a06-f10c-4249-99c3-f186848b27f2","Type":"ContainerStarted","Data":"657adea712b6a4d8fb207e5f35e415a77e396c13e93627249aa1d66492efc36b"} Dec 02 16:16:48 crc kubenswrapper[4933]: I1202 16:16:48.716133 4933 generic.go:334] "Generic (PLEG): container finished" podID="e488a57f-0085-46e8-b496-9d5886e003fb" containerID="4ec71ca02ee7d7b0536de444c3d4a00719a478fad96c644b5725c9f37563e7d7" exitCode=0 Dec 02 16:16:48 crc kubenswrapper[4933]: I1202 16:16:48.716264 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-jq7m4" event={"ID":"e488a57f-0085-46e8-b496-9d5886e003fb","Type":"ContainerDied","Data":"4ec71ca02ee7d7b0536de444c3d4a00719a478fad96c644b5725c9f37563e7d7"} Dec 02 16:16:48 crc kubenswrapper[4933]: I1202 16:16:48.730524 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-37a8-account-create-update-2fzs9" podStartSLOduration=2.730507829 podStartE2EDuration="2.730507829s" podCreationTimestamp="2025-12-02 16:16:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 16:16:48.725106882 +0000 UTC m=+1471.976333585" watchObservedRunningTime="2025-12-02 16:16:48.730507829 +0000 UTC m=+1471.981734532" Dec 02 16:16:48 crc kubenswrapper[4933]: I1202 16:16:48.732146 4933 generic.go:334] "Generic (PLEG): container finished" podID="7971426e-62c3-4709-8e81-8ce9a491f510" containerID="d164e95c31b94065ab22675c51a7b0bee0a24ed21e57d8134fbde780098d8c45" exitCode=0 Dec 02 16:16:48 crc kubenswrapper[4933]: I1202 16:16:48.732241 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-56a2-account-create-update-nt6cr" event={"ID":"7971426e-62c3-4709-8e81-8ce9a491f510","Type":"ContainerDied","Data":"d164e95c31b94065ab22675c51a7b0bee0a24ed21e57d8134fbde780098d8c45"} Dec 02 16:16:48 crc kubenswrapper[4933]: I1202 16:16:48.732281 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-56a2-account-create-update-nt6cr" event={"ID":"7971426e-62c3-4709-8e81-8ce9a491f510","Type":"ContainerStarted","Data":"eb78c57fa87fac15212cfaf1182084f09868a840c81d386dd476de704ea9e34a"} Dec 02 16:16:48 crc kubenswrapper[4933]: I1202 16:16:48.749172 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-h4bvg" event={"ID":"d4e7c120-984b-477a-929e-d0b9ddec63d8","Type":"ContainerStarted","Data":"057a8f6cf3fe7ecb08bee1c49a23693e6959938aa62739de0532acd90b04f410"} Dec 02 16:16:48 crc kubenswrapper[4933]: I1202 16:16:48.757214 4933 generic.go:334] "Generic (PLEG): container finished" podID="9bea586b-0ef5-4524-8171-5dce637605a8" containerID="b09775dea3640bffaee49f07a4d5c9359c413c076fe5b41e55c987e1c93bb0e1" exitCode=0 Dec 02 16:16:48 crc kubenswrapper[4933]: I1202 16:16:48.757257 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-pvgh5" event={"ID":"9bea586b-0ef5-4524-8171-5dce637605a8","Type":"ContainerDied","Data":"b09775dea3640bffaee49f07a4d5c9359c413c076fe5b41e55c987e1c93bb0e1"} Dec 02 16:16:49 crc kubenswrapper[4933]: I1202 16:16:49.771169 4933 generic.go:334] "Generic (PLEG): container finished" podID="8bd90a06-f10c-4249-99c3-f186848b27f2" containerID="7817274f9e9d8ada2f089d65e2b8f010643d1a99e66bb96bd5e79509a196423d" exitCode=0 Dec 02 16:16:49 crc kubenswrapper[4933]: I1202 16:16:49.771272 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-37a8-account-create-update-2fzs9" event={"ID":"8bd90a06-f10c-4249-99c3-f186848b27f2","Type":"ContainerDied","Data":"7817274f9e9d8ada2f089d65e2b8f010643d1a99e66bb96bd5e79509a196423d"} Dec 02 16:16:49 crc kubenswrapper[4933]: I1202 16:16:49.773983 4933 generic.go:334] "Generic (PLEG): container finished" podID="d4e7c120-984b-477a-929e-d0b9ddec63d8" containerID="7e58dcaca13dfc2b18cb9fef5e610f57b93bd430c5179fe8c0df14b01a0dd738" exitCode=0 Dec 02 16:16:49 crc kubenswrapper[4933]: I1202 16:16:49.774417 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-h4bvg" event={"ID":"d4e7c120-984b-477a-929e-d0b9ddec63d8","Type":"ContainerDied","Data":"7e58dcaca13dfc2b18cb9fef5e610f57b93bd430c5179fe8c0df14b01a0dd738"} Dec 02 16:16:49 crc kubenswrapper[4933]: I1202 16:16:49.775931 4933 generic.go:334] "Generic (PLEG): container finished" podID="adec7c24-3d21-4a10-a5f7-d7bf669b1007" containerID="761e5650d93526f31152508daccf11a2a4a61a8b05ea5e05382de732134fb744" exitCode=0 Dec 02 16:16:49 crc kubenswrapper[4933]: I1202 16:16:49.775981 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-2cb5-account-create-update-7gdfb" event={"ID":"adec7c24-3d21-4a10-a5f7-d7bf669b1007","Type":"ContainerDied","Data":"761e5650d93526f31152508daccf11a2a4a61a8b05ea5e05382de732134fb744"} Dec 02 16:16:50 crc kubenswrapper[4933]: I1202 16:16:50.430406 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-pvgh5" Dec 02 16:16:50 crc kubenswrapper[4933]: I1202 16:16:50.437234 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-jq7m4" Dec 02 16:16:50 crc kubenswrapper[4933]: I1202 16:16:50.486368 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-56a2-account-create-update-nt6cr" Dec 02 16:16:50 crc kubenswrapper[4933]: I1202 16:16:50.492967 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e488a57f-0085-46e8-b496-9d5886e003fb-operator-scripts\") pod \"e488a57f-0085-46e8-b496-9d5886e003fb\" (UID: \"e488a57f-0085-46e8-b496-9d5886e003fb\") " Dec 02 16:16:50 crc kubenswrapper[4933]: I1202 16:16:50.493506 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9bea586b-0ef5-4524-8171-5dce637605a8-operator-scripts\") pod \"9bea586b-0ef5-4524-8171-5dce637605a8\" (UID: \"9bea586b-0ef5-4524-8171-5dce637605a8\") " Dec 02 16:16:50 crc kubenswrapper[4933]: I1202 16:16:50.493753 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e488a57f-0085-46e8-b496-9d5886e003fb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e488a57f-0085-46e8-b496-9d5886e003fb" (UID: "e488a57f-0085-46e8-b496-9d5886e003fb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:16:50 crc kubenswrapper[4933]: I1202 16:16:50.493994 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9bea586b-0ef5-4524-8171-5dce637605a8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9bea586b-0ef5-4524-8171-5dce637605a8" (UID: "9bea586b-0ef5-4524-8171-5dce637605a8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:16:50 crc kubenswrapper[4933]: I1202 16:16:50.504047 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jtnvv\" (UniqueName: \"kubernetes.io/projected/e488a57f-0085-46e8-b496-9d5886e003fb-kube-api-access-jtnvv\") pod \"e488a57f-0085-46e8-b496-9d5886e003fb\" (UID: \"e488a57f-0085-46e8-b496-9d5886e003fb\") " Dec 02 16:16:50 crc kubenswrapper[4933]: I1202 16:16:50.504385 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zvxdc\" (UniqueName: \"kubernetes.io/projected/9bea586b-0ef5-4524-8171-5dce637605a8-kube-api-access-zvxdc\") pod \"9bea586b-0ef5-4524-8171-5dce637605a8\" (UID: \"9bea586b-0ef5-4524-8171-5dce637605a8\") " Dec 02 16:16:50 crc kubenswrapper[4933]: I1202 16:16:50.505434 4933 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e488a57f-0085-46e8-b496-9d5886e003fb-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 16:16:50 crc kubenswrapper[4933]: I1202 16:16:50.505547 4933 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9bea586b-0ef5-4524-8171-5dce637605a8-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 16:16:50 crc kubenswrapper[4933]: I1202 16:16:50.508844 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e488a57f-0085-46e8-b496-9d5886e003fb-kube-api-access-jtnvv" (OuterVolumeSpecName: "kube-api-access-jtnvv") pod "e488a57f-0085-46e8-b496-9d5886e003fb" (UID: "e488a57f-0085-46e8-b496-9d5886e003fb"). InnerVolumeSpecName "kube-api-access-jtnvv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:16:50 crc kubenswrapper[4933]: I1202 16:16:50.509927 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9bea586b-0ef5-4524-8171-5dce637605a8-kube-api-access-zvxdc" (OuterVolumeSpecName: "kube-api-access-zvxdc") pod "9bea586b-0ef5-4524-8171-5dce637605a8" (UID: "9bea586b-0ef5-4524-8171-5dce637605a8"). InnerVolumeSpecName "kube-api-access-zvxdc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:16:50 crc kubenswrapper[4933]: I1202 16:16:50.608801 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fmhr4\" (UniqueName: \"kubernetes.io/projected/7971426e-62c3-4709-8e81-8ce9a491f510-kube-api-access-fmhr4\") pod \"7971426e-62c3-4709-8e81-8ce9a491f510\" (UID: \"7971426e-62c3-4709-8e81-8ce9a491f510\") " Dec 02 16:16:50 crc kubenswrapper[4933]: I1202 16:16:50.609058 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7971426e-62c3-4709-8e81-8ce9a491f510-operator-scripts\") pod \"7971426e-62c3-4709-8e81-8ce9a491f510\" (UID: \"7971426e-62c3-4709-8e81-8ce9a491f510\") " Dec 02 16:16:50 crc kubenswrapper[4933]: I1202 16:16:50.609586 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jtnvv\" (UniqueName: \"kubernetes.io/projected/e488a57f-0085-46e8-b496-9d5886e003fb-kube-api-access-jtnvv\") on node \"crc\" DevicePath \"\"" Dec 02 16:16:50 crc kubenswrapper[4933]: I1202 16:16:50.609606 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zvxdc\" (UniqueName: \"kubernetes.io/projected/9bea586b-0ef5-4524-8171-5dce637605a8-kube-api-access-zvxdc\") on node \"crc\" DevicePath \"\"" Dec 02 16:16:50 crc kubenswrapper[4933]: I1202 16:16:50.615926 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7971426e-62c3-4709-8e81-8ce9a491f510-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7971426e-62c3-4709-8e81-8ce9a491f510" (UID: "7971426e-62c3-4709-8e81-8ce9a491f510"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:16:50 crc kubenswrapper[4933]: I1202 16:16:50.620204 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7971426e-62c3-4709-8e81-8ce9a491f510-kube-api-access-fmhr4" (OuterVolumeSpecName: "kube-api-access-fmhr4") pod "7971426e-62c3-4709-8e81-8ce9a491f510" (UID: "7971426e-62c3-4709-8e81-8ce9a491f510"). InnerVolumeSpecName "kube-api-access-fmhr4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:16:50 crc kubenswrapper[4933]: I1202 16:16:50.711350 4933 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7971426e-62c3-4709-8e81-8ce9a491f510-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 16:16:50 crc kubenswrapper[4933]: I1202 16:16:50.711649 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fmhr4\" (UniqueName: \"kubernetes.io/projected/7971426e-62c3-4709-8e81-8ce9a491f510-kube-api-access-fmhr4\") on node \"crc\" DevicePath \"\"" Dec 02 16:16:50 crc kubenswrapper[4933]: I1202 16:16:50.792212 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-pvgh5" Dec 02 16:16:50 crc kubenswrapper[4933]: I1202 16:16:50.793082 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-pvgh5" event={"ID":"9bea586b-0ef5-4524-8171-5dce637605a8","Type":"ContainerDied","Data":"ad2b533039fe5e1149214f78a312ab6242a5f2b38cddab7a2ff9403a952c000b"} Dec 02 16:16:50 crc kubenswrapper[4933]: I1202 16:16:50.793119 4933 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ad2b533039fe5e1149214f78a312ab6242a5f2b38cddab7a2ff9403a952c000b" Dec 02 16:16:50 crc kubenswrapper[4933]: I1202 16:16:50.799308 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-jq7m4" event={"ID":"e488a57f-0085-46e8-b496-9d5886e003fb","Type":"ContainerDied","Data":"b6156d73d9d164468c982e361e2a7bdbc135c380276929185602a6a62ae07688"} Dec 02 16:16:50 crc kubenswrapper[4933]: I1202 16:16:50.799346 4933 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b6156d73d9d164468c982e361e2a7bdbc135c380276929185602a6a62ae07688" Dec 02 16:16:50 crc kubenswrapper[4933]: I1202 16:16:50.799347 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-jq7m4" Dec 02 16:16:50 crc kubenswrapper[4933]: I1202 16:16:50.800970 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-56a2-account-create-update-nt6cr" Dec 02 16:16:50 crc kubenswrapper[4933]: I1202 16:16:50.804172 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-56a2-account-create-update-nt6cr" event={"ID":"7971426e-62c3-4709-8e81-8ce9a491f510","Type":"ContainerDied","Data":"eb78c57fa87fac15212cfaf1182084f09868a840c81d386dd476de704ea9e34a"} Dec 02 16:16:50 crc kubenswrapper[4933]: I1202 16:16:50.804443 4933 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eb78c57fa87fac15212cfaf1182084f09868a840c81d386dd476de704ea9e34a" Dec 02 16:16:51 crc kubenswrapper[4933]: I1202 16:16:51.086798 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-h4bvg" Dec 02 16:16:51 crc kubenswrapper[4933]: I1202 16:16:51.227131 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d4e7c120-984b-477a-929e-d0b9ddec63d8-operator-scripts\") pod \"d4e7c120-984b-477a-929e-d0b9ddec63d8\" (UID: \"d4e7c120-984b-477a-929e-d0b9ddec63d8\") " Dec 02 16:16:51 crc kubenswrapper[4933]: I1202 16:16:51.227274 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9d7h\" (UniqueName: \"kubernetes.io/projected/d4e7c120-984b-477a-929e-d0b9ddec63d8-kube-api-access-w9d7h\") pod \"d4e7c120-984b-477a-929e-d0b9ddec63d8\" (UID: \"d4e7c120-984b-477a-929e-d0b9ddec63d8\") " Dec 02 16:16:51 crc kubenswrapper[4933]: I1202 16:16:51.228134 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4e7c120-984b-477a-929e-d0b9ddec63d8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d4e7c120-984b-477a-929e-d0b9ddec63d8" (UID: "d4e7c120-984b-477a-929e-d0b9ddec63d8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:16:51 crc kubenswrapper[4933]: I1202 16:16:51.238144 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4e7c120-984b-477a-929e-d0b9ddec63d8-kube-api-access-w9d7h" (OuterVolumeSpecName: "kube-api-access-w9d7h") pod "d4e7c120-984b-477a-929e-d0b9ddec63d8" (UID: "d4e7c120-984b-477a-929e-d0b9ddec63d8"). InnerVolumeSpecName "kube-api-access-w9d7h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:16:51 crc kubenswrapper[4933]: I1202 16:16:51.330736 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9d7h\" (UniqueName: \"kubernetes.io/projected/d4e7c120-984b-477a-929e-d0b9ddec63d8-kube-api-access-w9d7h\") on node \"crc\" DevicePath \"\"" Dec 02 16:16:51 crc kubenswrapper[4933]: I1202 16:16:51.330774 4933 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d4e7c120-984b-477a-929e-d0b9ddec63d8-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 16:16:51 crc kubenswrapper[4933]: I1202 16:16:51.386421 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-37a8-account-create-update-2fzs9" Dec 02 16:16:51 crc kubenswrapper[4933]: I1202 16:16:51.392498 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-2cb5-account-create-update-7gdfb" Dec 02 16:16:51 crc kubenswrapper[4933]: I1202 16:16:51.434894 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/adec7c24-3d21-4a10-a5f7-d7bf669b1007-operator-scripts\") pod \"adec7c24-3d21-4a10-a5f7-d7bf669b1007\" (UID: \"adec7c24-3d21-4a10-a5f7-d7bf669b1007\") " Dec 02 16:16:51 crc kubenswrapper[4933]: I1202 16:16:51.434992 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8bd90a06-f10c-4249-99c3-f186848b27f2-operator-scripts\") pod \"8bd90a06-f10c-4249-99c3-f186848b27f2\" (UID: \"8bd90a06-f10c-4249-99c3-f186848b27f2\") " Dec 02 16:16:51 crc kubenswrapper[4933]: I1202 16:16:51.435011 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-578ms\" (UniqueName: \"kubernetes.io/projected/8bd90a06-f10c-4249-99c3-f186848b27f2-kube-api-access-578ms\") pod \"8bd90a06-f10c-4249-99c3-f186848b27f2\" (UID: \"8bd90a06-f10c-4249-99c3-f186848b27f2\") " Dec 02 16:16:51 crc kubenswrapper[4933]: I1202 16:16:51.435047 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kts7t\" (UniqueName: \"kubernetes.io/projected/adec7c24-3d21-4a10-a5f7-d7bf669b1007-kube-api-access-kts7t\") pod \"adec7c24-3d21-4a10-a5f7-d7bf669b1007\" (UID: \"adec7c24-3d21-4a10-a5f7-d7bf669b1007\") " Dec 02 16:16:51 crc kubenswrapper[4933]: I1202 16:16:51.438071 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8bd90a06-f10c-4249-99c3-f186848b27f2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8bd90a06-f10c-4249-99c3-f186848b27f2" (UID: "8bd90a06-f10c-4249-99c3-f186848b27f2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:16:51 crc kubenswrapper[4933]: I1202 16:16:51.438374 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/adec7c24-3d21-4a10-a5f7-d7bf669b1007-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "adec7c24-3d21-4a10-a5f7-d7bf669b1007" (UID: "adec7c24-3d21-4a10-a5f7-d7bf669b1007"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:16:51 crc kubenswrapper[4933]: I1202 16:16:51.445058 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/adec7c24-3d21-4a10-a5f7-d7bf669b1007-kube-api-access-kts7t" (OuterVolumeSpecName: "kube-api-access-kts7t") pod "adec7c24-3d21-4a10-a5f7-d7bf669b1007" (UID: "adec7c24-3d21-4a10-a5f7-d7bf669b1007"). InnerVolumeSpecName "kube-api-access-kts7t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:16:51 crc kubenswrapper[4933]: I1202 16:16:51.445517 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8bd90a06-f10c-4249-99c3-f186848b27f2-kube-api-access-578ms" (OuterVolumeSpecName: "kube-api-access-578ms") pod "8bd90a06-f10c-4249-99c3-f186848b27f2" (UID: "8bd90a06-f10c-4249-99c3-f186848b27f2"). InnerVolumeSpecName "kube-api-access-578ms". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:16:51 crc kubenswrapper[4933]: I1202 16:16:51.537812 4933 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8bd90a06-f10c-4249-99c3-f186848b27f2-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 16:16:51 crc kubenswrapper[4933]: I1202 16:16:51.537869 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-578ms\" (UniqueName: \"kubernetes.io/projected/8bd90a06-f10c-4249-99c3-f186848b27f2-kube-api-access-578ms\") on node \"crc\" DevicePath \"\"" Dec 02 16:16:51 crc kubenswrapper[4933]: I1202 16:16:51.537881 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kts7t\" (UniqueName: \"kubernetes.io/projected/adec7c24-3d21-4a10-a5f7-d7bf669b1007-kube-api-access-kts7t\") on node \"crc\" DevicePath \"\"" Dec 02 16:16:51 crc kubenswrapper[4933]: I1202 16:16:51.537891 4933 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/adec7c24-3d21-4a10-a5f7-d7bf669b1007-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 16:16:51 crc kubenswrapper[4933]: I1202 16:16:51.813299 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-h4bvg" event={"ID":"d4e7c120-984b-477a-929e-d0b9ddec63d8","Type":"ContainerDied","Data":"057a8f6cf3fe7ecb08bee1c49a23693e6959938aa62739de0532acd90b04f410"} Dec 02 16:16:51 crc kubenswrapper[4933]: I1202 16:16:51.813358 4933 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="057a8f6cf3fe7ecb08bee1c49a23693e6959938aa62739de0532acd90b04f410" Dec 02 16:16:51 crc kubenswrapper[4933]: I1202 16:16:51.813322 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-h4bvg" Dec 02 16:16:51 crc kubenswrapper[4933]: I1202 16:16:51.815281 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-2cb5-account-create-update-7gdfb" event={"ID":"adec7c24-3d21-4a10-a5f7-d7bf669b1007","Type":"ContainerDied","Data":"92d90bd0b13e84a26d06f8265747e8ecb6849043876d47a247c0bc5f05ce7e3f"} Dec 02 16:16:51 crc kubenswrapper[4933]: I1202 16:16:51.815336 4933 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="92d90bd0b13e84a26d06f8265747e8ecb6849043876d47a247c0bc5f05ce7e3f" Dec 02 16:16:51 crc kubenswrapper[4933]: I1202 16:16:51.815296 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-2cb5-account-create-update-7gdfb" Dec 02 16:16:51 crc kubenswrapper[4933]: I1202 16:16:51.818075 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-37a8-account-create-update-2fzs9" event={"ID":"8bd90a06-f10c-4249-99c3-f186848b27f2","Type":"ContainerDied","Data":"657adea712b6a4d8fb207e5f35e415a77e396c13e93627249aa1d66492efc36b"} Dec 02 16:16:51 crc kubenswrapper[4933]: I1202 16:16:51.818117 4933 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="657adea712b6a4d8fb207e5f35e415a77e396c13e93627249aa1d66492efc36b" Dec 02 16:16:51 crc kubenswrapper[4933]: I1202 16:16:51.818189 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-37a8-account-create-update-2fzs9" Dec 02 16:16:52 crc kubenswrapper[4933]: I1202 16:16:52.453099 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 16:16:52 crc kubenswrapper[4933]: I1202 16:16:52.467157 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1360dd8b-76ca-4595-a12d-f8f5309b06cd-sg-core-conf-yaml\") pod \"1360dd8b-76ca-4595-a12d-f8f5309b06cd\" (UID: \"1360dd8b-76ca-4595-a12d-f8f5309b06cd\") " Dec 02 16:16:52 crc kubenswrapper[4933]: I1202 16:16:52.467215 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1360dd8b-76ca-4595-a12d-f8f5309b06cd-combined-ca-bundle\") pod \"1360dd8b-76ca-4595-a12d-f8f5309b06cd\" (UID: \"1360dd8b-76ca-4595-a12d-f8f5309b06cd\") " Dec 02 16:16:52 crc kubenswrapper[4933]: I1202 16:16:52.467247 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hb48x\" (UniqueName: \"kubernetes.io/projected/1360dd8b-76ca-4595-a12d-f8f5309b06cd-kube-api-access-hb48x\") pod \"1360dd8b-76ca-4595-a12d-f8f5309b06cd\" (UID: \"1360dd8b-76ca-4595-a12d-f8f5309b06cd\") " Dec 02 16:16:52 crc kubenswrapper[4933]: I1202 16:16:52.467278 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1360dd8b-76ca-4595-a12d-f8f5309b06cd-run-httpd\") pod \"1360dd8b-76ca-4595-a12d-f8f5309b06cd\" (UID: \"1360dd8b-76ca-4595-a12d-f8f5309b06cd\") " Dec 02 16:16:52 crc kubenswrapper[4933]: I1202 16:16:52.467320 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1360dd8b-76ca-4595-a12d-f8f5309b06cd-config-data\") pod \"1360dd8b-76ca-4595-a12d-f8f5309b06cd\" (UID: \"1360dd8b-76ca-4595-a12d-f8f5309b06cd\") " Dec 02 16:16:52 crc kubenswrapper[4933]: I1202 16:16:52.467398 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1360dd8b-76ca-4595-a12d-f8f5309b06cd-log-httpd\") pod \"1360dd8b-76ca-4595-a12d-f8f5309b06cd\" (UID: \"1360dd8b-76ca-4595-a12d-f8f5309b06cd\") " Dec 02 16:16:52 crc kubenswrapper[4933]: I1202 16:16:52.467429 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1360dd8b-76ca-4595-a12d-f8f5309b06cd-scripts\") pod \"1360dd8b-76ca-4595-a12d-f8f5309b06cd\" (UID: \"1360dd8b-76ca-4595-a12d-f8f5309b06cd\") " Dec 02 16:16:52 crc kubenswrapper[4933]: I1202 16:16:52.468173 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1360dd8b-76ca-4595-a12d-f8f5309b06cd-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "1360dd8b-76ca-4595-a12d-f8f5309b06cd" (UID: "1360dd8b-76ca-4595-a12d-f8f5309b06cd"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 16:16:52 crc kubenswrapper[4933]: I1202 16:16:52.469010 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1360dd8b-76ca-4595-a12d-f8f5309b06cd-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "1360dd8b-76ca-4595-a12d-f8f5309b06cd" (UID: "1360dd8b-76ca-4595-a12d-f8f5309b06cd"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 16:16:52 crc kubenswrapper[4933]: I1202 16:16:52.472924 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1360dd8b-76ca-4595-a12d-f8f5309b06cd-scripts" (OuterVolumeSpecName: "scripts") pod "1360dd8b-76ca-4595-a12d-f8f5309b06cd" (UID: "1360dd8b-76ca-4595-a12d-f8f5309b06cd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:16:52 crc kubenswrapper[4933]: I1202 16:16:52.496023 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1360dd8b-76ca-4595-a12d-f8f5309b06cd-kube-api-access-hb48x" (OuterVolumeSpecName: "kube-api-access-hb48x") pod "1360dd8b-76ca-4595-a12d-f8f5309b06cd" (UID: "1360dd8b-76ca-4595-a12d-f8f5309b06cd"). InnerVolumeSpecName "kube-api-access-hb48x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:16:52 crc kubenswrapper[4933]: I1202 16:16:52.515311 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1360dd8b-76ca-4595-a12d-f8f5309b06cd-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "1360dd8b-76ca-4595-a12d-f8f5309b06cd" (UID: "1360dd8b-76ca-4595-a12d-f8f5309b06cd"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:16:52 crc kubenswrapper[4933]: I1202 16:16:52.571082 4933 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1360dd8b-76ca-4595-a12d-f8f5309b06cd-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 16:16:52 crc kubenswrapper[4933]: I1202 16:16:52.571126 4933 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1360dd8b-76ca-4595-a12d-f8f5309b06cd-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 16:16:52 crc kubenswrapper[4933]: I1202 16:16:52.571136 4933 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1360dd8b-76ca-4595-a12d-f8f5309b06cd-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 02 16:16:52 crc kubenswrapper[4933]: I1202 16:16:52.571172 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hb48x\" (UniqueName: \"kubernetes.io/projected/1360dd8b-76ca-4595-a12d-f8f5309b06cd-kube-api-access-hb48x\") on node \"crc\" DevicePath \"\"" Dec 02 16:16:52 crc kubenswrapper[4933]: I1202 16:16:52.571182 4933 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1360dd8b-76ca-4595-a12d-f8f5309b06cd-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 16:16:52 crc kubenswrapper[4933]: I1202 16:16:52.573917 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1360dd8b-76ca-4595-a12d-f8f5309b06cd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1360dd8b-76ca-4595-a12d-f8f5309b06cd" (UID: "1360dd8b-76ca-4595-a12d-f8f5309b06cd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:16:52 crc kubenswrapper[4933]: I1202 16:16:52.639928 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1360dd8b-76ca-4595-a12d-f8f5309b06cd-config-data" (OuterVolumeSpecName: "config-data") pod "1360dd8b-76ca-4595-a12d-f8f5309b06cd" (UID: "1360dd8b-76ca-4595-a12d-f8f5309b06cd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:16:52 crc kubenswrapper[4933]: I1202 16:16:52.673095 4933 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1360dd8b-76ca-4595-a12d-f8f5309b06cd-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 16:16:52 crc kubenswrapper[4933]: I1202 16:16:52.673350 4933 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1360dd8b-76ca-4595-a12d-f8f5309b06cd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 16:16:52 crc kubenswrapper[4933]: I1202 16:16:52.832094 4933 generic.go:334] "Generic (PLEG): container finished" podID="1360dd8b-76ca-4595-a12d-f8f5309b06cd" containerID="c55968801393d3b705956e2dbf7487c7a48e991336ff7e84b4aff5e5ff56c20a" exitCode=0 Dec 02 16:16:52 crc kubenswrapper[4933]: I1202 16:16:52.832174 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 16:16:52 crc kubenswrapper[4933]: I1202 16:16:52.832177 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1360dd8b-76ca-4595-a12d-f8f5309b06cd","Type":"ContainerDied","Data":"c55968801393d3b705956e2dbf7487c7a48e991336ff7e84b4aff5e5ff56c20a"} Dec 02 16:16:52 crc kubenswrapper[4933]: I1202 16:16:52.832613 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1360dd8b-76ca-4595-a12d-f8f5309b06cd","Type":"ContainerDied","Data":"c240c43603a99e0451c0d890e9d13534c18d7107119e0eda4dd176e26a6728ec"} Dec 02 16:16:52 crc kubenswrapper[4933]: I1202 16:16:52.832650 4933 scope.go:117] "RemoveContainer" containerID="e6b229e1a24db5a891b1bb3c995b31a7060b1517cbfc3e96bedafded76211113" Dec 02 16:16:52 crc kubenswrapper[4933]: I1202 16:16:52.858558 4933 scope.go:117] "RemoveContainer" containerID="a5812e2d5c5ef9245b1d3f50c8f25cb2f929c5400f5e20c77006f427985b994a" Dec 02 16:16:52 crc kubenswrapper[4933]: I1202 16:16:52.887093 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 16:16:52 crc kubenswrapper[4933]: I1202 16:16:52.894663 4933 scope.go:117] "RemoveContainer" containerID="5202c98bfd57880824328d8a0af71f21f541049fbd19445ed5c9cb9f17911071" Dec 02 16:16:52 crc kubenswrapper[4933]: I1202 16:16:52.895943 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 02 16:16:52 crc kubenswrapper[4933]: I1202 16:16:52.914137 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 02 16:16:52 crc kubenswrapper[4933]: E1202 16:16:52.914703 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bea586b-0ef5-4524-8171-5dce637605a8" containerName="mariadb-database-create" Dec 02 16:16:52 crc kubenswrapper[4933]: I1202 16:16:52.914721 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bea586b-0ef5-4524-8171-5dce637605a8" containerName="mariadb-database-create" Dec 02 16:16:52 crc kubenswrapper[4933]: E1202 16:16:52.914736 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1360dd8b-76ca-4595-a12d-f8f5309b06cd" containerName="ceilometer-notification-agent" Dec 02 16:16:52 crc kubenswrapper[4933]: I1202 16:16:52.914743 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="1360dd8b-76ca-4595-a12d-f8f5309b06cd" containerName="ceilometer-notification-agent" Dec 02 16:16:52 crc kubenswrapper[4933]: E1202 16:16:52.914753 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1360dd8b-76ca-4595-a12d-f8f5309b06cd" containerName="sg-core" Dec 02 16:16:52 crc kubenswrapper[4933]: I1202 16:16:52.914760 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="1360dd8b-76ca-4595-a12d-f8f5309b06cd" containerName="sg-core" Dec 02 16:16:52 crc kubenswrapper[4933]: E1202 16:16:52.914767 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7971426e-62c3-4709-8e81-8ce9a491f510" containerName="mariadb-account-create-update" Dec 02 16:16:52 crc kubenswrapper[4933]: I1202 16:16:52.914773 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="7971426e-62c3-4709-8e81-8ce9a491f510" containerName="mariadb-account-create-update" Dec 02 16:16:52 crc kubenswrapper[4933]: E1202 16:16:52.914785 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4e7c120-984b-477a-929e-d0b9ddec63d8" containerName="mariadb-database-create" Dec 02 16:16:52 crc kubenswrapper[4933]: I1202 16:16:52.914790 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4e7c120-984b-477a-929e-d0b9ddec63d8" containerName="mariadb-database-create" Dec 02 16:16:52 crc kubenswrapper[4933]: E1202 16:16:52.914804 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1360dd8b-76ca-4595-a12d-f8f5309b06cd" containerName="proxy-httpd" Dec 02 16:16:52 crc kubenswrapper[4933]: I1202 16:16:52.914810 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="1360dd8b-76ca-4595-a12d-f8f5309b06cd" containerName="proxy-httpd" Dec 02 16:16:52 crc kubenswrapper[4933]: E1202 16:16:52.914913 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bd90a06-f10c-4249-99c3-f186848b27f2" containerName="mariadb-account-create-update" Dec 02 16:16:52 crc kubenswrapper[4933]: I1202 16:16:52.914920 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bd90a06-f10c-4249-99c3-f186848b27f2" containerName="mariadb-account-create-update" Dec 02 16:16:52 crc kubenswrapper[4933]: E1202 16:16:52.914961 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adec7c24-3d21-4a10-a5f7-d7bf669b1007" containerName="mariadb-account-create-update" Dec 02 16:16:52 crc kubenswrapper[4933]: I1202 16:16:52.914967 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="adec7c24-3d21-4a10-a5f7-d7bf669b1007" containerName="mariadb-account-create-update" Dec 02 16:16:52 crc kubenswrapper[4933]: E1202 16:16:52.914978 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1360dd8b-76ca-4595-a12d-f8f5309b06cd" containerName="ceilometer-central-agent" Dec 02 16:16:52 crc kubenswrapper[4933]: I1202 16:16:52.914984 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="1360dd8b-76ca-4595-a12d-f8f5309b06cd" containerName="ceilometer-central-agent" Dec 02 16:16:52 crc kubenswrapper[4933]: E1202 16:16:52.914995 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e488a57f-0085-46e8-b496-9d5886e003fb" containerName="mariadb-database-create" Dec 02 16:16:52 crc kubenswrapper[4933]: I1202 16:16:52.915000 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="e488a57f-0085-46e8-b496-9d5886e003fb" containerName="mariadb-database-create" Dec 02 16:16:52 crc kubenswrapper[4933]: I1202 16:16:52.915403 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="1360dd8b-76ca-4595-a12d-f8f5309b06cd" containerName="proxy-httpd" Dec 02 16:16:52 crc kubenswrapper[4933]: I1202 16:16:52.915414 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="adec7c24-3d21-4a10-a5f7-d7bf669b1007" containerName="mariadb-account-create-update" Dec 02 16:16:52 crc kubenswrapper[4933]: I1202 16:16:52.915429 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="1360dd8b-76ca-4595-a12d-f8f5309b06cd" containerName="ceilometer-notification-agent" Dec 02 16:16:52 crc kubenswrapper[4933]: I1202 16:16:52.915435 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="8bd90a06-f10c-4249-99c3-f186848b27f2" containerName="mariadb-account-create-update" Dec 02 16:16:52 crc kubenswrapper[4933]: I1202 16:16:52.915448 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="1360dd8b-76ca-4595-a12d-f8f5309b06cd" containerName="sg-core" Dec 02 16:16:52 crc kubenswrapper[4933]: I1202 16:16:52.915459 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="7971426e-62c3-4709-8e81-8ce9a491f510" containerName="mariadb-account-create-update" Dec 02 16:16:52 crc kubenswrapper[4933]: I1202 16:16:52.915473 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="1360dd8b-76ca-4595-a12d-f8f5309b06cd" containerName="ceilometer-central-agent" Dec 02 16:16:52 crc kubenswrapper[4933]: I1202 16:16:52.915479 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="9bea586b-0ef5-4524-8171-5dce637605a8" containerName="mariadb-database-create" Dec 02 16:16:52 crc kubenswrapper[4933]: I1202 16:16:52.915489 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="e488a57f-0085-46e8-b496-9d5886e003fb" containerName="mariadb-database-create" Dec 02 16:16:52 crc kubenswrapper[4933]: I1202 16:16:52.915505 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4e7c120-984b-477a-929e-d0b9ddec63d8" containerName="mariadb-database-create" Dec 02 16:16:52 crc kubenswrapper[4933]: I1202 16:16:52.917567 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 16:16:52 crc kubenswrapper[4933]: I1202 16:16:52.925685 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 02 16:16:52 crc kubenswrapper[4933]: I1202 16:16:52.926019 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 02 16:16:52 crc kubenswrapper[4933]: I1202 16:16:52.959618 4933 scope.go:117] "RemoveContainer" containerID="c55968801393d3b705956e2dbf7487c7a48e991336ff7e84b4aff5e5ff56c20a" Dec 02 16:16:52 crc kubenswrapper[4933]: I1202 16:16:52.962185 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 16:16:52 crc kubenswrapper[4933]: I1202 16:16:52.985375 4933 scope.go:117] "RemoveContainer" containerID="e6b229e1a24db5a891b1bb3c995b31a7060b1517cbfc3e96bedafded76211113" Dec 02 16:16:52 crc kubenswrapper[4933]: E1202 16:16:52.985994 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6b229e1a24db5a891b1bb3c995b31a7060b1517cbfc3e96bedafded76211113\": container with ID starting with e6b229e1a24db5a891b1bb3c995b31a7060b1517cbfc3e96bedafded76211113 not found: ID does not exist" containerID="e6b229e1a24db5a891b1bb3c995b31a7060b1517cbfc3e96bedafded76211113" Dec 02 16:16:52 crc kubenswrapper[4933]: I1202 16:16:52.986023 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6b229e1a24db5a891b1bb3c995b31a7060b1517cbfc3e96bedafded76211113"} err="failed to get container status \"e6b229e1a24db5a891b1bb3c995b31a7060b1517cbfc3e96bedafded76211113\": rpc error: code = NotFound desc = could not find container \"e6b229e1a24db5a891b1bb3c995b31a7060b1517cbfc3e96bedafded76211113\": container with ID starting with e6b229e1a24db5a891b1bb3c995b31a7060b1517cbfc3e96bedafded76211113 not found: ID does not exist" Dec 02 16:16:52 crc kubenswrapper[4933]: I1202 16:16:52.986044 4933 scope.go:117] "RemoveContainer" containerID="a5812e2d5c5ef9245b1d3f50c8f25cb2f929c5400f5e20c77006f427985b994a" Dec 02 16:16:52 crc kubenswrapper[4933]: E1202 16:16:52.986369 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5812e2d5c5ef9245b1d3f50c8f25cb2f929c5400f5e20c77006f427985b994a\": container with ID starting with a5812e2d5c5ef9245b1d3f50c8f25cb2f929c5400f5e20c77006f427985b994a not found: ID does not exist" containerID="a5812e2d5c5ef9245b1d3f50c8f25cb2f929c5400f5e20c77006f427985b994a" Dec 02 16:16:52 crc kubenswrapper[4933]: I1202 16:16:52.986402 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5812e2d5c5ef9245b1d3f50c8f25cb2f929c5400f5e20c77006f427985b994a"} err="failed to get container status \"a5812e2d5c5ef9245b1d3f50c8f25cb2f929c5400f5e20c77006f427985b994a\": rpc error: code = NotFound desc = could not find container \"a5812e2d5c5ef9245b1d3f50c8f25cb2f929c5400f5e20c77006f427985b994a\": container with ID starting with a5812e2d5c5ef9245b1d3f50c8f25cb2f929c5400f5e20c77006f427985b994a not found: ID does not exist" Dec 02 16:16:52 crc kubenswrapper[4933]: I1202 16:16:52.986421 4933 scope.go:117] "RemoveContainer" containerID="5202c98bfd57880824328d8a0af71f21f541049fbd19445ed5c9cb9f17911071" Dec 02 16:16:52 crc kubenswrapper[4933]: E1202 16:16:52.986755 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5202c98bfd57880824328d8a0af71f21f541049fbd19445ed5c9cb9f17911071\": container with ID starting with 5202c98bfd57880824328d8a0af71f21f541049fbd19445ed5c9cb9f17911071 not found: ID does not exist" containerID="5202c98bfd57880824328d8a0af71f21f541049fbd19445ed5c9cb9f17911071" Dec 02 16:16:52 crc kubenswrapper[4933]: I1202 16:16:52.986777 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5202c98bfd57880824328d8a0af71f21f541049fbd19445ed5c9cb9f17911071"} err="failed to get container status \"5202c98bfd57880824328d8a0af71f21f541049fbd19445ed5c9cb9f17911071\": rpc error: code = NotFound desc = could not find container \"5202c98bfd57880824328d8a0af71f21f541049fbd19445ed5c9cb9f17911071\": container with ID starting with 5202c98bfd57880824328d8a0af71f21f541049fbd19445ed5c9cb9f17911071 not found: ID does not exist" Dec 02 16:16:52 crc kubenswrapper[4933]: I1202 16:16:52.986790 4933 scope.go:117] "RemoveContainer" containerID="c55968801393d3b705956e2dbf7487c7a48e991336ff7e84b4aff5e5ff56c20a" Dec 02 16:16:52 crc kubenswrapper[4933]: E1202 16:16:52.987032 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c55968801393d3b705956e2dbf7487c7a48e991336ff7e84b4aff5e5ff56c20a\": container with ID starting with c55968801393d3b705956e2dbf7487c7a48e991336ff7e84b4aff5e5ff56c20a not found: ID does not exist" containerID="c55968801393d3b705956e2dbf7487c7a48e991336ff7e84b4aff5e5ff56c20a" Dec 02 16:16:52 crc kubenswrapper[4933]: I1202 16:16:52.987078 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c55968801393d3b705956e2dbf7487c7a48e991336ff7e84b4aff5e5ff56c20a"} err="failed to get container status \"c55968801393d3b705956e2dbf7487c7a48e991336ff7e84b4aff5e5ff56c20a\": rpc error: code = NotFound desc = could not find container \"c55968801393d3b705956e2dbf7487c7a48e991336ff7e84b4aff5e5ff56c20a\": container with ID starting with c55968801393d3b705956e2dbf7487c7a48e991336ff7e84b4aff5e5ff56c20a not found: ID does not exist" Dec 02 16:16:53 crc kubenswrapper[4933]: I1202 16:16:53.008663 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/61565441-37a1-459b-89b5-9f20a8fb69c0-run-httpd\") pod \"ceilometer-0\" (UID: \"61565441-37a1-459b-89b5-9f20a8fb69c0\") " pod="openstack/ceilometer-0" Dec 02 16:16:53 crc kubenswrapper[4933]: I1202 16:16:53.008717 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zh7vl\" (UniqueName: \"kubernetes.io/projected/61565441-37a1-459b-89b5-9f20a8fb69c0-kube-api-access-zh7vl\") pod \"ceilometer-0\" (UID: \"61565441-37a1-459b-89b5-9f20a8fb69c0\") " pod="openstack/ceilometer-0" Dec 02 16:16:53 crc kubenswrapper[4933]: I1202 16:16:53.008746 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61565441-37a1-459b-89b5-9f20a8fb69c0-scripts\") pod \"ceilometer-0\" (UID: \"61565441-37a1-459b-89b5-9f20a8fb69c0\") " pod="openstack/ceilometer-0" Dec 02 16:16:53 crc kubenswrapper[4933]: I1202 16:16:53.008949 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61565441-37a1-459b-89b5-9f20a8fb69c0-config-data\") pod \"ceilometer-0\" (UID: \"61565441-37a1-459b-89b5-9f20a8fb69c0\") " pod="openstack/ceilometer-0" Dec 02 16:16:53 crc kubenswrapper[4933]: I1202 16:16:53.009280 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/61565441-37a1-459b-89b5-9f20a8fb69c0-log-httpd\") pod \"ceilometer-0\" (UID: \"61565441-37a1-459b-89b5-9f20a8fb69c0\") " pod="openstack/ceilometer-0" Dec 02 16:16:53 crc kubenswrapper[4933]: I1202 16:16:53.009360 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/61565441-37a1-459b-89b5-9f20a8fb69c0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"61565441-37a1-459b-89b5-9f20a8fb69c0\") " pod="openstack/ceilometer-0" Dec 02 16:16:53 crc kubenswrapper[4933]: I1202 16:16:53.009440 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61565441-37a1-459b-89b5-9f20a8fb69c0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"61565441-37a1-459b-89b5-9f20a8fb69c0\") " pod="openstack/ceilometer-0" Dec 02 16:16:53 crc kubenswrapper[4933]: I1202 16:16:53.066222 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1360dd8b-76ca-4595-a12d-f8f5309b06cd" path="/var/lib/kubelet/pods/1360dd8b-76ca-4595-a12d-f8f5309b06cd/volumes" Dec 02 16:16:53 crc kubenswrapper[4933]: I1202 16:16:53.111470 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/61565441-37a1-459b-89b5-9f20a8fb69c0-run-httpd\") pod \"ceilometer-0\" (UID: \"61565441-37a1-459b-89b5-9f20a8fb69c0\") " pod="openstack/ceilometer-0" Dec 02 16:16:53 crc kubenswrapper[4933]: I1202 16:16:53.111532 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zh7vl\" (UniqueName: \"kubernetes.io/projected/61565441-37a1-459b-89b5-9f20a8fb69c0-kube-api-access-zh7vl\") pod \"ceilometer-0\" (UID: \"61565441-37a1-459b-89b5-9f20a8fb69c0\") " pod="openstack/ceilometer-0" Dec 02 16:16:53 crc kubenswrapper[4933]: I1202 16:16:53.111575 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61565441-37a1-459b-89b5-9f20a8fb69c0-scripts\") pod \"ceilometer-0\" (UID: \"61565441-37a1-459b-89b5-9f20a8fb69c0\") " pod="openstack/ceilometer-0" Dec 02 16:16:53 crc kubenswrapper[4933]: I1202 16:16:53.111609 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61565441-37a1-459b-89b5-9f20a8fb69c0-config-data\") pod \"ceilometer-0\" (UID: \"61565441-37a1-459b-89b5-9f20a8fb69c0\") " pod="openstack/ceilometer-0" Dec 02 16:16:53 crc kubenswrapper[4933]: I1202 16:16:53.111750 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/61565441-37a1-459b-89b5-9f20a8fb69c0-log-httpd\") pod \"ceilometer-0\" (UID: \"61565441-37a1-459b-89b5-9f20a8fb69c0\") " pod="openstack/ceilometer-0" Dec 02 16:16:53 crc kubenswrapper[4933]: I1202 16:16:53.111834 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/61565441-37a1-459b-89b5-9f20a8fb69c0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"61565441-37a1-459b-89b5-9f20a8fb69c0\") " pod="openstack/ceilometer-0" Dec 02 16:16:53 crc kubenswrapper[4933]: I1202 16:16:53.111887 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61565441-37a1-459b-89b5-9f20a8fb69c0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"61565441-37a1-459b-89b5-9f20a8fb69c0\") " pod="openstack/ceilometer-0" Dec 02 16:16:53 crc kubenswrapper[4933]: I1202 16:16:53.111970 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/61565441-37a1-459b-89b5-9f20a8fb69c0-run-httpd\") pod \"ceilometer-0\" (UID: \"61565441-37a1-459b-89b5-9f20a8fb69c0\") " pod="openstack/ceilometer-0" Dec 02 16:16:53 crc kubenswrapper[4933]: I1202 16:16:53.112214 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/61565441-37a1-459b-89b5-9f20a8fb69c0-log-httpd\") pod \"ceilometer-0\" (UID: \"61565441-37a1-459b-89b5-9f20a8fb69c0\") " pod="openstack/ceilometer-0" Dec 02 16:16:53 crc kubenswrapper[4933]: I1202 16:16:53.117095 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/61565441-37a1-459b-89b5-9f20a8fb69c0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"61565441-37a1-459b-89b5-9f20a8fb69c0\") " pod="openstack/ceilometer-0" Dec 02 16:16:53 crc kubenswrapper[4933]: I1202 16:16:53.117265 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61565441-37a1-459b-89b5-9f20a8fb69c0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"61565441-37a1-459b-89b5-9f20a8fb69c0\") " pod="openstack/ceilometer-0" Dec 02 16:16:53 crc kubenswrapper[4933]: I1202 16:16:53.117611 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61565441-37a1-459b-89b5-9f20a8fb69c0-scripts\") pod \"ceilometer-0\" (UID: \"61565441-37a1-459b-89b5-9f20a8fb69c0\") " pod="openstack/ceilometer-0" Dec 02 16:16:53 crc kubenswrapper[4933]: I1202 16:16:53.119150 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61565441-37a1-459b-89b5-9f20a8fb69c0-config-data\") pod \"ceilometer-0\" (UID: \"61565441-37a1-459b-89b5-9f20a8fb69c0\") " pod="openstack/ceilometer-0" Dec 02 16:16:53 crc kubenswrapper[4933]: I1202 16:16:53.131546 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zh7vl\" (UniqueName: \"kubernetes.io/projected/61565441-37a1-459b-89b5-9f20a8fb69c0-kube-api-access-zh7vl\") pod \"ceilometer-0\" (UID: \"61565441-37a1-459b-89b5-9f20a8fb69c0\") " pod="openstack/ceilometer-0" Dec 02 16:16:53 crc kubenswrapper[4933]: I1202 16:16:53.242928 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 16:16:53 crc kubenswrapper[4933]: I1202 16:16:53.627931 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 16:16:53 crc kubenswrapper[4933]: I1202 16:16:53.845295 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"61565441-37a1-459b-89b5-9f20a8fb69c0","Type":"ContainerStarted","Data":"9782b08ddbc564b6efc09574cfd408240cb83ec85e62d37666e046ca044ce87a"} Dec 02 16:16:54 crc kubenswrapper[4933]: I1202 16:16:54.856095 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"61565441-37a1-459b-89b5-9f20a8fb69c0","Type":"ContainerStarted","Data":"ce6921617957bc01baea4675869f5c1e84f00bc3cdfbda16eab69bfff03b14c2"} Dec 02 16:16:55 crc kubenswrapper[4933]: I1202 16:16:55.879142 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"61565441-37a1-459b-89b5-9f20a8fb69c0","Type":"ContainerStarted","Data":"896431baf470efffca045b8dc81779f9ead5087a7af5227e50b75de891754eae"} Dec 02 16:16:56 crc kubenswrapper[4933]: I1202 16:16:56.891263 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"61565441-37a1-459b-89b5-9f20a8fb69c0","Type":"ContainerStarted","Data":"956929c8ea3b8eeacc1d80918c538f7e6964f31efd763b8ee1c584d454aad8d2"} Dec 02 16:16:57 crc kubenswrapper[4933]: I1202 16:16:57.265173 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-ck96l"] Dec 02 16:16:57 crc kubenswrapper[4933]: I1202 16:16:57.266908 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-ck96l" Dec 02 16:16:57 crc kubenswrapper[4933]: I1202 16:16:57.269668 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-5k8wt" Dec 02 16:16:57 crc kubenswrapper[4933]: I1202 16:16:57.270677 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Dec 02 16:16:57 crc kubenswrapper[4933]: I1202 16:16:57.283261 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 02 16:16:57 crc kubenswrapper[4933]: I1202 16:16:57.286856 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-ck96l"] Dec 02 16:16:57 crc kubenswrapper[4933]: I1202 16:16:57.328686 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21060aba-17fe-429c-b7db-206edaaf91b4-scripts\") pod \"nova-cell0-conductor-db-sync-ck96l\" (UID: \"21060aba-17fe-429c-b7db-206edaaf91b4\") " pod="openstack/nova-cell0-conductor-db-sync-ck96l" Dec 02 16:16:57 crc kubenswrapper[4933]: I1202 16:16:57.329249 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21060aba-17fe-429c-b7db-206edaaf91b4-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-ck96l\" (UID: \"21060aba-17fe-429c-b7db-206edaaf91b4\") " pod="openstack/nova-cell0-conductor-db-sync-ck96l" Dec 02 16:16:57 crc kubenswrapper[4933]: I1202 16:16:57.329325 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21060aba-17fe-429c-b7db-206edaaf91b4-config-data\") pod \"nova-cell0-conductor-db-sync-ck96l\" (UID: \"21060aba-17fe-429c-b7db-206edaaf91b4\") " pod="openstack/nova-cell0-conductor-db-sync-ck96l" Dec 02 16:16:57 crc kubenswrapper[4933]: I1202 16:16:57.329443 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4hkd\" (UniqueName: \"kubernetes.io/projected/21060aba-17fe-429c-b7db-206edaaf91b4-kube-api-access-k4hkd\") pod \"nova-cell0-conductor-db-sync-ck96l\" (UID: \"21060aba-17fe-429c-b7db-206edaaf91b4\") " pod="openstack/nova-cell0-conductor-db-sync-ck96l" Dec 02 16:16:57 crc kubenswrapper[4933]: I1202 16:16:57.431730 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21060aba-17fe-429c-b7db-206edaaf91b4-config-data\") pod \"nova-cell0-conductor-db-sync-ck96l\" (UID: \"21060aba-17fe-429c-b7db-206edaaf91b4\") " pod="openstack/nova-cell0-conductor-db-sync-ck96l" Dec 02 16:16:57 crc kubenswrapper[4933]: I1202 16:16:57.431900 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4hkd\" (UniqueName: \"kubernetes.io/projected/21060aba-17fe-429c-b7db-206edaaf91b4-kube-api-access-k4hkd\") pod \"nova-cell0-conductor-db-sync-ck96l\" (UID: \"21060aba-17fe-429c-b7db-206edaaf91b4\") " pod="openstack/nova-cell0-conductor-db-sync-ck96l" Dec 02 16:16:57 crc kubenswrapper[4933]: I1202 16:16:57.432387 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21060aba-17fe-429c-b7db-206edaaf91b4-scripts\") pod \"nova-cell0-conductor-db-sync-ck96l\" (UID: \"21060aba-17fe-429c-b7db-206edaaf91b4\") " pod="openstack/nova-cell0-conductor-db-sync-ck96l" Dec 02 16:16:57 crc kubenswrapper[4933]: I1202 16:16:57.432597 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21060aba-17fe-429c-b7db-206edaaf91b4-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-ck96l\" (UID: \"21060aba-17fe-429c-b7db-206edaaf91b4\") " pod="openstack/nova-cell0-conductor-db-sync-ck96l" Dec 02 16:16:57 crc kubenswrapper[4933]: I1202 16:16:57.437758 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21060aba-17fe-429c-b7db-206edaaf91b4-scripts\") pod \"nova-cell0-conductor-db-sync-ck96l\" (UID: \"21060aba-17fe-429c-b7db-206edaaf91b4\") " pod="openstack/nova-cell0-conductor-db-sync-ck96l" Dec 02 16:16:57 crc kubenswrapper[4933]: I1202 16:16:57.438688 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21060aba-17fe-429c-b7db-206edaaf91b4-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-ck96l\" (UID: \"21060aba-17fe-429c-b7db-206edaaf91b4\") " pod="openstack/nova-cell0-conductor-db-sync-ck96l" Dec 02 16:16:57 crc kubenswrapper[4933]: I1202 16:16:57.440513 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21060aba-17fe-429c-b7db-206edaaf91b4-config-data\") pod \"nova-cell0-conductor-db-sync-ck96l\" (UID: \"21060aba-17fe-429c-b7db-206edaaf91b4\") " pod="openstack/nova-cell0-conductor-db-sync-ck96l" Dec 02 16:16:57 crc kubenswrapper[4933]: I1202 16:16:57.458940 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4hkd\" (UniqueName: \"kubernetes.io/projected/21060aba-17fe-429c-b7db-206edaaf91b4-kube-api-access-k4hkd\") pod \"nova-cell0-conductor-db-sync-ck96l\" (UID: \"21060aba-17fe-429c-b7db-206edaaf91b4\") " pod="openstack/nova-cell0-conductor-db-sync-ck96l" Dec 02 16:16:57 crc kubenswrapper[4933]: I1202 16:16:57.584558 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-ck96l" Dec 02 16:16:57 crc kubenswrapper[4933]: I1202 16:16:57.979168 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"61565441-37a1-459b-89b5-9f20a8fb69c0","Type":"ContainerStarted","Data":"b601587d37460d13ea9bc91f4036161d1a195d7490606ebf861da62204974576"} Dec 02 16:16:57 crc kubenswrapper[4933]: I1202 16:16:57.979990 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 02 16:16:58 crc kubenswrapper[4933]: I1202 16:16:58.029810 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.357513607 podStartE2EDuration="6.029787883s" podCreationTimestamp="2025-12-02 16:16:52 +0000 UTC" firstStartedPulling="2025-12-02 16:16:53.659847474 +0000 UTC m=+1476.911074177" lastFinishedPulling="2025-12-02 16:16:57.33212175 +0000 UTC m=+1480.583348453" observedRunningTime="2025-12-02 16:16:58.016571064 +0000 UTC m=+1481.267797767" watchObservedRunningTime="2025-12-02 16:16:58.029787883 +0000 UTC m=+1481.281014586" Dec 02 16:16:58 crc kubenswrapper[4933]: I1202 16:16:58.278929 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-ck96l"] Dec 02 16:16:58 crc kubenswrapper[4933]: I1202 16:16:58.989135 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-ck96l" event={"ID":"21060aba-17fe-429c-b7db-206edaaf91b4","Type":"ContainerStarted","Data":"f755c9f39589bdee80d007491a185cc76a9231bd83646b48ddf7b422c4ea6afd"} Dec 02 16:17:03 crc kubenswrapper[4933]: I1202 16:17:03.568009 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 16:17:03 crc kubenswrapper[4933]: I1202 16:17:03.568743 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="61565441-37a1-459b-89b5-9f20a8fb69c0" containerName="ceilometer-central-agent" containerID="cri-o://ce6921617957bc01baea4675869f5c1e84f00bc3cdfbda16eab69bfff03b14c2" gracePeriod=30 Dec 02 16:17:03 crc kubenswrapper[4933]: I1202 16:17:03.569360 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="61565441-37a1-459b-89b5-9f20a8fb69c0" containerName="proxy-httpd" containerID="cri-o://b601587d37460d13ea9bc91f4036161d1a195d7490606ebf861da62204974576" gracePeriod=30 Dec 02 16:17:03 crc kubenswrapper[4933]: I1202 16:17:03.569429 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="61565441-37a1-459b-89b5-9f20a8fb69c0" containerName="sg-core" containerID="cri-o://956929c8ea3b8eeacc1d80918c538f7e6964f31efd763b8ee1c584d454aad8d2" gracePeriod=30 Dec 02 16:17:03 crc kubenswrapper[4933]: I1202 16:17:03.569470 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="61565441-37a1-459b-89b5-9f20a8fb69c0" containerName="ceilometer-notification-agent" containerID="cri-o://896431baf470efffca045b8dc81779f9ead5087a7af5227e50b75de891754eae" gracePeriod=30 Dec 02 16:17:04 crc kubenswrapper[4933]: I1202 16:17:04.075395 4933 generic.go:334] "Generic (PLEG): container finished" podID="61565441-37a1-459b-89b5-9f20a8fb69c0" containerID="b601587d37460d13ea9bc91f4036161d1a195d7490606ebf861da62204974576" exitCode=0 Dec 02 16:17:04 crc kubenswrapper[4933]: I1202 16:17:04.075438 4933 generic.go:334] "Generic (PLEG): container finished" podID="61565441-37a1-459b-89b5-9f20a8fb69c0" containerID="956929c8ea3b8eeacc1d80918c538f7e6964f31efd763b8ee1c584d454aad8d2" exitCode=2 Dec 02 16:17:04 crc kubenswrapper[4933]: I1202 16:17:04.075448 4933 generic.go:334] "Generic (PLEG): container finished" podID="61565441-37a1-459b-89b5-9f20a8fb69c0" containerID="896431baf470efffca045b8dc81779f9ead5087a7af5227e50b75de891754eae" exitCode=0 Dec 02 16:17:04 crc kubenswrapper[4933]: I1202 16:17:04.075475 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"61565441-37a1-459b-89b5-9f20a8fb69c0","Type":"ContainerDied","Data":"b601587d37460d13ea9bc91f4036161d1a195d7490606ebf861da62204974576"} Dec 02 16:17:04 crc kubenswrapper[4933]: I1202 16:17:04.075506 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"61565441-37a1-459b-89b5-9f20a8fb69c0","Type":"ContainerDied","Data":"956929c8ea3b8eeacc1d80918c538f7e6964f31efd763b8ee1c584d454aad8d2"} Dec 02 16:17:04 crc kubenswrapper[4933]: I1202 16:17:04.075519 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"61565441-37a1-459b-89b5-9f20a8fb69c0","Type":"ContainerDied","Data":"896431baf470efffca045b8dc81779f9ead5087a7af5227e50b75de891754eae"} Dec 02 16:17:05 crc kubenswrapper[4933]: I1202 16:17:05.089463 4933 generic.go:334] "Generic (PLEG): container finished" podID="61565441-37a1-459b-89b5-9f20a8fb69c0" containerID="ce6921617957bc01baea4675869f5c1e84f00bc3cdfbda16eab69bfff03b14c2" exitCode=0 Dec 02 16:17:05 crc kubenswrapper[4933]: I1202 16:17:05.089516 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"61565441-37a1-459b-89b5-9f20a8fb69c0","Type":"ContainerDied","Data":"ce6921617957bc01baea4675869f5c1e84f00bc3cdfbda16eab69bfff03b14c2"} Dec 02 16:17:09 crc kubenswrapper[4933]: I1202 16:17:09.393152 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 16:17:09 crc kubenswrapper[4933]: I1202 16:17:09.533191 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/61565441-37a1-459b-89b5-9f20a8fb69c0-sg-core-conf-yaml\") pod \"61565441-37a1-459b-89b5-9f20a8fb69c0\" (UID: \"61565441-37a1-459b-89b5-9f20a8fb69c0\") " Dec 02 16:17:09 crc kubenswrapper[4933]: I1202 16:17:09.533260 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/61565441-37a1-459b-89b5-9f20a8fb69c0-log-httpd\") pod \"61565441-37a1-459b-89b5-9f20a8fb69c0\" (UID: \"61565441-37a1-459b-89b5-9f20a8fb69c0\") " Dec 02 16:17:09 crc kubenswrapper[4933]: I1202 16:17:09.533306 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61565441-37a1-459b-89b5-9f20a8fb69c0-config-data\") pod \"61565441-37a1-459b-89b5-9f20a8fb69c0\" (UID: \"61565441-37a1-459b-89b5-9f20a8fb69c0\") " Dec 02 16:17:09 crc kubenswrapper[4933]: I1202 16:17:09.533329 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zh7vl\" (UniqueName: \"kubernetes.io/projected/61565441-37a1-459b-89b5-9f20a8fb69c0-kube-api-access-zh7vl\") pod \"61565441-37a1-459b-89b5-9f20a8fb69c0\" (UID: \"61565441-37a1-459b-89b5-9f20a8fb69c0\") " Dec 02 16:17:09 crc kubenswrapper[4933]: I1202 16:17:09.533369 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/61565441-37a1-459b-89b5-9f20a8fb69c0-run-httpd\") pod \"61565441-37a1-459b-89b5-9f20a8fb69c0\" (UID: \"61565441-37a1-459b-89b5-9f20a8fb69c0\") " Dec 02 16:17:09 crc kubenswrapper[4933]: I1202 16:17:09.533484 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61565441-37a1-459b-89b5-9f20a8fb69c0-combined-ca-bundle\") pod \"61565441-37a1-459b-89b5-9f20a8fb69c0\" (UID: \"61565441-37a1-459b-89b5-9f20a8fb69c0\") " Dec 02 16:17:09 crc kubenswrapper[4933]: I1202 16:17:09.533570 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61565441-37a1-459b-89b5-9f20a8fb69c0-scripts\") pod \"61565441-37a1-459b-89b5-9f20a8fb69c0\" (UID: \"61565441-37a1-459b-89b5-9f20a8fb69c0\") " Dec 02 16:17:09 crc kubenswrapper[4933]: I1202 16:17:09.534290 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/61565441-37a1-459b-89b5-9f20a8fb69c0-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "61565441-37a1-459b-89b5-9f20a8fb69c0" (UID: "61565441-37a1-459b-89b5-9f20a8fb69c0"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 16:17:09 crc kubenswrapper[4933]: I1202 16:17:09.534311 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/61565441-37a1-459b-89b5-9f20a8fb69c0-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "61565441-37a1-459b-89b5-9f20a8fb69c0" (UID: "61565441-37a1-459b-89b5-9f20a8fb69c0"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 16:17:09 crc kubenswrapper[4933]: I1202 16:17:09.534886 4933 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/61565441-37a1-459b-89b5-9f20a8fb69c0-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 16:17:09 crc kubenswrapper[4933]: I1202 16:17:09.534903 4933 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/61565441-37a1-459b-89b5-9f20a8fb69c0-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 16:17:09 crc kubenswrapper[4933]: I1202 16:17:09.539033 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61565441-37a1-459b-89b5-9f20a8fb69c0-scripts" (OuterVolumeSpecName: "scripts") pod "61565441-37a1-459b-89b5-9f20a8fb69c0" (UID: "61565441-37a1-459b-89b5-9f20a8fb69c0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:17:09 crc kubenswrapper[4933]: I1202 16:17:09.541287 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61565441-37a1-459b-89b5-9f20a8fb69c0-kube-api-access-zh7vl" (OuterVolumeSpecName: "kube-api-access-zh7vl") pod "61565441-37a1-459b-89b5-9f20a8fb69c0" (UID: "61565441-37a1-459b-89b5-9f20a8fb69c0"). InnerVolumeSpecName "kube-api-access-zh7vl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:17:09 crc kubenswrapper[4933]: I1202 16:17:09.577409 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61565441-37a1-459b-89b5-9f20a8fb69c0-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "61565441-37a1-459b-89b5-9f20a8fb69c0" (UID: "61565441-37a1-459b-89b5-9f20a8fb69c0"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:17:09 crc kubenswrapper[4933]: I1202 16:17:09.629862 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61565441-37a1-459b-89b5-9f20a8fb69c0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "61565441-37a1-459b-89b5-9f20a8fb69c0" (UID: "61565441-37a1-459b-89b5-9f20a8fb69c0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:17:09 crc kubenswrapper[4933]: I1202 16:17:09.636677 4933 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61565441-37a1-459b-89b5-9f20a8fb69c0-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 16:17:09 crc kubenswrapper[4933]: I1202 16:17:09.636709 4933 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/61565441-37a1-459b-89b5-9f20a8fb69c0-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 02 16:17:09 crc kubenswrapper[4933]: I1202 16:17:09.636720 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zh7vl\" (UniqueName: \"kubernetes.io/projected/61565441-37a1-459b-89b5-9f20a8fb69c0-kube-api-access-zh7vl\") on node \"crc\" DevicePath \"\"" Dec 02 16:17:09 crc kubenswrapper[4933]: I1202 16:17:09.636728 4933 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61565441-37a1-459b-89b5-9f20a8fb69c0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 16:17:09 crc kubenswrapper[4933]: I1202 16:17:09.675092 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61565441-37a1-459b-89b5-9f20a8fb69c0-config-data" (OuterVolumeSpecName: "config-data") pod "61565441-37a1-459b-89b5-9f20a8fb69c0" (UID: "61565441-37a1-459b-89b5-9f20a8fb69c0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:17:09 crc kubenswrapper[4933]: I1202 16:17:09.738394 4933 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61565441-37a1-459b-89b5-9f20a8fb69c0-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 16:17:10 crc kubenswrapper[4933]: I1202 16:17:10.174137 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"61565441-37a1-459b-89b5-9f20a8fb69c0","Type":"ContainerDied","Data":"9782b08ddbc564b6efc09574cfd408240cb83ec85e62d37666e046ca044ce87a"} Dec 02 16:17:10 crc kubenswrapper[4933]: I1202 16:17:10.174195 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 16:17:10 crc kubenswrapper[4933]: I1202 16:17:10.174211 4933 scope.go:117] "RemoveContainer" containerID="b601587d37460d13ea9bc91f4036161d1a195d7490606ebf861da62204974576" Dec 02 16:17:10 crc kubenswrapper[4933]: I1202 16:17:10.176480 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-ck96l" event={"ID":"21060aba-17fe-429c-b7db-206edaaf91b4","Type":"ContainerStarted","Data":"a13ae2dda21fde57b30ec26bf269ac5108cc18c5faac2ae044bc0843d42b2feb"} Dec 02 16:17:10 crc kubenswrapper[4933]: I1202 16:17:10.196181 4933 scope.go:117] "RemoveContainer" containerID="956929c8ea3b8eeacc1d80918c538f7e6964f31efd763b8ee1c584d454aad8d2" Dec 02 16:17:10 crc kubenswrapper[4933]: I1202 16:17:10.211643 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-ck96l" podStartSLOduration=2.456699319 podStartE2EDuration="13.211619178s" podCreationTimestamp="2025-12-02 16:16:57 +0000 UTC" firstStartedPulling="2025-12-02 16:16:58.276380716 +0000 UTC m=+1481.527607419" lastFinishedPulling="2025-12-02 16:17:09.031300575 +0000 UTC m=+1492.282527278" observedRunningTime="2025-12-02 16:17:10.202334653 +0000 UTC m=+1493.453561356" watchObservedRunningTime="2025-12-02 16:17:10.211619178 +0000 UTC m=+1493.462845881" Dec 02 16:17:10 crc kubenswrapper[4933]: I1202 16:17:10.225679 4933 scope.go:117] "RemoveContainer" containerID="896431baf470efffca045b8dc81779f9ead5087a7af5227e50b75de891754eae" Dec 02 16:17:10 crc kubenswrapper[4933]: I1202 16:17:10.238598 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 16:17:10 crc kubenswrapper[4933]: I1202 16:17:10.255234 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 02 16:17:10 crc kubenswrapper[4933]: I1202 16:17:10.256000 4933 scope.go:117] "RemoveContainer" containerID="ce6921617957bc01baea4675869f5c1e84f00bc3cdfbda16eab69bfff03b14c2" Dec 02 16:17:10 crc kubenswrapper[4933]: I1202 16:17:10.272950 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 02 16:17:10 crc kubenswrapper[4933]: E1202 16:17:10.273403 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61565441-37a1-459b-89b5-9f20a8fb69c0" containerName="sg-core" Dec 02 16:17:10 crc kubenswrapper[4933]: I1202 16:17:10.273421 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="61565441-37a1-459b-89b5-9f20a8fb69c0" containerName="sg-core" Dec 02 16:17:10 crc kubenswrapper[4933]: E1202 16:17:10.273451 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61565441-37a1-459b-89b5-9f20a8fb69c0" containerName="ceilometer-central-agent" Dec 02 16:17:10 crc kubenswrapper[4933]: I1202 16:17:10.273458 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="61565441-37a1-459b-89b5-9f20a8fb69c0" containerName="ceilometer-central-agent" Dec 02 16:17:10 crc kubenswrapper[4933]: E1202 16:17:10.273477 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61565441-37a1-459b-89b5-9f20a8fb69c0" containerName="ceilometer-notification-agent" Dec 02 16:17:10 crc kubenswrapper[4933]: I1202 16:17:10.273484 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="61565441-37a1-459b-89b5-9f20a8fb69c0" containerName="ceilometer-notification-agent" Dec 02 16:17:10 crc kubenswrapper[4933]: E1202 16:17:10.273500 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61565441-37a1-459b-89b5-9f20a8fb69c0" containerName="proxy-httpd" Dec 02 16:17:10 crc kubenswrapper[4933]: I1202 16:17:10.273507 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="61565441-37a1-459b-89b5-9f20a8fb69c0" containerName="proxy-httpd" Dec 02 16:17:10 crc kubenswrapper[4933]: I1202 16:17:10.273711 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="61565441-37a1-459b-89b5-9f20a8fb69c0" containerName="ceilometer-central-agent" Dec 02 16:17:10 crc kubenswrapper[4933]: I1202 16:17:10.273734 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="61565441-37a1-459b-89b5-9f20a8fb69c0" containerName="proxy-httpd" Dec 02 16:17:10 crc kubenswrapper[4933]: I1202 16:17:10.273754 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="61565441-37a1-459b-89b5-9f20a8fb69c0" containerName="ceilometer-notification-agent" Dec 02 16:17:10 crc kubenswrapper[4933]: I1202 16:17:10.273767 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="61565441-37a1-459b-89b5-9f20a8fb69c0" containerName="sg-core" Dec 02 16:17:10 crc kubenswrapper[4933]: I1202 16:17:10.275728 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 16:17:10 crc kubenswrapper[4933]: I1202 16:17:10.278910 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 02 16:17:10 crc kubenswrapper[4933]: I1202 16:17:10.279288 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 02 16:17:10 crc kubenswrapper[4933]: I1202 16:17:10.289685 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 16:17:10 crc kubenswrapper[4933]: I1202 16:17:10.460980 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6aa9c547-4fd4-4838-a679-6069e762f51c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6aa9c547-4fd4-4838-a679-6069e762f51c\") " pod="openstack/ceilometer-0" Dec 02 16:17:10 crc kubenswrapper[4933]: I1202 16:17:10.462169 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6aa9c547-4fd4-4838-a679-6069e762f51c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6aa9c547-4fd4-4838-a679-6069e762f51c\") " pod="openstack/ceilometer-0" Dec 02 16:17:10 crc kubenswrapper[4933]: I1202 16:17:10.462288 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6aa9c547-4fd4-4838-a679-6069e762f51c-config-data\") pod \"ceilometer-0\" (UID: \"6aa9c547-4fd4-4838-a679-6069e762f51c\") " pod="openstack/ceilometer-0" Dec 02 16:17:10 crc kubenswrapper[4933]: I1202 16:17:10.462423 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6aa9c547-4fd4-4838-a679-6069e762f51c-run-httpd\") pod \"ceilometer-0\" (UID: \"6aa9c547-4fd4-4838-a679-6069e762f51c\") " pod="openstack/ceilometer-0" Dec 02 16:17:10 crc kubenswrapper[4933]: I1202 16:17:10.462536 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6aa9c547-4fd4-4838-a679-6069e762f51c-log-httpd\") pod \"ceilometer-0\" (UID: \"6aa9c547-4fd4-4838-a679-6069e762f51c\") " pod="openstack/ceilometer-0" Dec 02 16:17:10 crc kubenswrapper[4933]: I1202 16:17:10.462767 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6aa9c547-4fd4-4838-a679-6069e762f51c-scripts\") pod \"ceilometer-0\" (UID: \"6aa9c547-4fd4-4838-a679-6069e762f51c\") " pod="openstack/ceilometer-0" Dec 02 16:17:10 crc kubenswrapper[4933]: I1202 16:17:10.462893 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7s85p\" (UniqueName: \"kubernetes.io/projected/6aa9c547-4fd4-4838-a679-6069e762f51c-kube-api-access-7s85p\") pod \"ceilometer-0\" (UID: \"6aa9c547-4fd4-4838-a679-6069e762f51c\") " pod="openstack/ceilometer-0" Dec 02 16:17:10 crc kubenswrapper[4933]: I1202 16:17:10.564402 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6aa9c547-4fd4-4838-a679-6069e762f51c-scripts\") pod \"ceilometer-0\" (UID: \"6aa9c547-4fd4-4838-a679-6069e762f51c\") " pod="openstack/ceilometer-0" Dec 02 16:17:10 crc kubenswrapper[4933]: I1202 16:17:10.564716 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7s85p\" (UniqueName: \"kubernetes.io/projected/6aa9c547-4fd4-4838-a679-6069e762f51c-kube-api-access-7s85p\") pod \"ceilometer-0\" (UID: \"6aa9c547-4fd4-4838-a679-6069e762f51c\") " pod="openstack/ceilometer-0" Dec 02 16:17:10 crc kubenswrapper[4933]: I1202 16:17:10.564770 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6aa9c547-4fd4-4838-a679-6069e762f51c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6aa9c547-4fd4-4838-a679-6069e762f51c\") " pod="openstack/ceilometer-0" Dec 02 16:17:10 crc kubenswrapper[4933]: I1202 16:17:10.564897 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6aa9c547-4fd4-4838-a679-6069e762f51c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6aa9c547-4fd4-4838-a679-6069e762f51c\") " pod="openstack/ceilometer-0" Dec 02 16:17:10 crc kubenswrapper[4933]: I1202 16:17:10.564919 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6aa9c547-4fd4-4838-a679-6069e762f51c-config-data\") pod \"ceilometer-0\" (UID: \"6aa9c547-4fd4-4838-a679-6069e762f51c\") " pod="openstack/ceilometer-0" Dec 02 16:17:10 crc kubenswrapper[4933]: I1202 16:17:10.564946 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6aa9c547-4fd4-4838-a679-6069e762f51c-run-httpd\") pod \"ceilometer-0\" (UID: \"6aa9c547-4fd4-4838-a679-6069e762f51c\") " pod="openstack/ceilometer-0" Dec 02 16:17:10 crc kubenswrapper[4933]: I1202 16:17:10.564974 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6aa9c547-4fd4-4838-a679-6069e762f51c-log-httpd\") pod \"ceilometer-0\" (UID: \"6aa9c547-4fd4-4838-a679-6069e762f51c\") " pod="openstack/ceilometer-0" Dec 02 16:17:10 crc kubenswrapper[4933]: I1202 16:17:10.566289 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6aa9c547-4fd4-4838-a679-6069e762f51c-run-httpd\") pod \"ceilometer-0\" (UID: \"6aa9c547-4fd4-4838-a679-6069e762f51c\") " pod="openstack/ceilometer-0" Dec 02 16:17:10 crc kubenswrapper[4933]: I1202 16:17:10.566381 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6aa9c547-4fd4-4838-a679-6069e762f51c-log-httpd\") pod \"ceilometer-0\" (UID: \"6aa9c547-4fd4-4838-a679-6069e762f51c\") " pod="openstack/ceilometer-0" Dec 02 16:17:10 crc kubenswrapper[4933]: I1202 16:17:10.570419 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6aa9c547-4fd4-4838-a679-6069e762f51c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6aa9c547-4fd4-4838-a679-6069e762f51c\") " pod="openstack/ceilometer-0" Dec 02 16:17:10 crc kubenswrapper[4933]: I1202 16:17:10.570857 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6aa9c547-4fd4-4838-a679-6069e762f51c-config-data\") pod \"ceilometer-0\" (UID: \"6aa9c547-4fd4-4838-a679-6069e762f51c\") " pod="openstack/ceilometer-0" Dec 02 16:17:10 crc kubenswrapper[4933]: I1202 16:17:10.576757 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6aa9c547-4fd4-4838-a679-6069e762f51c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6aa9c547-4fd4-4838-a679-6069e762f51c\") " pod="openstack/ceilometer-0" Dec 02 16:17:10 crc kubenswrapper[4933]: I1202 16:17:10.580385 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6aa9c547-4fd4-4838-a679-6069e762f51c-scripts\") pod \"ceilometer-0\" (UID: \"6aa9c547-4fd4-4838-a679-6069e762f51c\") " pod="openstack/ceilometer-0" Dec 02 16:17:10 crc kubenswrapper[4933]: I1202 16:17:10.584861 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7s85p\" (UniqueName: \"kubernetes.io/projected/6aa9c547-4fd4-4838-a679-6069e762f51c-kube-api-access-7s85p\") pod \"ceilometer-0\" (UID: \"6aa9c547-4fd4-4838-a679-6069e762f51c\") " pod="openstack/ceilometer-0" Dec 02 16:17:10 crc kubenswrapper[4933]: I1202 16:17:10.600795 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 16:17:11 crc kubenswrapper[4933]: I1202 16:17:11.067238 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61565441-37a1-459b-89b5-9f20a8fb69c0" path="/var/lib/kubelet/pods/61565441-37a1-459b-89b5-9f20a8fb69c0/volumes" Dec 02 16:17:11 crc kubenswrapper[4933]: W1202 16:17:11.130075 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6aa9c547_4fd4_4838_a679_6069e762f51c.slice/crio-8511c7b3995e13764830877759d6c3089e7c07d255fd92c6f94033a3055f70f8 WatchSource:0}: Error finding container 8511c7b3995e13764830877759d6c3089e7c07d255fd92c6f94033a3055f70f8: Status 404 returned error can't find the container with id 8511c7b3995e13764830877759d6c3089e7c07d255fd92c6f94033a3055f70f8 Dec 02 16:17:11 crc kubenswrapper[4933]: I1202 16:17:11.145948 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 16:17:11 crc kubenswrapper[4933]: I1202 16:17:11.191450 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6aa9c547-4fd4-4838-a679-6069e762f51c","Type":"ContainerStarted","Data":"8511c7b3995e13764830877759d6c3089e7c07d255fd92c6f94033a3055f70f8"} Dec 02 16:17:12 crc kubenswrapper[4933]: I1202 16:17:12.207112 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6aa9c547-4fd4-4838-a679-6069e762f51c","Type":"ContainerStarted","Data":"470bc878f07b8b62ec444c83ade123cba3e859ba450032c3cab76263e29ead2f"} Dec 02 16:17:12 crc kubenswrapper[4933]: I1202 16:17:12.987429 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-create-nc5lc"] Dec 02 16:17:12 crc kubenswrapper[4933]: I1202 16:17:12.989549 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-nc5lc" Dec 02 16:17:12 crc kubenswrapper[4933]: I1202 16:17:12.997889 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-nc5lc"] Dec 02 16:17:13 crc kubenswrapper[4933]: I1202 16:17:13.112950 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 16:17:13 crc kubenswrapper[4933]: I1202 16:17:13.130805 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nx27z\" (UniqueName: \"kubernetes.io/projected/6e06c62d-ee51-43e3-aa5e-eb045d4ec1c8-kube-api-access-nx27z\") pod \"aodh-db-create-nc5lc\" (UID: \"6e06c62d-ee51-43e3-aa5e-eb045d4ec1c8\") " pod="openstack/aodh-db-create-nc5lc" Dec 02 16:17:13 crc kubenswrapper[4933]: I1202 16:17:13.131222 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6e06c62d-ee51-43e3-aa5e-eb045d4ec1c8-operator-scripts\") pod \"aodh-db-create-nc5lc\" (UID: \"6e06c62d-ee51-43e3-aa5e-eb045d4ec1c8\") " pod="openstack/aodh-db-create-nc5lc" Dec 02 16:17:13 crc kubenswrapper[4933]: I1202 16:17:13.134932 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-9217-account-create-update-mnldr"] Dec 02 16:17:13 crc kubenswrapper[4933]: I1202 16:17:13.136804 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-9217-account-create-update-mnldr" Dec 02 16:17:13 crc kubenswrapper[4933]: I1202 16:17:13.139248 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-db-secret" Dec 02 16:17:13 crc kubenswrapper[4933]: I1202 16:17:13.170995 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-9217-account-create-update-mnldr"] Dec 02 16:17:13 crc kubenswrapper[4933]: I1202 16:17:13.224616 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6aa9c547-4fd4-4838-a679-6069e762f51c","Type":"ContainerStarted","Data":"e0b29a90103c6a3bc3567e4157d4eb6032a5b1c6117b0973521dd94a567cdb79"} Dec 02 16:17:13 crc kubenswrapper[4933]: I1202 16:17:13.233186 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9cde07d2-340f-48ad-bb66-db94386d9052-operator-scripts\") pod \"aodh-9217-account-create-update-mnldr\" (UID: \"9cde07d2-340f-48ad-bb66-db94386d9052\") " pod="openstack/aodh-9217-account-create-update-mnldr" Dec 02 16:17:13 crc kubenswrapper[4933]: I1202 16:17:13.233262 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cwcp\" (UniqueName: \"kubernetes.io/projected/9cde07d2-340f-48ad-bb66-db94386d9052-kube-api-access-8cwcp\") pod \"aodh-9217-account-create-update-mnldr\" (UID: \"9cde07d2-340f-48ad-bb66-db94386d9052\") " pod="openstack/aodh-9217-account-create-update-mnldr" Dec 02 16:17:13 crc kubenswrapper[4933]: I1202 16:17:13.233471 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6e06c62d-ee51-43e3-aa5e-eb045d4ec1c8-operator-scripts\") pod \"aodh-db-create-nc5lc\" (UID: \"6e06c62d-ee51-43e3-aa5e-eb045d4ec1c8\") " pod="openstack/aodh-db-create-nc5lc" Dec 02 16:17:13 crc kubenswrapper[4933]: I1202 16:17:13.233695 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nx27z\" (UniqueName: \"kubernetes.io/projected/6e06c62d-ee51-43e3-aa5e-eb045d4ec1c8-kube-api-access-nx27z\") pod \"aodh-db-create-nc5lc\" (UID: \"6e06c62d-ee51-43e3-aa5e-eb045d4ec1c8\") " pod="openstack/aodh-db-create-nc5lc" Dec 02 16:17:13 crc kubenswrapper[4933]: I1202 16:17:13.234315 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6e06c62d-ee51-43e3-aa5e-eb045d4ec1c8-operator-scripts\") pod \"aodh-db-create-nc5lc\" (UID: \"6e06c62d-ee51-43e3-aa5e-eb045d4ec1c8\") " pod="openstack/aodh-db-create-nc5lc" Dec 02 16:17:13 crc kubenswrapper[4933]: I1202 16:17:13.252001 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nx27z\" (UniqueName: \"kubernetes.io/projected/6e06c62d-ee51-43e3-aa5e-eb045d4ec1c8-kube-api-access-nx27z\") pod \"aodh-db-create-nc5lc\" (UID: \"6e06c62d-ee51-43e3-aa5e-eb045d4ec1c8\") " pod="openstack/aodh-db-create-nc5lc" Dec 02 16:17:13 crc kubenswrapper[4933]: I1202 16:17:13.335964 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8cwcp\" (UniqueName: \"kubernetes.io/projected/9cde07d2-340f-48ad-bb66-db94386d9052-kube-api-access-8cwcp\") pod \"aodh-9217-account-create-update-mnldr\" (UID: \"9cde07d2-340f-48ad-bb66-db94386d9052\") " pod="openstack/aodh-9217-account-create-update-mnldr" Dec 02 16:17:13 crc kubenswrapper[4933]: I1202 16:17:13.336256 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9cde07d2-340f-48ad-bb66-db94386d9052-operator-scripts\") pod \"aodh-9217-account-create-update-mnldr\" (UID: \"9cde07d2-340f-48ad-bb66-db94386d9052\") " pod="openstack/aodh-9217-account-create-update-mnldr" Dec 02 16:17:13 crc kubenswrapper[4933]: I1202 16:17:13.336915 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9cde07d2-340f-48ad-bb66-db94386d9052-operator-scripts\") pod \"aodh-9217-account-create-update-mnldr\" (UID: \"9cde07d2-340f-48ad-bb66-db94386d9052\") " pod="openstack/aodh-9217-account-create-update-mnldr" Dec 02 16:17:13 crc kubenswrapper[4933]: I1202 16:17:13.345388 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-nc5lc" Dec 02 16:17:13 crc kubenswrapper[4933]: I1202 16:17:13.356951 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cwcp\" (UniqueName: \"kubernetes.io/projected/9cde07d2-340f-48ad-bb66-db94386d9052-kube-api-access-8cwcp\") pod \"aodh-9217-account-create-update-mnldr\" (UID: \"9cde07d2-340f-48ad-bb66-db94386d9052\") " pod="openstack/aodh-9217-account-create-update-mnldr" Dec 02 16:17:13 crc kubenswrapper[4933]: I1202 16:17:13.465710 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-9217-account-create-update-mnldr" Dec 02 16:17:14 crc kubenswrapper[4933]: I1202 16:17:14.012356 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-nc5lc"] Dec 02 16:17:14 crc kubenswrapper[4933]: I1202 16:17:14.123956 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-9217-account-create-update-mnldr"] Dec 02 16:17:14 crc kubenswrapper[4933]: W1202 16:17:14.129312 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9cde07d2_340f_48ad_bb66_db94386d9052.slice/crio-765cecca6fbfd3edb5dd54e81aa4cde23ec739c1f93f91a345a5bf2a4045f503 WatchSource:0}: Error finding container 765cecca6fbfd3edb5dd54e81aa4cde23ec739c1f93f91a345a5bf2a4045f503: Status 404 returned error can't find the container with id 765cecca6fbfd3edb5dd54e81aa4cde23ec739c1f93f91a345a5bf2a4045f503 Dec 02 16:17:14 crc kubenswrapper[4933]: I1202 16:17:14.235176 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-9217-account-create-update-mnldr" event={"ID":"9cde07d2-340f-48ad-bb66-db94386d9052","Type":"ContainerStarted","Data":"765cecca6fbfd3edb5dd54e81aa4cde23ec739c1f93f91a345a5bf2a4045f503"} Dec 02 16:17:14 crc kubenswrapper[4933]: I1202 16:17:14.237537 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-nc5lc" event={"ID":"6e06c62d-ee51-43e3-aa5e-eb045d4ec1c8","Type":"ContainerStarted","Data":"13dfbc5c0d228e4615e41c19a2b1d2405ee951cad4901d23e29cefd7ef71304c"} Dec 02 16:17:14 crc kubenswrapper[4933]: I1202 16:17:14.237572 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-nc5lc" event={"ID":"6e06c62d-ee51-43e3-aa5e-eb045d4ec1c8","Type":"ContainerStarted","Data":"43ab1b9955780577e3f9c0b645c7d9e41f1ce8c0d7b9d6bc33b322927a3ab51f"} Dec 02 16:17:14 crc kubenswrapper[4933]: I1202 16:17:14.241207 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6aa9c547-4fd4-4838-a679-6069e762f51c","Type":"ContainerStarted","Data":"4f895b87a8a15af65f7a9c441f30c1a5c78c0bc084c10af6c172fbe2c082220b"} Dec 02 16:17:14 crc kubenswrapper[4933]: I1202 16:17:14.271143 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-create-nc5lc" podStartSLOduration=2.271122545 podStartE2EDuration="2.271122545s" podCreationTimestamp="2025-12-02 16:17:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 16:17:14.256326148 +0000 UTC m=+1497.507552841" watchObservedRunningTime="2025-12-02 16:17:14.271122545 +0000 UTC m=+1497.522349248" Dec 02 16:17:15 crc kubenswrapper[4933]: I1202 16:17:15.252989 4933 generic.go:334] "Generic (PLEG): container finished" podID="6e06c62d-ee51-43e3-aa5e-eb045d4ec1c8" containerID="13dfbc5c0d228e4615e41c19a2b1d2405ee951cad4901d23e29cefd7ef71304c" exitCode=0 Dec 02 16:17:15 crc kubenswrapper[4933]: I1202 16:17:15.253112 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-nc5lc" event={"ID":"6e06c62d-ee51-43e3-aa5e-eb045d4ec1c8","Type":"ContainerDied","Data":"13dfbc5c0d228e4615e41c19a2b1d2405ee951cad4901d23e29cefd7ef71304c"} Dec 02 16:17:15 crc kubenswrapper[4933]: I1202 16:17:15.256925 4933 generic.go:334] "Generic (PLEG): container finished" podID="9cde07d2-340f-48ad-bb66-db94386d9052" containerID="98f6a454b62a21e83e74b898e2d989fa4846f785c24ecabe3e51d7ebb5cbb6c7" exitCode=0 Dec 02 16:17:15 crc kubenswrapper[4933]: I1202 16:17:15.257088 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-9217-account-create-update-mnldr" event={"ID":"9cde07d2-340f-48ad-bb66-db94386d9052","Type":"ContainerDied","Data":"98f6a454b62a21e83e74b898e2d989fa4846f785c24ecabe3e51d7ebb5cbb6c7"} Dec 02 16:17:16 crc kubenswrapper[4933]: I1202 16:17:16.274665 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6aa9c547-4fd4-4838-a679-6069e762f51c" containerName="ceilometer-central-agent" containerID="cri-o://470bc878f07b8b62ec444c83ade123cba3e859ba450032c3cab76263e29ead2f" gracePeriod=30 Dec 02 16:17:16 crc kubenswrapper[4933]: I1202 16:17:16.275081 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6aa9c547-4fd4-4838-a679-6069e762f51c","Type":"ContainerStarted","Data":"2758191141d7dbcea9ebf2c3f5286ca59e3f1d947994bf62a1cbc5ea18e465ba"} Dec 02 16:17:16 crc kubenswrapper[4933]: I1202 16:17:16.275371 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 02 16:17:16 crc kubenswrapper[4933]: I1202 16:17:16.275428 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6aa9c547-4fd4-4838-a679-6069e762f51c" containerName="proxy-httpd" containerID="cri-o://2758191141d7dbcea9ebf2c3f5286ca59e3f1d947994bf62a1cbc5ea18e465ba" gracePeriod=30 Dec 02 16:17:16 crc kubenswrapper[4933]: I1202 16:17:16.275603 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6aa9c547-4fd4-4838-a679-6069e762f51c" containerName="sg-core" containerID="cri-o://4f895b87a8a15af65f7a9c441f30c1a5c78c0bc084c10af6c172fbe2c082220b" gracePeriod=30 Dec 02 16:17:16 crc kubenswrapper[4933]: I1202 16:17:16.275581 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6aa9c547-4fd4-4838-a679-6069e762f51c" containerName="ceilometer-notification-agent" containerID="cri-o://e0b29a90103c6a3bc3567e4157d4eb6032a5b1c6117b0973521dd94a567cdb79" gracePeriod=30 Dec 02 16:17:16 crc kubenswrapper[4933]: I1202 16:17:16.309179 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.9053648399999998 podStartE2EDuration="6.309155116s" podCreationTimestamp="2025-12-02 16:17:10 +0000 UTC" firstStartedPulling="2025-12-02 16:17:11.133083203 +0000 UTC m=+1494.384309906" lastFinishedPulling="2025-12-02 16:17:15.536873479 +0000 UTC m=+1498.788100182" observedRunningTime="2025-12-02 16:17:16.298226385 +0000 UTC m=+1499.549453098" watchObservedRunningTime="2025-12-02 16:17:16.309155116 +0000 UTC m=+1499.560381839" Dec 02 16:17:16 crc kubenswrapper[4933]: I1202 16:17:16.826388 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-nc5lc" Dec 02 16:17:16 crc kubenswrapper[4933]: I1202 16:17:16.828321 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nx27z\" (UniqueName: \"kubernetes.io/projected/6e06c62d-ee51-43e3-aa5e-eb045d4ec1c8-kube-api-access-nx27z\") pod \"6e06c62d-ee51-43e3-aa5e-eb045d4ec1c8\" (UID: \"6e06c62d-ee51-43e3-aa5e-eb045d4ec1c8\") " Dec 02 16:17:16 crc kubenswrapper[4933]: I1202 16:17:16.828511 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6e06c62d-ee51-43e3-aa5e-eb045d4ec1c8-operator-scripts\") pod \"6e06c62d-ee51-43e3-aa5e-eb045d4ec1c8\" (UID: \"6e06c62d-ee51-43e3-aa5e-eb045d4ec1c8\") " Dec 02 16:17:16 crc kubenswrapper[4933]: I1202 16:17:16.831324 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e06c62d-ee51-43e3-aa5e-eb045d4ec1c8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6e06c62d-ee51-43e3-aa5e-eb045d4ec1c8" (UID: "6e06c62d-ee51-43e3-aa5e-eb045d4ec1c8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:17:16 crc kubenswrapper[4933]: I1202 16:17:16.833460 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-9217-account-create-update-mnldr" Dec 02 16:17:16 crc kubenswrapper[4933]: I1202 16:17:16.834887 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e06c62d-ee51-43e3-aa5e-eb045d4ec1c8-kube-api-access-nx27z" (OuterVolumeSpecName: "kube-api-access-nx27z") pod "6e06c62d-ee51-43e3-aa5e-eb045d4ec1c8" (UID: "6e06c62d-ee51-43e3-aa5e-eb045d4ec1c8"). InnerVolumeSpecName "kube-api-access-nx27z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:17:16 crc kubenswrapper[4933]: I1202 16:17:16.930383 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8cwcp\" (UniqueName: \"kubernetes.io/projected/9cde07d2-340f-48ad-bb66-db94386d9052-kube-api-access-8cwcp\") pod \"9cde07d2-340f-48ad-bb66-db94386d9052\" (UID: \"9cde07d2-340f-48ad-bb66-db94386d9052\") " Dec 02 16:17:16 crc kubenswrapper[4933]: I1202 16:17:16.930472 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9cde07d2-340f-48ad-bb66-db94386d9052-operator-scripts\") pod \"9cde07d2-340f-48ad-bb66-db94386d9052\" (UID: \"9cde07d2-340f-48ad-bb66-db94386d9052\") " Dec 02 16:17:16 crc kubenswrapper[4933]: I1202 16:17:16.931019 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9cde07d2-340f-48ad-bb66-db94386d9052-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9cde07d2-340f-48ad-bb66-db94386d9052" (UID: "9cde07d2-340f-48ad-bb66-db94386d9052"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:17:16 crc kubenswrapper[4933]: I1202 16:17:16.931142 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nx27z\" (UniqueName: \"kubernetes.io/projected/6e06c62d-ee51-43e3-aa5e-eb045d4ec1c8-kube-api-access-nx27z\") on node \"crc\" DevicePath \"\"" Dec 02 16:17:16 crc kubenswrapper[4933]: I1202 16:17:16.931164 4933 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9cde07d2-340f-48ad-bb66-db94386d9052-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 16:17:16 crc kubenswrapper[4933]: I1202 16:17:16.931174 4933 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6e06c62d-ee51-43e3-aa5e-eb045d4ec1c8-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 16:17:16 crc kubenswrapper[4933]: I1202 16:17:16.933775 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9cde07d2-340f-48ad-bb66-db94386d9052-kube-api-access-8cwcp" (OuterVolumeSpecName: "kube-api-access-8cwcp") pod "9cde07d2-340f-48ad-bb66-db94386d9052" (UID: "9cde07d2-340f-48ad-bb66-db94386d9052"). InnerVolumeSpecName "kube-api-access-8cwcp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:17:17 crc kubenswrapper[4933]: I1202 16:17:17.033900 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8cwcp\" (UniqueName: \"kubernetes.io/projected/9cde07d2-340f-48ad-bb66-db94386d9052-kube-api-access-8cwcp\") on node \"crc\" DevicePath \"\"" Dec 02 16:17:17 crc kubenswrapper[4933]: I1202 16:17:17.287750 4933 generic.go:334] "Generic (PLEG): container finished" podID="6aa9c547-4fd4-4838-a679-6069e762f51c" containerID="2758191141d7dbcea9ebf2c3f5286ca59e3f1d947994bf62a1cbc5ea18e465ba" exitCode=0 Dec 02 16:17:17 crc kubenswrapper[4933]: I1202 16:17:17.287780 4933 generic.go:334] "Generic (PLEG): container finished" podID="6aa9c547-4fd4-4838-a679-6069e762f51c" containerID="4f895b87a8a15af65f7a9c441f30c1a5c78c0bc084c10af6c172fbe2c082220b" exitCode=2 Dec 02 16:17:17 crc kubenswrapper[4933]: I1202 16:17:17.287788 4933 generic.go:334] "Generic (PLEG): container finished" podID="6aa9c547-4fd4-4838-a679-6069e762f51c" containerID="e0b29a90103c6a3bc3567e4157d4eb6032a5b1c6117b0973521dd94a567cdb79" exitCode=0 Dec 02 16:17:17 crc kubenswrapper[4933]: I1202 16:17:17.287862 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6aa9c547-4fd4-4838-a679-6069e762f51c","Type":"ContainerDied","Data":"2758191141d7dbcea9ebf2c3f5286ca59e3f1d947994bf62a1cbc5ea18e465ba"} Dec 02 16:17:17 crc kubenswrapper[4933]: I1202 16:17:17.287890 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6aa9c547-4fd4-4838-a679-6069e762f51c","Type":"ContainerDied","Data":"4f895b87a8a15af65f7a9c441f30c1a5c78c0bc084c10af6c172fbe2c082220b"} Dec 02 16:17:17 crc kubenswrapper[4933]: I1202 16:17:17.287899 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6aa9c547-4fd4-4838-a679-6069e762f51c","Type":"ContainerDied","Data":"e0b29a90103c6a3bc3567e4157d4eb6032a5b1c6117b0973521dd94a567cdb79"} Dec 02 16:17:17 crc kubenswrapper[4933]: I1202 16:17:17.289037 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-9217-account-create-update-mnldr" event={"ID":"9cde07d2-340f-48ad-bb66-db94386d9052","Type":"ContainerDied","Data":"765cecca6fbfd3edb5dd54e81aa4cde23ec739c1f93f91a345a5bf2a4045f503"} Dec 02 16:17:17 crc kubenswrapper[4933]: I1202 16:17:17.289062 4933 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="765cecca6fbfd3edb5dd54e81aa4cde23ec739c1f93f91a345a5bf2a4045f503" Dec 02 16:17:17 crc kubenswrapper[4933]: I1202 16:17:17.289112 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-9217-account-create-update-mnldr" Dec 02 16:17:17 crc kubenswrapper[4933]: I1202 16:17:17.294032 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-nc5lc" event={"ID":"6e06c62d-ee51-43e3-aa5e-eb045d4ec1c8","Type":"ContainerDied","Data":"43ab1b9955780577e3f9c0b645c7d9e41f1ce8c0d7b9d6bc33b322927a3ab51f"} Dec 02 16:17:17 crc kubenswrapper[4933]: I1202 16:17:17.294084 4933 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="43ab1b9955780577e3f9c0b645c7d9e41f1ce8c0d7b9d6bc33b322927a3ab51f" Dec 02 16:17:17 crc kubenswrapper[4933]: I1202 16:17:17.294327 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-nc5lc" Dec 02 16:17:18 crc kubenswrapper[4933]: I1202 16:17:18.228169 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 16:17:18 crc kubenswrapper[4933]: I1202 16:17:18.262223 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6aa9c547-4fd4-4838-a679-6069e762f51c-combined-ca-bundle\") pod \"6aa9c547-4fd4-4838-a679-6069e762f51c\" (UID: \"6aa9c547-4fd4-4838-a679-6069e762f51c\") " Dec 02 16:17:18 crc kubenswrapper[4933]: I1202 16:17:18.262278 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7s85p\" (UniqueName: \"kubernetes.io/projected/6aa9c547-4fd4-4838-a679-6069e762f51c-kube-api-access-7s85p\") pod \"6aa9c547-4fd4-4838-a679-6069e762f51c\" (UID: \"6aa9c547-4fd4-4838-a679-6069e762f51c\") " Dec 02 16:17:18 crc kubenswrapper[4933]: I1202 16:17:18.262341 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6aa9c547-4fd4-4838-a679-6069e762f51c-run-httpd\") pod \"6aa9c547-4fd4-4838-a679-6069e762f51c\" (UID: \"6aa9c547-4fd4-4838-a679-6069e762f51c\") " Dec 02 16:17:18 crc kubenswrapper[4933]: I1202 16:17:18.262913 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6aa9c547-4fd4-4838-a679-6069e762f51c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "6aa9c547-4fd4-4838-a679-6069e762f51c" (UID: "6aa9c547-4fd4-4838-a679-6069e762f51c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 16:17:18 crc kubenswrapper[4933]: I1202 16:17:18.264184 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6aa9c547-4fd4-4838-a679-6069e762f51c-log-httpd\") pod \"6aa9c547-4fd4-4838-a679-6069e762f51c\" (UID: \"6aa9c547-4fd4-4838-a679-6069e762f51c\") " Dec 02 16:17:18 crc kubenswrapper[4933]: I1202 16:17:18.264319 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6aa9c547-4fd4-4838-a679-6069e762f51c-sg-core-conf-yaml\") pod \"6aa9c547-4fd4-4838-a679-6069e762f51c\" (UID: \"6aa9c547-4fd4-4838-a679-6069e762f51c\") " Dec 02 16:17:18 crc kubenswrapper[4933]: I1202 16:17:18.264403 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6aa9c547-4fd4-4838-a679-6069e762f51c-config-data\") pod \"6aa9c547-4fd4-4838-a679-6069e762f51c\" (UID: \"6aa9c547-4fd4-4838-a679-6069e762f51c\") " Dec 02 16:17:18 crc kubenswrapper[4933]: I1202 16:17:18.264440 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6aa9c547-4fd4-4838-a679-6069e762f51c-scripts\") pod \"6aa9c547-4fd4-4838-a679-6069e762f51c\" (UID: \"6aa9c547-4fd4-4838-a679-6069e762f51c\") " Dec 02 16:17:18 crc kubenswrapper[4933]: I1202 16:17:18.265446 4933 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6aa9c547-4fd4-4838-a679-6069e762f51c-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 16:17:18 crc kubenswrapper[4933]: I1202 16:17:18.268119 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6aa9c547-4fd4-4838-a679-6069e762f51c-kube-api-access-7s85p" (OuterVolumeSpecName: "kube-api-access-7s85p") pod "6aa9c547-4fd4-4838-a679-6069e762f51c" (UID: "6aa9c547-4fd4-4838-a679-6069e762f51c"). InnerVolumeSpecName "kube-api-access-7s85p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:17:18 crc kubenswrapper[4933]: I1202 16:17:18.273108 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6aa9c547-4fd4-4838-a679-6069e762f51c-scripts" (OuterVolumeSpecName: "scripts") pod "6aa9c547-4fd4-4838-a679-6069e762f51c" (UID: "6aa9c547-4fd4-4838-a679-6069e762f51c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:17:18 crc kubenswrapper[4933]: I1202 16:17:18.278762 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6aa9c547-4fd4-4838-a679-6069e762f51c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "6aa9c547-4fd4-4838-a679-6069e762f51c" (UID: "6aa9c547-4fd4-4838-a679-6069e762f51c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 16:17:18 crc kubenswrapper[4933]: I1202 16:17:18.331099 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6aa9c547-4fd4-4838-a679-6069e762f51c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "6aa9c547-4fd4-4838-a679-6069e762f51c" (UID: "6aa9c547-4fd4-4838-a679-6069e762f51c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:17:18 crc kubenswrapper[4933]: I1202 16:17:18.331573 4933 generic.go:334] "Generic (PLEG): container finished" podID="6aa9c547-4fd4-4838-a679-6069e762f51c" containerID="470bc878f07b8b62ec444c83ade123cba3e859ba450032c3cab76263e29ead2f" exitCode=0 Dec 02 16:17:18 crc kubenswrapper[4933]: I1202 16:17:18.331616 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6aa9c547-4fd4-4838-a679-6069e762f51c","Type":"ContainerDied","Data":"470bc878f07b8b62ec444c83ade123cba3e859ba450032c3cab76263e29ead2f"} Dec 02 16:17:18 crc kubenswrapper[4933]: I1202 16:17:18.331651 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6aa9c547-4fd4-4838-a679-6069e762f51c","Type":"ContainerDied","Data":"8511c7b3995e13764830877759d6c3089e7c07d255fd92c6f94033a3055f70f8"} Dec 02 16:17:18 crc kubenswrapper[4933]: I1202 16:17:18.331672 4933 scope.go:117] "RemoveContainer" containerID="2758191141d7dbcea9ebf2c3f5286ca59e3f1d947994bf62a1cbc5ea18e465ba" Dec 02 16:17:18 crc kubenswrapper[4933]: I1202 16:17:18.331903 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 16:17:18 crc kubenswrapper[4933]: I1202 16:17:18.367544 4933 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6aa9c547-4fd4-4838-a679-6069e762f51c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 02 16:17:18 crc kubenswrapper[4933]: I1202 16:17:18.367602 4933 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6aa9c547-4fd4-4838-a679-6069e762f51c-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 16:17:18 crc kubenswrapper[4933]: I1202 16:17:18.367615 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7s85p\" (UniqueName: \"kubernetes.io/projected/6aa9c547-4fd4-4838-a679-6069e762f51c-kube-api-access-7s85p\") on node \"crc\" DevicePath \"\"" Dec 02 16:17:18 crc kubenswrapper[4933]: I1202 16:17:18.367628 4933 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6aa9c547-4fd4-4838-a679-6069e762f51c-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 16:17:18 crc kubenswrapper[4933]: I1202 16:17:18.385972 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6aa9c547-4fd4-4838-a679-6069e762f51c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6aa9c547-4fd4-4838-a679-6069e762f51c" (UID: "6aa9c547-4fd4-4838-a679-6069e762f51c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:17:18 crc kubenswrapper[4933]: I1202 16:17:18.386537 4933 scope.go:117] "RemoveContainer" containerID="4f895b87a8a15af65f7a9c441f30c1a5c78c0bc084c10af6c172fbe2c082220b" Dec 02 16:17:18 crc kubenswrapper[4933]: I1202 16:17:18.399833 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6aa9c547-4fd4-4838-a679-6069e762f51c-config-data" (OuterVolumeSpecName: "config-data") pod "6aa9c547-4fd4-4838-a679-6069e762f51c" (UID: "6aa9c547-4fd4-4838-a679-6069e762f51c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:17:18 crc kubenswrapper[4933]: I1202 16:17:18.410503 4933 scope.go:117] "RemoveContainer" containerID="e0b29a90103c6a3bc3567e4157d4eb6032a5b1c6117b0973521dd94a567cdb79" Dec 02 16:17:18 crc kubenswrapper[4933]: I1202 16:17:18.441072 4933 scope.go:117] "RemoveContainer" containerID="470bc878f07b8b62ec444c83ade123cba3e859ba450032c3cab76263e29ead2f" Dec 02 16:17:18 crc kubenswrapper[4933]: I1202 16:17:18.456108 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-sync-tdcg5"] Dec 02 16:17:18 crc kubenswrapper[4933]: E1202 16:17:18.456748 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cde07d2-340f-48ad-bb66-db94386d9052" containerName="mariadb-account-create-update" Dec 02 16:17:18 crc kubenswrapper[4933]: I1202 16:17:18.456772 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cde07d2-340f-48ad-bb66-db94386d9052" containerName="mariadb-account-create-update" Dec 02 16:17:18 crc kubenswrapper[4933]: E1202 16:17:18.456805 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e06c62d-ee51-43e3-aa5e-eb045d4ec1c8" containerName="mariadb-database-create" Dec 02 16:17:18 crc kubenswrapper[4933]: I1202 16:17:18.456814 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e06c62d-ee51-43e3-aa5e-eb045d4ec1c8" containerName="mariadb-database-create" Dec 02 16:17:18 crc kubenswrapper[4933]: E1202 16:17:18.456838 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6aa9c547-4fd4-4838-a679-6069e762f51c" containerName="sg-core" Dec 02 16:17:18 crc kubenswrapper[4933]: I1202 16:17:18.456846 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="6aa9c547-4fd4-4838-a679-6069e762f51c" containerName="sg-core" Dec 02 16:17:18 crc kubenswrapper[4933]: E1202 16:17:18.456862 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6aa9c547-4fd4-4838-a679-6069e762f51c" containerName="proxy-httpd" Dec 02 16:17:18 crc kubenswrapper[4933]: I1202 16:17:18.456869 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="6aa9c547-4fd4-4838-a679-6069e762f51c" containerName="proxy-httpd" Dec 02 16:17:18 crc kubenswrapper[4933]: E1202 16:17:18.456891 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6aa9c547-4fd4-4838-a679-6069e762f51c" containerName="ceilometer-notification-agent" Dec 02 16:17:18 crc kubenswrapper[4933]: I1202 16:17:18.456900 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="6aa9c547-4fd4-4838-a679-6069e762f51c" containerName="ceilometer-notification-agent" Dec 02 16:17:18 crc kubenswrapper[4933]: E1202 16:17:18.456919 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6aa9c547-4fd4-4838-a679-6069e762f51c" containerName="ceilometer-central-agent" Dec 02 16:17:18 crc kubenswrapper[4933]: I1202 16:17:18.456927 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="6aa9c547-4fd4-4838-a679-6069e762f51c" containerName="ceilometer-central-agent" Dec 02 16:17:18 crc kubenswrapper[4933]: I1202 16:17:18.457194 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e06c62d-ee51-43e3-aa5e-eb045d4ec1c8" containerName="mariadb-database-create" Dec 02 16:17:18 crc kubenswrapper[4933]: I1202 16:17:18.457216 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cde07d2-340f-48ad-bb66-db94386d9052" containerName="mariadb-account-create-update" Dec 02 16:17:18 crc kubenswrapper[4933]: I1202 16:17:18.457233 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="6aa9c547-4fd4-4838-a679-6069e762f51c" containerName="ceilometer-notification-agent" Dec 02 16:17:18 crc kubenswrapper[4933]: I1202 16:17:18.457245 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="6aa9c547-4fd4-4838-a679-6069e762f51c" containerName="sg-core" Dec 02 16:17:18 crc kubenswrapper[4933]: I1202 16:17:18.457255 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="6aa9c547-4fd4-4838-a679-6069e762f51c" containerName="proxy-httpd" Dec 02 16:17:18 crc kubenswrapper[4933]: I1202 16:17:18.457271 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="6aa9c547-4fd4-4838-a679-6069e762f51c" containerName="ceilometer-central-agent" Dec 02 16:17:18 crc kubenswrapper[4933]: I1202 16:17:18.458281 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-tdcg5" Dec 02 16:17:18 crc kubenswrapper[4933]: I1202 16:17:18.460748 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Dec 02 16:17:18 crc kubenswrapper[4933]: I1202 16:17:18.461335 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 02 16:17:18 crc kubenswrapper[4933]: I1202 16:17:18.461492 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-g75sl" Dec 02 16:17:18 crc kubenswrapper[4933]: I1202 16:17:18.462863 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Dec 02 16:17:18 crc kubenswrapper[4933]: I1202 16:17:18.470018 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/020884d1-c656-48c9-966b-5f7da8bf6af6-combined-ca-bundle\") pod \"aodh-db-sync-tdcg5\" (UID: \"020884d1-c656-48c9-966b-5f7da8bf6af6\") " pod="openstack/aodh-db-sync-tdcg5" Dec 02 16:17:18 crc kubenswrapper[4933]: I1202 16:17:18.470148 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/020884d1-c656-48c9-966b-5f7da8bf6af6-config-data\") pod \"aodh-db-sync-tdcg5\" (UID: \"020884d1-c656-48c9-966b-5f7da8bf6af6\") " pod="openstack/aodh-db-sync-tdcg5" Dec 02 16:17:18 crc kubenswrapper[4933]: I1202 16:17:18.470188 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5q9gf\" (UniqueName: \"kubernetes.io/projected/020884d1-c656-48c9-966b-5f7da8bf6af6-kube-api-access-5q9gf\") pod \"aodh-db-sync-tdcg5\" (UID: \"020884d1-c656-48c9-966b-5f7da8bf6af6\") " pod="openstack/aodh-db-sync-tdcg5" Dec 02 16:17:18 crc kubenswrapper[4933]: I1202 16:17:18.470227 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/020884d1-c656-48c9-966b-5f7da8bf6af6-scripts\") pod \"aodh-db-sync-tdcg5\" (UID: \"020884d1-c656-48c9-966b-5f7da8bf6af6\") " pod="openstack/aodh-db-sync-tdcg5" Dec 02 16:17:18 crc kubenswrapper[4933]: I1202 16:17:18.470700 4933 scope.go:117] "RemoveContainer" containerID="2758191141d7dbcea9ebf2c3f5286ca59e3f1d947994bf62a1cbc5ea18e465ba" Dec 02 16:17:18 crc kubenswrapper[4933]: I1202 16:17:18.470748 4933 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6aa9c547-4fd4-4838-a679-6069e762f51c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 16:17:18 crc kubenswrapper[4933]: I1202 16:17:18.470786 4933 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6aa9c547-4fd4-4838-a679-6069e762f51c-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 16:17:18 crc kubenswrapper[4933]: E1202 16:17:18.471253 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2758191141d7dbcea9ebf2c3f5286ca59e3f1d947994bf62a1cbc5ea18e465ba\": container with ID starting with 2758191141d7dbcea9ebf2c3f5286ca59e3f1d947994bf62a1cbc5ea18e465ba not found: ID does not exist" containerID="2758191141d7dbcea9ebf2c3f5286ca59e3f1d947994bf62a1cbc5ea18e465ba" Dec 02 16:17:18 crc kubenswrapper[4933]: I1202 16:17:18.471291 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2758191141d7dbcea9ebf2c3f5286ca59e3f1d947994bf62a1cbc5ea18e465ba"} err="failed to get container status \"2758191141d7dbcea9ebf2c3f5286ca59e3f1d947994bf62a1cbc5ea18e465ba\": rpc error: code = NotFound desc = could not find container \"2758191141d7dbcea9ebf2c3f5286ca59e3f1d947994bf62a1cbc5ea18e465ba\": container with ID starting with 2758191141d7dbcea9ebf2c3f5286ca59e3f1d947994bf62a1cbc5ea18e465ba not found: ID does not exist" Dec 02 16:17:18 crc kubenswrapper[4933]: I1202 16:17:18.472226 4933 scope.go:117] "RemoveContainer" containerID="4f895b87a8a15af65f7a9c441f30c1a5c78c0bc084c10af6c172fbe2c082220b" Dec 02 16:17:18 crc kubenswrapper[4933]: E1202 16:17:18.472858 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f895b87a8a15af65f7a9c441f30c1a5c78c0bc084c10af6c172fbe2c082220b\": container with ID starting with 4f895b87a8a15af65f7a9c441f30c1a5c78c0bc084c10af6c172fbe2c082220b not found: ID does not exist" containerID="4f895b87a8a15af65f7a9c441f30c1a5c78c0bc084c10af6c172fbe2c082220b" Dec 02 16:17:18 crc kubenswrapper[4933]: I1202 16:17:18.472894 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f895b87a8a15af65f7a9c441f30c1a5c78c0bc084c10af6c172fbe2c082220b"} err="failed to get container status \"4f895b87a8a15af65f7a9c441f30c1a5c78c0bc084c10af6c172fbe2c082220b\": rpc error: code = NotFound desc = could not find container \"4f895b87a8a15af65f7a9c441f30c1a5c78c0bc084c10af6c172fbe2c082220b\": container with ID starting with 4f895b87a8a15af65f7a9c441f30c1a5c78c0bc084c10af6c172fbe2c082220b not found: ID does not exist" Dec 02 16:17:18 crc kubenswrapper[4933]: I1202 16:17:18.472924 4933 scope.go:117] "RemoveContainer" containerID="e0b29a90103c6a3bc3567e4157d4eb6032a5b1c6117b0973521dd94a567cdb79" Dec 02 16:17:18 crc kubenswrapper[4933]: E1202 16:17:18.476069 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0b29a90103c6a3bc3567e4157d4eb6032a5b1c6117b0973521dd94a567cdb79\": container with ID starting with e0b29a90103c6a3bc3567e4157d4eb6032a5b1c6117b0973521dd94a567cdb79 not found: ID does not exist" containerID="e0b29a90103c6a3bc3567e4157d4eb6032a5b1c6117b0973521dd94a567cdb79" Dec 02 16:17:18 crc kubenswrapper[4933]: I1202 16:17:18.476107 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0b29a90103c6a3bc3567e4157d4eb6032a5b1c6117b0973521dd94a567cdb79"} err="failed to get container status \"e0b29a90103c6a3bc3567e4157d4eb6032a5b1c6117b0973521dd94a567cdb79\": rpc error: code = NotFound desc = could not find container \"e0b29a90103c6a3bc3567e4157d4eb6032a5b1c6117b0973521dd94a567cdb79\": container with ID starting with e0b29a90103c6a3bc3567e4157d4eb6032a5b1c6117b0973521dd94a567cdb79 not found: ID does not exist" Dec 02 16:17:18 crc kubenswrapper[4933]: I1202 16:17:18.476131 4933 scope.go:117] "RemoveContainer" containerID="470bc878f07b8b62ec444c83ade123cba3e859ba450032c3cab76263e29ead2f" Dec 02 16:17:18 crc kubenswrapper[4933]: E1202 16:17:18.476380 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"470bc878f07b8b62ec444c83ade123cba3e859ba450032c3cab76263e29ead2f\": container with ID starting with 470bc878f07b8b62ec444c83ade123cba3e859ba450032c3cab76263e29ead2f not found: ID does not exist" containerID="470bc878f07b8b62ec444c83ade123cba3e859ba450032c3cab76263e29ead2f" Dec 02 16:17:18 crc kubenswrapper[4933]: I1202 16:17:18.476413 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"470bc878f07b8b62ec444c83ade123cba3e859ba450032c3cab76263e29ead2f"} err="failed to get container status \"470bc878f07b8b62ec444c83ade123cba3e859ba450032c3cab76263e29ead2f\": rpc error: code = NotFound desc = could not find container \"470bc878f07b8b62ec444c83ade123cba3e859ba450032c3cab76263e29ead2f\": container with ID starting with 470bc878f07b8b62ec444c83ade123cba3e859ba450032c3cab76263e29ead2f not found: ID does not exist" Dec 02 16:17:18 crc kubenswrapper[4933]: I1202 16:17:18.477055 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-tdcg5"] Dec 02 16:17:18 crc kubenswrapper[4933]: I1202 16:17:18.572690 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/020884d1-c656-48c9-966b-5f7da8bf6af6-config-data\") pod \"aodh-db-sync-tdcg5\" (UID: \"020884d1-c656-48c9-966b-5f7da8bf6af6\") " pod="openstack/aodh-db-sync-tdcg5" Dec 02 16:17:18 crc kubenswrapper[4933]: I1202 16:17:18.573099 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5q9gf\" (UniqueName: \"kubernetes.io/projected/020884d1-c656-48c9-966b-5f7da8bf6af6-kube-api-access-5q9gf\") pod \"aodh-db-sync-tdcg5\" (UID: \"020884d1-c656-48c9-966b-5f7da8bf6af6\") " pod="openstack/aodh-db-sync-tdcg5" Dec 02 16:17:18 crc kubenswrapper[4933]: I1202 16:17:18.573222 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/020884d1-c656-48c9-966b-5f7da8bf6af6-scripts\") pod \"aodh-db-sync-tdcg5\" (UID: \"020884d1-c656-48c9-966b-5f7da8bf6af6\") " pod="openstack/aodh-db-sync-tdcg5" Dec 02 16:17:18 crc kubenswrapper[4933]: I1202 16:17:18.573539 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/020884d1-c656-48c9-966b-5f7da8bf6af6-combined-ca-bundle\") pod \"aodh-db-sync-tdcg5\" (UID: \"020884d1-c656-48c9-966b-5f7da8bf6af6\") " pod="openstack/aodh-db-sync-tdcg5" Dec 02 16:17:18 crc kubenswrapper[4933]: I1202 16:17:18.577651 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/020884d1-c656-48c9-966b-5f7da8bf6af6-scripts\") pod \"aodh-db-sync-tdcg5\" (UID: \"020884d1-c656-48c9-966b-5f7da8bf6af6\") " pod="openstack/aodh-db-sync-tdcg5" Dec 02 16:17:18 crc kubenswrapper[4933]: I1202 16:17:18.578151 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/020884d1-c656-48c9-966b-5f7da8bf6af6-config-data\") pod \"aodh-db-sync-tdcg5\" (UID: \"020884d1-c656-48c9-966b-5f7da8bf6af6\") " pod="openstack/aodh-db-sync-tdcg5" Dec 02 16:17:18 crc kubenswrapper[4933]: I1202 16:17:18.579904 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/020884d1-c656-48c9-966b-5f7da8bf6af6-combined-ca-bundle\") pod \"aodh-db-sync-tdcg5\" (UID: \"020884d1-c656-48c9-966b-5f7da8bf6af6\") " pod="openstack/aodh-db-sync-tdcg5" Dec 02 16:17:18 crc kubenswrapper[4933]: I1202 16:17:18.590024 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5q9gf\" (UniqueName: \"kubernetes.io/projected/020884d1-c656-48c9-966b-5f7da8bf6af6-kube-api-access-5q9gf\") pod \"aodh-db-sync-tdcg5\" (UID: \"020884d1-c656-48c9-966b-5f7da8bf6af6\") " pod="openstack/aodh-db-sync-tdcg5" Dec 02 16:17:18 crc kubenswrapper[4933]: I1202 16:17:18.669606 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 16:17:18 crc kubenswrapper[4933]: I1202 16:17:18.696472 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 02 16:17:18 crc kubenswrapper[4933]: I1202 16:17:18.721817 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 02 16:17:18 crc kubenswrapper[4933]: I1202 16:17:18.725701 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 16:17:18 crc kubenswrapper[4933]: I1202 16:17:18.734082 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 02 16:17:18 crc kubenswrapper[4933]: I1202 16:17:18.734289 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 02 16:17:18 crc kubenswrapper[4933]: I1202 16:17:18.748397 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 16:17:18 crc kubenswrapper[4933]: I1202 16:17:18.777613 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0429fdbf-27e6-4cfb-a853-efb467b315d8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0429fdbf-27e6-4cfb-a853-efb467b315d8\") " pod="openstack/ceilometer-0" Dec 02 16:17:18 crc kubenswrapper[4933]: I1202 16:17:18.777725 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0429fdbf-27e6-4cfb-a853-efb467b315d8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0429fdbf-27e6-4cfb-a853-efb467b315d8\") " pod="openstack/ceilometer-0" Dec 02 16:17:18 crc kubenswrapper[4933]: I1202 16:17:18.777761 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0429fdbf-27e6-4cfb-a853-efb467b315d8-log-httpd\") pod \"ceilometer-0\" (UID: \"0429fdbf-27e6-4cfb-a853-efb467b315d8\") " pod="openstack/ceilometer-0" Dec 02 16:17:18 crc kubenswrapper[4933]: I1202 16:17:18.777783 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0429fdbf-27e6-4cfb-a853-efb467b315d8-scripts\") pod \"ceilometer-0\" (UID: \"0429fdbf-27e6-4cfb-a853-efb467b315d8\") " pod="openstack/ceilometer-0" Dec 02 16:17:18 crc kubenswrapper[4933]: I1202 16:17:18.777940 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6z5mt\" (UniqueName: \"kubernetes.io/projected/0429fdbf-27e6-4cfb-a853-efb467b315d8-kube-api-access-6z5mt\") pod \"ceilometer-0\" (UID: \"0429fdbf-27e6-4cfb-a853-efb467b315d8\") " pod="openstack/ceilometer-0" Dec 02 16:17:18 crc kubenswrapper[4933]: I1202 16:17:18.777964 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0429fdbf-27e6-4cfb-a853-efb467b315d8-config-data\") pod \"ceilometer-0\" (UID: \"0429fdbf-27e6-4cfb-a853-efb467b315d8\") " pod="openstack/ceilometer-0" Dec 02 16:17:18 crc kubenswrapper[4933]: I1202 16:17:18.778025 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0429fdbf-27e6-4cfb-a853-efb467b315d8-run-httpd\") pod \"ceilometer-0\" (UID: \"0429fdbf-27e6-4cfb-a853-efb467b315d8\") " pod="openstack/ceilometer-0" Dec 02 16:17:18 crc kubenswrapper[4933]: I1202 16:17:18.781553 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-tdcg5" Dec 02 16:17:18 crc kubenswrapper[4933]: I1202 16:17:18.879972 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0429fdbf-27e6-4cfb-a853-efb467b315d8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0429fdbf-27e6-4cfb-a853-efb467b315d8\") " pod="openstack/ceilometer-0" Dec 02 16:17:18 crc kubenswrapper[4933]: I1202 16:17:18.880039 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0429fdbf-27e6-4cfb-a853-efb467b315d8-log-httpd\") pod \"ceilometer-0\" (UID: \"0429fdbf-27e6-4cfb-a853-efb467b315d8\") " pod="openstack/ceilometer-0" Dec 02 16:17:18 crc kubenswrapper[4933]: I1202 16:17:18.880066 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0429fdbf-27e6-4cfb-a853-efb467b315d8-scripts\") pod \"ceilometer-0\" (UID: \"0429fdbf-27e6-4cfb-a853-efb467b315d8\") " pod="openstack/ceilometer-0" Dec 02 16:17:18 crc kubenswrapper[4933]: I1202 16:17:18.880180 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6z5mt\" (UniqueName: \"kubernetes.io/projected/0429fdbf-27e6-4cfb-a853-efb467b315d8-kube-api-access-6z5mt\") pod \"ceilometer-0\" (UID: \"0429fdbf-27e6-4cfb-a853-efb467b315d8\") " pod="openstack/ceilometer-0" Dec 02 16:17:18 crc kubenswrapper[4933]: I1202 16:17:18.880202 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0429fdbf-27e6-4cfb-a853-efb467b315d8-config-data\") pod \"ceilometer-0\" (UID: \"0429fdbf-27e6-4cfb-a853-efb467b315d8\") " pod="openstack/ceilometer-0" Dec 02 16:17:18 crc kubenswrapper[4933]: I1202 16:17:18.880268 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0429fdbf-27e6-4cfb-a853-efb467b315d8-run-httpd\") pod \"ceilometer-0\" (UID: \"0429fdbf-27e6-4cfb-a853-efb467b315d8\") " pod="openstack/ceilometer-0" Dec 02 16:17:18 crc kubenswrapper[4933]: I1202 16:17:18.880308 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0429fdbf-27e6-4cfb-a853-efb467b315d8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0429fdbf-27e6-4cfb-a853-efb467b315d8\") " pod="openstack/ceilometer-0" Dec 02 16:17:18 crc kubenswrapper[4933]: I1202 16:17:18.881203 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0429fdbf-27e6-4cfb-a853-efb467b315d8-run-httpd\") pod \"ceilometer-0\" (UID: \"0429fdbf-27e6-4cfb-a853-efb467b315d8\") " pod="openstack/ceilometer-0" Dec 02 16:17:18 crc kubenswrapper[4933]: I1202 16:17:18.881281 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0429fdbf-27e6-4cfb-a853-efb467b315d8-log-httpd\") pod \"ceilometer-0\" (UID: \"0429fdbf-27e6-4cfb-a853-efb467b315d8\") " pod="openstack/ceilometer-0" Dec 02 16:17:18 crc kubenswrapper[4933]: I1202 16:17:18.884425 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0429fdbf-27e6-4cfb-a853-efb467b315d8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0429fdbf-27e6-4cfb-a853-efb467b315d8\") " pod="openstack/ceilometer-0" Dec 02 16:17:18 crc kubenswrapper[4933]: I1202 16:17:18.886602 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0429fdbf-27e6-4cfb-a853-efb467b315d8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0429fdbf-27e6-4cfb-a853-efb467b315d8\") " pod="openstack/ceilometer-0" Dec 02 16:17:18 crc kubenswrapper[4933]: I1202 16:17:18.890266 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0429fdbf-27e6-4cfb-a853-efb467b315d8-config-data\") pod \"ceilometer-0\" (UID: \"0429fdbf-27e6-4cfb-a853-efb467b315d8\") " pod="openstack/ceilometer-0" Dec 02 16:17:18 crc kubenswrapper[4933]: I1202 16:17:18.891654 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0429fdbf-27e6-4cfb-a853-efb467b315d8-scripts\") pod \"ceilometer-0\" (UID: \"0429fdbf-27e6-4cfb-a853-efb467b315d8\") " pod="openstack/ceilometer-0" Dec 02 16:17:18 crc kubenswrapper[4933]: I1202 16:17:18.940763 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6z5mt\" (UniqueName: \"kubernetes.io/projected/0429fdbf-27e6-4cfb-a853-efb467b315d8-kube-api-access-6z5mt\") pod \"ceilometer-0\" (UID: \"0429fdbf-27e6-4cfb-a853-efb467b315d8\") " pod="openstack/ceilometer-0" Dec 02 16:17:19 crc kubenswrapper[4933]: I1202 16:17:19.056627 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 16:17:19 crc kubenswrapper[4933]: I1202 16:17:19.184994 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6aa9c547-4fd4-4838-a679-6069e762f51c" path="/var/lib/kubelet/pods/6aa9c547-4fd4-4838-a679-6069e762f51c/volumes" Dec 02 16:17:19 crc kubenswrapper[4933]: W1202 16:17:19.353253 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod020884d1_c656_48c9_966b_5f7da8bf6af6.slice/crio-56a25a5ba8c7f37af53a683306ce15ac41dc4e40bef4932299710590c8fc43f1 WatchSource:0}: Error finding container 56a25a5ba8c7f37af53a683306ce15ac41dc4e40bef4932299710590c8fc43f1: Status 404 returned error can't find the container with id 56a25a5ba8c7f37af53a683306ce15ac41dc4e40bef4932299710590c8fc43f1 Dec 02 16:17:19 crc kubenswrapper[4933]: I1202 16:17:19.356406 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-tdcg5"] Dec 02 16:17:19 crc kubenswrapper[4933]: W1202 16:17:19.627604 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0429fdbf_27e6_4cfb_a853_efb467b315d8.slice/crio-59a0c70cda2d0c86052e61a5125d9f060461708075b8d35ff3f46c74590c8285 WatchSource:0}: Error finding container 59a0c70cda2d0c86052e61a5125d9f060461708075b8d35ff3f46c74590c8285: Status 404 returned error can't find the container with id 59a0c70cda2d0c86052e61a5125d9f060461708075b8d35ff3f46c74590c8285 Dec 02 16:17:19 crc kubenswrapper[4933]: I1202 16:17:19.628776 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 16:17:20 crc kubenswrapper[4933]: I1202 16:17:20.384165 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-tdcg5" event={"ID":"020884d1-c656-48c9-966b-5f7da8bf6af6","Type":"ContainerStarted","Data":"56a25a5ba8c7f37af53a683306ce15ac41dc4e40bef4932299710590c8fc43f1"} Dec 02 16:17:20 crc kubenswrapper[4933]: I1202 16:17:20.388323 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0429fdbf-27e6-4cfb-a853-efb467b315d8","Type":"ContainerStarted","Data":"59a0c70cda2d0c86052e61a5125d9f060461708075b8d35ff3f46c74590c8285"} Dec 02 16:17:21 crc kubenswrapper[4933]: I1202 16:17:21.399735 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0429fdbf-27e6-4cfb-a853-efb467b315d8","Type":"ContainerStarted","Data":"18bd4e0e45a030384b4ca347f9229891bff82edbc21fe19844dfe65ec5ef522e"} Dec 02 16:17:21 crc kubenswrapper[4933]: I1202 16:17:21.400482 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0429fdbf-27e6-4cfb-a853-efb467b315d8","Type":"ContainerStarted","Data":"5c1545d48f08e3b2797d52029f5f4f1ff712110768acf700c8d2fadd3cf3be9b"} Dec 02 16:17:21 crc kubenswrapper[4933]: I1202 16:17:21.402638 4933 generic.go:334] "Generic (PLEG): container finished" podID="21060aba-17fe-429c-b7db-206edaaf91b4" containerID="a13ae2dda21fde57b30ec26bf269ac5108cc18c5faac2ae044bc0843d42b2feb" exitCode=0 Dec 02 16:17:21 crc kubenswrapper[4933]: I1202 16:17:21.402669 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-ck96l" event={"ID":"21060aba-17fe-429c-b7db-206edaaf91b4","Type":"ContainerDied","Data":"a13ae2dda21fde57b30ec26bf269ac5108cc18c5faac2ae044bc0843d42b2feb"} Dec 02 16:17:22 crc kubenswrapper[4933]: I1202 16:17:22.425890 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0429fdbf-27e6-4cfb-a853-efb467b315d8","Type":"ContainerStarted","Data":"baf957cc50228f7a3c8d7aea81c72b28eea7a9b92af72138fd36757f5fe44c59"} Dec 02 16:17:22 crc kubenswrapper[4933]: I1202 16:17:22.971563 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-ck96l" Dec 02 16:17:22 crc kubenswrapper[4933]: I1202 16:17:22.999728 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k4hkd\" (UniqueName: \"kubernetes.io/projected/21060aba-17fe-429c-b7db-206edaaf91b4-kube-api-access-k4hkd\") pod \"21060aba-17fe-429c-b7db-206edaaf91b4\" (UID: \"21060aba-17fe-429c-b7db-206edaaf91b4\") " Dec 02 16:17:22 crc kubenswrapper[4933]: I1202 16:17:22.999841 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21060aba-17fe-429c-b7db-206edaaf91b4-combined-ca-bundle\") pod \"21060aba-17fe-429c-b7db-206edaaf91b4\" (UID: \"21060aba-17fe-429c-b7db-206edaaf91b4\") " Dec 02 16:17:23 crc kubenswrapper[4933]: I1202 16:17:23.013760 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21060aba-17fe-429c-b7db-206edaaf91b4-config-data\") pod \"21060aba-17fe-429c-b7db-206edaaf91b4\" (UID: \"21060aba-17fe-429c-b7db-206edaaf91b4\") " Dec 02 16:17:23 crc kubenswrapper[4933]: I1202 16:17:23.013858 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21060aba-17fe-429c-b7db-206edaaf91b4-scripts\") pod \"21060aba-17fe-429c-b7db-206edaaf91b4\" (UID: \"21060aba-17fe-429c-b7db-206edaaf91b4\") " Dec 02 16:17:23 crc kubenswrapper[4933]: I1202 16:17:23.021062 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21060aba-17fe-429c-b7db-206edaaf91b4-scripts" (OuterVolumeSpecName: "scripts") pod "21060aba-17fe-429c-b7db-206edaaf91b4" (UID: "21060aba-17fe-429c-b7db-206edaaf91b4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:17:23 crc kubenswrapper[4933]: I1202 16:17:23.022671 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21060aba-17fe-429c-b7db-206edaaf91b4-kube-api-access-k4hkd" (OuterVolumeSpecName: "kube-api-access-k4hkd") pod "21060aba-17fe-429c-b7db-206edaaf91b4" (UID: "21060aba-17fe-429c-b7db-206edaaf91b4"). InnerVolumeSpecName "kube-api-access-k4hkd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:17:23 crc kubenswrapper[4933]: I1202 16:17:23.051171 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21060aba-17fe-429c-b7db-206edaaf91b4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "21060aba-17fe-429c-b7db-206edaaf91b4" (UID: "21060aba-17fe-429c-b7db-206edaaf91b4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:17:23 crc kubenswrapper[4933]: I1202 16:17:23.061138 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21060aba-17fe-429c-b7db-206edaaf91b4-config-data" (OuterVolumeSpecName: "config-data") pod "21060aba-17fe-429c-b7db-206edaaf91b4" (UID: "21060aba-17fe-429c-b7db-206edaaf91b4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:17:23 crc kubenswrapper[4933]: I1202 16:17:23.117846 4933 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21060aba-17fe-429c-b7db-206edaaf91b4-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 16:17:23 crc kubenswrapper[4933]: I1202 16:17:23.117876 4933 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21060aba-17fe-429c-b7db-206edaaf91b4-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 16:17:23 crc kubenswrapper[4933]: I1202 16:17:23.117886 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k4hkd\" (UniqueName: \"kubernetes.io/projected/21060aba-17fe-429c-b7db-206edaaf91b4-kube-api-access-k4hkd\") on node \"crc\" DevicePath \"\"" Dec 02 16:17:23 crc kubenswrapper[4933]: I1202 16:17:23.117896 4933 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21060aba-17fe-429c-b7db-206edaaf91b4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 16:17:23 crc kubenswrapper[4933]: I1202 16:17:23.451334 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-ck96l" event={"ID":"21060aba-17fe-429c-b7db-206edaaf91b4","Type":"ContainerDied","Data":"f755c9f39589bdee80d007491a185cc76a9231bd83646b48ddf7b422c4ea6afd"} Dec 02 16:17:23 crc kubenswrapper[4933]: I1202 16:17:23.451595 4933 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f755c9f39589bdee80d007491a185cc76a9231bd83646b48ddf7b422c4ea6afd" Dec 02 16:17:23 crc kubenswrapper[4933]: I1202 16:17:23.451381 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-ck96l" Dec 02 16:17:23 crc kubenswrapper[4933]: I1202 16:17:23.525172 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 02 16:17:23 crc kubenswrapper[4933]: E1202 16:17:23.525755 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21060aba-17fe-429c-b7db-206edaaf91b4" containerName="nova-cell0-conductor-db-sync" Dec 02 16:17:23 crc kubenswrapper[4933]: I1202 16:17:23.525771 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="21060aba-17fe-429c-b7db-206edaaf91b4" containerName="nova-cell0-conductor-db-sync" Dec 02 16:17:23 crc kubenswrapper[4933]: I1202 16:17:23.526130 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="21060aba-17fe-429c-b7db-206edaaf91b4" containerName="nova-cell0-conductor-db-sync" Dec 02 16:17:23 crc kubenswrapper[4933]: I1202 16:17:23.527266 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 02 16:17:23 crc kubenswrapper[4933]: I1202 16:17:23.532324 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 02 16:17:23 crc kubenswrapper[4933]: I1202 16:17:23.535985 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 02 16:17:23 crc kubenswrapper[4933]: I1202 16:17:23.538507 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-5k8wt" Dec 02 16:17:23 crc kubenswrapper[4933]: I1202 16:17:23.635859 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8m9s\" (UniqueName: \"kubernetes.io/projected/f8103500-191e-4988-b97f-532c1f1c1b20-kube-api-access-k8m9s\") pod \"nova-cell0-conductor-0\" (UID: \"f8103500-191e-4988-b97f-532c1f1c1b20\") " pod="openstack/nova-cell0-conductor-0" Dec 02 16:17:23 crc kubenswrapper[4933]: I1202 16:17:23.636105 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8103500-191e-4988-b97f-532c1f1c1b20-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"f8103500-191e-4988-b97f-532c1f1c1b20\") " pod="openstack/nova-cell0-conductor-0" Dec 02 16:17:23 crc kubenswrapper[4933]: I1202 16:17:23.636256 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8103500-191e-4988-b97f-532c1f1c1b20-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"f8103500-191e-4988-b97f-532c1f1c1b20\") " pod="openstack/nova-cell0-conductor-0" Dec 02 16:17:23 crc kubenswrapper[4933]: I1202 16:17:23.738222 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k8m9s\" (UniqueName: \"kubernetes.io/projected/f8103500-191e-4988-b97f-532c1f1c1b20-kube-api-access-k8m9s\") pod \"nova-cell0-conductor-0\" (UID: \"f8103500-191e-4988-b97f-532c1f1c1b20\") " pod="openstack/nova-cell0-conductor-0" Dec 02 16:17:23 crc kubenswrapper[4933]: I1202 16:17:23.738413 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8103500-191e-4988-b97f-532c1f1c1b20-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"f8103500-191e-4988-b97f-532c1f1c1b20\") " pod="openstack/nova-cell0-conductor-0" Dec 02 16:17:23 crc kubenswrapper[4933]: I1202 16:17:23.738507 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8103500-191e-4988-b97f-532c1f1c1b20-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"f8103500-191e-4988-b97f-532c1f1c1b20\") " pod="openstack/nova-cell0-conductor-0" Dec 02 16:17:23 crc kubenswrapper[4933]: I1202 16:17:23.752984 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8103500-191e-4988-b97f-532c1f1c1b20-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"f8103500-191e-4988-b97f-532c1f1c1b20\") " pod="openstack/nova-cell0-conductor-0" Dec 02 16:17:23 crc kubenswrapper[4933]: I1202 16:17:23.753082 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8103500-191e-4988-b97f-532c1f1c1b20-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"f8103500-191e-4988-b97f-532c1f1c1b20\") " pod="openstack/nova-cell0-conductor-0" Dec 02 16:17:23 crc kubenswrapper[4933]: I1202 16:17:23.762522 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8m9s\" (UniqueName: \"kubernetes.io/projected/f8103500-191e-4988-b97f-532c1f1c1b20-kube-api-access-k8m9s\") pod \"nova-cell0-conductor-0\" (UID: \"f8103500-191e-4988-b97f-532c1f1c1b20\") " pod="openstack/nova-cell0-conductor-0" Dec 02 16:17:23 crc kubenswrapper[4933]: I1202 16:17:23.957869 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 02 16:17:24 crc kubenswrapper[4933]: E1202 16:17:24.831700 4933 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6e06c62d_ee51_43e3_aa5e_eb045d4ec1c8.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9cde07d2_340f_48ad_bb66_db94386d9052.slice\": RecentStats: unable to find data in memory cache]" Dec 02 16:17:26 crc kubenswrapper[4933]: I1202 16:17:26.494076 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0429fdbf-27e6-4cfb-a853-efb467b315d8","Type":"ContainerStarted","Data":"616468f5d27027b95a4ca3f63cd1a374fe7b55927f593acdd56b27443c8847a0"} Dec 02 16:17:26 crc kubenswrapper[4933]: I1202 16:17:26.494622 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 02 16:17:26 crc kubenswrapper[4933]: I1202 16:17:26.518282 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=4.622915188 podStartE2EDuration="8.518259034s" podCreationTimestamp="2025-12-02 16:17:18 +0000 UTC" firstStartedPulling="2025-12-02 16:17:19.631239846 +0000 UTC m=+1502.882466549" lastFinishedPulling="2025-12-02 16:17:23.526583682 +0000 UTC m=+1506.777810395" observedRunningTime="2025-12-02 16:17:26.514403978 +0000 UTC m=+1509.765630691" watchObservedRunningTime="2025-12-02 16:17:26.518259034 +0000 UTC m=+1509.769485747" Dec 02 16:17:26 crc kubenswrapper[4933]: E1202 16:17:26.943077 4933 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9cde07d2_340f_48ad_bb66_db94386d9052.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6e06c62d_ee51_43e3_aa5e_eb045d4ec1c8.slice\": RecentStats: unable to find data in memory cache]" Dec 02 16:17:27 crc kubenswrapper[4933]: I1202 16:17:27.239184 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 02 16:17:27 crc kubenswrapper[4933]: W1202 16:17:27.240074 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf8103500_191e_4988_b97f_532c1f1c1b20.slice/crio-dcb4ee1259a31b7829b47b36f4a1a4fef5c0baf627bb80db3bd9507de2eadf03 WatchSource:0}: Error finding container dcb4ee1259a31b7829b47b36f4a1a4fef5c0baf627bb80db3bd9507de2eadf03: Status 404 returned error can't find the container with id dcb4ee1259a31b7829b47b36f4a1a4fef5c0baf627bb80db3bd9507de2eadf03 Dec 02 16:17:27 crc kubenswrapper[4933]: I1202 16:17:27.506334 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"f8103500-191e-4988-b97f-532c1f1c1b20","Type":"ContainerStarted","Data":"4822d0bc927120a8f04dfc13d30053fd18d6146678487c42a991870b1ee41056"} Dec 02 16:17:27 crc kubenswrapper[4933]: I1202 16:17:27.507197 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"f8103500-191e-4988-b97f-532c1f1c1b20","Type":"ContainerStarted","Data":"dcb4ee1259a31b7829b47b36f4a1a4fef5c0baf627bb80db3bd9507de2eadf03"} Dec 02 16:17:27 crc kubenswrapper[4933]: I1202 16:17:27.507221 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Dec 02 16:17:27 crc kubenswrapper[4933]: I1202 16:17:27.510183 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-tdcg5" event={"ID":"020884d1-c656-48c9-966b-5f7da8bf6af6","Type":"ContainerStarted","Data":"43aeb7bd707fcb0da67202e8c8af991d815d581f71fecb2bcf97157a60608b96"} Dec 02 16:17:27 crc kubenswrapper[4933]: I1202 16:17:27.529389 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=4.529371267 podStartE2EDuration="4.529371267s" podCreationTimestamp="2025-12-02 16:17:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 16:17:27.521566462 +0000 UTC m=+1510.772793185" watchObservedRunningTime="2025-12-02 16:17:27.529371267 +0000 UTC m=+1510.780597970" Dec 02 16:17:27 crc kubenswrapper[4933]: I1202 16:17:27.549698 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-sync-tdcg5" podStartSLOduration=2.102908054 podStartE2EDuration="9.549678016s" podCreationTimestamp="2025-12-02 16:17:18 +0000 UTC" firstStartedPulling="2025-12-02 16:17:19.36156195 +0000 UTC m=+1502.612788663" lastFinishedPulling="2025-12-02 16:17:26.808331912 +0000 UTC m=+1510.059558625" observedRunningTime="2025-12-02 16:17:27.538739975 +0000 UTC m=+1510.789966678" watchObservedRunningTime="2025-12-02 16:17:27.549678016 +0000 UTC m=+1510.800904719" Dec 02 16:17:29 crc kubenswrapper[4933]: I1202 16:17:29.536044 4933 generic.go:334] "Generic (PLEG): container finished" podID="020884d1-c656-48c9-966b-5f7da8bf6af6" containerID="43aeb7bd707fcb0da67202e8c8af991d815d581f71fecb2bcf97157a60608b96" exitCode=0 Dec 02 16:17:29 crc kubenswrapper[4933]: I1202 16:17:29.536308 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-tdcg5" event={"ID":"020884d1-c656-48c9-966b-5f7da8bf6af6","Type":"ContainerDied","Data":"43aeb7bd707fcb0da67202e8c8af991d815d581f71fecb2bcf97157a60608b96"} Dec 02 16:17:31 crc kubenswrapper[4933]: I1202 16:17:31.012897 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-tdcg5" Dec 02 16:17:31 crc kubenswrapper[4933]: I1202 16:17:31.125567 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5q9gf\" (UniqueName: \"kubernetes.io/projected/020884d1-c656-48c9-966b-5f7da8bf6af6-kube-api-access-5q9gf\") pod \"020884d1-c656-48c9-966b-5f7da8bf6af6\" (UID: \"020884d1-c656-48c9-966b-5f7da8bf6af6\") " Dec 02 16:17:31 crc kubenswrapper[4933]: I1202 16:17:31.125699 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/020884d1-c656-48c9-966b-5f7da8bf6af6-scripts\") pod \"020884d1-c656-48c9-966b-5f7da8bf6af6\" (UID: \"020884d1-c656-48c9-966b-5f7da8bf6af6\") " Dec 02 16:17:31 crc kubenswrapper[4933]: I1202 16:17:31.125764 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/020884d1-c656-48c9-966b-5f7da8bf6af6-combined-ca-bundle\") pod \"020884d1-c656-48c9-966b-5f7da8bf6af6\" (UID: \"020884d1-c656-48c9-966b-5f7da8bf6af6\") " Dec 02 16:17:31 crc kubenswrapper[4933]: I1202 16:17:31.125933 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/020884d1-c656-48c9-966b-5f7da8bf6af6-config-data\") pod \"020884d1-c656-48c9-966b-5f7da8bf6af6\" (UID: \"020884d1-c656-48c9-966b-5f7da8bf6af6\") " Dec 02 16:17:31 crc kubenswrapper[4933]: I1202 16:17:31.134042 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/020884d1-c656-48c9-966b-5f7da8bf6af6-scripts" (OuterVolumeSpecName: "scripts") pod "020884d1-c656-48c9-966b-5f7da8bf6af6" (UID: "020884d1-c656-48c9-966b-5f7da8bf6af6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:17:31 crc kubenswrapper[4933]: I1202 16:17:31.134076 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/020884d1-c656-48c9-966b-5f7da8bf6af6-kube-api-access-5q9gf" (OuterVolumeSpecName: "kube-api-access-5q9gf") pod "020884d1-c656-48c9-966b-5f7da8bf6af6" (UID: "020884d1-c656-48c9-966b-5f7da8bf6af6"). InnerVolumeSpecName "kube-api-access-5q9gf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:17:31 crc kubenswrapper[4933]: I1202 16:17:31.161077 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/020884d1-c656-48c9-966b-5f7da8bf6af6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "020884d1-c656-48c9-966b-5f7da8bf6af6" (UID: "020884d1-c656-48c9-966b-5f7da8bf6af6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:17:31 crc kubenswrapper[4933]: I1202 16:17:31.167929 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/020884d1-c656-48c9-966b-5f7da8bf6af6-config-data" (OuterVolumeSpecName: "config-data") pod "020884d1-c656-48c9-966b-5f7da8bf6af6" (UID: "020884d1-c656-48c9-966b-5f7da8bf6af6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:17:31 crc kubenswrapper[4933]: I1202 16:17:31.228361 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5q9gf\" (UniqueName: \"kubernetes.io/projected/020884d1-c656-48c9-966b-5f7da8bf6af6-kube-api-access-5q9gf\") on node \"crc\" DevicePath \"\"" Dec 02 16:17:31 crc kubenswrapper[4933]: I1202 16:17:31.228397 4933 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/020884d1-c656-48c9-966b-5f7da8bf6af6-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 16:17:31 crc kubenswrapper[4933]: I1202 16:17:31.228408 4933 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/020884d1-c656-48c9-966b-5f7da8bf6af6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 16:17:31 crc kubenswrapper[4933]: I1202 16:17:31.228416 4933 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/020884d1-c656-48c9-966b-5f7da8bf6af6-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 16:17:31 crc kubenswrapper[4933]: I1202 16:17:31.555994 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-tdcg5" event={"ID":"020884d1-c656-48c9-966b-5f7da8bf6af6","Type":"ContainerDied","Data":"56a25a5ba8c7f37af53a683306ce15ac41dc4e40bef4932299710590c8fc43f1"} Dec 02 16:17:31 crc kubenswrapper[4933]: I1202 16:17:31.556379 4933 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="56a25a5ba8c7f37af53a683306ce15ac41dc4e40bef4932299710590c8fc43f1" Dec 02 16:17:31 crc kubenswrapper[4933]: I1202 16:17:31.556099 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-tdcg5" Dec 02 16:17:33 crc kubenswrapper[4933]: I1202 16:17:33.021404 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Dec 02 16:17:33 crc kubenswrapper[4933]: E1202 16:17:33.022207 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="020884d1-c656-48c9-966b-5f7da8bf6af6" containerName="aodh-db-sync" Dec 02 16:17:33 crc kubenswrapper[4933]: I1202 16:17:33.022220 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="020884d1-c656-48c9-966b-5f7da8bf6af6" containerName="aodh-db-sync" Dec 02 16:17:33 crc kubenswrapper[4933]: I1202 16:17:33.022506 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="020884d1-c656-48c9-966b-5f7da8bf6af6" containerName="aodh-db-sync" Dec 02 16:17:33 crc kubenswrapper[4933]: I1202 16:17:33.026020 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 02 16:17:33 crc kubenswrapper[4933]: I1202 16:17:33.031389 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-g75sl" Dec 02 16:17:33 crc kubenswrapper[4933]: I1202 16:17:33.031616 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Dec 02 16:17:33 crc kubenswrapper[4933]: I1202 16:17:33.032160 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Dec 02 16:17:33 crc kubenswrapper[4933]: I1202 16:17:33.037066 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Dec 02 16:17:33 crc kubenswrapper[4933]: I1202 16:17:33.172759 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9e6f954-bf8f-4aab-a398-d3320706266e-config-data\") pod \"aodh-0\" (UID: \"a9e6f954-bf8f-4aab-a398-d3320706266e\") " pod="openstack/aodh-0" Dec 02 16:17:33 crc kubenswrapper[4933]: I1202 16:17:33.172942 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9e6f954-bf8f-4aab-a398-d3320706266e-scripts\") pod \"aodh-0\" (UID: \"a9e6f954-bf8f-4aab-a398-d3320706266e\") " pod="openstack/aodh-0" Dec 02 16:17:33 crc kubenswrapper[4933]: I1202 16:17:33.173501 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmlkz\" (UniqueName: \"kubernetes.io/projected/a9e6f954-bf8f-4aab-a398-d3320706266e-kube-api-access-tmlkz\") pod \"aodh-0\" (UID: \"a9e6f954-bf8f-4aab-a398-d3320706266e\") " pod="openstack/aodh-0" Dec 02 16:17:33 crc kubenswrapper[4933]: I1202 16:17:33.173596 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9e6f954-bf8f-4aab-a398-d3320706266e-combined-ca-bundle\") pod \"aodh-0\" (UID: \"a9e6f954-bf8f-4aab-a398-d3320706266e\") " pod="openstack/aodh-0" Dec 02 16:17:33 crc kubenswrapper[4933]: I1202 16:17:33.275621 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmlkz\" (UniqueName: \"kubernetes.io/projected/a9e6f954-bf8f-4aab-a398-d3320706266e-kube-api-access-tmlkz\") pod \"aodh-0\" (UID: \"a9e6f954-bf8f-4aab-a398-d3320706266e\") " pod="openstack/aodh-0" Dec 02 16:17:33 crc kubenswrapper[4933]: I1202 16:17:33.275681 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9e6f954-bf8f-4aab-a398-d3320706266e-combined-ca-bundle\") pod \"aodh-0\" (UID: \"a9e6f954-bf8f-4aab-a398-d3320706266e\") " pod="openstack/aodh-0" Dec 02 16:17:33 crc kubenswrapper[4933]: I1202 16:17:33.275738 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9e6f954-bf8f-4aab-a398-d3320706266e-config-data\") pod \"aodh-0\" (UID: \"a9e6f954-bf8f-4aab-a398-d3320706266e\") " pod="openstack/aodh-0" Dec 02 16:17:33 crc kubenswrapper[4933]: I1202 16:17:33.275873 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9e6f954-bf8f-4aab-a398-d3320706266e-scripts\") pod \"aodh-0\" (UID: \"a9e6f954-bf8f-4aab-a398-d3320706266e\") " pod="openstack/aodh-0" Dec 02 16:17:33 crc kubenswrapper[4933]: I1202 16:17:33.282715 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9e6f954-bf8f-4aab-a398-d3320706266e-config-data\") pod \"aodh-0\" (UID: \"a9e6f954-bf8f-4aab-a398-d3320706266e\") " pod="openstack/aodh-0" Dec 02 16:17:33 crc kubenswrapper[4933]: I1202 16:17:33.285297 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9e6f954-bf8f-4aab-a398-d3320706266e-combined-ca-bundle\") pod \"aodh-0\" (UID: \"a9e6f954-bf8f-4aab-a398-d3320706266e\") " pod="openstack/aodh-0" Dec 02 16:17:33 crc kubenswrapper[4933]: I1202 16:17:33.288205 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9e6f954-bf8f-4aab-a398-d3320706266e-scripts\") pod \"aodh-0\" (UID: \"a9e6f954-bf8f-4aab-a398-d3320706266e\") " pod="openstack/aodh-0" Dec 02 16:17:33 crc kubenswrapper[4933]: I1202 16:17:33.296740 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmlkz\" (UniqueName: \"kubernetes.io/projected/a9e6f954-bf8f-4aab-a398-d3320706266e-kube-api-access-tmlkz\") pod \"aodh-0\" (UID: \"a9e6f954-bf8f-4aab-a398-d3320706266e\") " pod="openstack/aodh-0" Dec 02 16:17:33 crc kubenswrapper[4933]: I1202 16:17:33.360977 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 02 16:17:33 crc kubenswrapper[4933]: I1202 16:17:33.894483 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Dec 02 16:17:33 crc kubenswrapper[4933]: I1202 16:17:33.998082 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Dec 02 16:17:34 crc kubenswrapper[4933]: I1202 16:17:34.594581 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"a9e6f954-bf8f-4aab-a398-d3320706266e","Type":"ContainerStarted","Data":"6a5d7f2f823dc517cefa6624667db900084a4d402679db3eafd1cfcfc4f86a5e"} Dec 02 16:17:34 crc kubenswrapper[4933]: I1202 16:17:34.842462 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-6zh5j"] Dec 02 16:17:34 crc kubenswrapper[4933]: I1202 16:17:34.844364 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-6zh5j" Dec 02 16:17:34 crc kubenswrapper[4933]: I1202 16:17:34.846361 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Dec 02 16:17:34 crc kubenswrapper[4933]: I1202 16:17:34.848189 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Dec 02 16:17:34 crc kubenswrapper[4933]: I1202 16:17:34.864567 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-6zh5j"] Dec 02 16:17:35 crc kubenswrapper[4933]: I1202 16:17:35.031957 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pt9ll\" (UniqueName: \"kubernetes.io/projected/19ded5fa-9330-468d-b544-de40fe542bf0-kube-api-access-pt9ll\") pod \"nova-cell0-cell-mapping-6zh5j\" (UID: \"19ded5fa-9330-468d-b544-de40fe542bf0\") " pod="openstack/nova-cell0-cell-mapping-6zh5j" Dec 02 16:17:35 crc kubenswrapper[4933]: I1202 16:17:35.032167 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19ded5fa-9330-468d-b544-de40fe542bf0-config-data\") pod \"nova-cell0-cell-mapping-6zh5j\" (UID: \"19ded5fa-9330-468d-b544-de40fe542bf0\") " pod="openstack/nova-cell0-cell-mapping-6zh5j" Dec 02 16:17:35 crc kubenswrapper[4933]: I1202 16:17:35.032242 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19ded5fa-9330-468d-b544-de40fe542bf0-scripts\") pod \"nova-cell0-cell-mapping-6zh5j\" (UID: \"19ded5fa-9330-468d-b544-de40fe542bf0\") " pod="openstack/nova-cell0-cell-mapping-6zh5j" Dec 02 16:17:35 crc kubenswrapper[4933]: I1202 16:17:35.032375 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19ded5fa-9330-468d-b544-de40fe542bf0-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-6zh5j\" (UID: \"19ded5fa-9330-468d-b544-de40fe542bf0\") " pod="openstack/nova-cell0-cell-mapping-6zh5j" Dec 02 16:17:35 crc kubenswrapper[4933]: I1202 16:17:35.132152 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 02 16:17:35 crc kubenswrapper[4933]: I1202 16:17:35.142013 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19ded5fa-9330-468d-b544-de40fe542bf0-config-data\") pod \"nova-cell0-cell-mapping-6zh5j\" (UID: \"19ded5fa-9330-468d-b544-de40fe542bf0\") " pod="openstack/nova-cell0-cell-mapping-6zh5j" Dec 02 16:17:35 crc kubenswrapper[4933]: I1202 16:17:35.142093 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19ded5fa-9330-468d-b544-de40fe542bf0-scripts\") pod \"nova-cell0-cell-mapping-6zh5j\" (UID: \"19ded5fa-9330-468d-b544-de40fe542bf0\") " pod="openstack/nova-cell0-cell-mapping-6zh5j" Dec 02 16:17:35 crc kubenswrapper[4933]: I1202 16:17:35.142184 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19ded5fa-9330-468d-b544-de40fe542bf0-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-6zh5j\" (UID: \"19ded5fa-9330-468d-b544-de40fe542bf0\") " pod="openstack/nova-cell0-cell-mapping-6zh5j" Dec 02 16:17:35 crc kubenswrapper[4933]: I1202 16:17:35.142382 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pt9ll\" (UniqueName: \"kubernetes.io/projected/19ded5fa-9330-468d-b544-de40fe542bf0-kube-api-access-pt9ll\") pod \"nova-cell0-cell-mapping-6zh5j\" (UID: \"19ded5fa-9330-468d-b544-de40fe542bf0\") " pod="openstack/nova-cell0-cell-mapping-6zh5j" Dec 02 16:17:35 crc kubenswrapper[4933]: I1202 16:17:35.146775 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 16:17:35 crc kubenswrapper[4933]: I1202 16:17:35.157458 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 02 16:17:35 crc kubenswrapper[4933]: I1202 16:17:35.158203 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 02 16:17:35 crc kubenswrapper[4933]: I1202 16:17:35.159775 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 02 16:17:35 crc kubenswrapper[4933]: I1202 16:17:35.167985 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19ded5fa-9330-468d-b544-de40fe542bf0-scripts\") pod \"nova-cell0-cell-mapping-6zh5j\" (UID: \"19ded5fa-9330-468d-b544-de40fe542bf0\") " pod="openstack/nova-cell0-cell-mapping-6zh5j" Dec 02 16:17:35 crc kubenswrapper[4933]: I1202 16:17:35.168288 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 02 16:17:35 crc kubenswrapper[4933]: I1202 16:17:35.177059 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19ded5fa-9330-468d-b544-de40fe542bf0-config-data\") pod \"nova-cell0-cell-mapping-6zh5j\" (UID: \"19ded5fa-9330-468d-b544-de40fe542bf0\") " pod="openstack/nova-cell0-cell-mapping-6zh5j" Dec 02 16:17:35 crc kubenswrapper[4933]: I1202 16:17:35.189681 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pt9ll\" (UniqueName: \"kubernetes.io/projected/19ded5fa-9330-468d-b544-de40fe542bf0-kube-api-access-pt9ll\") pod \"nova-cell0-cell-mapping-6zh5j\" (UID: \"19ded5fa-9330-468d-b544-de40fe542bf0\") " pod="openstack/nova-cell0-cell-mapping-6zh5j" Dec 02 16:17:35 crc kubenswrapper[4933]: I1202 16:17:35.191497 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19ded5fa-9330-468d-b544-de40fe542bf0-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-6zh5j\" (UID: \"19ded5fa-9330-468d-b544-de40fe542bf0\") " pod="openstack/nova-cell0-cell-mapping-6zh5j" Dec 02 16:17:35 crc kubenswrapper[4933]: I1202 16:17:35.213425 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 02 16:17:35 crc kubenswrapper[4933]: I1202 16:17:35.252418 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4pr9\" (UniqueName: \"kubernetes.io/projected/93444a85-ddc7-4883-9a66-8a759201161d-kube-api-access-s4pr9\") pod \"nova-api-0\" (UID: \"93444a85-ddc7-4883-9a66-8a759201161d\") " pod="openstack/nova-api-0" Dec 02 16:17:35 crc kubenswrapper[4933]: I1202 16:17:35.252627 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/93444a85-ddc7-4883-9a66-8a759201161d-logs\") pod \"nova-api-0\" (UID: \"93444a85-ddc7-4883-9a66-8a759201161d\") " pod="openstack/nova-api-0" Dec 02 16:17:35 crc kubenswrapper[4933]: I1202 16:17:35.253216 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93444a85-ddc7-4883-9a66-8a759201161d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"93444a85-ddc7-4883-9a66-8a759201161d\") " pod="openstack/nova-api-0" Dec 02 16:17:35 crc kubenswrapper[4933]: I1202 16:17:35.253506 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93444a85-ddc7-4883-9a66-8a759201161d-config-data\") pod \"nova-api-0\" (UID: \"93444a85-ddc7-4883-9a66-8a759201161d\") " pod="openstack/nova-api-0" Dec 02 16:17:35 crc kubenswrapper[4933]: I1202 16:17:35.275117 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 02 16:17:35 crc kubenswrapper[4933]: I1202 16:17:35.355588 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4pr9\" (UniqueName: \"kubernetes.io/projected/93444a85-ddc7-4883-9a66-8a759201161d-kube-api-access-s4pr9\") pod \"nova-api-0\" (UID: \"93444a85-ddc7-4883-9a66-8a759201161d\") " pod="openstack/nova-api-0" Dec 02 16:17:35 crc kubenswrapper[4933]: I1202 16:17:35.355711 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e5cd02d-fd19-452b-a4c0-eb3dbf888dfb-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"2e5cd02d-fd19-452b-a4c0-eb3dbf888dfb\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 16:17:35 crc kubenswrapper[4933]: I1202 16:17:35.355745 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/93444a85-ddc7-4883-9a66-8a759201161d-logs\") pod \"nova-api-0\" (UID: \"93444a85-ddc7-4883-9a66-8a759201161d\") " pod="openstack/nova-api-0" Dec 02 16:17:35 crc kubenswrapper[4933]: I1202 16:17:35.355896 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzmvb\" (UniqueName: \"kubernetes.io/projected/2e5cd02d-fd19-452b-a4c0-eb3dbf888dfb-kube-api-access-dzmvb\") pod \"nova-cell1-novncproxy-0\" (UID: \"2e5cd02d-fd19-452b-a4c0-eb3dbf888dfb\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 16:17:35 crc kubenswrapper[4933]: I1202 16:17:35.355961 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e5cd02d-fd19-452b-a4c0-eb3dbf888dfb-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"2e5cd02d-fd19-452b-a4c0-eb3dbf888dfb\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 16:17:35 crc kubenswrapper[4933]: I1202 16:17:35.356003 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93444a85-ddc7-4883-9a66-8a759201161d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"93444a85-ddc7-4883-9a66-8a759201161d\") " pod="openstack/nova-api-0" Dec 02 16:17:35 crc kubenswrapper[4933]: I1202 16:17:35.356046 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93444a85-ddc7-4883-9a66-8a759201161d-config-data\") pod \"nova-api-0\" (UID: \"93444a85-ddc7-4883-9a66-8a759201161d\") " pod="openstack/nova-api-0" Dec 02 16:17:35 crc kubenswrapper[4933]: I1202 16:17:35.356836 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/93444a85-ddc7-4883-9a66-8a759201161d-logs\") pod \"nova-api-0\" (UID: \"93444a85-ddc7-4883-9a66-8a759201161d\") " pod="openstack/nova-api-0" Dec 02 16:17:35 crc kubenswrapper[4933]: I1202 16:17:35.372867 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93444a85-ddc7-4883-9a66-8a759201161d-config-data\") pod \"nova-api-0\" (UID: \"93444a85-ddc7-4883-9a66-8a759201161d\") " pod="openstack/nova-api-0" Dec 02 16:17:35 crc kubenswrapper[4933]: I1202 16:17:35.385759 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93444a85-ddc7-4883-9a66-8a759201161d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"93444a85-ddc7-4883-9a66-8a759201161d\") " pod="openstack/nova-api-0" Dec 02 16:17:35 crc kubenswrapper[4933]: I1202 16:17:35.395906 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 16:17:35 crc kubenswrapper[4933]: I1202 16:17:35.397659 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 02 16:17:35 crc kubenswrapper[4933]: I1202 16:17:35.409347 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 02 16:17:35 crc kubenswrapper[4933]: I1202 16:17:35.409895 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4pr9\" (UniqueName: \"kubernetes.io/projected/93444a85-ddc7-4883-9a66-8a759201161d-kube-api-access-s4pr9\") pod \"nova-api-0\" (UID: \"93444a85-ddc7-4883-9a66-8a759201161d\") " pod="openstack/nova-api-0" Dec 02 16:17:35 crc kubenswrapper[4933]: I1202 16:17:35.469747 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce47649a-d30e-4b04-a077-b6efde5fd587-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ce47649a-d30e-4b04-a077-b6efde5fd587\") " pod="openstack/nova-scheduler-0" Dec 02 16:17:35 crc kubenswrapper[4933]: I1202 16:17:35.469793 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plffx\" (UniqueName: \"kubernetes.io/projected/ce47649a-d30e-4b04-a077-b6efde5fd587-kube-api-access-plffx\") pod \"nova-scheduler-0\" (UID: \"ce47649a-d30e-4b04-a077-b6efde5fd587\") " pod="openstack/nova-scheduler-0" Dec 02 16:17:35 crc kubenswrapper[4933]: I1202 16:17:35.469883 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e5cd02d-fd19-452b-a4c0-eb3dbf888dfb-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"2e5cd02d-fd19-452b-a4c0-eb3dbf888dfb\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 16:17:35 crc kubenswrapper[4933]: I1202 16:17:35.469967 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce47649a-d30e-4b04-a077-b6efde5fd587-config-data\") pod \"nova-scheduler-0\" (UID: \"ce47649a-d30e-4b04-a077-b6efde5fd587\") " pod="openstack/nova-scheduler-0" Dec 02 16:17:35 crc kubenswrapper[4933]: I1202 16:17:35.469988 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dzmvb\" (UniqueName: \"kubernetes.io/projected/2e5cd02d-fd19-452b-a4c0-eb3dbf888dfb-kube-api-access-dzmvb\") pod \"nova-cell1-novncproxy-0\" (UID: \"2e5cd02d-fd19-452b-a4c0-eb3dbf888dfb\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 16:17:35 crc kubenswrapper[4933]: I1202 16:17:35.470019 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e5cd02d-fd19-452b-a4c0-eb3dbf888dfb-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"2e5cd02d-fd19-452b-a4c0-eb3dbf888dfb\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 16:17:35 crc kubenswrapper[4933]: I1202 16:17:35.478142 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-6zh5j" Dec 02 16:17:35 crc kubenswrapper[4933]: I1202 16:17:35.478774 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e5cd02d-fd19-452b-a4c0-eb3dbf888dfb-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"2e5cd02d-fd19-452b-a4c0-eb3dbf888dfb\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 16:17:35 crc kubenswrapper[4933]: I1202 16:17:35.499136 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e5cd02d-fd19-452b-a4c0-eb3dbf888dfb-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"2e5cd02d-fd19-452b-a4c0-eb3dbf888dfb\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 16:17:35 crc kubenswrapper[4933]: I1202 16:17:35.507346 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzmvb\" (UniqueName: \"kubernetes.io/projected/2e5cd02d-fd19-452b-a4c0-eb3dbf888dfb-kube-api-access-dzmvb\") pod \"nova-cell1-novncproxy-0\" (UID: \"2e5cd02d-fd19-452b-a4c0-eb3dbf888dfb\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 16:17:35 crc kubenswrapper[4933]: I1202 16:17:35.524589 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 16:17:35 crc kubenswrapper[4933]: I1202 16:17:35.573527 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce47649a-d30e-4b04-a077-b6efde5fd587-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ce47649a-d30e-4b04-a077-b6efde5fd587\") " pod="openstack/nova-scheduler-0" Dec 02 16:17:35 crc kubenswrapper[4933]: I1202 16:17:35.573569 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-plffx\" (UniqueName: \"kubernetes.io/projected/ce47649a-d30e-4b04-a077-b6efde5fd587-kube-api-access-plffx\") pod \"nova-scheduler-0\" (UID: \"ce47649a-d30e-4b04-a077-b6efde5fd587\") " pod="openstack/nova-scheduler-0" Dec 02 16:17:35 crc kubenswrapper[4933]: I1202 16:17:35.573668 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce47649a-d30e-4b04-a077-b6efde5fd587-config-data\") pod \"nova-scheduler-0\" (UID: \"ce47649a-d30e-4b04-a077-b6efde5fd587\") " pod="openstack/nova-scheduler-0" Dec 02 16:17:35 crc kubenswrapper[4933]: I1202 16:17:35.579444 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce47649a-d30e-4b04-a077-b6efde5fd587-config-data\") pod \"nova-scheduler-0\" (UID: \"ce47649a-d30e-4b04-a077-b6efde5fd587\") " pod="openstack/nova-scheduler-0" Dec 02 16:17:35 crc kubenswrapper[4933]: E1202 16:17:35.589406 4933 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6e06c62d_ee51_43e3_aa5e_eb045d4ec1c8.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9cde07d2_340f_48ad_bb66_db94386d9052.slice\": RecentStats: unable to find data in memory cache]" Dec 02 16:17:35 crc kubenswrapper[4933]: I1202 16:17:35.599030 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-plffx\" (UniqueName: \"kubernetes.io/projected/ce47649a-d30e-4b04-a077-b6efde5fd587-kube-api-access-plffx\") pod \"nova-scheduler-0\" (UID: \"ce47649a-d30e-4b04-a077-b6efde5fd587\") " pod="openstack/nova-scheduler-0" Dec 02 16:17:35 crc kubenswrapper[4933]: I1202 16:17:35.607621 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce47649a-d30e-4b04-a077-b6efde5fd587-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ce47649a-d30e-4b04-a077-b6efde5fd587\") " pod="openstack/nova-scheduler-0" Dec 02 16:17:35 crc kubenswrapper[4933]: I1202 16:17:35.614905 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 02 16:17:35 crc kubenswrapper[4933]: I1202 16:17:35.617238 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 16:17:35 crc kubenswrapper[4933]: I1202 16:17:35.621198 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 02 16:17:35 crc kubenswrapper[4933]: I1202 16:17:35.657572 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"a9e6f954-bf8f-4aab-a398-d3320706266e","Type":"ContainerStarted","Data":"3927949d979b6f21a1bd1cf411956109309710dcddf32b5a9f27b0ef47d6c472"} Dec 02 16:17:35 crc kubenswrapper[4933]: I1202 16:17:35.659129 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 16:17:35 crc kubenswrapper[4933]: I1202 16:17:35.663857 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 16:17:35 crc kubenswrapper[4933]: I1202 16:17:35.685907 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7877d89589-kn582"] Dec 02 16:17:35 crc kubenswrapper[4933]: I1202 16:17:35.687833 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7877d89589-kn582" Dec 02 16:17:35 crc kubenswrapper[4933]: I1202 16:17:35.689117 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/65ceb88e-5297-4196-951e-3f4172c0544d-logs\") pod \"nova-metadata-0\" (UID: \"65ceb88e-5297-4196-951e-3f4172c0544d\") " pod="openstack/nova-metadata-0" Dec 02 16:17:35 crc kubenswrapper[4933]: I1202 16:17:35.689753 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkjnc\" (UniqueName: \"kubernetes.io/projected/65ceb88e-5297-4196-951e-3f4172c0544d-kube-api-access-tkjnc\") pod \"nova-metadata-0\" (UID: \"65ceb88e-5297-4196-951e-3f4172c0544d\") " pod="openstack/nova-metadata-0" Dec 02 16:17:35 crc kubenswrapper[4933]: I1202 16:17:35.693308 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65ceb88e-5297-4196-951e-3f4172c0544d-config-data\") pod \"nova-metadata-0\" (UID: \"65ceb88e-5297-4196-951e-3f4172c0544d\") " pod="openstack/nova-metadata-0" Dec 02 16:17:35 crc kubenswrapper[4933]: I1202 16:17:35.693638 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65ceb88e-5297-4196-951e-3f4172c0544d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"65ceb88e-5297-4196-951e-3f4172c0544d\") " pod="openstack/nova-metadata-0" Dec 02 16:17:35 crc kubenswrapper[4933]: I1202 16:17:35.710084 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 02 16:17:35 crc kubenswrapper[4933]: I1202 16:17:35.720195 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7877d89589-kn582"] Dec 02 16:17:35 crc kubenswrapper[4933]: I1202 16:17:35.731240 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 02 16:17:35 crc kubenswrapper[4933]: I1202 16:17:35.796294 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/29bf2149-397b-49a3-a8b5-f2793520040e-dns-swift-storage-0\") pod \"dnsmasq-dns-7877d89589-kn582\" (UID: \"29bf2149-397b-49a3-a8b5-f2793520040e\") " pod="openstack/dnsmasq-dns-7877d89589-kn582" Dec 02 16:17:35 crc kubenswrapper[4933]: I1202 16:17:35.796667 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tkjnc\" (UniqueName: \"kubernetes.io/projected/65ceb88e-5297-4196-951e-3f4172c0544d-kube-api-access-tkjnc\") pod \"nova-metadata-0\" (UID: \"65ceb88e-5297-4196-951e-3f4172c0544d\") " pod="openstack/nova-metadata-0" Dec 02 16:17:35 crc kubenswrapper[4933]: I1202 16:17:35.796702 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65ceb88e-5297-4196-951e-3f4172c0544d-config-data\") pod \"nova-metadata-0\" (UID: \"65ceb88e-5297-4196-951e-3f4172c0544d\") " pod="openstack/nova-metadata-0" Dec 02 16:17:35 crc kubenswrapper[4933]: I1202 16:17:35.796735 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65ceb88e-5297-4196-951e-3f4172c0544d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"65ceb88e-5297-4196-951e-3f4172c0544d\") " pod="openstack/nova-metadata-0" Dec 02 16:17:35 crc kubenswrapper[4933]: I1202 16:17:35.796810 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/65ceb88e-5297-4196-951e-3f4172c0544d-logs\") pod \"nova-metadata-0\" (UID: \"65ceb88e-5297-4196-951e-3f4172c0544d\") " pod="openstack/nova-metadata-0" Dec 02 16:17:35 crc kubenswrapper[4933]: I1202 16:17:35.796896 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/29bf2149-397b-49a3-a8b5-f2793520040e-ovsdbserver-sb\") pod \"dnsmasq-dns-7877d89589-kn582\" (UID: \"29bf2149-397b-49a3-a8b5-f2793520040e\") " pod="openstack/dnsmasq-dns-7877d89589-kn582" Dec 02 16:17:35 crc kubenswrapper[4933]: I1202 16:17:35.796924 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29bf2149-397b-49a3-a8b5-f2793520040e-config\") pod \"dnsmasq-dns-7877d89589-kn582\" (UID: \"29bf2149-397b-49a3-a8b5-f2793520040e\") " pod="openstack/dnsmasq-dns-7877d89589-kn582" Dec 02 16:17:35 crc kubenswrapper[4933]: I1202 16:17:35.796967 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/29bf2149-397b-49a3-a8b5-f2793520040e-ovsdbserver-nb\") pod \"dnsmasq-dns-7877d89589-kn582\" (UID: \"29bf2149-397b-49a3-a8b5-f2793520040e\") " pod="openstack/dnsmasq-dns-7877d89589-kn582" Dec 02 16:17:35 crc kubenswrapper[4933]: I1202 16:17:35.797023 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pz2v6\" (UniqueName: \"kubernetes.io/projected/29bf2149-397b-49a3-a8b5-f2793520040e-kube-api-access-pz2v6\") pod \"dnsmasq-dns-7877d89589-kn582\" (UID: \"29bf2149-397b-49a3-a8b5-f2793520040e\") " pod="openstack/dnsmasq-dns-7877d89589-kn582" Dec 02 16:17:35 crc kubenswrapper[4933]: I1202 16:17:35.797545 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/65ceb88e-5297-4196-951e-3f4172c0544d-logs\") pod \"nova-metadata-0\" (UID: \"65ceb88e-5297-4196-951e-3f4172c0544d\") " pod="openstack/nova-metadata-0" Dec 02 16:17:36 crc kubenswrapper[4933]: I1202 16:17:35.804351 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65ceb88e-5297-4196-951e-3f4172c0544d-config-data\") pod \"nova-metadata-0\" (UID: \"65ceb88e-5297-4196-951e-3f4172c0544d\") " pod="openstack/nova-metadata-0" Dec 02 16:17:36 crc kubenswrapper[4933]: I1202 16:17:35.797118 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/29bf2149-397b-49a3-a8b5-f2793520040e-dns-svc\") pod \"dnsmasq-dns-7877d89589-kn582\" (UID: \"29bf2149-397b-49a3-a8b5-f2793520040e\") " pod="openstack/dnsmasq-dns-7877d89589-kn582" Dec 02 16:17:36 crc kubenswrapper[4933]: I1202 16:17:35.806472 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65ceb88e-5297-4196-951e-3f4172c0544d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"65ceb88e-5297-4196-951e-3f4172c0544d\") " pod="openstack/nova-metadata-0" Dec 02 16:17:36 crc kubenswrapper[4933]: I1202 16:17:35.832535 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkjnc\" (UniqueName: \"kubernetes.io/projected/65ceb88e-5297-4196-951e-3f4172c0544d-kube-api-access-tkjnc\") pod \"nova-metadata-0\" (UID: \"65ceb88e-5297-4196-951e-3f4172c0544d\") " pod="openstack/nova-metadata-0" Dec 02 16:17:36 crc kubenswrapper[4933]: I1202 16:17:35.911250 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/29bf2149-397b-49a3-a8b5-f2793520040e-dns-svc\") pod \"dnsmasq-dns-7877d89589-kn582\" (UID: \"29bf2149-397b-49a3-a8b5-f2793520040e\") " pod="openstack/dnsmasq-dns-7877d89589-kn582" Dec 02 16:17:36 crc kubenswrapper[4933]: I1202 16:17:35.911351 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/29bf2149-397b-49a3-a8b5-f2793520040e-dns-swift-storage-0\") pod \"dnsmasq-dns-7877d89589-kn582\" (UID: \"29bf2149-397b-49a3-a8b5-f2793520040e\") " pod="openstack/dnsmasq-dns-7877d89589-kn582" Dec 02 16:17:36 crc kubenswrapper[4933]: I1202 16:17:35.911494 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/29bf2149-397b-49a3-a8b5-f2793520040e-ovsdbserver-sb\") pod \"dnsmasq-dns-7877d89589-kn582\" (UID: \"29bf2149-397b-49a3-a8b5-f2793520040e\") " pod="openstack/dnsmasq-dns-7877d89589-kn582" Dec 02 16:17:36 crc kubenswrapper[4933]: I1202 16:17:35.911517 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29bf2149-397b-49a3-a8b5-f2793520040e-config\") pod \"dnsmasq-dns-7877d89589-kn582\" (UID: \"29bf2149-397b-49a3-a8b5-f2793520040e\") " pod="openstack/dnsmasq-dns-7877d89589-kn582" Dec 02 16:17:36 crc kubenswrapper[4933]: I1202 16:17:35.911548 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/29bf2149-397b-49a3-a8b5-f2793520040e-ovsdbserver-nb\") pod \"dnsmasq-dns-7877d89589-kn582\" (UID: \"29bf2149-397b-49a3-a8b5-f2793520040e\") " pod="openstack/dnsmasq-dns-7877d89589-kn582" Dec 02 16:17:36 crc kubenswrapper[4933]: I1202 16:17:35.911575 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pz2v6\" (UniqueName: \"kubernetes.io/projected/29bf2149-397b-49a3-a8b5-f2793520040e-kube-api-access-pz2v6\") pod \"dnsmasq-dns-7877d89589-kn582\" (UID: \"29bf2149-397b-49a3-a8b5-f2793520040e\") " pod="openstack/dnsmasq-dns-7877d89589-kn582" Dec 02 16:17:36 crc kubenswrapper[4933]: I1202 16:17:35.912732 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/29bf2149-397b-49a3-a8b5-f2793520040e-dns-svc\") pod \"dnsmasq-dns-7877d89589-kn582\" (UID: \"29bf2149-397b-49a3-a8b5-f2793520040e\") " pod="openstack/dnsmasq-dns-7877d89589-kn582" Dec 02 16:17:36 crc kubenswrapper[4933]: I1202 16:17:35.913656 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/29bf2149-397b-49a3-a8b5-f2793520040e-dns-swift-storage-0\") pod \"dnsmasq-dns-7877d89589-kn582\" (UID: \"29bf2149-397b-49a3-a8b5-f2793520040e\") " pod="openstack/dnsmasq-dns-7877d89589-kn582" Dec 02 16:17:36 crc kubenswrapper[4933]: I1202 16:17:35.915260 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/29bf2149-397b-49a3-a8b5-f2793520040e-ovsdbserver-sb\") pod \"dnsmasq-dns-7877d89589-kn582\" (UID: \"29bf2149-397b-49a3-a8b5-f2793520040e\") " pod="openstack/dnsmasq-dns-7877d89589-kn582" Dec 02 16:17:36 crc kubenswrapper[4933]: I1202 16:17:35.915890 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29bf2149-397b-49a3-a8b5-f2793520040e-config\") pod \"dnsmasq-dns-7877d89589-kn582\" (UID: \"29bf2149-397b-49a3-a8b5-f2793520040e\") " pod="openstack/dnsmasq-dns-7877d89589-kn582" Dec 02 16:17:36 crc kubenswrapper[4933]: I1202 16:17:35.916521 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/29bf2149-397b-49a3-a8b5-f2793520040e-ovsdbserver-nb\") pod \"dnsmasq-dns-7877d89589-kn582\" (UID: \"29bf2149-397b-49a3-a8b5-f2793520040e\") " pod="openstack/dnsmasq-dns-7877d89589-kn582" Dec 02 16:17:36 crc kubenswrapper[4933]: I1202 16:17:35.932136 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pz2v6\" (UniqueName: \"kubernetes.io/projected/29bf2149-397b-49a3-a8b5-f2793520040e-kube-api-access-pz2v6\") pod \"dnsmasq-dns-7877d89589-kn582\" (UID: \"29bf2149-397b-49a3-a8b5-f2793520040e\") " pod="openstack/dnsmasq-dns-7877d89589-kn582" Dec 02 16:17:36 crc kubenswrapper[4933]: I1202 16:17:36.055422 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 16:17:36 crc kubenswrapper[4933]: I1202 16:17:36.071714 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7877d89589-kn582" Dec 02 16:17:36 crc kubenswrapper[4933]: I1202 16:17:36.693738 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-dz7nl"] Dec 02 16:17:36 crc kubenswrapper[4933]: I1202 16:17:36.695773 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-dz7nl" Dec 02 16:17:36 crc kubenswrapper[4933]: I1202 16:17:36.700123 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Dec 02 16:17:36 crc kubenswrapper[4933]: I1202 16:17:36.701450 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 02 16:17:36 crc kubenswrapper[4933]: I1202 16:17:36.713793 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-dz7nl"] Dec 02 16:17:36 crc kubenswrapper[4933]: I1202 16:17:36.848636 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52d2793b-1d77-4e12-bbaf-104e4174d0c6-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-dz7nl\" (UID: \"52d2793b-1d77-4e12-bbaf-104e4174d0c6\") " pod="openstack/nova-cell1-conductor-db-sync-dz7nl" Dec 02 16:17:36 crc kubenswrapper[4933]: I1202 16:17:36.848737 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52d2793b-1d77-4e12-bbaf-104e4174d0c6-scripts\") pod \"nova-cell1-conductor-db-sync-dz7nl\" (UID: \"52d2793b-1d77-4e12-bbaf-104e4174d0c6\") " pod="openstack/nova-cell1-conductor-db-sync-dz7nl" Dec 02 16:17:36 crc kubenswrapper[4933]: I1202 16:17:36.848842 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjkrs\" (UniqueName: \"kubernetes.io/projected/52d2793b-1d77-4e12-bbaf-104e4174d0c6-kube-api-access-pjkrs\") pod \"nova-cell1-conductor-db-sync-dz7nl\" (UID: \"52d2793b-1d77-4e12-bbaf-104e4174d0c6\") " pod="openstack/nova-cell1-conductor-db-sync-dz7nl" Dec 02 16:17:36 crc kubenswrapper[4933]: I1202 16:17:36.849274 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52d2793b-1d77-4e12-bbaf-104e4174d0c6-config-data\") pod \"nova-cell1-conductor-db-sync-dz7nl\" (UID: \"52d2793b-1d77-4e12-bbaf-104e4174d0c6\") " pod="openstack/nova-cell1-conductor-db-sync-dz7nl" Dec 02 16:17:36 crc kubenswrapper[4933]: I1202 16:17:36.913145 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-6zh5j"] Dec 02 16:17:36 crc kubenswrapper[4933]: I1202 16:17:36.951426 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52d2793b-1d77-4e12-bbaf-104e4174d0c6-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-dz7nl\" (UID: \"52d2793b-1d77-4e12-bbaf-104e4174d0c6\") " pod="openstack/nova-cell1-conductor-db-sync-dz7nl" Dec 02 16:17:36 crc kubenswrapper[4933]: I1202 16:17:36.951760 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52d2793b-1d77-4e12-bbaf-104e4174d0c6-scripts\") pod \"nova-cell1-conductor-db-sync-dz7nl\" (UID: \"52d2793b-1d77-4e12-bbaf-104e4174d0c6\") " pod="openstack/nova-cell1-conductor-db-sync-dz7nl" Dec 02 16:17:36 crc kubenswrapper[4933]: I1202 16:17:36.952307 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjkrs\" (UniqueName: \"kubernetes.io/projected/52d2793b-1d77-4e12-bbaf-104e4174d0c6-kube-api-access-pjkrs\") pod \"nova-cell1-conductor-db-sync-dz7nl\" (UID: \"52d2793b-1d77-4e12-bbaf-104e4174d0c6\") " pod="openstack/nova-cell1-conductor-db-sync-dz7nl" Dec 02 16:17:36 crc kubenswrapper[4933]: I1202 16:17:36.952452 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52d2793b-1d77-4e12-bbaf-104e4174d0c6-config-data\") pod \"nova-cell1-conductor-db-sync-dz7nl\" (UID: \"52d2793b-1d77-4e12-bbaf-104e4174d0c6\") " pod="openstack/nova-cell1-conductor-db-sync-dz7nl" Dec 02 16:17:36 crc kubenswrapper[4933]: I1202 16:17:36.955800 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52d2793b-1d77-4e12-bbaf-104e4174d0c6-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-dz7nl\" (UID: \"52d2793b-1d77-4e12-bbaf-104e4174d0c6\") " pod="openstack/nova-cell1-conductor-db-sync-dz7nl" Dec 02 16:17:36 crc kubenswrapper[4933]: I1202 16:17:36.955919 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52d2793b-1d77-4e12-bbaf-104e4174d0c6-scripts\") pod \"nova-cell1-conductor-db-sync-dz7nl\" (UID: \"52d2793b-1d77-4e12-bbaf-104e4174d0c6\") " pod="openstack/nova-cell1-conductor-db-sync-dz7nl" Dec 02 16:17:36 crc kubenswrapper[4933]: I1202 16:17:36.959444 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52d2793b-1d77-4e12-bbaf-104e4174d0c6-config-data\") pod \"nova-cell1-conductor-db-sync-dz7nl\" (UID: \"52d2793b-1d77-4e12-bbaf-104e4174d0c6\") " pod="openstack/nova-cell1-conductor-db-sync-dz7nl" Dec 02 16:17:36 crc kubenswrapper[4933]: I1202 16:17:36.972563 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjkrs\" (UniqueName: \"kubernetes.io/projected/52d2793b-1d77-4e12-bbaf-104e4174d0c6-kube-api-access-pjkrs\") pod \"nova-cell1-conductor-db-sync-dz7nl\" (UID: \"52d2793b-1d77-4e12-bbaf-104e4174d0c6\") " pod="openstack/nova-cell1-conductor-db-sync-dz7nl" Dec 02 16:17:37 crc kubenswrapper[4933]: I1202 16:17:37.042291 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-dz7nl" Dec 02 16:17:37 crc kubenswrapper[4933]: I1202 16:17:37.523082 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 02 16:17:37 crc kubenswrapper[4933]: I1202 16:17:37.547965 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 16:17:37 crc kubenswrapper[4933]: I1202 16:17:37.559355 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 02 16:17:37 crc kubenswrapper[4933]: I1202 16:17:37.570422 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 16:17:37 crc kubenswrapper[4933]: I1202 16:17:37.580455 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7877d89589-kn582"] Dec 02 16:17:37 crc kubenswrapper[4933]: I1202 16:17:37.612702 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 16:17:37 crc kubenswrapper[4933]: I1202 16:17:37.612990 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0429fdbf-27e6-4cfb-a853-efb467b315d8" containerName="ceilometer-central-agent" containerID="cri-o://5c1545d48f08e3b2797d52029f5f4f1ff712110768acf700c8d2fadd3cf3be9b" gracePeriod=30 Dec 02 16:17:37 crc kubenswrapper[4933]: I1202 16:17:37.615847 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0429fdbf-27e6-4cfb-a853-efb467b315d8" containerName="proxy-httpd" containerID="cri-o://616468f5d27027b95a4ca3f63cd1a374fe7b55927f593acdd56b27443c8847a0" gracePeriod=30 Dec 02 16:17:37 crc kubenswrapper[4933]: I1202 16:17:37.615944 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0429fdbf-27e6-4cfb-a853-efb467b315d8" containerName="sg-core" containerID="cri-o://baf957cc50228f7a3c8d7aea81c72b28eea7a9b92af72138fd36757f5fe44c59" gracePeriod=30 Dec 02 16:17:37 crc kubenswrapper[4933]: I1202 16:17:37.615979 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0429fdbf-27e6-4cfb-a853-efb467b315d8" containerName="ceilometer-notification-agent" containerID="cri-o://18bd4e0e45a030384b4ca347f9229891bff82edbc21fe19844dfe65ec5ef522e" gracePeriod=30 Dec 02 16:17:37 crc kubenswrapper[4933]: I1202 16:17:37.630594 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-dz7nl"] Dec 02 16:17:37 crc kubenswrapper[4933]: I1202 16:17:37.631679 4933 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="0429fdbf-27e6-4cfb-a853-efb467b315d8" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.232:3000/\": EOF" Dec 02 16:17:37 crc kubenswrapper[4933]: I1202 16:17:37.689883 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-6zh5j" event={"ID":"19ded5fa-9330-468d-b544-de40fe542bf0","Type":"ContainerStarted","Data":"eaa265c2a674d7092efccc8193f3d1a1ca1dd70657192ee9eb954691089f34fa"} Dec 02 16:17:37 crc kubenswrapper[4933]: I1202 16:17:37.689957 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-6zh5j" event={"ID":"19ded5fa-9330-468d-b544-de40fe542bf0","Type":"ContainerStarted","Data":"303ec1c41ff339a3f10d8fba9f1227bf9c71fb4ed4fb9a40356d57fb9b73243e"} Dec 02 16:17:37 crc kubenswrapper[4933]: I1202 16:17:37.715806 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-6zh5j" podStartSLOduration=3.71578694 podStartE2EDuration="3.71578694s" podCreationTimestamp="2025-12-02 16:17:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 16:17:37.707970035 +0000 UTC m=+1520.959196738" watchObservedRunningTime="2025-12-02 16:17:37.71578694 +0000 UTC m=+1520.967013643" Dec 02 16:17:37 crc kubenswrapper[4933]: W1202 16:17:37.840894 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod52d2793b_1d77_4e12_bbaf_104e4174d0c6.slice/crio-47a40652d3a4bc79dce104c16ca040811f9c429abedd53ce874985df1c2c4780 WatchSource:0}: Error finding container 47a40652d3a4bc79dce104c16ca040811f9c429abedd53ce874985df1c2c4780: Status 404 returned error can't find the container with id 47a40652d3a4bc79dce104c16ca040811f9c429abedd53ce874985df1c2c4780 Dec 02 16:17:37 crc kubenswrapper[4933]: W1202 16:17:37.920965 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podce47649a_d30e_4b04_a077_b6efde5fd587.slice/crio-8c322613760957c3c6815cac96bcf9e0b8425fa3e8925ecc5ae46c6e36de5f9a WatchSource:0}: Error finding container 8c322613760957c3c6815cac96bcf9e0b8425fa3e8925ecc5ae46c6e36de5f9a: Status 404 returned error can't find the container with id 8c322613760957c3c6815cac96bcf9e0b8425fa3e8925ecc5ae46c6e36de5f9a Dec 02 16:17:38 crc kubenswrapper[4933]: I1202 16:17:38.715260 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"a9e6f954-bf8f-4aab-a398-d3320706266e","Type":"ContainerStarted","Data":"a18d07d7710c65de4eece573380d203afe63bddee8a74e00d3d854d1ab8f07f4"} Dec 02 16:17:38 crc kubenswrapper[4933]: I1202 16:17:38.718619 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"65ceb88e-5297-4196-951e-3f4172c0544d","Type":"ContainerStarted","Data":"63bdcbe0c584159b6ecbb6e80aa49d48dd3958b0db1b21cb1f88c2e5cae3127b"} Dec 02 16:17:38 crc kubenswrapper[4933]: I1202 16:17:38.720419 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"2e5cd02d-fd19-452b-a4c0-eb3dbf888dfb","Type":"ContainerStarted","Data":"4f40f8c95cabad472d9807c6cc7daa52032e5c73c9d45d7e4ac7f67a683246e8"} Dec 02 16:17:38 crc kubenswrapper[4933]: I1202 16:17:38.722480 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"93444a85-ddc7-4883-9a66-8a759201161d","Type":"ContainerStarted","Data":"a3aa93e7aa55cec41b4491e6edfb29cd79545ae26f11ba3526cca57728a401eb"} Dec 02 16:17:38 crc kubenswrapper[4933]: I1202 16:17:38.729653 4933 generic.go:334] "Generic (PLEG): container finished" podID="29bf2149-397b-49a3-a8b5-f2793520040e" containerID="4039b4ffdafe2e3a13de3be3884b5fcccecfd1a33d4012ad6370cc1c2c993ee8" exitCode=0 Dec 02 16:17:38 crc kubenswrapper[4933]: I1202 16:17:38.729730 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7877d89589-kn582" event={"ID":"29bf2149-397b-49a3-a8b5-f2793520040e","Type":"ContainerDied","Data":"4039b4ffdafe2e3a13de3be3884b5fcccecfd1a33d4012ad6370cc1c2c993ee8"} Dec 02 16:17:38 crc kubenswrapper[4933]: I1202 16:17:38.729770 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7877d89589-kn582" event={"ID":"29bf2149-397b-49a3-a8b5-f2793520040e","Type":"ContainerStarted","Data":"e4a6af734486eed2bde24cd0bfd829b2bbc893d891586226616014599ba5a0d3"} Dec 02 16:17:38 crc kubenswrapper[4933]: I1202 16:17:38.732037 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-dz7nl" event={"ID":"52d2793b-1d77-4e12-bbaf-104e4174d0c6","Type":"ContainerStarted","Data":"21db6466c97949d3a87805fbfb31c2d56f02e0893d8b2f33c62470d5ebe1fcc5"} Dec 02 16:17:38 crc kubenswrapper[4933]: I1202 16:17:38.732072 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-dz7nl" event={"ID":"52d2793b-1d77-4e12-bbaf-104e4174d0c6","Type":"ContainerStarted","Data":"47a40652d3a4bc79dce104c16ca040811f9c429abedd53ce874985df1c2c4780"} Dec 02 16:17:38 crc kubenswrapper[4933]: I1202 16:17:38.753235 4933 generic.go:334] "Generic (PLEG): container finished" podID="0429fdbf-27e6-4cfb-a853-efb467b315d8" containerID="616468f5d27027b95a4ca3f63cd1a374fe7b55927f593acdd56b27443c8847a0" exitCode=0 Dec 02 16:17:38 crc kubenswrapper[4933]: I1202 16:17:38.753274 4933 generic.go:334] "Generic (PLEG): container finished" podID="0429fdbf-27e6-4cfb-a853-efb467b315d8" containerID="baf957cc50228f7a3c8d7aea81c72b28eea7a9b92af72138fd36757f5fe44c59" exitCode=2 Dec 02 16:17:38 crc kubenswrapper[4933]: I1202 16:17:38.753287 4933 generic.go:334] "Generic (PLEG): container finished" podID="0429fdbf-27e6-4cfb-a853-efb467b315d8" containerID="5c1545d48f08e3b2797d52029f5f4f1ff712110768acf700c8d2fadd3cf3be9b" exitCode=0 Dec 02 16:17:38 crc kubenswrapper[4933]: I1202 16:17:38.753356 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0429fdbf-27e6-4cfb-a853-efb467b315d8","Type":"ContainerDied","Data":"616468f5d27027b95a4ca3f63cd1a374fe7b55927f593acdd56b27443c8847a0"} Dec 02 16:17:38 crc kubenswrapper[4933]: I1202 16:17:38.753388 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0429fdbf-27e6-4cfb-a853-efb467b315d8","Type":"ContainerDied","Data":"baf957cc50228f7a3c8d7aea81c72b28eea7a9b92af72138fd36757f5fe44c59"} Dec 02 16:17:38 crc kubenswrapper[4933]: I1202 16:17:38.753400 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0429fdbf-27e6-4cfb-a853-efb467b315d8","Type":"ContainerDied","Data":"5c1545d48f08e3b2797d52029f5f4f1ff712110768acf700c8d2fadd3cf3be9b"} Dec 02 16:17:38 crc kubenswrapper[4933]: I1202 16:17:38.763878 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ce47649a-d30e-4b04-a077-b6efde5fd587","Type":"ContainerStarted","Data":"8c322613760957c3c6815cac96bcf9e0b8425fa3e8925ecc5ae46c6e36de5f9a"} Dec 02 16:17:38 crc kubenswrapper[4933]: I1202 16:17:38.851796 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-dz7nl" podStartSLOduration=2.851778883 podStartE2EDuration="2.851778883s" podCreationTimestamp="2025-12-02 16:17:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 16:17:38.776265523 +0000 UTC m=+1522.027492226" watchObservedRunningTime="2025-12-02 16:17:38.851778883 +0000 UTC m=+1522.103005586" Dec 02 16:17:39 crc kubenswrapper[4933]: I1202 16:17:39.783500 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7877d89589-kn582" event={"ID":"29bf2149-397b-49a3-a8b5-f2793520040e","Type":"ContainerStarted","Data":"e033e6d59b6d6809ce6e80d4fd6ca7c0af981684a01743651bce9ad0388f47cd"} Dec 02 16:17:39 crc kubenswrapper[4933]: I1202 16:17:39.783804 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7877d89589-kn582" Dec 02 16:17:40 crc kubenswrapper[4933]: I1202 16:17:40.610395 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7877d89589-kn582" podStartSLOduration=5.610374278 podStartE2EDuration="5.610374278s" podCreationTimestamp="2025-12-02 16:17:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 16:17:39.810066561 +0000 UTC m=+1523.061293264" watchObservedRunningTime="2025-12-02 16:17:40.610374278 +0000 UTC m=+1523.861600981" Dec 02 16:17:40 crc kubenswrapper[4933]: I1202 16:17:40.624653 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 16:17:40 crc kubenswrapper[4933]: I1202 16:17:40.635399 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 02 16:17:41 crc kubenswrapper[4933]: I1202 16:17:41.084430 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Dec 02 16:17:42 crc kubenswrapper[4933]: E1202 16:17:42.230313 4933 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6e06c62d_ee51_43e3_aa5e_eb045d4ec1c8.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9cde07d2_340f_48ad_bb66_db94386d9052.slice\": RecentStats: unable to find data in memory cache]" Dec 02 16:17:42 crc kubenswrapper[4933]: I1202 16:17:42.888018 4933 generic.go:334] "Generic (PLEG): container finished" podID="0429fdbf-27e6-4cfb-a853-efb467b315d8" containerID="18bd4e0e45a030384b4ca347f9229891bff82edbc21fe19844dfe65ec5ef522e" exitCode=0 Dec 02 16:17:42 crc kubenswrapper[4933]: I1202 16:17:42.888260 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0429fdbf-27e6-4cfb-a853-efb467b315d8","Type":"ContainerDied","Data":"18bd4e0e45a030384b4ca347f9229891bff82edbc21fe19844dfe65ec5ef522e"} Dec 02 16:17:43 crc kubenswrapper[4933]: I1202 16:17:43.482911 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 16:17:43 crc kubenswrapper[4933]: I1202 16:17:43.596264 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0429fdbf-27e6-4cfb-a853-efb467b315d8-run-httpd\") pod \"0429fdbf-27e6-4cfb-a853-efb467b315d8\" (UID: \"0429fdbf-27e6-4cfb-a853-efb467b315d8\") " Dec 02 16:17:43 crc kubenswrapper[4933]: I1202 16:17:43.596336 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6z5mt\" (UniqueName: \"kubernetes.io/projected/0429fdbf-27e6-4cfb-a853-efb467b315d8-kube-api-access-6z5mt\") pod \"0429fdbf-27e6-4cfb-a853-efb467b315d8\" (UID: \"0429fdbf-27e6-4cfb-a853-efb467b315d8\") " Dec 02 16:17:43 crc kubenswrapper[4933]: I1202 16:17:43.596368 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0429fdbf-27e6-4cfb-a853-efb467b315d8-scripts\") pod \"0429fdbf-27e6-4cfb-a853-efb467b315d8\" (UID: \"0429fdbf-27e6-4cfb-a853-efb467b315d8\") " Dec 02 16:17:43 crc kubenswrapper[4933]: I1202 16:17:43.596401 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0429fdbf-27e6-4cfb-a853-efb467b315d8-config-data\") pod \"0429fdbf-27e6-4cfb-a853-efb467b315d8\" (UID: \"0429fdbf-27e6-4cfb-a853-efb467b315d8\") " Dec 02 16:17:43 crc kubenswrapper[4933]: I1202 16:17:43.596517 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0429fdbf-27e6-4cfb-a853-efb467b315d8-sg-core-conf-yaml\") pod \"0429fdbf-27e6-4cfb-a853-efb467b315d8\" (UID: \"0429fdbf-27e6-4cfb-a853-efb467b315d8\") " Dec 02 16:17:43 crc kubenswrapper[4933]: I1202 16:17:43.596544 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0429fdbf-27e6-4cfb-a853-efb467b315d8-combined-ca-bundle\") pod \"0429fdbf-27e6-4cfb-a853-efb467b315d8\" (UID: \"0429fdbf-27e6-4cfb-a853-efb467b315d8\") " Dec 02 16:17:43 crc kubenswrapper[4933]: I1202 16:17:43.596578 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0429fdbf-27e6-4cfb-a853-efb467b315d8-log-httpd\") pod \"0429fdbf-27e6-4cfb-a853-efb467b315d8\" (UID: \"0429fdbf-27e6-4cfb-a853-efb467b315d8\") " Dec 02 16:17:43 crc kubenswrapper[4933]: I1202 16:17:43.597725 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0429fdbf-27e6-4cfb-a853-efb467b315d8-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "0429fdbf-27e6-4cfb-a853-efb467b315d8" (UID: "0429fdbf-27e6-4cfb-a853-efb467b315d8"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 16:17:43 crc kubenswrapper[4933]: I1202 16:17:43.597909 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0429fdbf-27e6-4cfb-a853-efb467b315d8-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "0429fdbf-27e6-4cfb-a853-efb467b315d8" (UID: "0429fdbf-27e6-4cfb-a853-efb467b315d8"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 16:17:43 crc kubenswrapper[4933]: I1202 16:17:43.617956 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0429fdbf-27e6-4cfb-a853-efb467b315d8-scripts" (OuterVolumeSpecName: "scripts") pod "0429fdbf-27e6-4cfb-a853-efb467b315d8" (UID: "0429fdbf-27e6-4cfb-a853-efb467b315d8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:17:43 crc kubenswrapper[4933]: I1202 16:17:43.621491 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0429fdbf-27e6-4cfb-a853-efb467b315d8-kube-api-access-6z5mt" (OuterVolumeSpecName: "kube-api-access-6z5mt") pod "0429fdbf-27e6-4cfb-a853-efb467b315d8" (UID: "0429fdbf-27e6-4cfb-a853-efb467b315d8"). InnerVolumeSpecName "kube-api-access-6z5mt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:17:43 crc kubenswrapper[4933]: I1202 16:17:43.651455 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0429fdbf-27e6-4cfb-a853-efb467b315d8-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "0429fdbf-27e6-4cfb-a853-efb467b315d8" (UID: "0429fdbf-27e6-4cfb-a853-efb467b315d8"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:17:43 crc kubenswrapper[4933]: I1202 16:17:43.706980 4933 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0429fdbf-27e6-4cfb-a853-efb467b315d8-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 16:17:43 crc kubenswrapper[4933]: I1202 16:17:43.707020 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6z5mt\" (UniqueName: \"kubernetes.io/projected/0429fdbf-27e6-4cfb-a853-efb467b315d8-kube-api-access-6z5mt\") on node \"crc\" DevicePath \"\"" Dec 02 16:17:43 crc kubenswrapper[4933]: I1202 16:17:43.707035 4933 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0429fdbf-27e6-4cfb-a853-efb467b315d8-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 16:17:43 crc kubenswrapper[4933]: I1202 16:17:43.707047 4933 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0429fdbf-27e6-4cfb-a853-efb467b315d8-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 02 16:17:43 crc kubenswrapper[4933]: I1202 16:17:43.707059 4933 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0429fdbf-27e6-4cfb-a853-efb467b315d8-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 16:17:43 crc kubenswrapper[4933]: I1202 16:17:43.716008 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0429fdbf-27e6-4cfb-a853-efb467b315d8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0429fdbf-27e6-4cfb-a853-efb467b315d8" (UID: "0429fdbf-27e6-4cfb-a853-efb467b315d8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:17:43 crc kubenswrapper[4933]: I1202 16:17:43.749399 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0429fdbf-27e6-4cfb-a853-efb467b315d8-config-data" (OuterVolumeSpecName: "config-data") pod "0429fdbf-27e6-4cfb-a853-efb467b315d8" (UID: "0429fdbf-27e6-4cfb-a853-efb467b315d8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:17:43 crc kubenswrapper[4933]: I1202 16:17:43.808415 4933 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0429fdbf-27e6-4cfb-a853-efb467b315d8-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 16:17:43 crc kubenswrapper[4933]: I1202 16:17:43.808448 4933 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0429fdbf-27e6-4cfb-a853-efb467b315d8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 16:17:43 crc kubenswrapper[4933]: I1202 16:17:43.906384 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"a9e6f954-bf8f-4aab-a398-d3320706266e","Type":"ContainerStarted","Data":"6070de53e73b654e425984097cbcbb55c54d32a6fa3cb2d323524bb3256a50c6"} Dec 02 16:17:43 crc kubenswrapper[4933]: I1202 16:17:43.916071 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"65ceb88e-5297-4196-951e-3f4172c0544d","Type":"ContainerStarted","Data":"7b6279d89edb751770cc704d561d15f3898d09f37c1d706830934f74e26b4146"} Dec 02 16:17:43 crc kubenswrapper[4933]: I1202 16:17:43.916119 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"65ceb88e-5297-4196-951e-3f4172c0544d","Type":"ContainerStarted","Data":"51390e37981e53ce18d558237258371169bcb7ec1d586403702a98bd4c837561"} Dec 02 16:17:43 crc kubenswrapper[4933]: I1202 16:17:43.916260 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="65ceb88e-5297-4196-951e-3f4172c0544d" containerName="nova-metadata-metadata" containerID="cri-o://7b6279d89edb751770cc704d561d15f3898d09f37c1d706830934f74e26b4146" gracePeriod=30 Dec 02 16:17:43 crc kubenswrapper[4933]: I1202 16:17:43.916242 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="65ceb88e-5297-4196-951e-3f4172c0544d" containerName="nova-metadata-log" containerID="cri-o://51390e37981e53ce18d558237258371169bcb7ec1d586403702a98bd4c837561" gracePeriod=30 Dec 02 16:17:43 crc kubenswrapper[4933]: I1202 16:17:43.919086 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"2e5cd02d-fd19-452b-a4c0-eb3dbf888dfb","Type":"ContainerStarted","Data":"d2a1efc4c7f6b05ce7d42809006ebfae41feec28cf69e3d3e5206e892dd0ece7"} Dec 02 16:17:43 crc kubenswrapper[4933]: I1202 16:17:43.919228 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="2e5cd02d-fd19-452b-a4c0-eb3dbf888dfb" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://d2a1efc4c7f6b05ce7d42809006ebfae41feec28cf69e3d3e5206e892dd0ece7" gracePeriod=30 Dec 02 16:17:43 crc kubenswrapper[4933]: I1202 16:17:43.927802 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"93444a85-ddc7-4883-9a66-8a759201161d","Type":"ContainerStarted","Data":"4560a6b592e8ed05b2f609deada274875270c0176efe00dda16466370776192a"} Dec 02 16:17:43 crc kubenswrapper[4933]: I1202 16:17:43.927919 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"93444a85-ddc7-4883-9a66-8a759201161d","Type":"ContainerStarted","Data":"cd4f4c4b0d3805a1717e6f1723206a2ffdb36477aced745e839329aeb1e7d82d"} Dec 02 16:17:43 crc kubenswrapper[4933]: I1202 16:17:43.947648 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 16:17:43 crc kubenswrapper[4933]: I1202 16:17:43.947604 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0429fdbf-27e6-4cfb-a853-efb467b315d8","Type":"ContainerDied","Data":"59a0c70cda2d0c86052e61a5125d9f060461708075b8d35ff3f46c74590c8285"} Dec 02 16:17:43 crc kubenswrapper[4933]: I1202 16:17:43.947889 4933 scope.go:117] "RemoveContainer" containerID="616468f5d27027b95a4ca3f63cd1a374fe7b55927f593acdd56b27443c8847a0" Dec 02 16:17:43 crc kubenswrapper[4933]: I1202 16:17:43.952493 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=4.500421585 podStartE2EDuration="8.952475851s" podCreationTimestamp="2025-12-02 16:17:35 +0000 UTC" firstStartedPulling="2025-12-02 16:17:37.86645712 +0000 UTC m=+1521.117683823" lastFinishedPulling="2025-12-02 16:17:42.318511386 +0000 UTC m=+1525.569738089" observedRunningTime="2025-12-02 16:17:43.937094007 +0000 UTC m=+1527.188320710" watchObservedRunningTime="2025-12-02 16:17:43.952475851 +0000 UTC m=+1527.203702554" Dec 02 16:17:43 crc kubenswrapper[4933]: I1202 16:17:43.959336 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ce47649a-d30e-4b04-a077-b6efde5fd587","Type":"ContainerStarted","Data":"e97c69945175a6e744512f5d20a08e2a73bfffca682fb8cef530f8175f578713"} Dec 02 16:17:43 crc kubenswrapper[4933]: I1202 16:17:43.988144 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=5.547272013 podStartE2EDuration="9.988120011s" podCreationTimestamp="2025-12-02 16:17:34 +0000 UTC" firstStartedPulling="2025-12-02 16:17:37.973402754 +0000 UTC m=+1521.224629457" lastFinishedPulling="2025-12-02 16:17:42.414250742 +0000 UTC m=+1525.665477455" observedRunningTime="2025-12-02 16:17:43.959727931 +0000 UTC m=+1527.210954654" watchObservedRunningTime="2025-12-02 16:17:43.988120011 +0000 UTC m=+1527.239346714" Dec 02 16:17:44 crc kubenswrapper[4933]: I1202 16:17:44.006500 4933 scope.go:117] "RemoveContainer" containerID="baf957cc50228f7a3c8d7aea81c72b28eea7a9b92af72138fd36757f5fe44c59" Dec 02 16:17:44 crc kubenswrapper[4933]: I1202 16:17:44.009658 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=4.525708439 podStartE2EDuration="9.009635594s" podCreationTimestamp="2025-12-02 16:17:35 +0000 UTC" firstStartedPulling="2025-12-02 16:17:37.833266335 +0000 UTC m=+1521.084493038" lastFinishedPulling="2025-12-02 16:17:42.31719349 +0000 UTC m=+1525.568420193" observedRunningTime="2025-12-02 16:17:43.976961104 +0000 UTC m=+1527.228187817" watchObservedRunningTime="2025-12-02 16:17:44.009635594 +0000 UTC m=+1527.260862297" Dec 02 16:17:44 crc kubenswrapper[4933]: I1202 16:17:44.047618 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 16:17:44 crc kubenswrapper[4933]: I1202 16:17:44.062609 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 02 16:17:44 crc kubenswrapper[4933]: I1202 16:17:44.063693 4933 scope.go:117] "RemoveContainer" containerID="18bd4e0e45a030384b4ca347f9229891bff82edbc21fe19844dfe65ec5ef522e" Dec 02 16:17:44 crc kubenswrapper[4933]: I1202 16:17:44.067117 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=4.6964843720000005 podStartE2EDuration="9.067104906s" podCreationTimestamp="2025-12-02 16:17:35 +0000 UTC" firstStartedPulling="2025-12-02 16:17:37.946051151 +0000 UTC m=+1521.197277854" lastFinishedPulling="2025-12-02 16:17:42.316671695 +0000 UTC m=+1525.567898388" observedRunningTime="2025-12-02 16:17:44.014413045 +0000 UTC m=+1527.265639748" watchObservedRunningTime="2025-12-02 16:17:44.067104906 +0000 UTC m=+1527.318331609" Dec 02 16:17:44 crc kubenswrapper[4933]: I1202 16:17:44.082659 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 02 16:17:44 crc kubenswrapper[4933]: E1202 16:17:44.083150 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0429fdbf-27e6-4cfb-a853-efb467b315d8" containerName="ceilometer-central-agent" Dec 02 16:17:44 crc kubenswrapper[4933]: I1202 16:17:44.083169 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="0429fdbf-27e6-4cfb-a853-efb467b315d8" containerName="ceilometer-central-agent" Dec 02 16:17:44 crc kubenswrapper[4933]: E1202 16:17:44.083200 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0429fdbf-27e6-4cfb-a853-efb467b315d8" containerName="sg-core" Dec 02 16:17:44 crc kubenswrapper[4933]: I1202 16:17:44.083206 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="0429fdbf-27e6-4cfb-a853-efb467b315d8" containerName="sg-core" Dec 02 16:17:44 crc kubenswrapper[4933]: E1202 16:17:44.083218 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0429fdbf-27e6-4cfb-a853-efb467b315d8" containerName="proxy-httpd" Dec 02 16:17:44 crc kubenswrapper[4933]: I1202 16:17:44.083225 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="0429fdbf-27e6-4cfb-a853-efb467b315d8" containerName="proxy-httpd" Dec 02 16:17:44 crc kubenswrapper[4933]: E1202 16:17:44.083253 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0429fdbf-27e6-4cfb-a853-efb467b315d8" containerName="ceilometer-notification-agent" Dec 02 16:17:44 crc kubenswrapper[4933]: I1202 16:17:44.083259 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="0429fdbf-27e6-4cfb-a853-efb467b315d8" containerName="ceilometer-notification-agent" Dec 02 16:17:44 crc kubenswrapper[4933]: I1202 16:17:44.083478 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="0429fdbf-27e6-4cfb-a853-efb467b315d8" containerName="sg-core" Dec 02 16:17:44 crc kubenswrapper[4933]: I1202 16:17:44.083496 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="0429fdbf-27e6-4cfb-a853-efb467b315d8" containerName="ceilometer-notification-agent" Dec 02 16:17:44 crc kubenswrapper[4933]: I1202 16:17:44.083516 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="0429fdbf-27e6-4cfb-a853-efb467b315d8" containerName="proxy-httpd" Dec 02 16:17:44 crc kubenswrapper[4933]: I1202 16:17:44.083534 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="0429fdbf-27e6-4cfb-a853-efb467b315d8" containerName="ceilometer-central-agent" Dec 02 16:17:44 crc kubenswrapper[4933]: I1202 16:17:44.085492 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 16:17:44 crc kubenswrapper[4933]: I1202 16:17:44.090359 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 02 16:17:44 crc kubenswrapper[4933]: I1202 16:17:44.090636 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 02 16:17:44 crc kubenswrapper[4933]: I1202 16:17:44.097543 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 16:17:44 crc kubenswrapper[4933]: I1202 16:17:44.119931 4933 scope.go:117] "RemoveContainer" containerID="5c1545d48f08e3b2797d52029f5f4f1ff712110768acf700c8d2fadd3cf3be9b" Dec 02 16:17:44 crc kubenswrapper[4933]: I1202 16:17:44.217857 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/43f4221c-6190-460d-9bd8-2a20b003c890-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"43f4221c-6190-460d-9bd8-2a20b003c890\") " pod="openstack/ceilometer-0" Dec 02 16:17:44 crc kubenswrapper[4933]: I1202 16:17:44.217913 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43f4221c-6190-460d-9bd8-2a20b003c890-config-data\") pod \"ceilometer-0\" (UID: \"43f4221c-6190-460d-9bd8-2a20b003c890\") " pod="openstack/ceilometer-0" Dec 02 16:17:44 crc kubenswrapper[4933]: I1202 16:17:44.218135 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/43f4221c-6190-460d-9bd8-2a20b003c890-log-httpd\") pod \"ceilometer-0\" (UID: \"43f4221c-6190-460d-9bd8-2a20b003c890\") " pod="openstack/ceilometer-0" Dec 02 16:17:44 crc kubenswrapper[4933]: I1202 16:17:44.218193 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/43f4221c-6190-460d-9bd8-2a20b003c890-run-httpd\") pod \"ceilometer-0\" (UID: \"43f4221c-6190-460d-9bd8-2a20b003c890\") " pod="openstack/ceilometer-0" Dec 02 16:17:44 crc kubenswrapper[4933]: I1202 16:17:44.218232 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/43f4221c-6190-460d-9bd8-2a20b003c890-scripts\") pod \"ceilometer-0\" (UID: \"43f4221c-6190-460d-9bd8-2a20b003c890\") " pod="openstack/ceilometer-0" Dec 02 16:17:44 crc kubenswrapper[4933]: I1202 16:17:44.218330 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4x4zn\" (UniqueName: \"kubernetes.io/projected/43f4221c-6190-460d-9bd8-2a20b003c890-kube-api-access-4x4zn\") pod \"ceilometer-0\" (UID: \"43f4221c-6190-460d-9bd8-2a20b003c890\") " pod="openstack/ceilometer-0" Dec 02 16:17:44 crc kubenswrapper[4933]: I1202 16:17:44.218442 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43f4221c-6190-460d-9bd8-2a20b003c890-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"43f4221c-6190-460d-9bd8-2a20b003c890\") " pod="openstack/ceilometer-0" Dec 02 16:17:44 crc kubenswrapper[4933]: I1202 16:17:44.321184 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43f4221c-6190-460d-9bd8-2a20b003c890-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"43f4221c-6190-460d-9bd8-2a20b003c890\") " pod="openstack/ceilometer-0" Dec 02 16:17:44 crc kubenswrapper[4933]: I1202 16:17:44.321313 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/43f4221c-6190-460d-9bd8-2a20b003c890-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"43f4221c-6190-460d-9bd8-2a20b003c890\") " pod="openstack/ceilometer-0" Dec 02 16:17:44 crc kubenswrapper[4933]: I1202 16:17:44.321342 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43f4221c-6190-460d-9bd8-2a20b003c890-config-data\") pod \"ceilometer-0\" (UID: \"43f4221c-6190-460d-9bd8-2a20b003c890\") " pod="openstack/ceilometer-0" Dec 02 16:17:44 crc kubenswrapper[4933]: I1202 16:17:44.321416 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/43f4221c-6190-460d-9bd8-2a20b003c890-log-httpd\") pod \"ceilometer-0\" (UID: \"43f4221c-6190-460d-9bd8-2a20b003c890\") " pod="openstack/ceilometer-0" Dec 02 16:17:44 crc kubenswrapper[4933]: I1202 16:17:44.321443 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/43f4221c-6190-460d-9bd8-2a20b003c890-run-httpd\") pod \"ceilometer-0\" (UID: \"43f4221c-6190-460d-9bd8-2a20b003c890\") " pod="openstack/ceilometer-0" Dec 02 16:17:44 crc kubenswrapper[4933]: I1202 16:17:44.321465 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/43f4221c-6190-460d-9bd8-2a20b003c890-scripts\") pod \"ceilometer-0\" (UID: \"43f4221c-6190-460d-9bd8-2a20b003c890\") " pod="openstack/ceilometer-0" Dec 02 16:17:44 crc kubenswrapper[4933]: I1202 16:17:44.322059 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/43f4221c-6190-460d-9bd8-2a20b003c890-log-httpd\") pod \"ceilometer-0\" (UID: \"43f4221c-6190-460d-9bd8-2a20b003c890\") " pod="openstack/ceilometer-0" Dec 02 16:17:44 crc kubenswrapper[4933]: I1202 16:17:44.322090 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4x4zn\" (UniqueName: \"kubernetes.io/projected/43f4221c-6190-460d-9bd8-2a20b003c890-kube-api-access-4x4zn\") pod \"ceilometer-0\" (UID: \"43f4221c-6190-460d-9bd8-2a20b003c890\") " pod="openstack/ceilometer-0" Dec 02 16:17:44 crc kubenswrapper[4933]: I1202 16:17:44.322126 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/43f4221c-6190-460d-9bd8-2a20b003c890-run-httpd\") pod \"ceilometer-0\" (UID: \"43f4221c-6190-460d-9bd8-2a20b003c890\") " pod="openstack/ceilometer-0" Dec 02 16:17:44 crc kubenswrapper[4933]: I1202 16:17:44.326404 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43f4221c-6190-460d-9bd8-2a20b003c890-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"43f4221c-6190-460d-9bd8-2a20b003c890\") " pod="openstack/ceilometer-0" Dec 02 16:17:44 crc kubenswrapper[4933]: I1202 16:17:44.326491 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/43f4221c-6190-460d-9bd8-2a20b003c890-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"43f4221c-6190-460d-9bd8-2a20b003c890\") " pod="openstack/ceilometer-0" Dec 02 16:17:44 crc kubenswrapper[4933]: I1202 16:17:44.343625 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/43f4221c-6190-460d-9bd8-2a20b003c890-scripts\") pod \"ceilometer-0\" (UID: \"43f4221c-6190-460d-9bd8-2a20b003c890\") " pod="openstack/ceilometer-0" Dec 02 16:17:44 crc kubenswrapper[4933]: I1202 16:17:44.347721 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43f4221c-6190-460d-9bd8-2a20b003c890-config-data\") pod \"ceilometer-0\" (UID: \"43f4221c-6190-460d-9bd8-2a20b003c890\") " pod="openstack/ceilometer-0" Dec 02 16:17:44 crc kubenswrapper[4933]: I1202 16:17:44.351411 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4x4zn\" (UniqueName: \"kubernetes.io/projected/43f4221c-6190-460d-9bd8-2a20b003c890-kube-api-access-4x4zn\") pod \"ceilometer-0\" (UID: \"43f4221c-6190-460d-9bd8-2a20b003c890\") " pod="openstack/ceilometer-0" Dec 02 16:17:44 crc kubenswrapper[4933]: I1202 16:17:44.408602 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 16:17:44 crc kubenswrapper[4933]: I1202 16:17:44.908247 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 16:17:44 crc kubenswrapper[4933]: I1202 16:17:44.946119 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tkjnc\" (UniqueName: \"kubernetes.io/projected/65ceb88e-5297-4196-951e-3f4172c0544d-kube-api-access-tkjnc\") pod \"65ceb88e-5297-4196-951e-3f4172c0544d\" (UID: \"65ceb88e-5297-4196-951e-3f4172c0544d\") " Dec 02 16:17:44 crc kubenswrapper[4933]: I1202 16:17:44.946365 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65ceb88e-5297-4196-951e-3f4172c0544d-config-data\") pod \"65ceb88e-5297-4196-951e-3f4172c0544d\" (UID: \"65ceb88e-5297-4196-951e-3f4172c0544d\") " Dec 02 16:17:44 crc kubenswrapper[4933]: I1202 16:17:44.946466 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/65ceb88e-5297-4196-951e-3f4172c0544d-logs\") pod \"65ceb88e-5297-4196-951e-3f4172c0544d\" (UID: \"65ceb88e-5297-4196-951e-3f4172c0544d\") " Dec 02 16:17:44 crc kubenswrapper[4933]: I1202 16:17:44.946576 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65ceb88e-5297-4196-951e-3f4172c0544d-combined-ca-bundle\") pod \"65ceb88e-5297-4196-951e-3f4172c0544d\" (UID: \"65ceb88e-5297-4196-951e-3f4172c0544d\") " Dec 02 16:17:44 crc kubenswrapper[4933]: I1202 16:17:44.960261 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65ceb88e-5297-4196-951e-3f4172c0544d-logs" (OuterVolumeSpecName: "logs") pod "65ceb88e-5297-4196-951e-3f4172c0544d" (UID: "65ceb88e-5297-4196-951e-3f4172c0544d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 16:17:44 crc kubenswrapper[4933]: I1202 16:17:44.980042 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65ceb88e-5297-4196-951e-3f4172c0544d-kube-api-access-tkjnc" (OuterVolumeSpecName: "kube-api-access-tkjnc") pod "65ceb88e-5297-4196-951e-3f4172c0544d" (UID: "65ceb88e-5297-4196-951e-3f4172c0544d"). InnerVolumeSpecName "kube-api-access-tkjnc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:17:45 crc kubenswrapper[4933]: I1202 16:17:45.018081 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65ceb88e-5297-4196-951e-3f4172c0544d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "65ceb88e-5297-4196-951e-3f4172c0544d" (UID: "65ceb88e-5297-4196-951e-3f4172c0544d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:17:45 crc kubenswrapper[4933]: I1202 16:17:45.085443 4933 generic.go:334] "Generic (PLEG): container finished" podID="65ceb88e-5297-4196-951e-3f4172c0544d" containerID="7b6279d89edb751770cc704d561d15f3898d09f37c1d706830934f74e26b4146" exitCode=0 Dec 02 16:17:45 crc kubenswrapper[4933]: I1202 16:17:45.085467 4933 generic.go:334] "Generic (PLEG): container finished" podID="65ceb88e-5297-4196-951e-3f4172c0544d" containerID="51390e37981e53ce18d558237258371169bcb7ec1d586403702a98bd4c837561" exitCode=143 Dec 02 16:17:45 crc kubenswrapper[4933]: I1202 16:17:45.085581 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 16:17:45 crc kubenswrapper[4933]: I1202 16:17:45.088058 4933 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/65ceb88e-5297-4196-951e-3f4172c0544d-logs\") on node \"crc\" DevicePath \"\"" Dec 02 16:17:45 crc kubenswrapper[4933]: I1202 16:17:45.088168 4933 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65ceb88e-5297-4196-951e-3f4172c0544d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 16:17:45 crc kubenswrapper[4933]: I1202 16:17:45.088761 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tkjnc\" (UniqueName: \"kubernetes.io/projected/65ceb88e-5297-4196-951e-3f4172c0544d-kube-api-access-tkjnc\") on node \"crc\" DevicePath \"\"" Dec 02 16:17:45 crc kubenswrapper[4933]: I1202 16:17:45.161163 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0429fdbf-27e6-4cfb-a853-efb467b315d8" path="/var/lib/kubelet/pods/0429fdbf-27e6-4cfb-a853-efb467b315d8/volumes" Dec 02 16:17:45 crc kubenswrapper[4933]: I1202 16:17:45.166759 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"65ceb88e-5297-4196-951e-3f4172c0544d","Type":"ContainerDied","Data":"7b6279d89edb751770cc704d561d15f3898d09f37c1d706830934f74e26b4146"} Dec 02 16:17:45 crc kubenswrapper[4933]: I1202 16:17:45.166809 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"65ceb88e-5297-4196-951e-3f4172c0544d","Type":"ContainerDied","Data":"51390e37981e53ce18d558237258371169bcb7ec1d586403702a98bd4c837561"} Dec 02 16:17:45 crc kubenswrapper[4933]: I1202 16:17:45.166832 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"65ceb88e-5297-4196-951e-3f4172c0544d","Type":"ContainerDied","Data":"63bdcbe0c584159b6ecbb6e80aa49d48dd3958b0db1b21cb1f88c2e5cae3127b"} Dec 02 16:17:45 crc kubenswrapper[4933]: I1202 16:17:45.166855 4933 scope.go:117] "RemoveContainer" containerID="7b6279d89edb751770cc704d561d15f3898d09f37c1d706830934f74e26b4146" Dec 02 16:17:45 crc kubenswrapper[4933]: I1202 16:17:45.224069 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65ceb88e-5297-4196-951e-3f4172c0544d-config-data" (OuterVolumeSpecName: "config-data") pod "65ceb88e-5297-4196-951e-3f4172c0544d" (UID: "65ceb88e-5297-4196-951e-3f4172c0544d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:17:45 crc kubenswrapper[4933]: I1202 16:17:45.294117 4933 scope.go:117] "RemoveContainer" containerID="51390e37981e53ce18d558237258371169bcb7ec1d586403702a98bd4c837561" Dec 02 16:17:45 crc kubenswrapper[4933]: I1202 16:17:45.315169 4933 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65ceb88e-5297-4196-951e-3f4172c0544d-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 16:17:45 crc kubenswrapper[4933]: I1202 16:17:45.340641 4933 scope.go:117] "RemoveContainer" containerID="7b6279d89edb751770cc704d561d15f3898d09f37c1d706830934f74e26b4146" Dec 02 16:17:45 crc kubenswrapper[4933]: E1202 16:17:45.344317 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b6279d89edb751770cc704d561d15f3898d09f37c1d706830934f74e26b4146\": container with ID starting with 7b6279d89edb751770cc704d561d15f3898d09f37c1d706830934f74e26b4146 not found: ID does not exist" containerID="7b6279d89edb751770cc704d561d15f3898d09f37c1d706830934f74e26b4146" Dec 02 16:17:45 crc kubenswrapper[4933]: I1202 16:17:45.344426 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b6279d89edb751770cc704d561d15f3898d09f37c1d706830934f74e26b4146"} err="failed to get container status \"7b6279d89edb751770cc704d561d15f3898d09f37c1d706830934f74e26b4146\": rpc error: code = NotFound desc = could not find container \"7b6279d89edb751770cc704d561d15f3898d09f37c1d706830934f74e26b4146\": container with ID starting with 7b6279d89edb751770cc704d561d15f3898d09f37c1d706830934f74e26b4146 not found: ID does not exist" Dec 02 16:17:45 crc kubenswrapper[4933]: I1202 16:17:45.344457 4933 scope.go:117] "RemoveContainer" containerID="51390e37981e53ce18d558237258371169bcb7ec1d586403702a98bd4c837561" Dec 02 16:17:45 crc kubenswrapper[4933]: E1202 16:17:45.358576 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51390e37981e53ce18d558237258371169bcb7ec1d586403702a98bd4c837561\": container with ID starting with 51390e37981e53ce18d558237258371169bcb7ec1d586403702a98bd4c837561 not found: ID does not exist" containerID="51390e37981e53ce18d558237258371169bcb7ec1d586403702a98bd4c837561" Dec 02 16:17:45 crc kubenswrapper[4933]: I1202 16:17:45.358635 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51390e37981e53ce18d558237258371169bcb7ec1d586403702a98bd4c837561"} err="failed to get container status \"51390e37981e53ce18d558237258371169bcb7ec1d586403702a98bd4c837561\": rpc error: code = NotFound desc = could not find container \"51390e37981e53ce18d558237258371169bcb7ec1d586403702a98bd4c837561\": container with ID starting with 51390e37981e53ce18d558237258371169bcb7ec1d586403702a98bd4c837561 not found: ID does not exist" Dec 02 16:17:45 crc kubenswrapper[4933]: I1202 16:17:45.358671 4933 scope.go:117] "RemoveContainer" containerID="7b6279d89edb751770cc704d561d15f3898d09f37c1d706830934f74e26b4146" Dec 02 16:17:45 crc kubenswrapper[4933]: I1202 16:17:45.359245 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b6279d89edb751770cc704d561d15f3898d09f37c1d706830934f74e26b4146"} err="failed to get container status \"7b6279d89edb751770cc704d561d15f3898d09f37c1d706830934f74e26b4146\": rpc error: code = NotFound desc = could not find container \"7b6279d89edb751770cc704d561d15f3898d09f37c1d706830934f74e26b4146\": container with ID starting with 7b6279d89edb751770cc704d561d15f3898d09f37c1d706830934f74e26b4146 not found: ID does not exist" Dec 02 16:17:45 crc kubenswrapper[4933]: I1202 16:17:45.359295 4933 scope.go:117] "RemoveContainer" containerID="51390e37981e53ce18d558237258371169bcb7ec1d586403702a98bd4c837561" Dec 02 16:17:45 crc kubenswrapper[4933]: I1202 16:17:45.360737 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51390e37981e53ce18d558237258371169bcb7ec1d586403702a98bd4c837561"} err="failed to get container status \"51390e37981e53ce18d558237258371169bcb7ec1d586403702a98bd4c837561\": rpc error: code = NotFound desc = could not find container \"51390e37981e53ce18d558237258371169bcb7ec1d586403702a98bd4c837561\": container with ID starting with 51390e37981e53ce18d558237258371169bcb7ec1d586403702a98bd4c837561 not found: ID does not exist" Dec 02 16:17:45 crc kubenswrapper[4933]: I1202 16:17:45.414147 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 16:17:45 crc kubenswrapper[4933]: I1202 16:17:45.532285 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 16:17:45 crc kubenswrapper[4933]: I1202 16:17:45.550410 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 16:17:45 crc kubenswrapper[4933]: I1202 16:17:45.571539 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 02 16:17:45 crc kubenswrapper[4933]: E1202 16:17:45.572051 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65ceb88e-5297-4196-951e-3f4172c0544d" containerName="nova-metadata-log" Dec 02 16:17:45 crc kubenswrapper[4933]: I1202 16:17:45.572069 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="65ceb88e-5297-4196-951e-3f4172c0544d" containerName="nova-metadata-log" Dec 02 16:17:45 crc kubenswrapper[4933]: E1202 16:17:45.572096 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65ceb88e-5297-4196-951e-3f4172c0544d" containerName="nova-metadata-metadata" Dec 02 16:17:45 crc kubenswrapper[4933]: I1202 16:17:45.572102 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="65ceb88e-5297-4196-951e-3f4172c0544d" containerName="nova-metadata-metadata" Dec 02 16:17:45 crc kubenswrapper[4933]: I1202 16:17:45.572365 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="65ceb88e-5297-4196-951e-3f4172c0544d" containerName="nova-metadata-log" Dec 02 16:17:45 crc kubenswrapper[4933]: I1202 16:17:45.572390 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="65ceb88e-5297-4196-951e-3f4172c0544d" containerName="nova-metadata-metadata" Dec 02 16:17:45 crc kubenswrapper[4933]: I1202 16:17:45.573615 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 16:17:45 crc kubenswrapper[4933]: I1202 16:17:45.585103 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 02 16:17:45 crc kubenswrapper[4933]: I1202 16:17:45.585447 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 02 16:17:45 crc kubenswrapper[4933]: I1202 16:17:45.586373 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 16:17:45 crc kubenswrapper[4933]: I1202 16:17:45.623255 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/53aa8ccd-0db2-41b4-ae8d-328813643b2c-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"53aa8ccd-0db2-41b4-ae8d-328813643b2c\") " pod="openstack/nova-metadata-0" Dec 02 16:17:45 crc kubenswrapper[4933]: I1202 16:17:45.623660 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqb7l\" (UniqueName: \"kubernetes.io/projected/53aa8ccd-0db2-41b4-ae8d-328813643b2c-kube-api-access-dqb7l\") pod \"nova-metadata-0\" (UID: \"53aa8ccd-0db2-41b4-ae8d-328813643b2c\") " pod="openstack/nova-metadata-0" Dec 02 16:17:45 crc kubenswrapper[4933]: I1202 16:17:45.623760 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/53aa8ccd-0db2-41b4-ae8d-328813643b2c-logs\") pod \"nova-metadata-0\" (UID: \"53aa8ccd-0db2-41b4-ae8d-328813643b2c\") " pod="openstack/nova-metadata-0" Dec 02 16:17:45 crc kubenswrapper[4933]: I1202 16:17:45.623780 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53aa8ccd-0db2-41b4-ae8d-328813643b2c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"53aa8ccd-0db2-41b4-ae8d-328813643b2c\") " pod="openstack/nova-metadata-0" Dec 02 16:17:45 crc kubenswrapper[4933]: I1202 16:17:45.624444 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53aa8ccd-0db2-41b4-ae8d-328813643b2c-config-data\") pod \"nova-metadata-0\" (UID: \"53aa8ccd-0db2-41b4-ae8d-328813643b2c\") " pod="openstack/nova-metadata-0" Dec 02 16:17:45 crc kubenswrapper[4933]: I1202 16:17:45.664668 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 02 16:17:45 crc kubenswrapper[4933]: I1202 16:17:45.664737 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 02 16:17:45 crc kubenswrapper[4933]: E1202 16:17:45.668356 4933 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9cde07d2_340f_48ad_bb66_db94386d9052.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6e06c62d_ee51_43e3_aa5e_eb045d4ec1c8.slice\": RecentStats: unable to find data in memory cache]" Dec 02 16:17:45 crc kubenswrapper[4933]: I1202 16:17:45.711234 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 02 16:17:45 crc kubenswrapper[4933]: I1202 16:17:45.727088 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/53aa8ccd-0db2-41b4-ae8d-328813643b2c-logs\") pod \"nova-metadata-0\" (UID: \"53aa8ccd-0db2-41b4-ae8d-328813643b2c\") " pod="openstack/nova-metadata-0" Dec 02 16:17:45 crc kubenswrapper[4933]: I1202 16:17:45.727807 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53aa8ccd-0db2-41b4-ae8d-328813643b2c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"53aa8ccd-0db2-41b4-ae8d-328813643b2c\") " pod="openstack/nova-metadata-0" Dec 02 16:17:45 crc kubenswrapper[4933]: I1202 16:17:45.729491 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53aa8ccd-0db2-41b4-ae8d-328813643b2c-config-data\") pod \"nova-metadata-0\" (UID: \"53aa8ccd-0db2-41b4-ae8d-328813643b2c\") " pod="openstack/nova-metadata-0" Dec 02 16:17:45 crc kubenswrapper[4933]: I1202 16:17:45.730099 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/53aa8ccd-0db2-41b4-ae8d-328813643b2c-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"53aa8ccd-0db2-41b4-ae8d-328813643b2c\") " pod="openstack/nova-metadata-0" Dec 02 16:17:45 crc kubenswrapper[4933]: I1202 16:17:45.727550 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/53aa8ccd-0db2-41b4-ae8d-328813643b2c-logs\") pod \"nova-metadata-0\" (UID: \"53aa8ccd-0db2-41b4-ae8d-328813643b2c\") " pod="openstack/nova-metadata-0" Dec 02 16:17:45 crc kubenswrapper[4933]: I1202 16:17:45.730380 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqb7l\" (UniqueName: \"kubernetes.io/projected/53aa8ccd-0db2-41b4-ae8d-328813643b2c-kube-api-access-dqb7l\") pod \"nova-metadata-0\" (UID: \"53aa8ccd-0db2-41b4-ae8d-328813643b2c\") " pod="openstack/nova-metadata-0" Dec 02 16:17:45 crc kubenswrapper[4933]: I1202 16:17:45.732852 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 02 16:17:45 crc kubenswrapper[4933]: I1202 16:17:45.732887 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 02 16:17:45 crc kubenswrapper[4933]: I1202 16:17:45.733816 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/53aa8ccd-0db2-41b4-ae8d-328813643b2c-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"53aa8ccd-0db2-41b4-ae8d-328813643b2c\") " pod="openstack/nova-metadata-0" Dec 02 16:17:45 crc kubenswrapper[4933]: I1202 16:17:45.735791 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53aa8ccd-0db2-41b4-ae8d-328813643b2c-config-data\") pod \"nova-metadata-0\" (UID: \"53aa8ccd-0db2-41b4-ae8d-328813643b2c\") " pod="openstack/nova-metadata-0" Dec 02 16:17:45 crc kubenswrapper[4933]: I1202 16:17:45.736159 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53aa8ccd-0db2-41b4-ae8d-328813643b2c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"53aa8ccd-0db2-41b4-ae8d-328813643b2c\") " pod="openstack/nova-metadata-0" Dec 02 16:17:45 crc kubenswrapper[4933]: I1202 16:17:45.752469 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqb7l\" (UniqueName: \"kubernetes.io/projected/53aa8ccd-0db2-41b4-ae8d-328813643b2c-kube-api-access-dqb7l\") pod \"nova-metadata-0\" (UID: \"53aa8ccd-0db2-41b4-ae8d-328813643b2c\") " pod="openstack/nova-metadata-0" Dec 02 16:17:45 crc kubenswrapper[4933]: I1202 16:17:45.786752 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 02 16:17:45 crc kubenswrapper[4933]: I1202 16:17:45.898301 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 16:17:46 crc kubenswrapper[4933]: I1202 16:17:46.074015 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7877d89589-kn582" Dec 02 16:17:46 crc kubenswrapper[4933]: I1202 16:17:46.153788 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"43f4221c-6190-460d-9bd8-2a20b003c890","Type":"ContainerStarted","Data":"881b25992a79e2f8ba60561d7658ae8dd712b24b2588f95eb75c4c5d1db0ac68"} Dec 02 16:17:46 crc kubenswrapper[4933]: I1202 16:17:46.159210 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="a9e6f954-bf8f-4aab-a398-d3320706266e" containerName="aodh-api" containerID="cri-o://3927949d979b6f21a1bd1cf411956109309710dcddf32b5a9f27b0ef47d6c472" gracePeriod=30 Dec 02 16:17:46 crc kubenswrapper[4933]: I1202 16:17:46.159660 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"a9e6f954-bf8f-4aab-a398-d3320706266e","Type":"ContainerStarted","Data":"d5a2bf8c4e5e0c59a8e2da171d829bd56a23120937e6f15e060463c3fed44198"} Dec 02 16:17:46 crc kubenswrapper[4933]: I1202 16:17:46.159849 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="a9e6f954-bf8f-4aab-a398-d3320706266e" containerName="aodh-listener" containerID="cri-o://d5a2bf8c4e5e0c59a8e2da171d829bd56a23120937e6f15e060463c3fed44198" gracePeriod=30 Dec 02 16:17:46 crc kubenswrapper[4933]: I1202 16:17:46.159918 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="a9e6f954-bf8f-4aab-a398-d3320706266e" containerName="aodh-notifier" containerID="cri-o://6070de53e73b654e425984097cbcbb55c54d32a6fa3cb2d323524bb3256a50c6" gracePeriod=30 Dec 02 16:17:46 crc kubenswrapper[4933]: I1202 16:17:46.159964 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="a9e6f954-bf8f-4aab-a398-d3320706266e" containerName="aodh-evaluator" containerID="cri-o://a18d07d7710c65de4eece573380d203afe63bddee8a74e00d3d854d1ab8f07f4" gracePeriod=30 Dec 02 16:17:46 crc kubenswrapper[4933]: I1202 16:17:46.174706 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d978555f9-9jz6z"] Dec 02 16:17:46 crc kubenswrapper[4933]: I1202 16:17:46.175012 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7d978555f9-9jz6z" podUID="01dfefe7-534e-41e1-9f9b-a59f177f4c7e" containerName="dnsmasq-dns" containerID="cri-o://b1faa251b79679e454ca73c37d4bb5745d5e4905f4ebb944f3c376ba708c320b" gracePeriod=10 Dec 02 16:17:46 crc kubenswrapper[4933]: I1202 16:17:46.203562 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=3.301695693 podStartE2EDuration="14.203543118s" podCreationTimestamp="2025-12-02 16:17:32 +0000 UTC" firstStartedPulling="2025-12-02 16:17:33.898635617 +0000 UTC m=+1517.149862320" lastFinishedPulling="2025-12-02 16:17:44.800483042 +0000 UTC m=+1528.051709745" observedRunningTime="2025-12-02 16:17:46.202297884 +0000 UTC m=+1529.453524587" watchObservedRunningTime="2025-12-02 16:17:46.203543118 +0000 UTC m=+1529.454769821" Dec 02 16:17:46 crc kubenswrapper[4933]: I1202 16:17:46.268750 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 02 16:17:46 crc kubenswrapper[4933]: I1202 16:17:46.747032 4933 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="93444a85-ddc7-4883-9a66-8a759201161d" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.236:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 02 16:17:46 crc kubenswrapper[4933]: I1202 16:17:46.747716 4933 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="93444a85-ddc7-4883-9a66-8a759201161d" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.236:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 02 16:17:46 crc kubenswrapper[4933]: I1202 16:17:46.775922 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 16:17:47 crc kubenswrapper[4933]: I1202 16:17:47.084407 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65ceb88e-5297-4196-951e-3f4172c0544d" path="/var/lib/kubelet/pods/65ceb88e-5297-4196-951e-3f4172c0544d/volumes" Dec 02 16:17:47 crc kubenswrapper[4933]: I1202 16:17:47.088481 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d978555f9-9jz6z" Dec 02 16:17:47 crc kubenswrapper[4933]: I1202 16:17:47.171310 4933 patch_prober.go:28] interesting pod/machine-config-daemon-d2p6w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 16:17:47 crc kubenswrapper[4933]: I1202 16:17:47.171357 4933 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 16:17:47 crc kubenswrapper[4933]: I1202 16:17:47.212152 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sffnq\" (UniqueName: \"kubernetes.io/projected/01dfefe7-534e-41e1-9f9b-a59f177f4c7e-kube-api-access-sffnq\") pod \"01dfefe7-534e-41e1-9f9b-a59f177f4c7e\" (UID: \"01dfefe7-534e-41e1-9f9b-a59f177f4c7e\") " Dec 02 16:17:47 crc kubenswrapper[4933]: I1202 16:17:47.212236 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/01dfefe7-534e-41e1-9f9b-a59f177f4c7e-ovsdbserver-nb\") pod \"01dfefe7-534e-41e1-9f9b-a59f177f4c7e\" (UID: \"01dfefe7-534e-41e1-9f9b-a59f177f4c7e\") " Dec 02 16:17:47 crc kubenswrapper[4933]: I1202 16:17:47.212289 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01dfefe7-534e-41e1-9f9b-a59f177f4c7e-config\") pod \"01dfefe7-534e-41e1-9f9b-a59f177f4c7e\" (UID: \"01dfefe7-534e-41e1-9f9b-a59f177f4c7e\") " Dec 02 16:17:47 crc kubenswrapper[4933]: I1202 16:17:47.212370 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/01dfefe7-534e-41e1-9f9b-a59f177f4c7e-dns-svc\") pod \"01dfefe7-534e-41e1-9f9b-a59f177f4c7e\" (UID: \"01dfefe7-534e-41e1-9f9b-a59f177f4c7e\") " Dec 02 16:17:47 crc kubenswrapper[4933]: I1202 16:17:47.212480 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/01dfefe7-534e-41e1-9f9b-a59f177f4c7e-dns-swift-storage-0\") pod \"01dfefe7-534e-41e1-9f9b-a59f177f4c7e\" (UID: \"01dfefe7-534e-41e1-9f9b-a59f177f4c7e\") " Dec 02 16:17:47 crc kubenswrapper[4933]: I1202 16:17:47.212564 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/01dfefe7-534e-41e1-9f9b-a59f177f4c7e-ovsdbserver-sb\") pod \"01dfefe7-534e-41e1-9f9b-a59f177f4c7e\" (UID: \"01dfefe7-534e-41e1-9f9b-a59f177f4c7e\") " Dec 02 16:17:47 crc kubenswrapper[4933]: I1202 16:17:47.216347 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"53aa8ccd-0db2-41b4-ae8d-328813643b2c","Type":"ContainerStarted","Data":"1c0c9b6c79cad1f47cb90f9b166d14876557482399a10268fabf9c2f5efa2e90"} Dec 02 16:17:47 crc kubenswrapper[4933]: I1202 16:17:47.225790 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01dfefe7-534e-41e1-9f9b-a59f177f4c7e-kube-api-access-sffnq" (OuterVolumeSpecName: "kube-api-access-sffnq") pod "01dfefe7-534e-41e1-9f9b-a59f177f4c7e" (UID: "01dfefe7-534e-41e1-9f9b-a59f177f4c7e"). InnerVolumeSpecName "kube-api-access-sffnq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:17:47 crc kubenswrapper[4933]: I1202 16:17:47.242724 4933 generic.go:334] "Generic (PLEG): container finished" podID="01dfefe7-534e-41e1-9f9b-a59f177f4c7e" containerID="b1faa251b79679e454ca73c37d4bb5745d5e4905f4ebb944f3c376ba708c320b" exitCode=0 Dec 02 16:17:47 crc kubenswrapper[4933]: I1202 16:17:47.242806 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d978555f9-9jz6z" event={"ID":"01dfefe7-534e-41e1-9f9b-a59f177f4c7e","Type":"ContainerDied","Data":"b1faa251b79679e454ca73c37d4bb5745d5e4905f4ebb944f3c376ba708c320b"} Dec 02 16:17:47 crc kubenswrapper[4933]: I1202 16:17:47.242847 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d978555f9-9jz6z" event={"ID":"01dfefe7-534e-41e1-9f9b-a59f177f4c7e","Type":"ContainerDied","Data":"07f83c27f58afda8100dcb5ef6bd319f39d955614ebab2ee0f4e7e6678bd7287"} Dec 02 16:17:47 crc kubenswrapper[4933]: I1202 16:17:47.242866 4933 scope.go:117] "RemoveContainer" containerID="b1faa251b79679e454ca73c37d4bb5745d5e4905f4ebb944f3c376ba708c320b" Dec 02 16:17:47 crc kubenswrapper[4933]: I1202 16:17:47.242998 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d978555f9-9jz6z" Dec 02 16:17:47 crc kubenswrapper[4933]: I1202 16:17:47.265976 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"43f4221c-6190-460d-9bd8-2a20b003c890","Type":"ContainerStarted","Data":"dfa961c2cdf52b7fd81f9f80d73b6fc10a164f842f9eb764e9bce33743bca3aa"} Dec 02 16:17:47 crc kubenswrapper[4933]: I1202 16:17:47.281435 4933 generic.go:334] "Generic (PLEG): container finished" podID="a9e6f954-bf8f-4aab-a398-d3320706266e" containerID="a18d07d7710c65de4eece573380d203afe63bddee8a74e00d3d854d1ab8f07f4" exitCode=0 Dec 02 16:17:47 crc kubenswrapper[4933]: I1202 16:17:47.281466 4933 generic.go:334] "Generic (PLEG): container finished" podID="a9e6f954-bf8f-4aab-a398-d3320706266e" containerID="3927949d979b6f21a1bd1cf411956109309710dcddf32b5a9f27b0ef47d6c472" exitCode=0 Dec 02 16:17:47 crc kubenswrapper[4933]: I1202 16:17:47.282363 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"a9e6f954-bf8f-4aab-a398-d3320706266e","Type":"ContainerDied","Data":"a18d07d7710c65de4eece573380d203afe63bddee8a74e00d3d854d1ab8f07f4"} Dec 02 16:17:47 crc kubenswrapper[4933]: I1202 16:17:47.282386 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"a9e6f954-bf8f-4aab-a398-d3320706266e","Type":"ContainerDied","Data":"3927949d979b6f21a1bd1cf411956109309710dcddf32b5a9f27b0ef47d6c472"} Dec 02 16:17:47 crc kubenswrapper[4933]: I1202 16:17:47.299937 4933 scope.go:117] "RemoveContainer" containerID="63762d47840846fc93de31c6a3336d8c32796518b8886be61035ba6567611654" Dec 02 16:17:47 crc kubenswrapper[4933]: I1202 16:17:47.316108 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sffnq\" (UniqueName: \"kubernetes.io/projected/01dfefe7-534e-41e1-9f9b-a59f177f4c7e-kube-api-access-sffnq\") on node \"crc\" DevicePath \"\"" Dec 02 16:17:47 crc kubenswrapper[4933]: I1202 16:17:47.376461 4933 scope.go:117] "RemoveContainer" containerID="b1faa251b79679e454ca73c37d4bb5745d5e4905f4ebb944f3c376ba708c320b" Dec 02 16:17:47 crc kubenswrapper[4933]: E1202 16:17:47.387883 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1faa251b79679e454ca73c37d4bb5745d5e4905f4ebb944f3c376ba708c320b\": container with ID starting with b1faa251b79679e454ca73c37d4bb5745d5e4905f4ebb944f3c376ba708c320b not found: ID does not exist" containerID="b1faa251b79679e454ca73c37d4bb5745d5e4905f4ebb944f3c376ba708c320b" Dec 02 16:17:47 crc kubenswrapper[4933]: I1202 16:17:47.387955 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1faa251b79679e454ca73c37d4bb5745d5e4905f4ebb944f3c376ba708c320b"} err="failed to get container status \"b1faa251b79679e454ca73c37d4bb5745d5e4905f4ebb944f3c376ba708c320b\": rpc error: code = NotFound desc = could not find container \"b1faa251b79679e454ca73c37d4bb5745d5e4905f4ebb944f3c376ba708c320b\": container with ID starting with b1faa251b79679e454ca73c37d4bb5745d5e4905f4ebb944f3c376ba708c320b not found: ID does not exist" Dec 02 16:17:47 crc kubenswrapper[4933]: I1202 16:17:47.387995 4933 scope.go:117] "RemoveContainer" containerID="63762d47840846fc93de31c6a3336d8c32796518b8886be61035ba6567611654" Dec 02 16:17:47 crc kubenswrapper[4933]: E1202 16:17:47.398026 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63762d47840846fc93de31c6a3336d8c32796518b8886be61035ba6567611654\": container with ID starting with 63762d47840846fc93de31c6a3336d8c32796518b8886be61035ba6567611654 not found: ID does not exist" containerID="63762d47840846fc93de31c6a3336d8c32796518b8886be61035ba6567611654" Dec 02 16:17:47 crc kubenswrapper[4933]: I1202 16:17:47.398085 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63762d47840846fc93de31c6a3336d8c32796518b8886be61035ba6567611654"} err="failed to get container status \"63762d47840846fc93de31c6a3336d8c32796518b8886be61035ba6567611654\": rpc error: code = NotFound desc = could not find container \"63762d47840846fc93de31c6a3336d8c32796518b8886be61035ba6567611654\": container with ID starting with 63762d47840846fc93de31c6a3336d8c32796518b8886be61035ba6567611654 not found: ID does not exist" Dec 02 16:17:47 crc kubenswrapper[4933]: I1202 16:17:47.750731 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01dfefe7-534e-41e1-9f9b-a59f177f4c7e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "01dfefe7-534e-41e1-9f9b-a59f177f4c7e" (UID: "01dfefe7-534e-41e1-9f9b-a59f177f4c7e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:17:47 crc kubenswrapper[4933]: I1202 16:17:47.770562 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01dfefe7-534e-41e1-9f9b-a59f177f4c7e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "01dfefe7-534e-41e1-9f9b-a59f177f4c7e" (UID: "01dfefe7-534e-41e1-9f9b-a59f177f4c7e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:17:47 crc kubenswrapper[4933]: I1202 16:17:47.783261 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01dfefe7-534e-41e1-9f9b-a59f177f4c7e-config" (OuterVolumeSpecName: "config") pod "01dfefe7-534e-41e1-9f9b-a59f177f4c7e" (UID: "01dfefe7-534e-41e1-9f9b-a59f177f4c7e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:17:47 crc kubenswrapper[4933]: I1202 16:17:47.804775 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01dfefe7-534e-41e1-9f9b-a59f177f4c7e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "01dfefe7-534e-41e1-9f9b-a59f177f4c7e" (UID: "01dfefe7-534e-41e1-9f9b-a59f177f4c7e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:17:47 crc kubenswrapper[4933]: I1202 16:17:47.820757 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01dfefe7-534e-41e1-9f9b-a59f177f4c7e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "01dfefe7-534e-41e1-9f9b-a59f177f4c7e" (UID: "01dfefe7-534e-41e1-9f9b-a59f177f4c7e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:17:47 crc kubenswrapper[4933]: I1202 16:17:47.847600 4933 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/01dfefe7-534e-41e1-9f9b-a59f177f4c7e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 02 16:17:47 crc kubenswrapper[4933]: I1202 16:17:47.847906 4933 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01dfefe7-534e-41e1-9f9b-a59f177f4c7e-config\") on node \"crc\" DevicePath \"\"" Dec 02 16:17:47 crc kubenswrapper[4933]: I1202 16:17:47.847916 4933 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/01dfefe7-534e-41e1-9f9b-a59f177f4c7e-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 16:17:47 crc kubenswrapper[4933]: I1202 16:17:47.847925 4933 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/01dfefe7-534e-41e1-9f9b-a59f177f4c7e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 02 16:17:47 crc kubenswrapper[4933]: I1202 16:17:47.847934 4933 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/01dfefe7-534e-41e1-9f9b-a59f177f4c7e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 02 16:17:48 crc kubenswrapper[4933]: I1202 16:17:48.042110 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d978555f9-9jz6z"] Dec 02 16:17:48 crc kubenswrapper[4933]: I1202 16:17:48.055540 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7d978555f9-9jz6z"] Dec 02 16:17:48 crc kubenswrapper[4933]: E1202 16:17:48.263923 4933 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6e06c62d_ee51_43e3_aa5e_eb045d4ec1c8.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9cde07d2_340f_48ad_bb66_db94386d9052.slice\": RecentStats: unable to find data in memory cache]" Dec 02 16:17:48 crc kubenswrapper[4933]: E1202 16:17:48.263994 4933 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6e06c62d_ee51_43e3_aa5e_eb045d4ec1c8.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9cde07d2_340f_48ad_bb66_db94386d9052.slice\": RecentStats: unable to find data in memory cache]" Dec 02 16:17:48 crc kubenswrapper[4933]: I1202 16:17:48.305921 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"43f4221c-6190-460d-9bd8-2a20b003c890","Type":"ContainerStarted","Data":"243ad2aa6ae175b50f1df1da30fd2bc0383e0e56746f64f56579aa57e79b5223"} Dec 02 16:17:48 crc kubenswrapper[4933]: I1202 16:17:48.311896 4933 generic.go:334] "Generic (PLEG): container finished" podID="19ded5fa-9330-468d-b544-de40fe542bf0" containerID="eaa265c2a674d7092efccc8193f3d1a1ca1dd70657192ee9eb954691089f34fa" exitCode=0 Dec 02 16:17:48 crc kubenswrapper[4933]: I1202 16:17:48.311975 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-6zh5j" event={"ID":"19ded5fa-9330-468d-b544-de40fe542bf0","Type":"ContainerDied","Data":"eaa265c2a674d7092efccc8193f3d1a1ca1dd70657192ee9eb954691089f34fa"} Dec 02 16:17:48 crc kubenswrapper[4933]: I1202 16:17:48.313295 4933 generic.go:334] "Generic (PLEG): container finished" podID="52d2793b-1d77-4e12-bbaf-104e4174d0c6" containerID="21db6466c97949d3a87805fbfb31c2d56f02e0893d8b2f33c62470d5ebe1fcc5" exitCode=0 Dec 02 16:17:48 crc kubenswrapper[4933]: I1202 16:17:48.313341 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-dz7nl" event={"ID":"52d2793b-1d77-4e12-bbaf-104e4174d0c6","Type":"ContainerDied","Data":"21db6466c97949d3a87805fbfb31c2d56f02e0893d8b2f33c62470d5ebe1fcc5"} Dec 02 16:17:48 crc kubenswrapper[4933]: I1202 16:17:48.316467 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"53aa8ccd-0db2-41b4-ae8d-328813643b2c","Type":"ContainerStarted","Data":"96461a2111b4b13c070332e2a15890eb06a74d39d83e56e9237359fd2e4e01b8"} Dec 02 16:17:48 crc kubenswrapper[4933]: I1202 16:17:48.316507 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"53aa8ccd-0db2-41b4-ae8d-328813643b2c","Type":"ContainerStarted","Data":"c83d407b0e5b270eea1cc779ed5faf7b92536b573251ab18a8ca439b0d4055fa"} Dec 02 16:17:48 crc kubenswrapper[4933]: I1202 16:17:48.426133 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.42572384 podStartE2EDuration="3.42572384s" podCreationTimestamp="2025-12-02 16:17:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 16:17:48.361326117 +0000 UTC m=+1531.612552820" watchObservedRunningTime="2025-12-02 16:17:48.42572384 +0000 UTC m=+1531.676950543" Dec 02 16:17:49 crc kubenswrapper[4933]: I1202 16:17:49.068526 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01dfefe7-534e-41e1-9f9b-a59f177f4c7e" path="/var/lib/kubelet/pods/01dfefe7-534e-41e1-9f9b-a59f177f4c7e/volumes" Dec 02 16:17:49 crc kubenswrapper[4933]: I1202 16:17:49.333878 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"43f4221c-6190-460d-9bd8-2a20b003c890","Type":"ContainerStarted","Data":"9986c9817966e4ff1b75e7ca7999d10b38938cd839dab7b7cdbdce9ebb69e00e"} Dec 02 16:17:49 crc kubenswrapper[4933]: I1202 16:17:49.679065 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-hw45r"] Dec 02 16:17:49 crc kubenswrapper[4933]: E1202 16:17:49.679622 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01dfefe7-534e-41e1-9f9b-a59f177f4c7e" containerName="dnsmasq-dns" Dec 02 16:17:49 crc kubenswrapper[4933]: I1202 16:17:49.679645 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="01dfefe7-534e-41e1-9f9b-a59f177f4c7e" containerName="dnsmasq-dns" Dec 02 16:17:49 crc kubenswrapper[4933]: E1202 16:17:49.680625 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01dfefe7-534e-41e1-9f9b-a59f177f4c7e" containerName="init" Dec 02 16:17:49 crc kubenswrapper[4933]: I1202 16:17:49.680637 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="01dfefe7-534e-41e1-9f9b-a59f177f4c7e" containerName="init" Dec 02 16:17:49 crc kubenswrapper[4933]: I1202 16:17:49.680899 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="01dfefe7-534e-41e1-9f9b-a59f177f4c7e" containerName="dnsmasq-dns" Dec 02 16:17:49 crc kubenswrapper[4933]: I1202 16:17:49.682476 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hw45r" Dec 02 16:17:49 crc kubenswrapper[4933]: I1202 16:17:49.701669 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hw45r"] Dec 02 16:17:49 crc kubenswrapper[4933]: I1202 16:17:49.792559 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b21b605-e2e1-412d-86d6-116c3c6684dd-catalog-content\") pod \"redhat-marketplace-hw45r\" (UID: \"1b21b605-e2e1-412d-86d6-116c3c6684dd\") " pod="openshift-marketplace/redhat-marketplace-hw45r" Dec 02 16:17:49 crc kubenswrapper[4933]: I1202 16:17:49.792636 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcs79\" (UniqueName: \"kubernetes.io/projected/1b21b605-e2e1-412d-86d6-116c3c6684dd-kube-api-access-wcs79\") pod \"redhat-marketplace-hw45r\" (UID: \"1b21b605-e2e1-412d-86d6-116c3c6684dd\") " pod="openshift-marketplace/redhat-marketplace-hw45r" Dec 02 16:17:49 crc kubenswrapper[4933]: I1202 16:17:49.793619 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b21b605-e2e1-412d-86d6-116c3c6684dd-utilities\") pod \"redhat-marketplace-hw45r\" (UID: \"1b21b605-e2e1-412d-86d6-116c3c6684dd\") " pod="openshift-marketplace/redhat-marketplace-hw45r" Dec 02 16:17:49 crc kubenswrapper[4933]: I1202 16:17:49.899178 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b21b605-e2e1-412d-86d6-116c3c6684dd-utilities\") pod \"redhat-marketplace-hw45r\" (UID: \"1b21b605-e2e1-412d-86d6-116c3c6684dd\") " pod="openshift-marketplace/redhat-marketplace-hw45r" Dec 02 16:17:49 crc kubenswrapper[4933]: I1202 16:17:49.899508 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b21b605-e2e1-412d-86d6-116c3c6684dd-utilities\") pod \"redhat-marketplace-hw45r\" (UID: \"1b21b605-e2e1-412d-86d6-116c3c6684dd\") " pod="openshift-marketplace/redhat-marketplace-hw45r" Dec 02 16:17:49 crc kubenswrapper[4933]: I1202 16:17:49.899925 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b21b605-e2e1-412d-86d6-116c3c6684dd-catalog-content\") pod \"redhat-marketplace-hw45r\" (UID: \"1b21b605-e2e1-412d-86d6-116c3c6684dd\") " pod="openshift-marketplace/redhat-marketplace-hw45r" Dec 02 16:17:49 crc kubenswrapper[4933]: I1202 16:17:49.900050 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wcs79\" (UniqueName: \"kubernetes.io/projected/1b21b605-e2e1-412d-86d6-116c3c6684dd-kube-api-access-wcs79\") pod \"redhat-marketplace-hw45r\" (UID: \"1b21b605-e2e1-412d-86d6-116c3c6684dd\") " pod="openshift-marketplace/redhat-marketplace-hw45r" Dec 02 16:17:49 crc kubenswrapper[4933]: I1202 16:17:49.902714 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b21b605-e2e1-412d-86d6-116c3c6684dd-catalog-content\") pod \"redhat-marketplace-hw45r\" (UID: \"1b21b605-e2e1-412d-86d6-116c3c6684dd\") " pod="openshift-marketplace/redhat-marketplace-hw45r" Dec 02 16:17:49 crc kubenswrapper[4933]: I1202 16:17:49.923480 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcs79\" (UniqueName: \"kubernetes.io/projected/1b21b605-e2e1-412d-86d6-116c3c6684dd-kube-api-access-wcs79\") pod \"redhat-marketplace-hw45r\" (UID: \"1b21b605-e2e1-412d-86d6-116c3c6684dd\") " pod="openshift-marketplace/redhat-marketplace-hw45r" Dec 02 16:17:50 crc kubenswrapper[4933]: I1202 16:17:50.004526 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hw45r" Dec 02 16:17:50 crc kubenswrapper[4933]: I1202 16:17:50.139185 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-6zh5j" Dec 02 16:17:50 crc kubenswrapper[4933]: I1202 16:17:50.152136 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-dz7nl" Dec 02 16:17:50 crc kubenswrapper[4933]: I1202 16:17:50.206701 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19ded5fa-9330-468d-b544-de40fe542bf0-combined-ca-bundle\") pod \"19ded5fa-9330-468d-b544-de40fe542bf0\" (UID: \"19ded5fa-9330-468d-b544-de40fe542bf0\") " Dec 02 16:17:50 crc kubenswrapper[4933]: I1202 16:17:50.206766 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19ded5fa-9330-468d-b544-de40fe542bf0-config-data\") pod \"19ded5fa-9330-468d-b544-de40fe542bf0\" (UID: \"19ded5fa-9330-468d-b544-de40fe542bf0\") " Dec 02 16:17:50 crc kubenswrapper[4933]: I1202 16:17:50.206876 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19ded5fa-9330-468d-b544-de40fe542bf0-scripts\") pod \"19ded5fa-9330-468d-b544-de40fe542bf0\" (UID: \"19ded5fa-9330-468d-b544-de40fe542bf0\") " Dec 02 16:17:50 crc kubenswrapper[4933]: I1202 16:17:50.206974 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pt9ll\" (UniqueName: \"kubernetes.io/projected/19ded5fa-9330-468d-b544-de40fe542bf0-kube-api-access-pt9ll\") pod \"19ded5fa-9330-468d-b544-de40fe542bf0\" (UID: \"19ded5fa-9330-468d-b544-de40fe542bf0\") " Dec 02 16:17:50 crc kubenswrapper[4933]: I1202 16:17:50.213351 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19ded5fa-9330-468d-b544-de40fe542bf0-kube-api-access-pt9ll" (OuterVolumeSpecName: "kube-api-access-pt9ll") pod "19ded5fa-9330-468d-b544-de40fe542bf0" (UID: "19ded5fa-9330-468d-b544-de40fe542bf0"). InnerVolumeSpecName "kube-api-access-pt9ll". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:17:50 crc kubenswrapper[4933]: I1202 16:17:50.227403 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19ded5fa-9330-468d-b544-de40fe542bf0-scripts" (OuterVolumeSpecName: "scripts") pod "19ded5fa-9330-468d-b544-de40fe542bf0" (UID: "19ded5fa-9330-468d-b544-de40fe542bf0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:17:50 crc kubenswrapper[4933]: I1202 16:17:50.251881 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19ded5fa-9330-468d-b544-de40fe542bf0-config-data" (OuterVolumeSpecName: "config-data") pod "19ded5fa-9330-468d-b544-de40fe542bf0" (UID: "19ded5fa-9330-468d-b544-de40fe542bf0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:17:50 crc kubenswrapper[4933]: I1202 16:17:50.258218 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19ded5fa-9330-468d-b544-de40fe542bf0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "19ded5fa-9330-468d-b544-de40fe542bf0" (UID: "19ded5fa-9330-468d-b544-de40fe542bf0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:17:50 crc kubenswrapper[4933]: I1202 16:17:50.309901 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52d2793b-1d77-4e12-bbaf-104e4174d0c6-config-data\") pod \"52d2793b-1d77-4e12-bbaf-104e4174d0c6\" (UID: \"52d2793b-1d77-4e12-bbaf-104e4174d0c6\") " Dec 02 16:17:50 crc kubenswrapper[4933]: I1202 16:17:50.310341 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52d2793b-1d77-4e12-bbaf-104e4174d0c6-combined-ca-bundle\") pod \"52d2793b-1d77-4e12-bbaf-104e4174d0c6\" (UID: \"52d2793b-1d77-4e12-bbaf-104e4174d0c6\") " Dec 02 16:17:50 crc kubenswrapper[4933]: I1202 16:17:50.310570 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52d2793b-1d77-4e12-bbaf-104e4174d0c6-scripts\") pod \"52d2793b-1d77-4e12-bbaf-104e4174d0c6\" (UID: \"52d2793b-1d77-4e12-bbaf-104e4174d0c6\") " Dec 02 16:17:50 crc kubenswrapper[4933]: I1202 16:17:50.310663 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjkrs\" (UniqueName: \"kubernetes.io/projected/52d2793b-1d77-4e12-bbaf-104e4174d0c6-kube-api-access-pjkrs\") pod \"52d2793b-1d77-4e12-bbaf-104e4174d0c6\" (UID: \"52d2793b-1d77-4e12-bbaf-104e4174d0c6\") " Dec 02 16:17:50 crc kubenswrapper[4933]: I1202 16:17:50.312039 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pt9ll\" (UniqueName: \"kubernetes.io/projected/19ded5fa-9330-468d-b544-de40fe542bf0-kube-api-access-pt9ll\") on node \"crc\" DevicePath \"\"" Dec 02 16:17:50 crc kubenswrapper[4933]: I1202 16:17:50.312136 4933 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19ded5fa-9330-468d-b544-de40fe542bf0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 16:17:50 crc kubenswrapper[4933]: I1202 16:17:50.312416 4933 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19ded5fa-9330-468d-b544-de40fe542bf0-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 16:17:50 crc kubenswrapper[4933]: I1202 16:17:50.312514 4933 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19ded5fa-9330-468d-b544-de40fe542bf0-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 16:17:50 crc kubenswrapper[4933]: I1202 16:17:50.316175 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52d2793b-1d77-4e12-bbaf-104e4174d0c6-scripts" (OuterVolumeSpecName: "scripts") pod "52d2793b-1d77-4e12-bbaf-104e4174d0c6" (UID: "52d2793b-1d77-4e12-bbaf-104e4174d0c6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:17:50 crc kubenswrapper[4933]: I1202 16:17:50.316689 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52d2793b-1d77-4e12-bbaf-104e4174d0c6-kube-api-access-pjkrs" (OuterVolumeSpecName: "kube-api-access-pjkrs") pod "52d2793b-1d77-4e12-bbaf-104e4174d0c6" (UID: "52d2793b-1d77-4e12-bbaf-104e4174d0c6"). InnerVolumeSpecName "kube-api-access-pjkrs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:17:50 crc kubenswrapper[4933]: I1202 16:17:50.347571 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52d2793b-1d77-4e12-bbaf-104e4174d0c6-config-data" (OuterVolumeSpecName: "config-data") pod "52d2793b-1d77-4e12-bbaf-104e4174d0c6" (UID: "52d2793b-1d77-4e12-bbaf-104e4174d0c6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:17:50 crc kubenswrapper[4933]: I1202 16:17:50.381843 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-dz7nl" event={"ID":"52d2793b-1d77-4e12-bbaf-104e4174d0c6","Type":"ContainerDied","Data":"47a40652d3a4bc79dce104c16ca040811f9c429abedd53ce874985df1c2c4780"} Dec 02 16:17:50 crc kubenswrapper[4933]: I1202 16:17:50.381915 4933 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="47a40652d3a4bc79dce104c16ca040811f9c429abedd53ce874985df1c2c4780" Dec 02 16:17:50 crc kubenswrapper[4933]: I1202 16:17:50.381929 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-dz7nl" Dec 02 16:17:50 crc kubenswrapper[4933]: I1202 16:17:50.385197 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-6zh5j" event={"ID":"19ded5fa-9330-468d-b544-de40fe542bf0","Type":"ContainerDied","Data":"303ec1c41ff339a3f10d8fba9f1227bf9c71fb4ed4fb9a40356d57fb9b73243e"} Dec 02 16:17:50 crc kubenswrapper[4933]: I1202 16:17:50.385236 4933 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="303ec1c41ff339a3f10d8fba9f1227bf9c71fb4ed4fb9a40356d57fb9b73243e" Dec 02 16:17:50 crc kubenswrapper[4933]: I1202 16:17:50.385296 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-6zh5j" Dec 02 16:17:50 crc kubenswrapper[4933]: I1202 16:17:50.395131 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52d2793b-1d77-4e12-bbaf-104e4174d0c6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "52d2793b-1d77-4e12-bbaf-104e4174d0c6" (UID: "52d2793b-1d77-4e12-bbaf-104e4174d0c6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:17:50 crc kubenswrapper[4933]: I1202 16:17:50.415057 4933 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52d2793b-1d77-4e12-bbaf-104e4174d0c6-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 16:17:50 crc kubenswrapper[4933]: I1202 16:17:50.418310 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjkrs\" (UniqueName: \"kubernetes.io/projected/52d2793b-1d77-4e12-bbaf-104e4174d0c6-kube-api-access-pjkrs\") on node \"crc\" DevicePath \"\"" Dec 02 16:17:50 crc kubenswrapper[4933]: I1202 16:17:50.418327 4933 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52d2793b-1d77-4e12-bbaf-104e4174d0c6-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 16:17:50 crc kubenswrapper[4933]: I1202 16:17:50.418338 4933 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52d2793b-1d77-4e12-bbaf-104e4174d0c6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 16:17:50 crc kubenswrapper[4933]: I1202 16:17:50.443952 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 02 16:17:50 crc kubenswrapper[4933]: E1202 16:17:50.444631 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52d2793b-1d77-4e12-bbaf-104e4174d0c6" containerName="nova-cell1-conductor-db-sync" Dec 02 16:17:50 crc kubenswrapper[4933]: I1202 16:17:50.444651 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="52d2793b-1d77-4e12-bbaf-104e4174d0c6" containerName="nova-cell1-conductor-db-sync" Dec 02 16:17:50 crc kubenswrapper[4933]: E1202 16:17:50.444707 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19ded5fa-9330-468d-b544-de40fe542bf0" containerName="nova-manage" Dec 02 16:17:50 crc kubenswrapper[4933]: I1202 16:17:50.444714 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="19ded5fa-9330-468d-b544-de40fe542bf0" containerName="nova-manage" Dec 02 16:17:50 crc kubenswrapper[4933]: I1202 16:17:50.446791 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="52d2793b-1d77-4e12-bbaf-104e4174d0c6" containerName="nova-cell1-conductor-db-sync" Dec 02 16:17:50 crc kubenswrapper[4933]: I1202 16:17:50.446838 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="19ded5fa-9330-468d-b544-de40fe542bf0" containerName="nova-manage" Dec 02 16:17:50 crc kubenswrapper[4933]: I1202 16:17:50.447674 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 02 16:17:50 crc kubenswrapper[4933]: I1202 16:17:50.462108 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 02 16:17:50 crc kubenswrapper[4933]: I1202 16:17:50.519960 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3186bcaa-4306-4842-885b-6afd00553e78-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"3186bcaa-4306-4842-885b-6afd00553e78\") " pod="openstack/nova-cell1-conductor-0" Dec 02 16:17:50 crc kubenswrapper[4933]: I1202 16:17:50.520003 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3186bcaa-4306-4842-885b-6afd00553e78-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"3186bcaa-4306-4842-885b-6afd00553e78\") " pod="openstack/nova-cell1-conductor-0" Dec 02 16:17:50 crc kubenswrapper[4933]: I1202 16:17:50.520057 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-scmk6\" (UniqueName: \"kubernetes.io/projected/3186bcaa-4306-4842-885b-6afd00553e78-kube-api-access-scmk6\") pod \"nova-cell1-conductor-0\" (UID: \"3186bcaa-4306-4842-885b-6afd00553e78\") " pod="openstack/nova-cell1-conductor-0" Dec 02 16:17:50 crc kubenswrapper[4933]: I1202 16:17:50.554956 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hw45r"] Dec 02 16:17:50 crc kubenswrapper[4933]: I1202 16:17:50.569006 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 02 16:17:50 crc kubenswrapper[4933]: I1202 16:17:50.569320 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="93444a85-ddc7-4883-9a66-8a759201161d" containerName="nova-api-log" containerID="cri-o://cd4f4c4b0d3805a1717e6f1723206a2ffdb36477aced745e839329aeb1e7d82d" gracePeriod=30 Dec 02 16:17:50 crc kubenswrapper[4933]: I1202 16:17:50.569633 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="93444a85-ddc7-4883-9a66-8a759201161d" containerName="nova-api-api" containerID="cri-o://4560a6b592e8ed05b2f609deada274875270c0176efe00dda16466370776192a" gracePeriod=30 Dec 02 16:17:50 crc kubenswrapper[4933]: I1202 16:17:50.584227 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 16:17:50 crc kubenswrapper[4933]: I1202 16:17:50.584553 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="ce47649a-d30e-4b04-a077-b6efde5fd587" containerName="nova-scheduler-scheduler" containerID="cri-o://e97c69945175a6e744512f5d20a08e2a73bfffca682fb8cef530f8175f578713" gracePeriod=30 Dec 02 16:17:50 crc kubenswrapper[4933]: I1202 16:17:50.621799 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 16:17:50 crc kubenswrapper[4933]: I1202 16:17:50.622024 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="53aa8ccd-0db2-41b4-ae8d-328813643b2c" containerName="nova-metadata-log" containerID="cri-o://c83d407b0e5b270eea1cc779ed5faf7b92536b573251ab18a8ca439b0d4055fa" gracePeriod=30 Dec 02 16:17:50 crc kubenswrapper[4933]: I1202 16:17:50.622577 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3186bcaa-4306-4842-885b-6afd00553e78-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"3186bcaa-4306-4842-885b-6afd00553e78\") " pod="openstack/nova-cell1-conductor-0" Dec 02 16:17:50 crc kubenswrapper[4933]: I1202 16:17:50.622619 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3186bcaa-4306-4842-885b-6afd00553e78-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"3186bcaa-4306-4842-885b-6afd00553e78\") " pod="openstack/nova-cell1-conductor-0" Dec 02 16:17:50 crc kubenswrapper[4933]: I1202 16:17:50.622673 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-scmk6\" (UniqueName: \"kubernetes.io/projected/3186bcaa-4306-4842-885b-6afd00553e78-kube-api-access-scmk6\") pod \"nova-cell1-conductor-0\" (UID: \"3186bcaa-4306-4842-885b-6afd00553e78\") " pod="openstack/nova-cell1-conductor-0" Dec 02 16:17:50 crc kubenswrapper[4933]: I1202 16:17:50.623739 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="53aa8ccd-0db2-41b4-ae8d-328813643b2c" containerName="nova-metadata-metadata" containerID="cri-o://96461a2111b4b13c070332e2a15890eb06a74d39d83e56e9237359fd2e4e01b8" gracePeriod=30 Dec 02 16:17:50 crc kubenswrapper[4933]: I1202 16:17:50.629243 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3186bcaa-4306-4842-885b-6afd00553e78-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"3186bcaa-4306-4842-885b-6afd00553e78\") " pod="openstack/nova-cell1-conductor-0" Dec 02 16:17:50 crc kubenswrapper[4933]: I1202 16:17:50.629356 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3186bcaa-4306-4842-885b-6afd00553e78-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"3186bcaa-4306-4842-885b-6afd00553e78\") " pod="openstack/nova-cell1-conductor-0" Dec 02 16:17:50 crc kubenswrapper[4933]: I1202 16:17:50.645414 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-scmk6\" (UniqueName: \"kubernetes.io/projected/3186bcaa-4306-4842-885b-6afd00553e78-kube-api-access-scmk6\") pod \"nova-cell1-conductor-0\" (UID: \"3186bcaa-4306-4842-885b-6afd00553e78\") " pod="openstack/nova-cell1-conductor-0" Dec 02 16:17:50 crc kubenswrapper[4933]: E1202 16:17:50.738725 4933 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e97c69945175a6e744512f5d20a08e2a73bfffca682fb8cef530f8175f578713" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 02 16:17:50 crc kubenswrapper[4933]: E1202 16:17:50.739866 4933 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e97c69945175a6e744512f5d20a08e2a73bfffca682fb8cef530f8175f578713" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 02 16:17:50 crc kubenswrapper[4933]: E1202 16:17:50.743934 4933 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e97c69945175a6e744512f5d20a08e2a73bfffca682fb8cef530f8175f578713" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 02 16:17:50 crc kubenswrapper[4933]: E1202 16:17:50.743978 4933 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="ce47649a-d30e-4b04-a077-b6efde5fd587" containerName="nova-scheduler-scheduler" Dec 02 16:17:50 crc kubenswrapper[4933]: I1202 16:17:50.770504 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 02 16:17:50 crc kubenswrapper[4933]: I1202 16:17:50.901908 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 02 16:17:50 crc kubenswrapper[4933]: I1202 16:17:50.901952 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 02 16:17:51 crc kubenswrapper[4933]: I1202 16:17:51.310841 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 02 16:17:51 crc kubenswrapper[4933]: W1202 16:17:51.353058 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3186bcaa_4306_4842_885b_6afd00553e78.slice/crio-e7470a70ec89121b4a66060586d9ddefee9ff2f35f0cbfe1d193e48d064b95d1 WatchSource:0}: Error finding container e7470a70ec89121b4a66060586d9ddefee9ff2f35f0cbfe1d193e48d064b95d1: Status 404 returned error can't find the container with id e7470a70ec89121b4a66060586d9ddefee9ff2f35f0cbfe1d193e48d064b95d1 Dec 02 16:17:51 crc kubenswrapper[4933]: I1202 16:17:51.411880 4933 generic.go:334] "Generic (PLEG): container finished" podID="93444a85-ddc7-4883-9a66-8a759201161d" containerID="cd4f4c4b0d3805a1717e6f1723206a2ffdb36477aced745e839329aeb1e7d82d" exitCode=143 Dec 02 16:17:51 crc kubenswrapper[4933]: I1202 16:17:51.411939 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"93444a85-ddc7-4883-9a66-8a759201161d","Type":"ContainerDied","Data":"cd4f4c4b0d3805a1717e6f1723206a2ffdb36477aced745e839329aeb1e7d82d"} Dec 02 16:17:51 crc kubenswrapper[4933]: I1202 16:17:51.415070 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"3186bcaa-4306-4842-885b-6afd00553e78","Type":"ContainerStarted","Data":"e7470a70ec89121b4a66060586d9ddefee9ff2f35f0cbfe1d193e48d064b95d1"} Dec 02 16:17:51 crc kubenswrapper[4933]: I1202 16:17:51.418795 4933 generic.go:334] "Generic (PLEG): container finished" podID="1b21b605-e2e1-412d-86d6-116c3c6684dd" containerID="a9abab9ad1fa4df2e78331c5c543b0c5fc2a1d9c35731699f66e355e460c1359" exitCode=0 Dec 02 16:17:51 crc kubenswrapper[4933]: I1202 16:17:51.419130 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hw45r" event={"ID":"1b21b605-e2e1-412d-86d6-116c3c6684dd","Type":"ContainerDied","Data":"a9abab9ad1fa4df2e78331c5c543b0c5fc2a1d9c35731699f66e355e460c1359"} Dec 02 16:17:51 crc kubenswrapper[4933]: I1202 16:17:51.419233 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hw45r" event={"ID":"1b21b605-e2e1-412d-86d6-116c3c6684dd","Type":"ContainerStarted","Data":"d75573f692765899c60adbe4bafe4f66ea48bcbd5267e76863dd0d03d5e2d1f8"} Dec 02 16:17:51 crc kubenswrapper[4933]: I1202 16:17:51.421799 4933 generic.go:334] "Generic (PLEG): container finished" podID="ce47649a-d30e-4b04-a077-b6efde5fd587" containerID="e97c69945175a6e744512f5d20a08e2a73bfffca682fb8cef530f8175f578713" exitCode=0 Dec 02 16:17:51 crc kubenswrapper[4933]: I1202 16:17:51.421872 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ce47649a-d30e-4b04-a077-b6efde5fd587","Type":"ContainerDied","Data":"e97c69945175a6e744512f5d20a08e2a73bfffca682fb8cef530f8175f578713"} Dec 02 16:17:51 crc kubenswrapper[4933]: I1202 16:17:51.427466 4933 generic.go:334] "Generic (PLEG): container finished" podID="53aa8ccd-0db2-41b4-ae8d-328813643b2c" containerID="96461a2111b4b13c070332e2a15890eb06a74d39d83e56e9237359fd2e4e01b8" exitCode=0 Dec 02 16:17:51 crc kubenswrapper[4933]: I1202 16:17:51.427484 4933 generic.go:334] "Generic (PLEG): container finished" podID="53aa8ccd-0db2-41b4-ae8d-328813643b2c" containerID="c83d407b0e5b270eea1cc779ed5faf7b92536b573251ab18a8ca439b0d4055fa" exitCode=143 Dec 02 16:17:51 crc kubenswrapper[4933]: I1202 16:17:51.427503 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"53aa8ccd-0db2-41b4-ae8d-328813643b2c","Type":"ContainerDied","Data":"96461a2111b4b13c070332e2a15890eb06a74d39d83e56e9237359fd2e4e01b8"} Dec 02 16:17:51 crc kubenswrapper[4933]: I1202 16:17:51.427523 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"53aa8ccd-0db2-41b4-ae8d-328813643b2c","Type":"ContainerDied","Data":"c83d407b0e5b270eea1cc779ed5faf7b92536b573251ab18a8ca439b0d4055fa"} Dec 02 16:17:51 crc kubenswrapper[4933]: I1202 16:17:51.635721 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 16:17:51 crc kubenswrapper[4933]: I1202 16:17:51.756625 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53aa8ccd-0db2-41b4-ae8d-328813643b2c-config-data\") pod \"53aa8ccd-0db2-41b4-ae8d-328813643b2c\" (UID: \"53aa8ccd-0db2-41b4-ae8d-328813643b2c\") " Dec 02 16:17:51 crc kubenswrapper[4933]: I1202 16:17:51.756962 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/53aa8ccd-0db2-41b4-ae8d-328813643b2c-logs\") pod \"53aa8ccd-0db2-41b4-ae8d-328813643b2c\" (UID: \"53aa8ccd-0db2-41b4-ae8d-328813643b2c\") " Dec 02 16:17:51 crc kubenswrapper[4933]: I1202 16:17:51.757152 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53aa8ccd-0db2-41b4-ae8d-328813643b2c-combined-ca-bundle\") pod \"53aa8ccd-0db2-41b4-ae8d-328813643b2c\" (UID: \"53aa8ccd-0db2-41b4-ae8d-328813643b2c\") " Dec 02 16:17:51 crc kubenswrapper[4933]: I1202 16:17:51.757247 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dqb7l\" (UniqueName: \"kubernetes.io/projected/53aa8ccd-0db2-41b4-ae8d-328813643b2c-kube-api-access-dqb7l\") pod \"53aa8ccd-0db2-41b4-ae8d-328813643b2c\" (UID: \"53aa8ccd-0db2-41b4-ae8d-328813643b2c\") " Dec 02 16:17:51 crc kubenswrapper[4933]: I1202 16:17:51.757336 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/53aa8ccd-0db2-41b4-ae8d-328813643b2c-nova-metadata-tls-certs\") pod \"53aa8ccd-0db2-41b4-ae8d-328813643b2c\" (UID: \"53aa8ccd-0db2-41b4-ae8d-328813643b2c\") " Dec 02 16:17:51 crc kubenswrapper[4933]: I1202 16:17:51.757515 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53aa8ccd-0db2-41b4-ae8d-328813643b2c-logs" (OuterVolumeSpecName: "logs") pod "53aa8ccd-0db2-41b4-ae8d-328813643b2c" (UID: "53aa8ccd-0db2-41b4-ae8d-328813643b2c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 16:17:51 crc kubenswrapper[4933]: I1202 16:17:51.757883 4933 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/53aa8ccd-0db2-41b4-ae8d-328813643b2c-logs\") on node \"crc\" DevicePath \"\"" Dec 02 16:17:51 crc kubenswrapper[4933]: I1202 16:17:51.784149 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53aa8ccd-0db2-41b4-ae8d-328813643b2c-kube-api-access-dqb7l" (OuterVolumeSpecName: "kube-api-access-dqb7l") pod "53aa8ccd-0db2-41b4-ae8d-328813643b2c" (UID: "53aa8ccd-0db2-41b4-ae8d-328813643b2c"). InnerVolumeSpecName "kube-api-access-dqb7l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:17:51 crc kubenswrapper[4933]: I1202 16:17:51.809467 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53aa8ccd-0db2-41b4-ae8d-328813643b2c-config-data" (OuterVolumeSpecName: "config-data") pod "53aa8ccd-0db2-41b4-ae8d-328813643b2c" (UID: "53aa8ccd-0db2-41b4-ae8d-328813643b2c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:17:51 crc kubenswrapper[4933]: I1202 16:17:51.822960 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53aa8ccd-0db2-41b4-ae8d-328813643b2c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "53aa8ccd-0db2-41b4-ae8d-328813643b2c" (UID: "53aa8ccd-0db2-41b4-ae8d-328813643b2c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:17:51 crc kubenswrapper[4933]: I1202 16:17:51.861754 4933 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53aa8ccd-0db2-41b4-ae8d-328813643b2c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 16:17:51 crc kubenswrapper[4933]: I1202 16:17:51.861788 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dqb7l\" (UniqueName: \"kubernetes.io/projected/53aa8ccd-0db2-41b4-ae8d-328813643b2c-kube-api-access-dqb7l\") on node \"crc\" DevicePath \"\"" Dec 02 16:17:51 crc kubenswrapper[4933]: I1202 16:17:51.861804 4933 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53aa8ccd-0db2-41b4-ae8d-328813643b2c-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 16:17:51 crc kubenswrapper[4933]: I1202 16:17:51.907047 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53aa8ccd-0db2-41b4-ae8d-328813643b2c-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "53aa8ccd-0db2-41b4-ae8d-328813643b2c" (UID: "53aa8ccd-0db2-41b4-ae8d-328813643b2c"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:17:51 crc kubenswrapper[4933]: I1202 16:17:51.963461 4933 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/53aa8ccd-0db2-41b4-ae8d-328813643b2c-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 02 16:17:52 crc kubenswrapper[4933]: I1202 16:17:52.058603 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 02 16:17:52 crc kubenswrapper[4933]: I1202 16:17:52.168658 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce47649a-d30e-4b04-a077-b6efde5fd587-combined-ca-bundle\") pod \"ce47649a-d30e-4b04-a077-b6efde5fd587\" (UID: \"ce47649a-d30e-4b04-a077-b6efde5fd587\") " Dec 02 16:17:52 crc kubenswrapper[4933]: I1202 16:17:52.169339 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-plffx\" (UniqueName: \"kubernetes.io/projected/ce47649a-d30e-4b04-a077-b6efde5fd587-kube-api-access-plffx\") pod \"ce47649a-d30e-4b04-a077-b6efde5fd587\" (UID: \"ce47649a-d30e-4b04-a077-b6efde5fd587\") " Dec 02 16:17:52 crc kubenswrapper[4933]: I1202 16:17:52.169407 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce47649a-d30e-4b04-a077-b6efde5fd587-config-data\") pod \"ce47649a-d30e-4b04-a077-b6efde5fd587\" (UID: \"ce47649a-d30e-4b04-a077-b6efde5fd587\") " Dec 02 16:17:52 crc kubenswrapper[4933]: I1202 16:17:52.180116 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce47649a-d30e-4b04-a077-b6efde5fd587-kube-api-access-plffx" (OuterVolumeSpecName: "kube-api-access-plffx") pod "ce47649a-d30e-4b04-a077-b6efde5fd587" (UID: "ce47649a-d30e-4b04-a077-b6efde5fd587"). InnerVolumeSpecName "kube-api-access-plffx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:17:52 crc kubenswrapper[4933]: I1202 16:17:52.232957 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce47649a-d30e-4b04-a077-b6efde5fd587-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ce47649a-d30e-4b04-a077-b6efde5fd587" (UID: "ce47649a-d30e-4b04-a077-b6efde5fd587"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:17:52 crc kubenswrapper[4933]: I1202 16:17:52.249705 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce47649a-d30e-4b04-a077-b6efde5fd587-config-data" (OuterVolumeSpecName: "config-data") pod "ce47649a-d30e-4b04-a077-b6efde5fd587" (UID: "ce47649a-d30e-4b04-a077-b6efde5fd587"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:17:52 crc kubenswrapper[4933]: I1202 16:17:52.275717 4933 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce47649a-d30e-4b04-a077-b6efde5fd587-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 16:17:52 crc kubenswrapper[4933]: I1202 16:17:52.275754 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-plffx\" (UniqueName: \"kubernetes.io/projected/ce47649a-d30e-4b04-a077-b6efde5fd587-kube-api-access-plffx\") on node \"crc\" DevicePath \"\"" Dec 02 16:17:52 crc kubenswrapper[4933]: I1202 16:17:52.275770 4933 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce47649a-d30e-4b04-a077-b6efde5fd587-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 16:17:52 crc kubenswrapper[4933]: I1202 16:17:52.440579 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"3186bcaa-4306-4842-885b-6afd00553e78","Type":"ContainerStarted","Data":"e5e27d9c8c1e18307938642576823550bd0cee1f34d34b68d1eb4548033ac6f4"} Dec 02 16:17:52 crc kubenswrapper[4933]: I1202 16:17:52.441259 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Dec 02 16:17:52 crc kubenswrapper[4933]: I1202 16:17:52.443307 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 02 16:17:52 crc kubenswrapper[4933]: I1202 16:17:52.445207 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ce47649a-d30e-4b04-a077-b6efde5fd587","Type":"ContainerDied","Data":"8c322613760957c3c6815cac96bcf9e0b8425fa3e8925ecc5ae46c6e36de5f9a"} Dec 02 16:17:52 crc kubenswrapper[4933]: I1202 16:17:52.445260 4933 scope.go:117] "RemoveContainer" containerID="e97c69945175a6e744512f5d20a08e2a73bfffca682fb8cef530f8175f578713" Dec 02 16:17:52 crc kubenswrapper[4933]: I1202 16:17:52.447231 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"53aa8ccd-0db2-41b4-ae8d-328813643b2c","Type":"ContainerDied","Data":"1c0c9b6c79cad1f47cb90f9b166d14876557482399a10268fabf9c2f5efa2e90"} Dec 02 16:17:52 crc kubenswrapper[4933]: I1202 16:17:52.447294 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 16:17:52 crc kubenswrapper[4933]: I1202 16:17:52.521937 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.521905907 podStartE2EDuration="2.521905907s" podCreationTimestamp="2025-12-02 16:17:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 16:17:52.464627819 +0000 UTC m=+1535.715854522" watchObservedRunningTime="2025-12-02 16:17:52.521905907 +0000 UTC m=+1535.773132610" Dec 02 16:17:52 crc kubenswrapper[4933]: I1202 16:17:52.546059 4933 scope.go:117] "RemoveContainer" containerID="96461a2111b4b13c070332e2a15890eb06a74d39d83e56e9237359fd2e4e01b8" Dec 02 16:17:52 crc kubenswrapper[4933]: I1202 16:17:52.627065 4933 scope.go:117] "RemoveContainer" containerID="c83d407b0e5b270eea1cc779ed5faf7b92536b573251ab18a8ca439b0d4055fa" Dec 02 16:17:52 crc kubenswrapper[4933]: I1202 16:17:52.627270 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 16:17:52 crc kubenswrapper[4933]: I1202 16:17:52.652414 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 16:17:52 crc kubenswrapper[4933]: I1202 16:17:52.665857 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 16:17:52 crc kubenswrapper[4933]: I1202 16:17:52.682833 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 16:17:52 crc kubenswrapper[4933]: I1202 16:17:52.699230 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 16:17:52 crc kubenswrapper[4933]: E1202 16:17:52.703338 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53aa8ccd-0db2-41b4-ae8d-328813643b2c" containerName="nova-metadata-log" Dec 02 16:17:52 crc kubenswrapper[4933]: I1202 16:17:52.703373 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="53aa8ccd-0db2-41b4-ae8d-328813643b2c" containerName="nova-metadata-log" Dec 02 16:17:52 crc kubenswrapper[4933]: E1202 16:17:52.703413 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53aa8ccd-0db2-41b4-ae8d-328813643b2c" containerName="nova-metadata-metadata" Dec 02 16:17:52 crc kubenswrapper[4933]: I1202 16:17:52.703420 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="53aa8ccd-0db2-41b4-ae8d-328813643b2c" containerName="nova-metadata-metadata" Dec 02 16:17:52 crc kubenswrapper[4933]: E1202 16:17:52.703558 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce47649a-d30e-4b04-a077-b6efde5fd587" containerName="nova-scheduler-scheduler" Dec 02 16:17:52 crc kubenswrapper[4933]: I1202 16:17:52.703574 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce47649a-d30e-4b04-a077-b6efde5fd587" containerName="nova-scheduler-scheduler" Dec 02 16:17:52 crc kubenswrapper[4933]: I1202 16:17:52.704411 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce47649a-d30e-4b04-a077-b6efde5fd587" containerName="nova-scheduler-scheduler" Dec 02 16:17:52 crc kubenswrapper[4933]: I1202 16:17:52.704477 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="53aa8ccd-0db2-41b4-ae8d-328813643b2c" containerName="nova-metadata-metadata" Dec 02 16:17:52 crc kubenswrapper[4933]: I1202 16:17:52.704489 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="53aa8ccd-0db2-41b4-ae8d-328813643b2c" containerName="nova-metadata-log" Dec 02 16:17:52 crc kubenswrapper[4933]: I1202 16:17:52.707311 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 02 16:17:52 crc kubenswrapper[4933]: I1202 16:17:52.707525 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 16:17:52 crc kubenswrapper[4933]: I1202 16:17:52.710581 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 02 16:17:52 crc kubenswrapper[4933]: I1202 16:17:52.720215 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 02 16:17:52 crc kubenswrapper[4933]: I1202 16:17:52.722211 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 16:17:52 crc kubenswrapper[4933]: I1202 16:17:52.728904 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 02 16:17:52 crc kubenswrapper[4933]: I1202 16:17:52.730847 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 02 16:17:52 crc kubenswrapper[4933]: I1202 16:17:52.738925 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 16:17:52 crc kubenswrapper[4933]: I1202 16:17:52.807767 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c58ccba-4def-4328-b1c9-6751fb2dcb0f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"1c58ccba-4def-4328-b1c9-6751fb2dcb0f\") " pod="openstack/nova-scheduler-0" Dec 02 16:17:52 crc kubenswrapper[4933]: I1202 16:17:52.807815 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42ssl\" (UniqueName: \"kubernetes.io/projected/37fc2f39-ccfe-45bc-b2e0-df5cabe8c327-kube-api-access-42ssl\") pod \"nova-metadata-0\" (UID: \"37fc2f39-ccfe-45bc-b2e0-df5cabe8c327\") " pod="openstack/nova-metadata-0" Dec 02 16:17:52 crc kubenswrapper[4933]: I1202 16:17:52.807915 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37fc2f39-ccfe-45bc-b2e0-df5cabe8c327-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"37fc2f39-ccfe-45bc-b2e0-df5cabe8c327\") " pod="openstack/nova-metadata-0" Dec 02 16:17:52 crc kubenswrapper[4933]: I1202 16:17:52.807944 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/37fc2f39-ccfe-45bc-b2e0-df5cabe8c327-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"37fc2f39-ccfe-45bc-b2e0-df5cabe8c327\") " pod="openstack/nova-metadata-0" Dec 02 16:17:52 crc kubenswrapper[4933]: I1202 16:17:52.807967 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37fc2f39-ccfe-45bc-b2e0-df5cabe8c327-logs\") pod \"nova-metadata-0\" (UID: \"37fc2f39-ccfe-45bc-b2e0-df5cabe8c327\") " pod="openstack/nova-metadata-0" Dec 02 16:17:52 crc kubenswrapper[4933]: I1202 16:17:52.808017 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c58ccba-4def-4328-b1c9-6751fb2dcb0f-config-data\") pod \"nova-scheduler-0\" (UID: \"1c58ccba-4def-4328-b1c9-6751fb2dcb0f\") " pod="openstack/nova-scheduler-0" Dec 02 16:17:52 crc kubenswrapper[4933]: I1202 16:17:52.808034 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37fc2f39-ccfe-45bc-b2e0-df5cabe8c327-config-data\") pod \"nova-metadata-0\" (UID: \"37fc2f39-ccfe-45bc-b2e0-df5cabe8c327\") " pod="openstack/nova-metadata-0" Dec 02 16:17:52 crc kubenswrapper[4933]: I1202 16:17:52.808060 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59jxw\" (UniqueName: \"kubernetes.io/projected/1c58ccba-4def-4328-b1c9-6751fb2dcb0f-kube-api-access-59jxw\") pod \"nova-scheduler-0\" (UID: \"1c58ccba-4def-4328-b1c9-6751fb2dcb0f\") " pod="openstack/nova-scheduler-0" Dec 02 16:17:52 crc kubenswrapper[4933]: I1202 16:17:52.909448 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37fc2f39-ccfe-45bc-b2e0-df5cabe8c327-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"37fc2f39-ccfe-45bc-b2e0-df5cabe8c327\") " pod="openstack/nova-metadata-0" Dec 02 16:17:52 crc kubenswrapper[4933]: I1202 16:17:52.909490 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/37fc2f39-ccfe-45bc-b2e0-df5cabe8c327-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"37fc2f39-ccfe-45bc-b2e0-df5cabe8c327\") " pod="openstack/nova-metadata-0" Dec 02 16:17:52 crc kubenswrapper[4933]: I1202 16:17:52.909513 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37fc2f39-ccfe-45bc-b2e0-df5cabe8c327-logs\") pod \"nova-metadata-0\" (UID: \"37fc2f39-ccfe-45bc-b2e0-df5cabe8c327\") " pod="openstack/nova-metadata-0" Dec 02 16:17:52 crc kubenswrapper[4933]: I1202 16:17:52.909569 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c58ccba-4def-4328-b1c9-6751fb2dcb0f-config-data\") pod \"nova-scheduler-0\" (UID: \"1c58ccba-4def-4328-b1c9-6751fb2dcb0f\") " pod="openstack/nova-scheduler-0" Dec 02 16:17:52 crc kubenswrapper[4933]: I1202 16:17:52.909584 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37fc2f39-ccfe-45bc-b2e0-df5cabe8c327-config-data\") pod \"nova-metadata-0\" (UID: \"37fc2f39-ccfe-45bc-b2e0-df5cabe8c327\") " pod="openstack/nova-metadata-0" Dec 02 16:17:52 crc kubenswrapper[4933]: I1202 16:17:52.909611 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59jxw\" (UniqueName: \"kubernetes.io/projected/1c58ccba-4def-4328-b1c9-6751fb2dcb0f-kube-api-access-59jxw\") pod \"nova-scheduler-0\" (UID: \"1c58ccba-4def-4328-b1c9-6751fb2dcb0f\") " pod="openstack/nova-scheduler-0" Dec 02 16:17:52 crc kubenswrapper[4933]: I1202 16:17:52.909697 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c58ccba-4def-4328-b1c9-6751fb2dcb0f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"1c58ccba-4def-4328-b1c9-6751fb2dcb0f\") " pod="openstack/nova-scheduler-0" Dec 02 16:17:52 crc kubenswrapper[4933]: I1202 16:17:52.909719 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42ssl\" (UniqueName: \"kubernetes.io/projected/37fc2f39-ccfe-45bc-b2e0-df5cabe8c327-kube-api-access-42ssl\") pod \"nova-metadata-0\" (UID: \"37fc2f39-ccfe-45bc-b2e0-df5cabe8c327\") " pod="openstack/nova-metadata-0" Dec 02 16:17:52 crc kubenswrapper[4933]: I1202 16:17:52.910409 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37fc2f39-ccfe-45bc-b2e0-df5cabe8c327-logs\") pod \"nova-metadata-0\" (UID: \"37fc2f39-ccfe-45bc-b2e0-df5cabe8c327\") " pod="openstack/nova-metadata-0" Dec 02 16:17:52 crc kubenswrapper[4933]: I1202 16:17:52.913472 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/37fc2f39-ccfe-45bc-b2e0-df5cabe8c327-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"37fc2f39-ccfe-45bc-b2e0-df5cabe8c327\") " pod="openstack/nova-metadata-0" Dec 02 16:17:52 crc kubenswrapper[4933]: I1202 16:17:52.913958 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c58ccba-4def-4328-b1c9-6751fb2dcb0f-config-data\") pod \"nova-scheduler-0\" (UID: \"1c58ccba-4def-4328-b1c9-6751fb2dcb0f\") " pod="openstack/nova-scheduler-0" Dec 02 16:17:52 crc kubenswrapper[4933]: I1202 16:17:52.914525 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c58ccba-4def-4328-b1c9-6751fb2dcb0f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"1c58ccba-4def-4328-b1c9-6751fb2dcb0f\") " pod="openstack/nova-scheduler-0" Dec 02 16:17:52 crc kubenswrapper[4933]: I1202 16:17:52.915440 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37fc2f39-ccfe-45bc-b2e0-df5cabe8c327-config-data\") pod \"nova-metadata-0\" (UID: \"37fc2f39-ccfe-45bc-b2e0-df5cabe8c327\") " pod="openstack/nova-metadata-0" Dec 02 16:17:52 crc kubenswrapper[4933]: I1202 16:17:52.915937 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37fc2f39-ccfe-45bc-b2e0-df5cabe8c327-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"37fc2f39-ccfe-45bc-b2e0-df5cabe8c327\") " pod="openstack/nova-metadata-0" Dec 02 16:17:52 crc kubenswrapper[4933]: I1202 16:17:52.933149 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59jxw\" (UniqueName: \"kubernetes.io/projected/1c58ccba-4def-4328-b1c9-6751fb2dcb0f-kube-api-access-59jxw\") pod \"nova-scheduler-0\" (UID: \"1c58ccba-4def-4328-b1c9-6751fb2dcb0f\") " pod="openstack/nova-scheduler-0" Dec 02 16:17:52 crc kubenswrapper[4933]: I1202 16:17:52.937112 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42ssl\" (UniqueName: \"kubernetes.io/projected/37fc2f39-ccfe-45bc-b2e0-df5cabe8c327-kube-api-access-42ssl\") pod \"nova-metadata-0\" (UID: \"37fc2f39-ccfe-45bc-b2e0-df5cabe8c327\") " pod="openstack/nova-metadata-0" Dec 02 16:17:53 crc kubenswrapper[4933]: I1202 16:17:53.035281 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 02 16:17:53 crc kubenswrapper[4933]: I1202 16:17:53.060303 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 16:17:53 crc kubenswrapper[4933]: I1202 16:17:53.077541 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53aa8ccd-0db2-41b4-ae8d-328813643b2c" path="/var/lib/kubelet/pods/53aa8ccd-0db2-41b4-ae8d-328813643b2c/volumes" Dec 02 16:17:53 crc kubenswrapper[4933]: I1202 16:17:53.078619 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce47649a-d30e-4b04-a077-b6efde5fd587" path="/var/lib/kubelet/pods/ce47649a-d30e-4b04-a077-b6efde5fd587/volumes" Dec 02 16:17:53 crc kubenswrapper[4933]: I1202 16:17:53.502474 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"43f4221c-6190-460d-9bd8-2a20b003c890","Type":"ContainerStarted","Data":"11d50da1681d83e2a3f5ca983fe39e7ee44b4f02c55495cf197b101458e7db00"} Dec 02 16:17:53 crc kubenswrapper[4933]: I1202 16:17:53.504017 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 02 16:17:53 crc kubenswrapper[4933]: I1202 16:17:53.572371 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.365093767 podStartE2EDuration="9.572352033s" podCreationTimestamp="2025-12-02 16:17:44 +0000 UTC" firstStartedPulling="2025-12-02 16:17:45.420627619 +0000 UTC m=+1528.671854322" lastFinishedPulling="2025-12-02 16:17:52.627885885 +0000 UTC m=+1535.879112588" observedRunningTime="2025-12-02 16:17:53.570296226 +0000 UTC m=+1536.821522919" watchObservedRunningTime="2025-12-02 16:17:53.572352033 +0000 UTC m=+1536.823578756" Dec 02 16:17:53 crc kubenswrapper[4933]: I1202 16:17:53.631135 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 16:17:53 crc kubenswrapper[4933]: I1202 16:17:53.896410 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 16:17:53 crc kubenswrapper[4933]: W1202 16:17:53.902128 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod37fc2f39_ccfe_45bc_b2e0_df5cabe8c327.slice/crio-14197c9753e17c31e8492c8d7ac7455c0978e4a3102ea5720f559f41ca6ca6cb WatchSource:0}: Error finding container 14197c9753e17c31e8492c8d7ac7455c0978e4a3102ea5720f559f41ca6ca6cb: Status 404 returned error can't find the container with id 14197c9753e17c31e8492c8d7ac7455c0978e4a3102ea5720f559f41ca6ca6cb Dec 02 16:17:54 crc kubenswrapper[4933]: I1202 16:17:54.388731 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 16:17:54 crc kubenswrapper[4933]: I1202 16:17:54.459649 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/93444a85-ddc7-4883-9a66-8a759201161d-logs\") pod \"93444a85-ddc7-4883-9a66-8a759201161d\" (UID: \"93444a85-ddc7-4883-9a66-8a759201161d\") " Dec 02 16:17:54 crc kubenswrapper[4933]: I1202 16:17:54.459799 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4pr9\" (UniqueName: \"kubernetes.io/projected/93444a85-ddc7-4883-9a66-8a759201161d-kube-api-access-s4pr9\") pod \"93444a85-ddc7-4883-9a66-8a759201161d\" (UID: \"93444a85-ddc7-4883-9a66-8a759201161d\") " Dec 02 16:17:54 crc kubenswrapper[4933]: I1202 16:17:54.459881 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93444a85-ddc7-4883-9a66-8a759201161d-config-data\") pod \"93444a85-ddc7-4883-9a66-8a759201161d\" (UID: \"93444a85-ddc7-4883-9a66-8a759201161d\") " Dec 02 16:17:54 crc kubenswrapper[4933]: I1202 16:17:54.459969 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93444a85-ddc7-4883-9a66-8a759201161d-combined-ca-bundle\") pod \"93444a85-ddc7-4883-9a66-8a759201161d\" (UID: \"93444a85-ddc7-4883-9a66-8a759201161d\") " Dec 02 16:17:54 crc kubenswrapper[4933]: I1202 16:17:54.460665 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93444a85-ddc7-4883-9a66-8a759201161d-logs" (OuterVolumeSpecName: "logs") pod "93444a85-ddc7-4883-9a66-8a759201161d" (UID: "93444a85-ddc7-4883-9a66-8a759201161d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 16:17:54 crc kubenswrapper[4933]: I1202 16:17:54.465692 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93444a85-ddc7-4883-9a66-8a759201161d-kube-api-access-s4pr9" (OuterVolumeSpecName: "kube-api-access-s4pr9") pod "93444a85-ddc7-4883-9a66-8a759201161d" (UID: "93444a85-ddc7-4883-9a66-8a759201161d"). InnerVolumeSpecName "kube-api-access-s4pr9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:17:54 crc kubenswrapper[4933]: I1202 16:17:54.514273 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93444a85-ddc7-4883-9a66-8a759201161d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "93444a85-ddc7-4883-9a66-8a759201161d" (UID: "93444a85-ddc7-4883-9a66-8a759201161d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:17:54 crc kubenswrapper[4933]: I1202 16:17:54.515780 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93444a85-ddc7-4883-9a66-8a759201161d-config-data" (OuterVolumeSpecName: "config-data") pod "93444a85-ddc7-4883-9a66-8a759201161d" (UID: "93444a85-ddc7-4883-9a66-8a759201161d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:17:54 crc kubenswrapper[4933]: I1202 16:17:54.529516 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"1c58ccba-4def-4328-b1c9-6751fb2dcb0f","Type":"ContainerStarted","Data":"c9e66550cd7499a508258e4f85a3033dc0817b18afd31a5e11a6a13b62738d1d"} Dec 02 16:17:54 crc kubenswrapper[4933]: I1202 16:17:54.529559 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"1c58ccba-4def-4328-b1c9-6751fb2dcb0f","Type":"ContainerStarted","Data":"c1c1bbbda72d0f0ca5ee5e7362af506936d4c8aae1cbe6535f4b65a222614989"} Dec 02 16:17:54 crc kubenswrapper[4933]: I1202 16:17:54.536523 4933 generic.go:334] "Generic (PLEG): container finished" podID="93444a85-ddc7-4883-9a66-8a759201161d" containerID="4560a6b592e8ed05b2f609deada274875270c0176efe00dda16466370776192a" exitCode=0 Dec 02 16:17:54 crc kubenswrapper[4933]: I1202 16:17:54.536604 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"93444a85-ddc7-4883-9a66-8a759201161d","Type":"ContainerDied","Data":"4560a6b592e8ed05b2f609deada274875270c0176efe00dda16466370776192a"} Dec 02 16:17:54 crc kubenswrapper[4933]: I1202 16:17:54.536640 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"93444a85-ddc7-4883-9a66-8a759201161d","Type":"ContainerDied","Data":"a3aa93e7aa55cec41b4491e6edfb29cd79545ae26f11ba3526cca57728a401eb"} Dec 02 16:17:54 crc kubenswrapper[4933]: I1202 16:17:54.536662 4933 scope.go:117] "RemoveContainer" containerID="4560a6b592e8ed05b2f609deada274875270c0176efe00dda16466370776192a" Dec 02 16:17:54 crc kubenswrapper[4933]: I1202 16:17:54.536813 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 16:17:54 crc kubenswrapper[4933]: I1202 16:17:54.554195 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"37fc2f39-ccfe-45bc-b2e0-df5cabe8c327","Type":"ContainerStarted","Data":"d45bfe61ec307d78b01e25e5b3d60baed2cfbd18e8e3c083dfd998cd815623dc"} Dec 02 16:17:54 crc kubenswrapper[4933]: I1202 16:17:54.554228 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"37fc2f39-ccfe-45bc-b2e0-df5cabe8c327","Type":"ContainerStarted","Data":"d65d73cfd91f9de5da7119c94edf9d13504856b4a427c8ed2938ddf237e11a33"} Dec 02 16:17:54 crc kubenswrapper[4933]: I1202 16:17:54.554241 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"37fc2f39-ccfe-45bc-b2e0-df5cabe8c327","Type":"ContainerStarted","Data":"14197c9753e17c31e8492c8d7ac7455c0978e4a3102ea5720f559f41ca6ca6cb"} Dec 02 16:17:54 crc kubenswrapper[4933]: I1202 16:17:54.564301 4933 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93444a85-ddc7-4883-9a66-8a759201161d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 16:17:54 crc kubenswrapper[4933]: I1202 16:17:54.564340 4933 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/93444a85-ddc7-4883-9a66-8a759201161d-logs\") on node \"crc\" DevicePath \"\"" Dec 02 16:17:54 crc kubenswrapper[4933]: I1202 16:17:54.564353 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4pr9\" (UniqueName: \"kubernetes.io/projected/93444a85-ddc7-4883-9a66-8a759201161d-kube-api-access-s4pr9\") on node \"crc\" DevicePath \"\"" Dec 02 16:17:54 crc kubenswrapper[4933]: I1202 16:17:54.564366 4933 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93444a85-ddc7-4883-9a66-8a759201161d-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 16:17:54 crc kubenswrapper[4933]: I1202 16:17:54.573210 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.573189213 podStartE2EDuration="2.573189213s" podCreationTimestamp="2025-12-02 16:17:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 16:17:54.549543182 +0000 UTC m=+1537.800769885" watchObservedRunningTime="2025-12-02 16:17:54.573189213 +0000 UTC m=+1537.824415916" Dec 02 16:17:54 crc kubenswrapper[4933]: I1202 16:17:54.596020 4933 scope.go:117] "RemoveContainer" containerID="cd4f4c4b0d3805a1717e6f1723206a2ffdb36477aced745e839329aeb1e7d82d" Dec 02 16:17:54 crc kubenswrapper[4933]: I1202 16:17:54.601525 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 02 16:17:54 crc kubenswrapper[4933]: I1202 16:17:54.627185 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 02 16:17:54 crc kubenswrapper[4933]: I1202 16:17:54.652944 4933 scope.go:117] "RemoveContainer" containerID="4560a6b592e8ed05b2f609deada274875270c0176efe00dda16466370776192a" Dec 02 16:17:54 crc kubenswrapper[4933]: E1202 16:17:54.656986 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4560a6b592e8ed05b2f609deada274875270c0176efe00dda16466370776192a\": container with ID starting with 4560a6b592e8ed05b2f609deada274875270c0176efe00dda16466370776192a not found: ID does not exist" containerID="4560a6b592e8ed05b2f609deada274875270c0176efe00dda16466370776192a" Dec 02 16:17:54 crc kubenswrapper[4933]: I1202 16:17:54.657044 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4560a6b592e8ed05b2f609deada274875270c0176efe00dda16466370776192a"} err="failed to get container status \"4560a6b592e8ed05b2f609deada274875270c0176efe00dda16466370776192a\": rpc error: code = NotFound desc = could not find container \"4560a6b592e8ed05b2f609deada274875270c0176efe00dda16466370776192a\": container with ID starting with 4560a6b592e8ed05b2f609deada274875270c0176efe00dda16466370776192a not found: ID does not exist" Dec 02 16:17:54 crc kubenswrapper[4933]: I1202 16:17:54.657078 4933 scope.go:117] "RemoveContainer" containerID="cd4f4c4b0d3805a1717e6f1723206a2ffdb36477aced745e839329aeb1e7d82d" Dec 02 16:17:54 crc kubenswrapper[4933]: I1202 16:17:54.657194 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 02 16:17:54 crc kubenswrapper[4933]: E1202 16:17:54.657748 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93444a85-ddc7-4883-9a66-8a759201161d" containerName="nova-api-log" Dec 02 16:17:54 crc kubenswrapper[4933]: I1202 16:17:54.657772 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="93444a85-ddc7-4883-9a66-8a759201161d" containerName="nova-api-log" Dec 02 16:17:54 crc kubenswrapper[4933]: E1202 16:17:54.657839 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93444a85-ddc7-4883-9a66-8a759201161d" containerName="nova-api-api" Dec 02 16:17:54 crc kubenswrapper[4933]: I1202 16:17:54.657850 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="93444a85-ddc7-4883-9a66-8a759201161d" containerName="nova-api-api" Dec 02 16:17:54 crc kubenswrapper[4933]: E1202 16:17:54.658121 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd4f4c4b0d3805a1717e6f1723206a2ffdb36477aced745e839329aeb1e7d82d\": container with ID starting with cd4f4c4b0d3805a1717e6f1723206a2ffdb36477aced745e839329aeb1e7d82d not found: ID does not exist" containerID="cd4f4c4b0d3805a1717e6f1723206a2ffdb36477aced745e839329aeb1e7d82d" Dec 02 16:17:54 crc kubenswrapper[4933]: I1202 16:17:54.658166 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="93444a85-ddc7-4883-9a66-8a759201161d" containerName="nova-api-api" Dec 02 16:17:54 crc kubenswrapper[4933]: I1202 16:17:54.658198 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="93444a85-ddc7-4883-9a66-8a759201161d" containerName="nova-api-log" Dec 02 16:17:54 crc kubenswrapper[4933]: I1202 16:17:54.658163 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd4f4c4b0d3805a1717e6f1723206a2ffdb36477aced745e839329aeb1e7d82d"} err="failed to get container status \"cd4f4c4b0d3805a1717e6f1723206a2ffdb36477aced745e839329aeb1e7d82d\": rpc error: code = NotFound desc = could not find container \"cd4f4c4b0d3805a1717e6f1723206a2ffdb36477aced745e839329aeb1e7d82d\": container with ID starting with cd4f4c4b0d3805a1717e6f1723206a2ffdb36477aced745e839329aeb1e7d82d not found: ID does not exist" Dec 02 16:17:54 crc kubenswrapper[4933]: I1202 16:17:54.659723 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 16:17:54 crc kubenswrapper[4933]: I1202 16:17:54.661864 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 02 16:17:54 crc kubenswrapper[4933]: I1202 16:17:54.718622 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.718598207 podStartE2EDuration="2.718598207s" podCreationTimestamp="2025-12-02 16:17:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 16:17:54.609122903 +0000 UTC m=+1537.860349616" watchObservedRunningTime="2025-12-02 16:17:54.718598207 +0000 UTC m=+1537.969824910" Dec 02 16:17:54 crc kubenswrapper[4933]: I1202 16:17:54.732864 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d822a2fc-63d8-4767-805b-d68ff0173248-logs\") pod \"nova-api-0\" (UID: \"d822a2fc-63d8-4767-805b-d68ff0173248\") " pod="openstack/nova-api-0" Dec 02 16:17:54 crc kubenswrapper[4933]: I1202 16:17:54.733194 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d822a2fc-63d8-4767-805b-d68ff0173248-config-data\") pod \"nova-api-0\" (UID: \"d822a2fc-63d8-4767-805b-d68ff0173248\") " pod="openstack/nova-api-0" Dec 02 16:17:54 crc kubenswrapper[4933]: I1202 16:17:54.733279 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n45z8\" (UniqueName: \"kubernetes.io/projected/d822a2fc-63d8-4767-805b-d68ff0173248-kube-api-access-n45z8\") pod \"nova-api-0\" (UID: \"d822a2fc-63d8-4767-805b-d68ff0173248\") " pod="openstack/nova-api-0" Dec 02 16:17:54 crc kubenswrapper[4933]: I1202 16:17:54.733311 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d822a2fc-63d8-4767-805b-d68ff0173248-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d822a2fc-63d8-4767-805b-d68ff0173248\") " pod="openstack/nova-api-0" Dec 02 16:17:54 crc kubenswrapper[4933]: I1202 16:17:54.755290 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 02 16:17:54 crc kubenswrapper[4933]: I1202 16:17:54.837324 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d822a2fc-63d8-4767-805b-d68ff0173248-logs\") pod \"nova-api-0\" (UID: \"d822a2fc-63d8-4767-805b-d68ff0173248\") " pod="openstack/nova-api-0" Dec 02 16:17:54 crc kubenswrapper[4933]: I1202 16:17:54.837467 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d822a2fc-63d8-4767-805b-d68ff0173248-config-data\") pod \"nova-api-0\" (UID: \"d822a2fc-63d8-4767-805b-d68ff0173248\") " pod="openstack/nova-api-0" Dec 02 16:17:54 crc kubenswrapper[4933]: I1202 16:17:54.837507 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n45z8\" (UniqueName: \"kubernetes.io/projected/d822a2fc-63d8-4767-805b-d68ff0173248-kube-api-access-n45z8\") pod \"nova-api-0\" (UID: \"d822a2fc-63d8-4767-805b-d68ff0173248\") " pod="openstack/nova-api-0" Dec 02 16:17:54 crc kubenswrapper[4933]: I1202 16:17:54.837533 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d822a2fc-63d8-4767-805b-d68ff0173248-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d822a2fc-63d8-4767-805b-d68ff0173248\") " pod="openstack/nova-api-0" Dec 02 16:17:54 crc kubenswrapper[4933]: I1202 16:17:54.838295 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d822a2fc-63d8-4767-805b-d68ff0173248-logs\") pod \"nova-api-0\" (UID: \"d822a2fc-63d8-4767-805b-d68ff0173248\") " pod="openstack/nova-api-0" Dec 02 16:17:54 crc kubenswrapper[4933]: I1202 16:17:54.841969 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d822a2fc-63d8-4767-805b-d68ff0173248-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d822a2fc-63d8-4767-805b-d68ff0173248\") " pod="openstack/nova-api-0" Dec 02 16:17:54 crc kubenswrapper[4933]: I1202 16:17:54.844315 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d822a2fc-63d8-4767-805b-d68ff0173248-config-data\") pod \"nova-api-0\" (UID: \"d822a2fc-63d8-4767-805b-d68ff0173248\") " pod="openstack/nova-api-0" Dec 02 16:17:54 crc kubenswrapper[4933]: I1202 16:17:54.854554 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n45z8\" (UniqueName: \"kubernetes.io/projected/d822a2fc-63d8-4767-805b-d68ff0173248-kube-api-access-n45z8\") pod \"nova-api-0\" (UID: \"d822a2fc-63d8-4767-805b-d68ff0173248\") " pod="openstack/nova-api-0" Dec 02 16:17:54 crc kubenswrapper[4933]: I1202 16:17:54.999645 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 16:17:55 crc kubenswrapper[4933]: I1202 16:17:55.073270 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93444a85-ddc7-4883-9a66-8a759201161d" path="/var/lib/kubelet/pods/93444a85-ddc7-4883-9a66-8a759201161d/volumes" Dec 02 16:17:55 crc kubenswrapper[4933]: I1202 16:17:55.488940 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 02 16:17:55 crc kubenswrapper[4933]: W1202 16:17:55.491252 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd822a2fc_63d8_4767_805b_d68ff0173248.slice/crio-f7dd600d25baa7a5b37d7f9d7268fb021b8bd0f3b5b4b9d9457e04501dae7572 WatchSource:0}: Error finding container f7dd600d25baa7a5b37d7f9d7268fb021b8bd0f3b5b4b9d9457e04501dae7572: Status 404 returned error can't find the container with id f7dd600d25baa7a5b37d7f9d7268fb021b8bd0f3b5b4b9d9457e04501dae7572 Dec 02 16:17:55 crc kubenswrapper[4933]: I1202 16:17:55.574035 4933 generic.go:334] "Generic (PLEG): container finished" podID="1b21b605-e2e1-412d-86d6-116c3c6684dd" containerID="895ad2acbbb9e13805177dcc39bd6a97c43c16eb24912641c9f832800498688c" exitCode=0 Dec 02 16:17:55 crc kubenswrapper[4933]: I1202 16:17:55.574116 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hw45r" event={"ID":"1b21b605-e2e1-412d-86d6-116c3c6684dd","Type":"ContainerDied","Data":"895ad2acbbb9e13805177dcc39bd6a97c43c16eb24912641c9f832800498688c"} Dec 02 16:17:55 crc kubenswrapper[4933]: I1202 16:17:55.578975 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d822a2fc-63d8-4767-805b-d68ff0173248","Type":"ContainerStarted","Data":"f7dd600d25baa7a5b37d7f9d7268fb021b8bd0f3b5b4b9d9457e04501dae7572"} Dec 02 16:17:55 crc kubenswrapper[4933]: E1202 16:17:55.992926 4933 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9cde07d2_340f_48ad_bb66_db94386d9052.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6e06c62d_ee51_43e3_aa5e_eb045d4ec1c8.slice\": RecentStats: unable to find data in memory cache]" Dec 02 16:17:56 crc kubenswrapper[4933]: I1202 16:17:56.595168 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hw45r" event={"ID":"1b21b605-e2e1-412d-86d6-116c3c6684dd","Type":"ContainerStarted","Data":"97d2bcae9cb162a443fc0744408178dec66b614385f6d4a4a52652403bb227eb"} Dec 02 16:17:56 crc kubenswrapper[4933]: I1202 16:17:56.599189 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d822a2fc-63d8-4767-805b-d68ff0173248","Type":"ContainerStarted","Data":"60370ca13f9bd60ea2881eae5d46b04ed2c2bbd59212dbe4137121ec23660afb"} Dec 02 16:17:56 crc kubenswrapper[4933]: I1202 16:17:56.599243 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d822a2fc-63d8-4767-805b-d68ff0173248","Type":"ContainerStarted","Data":"0d1d36c28b1eb7ed4343f377b2f784db47dcaf16b3355a2615520c17dc3b36f6"} Dec 02 16:17:56 crc kubenswrapper[4933]: I1202 16:17:56.628162 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.62814468 podStartE2EDuration="2.62814468s" podCreationTimestamp="2025-12-02 16:17:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 16:17:56.616569952 +0000 UTC m=+1539.867796665" watchObservedRunningTime="2025-12-02 16:17:56.62814468 +0000 UTC m=+1539.879371383" Dec 02 16:17:56 crc kubenswrapper[4933]: E1202 16:17:56.938607 4933 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6e06c62d_ee51_43e3_aa5e_eb045d4ec1c8.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9cde07d2_340f_48ad_bb66_db94386d9052.slice\": RecentStats: unable to find data in memory cache]" Dec 02 16:17:57 crc kubenswrapper[4933]: I1202 16:17:57.643330 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-hw45r" podStartSLOduration=4.066511534 podStartE2EDuration="8.643303955s" podCreationTimestamp="2025-12-02 16:17:49 +0000 UTC" firstStartedPulling="2025-12-02 16:17:51.426173443 +0000 UTC m=+1534.677400146" lastFinishedPulling="2025-12-02 16:17:56.002965864 +0000 UTC m=+1539.254192567" observedRunningTime="2025-12-02 16:17:57.628292012 +0000 UTC m=+1540.879518715" watchObservedRunningTime="2025-12-02 16:17:57.643303955 +0000 UTC m=+1540.894530658" Dec 02 16:17:58 crc kubenswrapper[4933]: I1202 16:17:58.035300 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 02 16:17:58 crc kubenswrapper[4933]: I1202 16:17:58.061357 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 02 16:17:58 crc kubenswrapper[4933]: I1202 16:17:58.061404 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 02 16:18:00 crc kubenswrapper[4933]: I1202 16:18:00.005516 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-hw45r" Dec 02 16:18:00 crc kubenswrapper[4933]: I1202 16:18:00.005813 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-hw45r" Dec 02 16:18:00 crc kubenswrapper[4933]: I1202 16:18:00.083388 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-hw45r" Dec 02 16:18:00 crc kubenswrapper[4933]: I1202 16:18:00.801731 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Dec 02 16:18:01 crc kubenswrapper[4933]: I1202 16:18:01.706421 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-hw45r" Dec 02 16:18:01 crc kubenswrapper[4933]: I1202 16:18:01.756170 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hw45r"] Dec 02 16:18:03 crc kubenswrapper[4933]: I1202 16:18:03.035830 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 02 16:18:03 crc kubenswrapper[4933]: I1202 16:18:03.069405 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 02 16:18:03 crc kubenswrapper[4933]: I1202 16:18:03.069451 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 02 16:18:03 crc kubenswrapper[4933]: I1202 16:18:03.069529 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 02 16:18:03 crc kubenswrapper[4933]: I1202 16:18:03.678412 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-hw45r" podUID="1b21b605-e2e1-412d-86d6-116c3c6684dd" containerName="registry-server" containerID="cri-o://97d2bcae9cb162a443fc0744408178dec66b614385f6d4a4a52652403bb227eb" gracePeriod=2 Dec 02 16:18:03 crc kubenswrapper[4933]: I1202 16:18:03.713951 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 02 16:18:04 crc kubenswrapper[4933]: I1202 16:18:04.071783 4933 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="37fc2f39-ccfe-45bc-b2e0-df5cabe8c327" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.247:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 02 16:18:04 crc kubenswrapper[4933]: I1202 16:18:04.079760 4933 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="37fc2f39-ccfe-45bc-b2e0-df5cabe8c327" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.247:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 02 16:18:04 crc kubenswrapper[4933]: I1202 16:18:04.360645 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hw45r" Dec 02 16:18:04 crc kubenswrapper[4933]: I1202 16:18:04.385150 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b21b605-e2e1-412d-86d6-116c3c6684dd-catalog-content\") pod \"1b21b605-e2e1-412d-86d6-116c3c6684dd\" (UID: \"1b21b605-e2e1-412d-86d6-116c3c6684dd\") " Dec 02 16:18:04 crc kubenswrapper[4933]: I1202 16:18:04.385286 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wcs79\" (UniqueName: \"kubernetes.io/projected/1b21b605-e2e1-412d-86d6-116c3c6684dd-kube-api-access-wcs79\") pod \"1b21b605-e2e1-412d-86d6-116c3c6684dd\" (UID: \"1b21b605-e2e1-412d-86d6-116c3c6684dd\") " Dec 02 16:18:04 crc kubenswrapper[4933]: I1202 16:18:04.385402 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b21b605-e2e1-412d-86d6-116c3c6684dd-utilities\") pod \"1b21b605-e2e1-412d-86d6-116c3c6684dd\" (UID: \"1b21b605-e2e1-412d-86d6-116c3c6684dd\") " Dec 02 16:18:04 crc kubenswrapper[4933]: I1202 16:18:04.386515 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b21b605-e2e1-412d-86d6-116c3c6684dd-utilities" (OuterVolumeSpecName: "utilities") pod "1b21b605-e2e1-412d-86d6-116c3c6684dd" (UID: "1b21b605-e2e1-412d-86d6-116c3c6684dd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 16:18:04 crc kubenswrapper[4933]: I1202 16:18:04.397480 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b21b605-e2e1-412d-86d6-116c3c6684dd-kube-api-access-wcs79" (OuterVolumeSpecName: "kube-api-access-wcs79") pod "1b21b605-e2e1-412d-86d6-116c3c6684dd" (UID: "1b21b605-e2e1-412d-86d6-116c3c6684dd"). InnerVolumeSpecName "kube-api-access-wcs79". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:18:04 crc kubenswrapper[4933]: I1202 16:18:04.407936 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b21b605-e2e1-412d-86d6-116c3c6684dd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1b21b605-e2e1-412d-86d6-116c3c6684dd" (UID: "1b21b605-e2e1-412d-86d6-116c3c6684dd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 16:18:04 crc kubenswrapper[4933]: I1202 16:18:04.487223 4933 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b21b605-e2e1-412d-86d6-116c3c6684dd-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 16:18:04 crc kubenswrapper[4933]: I1202 16:18:04.487263 4933 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b21b605-e2e1-412d-86d6-116c3c6684dd-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 16:18:04 crc kubenswrapper[4933]: I1202 16:18:04.487281 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wcs79\" (UniqueName: \"kubernetes.io/projected/1b21b605-e2e1-412d-86d6-116c3c6684dd-kube-api-access-wcs79\") on node \"crc\" DevicePath \"\"" Dec 02 16:18:04 crc kubenswrapper[4933]: I1202 16:18:04.694481 4933 generic.go:334] "Generic (PLEG): container finished" podID="1b21b605-e2e1-412d-86d6-116c3c6684dd" containerID="97d2bcae9cb162a443fc0744408178dec66b614385f6d4a4a52652403bb227eb" exitCode=0 Dec 02 16:18:04 crc kubenswrapper[4933]: I1202 16:18:04.694525 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hw45r" event={"ID":"1b21b605-e2e1-412d-86d6-116c3c6684dd","Type":"ContainerDied","Data":"97d2bcae9cb162a443fc0744408178dec66b614385f6d4a4a52652403bb227eb"} Dec 02 16:18:04 crc kubenswrapper[4933]: I1202 16:18:04.694586 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hw45r" Dec 02 16:18:04 crc kubenswrapper[4933]: I1202 16:18:04.694614 4933 scope.go:117] "RemoveContainer" containerID="97d2bcae9cb162a443fc0744408178dec66b614385f6d4a4a52652403bb227eb" Dec 02 16:18:04 crc kubenswrapper[4933]: I1202 16:18:04.694599 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hw45r" event={"ID":"1b21b605-e2e1-412d-86d6-116c3c6684dd","Type":"ContainerDied","Data":"d75573f692765899c60adbe4bafe4f66ea48bcbd5267e76863dd0d03d5e2d1f8"} Dec 02 16:18:04 crc kubenswrapper[4933]: I1202 16:18:04.741776 4933 scope.go:117] "RemoveContainer" containerID="895ad2acbbb9e13805177dcc39bd6a97c43c16eb24912641c9f832800498688c" Dec 02 16:18:04 crc kubenswrapper[4933]: I1202 16:18:04.746667 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hw45r"] Dec 02 16:18:04 crc kubenswrapper[4933]: I1202 16:18:04.771765 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-hw45r"] Dec 02 16:18:04 crc kubenswrapper[4933]: I1202 16:18:04.773960 4933 scope.go:117] "RemoveContainer" containerID="a9abab9ad1fa4df2e78331c5c543b0c5fc2a1d9c35731699f66e355e460c1359" Dec 02 16:18:04 crc kubenswrapper[4933]: I1202 16:18:04.844312 4933 scope.go:117] "RemoveContainer" containerID="97d2bcae9cb162a443fc0744408178dec66b614385f6d4a4a52652403bb227eb" Dec 02 16:18:04 crc kubenswrapper[4933]: E1202 16:18:04.844764 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97d2bcae9cb162a443fc0744408178dec66b614385f6d4a4a52652403bb227eb\": container with ID starting with 97d2bcae9cb162a443fc0744408178dec66b614385f6d4a4a52652403bb227eb not found: ID does not exist" containerID="97d2bcae9cb162a443fc0744408178dec66b614385f6d4a4a52652403bb227eb" Dec 02 16:18:04 crc kubenswrapper[4933]: I1202 16:18:04.844805 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97d2bcae9cb162a443fc0744408178dec66b614385f6d4a4a52652403bb227eb"} err="failed to get container status \"97d2bcae9cb162a443fc0744408178dec66b614385f6d4a4a52652403bb227eb\": rpc error: code = NotFound desc = could not find container \"97d2bcae9cb162a443fc0744408178dec66b614385f6d4a4a52652403bb227eb\": container with ID starting with 97d2bcae9cb162a443fc0744408178dec66b614385f6d4a4a52652403bb227eb not found: ID does not exist" Dec 02 16:18:04 crc kubenswrapper[4933]: I1202 16:18:04.844850 4933 scope.go:117] "RemoveContainer" containerID="895ad2acbbb9e13805177dcc39bd6a97c43c16eb24912641c9f832800498688c" Dec 02 16:18:04 crc kubenswrapper[4933]: E1202 16:18:04.845321 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"895ad2acbbb9e13805177dcc39bd6a97c43c16eb24912641c9f832800498688c\": container with ID starting with 895ad2acbbb9e13805177dcc39bd6a97c43c16eb24912641c9f832800498688c not found: ID does not exist" containerID="895ad2acbbb9e13805177dcc39bd6a97c43c16eb24912641c9f832800498688c" Dec 02 16:18:04 crc kubenswrapper[4933]: I1202 16:18:04.845345 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"895ad2acbbb9e13805177dcc39bd6a97c43c16eb24912641c9f832800498688c"} err="failed to get container status \"895ad2acbbb9e13805177dcc39bd6a97c43c16eb24912641c9f832800498688c\": rpc error: code = NotFound desc = could not find container \"895ad2acbbb9e13805177dcc39bd6a97c43c16eb24912641c9f832800498688c\": container with ID starting with 895ad2acbbb9e13805177dcc39bd6a97c43c16eb24912641c9f832800498688c not found: ID does not exist" Dec 02 16:18:04 crc kubenswrapper[4933]: I1202 16:18:04.845357 4933 scope.go:117] "RemoveContainer" containerID="a9abab9ad1fa4df2e78331c5c543b0c5fc2a1d9c35731699f66e355e460c1359" Dec 02 16:18:04 crc kubenswrapper[4933]: E1202 16:18:04.845606 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9abab9ad1fa4df2e78331c5c543b0c5fc2a1d9c35731699f66e355e460c1359\": container with ID starting with a9abab9ad1fa4df2e78331c5c543b0c5fc2a1d9c35731699f66e355e460c1359 not found: ID does not exist" containerID="a9abab9ad1fa4df2e78331c5c543b0c5fc2a1d9c35731699f66e355e460c1359" Dec 02 16:18:04 crc kubenswrapper[4933]: I1202 16:18:04.845626 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9abab9ad1fa4df2e78331c5c543b0c5fc2a1d9c35731699f66e355e460c1359"} err="failed to get container status \"a9abab9ad1fa4df2e78331c5c543b0c5fc2a1d9c35731699f66e355e460c1359\": rpc error: code = NotFound desc = could not find container \"a9abab9ad1fa4df2e78331c5c543b0c5fc2a1d9c35731699f66e355e460c1359\": container with ID starting with a9abab9ad1fa4df2e78331c5c543b0c5fc2a1d9c35731699f66e355e460c1359 not found: ID does not exist" Dec 02 16:18:05 crc kubenswrapper[4933]: I1202 16:18:05.000728 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 02 16:18:05 crc kubenswrapper[4933]: I1202 16:18:05.000807 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 02 16:18:05 crc kubenswrapper[4933]: I1202 16:18:05.072553 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b21b605-e2e1-412d-86d6-116c3c6684dd" path="/var/lib/kubelet/pods/1b21b605-e2e1-412d-86d6-116c3c6684dd/volumes" Dec 02 16:18:06 crc kubenswrapper[4933]: E1202 16:18:06.051357 4933 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6e06c62d_ee51_43e3_aa5e_eb045d4ec1c8.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9cde07d2_340f_48ad_bb66_db94386d9052.slice\": RecentStats: unable to find data in memory cache]" Dec 02 16:18:06 crc kubenswrapper[4933]: I1202 16:18:06.082105 4933 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d822a2fc-63d8-4767-805b-d68ff0173248" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.248:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 02 16:18:06 crc kubenswrapper[4933]: I1202 16:18:06.082247 4933 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d822a2fc-63d8-4767-805b-d68ff0173248" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.248:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 02 16:18:12 crc kubenswrapper[4933]: E1202 16:18:12.206640 4933 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9cde07d2_340f_48ad_bb66_db94386d9052.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6e06c62d_ee51_43e3_aa5e_eb045d4ec1c8.slice\": RecentStats: unable to find data in memory cache]" Dec 02 16:18:13 crc kubenswrapper[4933]: I1202 16:18:13.075269 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 02 16:18:13 crc kubenswrapper[4933]: I1202 16:18:13.075345 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 02 16:18:13 crc kubenswrapper[4933]: I1202 16:18:13.081846 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 02 16:18:13 crc kubenswrapper[4933]: I1202 16:18:13.088949 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 02 16:18:14 crc kubenswrapper[4933]: I1202 16:18:14.413395 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 02 16:18:14 crc kubenswrapper[4933]: I1202 16:18:14.473540 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 02 16:18:14 crc kubenswrapper[4933]: I1202 16:18:14.639360 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dzmvb\" (UniqueName: \"kubernetes.io/projected/2e5cd02d-fd19-452b-a4c0-eb3dbf888dfb-kube-api-access-dzmvb\") pod \"2e5cd02d-fd19-452b-a4c0-eb3dbf888dfb\" (UID: \"2e5cd02d-fd19-452b-a4c0-eb3dbf888dfb\") " Dec 02 16:18:14 crc kubenswrapper[4933]: I1202 16:18:14.639444 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e5cd02d-fd19-452b-a4c0-eb3dbf888dfb-config-data\") pod \"2e5cd02d-fd19-452b-a4c0-eb3dbf888dfb\" (UID: \"2e5cd02d-fd19-452b-a4c0-eb3dbf888dfb\") " Dec 02 16:18:14 crc kubenswrapper[4933]: I1202 16:18:14.639494 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e5cd02d-fd19-452b-a4c0-eb3dbf888dfb-combined-ca-bundle\") pod \"2e5cd02d-fd19-452b-a4c0-eb3dbf888dfb\" (UID: \"2e5cd02d-fd19-452b-a4c0-eb3dbf888dfb\") " Dec 02 16:18:14 crc kubenswrapper[4933]: I1202 16:18:14.655672 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e5cd02d-fd19-452b-a4c0-eb3dbf888dfb-kube-api-access-dzmvb" (OuterVolumeSpecName: "kube-api-access-dzmvb") pod "2e5cd02d-fd19-452b-a4c0-eb3dbf888dfb" (UID: "2e5cd02d-fd19-452b-a4c0-eb3dbf888dfb"). InnerVolumeSpecName "kube-api-access-dzmvb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:18:14 crc kubenswrapper[4933]: I1202 16:18:14.681851 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e5cd02d-fd19-452b-a4c0-eb3dbf888dfb-config-data" (OuterVolumeSpecName: "config-data") pod "2e5cd02d-fd19-452b-a4c0-eb3dbf888dfb" (UID: "2e5cd02d-fd19-452b-a4c0-eb3dbf888dfb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:18:14 crc kubenswrapper[4933]: I1202 16:18:14.685569 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e5cd02d-fd19-452b-a4c0-eb3dbf888dfb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2e5cd02d-fd19-452b-a4c0-eb3dbf888dfb" (UID: "2e5cd02d-fd19-452b-a4c0-eb3dbf888dfb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:18:14 crc kubenswrapper[4933]: I1202 16:18:14.742293 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dzmvb\" (UniqueName: \"kubernetes.io/projected/2e5cd02d-fd19-452b-a4c0-eb3dbf888dfb-kube-api-access-dzmvb\") on node \"crc\" DevicePath \"\"" Dec 02 16:18:14 crc kubenswrapper[4933]: I1202 16:18:14.742331 4933 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e5cd02d-fd19-452b-a4c0-eb3dbf888dfb-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 16:18:14 crc kubenswrapper[4933]: I1202 16:18:14.742342 4933 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e5cd02d-fd19-452b-a4c0-eb3dbf888dfb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 16:18:14 crc kubenswrapper[4933]: I1202 16:18:14.750668 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-cjgzk"] Dec 02 16:18:14 crc kubenswrapper[4933]: E1202 16:18:14.751330 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b21b605-e2e1-412d-86d6-116c3c6684dd" containerName="registry-server" Dec 02 16:18:14 crc kubenswrapper[4933]: I1202 16:18:14.751354 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b21b605-e2e1-412d-86d6-116c3c6684dd" containerName="registry-server" Dec 02 16:18:14 crc kubenswrapper[4933]: E1202 16:18:14.751400 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e5cd02d-fd19-452b-a4c0-eb3dbf888dfb" containerName="nova-cell1-novncproxy-novncproxy" Dec 02 16:18:14 crc kubenswrapper[4933]: I1202 16:18:14.751410 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e5cd02d-fd19-452b-a4c0-eb3dbf888dfb" containerName="nova-cell1-novncproxy-novncproxy" Dec 02 16:18:14 crc kubenswrapper[4933]: E1202 16:18:14.751431 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b21b605-e2e1-412d-86d6-116c3c6684dd" containerName="extract-utilities" Dec 02 16:18:14 crc kubenswrapper[4933]: I1202 16:18:14.751441 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b21b605-e2e1-412d-86d6-116c3c6684dd" containerName="extract-utilities" Dec 02 16:18:14 crc kubenswrapper[4933]: E1202 16:18:14.751459 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b21b605-e2e1-412d-86d6-116c3c6684dd" containerName="extract-content" Dec 02 16:18:14 crc kubenswrapper[4933]: I1202 16:18:14.751466 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b21b605-e2e1-412d-86d6-116c3c6684dd" containerName="extract-content" Dec 02 16:18:14 crc kubenswrapper[4933]: I1202 16:18:14.751800 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e5cd02d-fd19-452b-a4c0-eb3dbf888dfb" containerName="nova-cell1-novncproxy-novncproxy" Dec 02 16:18:14 crc kubenswrapper[4933]: I1202 16:18:14.751839 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b21b605-e2e1-412d-86d6-116c3c6684dd" containerName="registry-server" Dec 02 16:18:14 crc kubenswrapper[4933]: I1202 16:18:14.754004 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cjgzk" Dec 02 16:18:14 crc kubenswrapper[4933]: I1202 16:18:14.765729 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cjgzk"] Dec 02 16:18:14 crc kubenswrapper[4933]: I1202 16:18:14.822823 4933 generic.go:334] "Generic (PLEG): container finished" podID="2e5cd02d-fd19-452b-a4c0-eb3dbf888dfb" containerID="d2a1efc4c7f6b05ce7d42809006ebfae41feec28cf69e3d3e5206e892dd0ece7" exitCode=137 Dec 02 16:18:14 crc kubenswrapper[4933]: I1202 16:18:14.822934 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 02 16:18:14 crc kubenswrapper[4933]: I1202 16:18:14.822938 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"2e5cd02d-fd19-452b-a4c0-eb3dbf888dfb","Type":"ContainerDied","Data":"d2a1efc4c7f6b05ce7d42809006ebfae41feec28cf69e3d3e5206e892dd0ece7"} Dec 02 16:18:14 crc kubenswrapper[4933]: I1202 16:18:14.823328 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"2e5cd02d-fd19-452b-a4c0-eb3dbf888dfb","Type":"ContainerDied","Data":"4f40f8c95cabad472d9807c6cc7daa52032e5c73c9d45d7e4ac7f67a683246e8"} Dec 02 16:18:14 crc kubenswrapper[4933]: I1202 16:18:14.823353 4933 scope.go:117] "RemoveContainer" containerID="d2a1efc4c7f6b05ce7d42809006ebfae41feec28cf69e3d3e5206e892dd0ece7" Dec 02 16:18:14 crc kubenswrapper[4933]: I1202 16:18:14.869545 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzln9\" (UniqueName: \"kubernetes.io/projected/50ec02ba-6f45-4d20-a016-53cd873906ae-kube-api-access-hzln9\") pod \"certified-operators-cjgzk\" (UID: \"50ec02ba-6f45-4d20-a016-53cd873906ae\") " pod="openshift-marketplace/certified-operators-cjgzk" Dec 02 16:18:14 crc kubenswrapper[4933]: I1202 16:18:14.870017 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50ec02ba-6f45-4d20-a016-53cd873906ae-catalog-content\") pod \"certified-operators-cjgzk\" (UID: \"50ec02ba-6f45-4d20-a016-53cd873906ae\") " pod="openshift-marketplace/certified-operators-cjgzk" Dec 02 16:18:14 crc kubenswrapper[4933]: I1202 16:18:14.870129 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50ec02ba-6f45-4d20-a016-53cd873906ae-utilities\") pod \"certified-operators-cjgzk\" (UID: \"50ec02ba-6f45-4d20-a016-53cd873906ae\") " pod="openshift-marketplace/certified-operators-cjgzk" Dec 02 16:18:14 crc kubenswrapper[4933]: I1202 16:18:14.904891 4933 scope.go:117] "RemoveContainer" containerID="d2a1efc4c7f6b05ce7d42809006ebfae41feec28cf69e3d3e5206e892dd0ece7" Dec 02 16:18:14 crc kubenswrapper[4933]: E1202 16:18:14.906000 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2a1efc4c7f6b05ce7d42809006ebfae41feec28cf69e3d3e5206e892dd0ece7\": container with ID starting with d2a1efc4c7f6b05ce7d42809006ebfae41feec28cf69e3d3e5206e892dd0ece7 not found: ID does not exist" containerID="d2a1efc4c7f6b05ce7d42809006ebfae41feec28cf69e3d3e5206e892dd0ece7" Dec 02 16:18:14 crc kubenswrapper[4933]: I1202 16:18:14.906040 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2a1efc4c7f6b05ce7d42809006ebfae41feec28cf69e3d3e5206e892dd0ece7"} err="failed to get container status \"d2a1efc4c7f6b05ce7d42809006ebfae41feec28cf69e3d3e5206e892dd0ece7\": rpc error: code = NotFound desc = could not find container \"d2a1efc4c7f6b05ce7d42809006ebfae41feec28cf69e3d3e5206e892dd0ece7\": container with ID starting with d2a1efc4c7f6b05ce7d42809006ebfae41feec28cf69e3d3e5206e892dd0ece7 not found: ID does not exist" Dec 02 16:18:14 crc kubenswrapper[4933]: I1202 16:18:14.918963 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 02 16:18:14 crc kubenswrapper[4933]: I1202 16:18:14.944103 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 02 16:18:14 crc kubenswrapper[4933]: I1202 16:18:14.959870 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 02 16:18:14 crc kubenswrapper[4933]: I1202 16:18:14.961459 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 02 16:18:14 crc kubenswrapper[4933]: I1202 16:18:14.965692 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 02 16:18:14 crc kubenswrapper[4933]: I1202 16:18:14.965770 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Dec 02 16:18:14 crc kubenswrapper[4933]: I1202 16:18:14.965806 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Dec 02 16:18:14 crc kubenswrapper[4933]: I1202 16:18:14.972352 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hzln9\" (UniqueName: \"kubernetes.io/projected/50ec02ba-6f45-4d20-a016-53cd873906ae-kube-api-access-hzln9\") pod \"certified-operators-cjgzk\" (UID: \"50ec02ba-6f45-4d20-a016-53cd873906ae\") " pod="openshift-marketplace/certified-operators-cjgzk" Dec 02 16:18:14 crc kubenswrapper[4933]: I1202 16:18:14.972418 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7593905-f88c-466a-b547-9a3e59588987-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d7593905-f88c-466a-b547-9a3e59588987\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 16:18:14 crc kubenswrapper[4933]: I1202 16:18:14.972432 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 02 16:18:14 crc kubenswrapper[4933]: I1202 16:18:14.972528 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7593905-f88c-466a-b547-9a3e59588987-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d7593905-f88c-466a-b547-9a3e59588987\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 16:18:14 crc kubenswrapper[4933]: I1202 16:18:14.972546 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nf2vx\" (UniqueName: \"kubernetes.io/projected/d7593905-f88c-466a-b547-9a3e59588987-kube-api-access-nf2vx\") pod \"nova-cell1-novncproxy-0\" (UID: \"d7593905-f88c-466a-b547-9a3e59588987\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 16:18:14 crc kubenswrapper[4933]: I1202 16:18:14.972581 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50ec02ba-6f45-4d20-a016-53cd873906ae-catalog-content\") pod \"certified-operators-cjgzk\" (UID: \"50ec02ba-6f45-4d20-a016-53cd873906ae\") " pod="openshift-marketplace/certified-operators-cjgzk" Dec 02 16:18:14 crc kubenswrapper[4933]: I1202 16:18:14.972596 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7593905-f88c-466a-b547-9a3e59588987-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"d7593905-f88c-466a-b547-9a3e59588987\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 16:18:14 crc kubenswrapper[4933]: I1202 16:18:14.972658 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50ec02ba-6f45-4d20-a016-53cd873906ae-utilities\") pod \"certified-operators-cjgzk\" (UID: \"50ec02ba-6f45-4d20-a016-53cd873906ae\") " pod="openshift-marketplace/certified-operators-cjgzk" Dec 02 16:18:14 crc kubenswrapper[4933]: I1202 16:18:14.972708 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7593905-f88c-466a-b547-9a3e59588987-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"d7593905-f88c-466a-b547-9a3e59588987\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 16:18:14 crc kubenswrapper[4933]: I1202 16:18:14.973578 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50ec02ba-6f45-4d20-a016-53cd873906ae-catalog-content\") pod \"certified-operators-cjgzk\" (UID: \"50ec02ba-6f45-4d20-a016-53cd873906ae\") " pod="openshift-marketplace/certified-operators-cjgzk" Dec 02 16:18:14 crc kubenswrapper[4933]: I1202 16:18:14.973899 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50ec02ba-6f45-4d20-a016-53cd873906ae-utilities\") pod \"certified-operators-cjgzk\" (UID: \"50ec02ba-6f45-4d20-a016-53cd873906ae\") " pod="openshift-marketplace/certified-operators-cjgzk" Dec 02 16:18:14 crc kubenswrapper[4933]: I1202 16:18:14.993236 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzln9\" (UniqueName: \"kubernetes.io/projected/50ec02ba-6f45-4d20-a016-53cd873906ae-kube-api-access-hzln9\") pod \"certified-operators-cjgzk\" (UID: \"50ec02ba-6f45-4d20-a016-53cd873906ae\") " pod="openshift-marketplace/certified-operators-cjgzk" Dec 02 16:18:15 crc kubenswrapper[4933]: I1202 16:18:15.014908 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 02 16:18:15 crc kubenswrapper[4933]: I1202 16:18:15.016480 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 02 16:18:15 crc kubenswrapper[4933]: I1202 16:18:15.020055 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 02 16:18:15 crc kubenswrapper[4933]: I1202 16:18:15.024247 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 02 16:18:15 crc kubenswrapper[4933]: I1202 16:18:15.066509 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e5cd02d-fd19-452b-a4c0-eb3dbf888dfb" path="/var/lib/kubelet/pods/2e5cd02d-fd19-452b-a4c0-eb3dbf888dfb/volumes" Dec 02 16:18:15 crc kubenswrapper[4933]: I1202 16:18:15.074924 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7593905-f88c-466a-b547-9a3e59588987-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d7593905-f88c-466a-b547-9a3e59588987\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 16:18:15 crc kubenswrapper[4933]: I1202 16:18:15.075102 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nf2vx\" (UniqueName: \"kubernetes.io/projected/d7593905-f88c-466a-b547-9a3e59588987-kube-api-access-nf2vx\") pod \"nova-cell1-novncproxy-0\" (UID: \"d7593905-f88c-466a-b547-9a3e59588987\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 16:18:15 crc kubenswrapper[4933]: I1202 16:18:15.075139 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7593905-f88c-466a-b547-9a3e59588987-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d7593905-f88c-466a-b547-9a3e59588987\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 16:18:15 crc kubenswrapper[4933]: I1202 16:18:15.075181 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7593905-f88c-466a-b547-9a3e59588987-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"d7593905-f88c-466a-b547-9a3e59588987\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 16:18:15 crc kubenswrapper[4933]: I1202 16:18:15.075287 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7593905-f88c-466a-b547-9a3e59588987-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"d7593905-f88c-466a-b547-9a3e59588987\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 16:18:15 crc kubenswrapper[4933]: I1202 16:18:15.079062 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7593905-f88c-466a-b547-9a3e59588987-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d7593905-f88c-466a-b547-9a3e59588987\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 16:18:15 crc kubenswrapper[4933]: I1202 16:18:15.079105 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7593905-f88c-466a-b547-9a3e59588987-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d7593905-f88c-466a-b547-9a3e59588987\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 16:18:15 crc kubenswrapper[4933]: I1202 16:18:15.079711 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7593905-f88c-466a-b547-9a3e59588987-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"d7593905-f88c-466a-b547-9a3e59588987\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 16:18:15 crc kubenswrapper[4933]: I1202 16:18:15.081070 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7593905-f88c-466a-b547-9a3e59588987-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"d7593905-f88c-466a-b547-9a3e59588987\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 16:18:15 crc kubenswrapper[4933]: I1202 16:18:15.090606 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nf2vx\" (UniqueName: \"kubernetes.io/projected/d7593905-f88c-466a-b547-9a3e59588987-kube-api-access-nf2vx\") pod \"nova-cell1-novncproxy-0\" (UID: \"d7593905-f88c-466a-b547-9a3e59588987\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 16:18:15 crc kubenswrapper[4933]: I1202 16:18:15.105983 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cjgzk" Dec 02 16:18:15 crc kubenswrapper[4933]: I1202 16:18:15.294261 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 02 16:18:15 crc kubenswrapper[4933]: W1202 16:18:15.614068 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod50ec02ba_6f45_4d20_a016_53cd873906ae.slice/crio-7c4ead521110f803e67ece6e82177718b38efc85504f0828d70f9ed5932d4b3a WatchSource:0}: Error finding container 7c4ead521110f803e67ece6e82177718b38efc85504f0828d70f9ed5932d4b3a: Status 404 returned error can't find the container with id 7c4ead521110f803e67ece6e82177718b38efc85504f0828d70f9ed5932d4b3a Dec 02 16:18:15 crc kubenswrapper[4933]: I1202 16:18:15.619553 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cjgzk"] Dec 02 16:18:15 crc kubenswrapper[4933]: W1202 16:18:15.808340 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd7593905_f88c_466a_b547_9a3e59588987.slice/crio-219afe1acde9331670a54f58d9adf8775e7e26f98281fe5bea0ea895f2534ea3 WatchSource:0}: Error finding container 219afe1acde9331670a54f58d9adf8775e7e26f98281fe5bea0ea895f2534ea3: Status 404 returned error can't find the container with id 219afe1acde9331670a54f58d9adf8775e7e26f98281fe5bea0ea895f2534ea3 Dec 02 16:18:15 crc kubenswrapper[4933]: I1202 16:18:15.815606 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 02 16:18:15 crc kubenswrapper[4933]: I1202 16:18:15.844837 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"d7593905-f88c-466a-b547-9a3e59588987","Type":"ContainerStarted","Data":"219afe1acde9331670a54f58d9adf8775e7e26f98281fe5bea0ea895f2534ea3"} Dec 02 16:18:15 crc kubenswrapper[4933]: I1202 16:18:15.847759 4933 generic.go:334] "Generic (PLEG): container finished" podID="50ec02ba-6f45-4d20-a016-53cd873906ae" containerID="09206952d1f4eafd9ec1315dda15d183135c85575d70432eb287465c9a0c869d" exitCode=0 Dec 02 16:18:15 crc kubenswrapper[4933]: I1202 16:18:15.848097 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cjgzk" event={"ID":"50ec02ba-6f45-4d20-a016-53cd873906ae","Type":"ContainerDied","Data":"09206952d1f4eafd9ec1315dda15d183135c85575d70432eb287465c9a0c869d"} Dec 02 16:18:15 crc kubenswrapper[4933]: I1202 16:18:15.848275 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cjgzk" event={"ID":"50ec02ba-6f45-4d20-a016-53cd873906ae","Type":"ContainerStarted","Data":"7c4ead521110f803e67ece6e82177718b38efc85504f0828d70f9ed5932d4b3a"} Dec 02 16:18:15 crc kubenswrapper[4933]: I1202 16:18:15.852209 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 02 16:18:15 crc kubenswrapper[4933]: I1202 16:18:15.863210 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 02 16:18:16 crc kubenswrapper[4933]: I1202 16:18:16.039374 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6d99f6bc7f-xjx4w"] Dec 02 16:18:16 crc kubenswrapper[4933]: I1202 16:18:16.047044 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d99f6bc7f-xjx4w" Dec 02 16:18:16 crc kubenswrapper[4933]: I1202 16:18:16.078425 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d99f6bc7f-xjx4w"] Dec 02 16:18:16 crc kubenswrapper[4933]: I1202 16:18:16.104239 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/000e4cc8-9c69-4cd9-8d10-ea8a7a3411b4-dns-svc\") pod \"dnsmasq-dns-6d99f6bc7f-xjx4w\" (UID: \"000e4cc8-9c69-4cd9-8d10-ea8a7a3411b4\") " pod="openstack/dnsmasq-dns-6d99f6bc7f-xjx4w" Dec 02 16:18:16 crc kubenswrapper[4933]: I1202 16:18:16.104450 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/000e4cc8-9c69-4cd9-8d10-ea8a7a3411b4-dns-swift-storage-0\") pod \"dnsmasq-dns-6d99f6bc7f-xjx4w\" (UID: \"000e4cc8-9c69-4cd9-8d10-ea8a7a3411b4\") " pod="openstack/dnsmasq-dns-6d99f6bc7f-xjx4w" Dec 02 16:18:16 crc kubenswrapper[4933]: I1202 16:18:16.104979 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/000e4cc8-9c69-4cd9-8d10-ea8a7a3411b4-ovsdbserver-nb\") pod \"dnsmasq-dns-6d99f6bc7f-xjx4w\" (UID: \"000e4cc8-9c69-4cd9-8d10-ea8a7a3411b4\") " pod="openstack/dnsmasq-dns-6d99f6bc7f-xjx4w" Dec 02 16:18:16 crc kubenswrapper[4933]: I1202 16:18:16.105638 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/000e4cc8-9c69-4cd9-8d10-ea8a7a3411b4-ovsdbserver-sb\") pod \"dnsmasq-dns-6d99f6bc7f-xjx4w\" (UID: \"000e4cc8-9c69-4cd9-8d10-ea8a7a3411b4\") " pod="openstack/dnsmasq-dns-6d99f6bc7f-xjx4w" Dec 02 16:18:16 crc kubenswrapper[4933]: I1202 16:18:16.105912 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cr7xw\" (UniqueName: \"kubernetes.io/projected/000e4cc8-9c69-4cd9-8d10-ea8a7a3411b4-kube-api-access-cr7xw\") pod \"dnsmasq-dns-6d99f6bc7f-xjx4w\" (UID: \"000e4cc8-9c69-4cd9-8d10-ea8a7a3411b4\") " pod="openstack/dnsmasq-dns-6d99f6bc7f-xjx4w" Dec 02 16:18:16 crc kubenswrapper[4933]: I1202 16:18:16.106236 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/000e4cc8-9c69-4cd9-8d10-ea8a7a3411b4-config\") pod \"dnsmasq-dns-6d99f6bc7f-xjx4w\" (UID: \"000e4cc8-9c69-4cd9-8d10-ea8a7a3411b4\") " pod="openstack/dnsmasq-dns-6d99f6bc7f-xjx4w" Dec 02 16:18:16 crc kubenswrapper[4933]: E1202 16:18:16.119652 4933 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6e06c62d_ee51_43e3_aa5e_eb045d4ec1c8.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9cde07d2_340f_48ad_bb66_db94386d9052.slice\": RecentStats: unable to find data in memory cache]" Dec 02 16:18:16 crc kubenswrapper[4933]: I1202 16:18:16.240175 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/000e4cc8-9c69-4cd9-8d10-ea8a7a3411b4-dns-swift-storage-0\") pod \"dnsmasq-dns-6d99f6bc7f-xjx4w\" (UID: \"000e4cc8-9c69-4cd9-8d10-ea8a7a3411b4\") " pod="openstack/dnsmasq-dns-6d99f6bc7f-xjx4w" Dec 02 16:18:16 crc kubenswrapper[4933]: I1202 16:18:16.240265 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/000e4cc8-9c69-4cd9-8d10-ea8a7a3411b4-ovsdbserver-nb\") pod \"dnsmasq-dns-6d99f6bc7f-xjx4w\" (UID: \"000e4cc8-9c69-4cd9-8d10-ea8a7a3411b4\") " pod="openstack/dnsmasq-dns-6d99f6bc7f-xjx4w" Dec 02 16:18:16 crc kubenswrapper[4933]: I1202 16:18:16.240366 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/000e4cc8-9c69-4cd9-8d10-ea8a7a3411b4-ovsdbserver-sb\") pod \"dnsmasq-dns-6d99f6bc7f-xjx4w\" (UID: \"000e4cc8-9c69-4cd9-8d10-ea8a7a3411b4\") " pod="openstack/dnsmasq-dns-6d99f6bc7f-xjx4w" Dec 02 16:18:16 crc kubenswrapper[4933]: I1202 16:18:16.240445 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cr7xw\" (UniqueName: \"kubernetes.io/projected/000e4cc8-9c69-4cd9-8d10-ea8a7a3411b4-kube-api-access-cr7xw\") pod \"dnsmasq-dns-6d99f6bc7f-xjx4w\" (UID: \"000e4cc8-9c69-4cd9-8d10-ea8a7a3411b4\") " pod="openstack/dnsmasq-dns-6d99f6bc7f-xjx4w" Dec 02 16:18:16 crc kubenswrapper[4933]: I1202 16:18:16.240541 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/000e4cc8-9c69-4cd9-8d10-ea8a7a3411b4-config\") pod \"dnsmasq-dns-6d99f6bc7f-xjx4w\" (UID: \"000e4cc8-9c69-4cd9-8d10-ea8a7a3411b4\") " pod="openstack/dnsmasq-dns-6d99f6bc7f-xjx4w" Dec 02 16:18:16 crc kubenswrapper[4933]: I1202 16:18:16.240590 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/000e4cc8-9c69-4cd9-8d10-ea8a7a3411b4-dns-svc\") pod \"dnsmasq-dns-6d99f6bc7f-xjx4w\" (UID: \"000e4cc8-9c69-4cd9-8d10-ea8a7a3411b4\") " pod="openstack/dnsmasq-dns-6d99f6bc7f-xjx4w" Dec 02 16:18:16 crc kubenswrapper[4933]: I1202 16:18:16.241654 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/000e4cc8-9c69-4cd9-8d10-ea8a7a3411b4-dns-svc\") pod \"dnsmasq-dns-6d99f6bc7f-xjx4w\" (UID: \"000e4cc8-9c69-4cd9-8d10-ea8a7a3411b4\") " pod="openstack/dnsmasq-dns-6d99f6bc7f-xjx4w" Dec 02 16:18:16 crc kubenswrapper[4933]: I1202 16:18:16.241651 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/000e4cc8-9c69-4cd9-8d10-ea8a7a3411b4-ovsdbserver-nb\") pod \"dnsmasq-dns-6d99f6bc7f-xjx4w\" (UID: \"000e4cc8-9c69-4cd9-8d10-ea8a7a3411b4\") " pod="openstack/dnsmasq-dns-6d99f6bc7f-xjx4w" Dec 02 16:18:16 crc kubenswrapper[4933]: I1202 16:18:16.241658 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/000e4cc8-9c69-4cd9-8d10-ea8a7a3411b4-ovsdbserver-sb\") pod \"dnsmasq-dns-6d99f6bc7f-xjx4w\" (UID: \"000e4cc8-9c69-4cd9-8d10-ea8a7a3411b4\") " pod="openstack/dnsmasq-dns-6d99f6bc7f-xjx4w" Dec 02 16:18:16 crc kubenswrapper[4933]: I1202 16:18:16.242231 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/000e4cc8-9c69-4cd9-8d10-ea8a7a3411b4-config\") pod \"dnsmasq-dns-6d99f6bc7f-xjx4w\" (UID: \"000e4cc8-9c69-4cd9-8d10-ea8a7a3411b4\") " pod="openstack/dnsmasq-dns-6d99f6bc7f-xjx4w" Dec 02 16:18:16 crc kubenswrapper[4933]: I1202 16:18:16.242365 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/000e4cc8-9c69-4cd9-8d10-ea8a7a3411b4-dns-swift-storage-0\") pod \"dnsmasq-dns-6d99f6bc7f-xjx4w\" (UID: \"000e4cc8-9c69-4cd9-8d10-ea8a7a3411b4\") " pod="openstack/dnsmasq-dns-6d99f6bc7f-xjx4w" Dec 02 16:18:16 crc kubenswrapper[4933]: I1202 16:18:16.278977 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cr7xw\" (UniqueName: \"kubernetes.io/projected/000e4cc8-9c69-4cd9-8d10-ea8a7a3411b4-kube-api-access-cr7xw\") pod \"dnsmasq-dns-6d99f6bc7f-xjx4w\" (UID: \"000e4cc8-9c69-4cd9-8d10-ea8a7a3411b4\") " pod="openstack/dnsmasq-dns-6d99f6bc7f-xjx4w" Dec 02 16:18:16 crc kubenswrapper[4933]: I1202 16:18:16.445872 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d99f6bc7f-xjx4w" Dec 02 16:18:16 crc kubenswrapper[4933]: I1202 16:18:16.830544 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 02 16:18:16 crc kubenswrapper[4933]: I1202 16:18:16.863124 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9e6f954-bf8f-4aab-a398-d3320706266e-combined-ca-bundle\") pod \"a9e6f954-bf8f-4aab-a398-d3320706266e\" (UID: \"a9e6f954-bf8f-4aab-a398-d3320706266e\") " Dec 02 16:18:16 crc kubenswrapper[4933]: I1202 16:18:16.863254 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9e6f954-bf8f-4aab-a398-d3320706266e-scripts\") pod \"a9e6f954-bf8f-4aab-a398-d3320706266e\" (UID: \"a9e6f954-bf8f-4aab-a398-d3320706266e\") " Dec 02 16:18:16 crc kubenswrapper[4933]: I1202 16:18:16.863285 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9e6f954-bf8f-4aab-a398-d3320706266e-config-data\") pod \"a9e6f954-bf8f-4aab-a398-d3320706266e\" (UID: \"a9e6f954-bf8f-4aab-a398-d3320706266e\") " Dec 02 16:18:16 crc kubenswrapper[4933]: I1202 16:18:16.863528 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tmlkz\" (UniqueName: \"kubernetes.io/projected/a9e6f954-bf8f-4aab-a398-d3320706266e-kube-api-access-tmlkz\") pod \"a9e6f954-bf8f-4aab-a398-d3320706266e\" (UID: \"a9e6f954-bf8f-4aab-a398-d3320706266e\") " Dec 02 16:18:16 crc kubenswrapper[4933]: I1202 16:18:16.871583 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9e6f954-bf8f-4aab-a398-d3320706266e-scripts" (OuterVolumeSpecName: "scripts") pod "a9e6f954-bf8f-4aab-a398-d3320706266e" (UID: "a9e6f954-bf8f-4aab-a398-d3320706266e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:18:16 crc kubenswrapper[4933]: I1202 16:18:16.873088 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9e6f954-bf8f-4aab-a398-d3320706266e-kube-api-access-tmlkz" (OuterVolumeSpecName: "kube-api-access-tmlkz") pod "a9e6f954-bf8f-4aab-a398-d3320706266e" (UID: "a9e6f954-bf8f-4aab-a398-d3320706266e"). InnerVolumeSpecName "kube-api-access-tmlkz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:18:16 crc kubenswrapper[4933]: I1202 16:18:16.882737 4933 generic.go:334] "Generic (PLEG): container finished" podID="a9e6f954-bf8f-4aab-a398-d3320706266e" containerID="d5a2bf8c4e5e0c59a8e2da171d829bd56a23120937e6f15e060463c3fed44198" exitCode=137 Dec 02 16:18:16 crc kubenswrapper[4933]: I1202 16:18:16.882813 4933 generic.go:334] "Generic (PLEG): container finished" podID="a9e6f954-bf8f-4aab-a398-d3320706266e" containerID="6070de53e73b654e425984097cbcbb55c54d32a6fa3cb2d323524bb3256a50c6" exitCode=137 Dec 02 16:18:16 crc kubenswrapper[4933]: I1202 16:18:16.882933 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"a9e6f954-bf8f-4aab-a398-d3320706266e","Type":"ContainerDied","Data":"d5a2bf8c4e5e0c59a8e2da171d829bd56a23120937e6f15e060463c3fed44198"} Dec 02 16:18:16 crc kubenswrapper[4933]: I1202 16:18:16.882959 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"a9e6f954-bf8f-4aab-a398-d3320706266e","Type":"ContainerDied","Data":"6070de53e73b654e425984097cbcbb55c54d32a6fa3cb2d323524bb3256a50c6"} Dec 02 16:18:16 crc kubenswrapper[4933]: I1202 16:18:16.882968 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"a9e6f954-bf8f-4aab-a398-d3320706266e","Type":"ContainerDied","Data":"6a5d7f2f823dc517cefa6624667db900084a4d402679db3eafd1cfcfc4f86a5e"} Dec 02 16:18:16 crc kubenswrapper[4933]: I1202 16:18:16.882982 4933 scope.go:117] "RemoveContainer" containerID="d5a2bf8c4e5e0c59a8e2da171d829bd56a23120937e6f15e060463c3fed44198" Dec 02 16:18:16 crc kubenswrapper[4933]: I1202 16:18:16.883111 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 02 16:18:16 crc kubenswrapper[4933]: I1202 16:18:16.889909 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"d7593905-f88c-466a-b547-9a3e59588987","Type":"ContainerStarted","Data":"bb4ce27370ddff1236efb934daf644a70b390d014baf852f0ca1f08319745482"} Dec 02 16:18:16 crc kubenswrapper[4933]: I1202 16:18:16.930993 4933 scope.go:117] "RemoveContainer" containerID="6070de53e73b654e425984097cbcbb55c54d32a6fa3cb2d323524bb3256a50c6" Dec 02 16:18:16 crc kubenswrapper[4933]: I1202 16:18:16.946070 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.946050074 podStartE2EDuration="2.946050074s" podCreationTimestamp="2025-12-02 16:18:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 16:18:16.924345466 +0000 UTC m=+1560.175572169" watchObservedRunningTime="2025-12-02 16:18:16.946050074 +0000 UTC m=+1560.197276777" Dec 02 16:18:16 crc kubenswrapper[4933]: I1202 16:18:16.969111 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tmlkz\" (UniqueName: \"kubernetes.io/projected/a9e6f954-bf8f-4aab-a398-d3320706266e-kube-api-access-tmlkz\") on node \"crc\" DevicePath \"\"" Dec 02 16:18:16 crc kubenswrapper[4933]: I1202 16:18:16.969141 4933 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9e6f954-bf8f-4aab-a398-d3320706266e-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 16:18:16 crc kubenswrapper[4933]: I1202 16:18:16.976369 4933 scope.go:117] "RemoveContainer" containerID="a18d07d7710c65de4eece573380d203afe63bddee8a74e00d3d854d1ab8f07f4" Dec 02 16:18:17 crc kubenswrapper[4933]: I1202 16:18:17.066219 4933 scope.go:117] "RemoveContainer" containerID="3927949d979b6f21a1bd1cf411956109309710dcddf32b5a9f27b0ef47d6c472" Dec 02 16:18:17 crc kubenswrapper[4933]: I1202 16:18:17.174692 4933 patch_prober.go:28] interesting pod/machine-config-daemon-d2p6w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 16:18:17 crc kubenswrapper[4933]: I1202 16:18:17.174772 4933 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 16:18:17 crc kubenswrapper[4933]: I1202 16:18:17.226058 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9e6f954-bf8f-4aab-a398-d3320706266e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a9e6f954-bf8f-4aab-a398-d3320706266e" (UID: "a9e6f954-bf8f-4aab-a398-d3320706266e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:18:17 crc kubenswrapper[4933]: I1202 16:18:17.249378 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9e6f954-bf8f-4aab-a398-d3320706266e-config-data" (OuterVolumeSpecName: "config-data") pod "a9e6f954-bf8f-4aab-a398-d3320706266e" (UID: "a9e6f954-bf8f-4aab-a398-d3320706266e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:18:17 crc kubenswrapper[4933]: I1202 16:18:17.285171 4933 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9e6f954-bf8f-4aab-a398-d3320706266e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 16:18:17 crc kubenswrapper[4933]: I1202 16:18:17.285496 4933 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9e6f954-bf8f-4aab-a398-d3320706266e-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 16:18:17 crc kubenswrapper[4933]: I1202 16:18:17.400148 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d99f6bc7f-xjx4w"] Dec 02 16:18:17 crc kubenswrapper[4933]: I1202 16:18:17.422252 4933 scope.go:117] "RemoveContainer" containerID="d5a2bf8c4e5e0c59a8e2da171d829bd56a23120937e6f15e060463c3fed44198" Dec 02 16:18:17 crc kubenswrapper[4933]: E1202 16:18:17.422718 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5a2bf8c4e5e0c59a8e2da171d829bd56a23120937e6f15e060463c3fed44198\": container with ID starting with d5a2bf8c4e5e0c59a8e2da171d829bd56a23120937e6f15e060463c3fed44198 not found: ID does not exist" containerID="d5a2bf8c4e5e0c59a8e2da171d829bd56a23120937e6f15e060463c3fed44198" Dec 02 16:18:17 crc kubenswrapper[4933]: I1202 16:18:17.422751 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5a2bf8c4e5e0c59a8e2da171d829bd56a23120937e6f15e060463c3fed44198"} err="failed to get container status \"d5a2bf8c4e5e0c59a8e2da171d829bd56a23120937e6f15e060463c3fed44198\": rpc error: code = NotFound desc = could not find container \"d5a2bf8c4e5e0c59a8e2da171d829bd56a23120937e6f15e060463c3fed44198\": container with ID starting with d5a2bf8c4e5e0c59a8e2da171d829bd56a23120937e6f15e060463c3fed44198 not found: ID does not exist" Dec 02 16:18:17 crc kubenswrapper[4933]: I1202 16:18:17.422772 4933 scope.go:117] "RemoveContainer" containerID="6070de53e73b654e425984097cbcbb55c54d32a6fa3cb2d323524bb3256a50c6" Dec 02 16:18:17 crc kubenswrapper[4933]: E1202 16:18:17.423273 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6070de53e73b654e425984097cbcbb55c54d32a6fa3cb2d323524bb3256a50c6\": container with ID starting with 6070de53e73b654e425984097cbcbb55c54d32a6fa3cb2d323524bb3256a50c6 not found: ID does not exist" containerID="6070de53e73b654e425984097cbcbb55c54d32a6fa3cb2d323524bb3256a50c6" Dec 02 16:18:17 crc kubenswrapper[4933]: I1202 16:18:17.423295 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6070de53e73b654e425984097cbcbb55c54d32a6fa3cb2d323524bb3256a50c6"} err="failed to get container status \"6070de53e73b654e425984097cbcbb55c54d32a6fa3cb2d323524bb3256a50c6\": rpc error: code = NotFound desc = could not find container \"6070de53e73b654e425984097cbcbb55c54d32a6fa3cb2d323524bb3256a50c6\": container with ID starting with 6070de53e73b654e425984097cbcbb55c54d32a6fa3cb2d323524bb3256a50c6 not found: ID does not exist" Dec 02 16:18:17 crc kubenswrapper[4933]: I1202 16:18:17.423312 4933 scope.go:117] "RemoveContainer" containerID="a18d07d7710c65de4eece573380d203afe63bddee8a74e00d3d854d1ab8f07f4" Dec 02 16:18:17 crc kubenswrapper[4933]: E1202 16:18:17.423666 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a18d07d7710c65de4eece573380d203afe63bddee8a74e00d3d854d1ab8f07f4\": container with ID starting with a18d07d7710c65de4eece573380d203afe63bddee8a74e00d3d854d1ab8f07f4 not found: ID does not exist" containerID="a18d07d7710c65de4eece573380d203afe63bddee8a74e00d3d854d1ab8f07f4" Dec 02 16:18:17 crc kubenswrapper[4933]: I1202 16:18:17.423694 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a18d07d7710c65de4eece573380d203afe63bddee8a74e00d3d854d1ab8f07f4"} err="failed to get container status \"a18d07d7710c65de4eece573380d203afe63bddee8a74e00d3d854d1ab8f07f4\": rpc error: code = NotFound desc = could not find container \"a18d07d7710c65de4eece573380d203afe63bddee8a74e00d3d854d1ab8f07f4\": container with ID starting with a18d07d7710c65de4eece573380d203afe63bddee8a74e00d3d854d1ab8f07f4 not found: ID does not exist" Dec 02 16:18:17 crc kubenswrapper[4933]: I1202 16:18:17.423707 4933 scope.go:117] "RemoveContainer" containerID="3927949d979b6f21a1bd1cf411956109309710dcddf32b5a9f27b0ef47d6c472" Dec 02 16:18:17 crc kubenswrapper[4933]: E1202 16:18:17.424100 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3927949d979b6f21a1bd1cf411956109309710dcddf32b5a9f27b0ef47d6c472\": container with ID starting with 3927949d979b6f21a1bd1cf411956109309710dcddf32b5a9f27b0ef47d6c472 not found: ID does not exist" containerID="3927949d979b6f21a1bd1cf411956109309710dcddf32b5a9f27b0ef47d6c472" Dec 02 16:18:17 crc kubenswrapper[4933]: I1202 16:18:17.424136 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3927949d979b6f21a1bd1cf411956109309710dcddf32b5a9f27b0ef47d6c472"} err="failed to get container status \"3927949d979b6f21a1bd1cf411956109309710dcddf32b5a9f27b0ef47d6c472\": rpc error: code = NotFound desc = could not find container \"3927949d979b6f21a1bd1cf411956109309710dcddf32b5a9f27b0ef47d6c472\": container with ID starting with 3927949d979b6f21a1bd1cf411956109309710dcddf32b5a9f27b0ef47d6c472 not found: ID does not exist" Dec 02 16:18:17 crc kubenswrapper[4933]: I1202 16:18:17.424150 4933 scope.go:117] "RemoveContainer" containerID="d5a2bf8c4e5e0c59a8e2da171d829bd56a23120937e6f15e060463c3fed44198" Dec 02 16:18:17 crc kubenswrapper[4933]: I1202 16:18:17.424485 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5a2bf8c4e5e0c59a8e2da171d829bd56a23120937e6f15e060463c3fed44198"} err="failed to get container status \"d5a2bf8c4e5e0c59a8e2da171d829bd56a23120937e6f15e060463c3fed44198\": rpc error: code = NotFound desc = could not find container \"d5a2bf8c4e5e0c59a8e2da171d829bd56a23120937e6f15e060463c3fed44198\": container with ID starting with d5a2bf8c4e5e0c59a8e2da171d829bd56a23120937e6f15e060463c3fed44198 not found: ID does not exist" Dec 02 16:18:17 crc kubenswrapper[4933]: I1202 16:18:17.424505 4933 scope.go:117] "RemoveContainer" containerID="6070de53e73b654e425984097cbcbb55c54d32a6fa3cb2d323524bb3256a50c6" Dec 02 16:18:17 crc kubenswrapper[4933]: I1202 16:18:17.424739 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6070de53e73b654e425984097cbcbb55c54d32a6fa3cb2d323524bb3256a50c6"} err="failed to get container status \"6070de53e73b654e425984097cbcbb55c54d32a6fa3cb2d323524bb3256a50c6\": rpc error: code = NotFound desc = could not find container \"6070de53e73b654e425984097cbcbb55c54d32a6fa3cb2d323524bb3256a50c6\": container with ID starting with 6070de53e73b654e425984097cbcbb55c54d32a6fa3cb2d323524bb3256a50c6 not found: ID does not exist" Dec 02 16:18:17 crc kubenswrapper[4933]: I1202 16:18:17.424758 4933 scope.go:117] "RemoveContainer" containerID="a18d07d7710c65de4eece573380d203afe63bddee8a74e00d3d854d1ab8f07f4" Dec 02 16:18:17 crc kubenswrapper[4933]: I1202 16:18:17.424987 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a18d07d7710c65de4eece573380d203afe63bddee8a74e00d3d854d1ab8f07f4"} err="failed to get container status \"a18d07d7710c65de4eece573380d203afe63bddee8a74e00d3d854d1ab8f07f4\": rpc error: code = NotFound desc = could not find container \"a18d07d7710c65de4eece573380d203afe63bddee8a74e00d3d854d1ab8f07f4\": container with ID starting with a18d07d7710c65de4eece573380d203afe63bddee8a74e00d3d854d1ab8f07f4 not found: ID does not exist" Dec 02 16:18:17 crc kubenswrapper[4933]: I1202 16:18:17.425007 4933 scope.go:117] "RemoveContainer" containerID="3927949d979b6f21a1bd1cf411956109309710dcddf32b5a9f27b0ef47d6c472" Dec 02 16:18:17 crc kubenswrapper[4933]: I1202 16:18:17.425214 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3927949d979b6f21a1bd1cf411956109309710dcddf32b5a9f27b0ef47d6c472"} err="failed to get container status \"3927949d979b6f21a1bd1cf411956109309710dcddf32b5a9f27b0ef47d6c472\": rpc error: code = NotFound desc = could not find container \"3927949d979b6f21a1bd1cf411956109309710dcddf32b5a9f27b0ef47d6c472\": container with ID starting with 3927949d979b6f21a1bd1cf411956109309710dcddf32b5a9f27b0ef47d6c472 not found: ID does not exist" Dec 02 16:18:17 crc kubenswrapper[4933]: I1202 16:18:17.536244 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Dec 02 16:18:17 crc kubenswrapper[4933]: I1202 16:18:17.549372 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-0"] Dec 02 16:18:17 crc kubenswrapper[4933]: I1202 16:18:17.561852 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Dec 02 16:18:17 crc kubenswrapper[4933]: E1202 16:18:17.562402 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9e6f954-bf8f-4aab-a398-d3320706266e" containerName="aodh-api" Dec 02 16:18:17 crc kubenswrapper[4933]: I1202 16:18:17.562422 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9e6f954-bf8f-4aab-a398-d3320706266e" containerName="aodh-api" Dec 02 16:18:17 crc kubenswrapper[4933]: E1202 16:18:17.562449 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9e6f954-bf8f-4aab-a398-d3320706266e" containerName="aodh-listener" Dec 02 16:18:17 crc kubenswrapper[4933]: I1202 16:18:17.562457 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9e6f954-bf8f-4aab-a398-d3320706266e" containerName="aodh-listener" Dec 02 16:18:17 crc kubenswrapper[4933]: E1202 16:18:17.562475 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9e6f954-bf8f-4aab-a398-d3320706266e" containerName="aodh-evaluator" Dec 02 16:18:17 crc kubenswrapper[4933]: I1202 16:18:17.562481 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9e6f954-bf8f-4aab-a398-d3320706266e" containerName="aodh-evaluator" Dec 02 16:18:17 crc kubenswrapper[4933]: E1202 16:18:17.562507 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9e6f954-bf8f-4aab-a398-d3320706266e" containerName="aodh-notifier" Dec 02 16:18:17 crc kubenswrapper[4933]: I1202 16:18:17.562513 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9e6f954-bf8f-4aab-a398-d3320706266e" containerName="aodh-notifier" Dec 02 16:18:17 crc kubenswrapper[4933]: I1202 16:18:17.562711 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9e6f954-bf8f-4aab-a398-d3320706266e" containerName="aodh-notifier" Dec 02 16:18:17 crc kubenswrapper[4933]: I1202 16:18:17.562730 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9e6f954-bf8f-4aab-a398-d3320706266e" containerName="aodh-evaluator" Dec 02 16:18:17 crc kubenswrapper[4933]: I1202 16:18:17.562745 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9e6f954-bf8f-4aab-a398-d3320706266e" containerName="aodh-api" Dec 02 16:18:17 crc kubenswrapper[4933]: I1202 16:18:17.562763 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9e6f954-bf8f-4aab-a398-d3320706266e" containerName="aodh-listener" Dec 02 16:18:17 crc kubenswrapper[4933]: I1202 16:18:17.564793 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 02 16:18:17 crc kubenswrapper[4933]: I1202 16:18:17.569184 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Dec 02 16:18:17 crc kubenswrapper[4933]: I1202 16:18:17.569354 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-g75sl" Dec 02 16:18:17 crc kubenswrapper[4933]: I1202 16:18:17.569501 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-internal-svc" Dec 02 16:18:17 crc kubenswrapper[4933]: I1202 16:18:17.569625 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-public-svc" Dec 02 16:18:17 crc kubenswrapper[4933]: I1202 16:18:17.569753 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Dec 02 16:18:17 crc kubenswrapper[4933]: I1202 16:18:17.575062 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Dec 02 16:18:17 crc kubenswrapper[4933]: I1202 16:18:17.693813 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/270fd24f-c6e3-47d9-b20f-7a7c9aa15d37-internal-tls-certs\") pod \"aodh-0\" (UID: \"270fd24f-c6e3-47d9-b20f-7a7c9aa15d37\") " pod="openstack/aodh-0" Dec 02 16:18:17 crc kubenswrapper[4933]: I1202 16:18:17.694080 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/270fd24f-c6e3-47d9-b20f-7a7c9aa15d37-config-data\") pod \"aodh-0\" (UID: \"270fd24f-c6e3-47d9-b20f-7a7c9aa15d37\") " pod="openstack/aodh-0" Dec 02 16:18:17 crc kubenswrapper[4933]: I1202 16:18:17.694344 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/270fd24f-c6e3-47d9-b20f-7a7c9aa15d37-scripts\") pod \"aodh-0\" (UID: \"270fd24f-c6e3-47d9-b20f-7a7c9aa15d37\") " pod="openstack/aodh-0" Dec 02 16:18:17 crc kubenswrapper[4933]: I1202 16:18:17.694528 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/270fd24f-c6e3-47d9-b20f-7a7c9aa15d37-public-tls-certs\") pod \"aodh-0\" (UID: \"270fd24f-c6e3-47d9-b20f-7a7c9aa15d37\") " pod="openstack/aodh-0" Dec 02 16:18:17 crc kubenswrapper[4933]: I1202 16:18:17.694661 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/270fd24f-c6e3-47d9-b20f-7a7c9aa15d37-combined-ca-bundle\") pod \"aodh-0\" (UID: \"270fd24f-c6e3-47d9-b20f-7a7c9aa15d37\") " pod="openstack/aodh-0" Dec 02 16:18:17 crc kubenswrapper[4933]: I1202 16:18:17.694850 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvtf2\" (UniqueName: \"kubernetes.io/projected/270fd24f-c6e3-47d9-b20f-7a7c9aa15d37-kube-api-access-vvtf2\") pod \"aodh-0\" (UID: \"270fd24f-c6e3-47d9-b20f-7a7c9aa15d37\") " pod="openstack/aodh-0" Dec 02 16:18:17 crc kubenswrapper[4933]: I1202 16:18:17.800347 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/270fd24f-c6e3-47d9-b20f-7a7c9aa15d37-public-tls-certs\") pod \"aodh-0\" (UID: \"270fd24f-c6e3-47d9-b20f-7a7c9aa15d37\") " pod="openstack/aodh-0" Dec 02 16:18:17 crc kubenswrapper[4933]: I1202 16:18:17.800413 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/270fd24f-c6e3-47d9-b20f-7a7c9aa15d37-combined-ca-bundle\") pod \"aodh-0\" (UID: \"270fd24f-c6e3-47d9-b20f-7a7c9aa15d37\") " pod="openstack/aodh-0" Dec 02 16:18:17 crc kubenswrapper[4933]: I1202 16:18:17.800484 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvtf2\" (UniqueName: \"kubernetes.io/projected/270fd24f-c6e3-47d9-b20f-7a7c9aa15d37-kube-api-access-vvtf2\") pod \"aodh-0\" (UID: \"270fd24f-c6e3-47d9-b20f-7a7c9aa15d37\") " pod="openstack/aodh-0" Dec 02 16:18:17 crc kubenswrapper[4933]: I1202 16:18:17.800626 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/270fd24f-c6e3-47d9-b20f-7a7c9aa15d37-internal-tls-certs\") pod \"aodh-0\" (UID: \"270fd24f-c6e3-47d9-b20f-7a7c9aa15d37\") " pod="openstack/aodh-0" Dec 02 16:18:17 crc kubenswrapper[4933]: I1202 16:18:17.800702 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/270fd24f-c6e3-47d9-b20f-7a7c9aa15d37-config-data\") pod \"aodh-0\" (UID: \"270fd24f-c6e3-47d9-b20f-7a7c9aa15d37\") " pod="openstack/aodh-0" Dec 02 16:18:17 crc kubenswrapper[4933]: I1202 16:18:17.800769 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/270fd24f-c6e3-47d9-b20f-7a7c9aa15d37-scripts\") pod \"aodh-0\" (UID: \"270fd24f-c6e3-47d9-b20f-7a7c9aa15d37\") " pod="openstack/aodh-0" Dec 02 16:18:17 crc kubenswrapper[4933]: I1202 16:18:17.806707 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/270fd24f-c6e3-47d9-b20f-7a7c9aa15d37-config-data\") pod \"aodh-0\" (UID: \"270fd24f-c6e3-47d9-b20f-7a7c9aa15d37\") " pod="openstack/aodh-0" Dec 02 16:18:17 crc kubenswrapper[4933]: I1202 16:18:17.807414 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/270fd24f-c6e3-47d9-b20f-7a7c9aa15d37-combined-ca-bundle\") pod \"aodh-0\" (UID: \"270fd24f-c6e3-47d9-b20f-7a7c9aa15d37\") " pod="openstack/aodh-0" Dec 02 16:18:17 crc kubenswrapper[4933]: I1202 16:18:17.818874 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/270fd24f-c6e3-47d9-b20f-7a7c9aa15d37-scripts\") pod \"aodh-0\" (UID: \"270fd24f-c6e3-47d9-b20f-7a7c9aa15d37\") " pod="openstack/aodh-0" Dec 02 16:18:17 crc kubenswrapper[4933]: I1202 16:18:17.821441 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/270fd24f-c6e3-47d9-b20f-7a7c9aa15d37-internal-tls-certs\") pod \"aodh-0\" (UID: \"270fd24f-c6e3-47d9-b20f-7a7c9aa15d37\") " pod="openstack/aodh-0" Dec 02 16:18:17 crc kubenswrapper[4933]: I1202 16:18:17.824083 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/270fd24f-c6e3-47d9-b20f-7a7c9aa15d37-public-tls-certs\") pod \"aodh-0\" (UID: \"270fd24f-c6e3-47d9-b20f-7a7c9aa15d37\") " pod="openstack/aodh-0" Dec 02 16:18:17 crc kubenswrapper[4933]: I1202 16:18:17.831516 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvtf2\" (UniqueName: \"kubernetes.io/projected/270fd24f-c6e3-47d9-b20f-7a7c9aa15d37-kube-api-access-vvtf2\") pod \"aodh-0\" (UID: \"270fd24f-c6e3-47d9-b20f-7a7c9aa15d37\") " pod="openstack/aodh-0" Dec 02 16:18:17 crc kubenswrapper[4933]: I1202 16:18:17.905140 4933 generic.go:334] "Generic (PLEG): container finished" podID="000e4cc8-9c69-4cd9-8d10-ea8a7a3411b4" containerID="9f30483237a36797d943b695db267e6a2e3addd2847dcc1057e600d017d6dff6" exitCode=0 Dec 02 16:18:17 crc kubenswrapper[4933]: I1202 16:18:17.905205 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d99f6bc7f-xjx4w" event={"ID":"000e4cc8-9c69-4cd9-8d10-ea8a7a3411b4","Type":"ContainerDied","Data":"9f30483237a36797d943b695db267e6a2e3addd2847dcc1057e600d017d6dff6"} Dec 02 16:18:17 crc kubenswrapper[4933]: I1202 16:18:17.905234 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d99f6bc7f-xjx4w" event={"ID":"000e4cc8-9c69-4cd9-8d10-ea8a7a3411b4","Type":"ContainerStarted","Data":"d70e6ae6dff7504fbed38c07dc7234938f58f90ee33d8b5ef964005348c2197d"} Dec 02 16:18:17 crc kubenswrapper[4933]: I1202 16:18:17.912513 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cjgzk" event={"ID":"50ec02ba-6f45-4d20-a016-53cd873906ae","Type":"ContainerStarted","Data":"b5ed5f7d74538b7482cb67b0e747bfd2b3b3260d8cbcb464eeb643f22f08adb4"} Dec 02 16:18:17 crc kubenswrapper[4933]: I1202 16:18:17.982967 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 02 16:18:18 crc kubenswrapper[4933]: I1202 16:18:18.887811 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Dec 02 16:18:18 crc kubenswrapper[4933]: I1202 16:18:18.967863 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d99f6bc7f-xjx4w" event={"ID":"000e4cc8-9c69-4cd9-8d10-ea8a7a3411b4","Type":"ContainerStarted","Data":"7c941942bb2f62e644f5bfe46064a4ba2d6bf94f7382f3849356d2c44a2a3c45"} Dec 02 16:18:18 crc kubenswrapper[4933]: I1202 16:18:18.968042 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6d99f6bc7f-xjx4w" Dec 02 16:18:18 crc kubenswrapper[4933]: I1202 16:18:18.970568 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"270fd24f-c6e3-47d9-b20f-7a7c9aa15d37","Type":"ContainerStarted","Data":"1bab90aa13745309c7af191669811adb8e31c099193d3ceb0c1f9464d49ebf01"} Dec 02 16:18:18 crc kubenswrapper[4933]: I1202 16:18:18.995028 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6d99f6bc7f-xjx4w" podStartSLOduration=3.995007447 podStartE2EDuration="3.995007447s" podCreationTimestamp="2025-12-02 16:18:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 16:18:18.989943167 +0000 UTC m=+1562.241169860" watchObservedRunningTime="2025-12-02 16:18:18.995007447 +0000 UTC m=+1562.246234150" Dec 02 16:18:19 crc kubenswrapper[4933]: I1202 16:18:19.078387 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9e6f954-bf8f-4aab-a398-d3320706266e" path="/var/lib/kubelet/pods/a9e6f954-bf8f-4aab-a398-d3320706266e/volumes" Dec 02 16:18:19 crc kubenswrapper[4933]: I1202 16:18:19.112227 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 02 16:18:19 crc kubenswrapper[4933]: I1202 16:18:19.112482 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="d822a2fc-63d8-4767-805b-d68ff0173248" containerName="nova-api-log" containerID="cri-o://0d1d36c28b1eb7ed4343f377b2f784db47dcaf16b3355a2615520c17dc3b36f6" gracePeriod=30 Dec 02 16:18:19 crc kubenswrapper[4933]: I1202 16:18:19.112943 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="d822a2fc-63d8-4767-805b-d68ff0173248" containerName="nova-api-api" containerID="cri-o://60370ca13f9bd60ea2881eae5d46b04ed2c2bbd59212dbe4137121ec23660afb" gracePeriod=30 Dec 02 16:18:19 crc kubenswrapper[4933]: I1202 16:18:19.886773 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 16:18:19 crc kubenswrapper[4933]: I1202 16:18:19.887716 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="43f4221c-6190-460d-9bd8-2a20b003c890" containerName="ceilometer-central-agent" containerID="cri-o://dfa961c2cdf52b7fd81f9f80d73b6fc10a164f842f9eb764e9bce33743bca3aa" gracePeriod=30 Dec 02 16:18:19 crc kubenswrapper[4933]: I1202 16:18:19.887763 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="43f4221c-6190-460d-9bd8-2a20b003c890" containerName="sg-core" containerID="cri-o://9986c9817966e4ff1b75e7ca7999d10b38938cd839dab7b7cdbdce9ebb69e00e" gracePeriod=30 Dec 02 16:18:19 crc kubenswrapper[4933]: I1202 16:18:19.887785 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="43f4221c-6190-460d-9bd8-2a20b003c890" containerName="ceilometer-notification-agent" containerID="cri-o://243ad2aa6ae175b50f1df1da30fd2bc0383e0e56746f64f56579aa57e79b5223" gracePeriod=30 Dec 02 16:18:19 crc kubenswrapper[4933]: I1202 16:18:19.887807 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="43f4221c-6190-460d-9bd8-2a20b003c890" containerName="proxy-httpd" containerID="cri-o://11d50da1681d83e2a3f5ca983fe39e7ee44b4f02c55495cf197b101458e7db00" gracePeriod=30 Dec 02 16:18:19 crc kubenswrapper[4933]: I1202 16:18:19.984794 4933 generic.go:334] "Generic (PLEG): container finished" podID="d822a2fc-63d8-4767-805b-d68ff0173248" containerID="0d1d36c28b1eb7ed4343f377b2f784db47dcaf16b3355a2615520c17dc3b36f6" exitCode=143 Dec 02 16:18:19 crc kubenswrapper[4933]: I1202 16:18:19.984882 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d822a2fc-63d8-4767-805b-d68ff0173248","Type":"ContainerDied","Data":"0d1d36c28b1eb7ed4343f377b2f784db47dcaf16b3355a2615520c17dc3b36f6"} Dec 02 16:18:19 crc kubenswrapper[4933]: I1202 16:18:19.987971 4933 generic.go:334] "Generic (PLEG): container finished" podID="50ec02ba-6f45-4d20-a016-53cd873906ae" containerID="b5ed5f7d74538b7482cb67b0e747bfd2b3b3260d8cbcb464eeb643f22f08adb4" exitCode=0 Dec 02 16:18:19 crc kubenswrapper[4933]: I1202 16:18:19.988060 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cjgzk" event={"ID":"50ec02ba-6f45-4d20-a016-53cd873906ae","Type":"ContainerDied","Data":"b5ed5f7d74538b7482cb67b0e747bfd2b3b3260d8cbcb464eeb643f22f08adb4"} Dec 02 16:18:20 crc kubenswrapper[4933]: I1202 16:18:20.294629 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 02 16:18:21 crc kubenswrapper[4933]: I1202 16:18:21.001462 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cjgzk" event={"ID":"50ec02ba-6f45-4d20-a016-53cd873906ae","Type":"ContainerStarted","Data":"a0caddeb93067580bda7c104afca17b96085fbb21420b0f6c451edc06dcc60dd"} Dec 02 16:18:21 crc kubenswrapper[4933]: I1202 16:18:21.005147 4933 generic.go:334] "Generic (PLEG): container finished" podID="43f4221c-6190-460d-9bd8-2a20b003c890" containerID="11d50da1681d83e2a3f5ca983fe39e7ee44b4f02c55495cf197b101458e7db00" exitCode=0 Dec 02 16:18:21 crc kubenswrapper[4933]: I1202 16:18:21.005187 4933 generic.go:334] "Generic (PLEG): container finished" podID="43f4221c-6190-460d-9bd8-2a20b003c890" containerID="9986c9817966e4ff1b75e7ca7999d10b38938cd839dab7b7cdbdce9ebb69e00e" exitCode=2 Dec 02 16:18:21 crc kubenswrapper[4933]: I1202 16:18:21.005196 4933 generic.go:334] "Generic (PLEG): container finished" podID="43f4221c-6190-460d-9bd8-2a20b003c890" containerID="dfa961c2cdf52b7fd81f9f80d73b6fc10a164f842f9eb764e9bce33743bca3aa" exitCode=0 Dec 02 16:18:21 crc kubenswrapper[4933]: I1202 16:18:21.005217 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"43f4221c-6190-460d-9bd8-2a20b003c890","Type":"ContainerDied","Data":"11d50da1681d83e2a3f5ca983fe39e7ee44b4f02c55495cf197b101458e7db00"} Dec 02 16:18:21 crc kubenswrapper[4933]: I1202 16:18:21.005242 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"43f4221c-6190-460d-9bd8-2a20b003c890","Type":"ContainerDied","Data":"9986c9817966e4ff1b75e7ca7999d10b38938cd839dab7b7cdbdce9ebb69e00e"} Dec 02 16:18:21 crc kubenswrapper[4933]: I1202 16:18:21.005252 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"43f4221c-6190-460d-9bd8-2a20b003c890","Type":"ContainerDied","Data":"dfa961c2cdf52b7fd81f9f80d73b6fc10a164f842f9eb764e9bce33743bca3aa"} Dec 02 16:18:21 crc kubenswrapper[4933]: I1202 16:18:21.027354 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-cjgzk" podStartSLOduration=2.282467232 podStartE2EDuration="7.027332031s" podCreationTimestamp="2025-12-02 16:18:14 +0000 UTC" firstStartedPulling="2025-12-02 16:18:15.850529417 +0000 UTC m=+1559.101756130" lastFinishedPulling="2025-12-02 16:18:20.595394226 +0000 UTC m=+1563.846620929" observedRunningTime="2025-12-02 16:18:21.017609943 +0000 UTC m=+1564.268836646" watchObservedRunningTime="2025-12-02 16:18:21.027332031 +0000 UTC m=+1564.278558744" Dec 02 16:18:21 crc kubenswrapper[4933]: I1202 16:18:21.809021 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 16:18:21 crc kubenswrapper[4933]: I1202 16:18:21.932329 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43f4221c-6190-460d-9bd8-2a20b003c890-combined-ca-bundle\") pod \"43f4221c-6190-460d-9bd8-2a20b003c890\" (UID: \"43f4221c-6190-460d-9bd8-2a20b003c890\") " Dec 02 16:18:21 crc kubenswrapper[4933]: I1202 16:18:21.932861 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43f4221c-6190-460d-9bd8-2a20b003c890-config-data\") pod \"43f4221c-6190-460d-9bd8-2a20b003c890\" (UID: \"43f4221c-6190-460d-9bd8-2a20b003c890\") " Dec 02 16:18:21 crc kubenswrapper[4933]: I1202 16:18:21.932971 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/43f4221c-6190-460d-9bd8-2a20b003c890-scripts\") pod \"43f4221c-6190-460d-9bd8-2a20b003c890\" (UID: \"43f4221c-6190-460d-9bd8-2a20b003c890\") " Dec 02 16:18:21 crc kubenswrapper[4933]: I1202 16:18:21.933227 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/43f4221c-6190-460d-9bd8-2a20b003c890-run-httpd\") pod \"43f4221c-6190-460d-9bd8-2a20b003c890\" (UID: \"43f4221c-6190-460d-9bd8-2a20b003c890\") " Dec 02 16:18:21 crc kubenswrapper[4933]: I1202 16:18:21.933750 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43f4221c-6190-460d-9bd8-2a20b003c890-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "43f4221c-6190-460d-9bd8-2a20b003c890" (UID: "43f4221c-6190-460d-9bd8-2a20b003c890"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 16:18:21 crc kubenswrapper[4933]: I1202 16:18:21.933858 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/43f4221c-6190-460d-9bd8-2a20b003c890-log-httpd\") pod \"43f4221c-6190-460d-9bd8-2a20b003c890\" (UID: \"43f4221c-6190-460d-9bd8-2a20b003c890\") " Dec 02 16:18:21 crc kubenswrapper[4933]: I1202 16:18:21.933941 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4x4zn\" (UniqueName: \"kubernetes.io/projected/43f4221c-6190-460d-9bd8-2a20b003c890-kube-api-access-4x4zn\") pod \"43f4221c-6190-460d-9bd8-2a20b003c890\" (UID: \"43f4221c-6190-460d-9bd8-2a20b003c890\") " Dec 02 16:18:21 crc kubenswrapper[4933]: I1202 16:18:21.934034 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/43f4221c-6190-460d-9bd8-2a20b003c890-sg-core-conf-yaml\") pod \"43f4221c-6190-460d-9bd8-2a20b003c890\" (UID: \"43f4221c-6190-460d-9bd8-2a20b003c890\") " Dec 02 16:18:21 crc kubenswrapper[4933]: I1202 16:18:21.934439 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43f4221c-6190-460d-9bd8-2a20b003c890-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "43f4221c-6190-460d-9bd8-2a20b003c890" (UID: "43f4221c-6190-460d-9bd8-2a20b003c890"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 16:18:21 crc kubenswrapper[4933]: I1202 16:18:21.935017 4933 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/43f4221c-6190-460d-9bd8-2a20b003c890-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 16:18:21 crc kubenswrapper[4933]: I1202 16:18:21.935032 4933 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/43f4221c-6190-460d-9bd8-2a20b003c890-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 16:18:21 crc kubenswrapper[4933]: I1202 16:18:21.937477 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43f4221c-6190-460d-9bd8-2a20b003c890-scripts" (OuterVolumeSpecName: "scripts") pod "43f4221c-6190-460d-9bd8-2a20b003c890" (UID: "43f4221c-6190-460d-9bd8-2a20b003c890"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:18:21 crc kubenswrapper[4933]: I1202 16:18:21.948571 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43f4221c-6190-460d-9bd8-2a20b003c890-kube-api-access-4x4zn" (OuterVolumeSpecName: "kube-api-access-4x4zn") pod "43f4221c-6190-460d-9bd8-2a20b003c890" (UID: "43f4221c-6190-460d-9bd8-2a20b003c890"). InnerVolumeSpecName "kube-api-access-4x4zn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:18:21 crc kubenswrapper[4933]: I1202 16:18:21.990369 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43f4221c-6190-460d-9bd8-2a20b003c890-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "43f4221c-6190-460d-9bd8-2a20b003c890" (UID: "43f4221c-6190-460d-9bd8-2a20b003c890"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:18:22 crc kubenswrapper[4933]: I1202 16:18:22.038918 4933 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/43f4221c-6190-460d-9bd8-2a20b003c890-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 16:18:22 crc kubenswrapper[4933]: I1202 16:18:22.038949 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4x4zn\" (UniqueName: \"kubernetes.io/projected/43f4221c-6190-460d-9bd8-2a20b003c890-kube-api-access-4x4zn\") on node \"crc\" DevicePath \"\"" Dec 02 16:18:22 crc kubenswrapper[4933]: I1202 16:18:22.038964 4933 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/43f4221c-6190-460d-9bd8-2a20b003c890-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 02 16:18:22 crc kubenswrapper[4933]: I1202 16:18:22.040170 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"270fd24f-c6e3-47d9-b20f-7a7c9aa15d37","Type":"ContainerStarted","Data":"38de01f7ad7491c7c2e2d65b96fa2799b18c3f7c16abee72ce22a35810a0dac2"} Dec 02 16:18:22 crc kubenswrapper[4933]: I1202 16:18:22.046765 4933 generic.go:334] "Generic (PLEG): container finished" podID="43f4221c-6190-460d-9bd8-2a20b003c890" containerID="243ad2aa6ae175b50f1df1da30fd2bc0383e0e56746f64f56579aa57e79b5223" exitCode=0 Dec 02 16:18:22 crc kubenswrapper[4933]: I1202 16:18:22.046835 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"43f4221c-6190-460d-9bd8-2a20b003c890","Type":"ContainerDied","Data":"243ad2aa6ae175b50f1df1da30fd2bc0383e0e56746f64f56579aa57e79b5223"} Dec 02 16:18:22 crc kubenswrapper[4933]: I1202 16:18:22.046873 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"43f4221c-6190-460d-9bd8-2a20b003c890","Type":"ContainerDied","Data":"881b25992a79e2f8ba60561d7658ae8dd712b24b2588f95eb75c4c5d1db0ac68"} Dec 02 16:18:22 crc kubenswrapper[4933]: I1202 16:18:22.046894 4933 scope.go:117] "RemoveContainer" containerID="11d50da1681d83e2a3f5ca983fe39e7ee44b4f02c55495cf197b101458e7db00" Dec 02 16:18:22 crc kubenswrapper[4933]: I1202 16:18:22.047079 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 16:18:22 crc kubenswrapper[4933]: I1202 16:18:22.142142 4933 scope.go:117] "RemoveContainer" containerID="9986c9817966e4ff1b75e7ca7999d10b38938cd839dab7b7cdbdce9ebb69e00e" Dec 02 16:18:22 crc kubenswrapper[4933]: I1202 16:18:22.144047 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43f4221c-6190-460d-9bd8-2a20b003c890-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "43f4221c-6190-460d-9bd8-2a20b003c890" (UID: "43f4221c-6190-460d-9bd8-2a20b003c890"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:18:22 crc kubenswrapper[4933]: I1202 16:18:22.165108 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43f4221c-6190-460d-9bd8-2a20b003c890-config-data" (OuterVolumeSpecName: "config-data") pod "43f4221c-6190-460d-9bd8-2a20b003c890" (UID: "43f4221c-6190-460d-9bd8-2a20b003c890"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:18:22 crc kubenswrapper[4933]: I1202 16:18:22.181766 4933 scope.go:117] "RemoveContainer" containerID="243ad2aa6ae175b50f1df1da30fd2bc0383e0e56746f64f56579aa57e79b5223" Dec 02 16:18:22 crc kubenswrapper[4933]: I1202 16:18:22.218241 4933 scope.go:117] "RemoveContainer" containerID="dfa961c2cdf52b7fd81f9f80d73b6fc10a164f842f9eb764e9bce33743bca3aa" Dec 02 16:18:22 crc kubenswrapper[4933]: I1202 16:18:22.237929 4933 scope.go:117] "RemoveContainer" containerID="11d50da1681d83e2a3f5ca983fe39e7ee44b4f02c55495cf197b101458e7db00" Dec 02 16:18:22 crc kubenswrapper[4933]: E1202 16:18:22.238752 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11d50da1681d83e2a3f5ca983fe39e7ee44b4f02c55495cf197b101458e7db00\": container with ID starting with 11d50da1681d83e2a3f5ca983fe39e7ee44b4f02c55495cf197b101458e7db00 not found: ID does not exist" containerID="11d50da1681d83e2a3f5ca983fe39e7ee44b4f02c55495cf197b101458e7db00" Dec 02 16:18:22 crc kubenswrapper[4933]: I1202 16:18:22.238786 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11d50da1681d83e2a3f5ca983fe39e7ee44b4f02c55495cf197b101458e7db00"} err="failed to get container status \"11d50da1681d83e2a3f5ca983fe39e7ee44b4f02c55495cf197b101458e7db00\": rpc error: code = NotFound desc = could not find container \"11d50da1681d83e2a3f5ca983fe39e7ee44b4f02c55495cf197b101458e7db00\": container with ID starting with 11d50da1681d83e2a3f5ca983fe39e7ee44b4f02c55495cf197b101458e7db00 not found: ID does not exist" Dec 02 16:18:22 crc kubenswrapper[4933]: I1202 16:18:22.238839 4933 scope.go:117] "RemoveContainer" containerID="9986c9817966e4ff1b75e7ca7999d10b38938cd839dab7b7cdbdce9ebb69e00e" Dec 02 16:18:22 crc kubenswrapper[4933]: E1202 16:18:22.239183 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9986c9817966e4ff1b75e7ca7999d10b38938cd839dab7b7cdbdce9ebb69e00e\": container with ID starting with 9986c9817966e4ff1b75e7ca7999d10b38938cd839dab7b7cdbdce9ebb69e00e not found: ID does not exist" containerID="9986c9817966e4ff1b75e7ca7999d10b38938cd839dab7b7cdbdce9ebb69e00e" Dec 02 16:18:22 crc kubenswrapper[4933]: I1202 16:18:22.239223 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9986c9817966e4ff1b75e7ca7999d10b38938cd839dab7b7cdbdce9ebb69e00e"} err="failed to get container status \"9986c9817966e4ff1b75e7ca7999d10b38938cd839dab7b7cdbdce9ebb69e00e\": rpc error: code = NotFound desc = could not find container \"9986c9817966e4ff1b75e7ca7999d10b38938cd839dab7b7cdbdce9ebb69e00e\": container with ID starting with 9986c9817966e4ff1b75e7ca7999d10b38938cd839dab7b7cdbdce9ebb69e00e not found: ID does not exist" Dec 02 16:18:22 crc kubenswrapper[4933]: I1202 16:18:22.239251 4933 scope.go:117] "RemoveContainer" containerID="243ad2aa6ae175b50f1df1da30fd2bc0383e0e56746f64f56579aa57e79b5223" Dec 02 16:18:22 crc kubenswrapper[4933]: E1202 16:18:22.239603 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"243ad2aa6ae175b50f1df1da30fd2bc0383e0e56746f64f56579aa57e79b5223\": container with ID starting with 243ad2aa6ae175b50f1df1da30fd2bc0383e0e56746f64f56579aa57e79b5223 not found: ID does not exist" containerID="243ad2aa6ae175b50f1df1da30fd2bc0383e0e56746f64f56579aa57e79b5223" Dec 02 16:18:22 crc kubenswrapper[4933]: I1202 16:18:22.239645 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"243ad2aa6ae175b50f1df1da30fd2bc0383e0e56746f64f56579aa57e79b5223"} err="failed to get container status \"243ad2aa6ae175b50f1df1da30fd2bc0383e0e56746f64f56579aa57e79b5223\": rpc error: code = NotFound desc = could not find container \"243ad2aa6ae175b50f1df1da30fd2bc0383e0e56746f64f56579aa57e79b5223\": container with ID starting with 243ad2aa6ae175b50f1df1da30fd2bc0383e0e56746f64f56579aa57e79b5223 not found: ID does not exist" Dec 02 16:18:22 crc kubenswrapper[4933]: I1202 16:18:22.239672 4933 scope.go:117] "RemoveContainer" containerID="dfa961c2cdf52b7fd81f9f80d73b6fc10a164f842f9eb764e9bce33743bca3aa" Dec 02 16:18:22 crc kubenswrapper[4933]: E1202 16:18:22.239971 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dfa961c2cdf52b7fd81f9f80d73b6fc10a164f842f9eb764e9bce33743bca3aa\": container with ID starting with dfa961c2cdf52b7fd81f9f80d73b6fc10a164f842f9eb764e9bce33743bca3aa not found: ID does not exist" containerID="dfa961c2cdf52b7fd81f9f80d73b6fc10a164f842f9eb764e9bce33743bca3aa" Dec 02 16:18:22 crc kubenswrapper[4933]: I1202 16:18:22.240031 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dfa961c2cdf52b7fd81f9f80d73b6fc10a164f842f9eb764e9bce33743bca3aa"} err="failed to get container status \"dfa961c2cdf52b7fd81f9f80d73b6fc10a164f842f9eb764e9bce33743bca3aa\": rpc error: code = NotFound desc = could not find container \"dfa961c2cdf52b7fd81f9f80d73b6fc10a164f842f9eb764e9bce33743bca3aa\": container with ID starting with dfa961c2cdf52b7fd81f9f80d73b6fc10a164f842f9eb764e9bce33743bca3aa not found: ID does not exist" Dec 02 16:18:22 crc kubenswrapper[4933]: I1202 16:18:22.243858 4933 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43f4221c-6190-460d-9bd8-2a20b003c890-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 16:18:22 crc kubenswrapper[4933]: I1202 16:18:22.243981 4933 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43f4221c-6190-460d-9bd8-2a20b003c890-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 16:18:22 crc kubenswrapper[4933]: I1202 16:18:22.455222 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 16:18:22 crc kubenswrapper[4933]: I1202 16:18:22.484430 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 02 16:18:22 crc kubenswrapper[4933]: I1202 16:18:22.498038 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 02 16:18:22 crc kubenswrapper[4933]: E1202 16:18:22.498567 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43f4221c-6190-460d-9bd8-2a20b003c890" containerName="ceilometer-central-agent" Dec 02 16:18:22 crc kubenswrapper[4933]: I1202 16:18:22.498579 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="43f4221c-6190-460d-9bd8-2a20b003c890" containerName="ceilometer-central-agent" Dec 02 16:18:22 crc kubenswrapper[4933]: E1202 16:18:22.498595 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43f4221c-6190-460d-9bd8-2a20b003c890" containerName="ceilometer-notification-agent" Dec 02 16:18:22 crc kubenswrapper[4933]: I1202 16:18:22.498600 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="43f4221c-6190-460d-9bd8-2a20b003c890" containerName="ceilometer-notification-agent" Dec 02 16:18:22 crc kubenswrapper[4933]: E1202 16:18:22.498612 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43f4221c-6190-460d-9bd8-2a20b003c890" containerName="proxy-httpd" Dec 02 16:18:22 crc kubenswrapper[4933]: I1202 16:18:22.498621 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="43f4221c-6190-460d-9bd8-2a20b003c890" containerName="proxy-httpd" Dec 02 16:18:22 crc kubenswrapper[4933]: E1202 16:18:22.498648 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43f4221c-6190-460d-9bd8-2a20b003c890" containerName="sg-core" Dec 02 16:18:22 crc kubenswrapper[4933]: I1202 16:18:22.498653 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="43f4221c-6190-460d-9bd8-2a20b003c890" containerName="sg-core" Dec 02 16:18:22 crc kubenswrapper[4933]: I1202 16:18:22.498871 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="43f4221c-6190-460d-9bd8-2a20b003c890" containerName="sg-core" Dec 02 16:18:22 crc kubenswrapper[4933]: I1202 16:18:22.498887 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="43f4221c-6190-460d-9bd8-2a20b003c890" containerName="ceilometer-central-agent" Dec 02 16:18:22 crc kubenswrapper[4933]: I1202 16:18:22.498913 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="43f4221c-6190-460d-9bd8-2a20b003c890" containerName="proxy-httpd" Dec 02 16:18:22 crc kubenswrapper[4933]: I1202 16:18:22.498923 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="43f4221c-6190-460d-9bd8-2a20b003c890" containerName="ceilometer-notification-agent" Dec 02 16:18:22 crc kubenswrapper[4933]: I1202 16:18:22.503356 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 16:18:22 crc kubenswrapper[4933]: I1202 16:18:22.510308 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 02 16:18:22 crc kubenswrapper[4933]: I1202 16:18:22.510544 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 02 16:18:22 crc kubenswrapper[4933]: I1202 16:18:22.527279 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 16:18:22 crc kubenswrapper[4933]: I1202 16:18:22.551058 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd002289-4e1c-4ce5-844e-41044dfad95d-scripts\") pod \"ceilometer-0\" (UID: \"dd002289-4e1c-4ce5-844e-41044dfad95d\") " pod="openstack/ceilometer-0" Dec 02 16:18:22 crc kubenswrapper[4933]: I1202 16:18:22.551138 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66q8x\" (UniqueName: \"kubernetes.io/projected/dd002289-4e1c-4ce5-844e-41044dfad95d-kube-api-access-66q8x\") pod \"ceilometer-0\" (UID: \"dd002289-4e1c-4ce5-844e-41044dfad95d\") " pod="openstack/ceilometer-0" Dec 02 16:18:22 crc kubenswrapper[4933]: I1202 16:18:22.551164 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd002289-4e1c-4ce5-844e-41044dfad95d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dd002289-4e1c-4ce5-844e-41044dfad95d\") " pod="openstack/ceilometer-0" Dec 02 16:18:22 crc kubenswrapper[4933]: I1202 16:18:22.551243 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dd002289-4e1c-4ce5-844e-41044dfad95d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dd002289-4e1c-4ce5-844e-41044dfad95d\") " pod="openstack/ceilometer-0" Dec 02 16:18:22 crc kubenswrapper[4933]: I1202 16:18:22.551297 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dd002289-4e1c-4ce5-844e-41044dfad95d-run-httpd\") pod \"ceilometer-0\" (UID: \"dd002289-4e1c-4ce5-844e-41044dfad95d\") " pod="openstack/ceilometer-0" Dec 02 16:18:22 crc kubenswrapper[4933]: I1202 16:18:22.551328 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd002289-4e1c-4ce5-844e-41044dfad95d-config-data\") pod \"ceilometer-0\" (UID: \"dd002289-4e1c-4ce5-844e-41044dfad95d\") " pod="openstack/ceilometer-0" Dec 02 16:18:22 crc kubenswrapper[4933]: I1202 16:18:22.551342 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dd002289-4e1c-4ce5-844e-41044dfad95d-log-httpd\") pod \"ceilometer-0\" (UID: \"dd002289-4e1c-4ce5-844e-41044dfad95d\") " pod="openstack/ceilometer-0" Dec 02 16:18:22 crc kubenswrapper[4933]: I1202 16:18:22.615319 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 16:18:22 crc kubenswrapper[4933]: E1202 16:18:22.620694 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle config-data kube-api-access-66q8x log-httpd run-httpd scripts sg-core-conf-yaml], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/ceilometer-0" podUID="dd002289-4e1c-4ce5-844e-41044dfad95d" Dec 02 16:18:22 crc kubenswrapper[4933]: I1202 16:18:22.657309 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd002289-4e1c-4ce5-844e-41044dfad95d-scripts\") pod \"ceilometer-0\" (UID: \"dd002289-4e1c-4ce5-844e-41044dfad95d\") " pod="openstack/ceilometer-0" Dec 02 16:18:22 crc kubenswrapper[4933]: I1202 16:18:22.657441 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66q8x\" (UniqueName: \"kubernetes.io/projected/dd002289-4e1c-4ce5-844e-41044dfad95d-kube-api-access-66q8x\") pod \"ceilometer-0\" (UID: \"dd002289-4e1c-4ce5-844e-41044dfad95d\") " pod="openstack/ceilometer-0" Dec 02 16:18:22 crc kubenswrapper[4933]: I1202 16:18:22.657474 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd002289-4e1c-4ce5-844e-41044dfad95d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dd002289-4e1c-4ce5-844e-41044dfad95d\") " pod="openstack/ceilometer-0" Dec 02 16:18:22 crc kubenswrapper[4933]: I1202 16:18:22.657640 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dd002289-4e1c-4ce5-844e-41044dfad95d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dd002289-4e1c-4ce5-844e-41044dfad95d\") " pod="openstack/ceilometer-0" Dec 02 16:18:22 crc kubenswrapper[4933]: I1202 16:18:22.657738 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dd002289-4e1c-4ce5-844e-41044dfad95d-run-httpd\") pod \"ceilometer-0\" (UID: \"dd002289-4e1c-4ce5-844e-41044dfad95d\") " pod="openstack/ceilometer-0" Dec 02 16:18:22 crc kubenswrapper[4933]: I1202 16:18:22.657792 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd002289-4e1c-4ce5-844e-41044dfad95d-config-data\") pod \"ceilometer-0\" (UID: \"dd002289-4e1c-4ce5-844e-41044dfad95d\") " pod="openstack/ceilometer-0" Dec 02 16:18:22 crc kubenswrapper[4933]: I1202 16:18:22.657882 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dd002289-4e1c-4ce5-844e-41044dfad95d-log-httpd\") pod \"ceilometer-0\" (UID: \"dd002289-4e1c-4ce5-844e-41044dfad95d\") " pod="openstack/ceilometer-0" Dec 02 16:18:22 crc kubenswrapper[4933]: I1202 16:18:22.658290 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dd002289-4e1c-4ce5-844e-41044dfad95d-run-httpd\") pod \"ceilometer-0\" (UID: \"dd002289-4e1c-4ce5-844e-41044dfad95d\") " pod="openstack/ceilometer-0" Dec 02 16:18:22 crc kubenswrapper[4933]: I1202 16:18:22.658521 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dd002289-4e1c-4ce5-844e-41044dfad95d-log-httpd\") pod \"ceilometer-0\" (UID: \"dd002289-4e1c-4ce5-844e-41044dfad95d\") " pod="openstack/ceilometer-0" Dec 02 16:18:22 crc kubenswrapper[4933]: I1202 16:18:22.662012 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dd002289-4e1c-4ce5-844e-41044dfad95d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dd002289-4e1c-4ce5-844e-41044dfad95d\") " pod="openstack/ceilometer-0" Dec 02 16:18:22 crc kubenswrapper[4933]: I1202 16:18:22.662903 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd002289-4e1c-4ce5-844e-41044dfad95d-scripts\") pod \"ceilometer-0\" (UID: \"dd002289-4e1c-4ce5-844e-41044dfad95d\") " pod="openstack/ceilometer-0" Dec 02 16:18:22 crc kubenswrapper[4933]: I1202 16:18:22.663173 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd002289-4e1c-4ce5-844e-41044dfad95d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dd002289-4e1c-4ce5-844e-41044dfad95d\") " pod="openstack/ceilometer-0" Dec 02 16:18:22 crc kubenswrapper[4933]: I1202 16:18:22.663344 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd002289-4e1c-4ce5-844e-41044dfad95d-config-data\") pod \"ceilometer-0\" (UID: \"dd002289-4e1c-4ce5-844e-41044dfad95d\") " pod="openstack/ceilometer-0" Dec 02 16:18:22 crc kubenswrapper[4933]: I1202 16:18:22.677063 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66q8x\" (UniqueName: \"kubernetes.io/projected/dd002289-4e1c-4ce5-844e-41044dfad95d-kube-api-access-66q8x\") pod \"ceilometer-0\" (UID: \"dd002289-4e1c-4ce5-844e-41044dfad95d\") " pod="openstack/ceilometer-0" Dec 02 16:18:23 crc kubenswrapper[4933]: I1202 16:18:23.008233 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 16:18:23 crc kubenswrapper[4933]: I1202 16:18:23.069282 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d822a2fc-63d8-4767-805b-d68ff0173248-config-data\") pod \"d822a2fc-63d8-4767-805b-d68ff0173248\" (UID: \"d822a2fc-63d8-4767-805b-d68ff0173248\") " Dec 02 16:18:23 crc kubenswrapper[4933]: I1202 16:18:23.069370 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n45z8\" (UniqueName: \"kubernetes.io/projected/d822a2fc-63d8-4767-805b-d68ff0173248-kube-api-access-n45z8\") pod \"d822a2fc-63d8-4767-805b-d68ff0173248\" (UID: \"d822a2fc-63d8-4767-805b-d68ff0173248\") " Dec 02 16:18:23 crc kubenswrapper[4933]: I1202 16:18:23.069412 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d822a2fc-63d8-4767-805b-d68ff0173248-logs\") pod \"d822a2fc-63d8-4767-805b-d68ff0173248\" (UID: \"d822a2fc-63d8-4767-805b-d68ff0173248\") " Dec 02 16:18:23 crc kubenswrapper[4933]: I1202 16:18:23.069500 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d822a2fc-63d8-4767-805b-d68ff0173248-combined-ca-bundle\") pod \"d822a2fc-63d8-4767-805b-d68ff0173248\" (UID: \"d822a2fc-63d8-4767-805b-d68ff0173248\") " Dec 02 16:18:23 crc kubenswrapper[4933]: I1202 16:18:23.071782 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43f4221c-6190-460d-9bd8-2a20b003c890" path="/var/lib/kubelet/pods/43f4221c-6190-460d-9bd8-2a20b003c890/volumes" Dec 02 16:18:23 crc kubenswrapper[4933]: I1202 16:18:23.074580 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d822a2fc-63d8-4767-805b-d68ff0173248-logs" (OuterVolumeSpecName: "logs") pod "d822a2fc-63d8-4767-805b-d68ff0173248" (UID: "d822a2fc-63d8-4767-805b-d68ff0173248"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 16:18:23 crc kubenswrapper[4933]: I1202 16:18:23.076554 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d822a2fc-63d8-4767-805b-d68ff0173248-kube-api-access-n45z8" (OuterVolumeSpecName: "kube-api-access-n45z8") pod "d822a2fc-63d8-4767-805b-d68ff0173248" (UID: "d822a2fc-63d8-4767-805b-d68ff0173248"). InnerVolumeSpecName "kube-api-access-n45z8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:18:23 crc kubenswrapper[4933]: I1202 16:18:23.103267 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"270fd24f-c6e3-47d9-b20f-7a7c9aa15d37","Type":"ContainerStarted","Data":"6ba4c0a8df56cf0bf86039a2a8fcbf3d98bc92ae1f70fc1e6fc99297dabf227b"} Dec 02 16:18:23 crc kubenswrapper[4933]: I1202 16:18:23.113106 4933 generic.go:334] "Generic (PLEG): container finished" podID="d822a2fc-63d8-4767-805b-d68ff0173248" containerID="60370ca13f9bd60ea2881eae5d46b04ed2c2bbd59212dbe4137121ec23660afb" exitCode=0 Dec 02 16:18:23 crc kubenswrapper[4933]: I1202 16:18:23.113184 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 16:18:23 crc kubenswrapper[4933]: I1202 16:18:23.114004 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 16:18:23 crc kubenswrapper[4933]: I1202 16:18:23.114544 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d822a2fc-63d8-4767-805b-d68ff0173248","Type":"ContainerDied","Data":"60370ca13f9bd60ea2881eae5d46b04ed2c2bbd59212dbe4137121ec23660afb"} Dec 02 16:18:23 crc kubenswrapper[4933]: I1202 16:18:23.114574 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d822a2fc-63d8-4767-805b-d68ff0173248","Type":"ContainerDied","Data":"f7dd600d25baa7a5b37d7f9d7268fb021b8bd0f3b5b4b9d9457e04501dae7572"} Dec 02 16:18:23 crc kubenswrapper[4933]: I1202 16:18:23.114591 4933 scope.go:117] "RemoveContainer" containerID="60370ca13f9bd60ea2881eae5d46b04ed2c2bbd59212dbe4137121ec23660afb" Dec 02 16:18:23 crc kubenswrapper[4933]: I1202 16:18:23.126320 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d822a2fc-63d8-4767-805b-d68ff0173248-config-data" (OuterVolumeSpecName: "config-data") pod "d822a2fc-63d8-4767-805b-d68ff0173248" (UID: "d822a2fc-63d8-4767-805b-d68ff0173248"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:18:23 crc kubenswrapper[4933]: I1202 16:18:23.128236 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 16:18:23 crc kubenswrapper[4933]: I1202 16:18:23.129357 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d822a2fc-63d8-4767-805b-d68ff0173248-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d822a2fc-63d8-4767-805b-d68ff0173248" (UID: "d822a2fc-63d8-4767-805b-d68ff0173248"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:18:23 crc kubenswrapper[4933]: I1202 16:18:23.160569 4933 scope.go:117] "RemoveContainer" containerID="0d1d36c28b1eb7ed4343f377b2f784db47dcaf16b3355a2615520c17dc3b36f6" Dec 02 16:18:23 crc kubenswrapper[4933]: I1202 16:18:23.171649 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd002289-4e1c-4ce5-844e-41044dfad95d-scripts\") pod \"dd002289-4e1c-4ce5-844e-41044dfad95d\" (UID: \"dd002289-4e1c-4ce5-844e-41044dfad95d\") " Dec 02 16:18:23 crc kubenswrapper[4933]: I1202 16:18:23.171704 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd002289-4e1c-4ce5-844e-41044dfad95d-combined-ca-bundle\") pod \"dd002289-4e1c-4ce5-844e-41044dfad95d\" (UID: \"dd002289-4e1c-4ce5-844e-41044dfad95d\") " Dec 02 16:18:23 crc kubenswrapper[4933]: I1202 16:18:23.171746 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dd002289-4e1c-4ce5-844e-41044dfad95d-log-httpd\") pod \"dd002289-4e1c-4ce5-844e-41044dfad95d\" (UID: \"dd002289-4e1c-4ce5-844e-41044dfad95d\") " Dec 02 16:18:23 crc kubenswrapper[4933]: I1202 16:18:23.171841 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd002289-4e1c-4ce5-844e-41044dfad95d-config-data\") pod \"dd002289-4e1c-4ce5-844e-41044dfad95d\" (UID: \"dd002289-4e1c-4ce5-844e-41044dfad95d\") " Dec 02 16:18:23 crc kubenswrapper[4933]: I1202 16:18:23.171904 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dd002289-4e1c-4ce5-844e-41044dfad95d-sg-core-conf-yaml\") pod \"dd002289-4e1c-4ce5-844e-41044dfad95d\" (UID: \"dd002289-4e1c-4ce5-844e-41044dfad95d\") " Dec 02 16:18:23 crc kubenswrapper[4933]: I1202 16:18:23.171926 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-66q8x\" (UniqueName: \"kubernetes.io/projected/dd002289-4e1c-4ce5-844e-41044dfad95d-kube-api-access-66q8x\") pod \"dd002289-4e1c-4ce5-844e-41044dfad95d\" (UID: \"dd002289-4e1c-4ce5-844e-41044dfad95d\") " Dec 02 16:18:23 crc kubenswrapper[4933]: I1202 16:18:23.171956 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dd002289-4e1c-4ce5-844e-41044dfad95d-run-httpd\") pod \"dd002289-4e1c-4ce5-844e-41044dfad95d\" (UID: \"dd002289-4e1c-4ce5-844e-41044dfad95d\") " Dec 02 16:18:23 crc kubenswrapper[4933]: I1202 16:18:23.172569 4933 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d822a2fc-63d8-4767-805b-d68ff0173248-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 16:18:23 crc kubenswrapper[4933]: I1202 16:18:23.172589 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n45z8\" (UniqueName: \"kubernetes.io/projected/d822a2fc-63d8-4767-805b-d68ff0173248-kube-api-access-n45z8\") on node \"crc\" DevicePath \"\"" Dec 02 16:18:23 crc kubenswrapper[4933]: I1202 16:18:23.172598 4933 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d822a2fc-63d8-4767-805b-d68ff0173248-logs\") on node \"crc\" DevicePath \"\"" Dec 02 16:18:23 crc kubenswrapper[4933]: I1202 16:18:23.172606 4933 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d822a2fc-63d8-4767-805b-d68ff0173248-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 16:18:23 crc kubenswrapper[4933]: I1202 16:18:23.175243 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd002289-4e1c-4ce5-844e-41044dfad95d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "dd002289-4e1c-4ce5-844e-41044dfad95d" (UID: "dd002289-4e1c-4ce5-844e-41044dfad95d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 16:18:23 crc kubenswrapper[4933]: I1202 16:18:23.175624 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd002289-4e1c-4ce5-844e-41044dfad95d-scripts" (OuterVolumeSpecName: "scripts") pod "dd002289-4e1c-4ce5-844e-41044dfad95d" (UID: "dd002289-4e1c-4ce5-844e-41044dfad95d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:18:23 crc kubenswrapper[4933]: I1202 16:18:23.176121 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd002289-4e1c-4ce5-844e-41044dfad95d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "dd002289-4e1c-4ce5-844e-41044dfad95d" (UID: "dd002289-4e1c-4ce5-844e-41044dfad95d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 16:18:23 crc kubenswrapper[4933]: I1202 16:18:23.179899 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd002289-4e1c-4ce5-844e-41044dfad95d-kube-api-access-66q8x" (OuterVolumeSpecName: "kube-api-access-66q8x") pod "dd002289-4e1c-4ce5-844e-41044dfad95d" (UID: "dd002289-4e1c-4ce5-844e-41044dfad95d"). InnerVolumeSpecName "kube-api-access-66q8x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:18:23 crc kubenswrapper[4933]: I1202 16:18:23.182153 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd002289-4e1c-4ce5-844e-41044dfad95d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "dd002289-4e1c-4ce5-844e-41044dfad95d" (UID: "dd002289-4e1c-4ce5-844e-41044dfad95d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:18:23 crc kubenswrapper[4933]: I1202 16:18:23.182250 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd002289-4e1c-4ce5-844e-41044dfad95d-config-data" (OuterVolumeSpecName: "config-data") pod "dd002289-4e1c-4ce5-844e-41044dfad95d" (UID: "dd002289-4e1c-4ce5-844e-41044dfad95d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:18:23 crc kubenswrapper[4933]: I1202 16:18:23.192596 4933 scope.go:117] "RemoveContainer" containerID="60370ca13f9bd60ea2881eae5d46b04ed2c2bbd59212dbe4137121ec23660afb" Dec 02 16:18:23 crc kubenswrapper[4933]: I1202 16:18:23.195769 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd002289-4e1c-4ce5-844e-41044dfad95d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dd002289-4e1c-4ce5-844e-41044dfad95d" (UID: "dd002289-4e1c-4ce5-844e-41044dfad95d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:18:23 crc kubenswrapper[4933]: E1202 16:18:23.197958 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60370ca13f9bd60ea2881eae5d46b04ed2c2bbd59212dbe4137121ec23660afb\": container with ID starting with 60370ca13f9bd60ea2881eae5d46b04ed2c2bbd59212dbe4137121ec23660afb not found: ID does not exist" containerID="60370ca13f9bd60ea2881eae5d46b04ed2c2bbd59212dbe4137121ec23660afb" Dec 02 16:18:23 crc kubenswrapper[4933]: I1202 16:18:23.198002 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60370ca13f9bd60ea2881eae5d46b04ed2c2bbd59212dbe4137121ec23660afb"} err="failed to get container status \"60370ca13f9bd60ea2881eae5d46b04ed2c2bbd59212dbe4137121ec23660afb\": rpc error: code = NotFound desc = could not find container \"60370ca13f9bd60ea2881eae5d46b04ed2c2bbd59212dbe4137121ec23660afb\": container with ID starting with 60370ca13f9bd60ea2881eae5d46b04ed2c2bbd59212dbe4137121ec23660afb not found: ID does not exist" Dec 02 16:18:23 crc kubenswrapper[4933]: I1202 16:18:23.198027 4933 scope.go:117] "RemoveContainer" containerID="0d1d36c28b1eb7ed4343f377b2f784db47dcaf16b3355a2615520c17dc3b36f6" Dec 02 16:18:23 crc kubenswrapper[4933]: E1202 16:18:23.202126 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d1d36c28b1eb7ed4343f377b2f784db47dcaf16b3355a2615520c17dc3b36f6\": container with ID starting with 0d1d36c28b1eb7ed4343f377b2f784db47dcaf16b3355a2615520c17dc3b36f6 not found: ID does not exist" containerID="0d1d36c28b1eb7ed4343f377b2f784db47dcaf16b3355a2615520c17dc3b36f6" Dec 02 16:18:23 crc kubenswrapper[4933]: I1202 16:18:23.202186 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d1d36c28b1eb7ed4343f377b2f784db47dcaf16b3355a2615520c17dc3b36f6"} err="failed to get container status \"0d1d36c28b1eb7ed4343f377b2f784db47dcaf16b3355a2615520c17dc3b36f6\": rpc error: code = NotFound desc = could not find container \"0d1d36c28b1eb7ed4343f377b2f784db47dcaf16b3355a2615520c17dc3b36f6\": container with ID starting with 0d1d36c28b1eb7ed4343f377b2f784db47dcaf16b3355a2615520c17dc3b36f6 not found: ID does not exist" Dec 02 16:18:23 crc kubenswrapper[4933]: I1202 16:18:23.274601 4933 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd002289-4e1c-4ce5-844e-41044dfad95d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 16:18:23 crc kubenswrapper[4933]: I1202 16:18:23.274633 4933 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dd002289-4e1c-4ce5-844e-41044dfad95d-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 16:18:23 crc kubenswrapper[4933]: I1202 16:18:23.274643 4933 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd002289-4e1c-4ce5-844e-41044dfad95d-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 16:18:23 crc kubenswrapper[4933]: I1202 16:18:23.274651 4933 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dd002289-4e1c-4ce5-844e-41044dfad95d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 02 16:18:23 crc kubenswrapper[4933]: I1202 16:18:23.274660 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-66q8x\" (UniqueName: \"kubernetes.io/projected/dd002289-4e1c-4ce5-844e-41044dfad95d-kube-api-access-66q8x\") on node \"crc\" DevicePath \"\"" Dec 02 16:18:23 crc kubenswrapper[4933]: I1202 16:18:23.274668 4933 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dd002289-4e1c-4ce5-844e-41044dfad95d-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 16:18:23 crc kubenswrapper[4933]: I1202 16:18:23.274675 4933 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd002289-4e1c-4ce5-844e-41044dfad95d-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 16:18:23 crc kubenswrapper[4933]: I1202 16:18:23.514326 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 02 16:18:23 crc kubenswrapper[4933]: I1202 16:18:23.526241 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 02 16:18:23 crc kubenswrapper[4933]: I1202 16:18:23.559866 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 02 16:18:23 crc kubenswrapper[4933]: E1202 16:18:23.560450 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d822a2fc-63d8-4767-805b-d68ff0173248" containerName="nova-api-api" Dec 02 16:18:23 crc kubenswrapper[4933]: I1202 16:18:23.560472 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="d822a2fc-63d8-4767-805b-d68ff0173248" containerName="nova-api-api" Dec 02 16:18:23 crc kubenswrapper[4933]: E1202 16:18:23.560489 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d822a2fc-63d8-4767-805b-d68ff0173248" containerName="nova-api-log" Dec 02 16:18:23 crc kubenswrapper[4933]: I1202 16:18:23.560497 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="d822a2fc-63d8-4767-805b-d68ff0173248" containerName="nova-api-log" Dec 02 16:18:23 crc kubenswrapper[4933]: I1202 16:18:23.560762 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="d822a2fc-63d8-4767-805b-d68ff0173248" containerName="nova-api-api" Dec 02 16:18:23 crc kubenswrapper[4933]: I1202 16:18:23.560799 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="d822a2fc-63d8-4767-805b-d68ff0173248" containerName="nova-api-log" Dec 02 16:18:23 crc kubenswrapper[4933]: I1202 16:18:23.562524 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 16:18:23 crc kubenswrapper[4933]: I1202 16:18:23.587568 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 02 16:18:23 crc kubenswrapper[4933]: I1202 16:18:23.587595 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 02 16:18:23 crc kubenswrapper[4933]: I1202 16:18:23.587887 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 02 16:18:23 crc kubenswrapper[4933]: I1202 16:18:23.588950 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47162c0d-06c3-452d-b69d-a875408216f6-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"47162c0d-06c3-452d-b69d-a875408216f6\") " pod="openstack/nova-api-0" Dec 02 16:18:23 crc kubenswrapper[4933]: I1202 16:18:23.589000 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/47162c0d-06c3-452d-b69d-a875408216f6-logs\") pod \"nova-api-0\" (UID: \"47162c0d-06c3-452d-b69d-a875408216f6\") " pod="openstack/nova-api-0" Dec 02 16:18:23 crc kubenswrapper[4933]: I1202 16:18:23.589031 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/47162c0d-06c3-452d-b69d-a875408216f6-internal-tls-certs\") pod \"nova-api-0\" (UID: \"47162c0d-06c3-452d-b69d-a875408216f6\") " pod="openstack/nova-api-0" Dec 02 16:18:23 crc kubenswrapper[4933]: I1202 16:18:23.589121 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47162c0d-06c3-452d-b69d-a875408216f6-config-data\") pod \"nova-api-0\" (UID: \"47162c0d-06c3-452d-b69d-a875408216f6\") " pod="openstack/nova-api-0" Dec 02 16:18:23 crc kubenswrapper[4933]: I1202 16:18:23.589160 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dvtq\" (UniqueName: \"kubernetes.io/projected/47162c0d-06c3-452d-b69d-a875408216f6-kube-api-access-7dvtq\") pod \"nova-api-0\" (UID: \"47162c0d-06c3-452d-b69d-a875408216f6\") " pod="openstack/nova-api-0" Dec 02 16:18:23 crc kubenswrapper[4933]: I1202 16:18:23.589223 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/47162c0d-06c3-452d-b69d-a875408216f6-public-tls-certs\") pod \"nova-api-0\" (UID: \"47162c0d-06c3-452d-b69d-a875408216f6\") " pod="openstack/nova-api-0" Dec 02 16:18:23 crc kubenswrapper[4933]: I1202 16:18:23.620922 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 02 16:18:23 crc kubenswrapper[4933]: I1202 16:18:23.691948 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47162c0d-06c3-452d-b69d-a875408216f6-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"47162c0d-06c3-452d-b69d-a875408216f6\") " pod="openstack/nova-api-0" Dec 02 16:18:23 crc kubenswrapper[4933]: I1202 16:18:23.692005 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/47162c0d-06c3-452d-b69d-a875408216f6-logs\") pod \"nova-api-0\" (UID: \"47162c0d-06c3-452d-b69d-a875408216f6\") " pod="openstack/nova-api-0" Dec 02 16:18:23 crc kubenswrapper[4933]: I1202 16:18:23.692031 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/47162c0d-06c3-452d-b69d-a875408216f6-internal-tls-certs\") pod \"nova-api-0\" (UID: \"47162c0d-06c3-452d-b69d-a875408216f6\") " pod="openstack/nova-api-0" Dec 02 16:18:23 crc kubenswrapper[4933]: I1202 16:18:23.692094 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47162c0d-06c3-452d-b69d-a875408216f6-config-data\") pod \"nova-api-0\" (UID: \"47162c0d-06c3-452d-b69d-a875408216f6\") " pod="openstack/nova-api-0" Dec 02 16:18:23 crc kubenswrapper[4933]: I1202 16:18:23.692122 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7dvtq\" (UniqueName: \"kubernetes.io/projected/47162c0d-06c3-452d-b69d-a875408216f6-kube-api-access-7dvtq\") pod \"nova-api-0\" (UID: \"47162c0d-06c3-452d-b69d-a875408216f6\") " pod="openstack/nova-api-0" Dec 02 16:18:23 crc kubenswrapper[4933]: I1202 16:18:23.692170 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/47162c0d-06c3-452d-b69d-a875408216f6-public-tls-certs\") pod \"nova-api-0\" (UID: \"47162c0d-06c3-452d-b69d-a875408216f6\") " pod="openstack/nova-api-0" Dec 02 16:18:23 crc kubenswrapper[4933]: I1202 16:18:23.692577 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/47162c0d-06c3-452d-b69d-a875408216f6-logs\") pod \"nova-api-0\" (UID: \"47162c0d-06c3-452d-b69d-a875408216f6\") " pod="openstack/nova-api-0" Dec 02 16:18:23 crc kubenswrapper[4933]: I1202 16:18:23.695358 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47162c0d-06c3-452d-b69d-a875408216f6-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"47162c0d-06c3-452d-b69d-a875408216f6\") " pod="openstack/nova-api-0" Dec 02 16:18:23 crc kubenswrapper[4933]: I1202 16:18:23.695911 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/47162c0d-06c3-452d-b69d-a875408216f6-internal-tls-certs\") pod \"nova-api-0\" (UID: \"47162c0d-06c3-452d-b69d-a875408216f6\") " pod="openstack/nova-api-0" Dec 02 16:18:23 crc kubenswrapper[4933]: I1202 16:18:23.696269 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/47162c0d-06c3-452d-b69d-a875408216f6-public-tls-certs\") pod \"nova-api-0\" (UID: \"47162c0d-06c3-452d-b69d-a875408216f6\") " pod="openstack/nova-api-0" Dec 02 16:18:23 crc kubenswrapper[4933]: I1202 16:18:23.698515 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47162c0d-06c3-452d-b69d-a875408216f6-config-data\") pod \"nova-api-0\" (UID: \"47162c0d-06c3-452d-b69d-a875408216f6\") " pod="openstack/nova-api-0" Dec 02 16:18:23 crc kubenswrapper[4933]: I1202 16:18:23.719840 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dvtq\" (UniqueName: \"kubernetes.io/projected/47162c0d-06c3-452d-b69d-a875408216f6-kube-api-access-7dvtq\") pod \"nova-api-0\" (UID: \"47162c0d-06c3-452d-b69d-a875408216f6\") " pod="openstack/nova-api-0" Dec 02 16:18:23 crc kubenswrapper[4933]: I1202 16:18:23.920154 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 16:18:24 crc kubenswrapper[4933]: I1202 16:18:24.140104 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 16:18:24 crc kubenswrapper[4933]: I1202 16:18:24.141433 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"270fd24f-c6e3-47d9-b20f-7a7c9aa15d37","Type":"ContainerStarted","Data":"e6558144312a5355d0032730e5646c9623d2b0930eb2e7221c517509602c624b"} Dec 02 16:18:24 crc kubenswrapper[4933]: I1202 16:18:24.320740 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 16:18:24 crc kubenswrapper[4933]: I1202 16:18:24.343918 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 02 16:18:24 crc kubenswrapper[4933]: I1202 16:18:24.361326 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 02 16:18:24 crc kubenswrapper[4933]: I1202 16:18:24.366010 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 16:18:24 crc kubenswrapper[4933]: I1202 16:18:24.370065 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 02 16:18:24 crc kubenswrapper[4933]: I1202 16:18:24.370432 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 02 16:18:24 crc kubenswrapper[4933]: I1202 16:18:24.375409 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 16:18:24 crc kubenswrapper[4933]: I1202 16:18:24.412483 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c8b8773-6b20-4130-a427-9567f6d990f4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7c8b8773-6b20-4130-a427-9567f6d990f4\") " pod="openstack/ceilometer-0" Dec 02 16:18:24 crc kubenswrapper[4933]: I1202 16:18:24.412578 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7c8b8773-6b20-4130-a427-9567f6d990f4-run-httpd\") pod \"ceilometer-0\" (UID: \"7c8b8773-6b20-4130-a427-9567f6d990f4\") " pod="openstack/ceilometer-0" Dec 02 16:18:24 crc kubenswrapper[4933]: I1202 16:18:24.412702 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7c8b8773-6b20-4130-a427-9567f6d990f4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7c8b8773-6b20-4130-a427-9567f6d990f4\") " pod="openstack/ceilometer-0" Dec 02 16:18:24 crc kubenswrapper[4933]: I1202 16:18:24.412788 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7c8b8773-6b20-4130-a427-9567f6d990f4-log-httpd\") pod \"ceilometer-0\" (UID: \"7c8b8773-6b20-4130-a427-9567f6d990f4\") " pod="openstack/ceilometer-0" Dec 02 16:18:24 crc kubenswrapper[4933]: I1202 16:18:24.412857 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c8b8773-6b20-4130-a427-9567f6d990f4-config-data\") pod \"ceilometer-0\" (UID: \"7c8b8773-6b20-4130-a427-9567f6d990f4\") " pod="openstack/ceilometer-0" Dec 02 16:18:24 crc kubenswrapper[4933]: I1202 16:18:24.412938 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8kk6\" (UniqueName: \"kubernetes.io/projected/7c8b8773-6b20-4130-a427-9567f6d990f4-kube-api-access-w8kk6\") pod \"ceilometer-0\" (UID: \"7c8b8773-6b20-4130-a427-9567f6d990f4\") " pod="openstack/ceilometer-0" Dec 02 16:18:24 crc kubenswrapper[4933]: I1202 16:18:24.413049 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7c8b8773-6b20-4130-a427-9567f6d990f4-scripts\") pod \"ceilometer-0\" (UID: \"7c8b8773-6b20-4130-a427-9567f6d990f4\") " pod="openstack/ceilometer-0" Dec 02 16:18:24 crc kubenswrapper[4933]: I1202 16:18:24.442890 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 02 16:18:24 crc kubenswrapper[4933]: I1202 16:18:24.516133 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7c8b8773-6b20-4130-a427-9567f6d990f4-scripts\") pod \"ceilometer-0\" (UID: \"7c8b8773-6b20-4130-a427-9567f6d990f4\") " pod="openstack/ceilometer-0" Dec 02 16:18:24 crc kubenswrapper[4933]: I1202 16:18:24.516772 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c8b8773-6b20-4130-a427-9567f6d990f4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7c8b8773-6b20-4130-a427-9567f6d990f4\") " pod="openstack/ceilometer-0" Dec 02 16:18:24 crc kubenswrapper[4933]: I1202 16:18:24.517413 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7c8b8773-6b20-4130-a427-9567f6d990f4-run-httpd\") pod \"ceilometer-0\" (UID: \"7c8b8773-6b20-4130-a427-9567f6d990f4\") " pod="openstack/ceilometer-0" Dec 02 16:18:24 crc kubenswrapper[4933]: I1202 16:18:24.517783 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7c8b8773-6b20-4130-a427-9567f6d990f4-run-httpd\") pod \"ceilometer-0\" (UID: \"7c8b8773-6b20-4130-a427-9567f6d990f4\") " pod="openstack/ceilometer-0" Dec 02 16:18:24 crc kubenswrapper[4933]: I1202 16:18:24.517994 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7c8b8773-6b20-4130-a427-9567f6d990f4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7c8b8773-6b20-4130-a427-9567f6d990f4\") " pod="openstack/ceilometer-0" Dec 02 16:18:24 crc kubenswrapper[4933]: I1202 16:18:24.518372 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7c8b8773-6b20-4130-a427-9567f6d990f4-log-httpd\") pod \"ceilometer-0\" (UID: \"7c8b8773-6b20-4130-a427-9567f6d990f4\") " pod="openstack/ceilometer-0" Dec 02 16:18:24 crc kubenswrapper[4933]: I1202 16:18:24.518795 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7c8b8773-6b20-4130-a427-9567f6d990f4-log-httpd\") pod \"ceilometer-0\" (UID: \"7c8b8773-6b20-4130-a427-9567f6d990f4\") " pod="openstack/ceilometer-0" Dec 02 16:18:24 crc kubenswrapper[4933]: I1202 16:18:24.518928 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c8b8773-6b20-4130-a427-9567f6d990f4-config-data\") pod \"ceilometer-0\" (UID: \"7c8b8773-6b20-4130-a427-9567f6d990f4\") " pod="openstack/ceilometer-0" Dec 02 16:18:24 crc kubenswrapper[4933]: I1202 16:18:24.519001 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8kk6\" (UniqueName: \"kubernetes.io/projected/7c8b8773-6b20-4130-a427-9567f6d990f4-kube-api-access-w8kk6\") pod \"ceilometer-0\" (UID: \"7c8b8773-6b20-4130-a427-9567f6d990f4\") " pod="openstack/ceilometer-0" Dec 02 16:18:24 crc kubenswrapper[4933]: I1202 16:18:24.524512 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c8b8773-6b20-4130-a427-9567f6d990f4-config-data\") pod \"ceilometer-0\" (UID: \"7c8b8773-6b20-4130-a427-9567f6d990f4\") " pod="openstack/ceilometer-0" Dec 02 16:18:24 crc kubenswrapper[4933]: I1202 16:18:24.525153 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7c8b8773-6b20-4130-a427-9567f6d990f4-scripts\") pod \"ceilometer-0\" (UID: \"7c8b8773-6b20-4130-a427-9567f6d990f4\") " pod="openstack/ceilometer-0" Dec 02 16:18:24 crc kubenswrapper[4933]: I1202 16:18:24.534361 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c8b8773-6b20-4130-a427-9567f6d990f4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7c8b8773-6b20-4130-a427-9567f6d990f4\") " pod="openstack/ceilometer-0" Dec 02 16:18:24 crc kubenswrapper[4933]: I1202 16:18:24.538718 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7c8b8773-6b20-4130-a427-9567f6d990f4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7c8b8773-6b20-4130-a427-9567f6d990f4\") " pod="openstack/ceilometer-0" Dec 02 16:18:24 crc kubenswrapper[4933]: I1202 16:18:24.538788 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8kk6\" (UniqueName: \"kubernetes.io/projected/7c8b8773-6b20-4130-a427-9567f6d990f4-kube-api-access-w8kk6\") pod \"ceilometer-0\" (UID: \"7c8b8773-6b20-4130-a427-9567f6d990f4\") " pod="openstack/ceilometer-0" Dec 02 16:18:24 crc kubenswrapper[4933]: I1202 16:18:24.689513 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 16:18:25 crc kubenswrapper[4933]: I1202 16:18:25.082728 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d822a2fc-63d8-4767-805b-d68ff0173248" path="/var/lib/kubelet/pods/d822a2fc-63d8-4767-805b-d68ff0173248/volumes" Dec 02 16:18:25 crc kubenswrapper[4933]: I1202 16:18:25.084164 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd002289-4e1c-4ce5-844e-41044dfad95d" path="/var/lib/kubelet/pods/dd002289-4e1c-4ce5-844e-41044dfad95d/volumes" Dec 02 16:18:25 crc kubenswrapper[4933]: I1202 16:18:25.106233 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-cjgzk" Dec 02 16:18:25 crc kubenswrapper[4933]: I1202 16:18:25.106316 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-cjgzk" Dec 02 16:18:25 crc kubenswrapper[4933]: I1202 16:18:25.181965 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"47162c0d-06c3-452d-b69d-a875408216f6","Type":"ContainerStarted","Data":"e6a144c1d0da23141ffa91100cf1e01ba5c7b68dbd0a8ed2ac949eb31b1a5686"} Dec 02 16:18:25 crc kubenswrapper[4933]: I1202 16:18:25.182011 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"47162c0d-06c3-452d-b69d-a875408216f6","Type":"ContainerStarted","Data":"81292d89bccccc8f23d5cb412aed4c0a83459fb6811457b2feac58a039c13c48"} Dec 02 16:18:25 crc kubenswrapper[4933]: I1202 16:18:25.182023 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"47162c0d-06c3-452d-b69d-a875408216f6","Type":"ContainerStarted","Data":"98947bf55f5adac87d2a16fe8aa8825f778094207c4e769a787959427f42cde6"} Dec 02 16:18:25 crc kubenswrapper[4933]: I1202 16:18:25.220900 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 16:18:25 crc kubenswrapper[4933]: I1202 16:18:25.229363 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-cjgzk" Dec 02 16:18:25 crc kubenswrapper[4933]: I1202 16:18:25.230556 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.230535514 podStartE2EDuration="2.230535514s" podCreationTimestamp="2025-12-02 16:18:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 16:18:25.201492555 +0000 UTC m=+1568.452719268" watchObservedRunningTime="2025-12-02 16:18:25.230535514 +0000 UTC m=+1568.481762217" Dec 02 16:18:25 crc kubenswrapper[4933]: W1202 16:18:25.232969 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7c8b8773_6b20_4130_a427_9567f6d990f4.slice/crio-114091775de82c5b15b708aee85d5ef39fc5d79fb79f4f7a7ecc88c54e0a5783 WatchSource:0}: Error finding container 114091775de82c5b15b708aee85d5ef39fc5d79fb79f4f7a7ecc88c54e0a5783: Status 404 returned error can't find the container with id 114091775de82c5b15b708aee85d5ef39fc5d79fb79f4f7a7ecc88c54e0a5783 Dec 02 16:18:25 crc kubenswrapper[4933]: I1202 16:18:25.296219 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Dec 02 16:18:25 crc kubenswrapper[4933]: I1202 16:18:25.310938 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-cjgzk" Dec 02 16:18:25 crc kubenswrapper[4933]: I1202 16:18:25.320373 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Dec 02 16:18:25 crc kubenswrapper[4933]: I1202 16:18:25.479149 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-cjgzk"] Dec 02 16:18:26 crc kubenswrapper[4933]: I1202 16:18:26.201900 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"270fd24f-c6e3-47d9-b20f-7a7c9aa15d37","Type":"ContainerStarted","Data":"d03f41c6f6db8f3d1d3d64cc036e4af2271929cb2431a090ee946954af3e20a6"} Dec 02 16:18:26 crc kubenswrapper[4933]: I1202 16:18:26.210310 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7c8b8773-6b20-4130-a427-9567f6d990f4","Type":"ContainerStarted","Data":"114091775de82c5b15b708aee85d5ef39fc5d79fb79f4f7a7ecc88c54e0a5783"} Dec 02 16:18:26 crc kubenswrapper[4933]: I1202 16:18:26.236411 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Dec 02 16:18:26 crc kubenswrapper[4933]: I1202 16:18:26.250637 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=2.609817707 podStartE2EDuration="9.250606314s" podCreationTimestamp="2025-12-02 16:18:17 +0000 UTC" firstStartedPulling="2025-12-02 16:18:18.892976417 +0000 UTC m=+1562.144203120" lastFinishedPulling="2025-12-02 16:18:25.533765024 +0000 UTC m=+1568.784991727" observedRunningTime="2025-12-02 16:18:26.232178977 +0000 UTC m=+1569.483405680" watchObservedRunningTime="2025-12-02 16:18:26.250606314 +0000 UTC m=+1569.501833027" Dec 02 16:18:26 crc kubenswrapper[4933]: I1202 16:18:26.462611 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6d99f6bc7f-xjx4w" Dec 02 16:18:26 crc kubenswrapper[4933]: I1202 16:18:26.513789 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-j4ntg"] Dec 02 16:18:26 crc kubenswrapper[4933]: I1202 16:18:26.517354 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-j4ntg" Dec 02 16:18:26 crc kubenswrapper[4933]: I1202 16:18:26.526533 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Dec 02 16:18:26 crc kubenswrapper[4933]: I1202 16:18:26.528448 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Dec 02 16:18:26 crc kubenswrapper[4933]: I1202 16:18:26.593644 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-j4ntg"] Dec 02 16:18:26 crc kubenswrapper[4933]: I1202 16:18:26.615102 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e63b60f6-5169-44c1-b64f-6bad25c8ba22-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-j4ntg\" (UID: \"e63b60f6-5169-44c1-b64f-6bad25c8ba22\") " pod="openstack/nova-cell1-cell-mapping-j4ntg" Dec 02 16:18:26 crc kubenswrapper[4933]: I1202 16:18:26.615191 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bw6rk\" (UniqueName: \"kubernetes.io/projected/e63b60f6-5169-44c1-b64f-6bad25c8ba22-kube-api-access-bw6rk\") pod \"nova-cell1-cell-mapping-j4ntg\" (UID: \"e63b60f6-5169-44c1-b64f-6bad25c8ba22\") " pod="openstack/nova-cell1-cell-mapping-j4ntg" Dec 02 16:18:26 crc kubenswrapper[4933]: I1202 16:18:26.615229 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e63b60f6-5169-44c1-b64f-6bad25c8ba22-config-data\") pod \"nova-cell1-cell-mapping-j4ntg\" (UID: \"e63b60f6-5169-44c1-b64f-6bad25c8ba22\") " pod="openstack/nova-cell1-cell-mapping-j4ntg" Dec 02 16:18:26 crc kubenswrapper[4933]: I1202 16:18:26.615325 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e63b60f6-5169-44c1-b64f-6bad25c8ba22-scripts\") pod \"nova-cell1-cell-mapping-j4ntg\" (UID: \"e63b60f6-5169-44c1-b64f-6bad25c8ba22\") " pod="openstack/nova-cell1-cell-mapping-j4ntg" Dec 02 16:18:26 crc kubenswrapper[4933]: I1202 16:18:26.643965 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7877d89589-kn582"] Dec 02 16:18:26 crc kubenswrapper[4933]: I1202 16:18:26.644218 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7877d89589-kn582" podUID="29bf2149-397b-49a3-a8b5-f2793520040e" containerName="dnsmasq-dns" containerID="cri-o://e033e6d59b6d6809ce6e80d4fd6ca7c0af981684a01743651bce9ad0388f47cd" gracePeriod=10 Dec 02 16:18:26 crc kubenswrapper[4933]: I1202 16:18:26.717595 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e63b60f6-5169-44c1-b64f-6bad25c8ba22-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-j4ntg\" (UID: \"e63b60f6-5169-44c1-b64f-6bad25c8ba22\") " pod="openstack/nova-cell1-cell-mapping-j4ntg" Dec 02 16:18:26 crc kubenswrapper[4933]: I1202 16:18:26.717681 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bw6rk\" (UniqueName: \"kubernetes.io/projected/e63b60f6-5169-44c1-b64f-6bad25c8ba22-kube-api-access-bw6rk\") pod \"nova-cell1-cell-mapping-j4ntg\" (UID: \"e63b60f6-5169-44c1-b64f-6bad25c8ba22\") " pod="openstack/nova-cell1-cell-mapping-j4ntg" Dec 02 16:18:26 crc kubenswrapper[4933]: I1202 16:18:26.717721 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e63b60f6-5169-44c1-b64f-6bad25c8ba22-config-data\") pod \"nova-cell1-cell-mapping-j4ntg\" (UID: \"e63b60f6-5169-44c1-b64f-6bad25c8ba22\") " pod="openstack/nova-cell1-cell-mapping-j4ntg" Dec 02 16:18:26 crc kubenswrapper[4933]: I1202 16:18:26.717807 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e63b60f6-5169-44c1-b64f-6bad25c8ba22-scripts\") pod \"nova-cell1-cell-mapping-j4ntg\" (UID: \"e63b60f6-5169-44c1-b64f-6bad25c8ba22\") " pod="openstack/nova-cell1-cell-mapping-j4ntg" Dec 02 16:18:26 crc kubenswrapper[4933]: I1202 16:18:26.726163 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e63b60f6-5169-44c1-b64f-6bad25c8ba22-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-j4ntg\" (UID: \"e63b60f6-5169-44c1-b64f-6bad25c8ba22\") " pod="openstack/nova-cell1-cell-mapping-j4ntg" Dec 02 16:18:26 crc kubenswrapper[4933]: I1202 16:18:26.726553 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e63b60f6-5169-44c1-b64f-6bad25c8ba22-scripts\") pod \"nova-cell1-cell-mapping-j4ntg\" (UID: \"e63b60f6-5169-44c1-b64f-6bad25c8ba22\") " pod="openstack/nova-cell1-cell-mapping-j4ntg" Dec 02 16:18:26 crc kubenswrapper[4933]: I1202 16:18:26.727861 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e63b60f6-5169-44c1-b64f-6bad25c8ba22-config-data\") pod \"nova-cell1-cell-mapping-j4ntg\" (UID: \"e63b60f6-5169-44c1-b64f-6bad25c8ba22\") " pod="openstack/nova-cell1-cell-mapping-j4ntg" Dec 02 16:18:26 crc kubenswrapper[4933]: I1202 16:18:26.736143 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bw6rk\" (UniqueName: \"kubernetes.io/projected/e63b60f6-5169-44c1-b64f-6bad25c8ba22-kube-api-access-bw6rk\") pod \"nova-cell1-cell-mapping-j4ntg\" (UID: \"e63b60f6-5169-44c1-b64f-6bad25c8ba22\") " pod="openstack/nova-cell1-cell-mapping-j4ntg" Dec 02 16:18:26 crc kubenswrapper[4933]: I1202 16:18:26.982368 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-j4ntg" Dec 02 16:18:27 crc kubenswrapper[4933]: I1202 16:18:27.204166 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7877d89589-kn582" Dec 02 16:18:27 crc kubenswrapper[4933]: I1202 16:18:27.242689 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pz2v6\" (UniqueName: \"kubernetes.io/projected/29bf2149-397b-49a3-a8b5-f2793520040e-kube-api-access-pz2v6\") pod \"29bf2149-397b-49a3-a8b5-f2793520040e\" (UID: \"29bf2149-397b-49a3-a8b5-f2793520040e\") " Dec 02 16:18:27 crc kubenswrapper[4933]: I1202 16:18:27.242838 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/29bf2149-397b-49a3-a8b5-f2793520040e-ovsdbserver-nb\") pod \"29bf2149-397b-49a3-a8b5-f2793520040e\" (UID: \"29bf2149-397b-49a3-a8b5-f2793520040e\") " Dec 02 16:18:27 crc kubenswrapper[4933]: I1202 16:18:27.242920 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29bf2149-397b-49a3-a8b5-f2793520040e-config\") pod \"29bf2149-397b-49a3-a8b5-f2793520040e\" (UID: \"29bf2149-397b-49a3-a8b5-f2793520040e\") " Dec 02 16:18:27 crc kubenswrapper[4933]: I1202 16:18:27.242972 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/29bf2149-397b-49a3-a8b5-f2793520040e-dns-swift-storage-0\") pod \"29bf2149-397b-49a3-a8b5-f2793520040e\" (UID: \"29bf2149-397b-49a3-a8b5-f2793520040e\") " Dec 02 16:18:27 crc kubenswrapper[4933]: I1202 16:18:27.243025 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/29bf2149-397b-49a3-a8b5-f2793520040e-ovsdbserver-sb\") pod \"29bf2149-397b-49a3-a8b5-f2793520040e\" (UID: \"29bf2149-397b-49a3-a8b5-f2793520040e\") " Dec 02 16:18:27 crc kubenswrapper[4933]: I1202 16:18:27.243121 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/29bf2149-397b-49a3-a8b5-f2793520040e-dns-svc\") pod \"29bf2149-397b-49a3-a8b5-f2793520040e\" (UID: \"29bf2149-397b-49a3-a8b5-f2793520040e\") " Dec 02 16:18:27 crc kubenswrapper[4933]: I1202 16:18:27.268023 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29bf2149-397b-49a3-a8b5-f2793520040e-kube-api-access-pz2v6" (OuterVolumeSpecName: "kube-api-access-pz2v6") pod "29bf2149-397b-49a3-a8b5-f2793520040e" (UID: "29bf2149-397b-49a3-a8b5-f2793520040e"). InnerVolumeSpecName "kube-api-access-pz2v6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:18:27 crc kubenswrapper[4933]: I1202 16:18:27.324137 4933 generic.go:334] "Generic (PLEG): container finished" podID="29bf2149-397b-49a3-a8b5-f2793520040e" containerID="e033e6d59b6d6809ce6e80d4fd6ca7c0af981684a01743651bce9ad0388f47cd" exitCode=0 Dec 02 16:18:27 crc kubenswrapper[4933]: I1202 16:18:27.324222 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7877d89589-kn582" event={"ID":"29bf2149-397b-49a3-a8b5-f2793520040e","Type":"ContainerDied","Data":"e033e6d59b6d6809ce6e80d4fd6ca7c0af981684a01743651bce9ad0388f47cd"} Dec 02 16:18:27 crc kubenswrapper[4933]: I1202 16:18:27.324249 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7877d89589-kn582" event={"ID":"29bf2149-397b-49a3-a8b5-f2793520040e","Type":"ContainerDied","Data":"e4a6af734486eed2bde24cd0bfd829b2bbc893d891586226616014599ba5a0d3"} Dec 02 16:18:27 crc kubenswrapper[4933]: I1202 16:18:27.324264 4933 scope.go:117] "RemoveContainer" containerID="e033e6d59b6d6809ce6e80d4fd6ca7c0af981684a01743651bce9ad0388f47cd" Dec 02 16:18:27 crc kubenswrapper[4933]: I1202 16:18:27.324417 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7877d89589-kn582" Dec 02 16:18:27 crc kubenswrapper[4933]: I1202 16:18:27.345599 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29bf2149-397b-49a3-a8b5-f2793520040e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "29bf2149-397b-49a3-a8b5-f2793520040e" (UID: "29bf2149-397b-49a3-a8b5-f2793520040e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:18:27 crc kubenswrapper[4933]: I1202 16:18:27.347463 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pz2v6\" (UniqueName: \"kubernetes.io/projected/29bf2149-397b-49a3-a8b5-f2793520040e-kube-api-access-pz2v6\") on node \"crc\" DevicePath \"\"" Dec 02 16:18:27 crc kubenswrapper[4933]: I1202 16:18:27.347795 4933 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/29bf2149-397b-49a3-a8b5-f2793520040e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 02 16:18:27 crc kubenswrapper[4933]: I1202 16:18:27.353259 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-cjgzk" podUID="50ec02ba-6f45-4d20-a016-53cd873906ae" containerName="registry-server" containerID="cri-o://a0caddeb93067580bda7c104afca17b96085fbb21420b0f6c451edc06dcc60dd" gracePeriod=2 Dec 02 16:18:27 crc kubenswrapper[4933]: I1202 16:18:27.354056 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7c8b8773-6b20-4130-a427-9567f6d990f4","Type":"ContainerStarted","Data":"9f10a2aaf2d97f4029f4dbbdbdd93c4d6aa1b5cb107ba66005e44299b954ff69"} Dec 02 16:18:27 crc kubenswrapper[4933]: I1202 16:18:27.438460 4933 scope.go:117] "RemoveContainer" containerID="4039b4ffdafe2e3a13de3be3884b5fcccecfd1a33d4012ad6370cc1c2c993ee8" Dec 02 16:18:27 crc kubenswrapper[4933]: I1202 16:18:27.450475 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29bf2149-397b-49a3-a8b5-f2793520040e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "29bf2149-397b-49a3-a8b5-f2793520040e" (UID: "29bf2149-397b-49a3-a8b5-f2793520040e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:18:27 crc kubenswrapper[4933]: I1202 16:18:27.450746 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/29bf2149-397b-49a3-a8b5-f2793520040e-dns-swift-storage-0\") pod \"29bf2149-397b-49a3-a8b5-f2793520040e\" (UID: \"29bf2149-397b-49a3-a8b5-f2793520040e\") " Dec 02 16:18:27 crc kubenswrapper[4933]: W1202 16:18:27.452197 4933 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/29bf2149-397b-49a3-a8b5-f2793520040e/volumes/kubernetes.io~configmap/dns-swift-storage-0 Dec 02 16:18:27 crc kubenswrapper[4933]: I1202 16:18:27.452215 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29bf2149-397b-49a3-a8b5-f2793520040e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "29bf2149-397b-49a3-a8b5-f2793520040e" (UID: "29bf2149-397b-49a3-a8b5-f2793520040e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:18:27 crc kubenswrapper[4933]: I1202 16:18:27.517277 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29bf2149-397b-49a3-a8b5-f2793520040e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "29bf2149-397b-49a3-a8b5-f2793520040e" (UID: "29bf2149-397b-49a3-a8b5-f2793520040e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:18:27 crc kubenswrapper[4933]: I1202 16:18:27.521326 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29bf2149-397b-49a3-a8b5-f2793520040e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "29bf2149-397b-49a3-a8b5-f2793520040e" (UID: "29bf2149-397b-49a3-a8b5-f2793520040e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:18:27 crc kubenswrapper[4933]: I1202 16:18:27.553761 4933 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/29bf2149-397b-49a3-a8b5-f2793520040e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 02 16:18:27 crc kubenswrapper[4933]: I1202 16:18:27.553793 4933 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/29bf2149-397b-49a3-a8b5-f2793520040e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 02 16:18:27 crc kubenswrapper[4933]: I1202 16:18:27.553804 4933 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/29bf2149-397b-49a3-a8b5-f2793520040e-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 16:18:27 crc kubenswrapper[4933]: I1202 16:18:27.565567 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29bf2149-397b-49a3-a8b5-f2793520040e-config" (OuterVolumeSpecName: "config") pod "29bf2149-397b-49a3-a8b5-f2793520040e" (UID: "29bf2149-397b-49a3-a8b5-f2793520040e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:18:27 crc kubenswrapper[4933]: I1202 16:18:27.620672 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-j4ntg"] Dec 02 16:18:27 crc kubenswrapper[4933]: I1202 16:18:27.656215 4933 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29bf2149-397b-49a3-a8b5-f2793520040e-config\") on node \"crc\" DevicePath \"\"" Dec 02 16:18:27 crc kubenswrapper[4933]: I1202 16:18:27.796704 4933 scope.go:117] "RemoveContainer" containerID="e033e6d59b6d6809ce6e80d4fd6ca7c0af981684a01743651bce9ad0388f47cd" Dec 02 16:18:27 crc kubenswrapper[4933]: E1202 16:18:27.799742 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e033e6d59b6d6809ce6e80d4fd6ca7c0af981684a01743651bce9ad0388f47cd\": container with ID starting with e033e6d59b6d6809ce6e80d4fd6ca7c0af981684a01743651bce9ad0388f47cd not found: ID does not exist" containerID="e033e6d59b6d6809ce6e80d4fd6ca7c0af981684a01743651bce9ad0388f47cd" Dec 02 16:18:27 crc kubenswrapper[4933]: I1202 16:18:27.799804 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e033e6d59b6d6809ce6e80d4fd6ca7c0af981684a01743651bce9ad0388f47cd"} err="failed to get container status \"e033e6d59b6d6809ce6e80d4fd6ca7c0af981684a01743651bce9ad0388f47cd\": rpc error: code = NotFound desc = could not find container \"e033e6d59b6d6809ce6e80d4fd6ca7c0af981684a01743651bce9ad0388f47cd\": container with ID starting with e033e6d59b6d6809ce6e80d4fd6ca7c0af981684a01743651bce9ad0388f47cd not found: ID does not exist" Dec 02 16:18:27 crc kubenswrapper[4933]: I1202 16:18:27.799850 4933 scope.go:117] "RemoveContainer" containerID="4039b4ffdafe2e3a13de3be3884b5fcccecfd1a33d4012ad6370cc1c2c993ee8" Dec 02 16:18:27 crc kubenswrapper[4933]: E1202 16:18:27.803937 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4039b4ffdafe2e3a13de3be3884b5fcccecfd1a33d4012ad6370cc1c2c993ee8\": container with ID starting with 4039b4ffdafe2e3a13de3be3884b5fcccecfd1a33d4012ad6370cc1c2c993ee8 not found: ID does not exist" containerID="4039b4ffdafe2e3a13de3be3884b5fcccecfd1a33d4012ad6370cc1c2c993ee8" Dec 02 16:18:27 crc kubenswrapper[4933]: I1202 16:18:27.803989 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4039b4ffdafe2e3a13de3be3884b5fcccecfd1a33d4012ad6370cc1c2c993ee8"} err="failed to get container status \"4039b4ffdafe2e3a13de3be3884b5fcccecfd1a33d4012ad6370cc1c2c993ee8\": rpc error: code = NotFound desc = could not find container \"4039b4ffdafe2e3a13de3be3884b5fcccecfd1a33d4012ad6370cc1c2c993ee8\": container with ID starting with 4039b4ffdafe2e3a13de3be3884b5fcccecfd1a33d4012ad6370cc1c2c993ee8 not found: ID does not exist" Dec 02 16:18:27 crc kubenswrapper[4933]: I1202 16:18:27.817551 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7877d89589-kn582"] Dec 02 16:18:27 crc kubenswrapper[4933]: I1202 16:18:27.843464 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7877d89589-kn582"] Dec 02 16:18:27 crc kubenswrapper[4933]: I1202 16:18:27.944425 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cjgzk" Dec 02 16:18:27 crc kubenswrapper[4933]: I1202 16:18:27.952800 4933 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 16:18:28 crc kubenswrapper[4933]: I1202 16:18:28.066634 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hzln9\" (UniqueName: \"kubernetes.io/projected/50ec02ba-6f45-4d20-a016-53cd873906ae-kube-api-access-hzln9\") pod \"50ec02ba-6f45-4d20-a016-53cd873906ae\" (UID: \"50ec02ba-6f45-4d20-a016-53cd873906ae\") " Dec 02 16:18:28 crc kubenswrapper[4933]: I1202 16:18:28.067260 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50ec02ba-6f45-4d20-a016-53cd873906ae-utilities\") pod \"50ec02ba-6f45-4d20-a016-53cd873906ae\" (UID: \"50ec02ba-6f45-4d20-a016-53cd873906ae\") " Dec 02 16:18:28 crc kubenswrapper[4933]: I1202 16:18:28.069421 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50ec02ba-6f45-4d20-a016-53cd873906ae-utilities" (OuterVolumeSpecName: "utilities") pod "50ec02ba-6f45-4d20-a016-53cd873906ae" (UID: "50ec02ba-6f45-4d20-a016-53cd873906ae"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 16:18:28 crc kubenswrapper[4933]: I1202 16:18:28.070052 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50ec02ba-6f45-4d20-a016-53cd873906ae-catalog-content\") pod \"50ec02ba-6f45-4d20-a016-53cd873906ae\" (UID: \"50ec02ba-6f45-4d20-a016-53cd873906ae\") " Dec 02 16:18:28 crc kubenswrapper[4933]: I1202 16:18:28.071086 4933 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50ec02ba-6f45-4d20-a016-53cd873906ae-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 16:18:28 crc kubenswrapper[4933]: I1202 16:18:28.076282 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50ec02ba-6f45-4d20-a016-53cd873906ae-kube-api-access-hzln9" (OuterVolumeSpecName: "kube-api-access-hzln9") pod "50ec02ba-6f45-4d20-a016-53cd873906ae" (UID: "50ec02ba-6f45-4d20-a016-53cd873906ae"). InnerVolumeSpecName "kube-api-access-hzln9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:18:28 crc kubenswrapper[4933]: I1202 16:18:28.117364 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50ec02ba-6f45-4d20-a016-53cd873906ae-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "50ec02ba-6f45-4d20-a016-53cd873906ae" (UID: "50ec02ba-6f45-4d20-a016-53cd873906ae"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 16:18:28 crc kubenswrapper[4933]: I1202 16:18:28.172891 4933 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50ec02ba-6f45-4d20-a016-53cd873906ae-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 16:18:28 crc kubenswrapper[4933]: I1202 16:18:28.172915 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hzln9\" (UniqueName: \"kubernetes.io/projected/50ec02ba-6f45-4d20-a016-53cd873906ae-kube-api-access-hzln9\") on node \"crc\" DevicePath \"\"" Dec 02 16:18:28 crc kubenswrapper[4933]: I1202 16:18:28.367057 4933 generic.go:334] "Generic (PLEG): container finished" podID="50ec02ba-6f45-4d20-a016-53cd873906ae" containerID="a0caddeb93067580bda7c104afca17b96085fbb21420b0f6c451edc06dcc60dd" exitCode=0 Dec 02 16:18:28 crc kubenswrapper[4933]: I1202 16:18:28.367103 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cjgzk" event={"ID":"50ec02ba-6f45-4d20-a016-53cd873906ae","Type":"ContainerDied","Data":"a0caddeb93067580bda7c104afca17b96085fbb21420b0f6c451edc06dcc60dd"} Dec 02 16:18:28 crc kubenswrapper[4933]: I1202 16:18:28.367152 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cjgzk" event={"ID":"50ec02ba-6f45-4d20-a016-53cd873906ae","Type":"ContainerDied","Data":"7c4ead521110f803e67ece6e82177718b38efc85504f0828d70f9ed5932d4b3a"} Dec 02 16:18:28 crc kubenswrapper[4933]: I1202 16:18:28.367153 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cjgzk" Dec 02 16:18:28 crc kubenswrapper[4933]: I1202 16:18:28.367176 4933 scope.go:117] "RemoveContainer" containerID="a0caddeb93067580bda7c104afca17b96085fbb21420b0f6c451edc06dcc60dd" Dec 02 16:18:28 crc kubenswrapper[4933]: I1202 16:18:28.375555 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-j4ntg" event={"ID":"e63b60f6-5169-44c1-b64f-6bad25c8ba22","Type":"ContainerStarted","Data":"97790b38e1af16f3cb724c0fa81690697155a68e454d4a7c2c0fcdfdf7c7265f"} Dec 02 16:18:28 crc kubenswrapper[4933]: I1202 16:18:28.375597 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-j4ntg" event={"ID":"e63b60f6-5169-44c1-b64f-6bad25c8ba22","Type":"ContainerStarted","Data":"e076fb81f6e0b36150a2dd0f4972f1f36678c2d416d832670838c4d2b7ad201e"} Dec 02 16:18:28 crc kubenswrapper[4933]: I1202 16:18:28.378658 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7c8b8773-6b20-4130-a427-9567f6d990f4","Type":"ContainerStarted","Data":"0cdfaa2c395b0884665e146c203ca33668060c1d384b091e2c7f522e486816ef"} Dec 02 16:18:28 crc kubenswrapper[4933]: I1202 16:18:28.398052 4933 scope.go:117] "RemoveContainer" containerID="b5ed5f7d74538b7482cb67b0e747bfd2b3b3260d8cbcb464eeb643f22f08adb4" Dec 02 16:18:28 crc kubenswrapper[4933]: I1202 16:18:28.406987 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-j4ntg" podStartSLOduration=2.406959173 podStartE2EDuration="2.406959173s" podCreationTimestamp="2025-12-02 16:18:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 16:18:28.392255349 +0000 UTC m=+1571.643482072" watchObservedRunningTime="2025-12-02 16:18:28.406959173 +0000 UTC m=+1571.658185876" Dec 02 16:18:28 crc kubenswrapper[4933]: I1202 16:18:28.431658 4933 scope.go:117] "RemoveContainer" containerID="09206952d1f4eafd9ec1315dda15d183135c85575d70432eb287465c9a0c869d" Dec 02 16:18:28 crc kubenswrapper[4933]: I1202 16:18:28.432995 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-cjgzk"] Dec 02 16:18:28 crc kubenswrapper[4933]: I1202 16:18:28.443889 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-cjgzk"] Dec 02 16:18:28 crc kubenswrapper[4933]: I1202 16:18:28.459165 4933 scope.go:117] "RemoveContainer" containerID="a0caddeb93067580bda7c104afca17b96085fbb21420b0f6c451edc06dcc60dd" Dec 02 16:18:28 crc kubenswrapper[4933]: E1202 16:18:28.459683 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0caddeb93067580bda7c104afca17b96085fbb21420b0f6c451edc06dcc60dd\": container with ID starting with a0caddeb93067580bda7c104afca17b96085fbb21420b0f6c451edc06dcc60dd not found: ID does not exist" containerID="a0caddeb93067580bda7c104afca17b96085fbb21420b0f6c451edc06dcc60dd" Dec 02 16:18:28 crc kubenswrapper[4933]: I1202 16:18:28.459733 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0caddeb93067580bda7c104afca17b96085fbb21420b0f6c451edc06dcc60dd"} err="failed to get container status \"a0caddeb93067580bda7c104afca17b96085fbb21420b0f6c451edc06dcc60dd\": rpc error: code = NotFound desc = could not find container \"a0caddeb93067580bda7c104afca17b96085fbb21420b0f6c451edc06dcc60dd\": container with ID starting with a0caddeb93067580bda7c104afca17b96085fbb21420b0f6c451edc06dcc60dd not found: ID does not exist" Dec 02 16:18:28 crc kubenswrapper[4933]: I1202 16:18:28.459761 4933 scope.go:117] "RemoveContainer" containerID="b5ed5f7d74538b7482cb67b0e747bfd2b3b3260d8cbcb464eeb643f22f08adb4" Dec 02 16:18:28 crc kubenswrapper[4933]: E1202 16:18:28.460181 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5ed5f7d74538b7482cb67b0e747bfd2b3b3260d8cbcb464eeb643f22f08adb4\": container with ID starting with b5ed5f7d74538b7482cb67b0e747bfd2b3b3260d8cbcb464eeb643f22f08adb4 not found: ID does not exist" containerID="b5ed5f7d74538b7482cb67b0e747bfd2b3b3260d8cbcb464eeb643f22f08adb4" Dec 02 16:18:28 crc kubenswrapper[4933]: I1202 16:18:28.460210 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5ed5f7d74538b7482cb67b0e747bfd2b3b3260d8cbcb464eeb643f22f08adb4"} err="failed to get container status \"b5ed5f7d74538b7482cb67b0e747bfd2b3b3260d8cbcb464eeb643f22f08adb4\": rpc error: code = NotFound desc = could not find container \"b5ed5f7d74538b7482cb67b0e747bfd2b3b3260d8cbcb464eeb643f22f08adb4\": container with ID starting with b5ed5f7d74538b7482cb67b0e747bfd2b3b3260d8cbcb464eeb643f22f08adb4 not found: ID does not exist" Dec 02 16:18:28 crc kubenswrapper[4933]: I1202 16:18:28.460230 4933 scope.go:117] "RemoveContainer" containerID="09206952d1f4eafd9ec1315dda15d183135c85575d70432eb287465c9a0c869d" Dec 02 16:18:28 crc kubenswrapper[4933]: E1202 16:18:28.460488 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09206952d1f4eafd9ec1315dda15d183135c85575d70432eb287465c9a0c869d\": container with ID starting with 09206952d1f4eafd9ec1315dda15d183135c85575d70432eb287465c9a0c869d not found: ID does not exist" containerID="09206952d1f4eafd9ec1315dda15d183135c85575d70432eb287465c9a0c869d" Dec 02 16:18:28 crc kubenswrapper[4933]: I1202 16:18:28.460507 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09206952d1f4eafd9ec1315dda15d183135c85575d70432eb287465c9a0c869d"} err="failed to get container status \"09206952d1f4eafd9ec1315dda15d183135c85575d70432eb287465c9a0c869d\": rpc error: code = NotFound desc = could not find container \"09206952d1f4eafd9ec1315dda15d183135c85575d70432eb287465c9a0c869d\": container with ID starting with 09206952d1f4eafd9ec1315dda15d183135c85575d70432eb287465c9a0c869d not found: ID does not exist" Dec 02 16:18:29 crc kubenswrapper[4933]: I1202 16:18:29.067891 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29bf2149-397b-49a3-a8b5-f2793520040e" path="/var/lib/kubelet/pods/29bf2149-397b-49a3-a8b5-f2793520040e/volumes" Dec 02 16:18:29 crc kubenswrapper[4933]: I1202 16:18:29.069384 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50ec02ba-6f45-4d20-a016-53cd873906ae" path="/var/lib/kubelet/pods/50ec02ba-6f45-4d20-a016-53cd873906ae/volumes" Dec 02 16:18:31 crc kubenswrapper[4933]: I1202 16:18:31.424507 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7c8b8773-6b20-4130-a427-9567f6d990f4","Type":"ContainerStarted","Data":"263cb12cd86103682854f70122be5d9111e7bd2062dc5a3160bcd0553d308b3e"} Dec 02 16:18:33 crc kubenswrapper[4933]: I1202 16:18:33.449493 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7c8b8773-6b20-4130-a427-9567f6d990f4","Type":"ContainerStarted","Data":"08cb795548dfc81bc703c4ad03738280a23c1fb2e45ff4980dea95fe4bd40c20"} Dec 02 16:18:33 crc kubenswrapper[4933]: I1202 16:18:33.450146 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 02 16:18:33 crc kubenswrapper[4933]: I1202 16:18:33.453310 4933 generic.go:334] "Generic (PLEG): container finished" podID="e63b60f6-5169-44c1-b64f-6bad25c8ba22" containerID="97790b38e1af16f3cb724c0fa81690697155a68e454d4a7c2c0fcdfdf7c7265f" exitCode=0 Dec 02 16:18:33 crc kubenswrapper[4933]: I1202 16:18:33.453372 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-j4ntg" event={"ID":"e63b60f6-5169-44c1-b64f-6bad25c8ba22","Type":"ContainerDied","Data":"97790b38e1af16f3cb724c0fa81690697155a68e454d4a7c2c0fcdfdf7c7265f"} Dec 02 16:18:33 crc kubenswrapper[4933]: I1202 16:18:33.496068 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.352630473 podStartE2EDuration="9.496045282s" podCreationTimestamp="2025-12-02 16:18:24 +0000 UTC" firstStartedPulling="2025-12-02 16:18:25.241205378 +0000 UTC m=+1568.492432081" lastFinishedPulling="2025-12-02 16:18:32.384620187 +0000 UTC m=+1575.635846890" observedRunningTime="2025-12-02 16:18:33.4858041 +0000 UTC m=+1576.737030833" watchObservedRunningTime="2025-12-02 16:18:33.496045282 +0000 UTC m=+1576.747271985" Dec 02 16:18:33 crc kubenswrapper[4933]: I1202 16:18:33.921576 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 02 16:18:33 crc kubenswrapper[4933]: I1202 16:18:33.921963 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 02 16:18:34 crc kubenswrapper[4933]: I1202 16:18:34.941988 4933 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="47162c0d-06c3-452d-b69d-a875408216f6" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.254:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 02 16:18:34 crc kubenswrapper[4933]: I1202 16:18:34.942071 4933 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="47162c0d-06c3-452d-b69d-a875408216f6" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.254:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 02 16:18:34 crc kubenswrapper[4933]: I1202 16:18:34.951435 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-j4ntg" Dec 02 16:18:35 crc kubenswrapper[4933]: I1202 16:18:35.147473 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bw6rk\" (UniqueName: \"kubernetes.io/projected/e63b60f6-5169-44c1-b64f-6bad25c8ba22-kube-api-access-bw6rk\") pod \"e63b60f6-5169-44c1-b64f-6bad25c8ba22\" (UID: \"e63b60f6-5169-44c1-b64f-6bad25c8ba22\") " Dec 02 16:18:35 crc kubenswrapper[4933]: I1202 16:18:35.148844 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e63b60f6-5169-44c1-b64f-6bad25c8ba22-config-data\") pod \"e63b60f6-5169-44c1-b64f-6bad25c8ba22\" (UID: \"e63b60f6-5169-44c1-b64f-6bad25c8ba22\") " Dec 02 16:18:35 crc kubenswrapper[4933]: I1202 16:18:35.148996 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e63b60f6-5169-44c1-b64f-6bad25c8ba22-combined-ca-bundle\") pod \"e63b60f6-5169-44c1-b64f-6bad25c8ba22\" (UID: \"e63b60f6-5169-44c1-b64f-6bad25c8ba22\") " Dec 02 16:18:35 crc kubenswrapper[4933]: I1202 16:18:35.149142 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e63b60f6-5169-44c1-b64f-6bad25c8ba22-scripts\") pod \"e63b60f6-5169-44c1-b64f-6bad25c8ba22\" (UID: \"e63b60f6-5169-44c1-b64f-6bad25c8ba22\") " Dec 02 16:18:35 crc kubenswrapper[4933]: I1202 16:18:35.155927 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e63b60f6-5169-44c1-b64f-6bad25c8ba22-scripts" (OuterVolumeSpecName: "scripts") pod "e63b60f6-5169-44c1-b64f-6bad25c8ba22" (UID: "e63b60f6-5169-44c1-b64f-6bad25c8ba22"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:18:35 crc kubenswrapper[4933]: I1202 16:18:35.156380 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e63b60f6-5169-44c1-b64f-6bad25c8ba22-kube-api-access-bw6rk" (OuterVolumeSpecName: "kube-api-access-bw6rk") pod "e63b60f6-5169-44c1-b64f-6bad25c8ba22" (UID: "e63b60f6-5169-44c1-b64f-6bad25c8ba22"). InnerVolumeSpecName "kube-api-access-bw6rk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:18:35 crc kubenswrapper[4933]: I1202 16:18:35.195397 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e63b60f6-5169-44c1-b64f-6bad25c8ba22-config-data" (OuterVolumeSpecName: "config-data") pod "e63b60f6-5169-44c1-b64f-6bad25c8ba22" (UID: "e63b60f6-5169-44c1-b64f-6bad25c8ba22"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:18:35 crc kubenswrapper[4933]: I1202 16:18:35.199503 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e63b60f6-5169-44c1-b64f-6bad25c8ba22-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e63b60f6-5169-44c1-b64f-6bad25c8ba22" (UID: "e63b60f6-5169-44c1-b64f-6bad25c8ba22"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:18:35 crc kubenswrapper[4933]: I1202 16:18:35.256553 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bw6rk\" (UniqueName: \"kubernetes.io/projected/e63b60f6-5169-44c1-b64f-6bad25c8ba22-kube-api-access-bw6rk\") on node \"crc\" DevicePath \"\"" Dec 02 16:18:35 crc kubenswrapper[4933]: I1202 16:18:35.256817 4933 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e63b60f6-5169-44c1-b64f-6bad25c8ba22-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 16:18:35 crc kubenswrapper[4933]: I1202 16:18:35.256943 4933 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e63b60f6-5169-44c1-b64f-6bad25c8ba22-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 16:18:35 crc kubenswrapper[4933]: I1202 16:18:35.257016 4933 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e63b60f6-5169-44c1-b64f-6bad25c8ba22-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 16:18:35 crc kubenswrapper[4933]: I1202 16:18:35.478324 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-j4ntg" event={"ID":"e63b60f6-5169-44c1-b64f-6bad25c8ba22","Type":"ContainerDied","Data":"e076fb81f6e0b36150a2dd0f4972f1f36678c2d416d832670838c4d2b7ad201e"} Dec 02 16:18:35 crc kubenswrapper[4933]: I1202 16:18:35.478383 4933 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e076fb81f6e0b36150a2dd0f4972f1f36678c2d416d832670838c4d2b7ad201e" Dec 02 16:18:35 crc kubenswrapper[4933]: I1202 16:18:35.478409 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-j4ntg" Dec 02 16:18:35 crc kubenswrapper[4933]: I1202 16:18:35.724069 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 02 16:18:35 crc kubenswrapper[4933]: I1202 16:18:35.724522 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="47162c0d-06c3-452d-b69d-a875408216f6" containerName="nova-api-log" containerID="cri-o://81292d89bccccc8f23d5cb412aed4c0a83459fb6811457b2feac58a039c13c48" gracePeriod=30 Dec 02 16:18:35 crc kubenswrapper[4933]: I1202 16:18:35.724613 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="47162c0d-06c3-452d-b69d-a875408216f6" containerName="nova-api-api" containerID="cri-o://e6a144c1d0da23141ffa91100cf1e01ba5c7b68dbd0a8ed2ac949eb31b1a5686" gracePeriod=30 Dec 02 16:18:35 crc kubenswrapper[4933]: I1202 16:18:35.747750 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 16:18:35 crc kubenswrapper[4933]: I1202 16:18:35.748007 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="1c58ccba-4def-4328-b1c9-6751fb2dcb0f" containerName="nova-scheduler-scheduler" containerID="cri-o://c9e66550cd7499a508258e4f85a3033dc0817b18afd31a5e11a6a13b62738d1d" gracePeriod=30 Dec 02 16:18:35 crc kubenswrapper[4933]: I1202 16:18:35.765283 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 16:18:35 crc kubenswrapper[4933]: I1202 16:18:35.765672 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="37fc2f39-ccfe-45bc-b2e0-df5cabe8c327" containerName="nova-metadata-metadata" containerID="cri-o://d45bfe61ec307d78b01e25e5b3d60baed2cfbd18e8e3c083dfd998cd815623dc" gracePeriod=30 Dec 02 16:18:35 crc kubenswrapper[4933]: I1202 16:18:35.765540 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="37fc2f39-ccfe-45bc-b2e0-df5cabe8c327" containerName="nova-metadata-log" containerID="cri-o://d65d73cfd91f9de5da7119c94edf9d13504856b4a427c8ed2938ddf237e11a33" gracePeriod=30 Dec 02 16:18:36 crc kubenswrapper[4933]: I1202 16:18:36.490805 4933 generic.go:334] "Generic (PLEG): container finished" podID="47162c0d-06c3-452d-b69d-a875408216f6" containerID="81292d89bccccc8f23d5cb412aed4c0a83459fb6811457b2feac58a039c13c48" exitCode=143 Dec 02 16:18:36 crc kubenswrapper[4933]: I1202 16:18:36.490863 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"47162c0d-06c3-452d-b69d-a875408216f6","Type":"ContainerDied","Data":"81292d89bccccc8f23d5cb412aed4c0a83459fb6811457b2feac58a039c13c48"} Dec 02 16:18:36 crc kubenswrapper[4933]: I1202 16:18:36.493377 4933 generic.go:334] "Generic (PLEG): container finished" podID="37fc2f39-ccfe-45bc-b2e0-df5cabe8c327" containerID="d65d73cfd91f9de5da7119c94edf9d13504856b4a427c8ed2938ddf237e11a33" exitCode=143 Dec 02 16:18:36 crc kubenswrapper[4933]: I1202 16:18:36.493429 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"37fc2f39-ccfe-45bc-b2e0-df5cabe8c327","Type":"ContainerDied","Data":"d65d73cfd91f9de5da7119c94edf9d13504856b4a427c8ed2938ddf237e11a33"} Dec 02 16:18:37 crc kubenswrapper[4933]: I1202 16:18:37.508095 4933 generic.go:334] "Generic (PLEG): container finished" podID="1c58ccba-4def-4328-b1c9-6751fb2dcb0f" containerID="c9e66550cd7499a508258e4f85a3033dc0817b18afd31a5e11a6a13b62738d1d" exitCode=0 Dec 02 16:18:37 crc kubenswrapper[4933]: I1202 16:18:37.508192 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"1c58ccba-4def-4328-b1c9-6751fb2dcb0f","Type":"ContainerDied","Data":"c9e66550cd7499a508258e4f85a3033dc0817b18afd31a5e11a6a13b62738d1d"} Dec 02 16:18:37 crc kubenswrapper[4933]: I1202 16:18:37.709325 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 02 16:18:37 crc kubenswrapper[4933]: I1202 16:18:37.710318 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c58ccba-4def-4328-b1c9-6751fb2dcb0f-config-data\") pod \"1c58ccba-4def-4328-b1c9-6751fb2dcb0f\" (UID: \"1c58ccba-4def-4328-b1c9-6751fb2dcb0f\") " Dec 02 16:18:37 crc kubenswrapper[4933]: I1202 16:18:37.760876 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c58ccba-4def-4328-b1c9-6751fb2dcb0f-config-data" (OuterVolumeSpecName: "config-data") pod "1c58ccba-4def-4328-b1c9-6751fb2dcb0f" (UID: "1c58ccba-4def-4328-b1c9-6751fb2dcb0f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:18:37 crc kubenswrapper[4933]: I1202 16:18:37.813871 4933 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c58ccba-4def-4328-b1c9-6751fb2dcb0f-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 16:18:37 crc kubenswrapper[4933]: I1202 16:18:37.915906 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c58ccba-4def-4328-b1c9-6751fb2dcb0f-combined-ca-bundle\") pod \"1c58ccba-4def-4328-b1c9-6751fb2dcb0f\" (UID: \"1c58ccba-4def-4328-b1c9-6751fb2dcb0f\") " Dec 02 16:18:37 crc kubenswrapper[4933]: I1202 16:18:37.916932 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-59jxw\" (UniqueName: \"kubernetes.io/projected/1c58ccba-4def-4328-b1c9-6751fb2dcb0f-kube-api-access-59jxw\") pod \"1c58ccba-4def-4328-b1c9-6751fb2dcb0f\" (UID: \"1c58ccba-4def-4328-b1c9-6751fb2dcb0f\") " Dec 02 16:18:37 crc kubenswrapper[4933]: I1202 16:18:37.920135 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c58ccba-4def-4328-b1c9-6751fb2dcb0f-kube-api-access-59jxw" (OuterVolumeSpecName: "kube-api-access-59jxw") pod "1c58ccba-4def-4328-b1c9-6751fb2dcb0f" (UID: "1c58ccba-4def-4328-b1c9-6751fb2dcb0f"). InnerVolumeSpecName "kube-api-access-59jxw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:18:37 crc kubenswrapper[4933]: I1202 16:18:37.956607 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c58ccba-4def-4328-b1c9-6751fb2dcb0f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1c58ccba-4def-4328-b1c9-6751fb2dcb0f" (UID: "1c58ccba-4def-4328-b1c9-6751fb2dcb0f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:18:38 crc kubenswrapper[4933]: I1202 16:18:38.019264 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-59jxw\" (UniqueName: \"kubernetes.io/projected/1c58ccba-4def-4328-b1c9-6751fb2dcb0f-kube-api-access-59jxw\") on node \"crc\" DevicePath \"\"" Dec 02 16:18:38 crc kubenswrapper[4933]: I1202 16:18:38.019294 4933 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c58ccba-4def-4328-b1c9-6751fb2dcb0f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 16:18:38 crc kubenswrapper[4933]: I1202 16:18:38.523743 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"1c58ccba-4def-4328-b1c9-6751fb2dcb0f","Type":"ContainerDied","Data":"c1c1bbbda72d0f0ca5ee5e7362af506936d4c8aae1cbe6535f4b65a222614989"} Dec 02 16:18:38 crc kubenswrapper[4933]: I1202 16:18:38.523795 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 02 16:18:38 crc kubenswrapper[4933]: I1202 16:18:38.524519 4933 scope.go:117] "RemoveContainer" containerID="c9e66550cd7499a508258e4f85a3033dc0817b18afd31a5e11a6a13b62738d1d" Dec 02 16:18:38 crc kubenswrapper[4933]: I1202 16:18:38.575681 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 16:18:38 crc kubenswrapper[4933]: I1202 16:18:38.592550 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 16:18:38 crc kubenswrapper[4933]: I1202 16:18:38.607328 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 16:18:38 crc kubenswrapper[4933]: E1202 16:18:38.607927 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e63b60f6-5169-44c1-b64f-6bad25c8ba22" containerName="nova-manage" Dec 02 16:18:38 crc kubenswrapper[4933]: I1202 16:18:38.607983 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="e63b60f6-5169-44c1-b64f-6bad25c8ba22" containerName="nova-manage" Dec 02 16:18:38 crc kubenswrapper[4933]: E1202 16:18:38.608001 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50ec02ba-6f45-4d20-a016-53cd873906ae" containerName="extract-content" Dec 02 16:18:38 crc kubenswrapper[4933]: I1202 16:18:38.608008 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="50ec02ba-6f45-4d20-a016-53cd873906ae" containerName="extract-content" Dec 02 16:18:38 crc kubenswrapper[4933]: E1202 16:18:38.608033 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29bf2149-397b-49a3-a8b5-f2793520040e" containerName="dnsmasq-dns" Dec 02 16:18:38 crc kubenswrapper[4933]: I1202 16:18:38.608039 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="29bf2149-397b-49a3-a8b5-f2793520040e" containerName="dnsmasq-dns" Dec 02 16:18:38 crc kubenswrapper[4933]: E1202 16:18:38.608050 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50ec02ba-6f45-4d20-a016-53cd873906ae" containerName="registry-server" Dec 02 16:18:38 crc kubenswrapper[4933]: I1202 16:18:38.608056 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="50ec02ba-6f45-4d20-a016-53cd873906ae" containerName="registry-server" Dec 02 16:18:38 crc kubenswrapper[4933]: E1202 16:18:38.608076 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50ec02ba-6f45-4d20-a016-53cd873906ae" containerName="extract-utilities" Dec 02 16:18:38 crc kubenswrapper[4933]: I1202 16:18:38.608083 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="50ec02ba-6f45-4d20-a016-53cd873906ae" containerName="extract-utilities" Dec 02 16:18:38 crc kubenswrapper[4933]: E1202 16:18:38.608103 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29bf2149-397b-49a3-a8b5-f2793520040e" containerName="init" Dec 02 16:18:38 crc kubenswrapper[4933]: I1202 16:18:38.608110 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="29bf2149-397b-49a3-a8b5-f2793520040e" containerName="init" Dec 02 16:18:38 crc kubenswrapper[4933]: E1202 16:18:38.608119 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c58ccba-4def-4328-b1c9-6751fb2dcb0f" containerName="nova-scheduler-scheduler" Dec 02 16:18:38 crc kubenswrapper[4933]: I1202 16:18:38.608125 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c58ccba-4def-4328-b1c9-6751fb2dcb0f" containerName="nova-scheduler-scheduler" Dec 02 16:18:38 crc kubenswrapper[4933]: I1202 16:18:38.608360 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="29bf2149-397b-49a3-a8b5-f2793520040e" containerName="dnsmasq-dns" Dec 02 16:18:38 crc kubenswrapper[4933]: I1202 16:18:38.608381 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="e63b60f6-5169-44c1-b64f-6bad25c8ba22" containerName="nova-manage" Dec 02 16:18:38 crc kubenswrapper[4933]: I1202 16:18:38.608395 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="50ec02ba-6f45-4d20-a016-53cd873906ae" containerName="registry-server" Dec 02 16:18:38 crc kubenswrapper[4933]: I1202 16:18:38.608420 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c58ccba-4def-4328-b1c9-6751fb2dcb0f" containerName="nova-scheduler-scheduler" Dec 02 16:18:38 crc kubenswrapper[4933]: I1202 16:18:38.609255 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 02 16:18:38 crc kubenswrapper[4933]: I1202 16:18:38.613311 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 02 16:18:38 crc kubenswrapper[4933]: I1202 16:18:38.631944 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 16:18:38 crc kubenswrapper[4933]: I1202 16:18:38.647618 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dc5rn\" (UniqueName: \"kubernetes.io/projected/a40f5b56-cf52-45da-b3b8-70a271c07984-kube-api-access-dc5rn\") pod \"nova-scheduler-0\" (UID: \"a40f5b56-cf52-45da-b3b8-70a271c07984\") " pod="openstack/nova-scheduler-0" Dec 02 16:18:38 crc kubenswrapper[4933]: I1202 16:18:38.647869 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a40f5b56-cf52-45da-b3b8-70a271c07984-config-data\") pod \"nova-scheduler-0\" (UID: \"a40f5b56-cf52-45da-b3b8-70a271c07984\") " pod="openstack/nova-scheduler-0" Dec 02 16:18:38 crc kubenswrapper[4933]: I1202 16:18:38.648361 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a40f5b56-cf52-45da-b3b8-70a271c07984-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a40f5b56-cf52-45da-b3b8-70a271c07984\") " pod="openstack/nova-scheduler-0" Dec 02 16:18:38 crc kubenswrapper[4933]: I1202 16:18:38.749628 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dc5rn\" (UniqueName: \"kubernetes.io/projected/a40f5b56-cf52-45da-b3b8-70a271c07984-kube-api-access-dc5rn\") pod \"nova-scheduler-0\" (UID: \"a40f5b56-cf52-45da-b3b8-70a271c07984\") " pod="openstack/nova-scheduler-0" Dec 02 16:18:38 crc kubenswrapper[4933]: I1202 16:18:38.749731 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a40f5b56-cf52-45da-b3b8-70a271c07984-config-data\") pod \"nova-scheduler-0\" (UID: \"a40f5b56-cf52-45da-b3b8-70a271c07984\") " pod="openstack/nova-scheduler-0" Dec 02 16:18:38 crc kubenswrapper[4933]: I1202 16:18:38.749925 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a40f5b56-cf52-45da-b3b8-70a271c07984-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a40f5b56-cf52-45da-b3b8-70a271c07984\") " pod="openstack/nova-scheduler-0" Dec 02 16:18:38 crc kubenswrapper[4933]: I1202 16:18:38.756761 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a40f5b56-cf52-45da-b3b8-70a271c07984-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a40f5b56-cf52-45da-b3b8-70a271c07984\") " pod="openstack/nova-scheduler-0" Dec 02 16:18:38 crc kubenswrapper[4933]: I1202 16:18:38.766423 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a40f5b56-cf52-45da-b3b8-70a271c07984-config-data\") pod \"nova-scheduler-0\" (UID: \"a40f5b56-cf52-45da-b3b8-70a271c07984\") " pod="openstack/nova-scheduler-0" Dec 02 16:18:38 crc kubenswrapper[4933]: I1202 16:18:38.767171 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dc5rn\" (UniqueName: \"kubernetes.io/projected/a40f5b56-cf52-45da-b3b8-70a271c07984-kube-api-access-dc5rn\") pod \"nova-scheduler-0\" (UID: \"a40f5b56-cf52-45da-b3b8-70a271c07984\") " pod="openstack/nova-scheduler-0" Dec 02 16:18:38 crc kubenswrapper[4933]: I1202 16:18:38.889096 4933 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="37fc2f39-ccfe-45bc-b2e0-df5cabe8c327" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.247:8775/\": read tcp 10.217.0.2:49870->10.217.0.247:8775: read: connection reset by peer" Dec 02 16:18:38 crc kubenswrapper[4933]: I1202 16:18:38.889206 4933 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="37fc2f39-ccfe-45bc-b2e0-df5cabe8c327" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.247:8775/\": read tcp 10.217.0.2:49866->10.217.0.247:8775: read: connection reset by peer" Dec 02 16:18:38 crc kubenswrapper[4933]: I1202 16:18:38.939443 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 02 16:18:39 crc kubenswrapper[4933]: I1202 16:18:39.067187 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c58ccba-4def-4328-b1c9-6751fb2dcb0f" path="/var/lib/kubelet/pods/1c58ccba-4def-4328-b1c9-6751fb2dcb0f/volumes" Dec 02 16:18:39 crc kubenswrapper[4933]: I1202 16:18:39.454783 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 16:18:39 crc kubenswrapper[4933]: I1202 16:18:39.539528 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 16:18:39 crc kubenswrapper[4933]: I1202 16:18:39.543115 4933 generic.go:334] "Generic (PLEG): container finished" podID="37fc2f39-ccfe-45bc-b2e0-df5cabe8c327" containerID="d45bfe61ec307d78b01e25e5b3d60baed2cfbd18e8e3c083dfd998cd815623dc" exitCode=0 Dec 02 16:18:39 crc kubenswrapper[4933]: I1202 16:18:39.543187 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"37fc2f39-ccfe-45bc-b2e0-df5cabe8c327","Type":"ContainerDied","Data":"d45bfe61ec307d78b01e25e5b3d60baed2cfbd18e8e3c083dfd998cd815623dc"} Dec 02 16:18:39 crc kubenswrapper[4933]: I1202 16:18:39.543218 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"37fc2f39-ccfe-45bc-b2e0-df5cabe8c327","Type":"ContainerDied","Data":"14197c9753e17c31e8492c8d7ac7455c0978e4a3102ea5720f559f41ca6ca6cb"} Dec 02 16:18:39 crc kubenswrapper[4933]: I1202 16:18:39.543238 4933 scope.go:117] "RemoveContainer" containerID="d45bfe61ec307d78b01e25e5b3d60baed2cfbd18e8e3c083dfd998cd815623dc" Dec 02 16:18:39 crc kubenswrapper[4933]: I1202 16:18:39.543390 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 16:18:39 crc kubenswrapper[4933]: I1202 16:18:39.571070 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37fc2f39-ccfe-45bc-b2e0-df5cabe8c327-combined-ca-bundle\") pod \"37fc2f39-ccfe-45bc-b2e0-df5cabe8c327\" (UID: \"37fc2f39-ccfe-45bc-b2e0-df5cabe8c327\") " Dec 02 16:18:39 crc kubenswrapper[4933]: I1202 16:18:39.571167 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/37fc2f39-ccfe-45bc-b2e0-df5cabe8c327-nova-metadata-tls-certs\") pod \"37fc2f39-ccfe-45bc-b2e0-df5cabe8c327\" (UID: \"37fc2f39-ccfe-45bc-b2e0-df5cabe8c327\") " Dec 02 16:18:39 crc kubenswrapper[4933]: I1202 16:18:39.571333 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-42ssl\" (UniqueName: \"kubernetes.io/projected/37fc2f39-ccfe-45bc-b2e0-df5cabe8c327-kube-api-access-42ssl\") pod \"37fc2f39-ccfe-45bc-b2e0-df5cabe8c327\" (UID: \"37fc2f39-ccfe-45bc-b2e0-df5cabe8c327\") " Dec 02 16:18:39 crc kubenswrapper[4933]: I1202 16:18:39.571412 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37fc2f39-ccfe-45bc-b2e0-df5cabe8c327-config-data\") pod \"37fc2f39-ccfe-45bc-b2e0-df5cabe8c327\" (UID: \"37fc2f39-ccfe-45bc-b2e0-df5cabe8c327\") " Dec 02 16:18:39 crc kubenswrapper[4933]: I1202 16:18:39.571592 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37fc2f39-ccfe-45bc-b2e0-df5cabe8c327-logs\") pod \"37fc2f39-ccfe-45bc-b2e0-df5cabe8c327\" (UID: \"37fc2f39-ccfe-45bc-b2e0-df5cabe8c327\") " Dec 02 16:18:39 crc kubenswrapper[4933]: I1202 16:18:39.572114 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37fc2f39-ccfe-45bc-b2e0-df5cabe8c327-logs" (OuterVolumeSpecName: "logs") pod "37fc2f39-ccfe-45bc-b2e0-df5cabe8c327" (UID: "37fc2f39-ccfe-45bc-b2e0-df5cabe8c327"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 16:18:39 crc kubenswrapper[4933]: I1202 16:18:39.572606 4933 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37fc2f39-ccfe-45bc-b2e0-df5cabe8c327-logs\") on node \"crc\" DevicePath \"\"" Dec 02 16:18:39 crc kubenswrapper[4933]: I1202 16:18:39.580179 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37fc2f39-ccfe-45bc-b2e0-df5cabe8c327-kube-api-access-42ssl" (OuterVolumeSpecName: "kube-api-access-42ssl") pod "37fc2f39-ccfe-45bc-b2e0-df5cabe8c327" (UID: "37fc2f39-ccfe-45bc-b2e0-df5cabe8c327"). InnerVolumeSpecName "kube-api-access-42ssl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:18:39 crc kubenswrapper[4933]: I1202 16:18:39.609546 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37fc2f39-ccfe-45bc-b2e0-df5cabe8c327-config-data" (OuterVolumeSpecName: "config-data") pod "37fc2f39-ccfe-45bc-b2e0-df5cabe8c327" (UID: "37fc2f39-ccfe-45bc-b2e0-df5cabe8c327"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:18:39 crc kubenswrapper[4933]: I1202 16:18:39.630004 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37fc2f39-ccfe-45bc-b2e0-df5cabe8c327-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "37fc2f39-ccfe-45bc-b2e0-df5cabe8c327" (UID: "37fc2f39-ccfe-45bc-b2e0-df5cabe8c327"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:18:39 crc kubenswrapper[4933]: I1202 16:18:39.669893 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37fc2f39-ccfe-45bc-b2e0-df5cabe8c327-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "37fc2f39-ccfe-45bc-b2e0-df5cabe8c327" (UID: "37fc2f39-ccfe-45bc-b2e0-df5cabe8c327"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:18:39 crc kubenswrapper[4933]: I1202 16:18:39.675422 4933 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37fc2f39-ccfe-45bc-b2e0-df5cabe8c327-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 16:18:39 crc kubenswrapper[4933]: I1202 16:18:39.675458 4933 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/37fc2f39-ccfe-45bc-b2e0-df5cabe8c327-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 02 16:18:39 crc kubenswrapper[4933]: I1202 16:18:39.675496 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-42ssl\" (UniqueName: \"kubernetes.io/projected/37fc2f39-ccfe-45bc-b2e0-df5cabe8c327-kube-api-access-42ssl\") on node \"crc\" DevicePath \"\"" Dec 02 16:18:39 crc kubenswrapper[4933]: I1202 16:18:39.675511 4933 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37fc2f39-ccfe-45bc-b2e0-df5cabe8c327-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 16:18:39 crc kubenswrapper[4933]: I1202 16:18:39.715900 4933 scope.go:117] "RemoveContainer" containerID="d65d73cfd91f9de5da7119c94edf9d13504856b4a427c8ed2938ddf237e11a33" Dec 02 16:18:39 crc kubenswrapper[4933]: I1202 16:18:39.740365 4933 scope.go:117] "RemoveContainer" containerID="d45bfe61ec307d78b01e25e5b3d60baed2cfbd18e8e3c083dfd998cd815623dc" Dec 02 16:18:39 crc kubenswrapper[4933]: E1202 16:18:39.740926 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d45bfe61ec307d78b01e25e5b3d60baed2cfbd18e8e3c083dfd998cd815623dc\": container with ID starting with d45bfe61ec307d78b01e25e5b3d60baed2cfbd18e8e3c083dfd998cd815623dc not found: ID does not exist" containerID="d45bfe61ec307d78b01e25e5b3d60baed2cfbd18e8e3c083dfd998cd815623dc" Dec 02 16:18:39 crc kubenswrapper[4933]: I1202 16:18:39.741007 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d45bfe61ec307d78b01e25e5b3d60baed2cfbd18e8e3c083dfd998cd815623dc"} err="failed to get container status \"d45bfe61ec307d78b01e25e5b3d60baed2cfbd18e8e3c083dfd998cd815623dc\": rpc error: code = NotFound desc = could not find container \"d45bfe61ec307d78b01e25e5b3d60baed2cfbd18e8e3c083dfd998cd815623dc\": container with ID starting with d45bfe61ec307d78b01e25e5b3d60baed2cfbd18e8e3c083dfd998cd815623dc not found: ID does not exist" Dec 02 16:18:39 crc kubenswrapper[4933]: I1202 16:18:39.741041 4933 scope.go:117] "RemoveContainer" containerID="d65d73cfd91f9de5da7119c94edf9d13504856b4a427c8ed2938ddf237e11a33" Dec 02 16:18:39 crc kubenswrapper[4933]: E1202 16:18:39.741576 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d65d73cfd91f9de5da7119c94edf9d13504856b4a427c8ed2938ddf237e11a33\": container with ID starting with d65d73cfd91f9de5da7119c94edf9d13504856b4a427c8ed2938ddf237e11a33 not found: ID does not exist" containerID="d65d73cfd91f9de5da7119c94edf9d13504856b4a427c8ed2938ddf237e11a33" Dec 02 16:18:39 crc kubenswrapper[4933]: I1202 16:18:39.741607 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d65d73cfd91f9de5da7119c94edf9d13504856b4a427c8ed2938ddf237e11a33"} err="failed to get container status \"d65d73cfd91f9de5da7119c94edf9d13504856b4a427c8ed2938ddf237e11a33\": rpc error: code = NotFound desc = could not find container \"d65d73cfd91f9de5da7119c94edf9d13504856b4a427c8ed2938ddf237e11a33\": container with ID starting with d65d73cfd91f9de5da7119c94edf9d13504856b4a427c8ed2938ddf237e11a33 not found: ID does not exist" Dec 02 16:18:39 crc kubenswrapper[4933]: I1202 16:18:39.876348 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 16:18:39 crc kubenswrapper[4933]: I1202 16:18:39.894685 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 16:18:39 crc kubenswrapper[4933]: I1202 16:18:39.919037 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 02 16:18:39 crc kubenswrapper[4933]: E1202 16:18:39.919647 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37fc2f39-ccfe-45bc-b2e0-df5cabe8c327" containerName="nova-metadata-log" Dec 02 16:18:39 crc kubenswrapper[4933]: I1202 16:18:39.919671 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="37fc2f39-ccfe-45bc-b2e0-df5cabe8c327" containerName="nova-metadata-log" Dec 02 16:18:39 crc kubenswrapper[4933]: E1202 16:18:39.919722 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37fc2f39-ccfe-45bc-b2e0-df5cabe8c327" containerName="nova-metadata-metadata" Dec 02 16:18:39 crc kubenswrapper[4933]: I1202 16:18:39.919732 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="37fc2f39-ccfe-45bc-b2e0-df5cabe8c327" containerName="nova-metadata-metadata" Dec 02 16:18:39 crc kubenswrapper[4933]: I1202 16:18:39.920020 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="37fc2f39-ccfe-45bc-b2e0-df5cabe8c327" containerName="nova-metadata-log" Dec 02 16:18:39 crc kubenswrapper[4933]: I1202 16:18:39.920052 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="37fc2f39-ccfe-45bc-b2e0-df5cabe8c327" containerName="nova-metadata-metadata" Dec 02 16:18:39 crc kubenswrapper[4933]: I1202 16:18:39.921667 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 16:18:39 crc kubenswrapper[4933]: I1202 16:18:39.925458 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 02 16:18:39 crc kubenswrapper[4933]: I1202 16:18:39.925570 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 02 16:18:39 crc kubenswrapper[4933]: I1202 16:18:39.932037 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 16:18:40 crc kubenswrapper[4933]: I1202 16:18:40.083620 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/20794a38-c3e1-4b9e-a8d6-6df6d2a6bd7d-logs\") pod \"nova-metadata-0\" (UID: \"20794a38-c3e1-4b9e-a8d6-6df6d2a6bd7d\") " pod="openstack/nova-metadata-0" Dec 02 16:18:40 crc kubenswrapper[4933]: I1202 16:18:40.083935 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpcp2\" (UniqueName: \"kubernetes.io/projected/20794a38-c3e1-4b9e-a8d6-6df6d2a6bd7d-kube-api-access-fpcp2\") pod \"nova-metadata-0\" (UID: \"20794a38-c3e1-4b9e-a8d6-6df6d2a6bd7d\") " pod="openstack/nova-metadata-0" Dec 02 16:18:40 crc kubenswrapper[4933]: I1202 16:18:40.084091 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20794a38-c3e1-4b9e-a8d6-6df6d2a6bd7d-config-data\") pod \"nova-metadata-0\" (UID: \"20794a38-c3e1-4b9e-a8d6-6df6d2a6bd7d\") " pod="openstack/nova-metadata-0" Dec 02 16:18:40 crc kubenswrapper[4933]: I1202 16:18:40.084306 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20794a38-c3e1-4b9e-a8d6-6df6d2a6bd7d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"20794a38-c3e1-4b9e-a8d6-6df6d2a6bd7d\") " pod="openstack/nova-metadata-0" Dec 02 16:18:40 crc kubenswrapper[4933]: I1202 16:18:40.084380 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/20794a38-c3e1-4b9e-a8d6-6df6d2a6bd7d-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"20794a38-c3e1-4b9e-a8d6-6df6d2a6bd7d\") " pod="openstack/nova-metadata-0" Dec 02 16:18:40 crc kubenswrapper[4933]: I1202 16:18:40.186159 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fpcp2\" (UniqueName: \"kubernetes.io/projected/20794a38-c3e1-4b9e-a8d6-6df6d2a6bd7d-kube-api-access-fpcp2\") pod \"nova-metadata-0\" (UID: \"20794a38-c3e1-4b9e-a8d6-6df6d2a6bd7d\") " pod="openstack/nova-metadata-0" Dec 02 16:18:40 crc kubenswrapper[4933]: I1202 16:18:40.186269 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20794a38-c3e1-4b9e-a8d6-6df6d2a6bd7d-config-data\") pod \"nova-metadata-0\" (UID: \"20794a38-c3e1-4b9e-a8d6-6df6d2a6bd7d\") " pod="openstack/nova-metadata-0" Dec 02 16:18:40 crc kubenswrapper[4933]: I1202 16:18:40.186330 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20794a38-c3e1-4b9e-a8d6-6df6d2a6bd7d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"20794a38-c3e1-4b9e-a8d6-6df6d2a6bd7d\") " pod="openstack/nova-metadata-0" Dec 02 16:18:40 crc kubenswrapper[4933]: I1202 16:18:40.187041 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/20794a38-c3e1-4b9e-a8d6-6df6d2a6bd7d-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"20794a38-c3e1-4b9e-a8d6-6df6d2a6bd7d\") " pod="openstack/nova-metadata-0" Dec 02 16:18:40 crc kubenswrapper[4933]: I1202 16:18:40.187219 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/20794a38-c3e1-4b9e-a8d6-6df6d2a6bd7d-logs\") pod \"nova-metadata-0\" (UID: \"20794a38-c3e1-4b9e-a8d6-6df6d2a6bd7d\") " pod="openstack/nova-metadata-0" Dec 02 16:18:40 crc kubenswrapper[4933]: I1202 16:18:40.190226 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/20794a38-c3e1-4b9e-a8d6-6df6d2a6bd7d-logs\") pod \"nova-metadata-0\" (UID: \"20794a38-c3e1-4b9e-a8d6-6df6d2a6bd7d\") " pod="openstack/nova-metadata-0" Dec 02 16:18:40 crc kubenswrapper[4933]: I1202 16:18:40.192514 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20794a38-c3e1-4b9e-a8d6-6df6d2a6bd7d-config-data\") pod \"nova-metadata-0\" (UID: \"20794a38-c3e1-4b9e-a8d6-6df6d2a6bd7d\") " pod="openstack/nova-metadata-0" Dec 02 16:18:40 crc kubenswrapper[4933]: I1202 16:18:40.193063 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/20794a38-c3e1-4b9e-a8d6-6df6d2a6bd7d-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"20794a38-c3e1-4b9e-a8d6-6df6d2a6bd7d\") " pod="openstack/nova-metadata-0" Dec 02 16:18:40 crc kubenswrapper[4933]: I1202 16:18:40.203394 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20794a38-c3e1-4b9e-a8d6-6df6d2a6bd7d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"20794a38-c3e1-4b9e-a8d6-6df6d2a6bd7d\") " pod="openstack/nova-metadata-0" Dec 02 16:18:40 crc kubenswrapper[4933]: I1202 16:18:40.204785 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpcp2\" (UniqueName: \"kubernetes.io/projected/20794a38-c3e1-4b9e-a8d6-6df6d2a6bd7d-kube-api-access-fpcp2\") pod \"nova-metadata-0\" (UID: \"20794a38-c3e1-4b9e-a8d6-6df6d2a6bd7d\") " pod="openstack/nova-metadata-0" Dec 02 16:18:40 crc kubenswrapper[4933]: I1202 16:18:40.242874 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 16:18:40 crc kubenswrapper[4933]: I1202 16:18:40.561381 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a40f5b56-cf52-45da-b3b8-70a271c07984","Type":"ContainerStarted","Data":"beeab97767d6c2243aa6e27b92aa144fe83876043626f44f12125943d4cfe3e3"} Dec 02 16:18:40 crc kubenswrapper[4933]: I1202 16:18:40.561686 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a40f5b56-cf52-45da-b3b8-70a271c07984","Type":"ContainerStarted","Data":"d770adf18b3ed9501bfcb878e0de0a20bff7ee62d64df11bf42d46849a833301"} Dec 02 16:18:40 crc kubenswrapper[4933]: I1202 16:18:40.566926 4933 generic.go:334] "Generic (PLEG): container finished" podID="47162c0d-06c3-452d-b69d-a875408216f6" containerID="e6a144c1d0da23141ffa91100cf1e01ba5c7b68dbd0a8ed2ac949eb31b1a5686" exitCode=0 Dec 02 16:18:40 crc kubenswrapper[4933]: I1202 16:18:40.566966 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"47162c0d-06c3-452d-b69d-a875408216f6","Type":"ContainerDied","Data":"e6a144c1d0da23141ffa91100cf1e01ba5c7b68dbd0a8ed2ac949eb31b1a5686"} Dec 02 16:18:40 crc kubenswrapper[4933]: I1202 16:18:40.566992 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"47162c0d-06c3-452d-b69d-a875408216f6","Type":"ContainerDied","Data":"98947bf55f5adac87d2a16fe8aa8825f778094207c4e769a787959427f42cde6"} Dec 02 16:18:40 crc kubenswrapper[4933]: I1202 16:18:40.567004 4933 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="98947bf55f5adac87d2a16fe8aa8825f778094207c4e769a787959427f42cde6" Dec 02 16:18:40 crc kubenswrapper[4933]: I1202 16:18:40.580370 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 16:18:40 crc kubenswrapper[4933]: I1202 16:18:40.580850 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.580832676 podStartE2EDuration="2.580832676s" podCreationTimestamp="2025-12-02 16:18:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 16:18:40.577661759 +0000 UTC m=+1583.828888452" watchObservedRunningTime="2025-12-02 16:18:40.580832676 +0000 UTC m=+1583.832059389" Dec 02 16:18:40 crc kubenswrapper[4933]: I1202 16:18:40.699500 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47162c0d-06c3-452d-b69d-a875408216f6-config-data\") pod \"47162c0d-06c3-452d-b69d-a875408216f6\" (UID: \"47162c0d-06c3-452d-b69d-a875408216f6\") " Dec 02 16:18:40 crc kubenswrapper[4933]: I1202 16:18:40.699592 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/47162c0d-06c3-452d-b69d-a875408216f6-public-tls-certs\") pod \"47162c0d-06c3-452d-b69d-a875408216f6\" (UID: \"47162c0d-06c3-452d-b69d-a875408216f6\") " Dec 02 16:18:40 crc kubenswrapper[4933]: I1202 16:18:40.699618 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47162c0d-06c3-452d-b69d-a875408216f6-combined-ca-bundle\") pod \"47162c0d-06c3-452d-b69d-a875408216f6\" (UID: \"47162c0d-06c3-452d-b69d-a875408216f6\") " Dec 02 16:18:40 crc kubenswrapper[4933]: I1202 16:18:40.699778 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/47162c0d-06c3-452d-b69d-a875408216f6-logs\") pod \"47162c0d-06c3-452d-b69d-a875408216f6\" (UID: \"47162c0d-06c3-452d-b69d-a875408216f6\") " Dec 02 16:18:40 crc kubenswrapper[4933]: I1202 16:18:40.699930 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7dvtq\" (UniqueName: \"kubernetes.io/projected/47162c0d-06c3-452d-b69d-a875408216f6-kube-api-access-7dvtq\") pod \"47162c0d-06c3-452d-b69d-a875408216f6\" (UID: \"47162c0d-06c3-452d-b69d-a875408216f6\") " Dec 02 16:18:40 crc kubenswrapper[4933]: I1202 16:18:40.700024 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/47162c0d-06c3-452d-b69d-a875408216f6-internal-tls-certs\") pod \"47162c0d-06c3-452d-b69d-a875408216f6\" (UID: \"47162c0d-06c3-452d-b69d-a875408216f6\") " Dec 02 16:18:40 crc kubenswrapper[4933]: I1202 16:18:40.702134 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47162c0d-06c3-452d-b69d-a875408216f6-logs" (OuterVolumeSpecName: "logs") pod "47162c0d-06c3-452d-b69d-a875408216f6" (UID: "47162c0d-06c3-452d-b69d-a875408216f6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 16:18:40 crc kubenswrapper[4933]: I1202 16:18:40.707540 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47162c0d-06c3-452d-b69d-a875408216f6-kube-api-access-7dvtq" (OuterVolumeSpecName: "kube-api-access-7dvtq") pod "47162c0d-06c3-452d-b69d-a875408216f6" (UID: "47162c0d-06c3-452d-b69d-a875408216f6"). InnerVolumeSpecName "kube-api-access-7dvtq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:18:40 crc kubenswrapper[4933]: I1202 16:18:40.744975 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47162c0d-06c3-452d-b69d-a875408216f6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "47162c0d-06c3-452d-b69d-a875408216f6" (UID: "47162c0d-06c3-452d-b69d-a875408216f6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:18:40 crc kubenswrapper[4933]: I1202 16:18:40.745006 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47162c0d-06c3-452d-b69d-a875408216f6-config-data" (OuterVolumeSpecName: "config-data") pod "47162c0d-06c3-452d-b69d-a875408216f6" (UID: "47162c0d-06c3-452d-b69d-a875408216f6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:18:40 crc kubenswrapper[4933]: I1202 16:18:40.775702 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47162c0d-06c3-452d-b69d-a875408216f6-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "47162c0d-06c3-452d-b69d-a875408216f6" (UID: "47162c0d-06c3-452d-b69d-a875408216f6"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:18:40 crc kubenswrapper[4933]: W1202 16:18:40.779285 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod20794a38_c3e1_4b9e_a8d6_6df6d2a6bd7d.slice/crio-1e4d6577431397d44bebc56ba5264a9ddb3801cb9d6de5781f8f422bfb55f5c1 WatchSource:0}: Error finding container 1e4d6577431397d44bebc56ba5264a9ddb3801cb9d6de5781f8f422bfb55f5c1: Status 404 returned error can't find the container with id 1e4d6577431397d44bebc56ba5264a9ddb3801cb9d6de5781f8f422bfb55f5c1 Dec 02 16:18:40 crc kubenswrapper[4933]: I1202 16:18:40.783784 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 16:18:40 crc kubenswrapper[4933]: I1202 16:18:40.797930 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47162c0d-06c3-452d-b69d-a875408216f6-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "47162c0d-06c3-452d-b69d-a875408216f6" (UID: "47162c0d-06c3-452d-b69d-a875408216f6"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:18:40 crc kubenswrapper[4933]: I1202 16:18:40.803009 4933 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47162c0d-06c3-452d-b69d-a875408216f6-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 16:18:40 crc kubenswrapper[4933]: I1202 16:18:40.803059 4933 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/47162c0d-06c3-452d-b69d-a875408216f6-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 02 16:18:40 crc kubenswrapper[4933]: I1202 16:18:40.803074 4933 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47162c0d-06c3-452d-b69d-a875408216f6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 16:18:40 crc kubenswrapper[4933]: I1202 16:18:40.803087 4933 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/47162c0d-06c3-452d-b69d-a875408216f6-logs\") on node \"crc\" DevicePath \"\"" Dec 02 16:18:40 crc kubenswrapper[4933]: I1202 16:18:40.803099 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7dvtq\" (UniqueName: \"kubernetes.io/projected/47162c0d-06c3-452d-b69d-a875408216f6-kube-api-access-7dvtq\") on node \"crc\" DevicePath \"\"" Dec 02 16:18:40 crc kubenswrapper[4933]: I1202 16:18:40.803111 4933 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/47162c0d-06c3-452d-b69d-a875408216f6-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 02 16:18:41 crc kubenswrapper[4933]: I1202 16:18:41.067295 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37fc2f39-ccfe-45bc-b2e0-df5cabe8c327" path="/var/lib/kubelet/pods/37fc2f39-ccfe-45bc-b2e0-df5cabe8c327/volumes" Dec 02 16:18:41 crc kubenswrapper[4933]: I1202 16:18:41.579950 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 16:18:41 crc kubenswrapper[4933]: I1202 16:18:41.579946 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"20794a38-c3e1-4b9e-a8d6-6df6d2a6bd7d","Type":"ContainerStarted","Data":"dd61ea3ebe72a4f7b7b176f2d73c55ddde2ee5c457a24edcc347da5513863d17"} Dec 02 16:18:41 crc kubenswrapper[4933]: I1202 16:18:41.580419 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"20794a38-c3e1-4b9e-a8d6-6df6d2a6bd7d","Type":"ContainerStarted","Data":"829cfd97597d67e8fceef8c241834c3e8df914cf1c5bb0228c9238380350da91"} Dec 02 16:18:41 crc kubenswrapper[4933]: I1202 16:18:41.580451 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"20794a38-c3e1-4b9e-a8d6-6df6d2a6bd7d","Type":"ContainerStarted","Data":"1e4d6577431397d44bebc56ba5264a9ddb3801cb9d6de5781f8f422bfb55f5c1"} Dec 02 16:18:41 crc kubenswrapper[4933]: I1202 16:18:41.614093 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.614069257 podStartE2EDuration="2.614069257s" podCreationTimestamp="2025-12-02 16:18:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 16:18:41.606256022 +0000 UTC m=+1584.857482755" watchObservedRunningTime="2025-12-02 16:18:41.614069257 +0000 UTC m=+1584.865295970" Dec 02 16:18:41 crc kubenswrapper[4933]: I1202 16:18:41.644951 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 02 16:18:41 crc kubenswrapper[4933]: I1202 16:18:41.662556 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 02 16:18:41 crc kubenswrapper[4933]: I1202 16:18:41.684717 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 02 16:18:41 crc kubenswrapper[4933]: E1202 16:18:41.685392 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47162c0d-06c3-452d-b69d-a875408216f6" containerName="nova-api-api" Dec 02 16:18:41 crc kubenswrapper[4933]: I1202 16:18:41.685418 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="47162c0d-06c3-452d-b69d-a875408216f6" containerName="nova-api-api" Dec 02 16:18:41 crc kubenswrapper[4933]: E1202 16:18:41.685463 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47162c0d-06c3-452d-b69d-a875408216f6" containerName="nova-api-log" Dec 02 16:18:41 crc kubenswrapper[4933]: I1202 16:18:41.685470 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="47162c0d-06c3-452d-b69d-a875408216f6" containerName="nova-api-log" Dec 02 16:18:41 crc kubenswrapper[4933]: I1202 16:18:41.685777 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="47162c0d-06c3-452d-b69d-a875408216f6" containerName="nova-api-log" Dec 02 16:18:41 crc kubenswrapper[4933]: I1202 16:18:41.685815 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="47162c0d-06c3-452d-b69d-a875408216f6" containerName="nova-api-api" Dec 02 16:18:41 crc kubenswrapper[4933]: I1202 16:18:41.687505 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 16:18:41 crc kubenswrapper[4933]: I1202 16:18:41.690509 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 02 16:18:41 crc kubenswrapper[4933]: I1202 16:18:41.690591 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 02 16:18:41 crc kubenswrapper[4933]: I1202 16:18:41.690814 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 02 16:18:41 crc kubenswrapper[4933]: I1202 16:18:41.719575 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 02 16:18:41 crc kubenswrapper[4933]: I1202 16:18:41.830845 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9aa955fa-0221-4590-bd34-0ee84ba06562-internal-tls-certs\") pod \"nova-api-0\" (UID: \"9aa955fa-0221-4590-bd34-0ee84ba06562\") " pod="openstack/nova-api-0" Dec 02 16:18:41 crc kubenswrapper[4933]: I1202 16:18:41.830985 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9aa955fa-0221-4590-bd34-0ee84ba06562-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9aa955fa-0221-4590-bd34-0ee84ba06562\") " pod="openstack/nova-api-0" Dec 02 16:18:41 crc kubenswrapper[4933]: I1202 16:18:41.831034 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9aa955fa-0221-4590-bd34-0ee84ba06562-logs\") pod \"nova-api-0\" (UID: \"9aa955fa-0221-4590-bd34-0ee84ba06562\") " pod="openstack/nova-api-0" Dec 02 16:18:41 crc kubenswrapper[4933]: I1202 16:18:41.831101 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7csf2\" (UniqueName: \"kubernetes.io/projected/9aa955fa-0221-4590-bd34-0ee84ba06562-kube-api-access-7csf2\") pod \"nova-api-0\" (UID: \"9aa955fa-0221-4590-bd34-0ee84ba06562\") " pod="openstack/nova-api-0" Dec 02 16:18:41 crc kubenswrapper[4933]: I1202 16:18:41.831320 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9aa955fa-0221-4590-bd34-0ee84ba06562-config-data\") pod \"nova-api-0\" (UID: \"9aa955fa-0221-4590-bd34-0ee84ba06562\") " pod="openstack/nova-api-0" Dec 02 16:18:41 crc kubenswrapper[4933]: I1202 16:18:41.831392 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9aa955fa-0221-4590-bd34-0ee84ba06562-public-tls-certs\") pod \"nova-api-0\" (UID: \"9aa955fa-0221-4590-bd34-0ee84ba06562\") " pod="openstack/nova-api-0" Dec 02 16:18:41 crc kubenswrapper[4933]: I1202 16:18:41.933517 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9aa955fa-0221-4590-bd34-0ee84ba06562-public-tls-certs\") pod \"nova-api-0\" (UID: \"9aa955fa-0221-4590-bd34-0ee84ba06562\") " pod="openstack/nova-api-0" Dec 02 16:18:41 crc kubenswrapper[4933]: I1202 16:18:41.933622 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9aa955fa-0221-4590-bd34-0ee84ba06562-internal-tls-certs\") pod \"nova-api-0\" (UID: \"9aa955fa-0221-4590-bd34-0ee84ba06562\") " pod="openstack/nova-api-0" Dec 02 16:18:41 crc kubenswrapper[4933]: I1202 16:18:41.933692 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9aa955fa-0221-4590-bd34-0ee84ba06562-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9aa955fa-0221-4590-bd34-0ee84ba06562\") " pod="openstack/nova-api-0" Dec 02 16:18:41 crc kubenswrapper[4933]: I1202 16:18:41.933722 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9aa955fa-0221-4590-bd34-0ee84ba06562-logs\") pod \"nova-api-0\" (UID: \"9aa955fa-0221-4590-bd34-0ee84ba06562\") " pod="openstack/nova-api-0" Dec 02 16:18:41 crc kubenswrapper[4933]: I1202 16:18:41.933764 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7csf2\" (UniqueName: \"kubernetes.io/projected/9aa955fa-0221-4590-bd34-0ee84ba06562-kube-api-access-7csf2\") pod \"nova-api-0\" (UID: \"9aa955fa-0221-4590-bd34-0ee84ba06562\") " pod="openstack/nova-api-0" Dec 02 16:18:41 crc kubenswrapper[4933]: I1202 16:18:41.933938 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9aa955fa-0221-4590-bd34-0ee84ba06562-config-data\") pod \"nova-api-0\" (UID: \"9aa955fa-0221-4590-bd34-0ee84ba06562\") " pod="openstack/nova-api-0" Dec 02 16:18:41 crc kubenswrapper[4933]: I1202 16:18:41.934607 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9aa955fa-0221-4590-bd34-0ee84ba06562-logs\") pod \"nova-api-0\" (UID: \"9aa955fa-0221-4590-bd34-0ee84ba06562\") " pod="openstack/nova-api-0" Dec 02 16:18:41 crc kubenswrapper[4933]: I1202 16:18:41.939455 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9aa955fa-0221-4590-bd34-0ee84ba06562-public-tls-certs\") pod \"nova-api-0\" (UID: \"9aa955fa-0221-4590-bd34-0ee84ba06562\") " pod="openstack/nova-api-0" Dec 02 16:18:41 crc kubenswrapper[4933]: I1202 16:18:41.940025 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9aa955fa-0221-4590-bd34-0ee84ba06562-config-data\") pod \"nova-api-0\" (UID: \"9aa955fa-0221-4590-bd34-0ee84ba06562\") " pod="openstack/nova-api-0" Dec 02 16:18:41 crc kubenswrapper[4933]: I1202 16:18:41.940387 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9aa955fa-0221-4590-bd34-0ee84ba06562-internal-tls-certs\") pod \"nova-api-0\" (UID: \"9aa955fa-0221-4590-bd34-0ee84ba06562\") " pod="openstack/nova-api-0" Dec 02 16:18:41 crc kubenswrapper[4933]: I1202 16:18:41.963487 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9aa955fa-0221-4590-bd34-0ee84ba06562-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9aa955fa-0221-4590-bd34-0ee84ba06562\") " pod="openstack/nova-api-0" Dec 02 16:18:41 crc kubenswrapper[4933]: I1202 16:18:41.965669 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7csf2\" (UniqueName: \"kubernetes.io/projected/9aa955fa-0221-4590-bd34-0ee84ba06562-kube-api-access-7csf2\") pod \"nova-api-0\" (UID: \"9aa955fa-0221-4590-bd34-0ee84ba06562\") " pod="openstack/nova-api-0" Dec 02 16:18:42 crc kubenswrapper[4933]: I1202 16:18:42.008003 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 16:18:42 crc kubenswrapper[4933]: I1202 16:18:42.534101 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 02 16:18:42 crc kubenswrapper[4933]: W1202 16:18:42.545142 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9aa955fa_0221_4590_bd34_0ee84ba06562.slice/crio-a1ff951989306c47ea45140e0b7ccf6ca68f7ed019fb1bae8ee914e8c631cfec WatchSource:0}: Error finding container a1ff951989306c47ea45140e0b7ccf6ca68f7ed019fb1bae8ee914e8c631cfec: Status 404 returned error can't find the container with id a1ff951989306c47ea45140e0b7ccf6ca68f7ed019fb1bae8ee914e8c631cfec Dec 02 16:18:42 crc kubenswrapper[4933]: I1202 16:18:42.596476 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9aa955fa-0221-4590-bd34-0ee84ba06562","Type":"ContainerStarted","Data":"a1ff951989306c47ea45140e0b7ccf6ca68f7ed019fb1bae8ee914e8c631cfec"} Dec 02 16:18:43 crc kubenswrapper[4933]: I1202 16:18:43.068921 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47162c0d-06c3-452d-b69d-a875408216f6" path="/var/lib/kubelet/pods/47162c0d-06c3-452d-b69d-a875408216f6/volumes" Dec 02 16:18:43 crc kubenswrapper[4933]: I1202 16:18:43.606669 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9aa955fa-0221-4590-bd34-0ee84ba06562","Type":"ContainerStarted","Data":"e541fe902068656f8843ec86191dd7b2103cf48ca66448634ad8d90deb8b7206"} Dec 02 16:18:43 crc kubenswrapper[4933]: I1202 16:18:43.606723 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9aa955fa-0221-4590-bd34-0ee84ba06562","Type":"ContainerStarted","Data":"5093d82dd04f7592e40804d6526127844ceda4d619d12392265c480ca29bab28"} Dec 02 16:18:43 crc kubenswrapper[4933]: I1202 16:18:43.628960 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.6289386009999998 podStartE2EDuration="2.628938601s" podCreationTimestamp="2025-12-02 16:18:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 16:18:43.628495479 +0000 UTC m=+1586.879722212" watchObservedRunningTime="2025-12-02 16:18:43.628938601 +0000 UTC m=+1586.880165304" Dec 02 16:18:43 crc kubenswrapper[4933]: I1202 16:18:43.940516 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 02 16:18:45 crc kubenswrapper[4933]: I1202 16:18:45.243290 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 02 16:18:45 crc kubenswrapper[4933]: I1202 16:18:45.243688 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 02 16:18:47 crc kubenswrapper[4933]: I1202 16:18:47.169180 4933 patch_prober.go:28] interesting pod/machine-config-daemon-d2p6w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 16:18:47 crc kubenswrapper[4933]: I1202 16:18:47.169497 4933 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 16:18:47 crc kubenswrapper[4933]: I1202 16:18:47.169536 4933 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" Dec 02 16:18:47 crc kubenswrapper[4933]: I1202 16:18:47.170362 4933 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3464037e44f6657a3e78f158e1b9ac51eea30ff49dcabcf02071d819dacd47c7"} pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 16:18:47 crc kubenswrapper[4933]: I1202 16:18:47.170437 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" containerName="machine-config-daemon" containerID="cri-o://3464037e44f6657a3e78f158e1b9ac51eea30ff49dcabcf02071d819dacd47c7" gracePeriod=600 Dec 02 16:18:47 crc kubenswrapper[4933]: E1202 16:18:47.278212 4933 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6c1c5e6_50dd_428a_890c_2c3f0456f2fa.slice/crio-conmon-3464037e44f6657a3e78f158e1b9ac51eea30ff49dcabcf02071d819dacd47c7.scope\": RecentStats: unable to find data in memory cache]" Dec 02 16:18:47 crc kubenswrapper[4933]: E1202 16:18:47.288698 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 16:18:47 crc kubenswrapper[4933]: I1202 16:18:47.656036 4933 generic.go:334] "Generic (PLEG): container finished" podID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" containerID="3464037e44f6657a3e78f158e1b9ac51eea30ff49dcabcf02071d819dacd47c7" exitCode=0 Dec 02 16:18:47 crc kubenswrapper[4933]: I1202 16:18:47.656109 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" event={"ID":"e6c1c5e6-50dd-428a-890c-2c3f0456f2fa","Type":"ContainerDied","Data":"3464037e44f6657a3e78f158e1b9ac51eea30ff49dcabcf02071d819dacd47c7"} Dec 02 16:18:47 crc kubenswrapper[4933]: I1202 16:18:47.656415 4933 scope.go:117] "RemoveContainer" containerID="88738e3bef53d0478fb4beaed256d677959bb94ab5e545598a6a14e9cb4a5112" Dec 02 16:18:47 crc kubenswrapper[4933]: I1202 16:18:47.657907 4933 scope.go:117] "RemoveContainer" containerID="3464037e44f6657a3e78f158e1b9ac51eea30ff49dcabcf02071d819dacd47c7" Dec 02 16:18:47 crc kubenswrapper[4933]: E1202 16:18:47.658631 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 16:18:48 crc kubenswrapper[4933]: I1202 16:18:48.940548 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 02 16:18:48 crc kubenswrapper[4933]: I1202 16:18:48.985972 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 02 16:18:49 crc kubenswrapper[4933]: I1202 16:18:49.735624 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 02 16:18:50 crc kubenswrapper[4933]: I1202 16:18:50.243842 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 02 16:18:50 crc kubenswrapper[4933]: I1202 16:18:50.244177 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 02 16:18:51 crc kubenswrapper[4933]: I1202 16:18:51.255015 4933 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="20794a38-c3e1-4b9e-a8d6-6df6d2a6bd7d" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.2:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 02 16:18:51 crc kubenswrapper[4933]: I1202 16:18:51.255044 4933 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="20794a38-c3e1-4b9e-a8d6-6df6d2a6bd7d" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.2:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 02 16:18:52 crc kubenswrapper[4933]: I1202 16:18:52.008905 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 02 16:18:52 crc kubenswrapper[4933]: I1202 16:18:52.009249 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 02 16:18:53 crc kubenswrapper[4933]: I1202 16:18:53.028126 4933 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="9aa955fa-0221-4590-bd34-0ee84ba06562" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.1.3:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 02 16:18:53 crc kubenswrapper[4933]: I1202 16:18:53.028199 4933 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="9aa955fa-0221-4590-bd34-0ee84ba06562" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.1.3:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 02 16:18:54 crc kubenswrapper[4933]: I1202 16:18:54.694593 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 02 16:18:58 crc kubenswrapper[4933]: I1202 16:18:58.614406 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 02 16:18:58 crc kubenswrapper[4933]: I1202 16:18:58.614955 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="09f9fb84-64a0-46a8-8551-4fcbf0e46050" containerName="kube-state-metrics" containerID="cri-o://1f19e9713fcdd5bfdc66ce1322b3ccb20258bc3fc52a9850060d466d321dc854" gracePeriod=30 Dec 02 16:18:58 crc kubenswrapper[4933]: I1202 16:18:58.668595 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-0"] Dec 02 16:18:58 crc kubenswrapper[4933]: I1202 16:18:58.668868 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/mysqld-exporter-0" podUID="fa0c36d2-33b1-4415-9413-7835332aa49d" containerName="mysqld-exporter" containerID="cri-o://2d89454283151b2d688ebfe4eec249b0c87cde5ca22c4f55a067a14c997607d2" gracePeriod=30 Dec 02 16:18:58 crc kubenswrapper[4933]: I1202 16:18:58.827580 4933 generic.go:334] "Generic (PLEG): container finished" podID="09f9fb84-64a0-46a8-8551-4fcbf0e46050" containerID="1f19e9713fcdd5bfdc66ce1322b3ccb20258bc3fc52a9850060d466d321dc854" exitCode=2 Dec 02 16:18:58 crc kubenswrapper[4933]: I1202 16:18:58.827668 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"09f9fb84-64a0-46a8-8551-4fcbf0e46050","Type":"ContainerDied","Data":"1f19e9713fcdd5bfdc66ce1322b3ccb20258bc3fc52a9850060d466d321dc854"} Dec 02 16:18:58 crc kubenswrapper[4933]: I1202 16:18:58.830297 4933 generic.go:334] "Generic (PLEG): container finished" podID="fa0c36d2-33b1-4415-9413-7835332aa49d" containerID="2d89454283151b2d688ebfe4eec249b0c87cde5ca22c4f55a067a14c997607d2" exitCode=2 Dec 02 16:18:58 crc kubenswrapper[4933]: I1202 16:18:58.830334 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"fa0c36d2-33b1-4415-9413-7835332aa49d","Type":"ContainerDied","Data":"2d89454283151b2d688ebfe4eec249b0c87cde5ca22c4f55a067a14c997607d2"} Dec 02 16:18:59 crc kubenswrapper[4933]: I1202 16:18:59.054590 4933 scope.go:117] "RemoveContainer" containerID="3464037e44f6657a3e78f158e1b9ac51eea30ff49dcabcf02071d819dacd47c7" Dec 02 16:18:59 crc kubenswrapper[4933]: E1202 16:18:59.055128 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 16:18:59 crc kubenswrapper[4933]: I1202 16:18:59.372257 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 02 16:18:59 crc kubenswrapper[4933]: I1202 16:18:59.378489 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Dec 02 16:18:59 crc kubenswrapper[4933]: I1202 16:18:59.454876 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gbftz\" (UniqueName: \"kubernetes.io/projected/fa0c36d2-33b1-4415-9413-7835332aa49d-kube-api-access-gbftz\") pod \"fa0c36d2-33b1-4415-9413-7835332aa49d\" (UID: \"fa0c36d2-33b1-4415-9413-7835332aa49d\") " Dec 02 16:18:59 crc kubenswrapper[4933]: I1202 16:18:59.455156 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa0c36d2-33b1-4415-9413-7835332aa49d-config-data\") pod \"fa0c36d2-33b1-4415-9413-7835332aa49d\" (UID: \"fa0c36d2-33b1-4415-9413-7835332aa49d\") " Dec 02 16:18:59 crc kubenswrapper[4933]: I1202 16:18:59.455326 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-js6p2\" (UniqueName: \"kubernetes.io/projected/09f9fb84-64a0-46a8-8551-4fcbf0e46050-kube-api-access-js6p2\") pod \"09f9fb84-64a0-46a8-8551-4fcbf0e46050\" (UID: \"09f9fb84-64a0-46a8-8551-4fcbf0e46050\") " Dec 02 16:18:59 crc kubenswrapper[4933]: I1202 16:18:59.455366 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa0c36d2-33b1-4415-9413-7835332aa49d-combined-ca-bundle\") pod \"fa0c36d2-33b1-4415-9413-7835332aa49d\" (UID: \"fa0c36d2-33b1-4415-9413-7835332aa49d\") " Dec 02 16:18:59 crc kubenswrapper[4933]: I1202 16:18:59.465872 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09f9fb84-64a0-46a8-8551-4fcbf0e46050-kube-api-access-js6p2" (OuterVolumeSpecName: "kube-api-access-js6p2") pod "09f9fb84-64a0-46a8-8551-4fcbf0e46050" (UID: "09f9fb84-64a0-46a8-8551-4fcbf0e46050"). InnerVolumeSpecName "kube-api-access-js6p2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:18:59 crc kubenswrapper[4933]: I1202 16:18:59.471529 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa0c36d2-33b1-4415-9413-7835332aa49d-kube-api-access-gbftz" (OuterVolumeSpecName: "kube-api-access-gbftz") pod "fa0c36d2-33b1-4415-9413-7835332aa49d" (UID: "fa0c36d2-33b1-4415-9413-7835332aa49d"). InnerVolumeSpecName "kube-api-access-gbftz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:18:59 crc kubenswrapper[4933]: I1202 16:18:59.502917 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa0c36d2-33b1-4415-9413-7835332aa49d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fa0c36d2-33b1-4415-9413-7835332aa49d" (UID: "fa0c36d2-33b1-4415-9413-7835332aa49d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:18:59 crc kubenswrapper[4933]: I1202 16:18:59.548858 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa0c36d2-33b1-4415-9413-7835332aa49d-config-data" (OuterVolumeSpecName: "config-data") pod "fa0c36d2-33b1-4415-9413-7835332aa49d" (UID: "fa0c36d2-33b1-4415-9413-7835332aa49d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:18:59 crc kubenswrapper[4933]: I1202 16:18:59.558361 4933 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa0c36d2-33b1-4415-9413-7835332aa49d-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 16:18:59 crc kubenswrapper[4933]: I1202 16:18:59.558395 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-js6p2\" (UniqueName: \"kubernetes.io/projected/09f9fb84-64a0-46a8-8551-4fcbf0e46050-kube-api-access-js6p2\") on node \"crc\" DevicePath \"\"" Dec 02 16:18:59 crc kubenswrapper[4933]: I1202 16:18:59.558409 4933 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa0c36d2-33b1-4415-9413-7835332aa49d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 16:18:59 crc kubenswrapper[4933]: I1202 16:18:59.558420 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gbftz\" (UniqueName: \"kubernetes.io/projected/fa0c36d2-33b1-4415-9413-7835332aa49d-kube-api-access-gbftz\") on node \"crc\" DevicePath \"\"" Dec 02 16:18:59 crc kubenswrapper[4933]: I1202 16:18:59.842299 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"09f9fb84-64a0-46a8-8551-4fcbf0e46050","Type":"ContainerDied","Data":"a352a65a9c0d7ca501c24658a49c21d73ce43bd39a57d3da215596102005ca27"} Dec 02 16:18:59 crc kubenswrapper[4933]: I1202 16:18:59.842358 4933 scope.go:117] "RemoveContainer" containerID="1f19e9713fcdd5bfdc66ce1322b3ccb20258bc3fc52a9850060d466d321dc854" Dec 02 16:18:59 crc kubenswrapper[4933]: I1202 16:18:59.842438 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 02 16:18:59 crc kubenswrapper[4933]: I1202 16:18:59.843998 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"fa0c36d2-33b1-4415-9413-7835332aa49d","Type":"ContainerDied","Data":"a44b461c55935785ce7e2b10367e551519a02f9b3e481e0fbd8d60797fe7238d"} Dec 02 16:18:59 crc kubenswrapper[4933]: I1202 16:18:59.844154 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Dec 02 16:18:59 crc kubenswrapper[4933]: I1202 16:18:59.882423 4933 scope.go:117] "RemoveContainer" containerID="2d89454283151b2d688ebfe4eec249b0c87cde5ca22c4f55a067a14c997607d2" Dec 02 16:18:59 crc kubenswrapper[4933]: I1202 16:18:59.924964 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-0"] Dec 02 16:18:59 crc kubenswrapper[4933]: I1202 16:18:59.942620 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-0"] Dec 02 16:18:59 crc kubenswrapper[4933]: I1202 16:18:59.958026 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 02 16:18:59 crc kubenswrapper[4933]: I1202 16:18:59.970049 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 02 16:18:59 crc kubenswrapper[4933]: I1202 16:18:59.997987 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-0"] Dec 02 16:19:00 crc kubenswrapper[4933]: E1202 16:18:59.999772 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09f9fb84-64a0-46a8-8551-4fcbf0e46050" containerName="kube-state-metrics" Dec 02 16:19:00 crc kubenswrapper[4933]: I1202 16:18:59.999803 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="09f9fb84-64a0-46a8-8551-4fcbf0e46050" containerName="kube-state-metrics" Dec 02 16:19:00 crc kubenswrapper[4933]: E1202 16:18:59.999868 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa0c36d2-33b1-4415-9413-7835332aa49d" containerName="mysqld-exporter" Dec 02 16:19:00 crc kubenswrapper[4933]: I1202 16:18:59.999878 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa0c36d2-33b1-4415-9413-7835332aa49d" containerName="mysqld-exporter" Dec 02 16:19:00 crc kubenswrapper[4933]: I1202 16:19:00.008374 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa0c36d2-33b1-4415-9413-7835332aa49d" containerName="mysqld-exporter" Dec 02 16:19:00 crc kubenswrapper[4933]: I1202 16:19:00.008446 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="09f9fb84-64a0-46a8-8551-4fcbf0e46050" containerName="kube-state-metrics" Dec 02 16:19:00 crc kubenswrapper[4933]: I1202 16:19:00.010042 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Dec 02 16:19:00 crc kubenswrapper[4933]: I1202 16:19:00.014329 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-mysqld-exporter-svc" Dec 02 16:19:00 crc kubenswrapper[4933]: I1202 16:19:00.014689 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"mysqld-exporter-config-data" Dec 02 16:19:00 crc kubenswrapper[4933]: I1202 16:19:00.022022 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 02 16:19:00 crc kubenswrapper[4933]: I1202 16:19:00.042060 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 02 16:19:00 crc kubenswrapper[4933]: I1202 16:19:00.044248 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Dec 02 16:19:00 crc kubenswrapper[4933]: I1202 16:19:00.046066 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Dec 02 16:19:00 crc kubenswrapper[4933]: I1202 16:19:00.046227 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-0"] Dec 02 16:19:00 crc kubenswrapper[4933]: I1202 16:19:00.059322 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 02 16:19:00 crc kubenswrapper[4933]: I1202 16:19:00.074943 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mysqld-exporter-tls-certs\" (UniqueName: \"kubernetes.io/secret/43672f01-bd87-4595-8ba0-d76811762bc2-mysqld-exporter-tls-certs\") pod \"mysqld-exporter-0\" (UID: \"43672f01-bd87-4595-8ba0-d76811762bc2\") " pod="openstack/mysqld-exporter-0" Dec 02 16:19:00 crc kubenswrapper[4933]: I1202 16:19:00.075131 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43672f01-bd87-4595-8ba0-d76811762bc2-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"43672f01-bd87-4595-8ba0-d76811762bc2\") " pod="openstack/mysqld-exporter-0" Dec 02 16:19:00 crc kubenswrapper[4933]: I1202 16:19:00.075170 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bld75\" (UniqueName: \"kubernetes.io/projected/43672f01-bd87-4595-8ba0-d76811762bc2-kube-api-access-bld75\") pod \"mysqld-exporter-0\" (UID: \"43672f01-bd87-4595-8ba0-d76811762bc2\") " pod="openstack/mysqld-exporter-0" Dec 02 16:19:00 crc kubenswrapper[4933]: I1202 16:19:00.075388 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43672f01-bd87-4595-8ba0-d76811762bc2-config-data\") pod \"mysqld-exporter-0\" (UID: \"43672f01-bd87-4595-8ba0-d76811762bc2\") " pod="openstack/mysqld-exporter-0" Dec 02 16:19:00 crc kubenswrapper[4933]: I1202 16:19:00.176915 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43672f01-bd87-4595-8ba0-d76811762bc2-config-data\") pod \"mysqld-exporter-0\" (UID: \"43672f01-bd87-4595-8ba0-d76811762bc2\") " pod="openstack/mysqld-exporter-0" Dec 02 16:19:00 crc kubenswrapper[4933]: I1202 16:19:00.176992 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvccj\" (UniqueName: \"kubernetes.io/projected/06766e68-6811-4cd2-bb90-67cc353669e6-kube-api-access-lvccj\") pod \"kube-state-metrics-0\" (UID: \"06766e68-6811-4cd2-bb90-67cc353669e6\") " pod="openstack/kube-state-metrics-0" Dec 02 16:19:00 crc kubenswrapper[4933]: I1202 16:19:00.177031 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/06766e68-6811-4cd2-bb90-67cc353669e6-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"06766e68-6811-4cd2-bb90-67cc353669e6\") " pod="openstack/kube-state-metrics-0" Dec 02 16:19:00 crc kubenswrapper[4933]: I1202 16:19:00.177063 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mysqld-exporter-tls-certs\" (UniqueName: \"kubernetes.io/secret/43672f01-bd87-4595-8ba0-d76811762bc2-mysqld-exporter-tls-certs\") pod \"mysqld-exporter-0\" (UID: \"43672f01-bd87-4595-8ba0-d76811762bc2\") " pod="openstack/mysqld-exporter-0" Dec 02 16:19:00 crc kubenswrapper[4933]: I1202 16:19:00.177243 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/06766e68-6811-4cd2-bb90-67cc353669e6-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"06766e68-6811-4cd2-bb90-67cc353669e6\") " pod="openstack/kube-state-metrics-0" Dec 02 16:19:00 crc kubenswrapper[4933]: I1202 16:19:00.177465 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43672f01-bd87-4595-8ba0-d76811762bc2-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"43672f01-bd87-4595-8ba0-d76811762bc2\") " pod="openstack/mysqld-exporter-0" Dec 02 16:19:00 crc kubenswrapper[4933]: I1202 16:19:00.177527 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bld75\" (UniqueName: \"kubernetes.io/projected/43672f01-bd87-4595-8ba0-d76811762bc2-kube-api-access-bld75\") pod \"mysqld-exporter-0\" (UID: \"43672f01-bd87-4595-8ba0-d76811762bc2\") " pod="openstack/mysqld-exporter-0" Dec 02 16:19:00 crc kubenswrapper[4933]: I1202 16:19:00.177614 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06766e68-6811-4cd2-bb90-67cc353669e6-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"06766e68-6811-4cd2-bb90-67cc353669e6\") " pod="openstack/kube-state-metrics-0" Dec 02 16:19:00 crc kubenswrapper[4933]: I1202 16:19:00.181166 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mysqld-exporter-tls-certs\" (UniqueName: \"kubernetes.io/secret/43672f01-bd87-4595-8ba0-d76811762bc2-mysqld-exporter-tls-certs\") pod \"mysqld-exporter-0\" (UID: \"43672f01-bd87-4595-8ba0-d76811762bc2\") " pod="openstack/mysqld-exporter-0" Dec 02 16:19:00 crc kubenswrapper[4933]: I1202 16:19:00.182045 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43672f01-bd87-4595-8ba0-d76811762bc2-config-data\") pod \"mysqld-exporter-0\" (UID: \"43672f01-bd87-4595-8ba0-d76811762bc2\") " pod="openstack/mysqld-exporter-0" Dec 02 16:19:00 crc kubenswrapper[4933]: I1202 16:19:00.182596 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43672f01-bd87-4595-8ba0-d76811762bc2-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"43672f01-bd87-4595-8ba0-d76811762bc2\") " pod="openstack/mysqld-exporter-0" Dec 02 16:19:00 crc kubenswrapper[4933]: I1202 16:19:00.198856 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bld75\" (UniqueName: \"kubernetes.io/projected/43672f01-bd87-4595-8ba0-d76811762bc2-kube-api-access-bld75\") pod \"mysqld-exporter-0\" (UID: \"43672f01-bd87-4595-8ba0-d76811762bc2\") " pod="openstack/mysqld-exporter-0" Dec 02 16:19:00 crc kubenswrapper[4933]: I1202 16:19:00.265291 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 02 16:19:00 crc kubenswrapper[4933]: I1202 16:19:00.265348 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 02 16:19:00 crc kubenswrapper[4933]: I1202 16:19:00.272164 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 02 16:19:00 crc kubenswrapper[4933]: I1202 16:19:00.273116 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 02 16:19:00 crc kubenswrapper[4933]: I1202 16:19:00.279847 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06766e68-6811-4cd2-bb90-67cc353669e6-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"06766e68-6811-4cd2-bb90-67cc353669e6\") " pod="openstack/kube-state-metrics-0" Dec 02 16:19:00 crc kubenswrapper[4933]: I1202 16:19:00.280054 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvccj\" (UniqueName: \"kubernetes.io/projected/06766e68-6811-4cd2-bb90-67cc353669e6-kube-api-access-lvccj\") pod \"kube-state-metrics-0\" (UID: \"06766e68-6811-4cd2-bb90-67cc353669e6\") " pod="openstack/kube-state-metrics-0" Dec 02 16:19:00 crc kubenswrapper[4933]: I1202 16:19:00.280105 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/06766e68-6811-4cd2-bb90-67cc353669e6-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"06766e68-6811-4cd2-bb90-67cc353669e6\") " pod="openstack/kube-state-metrics-0" Dec 02 16:19:00 crc kubenswrapper[4933]: I1202 16:19:00.280178 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/06766e68-6811-4cd2-bb90-67cc353669e6-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"06766e68-6811-4cd2-bb90-67cc353669e6\") " pod="openstack/kube-state-metrics-0" Dec 02 16:19:00 crc kubenswrapper[4933]: I1202 16:19:00.283802 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/06766e68-6811-4cd2-bb90-67cc353669e6-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"06766e68-6811-4cd2-bb90-67cc353669e6\") " pod="openstack/kube-state-metrics-0" Dec 02 16:19:00 crc kubenswrapper[4933]: I1202 16:19:00.285988 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/06766e68-6811-4cd2-bb90-67cc353669e6-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"06766e68-6811-4cd2-bb90-67cc353669e6\") " pod="openstack/kube-state-metrics-0" Dec 02 16:19:00 crc kubenswrapper[4933]: I1202 16:19:00.287895 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06766e68-6811-4cd2-bb90-67cc353669e6-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"06766e68-6811-4cd2-bb90-67cc353669e6\") " pod="openstack/kube-state-metrics-0" Dec 02 16:19:00 crc kubenswrapper[4933]: I1202 16:19:00.318863 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvccj\" (UniqueName: \"kubernetes.io/projected/06766e68-6811-4cd2-bb90-67cc353669e6-kube-api-access-lvccj\") pod \"kube-state-metrics-0\" (UID: \"06766e68-6811-4cd2-bb90-67cc353669e6\") " pod="openstack/kube-state-metrics-0" Dec 02 16:19:00 crc kubenswrapper[4933]: I1202 16:19:00.338618 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Dec 02 16:19:00 crc kubenswrapper[4933]: I1202 16:19:00.373953 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 02 16:19:01 crc kubenswrapper[4933]: I1202 16:19:01.067770 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09f9fb84-64a0-46a8-8551-4fcbf0e46050" path="/var/lib/kubelet/pods/09f9fb84-64a0-46a8-8551-4fcbf0e46050/volumes" Dec 02 16:19:01 crc kubenswrapper[4933]: I1202 16:19:01.068814 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa0c36d2-33b1-4415-9413-7835332aa49d" path="/var/lib/kubelet/pods/fa0c36d2-33b1-4415-9413-7835332aa49d/volumes" Dec 02 16:19:01 crc kubenswrapper[4933]: W1202 16:19:01.121367 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod43672f01_bd87_4595_8ba0_d76811762bc2.slice/crio-233143f613c2a34c8b578327f248df0c7a260e880e1b9feaa7c4a2c09aba28b0 WatchSource:0}: Error finding container 233143f613c2a34c8b578327f248df0c7a260e880e1b9feaa7c4a2c09aba28b0: Status 404 returned error can't find the container with id 233143f613c2a34c8b578327f248df0c7a260e880e1b9feaa7c4a2c09aba28b0 Dec 02 16:19:01 crc kubenswrapper[4933]: I1202 16:19:01.129797 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-0"] Dec 02 16:19:01 crc kubenswrapper[4933]: I1202 16:19:01.219989 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 16:19:01 crc kubenswrapper[4933]: I1202 16:19:01.220911 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7c8b8773-6b20-4130-a427-9567f6d990f4" containerName="sg-core" containerID="cri-o://263cb12cd86103682854f70122be5d9111e7bd2062dc5a3160bcd0553d308b3e" gracePeriod=30 Dec 02 16:19:01 crc kubenswrapper[4933]: I1202 16:19:01.220937 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7c8b8773-6b20-4130-a427-9567f6d990f4" containerName="proxy-httpd" containerID="cri-o://08cb795548dfc81bc703c4ad03738280a23c1fb2e45ff4980dea95fe4bd40c20" gracePeriod=30 Dec 02 16:19:01 crc kubenswrapper[4933]: I1202 16:19:01.220937 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7c8b8773-6b20-4130-a427-9567f6d990f4" containerName="ceilometer-notification-agent" containerID="cri-o://0cdfaa2c395b0884665e146c203ca33668060c1d384b091e2c7f522e486816ef" gracePeriod=30 Dec 02 16:19:01 crc kubenswrapper[4933]: I1202 16:19:01.221142 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7c8b8773-6b20-4130-a427-9567f6d990f4" containerName="ceilometer-central-agent" containerID="cri-o://9f10a2aaf2d97f4029f4dbbdbdd93c4d6aa1b5cb107ba66005e44299b954ff69" gracePeriod=30 Dec 02 16:19:01 crc kubenswrapper[4933]: I1202 16:19:01.256036 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 02 16:19:01 crc kubenswrapper[4933]: I1202 16:19:01.883953 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"43672f01-bd87-4595-8ba0-d76811762bc2","Type":"ContainerStarted","Data":"233143f613c2a34c8b578327f248df0c7a260e880e1b9feaa7c4a2c09aba28b0"} Dec 02 16:19:01 crc kubenswrapper[4933]: I1202 16:19:01.887713 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"06766e68-6811-4cd2-bb90-67cc353669e6","Type":"ContainerStarted","Data":"dddf436af86a81929b323d13896ab8ca93429ad7830d745fea9b95796e50654c"} Dec 02 16:19:01 crc kubenswrapper[4933]: I1202 16:19:01.887749 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"06766e68-6811-4cd2-bb90-67cc353669e6","Type":"ContainerStarted","Data":"68c0efcc4e648f4bbe76e46acf8cddacaace43a24a860d30526f59cb58771d62"} Dec 02 16:19:01 crc kubenswrapper[4933]: I1202 16:19:01.887903 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 02 16:19:01 crc kubenswrapper[4933]: I1202 16:19:01.893252 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7c8b8773-6b20-4130-a427-9567f6d990f4","Type":"ContainerDied","Data":"08cb795548dfc81bc703c4ad03738280a23c1fb2e45ff4980dea95fe4bd40c20"} Dec 02 16:19:01 crc kubenswrapper[4933]: I1202 16:19:01.893256 4933 generic.go:334] "Generic (PLEG): container finished" podID="7c8b8773-6b20-4130-a427-9567f6d990f4" containerID="08cb795548dfc81bc703c4ad03738280a23c1fb2e45ff4980dea95fe4bd40c20" exitCode=0 Dec 02 16:19:01 crc kubenswrapper[4933]: I1202 16:19:01.893316 4933 generic.go:334] "Generic (PLEG): container finished" podID="7c8b8773-6b20-4130-a427-9567f6d990f4" containerID="263cb12cd86103682854f70122be5d9111e7bd2062dc5a3160bcd0553d308b3e" exitCode=2 Dec 02 16:19:01 crc kubenswrapper[4933]: I1202 16:19:01.893327 4933 generic.go:334] "Generic (PLEG): container finished" podID="7c8b8773-6b20-4130-a427-9567f6d990f4" containerID="9f10a2aaf2d97f4029f4dbbdbdd93c4d6aa1b5cb107ba66005e44299b954ff69" exitCode=0 Dec 02 16:19:01 crc kubenswrapper[4933]: I1202 16:19:01.893330 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7c8b8773-6b20-4130-a427-9567f6d990f4","Type":"ContainerDied","Data":"263cb12cd86103682854f70122be5d9111e7bd2062dc5a3160bcd0553d308b3e"} Dec 02 16:19:01 crc kubenswrapper[4933]: I1202 16:19:01.893347 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7c8b8773-6b20-4130-a427-9567f6d990f4","Type":"ContainerDied","Data":"9f10a2aaf2d97f4029f4dbbdbdd93c4d6aa1b5cb107ba66005e44299b954ff69"} Dec 02 16:19:01 crc kubenswrapper[4933]: I1202 16:19:01.912746 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.556081931 podStartE2EDuration="2.912722882s" podCreationTimestamp="2025-12-02 16:18:59 +0000 UTC" firstStartedPulling="2025-12-02 16:19:01.254426624 +0000 UTC m=+1604.505653327" lastFinishedPulling="2025-12-02 16:19:01.611067575 +0000 UTC m=+1604.862294278" observedRunningTime="2025-12-02 16:19:01.908122335 +0000 UTC m=+1605.159349058" watchObservedRunningTime="2025-12-02 16:19:01.912722882 +0000 UTC m=+1605.163949585" Dec 02 16:19:02 crc kubenswrapper[4933]: I1202 16:19:02.015205 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 02 16:19:02 crc kubenswrapper[4933]: I1202 16:19:02.015282 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 02 16:19:02 crc kubenswrapper[4933]: I1202 16:19:02.016905 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 02 16:19:02 crc kubenswrapper[4933]: I1202 16:19:02.016942 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 02 16:19:02 crc kubenswrapper[4933]: I1202 16:19:02.027972 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 02 16:19:02 crc kubenswrapper[4933]: I1202 16:19:02.029391 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 02 16:19:02 crc kubenswrapper[4933]: I1202 16:19:02.912586 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"43672f01-bd87-4595-8ba0-d76811762bc2","Type":"ContainerStarted","Data":"3a6d7f7bdd4efb6361e332bd6dc18b730554ee416cff2cbd8b1dd135bf1f51c6"} Dec 02 16:19:02 crc kubenswrapper[4933]: I1202 16:19:02.958120 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mysqld-exporter-0" podStartSLOduration=3.304542751 podStartE2EDuration="3.958097878s" podCreationTimestamp="2025-12-02 16:18:59 +0000 UTC" firstStartedPulling="2025-12-02 16:19:01.123793287 +0000 UTC m=+1604.375019990" lastFinishedPulling="2025-12-02 16:19:01.777348364 +0000 UTC m=+1605.028575117" observedRunningTime="2025-12-02 16:19:02.935414653 +0000 UTC m=+1606.186641366" watchObservedRunningTime="2025-12-02 16:19:02.958097878 +0000 UTC m=+1606.209324571" Dec 02 16:19:04 crc kubenswrapper[4933]: I1202 16:19:04.937810 4933 generic.go:334] "Generic (PLEG): container finished" podID="7c8b8773-6b20-4130-a427-9567f6d990f4" containerID="0cdfaa2c395b0884665e146c203ca33668060c1d384b091e2c7f522e486816ef" exitCode=0 Dec 02 16:19:04 crc kubenswrapper[4933]: I1202 16:19:04.938122 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7c8b8773-6b20-4130-a427-9567f6d990f4","Type":"ContainerDied","Data":"0cdfaa2c395b0884665e146c203ca33668060c1d384b091e2c7f522e486816ef"} Dec 02 16:19:05 crc kubenswrapper[4933]: I1202 16:19:05.055692 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 16:19:05 crc kubenswrapper[4933]: I1202 16:19:05.138999 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7c8b8773-6b20-4130-a427-9567f6d990f4-run-httpd\") pod \"7c8b8773-6b20-4130-a427-9567f6d990f4\" (UID: \"7c8b8773-6b20-4130-a427-9567f6d990f4\") " Dec 02 16:19:05 crc kubenswrapper[4933]: I1202 16:19:05.139152 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7c8b8773-6b20-4130-a427-9567f6d990f4-log-httpd\") pod \"7c8b8773-6b20-4130-a427-9567f6d990f4\" (UID: \"7c8b8773-6b20-4130-a427-9567f6d990f4\") " Dec 02 16:19:05 crc kubenswrapper[4933]: I1202 16:19:05.139251 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7c8b8773-6b20-4130-a427-9567f6d990f4-sg-core-conf-yaml\") pod \"7c8b8773-6b20-4130-a427-9567f6d990f4\" (UID: \"7c8b8773-6b20-4130-a427-9567f6d990f4\") " Dec 02 16:19:05 crc kubenswrapper[4933]: I1202 16:19:05.139278 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7c8b8773-6b20-4130-a427-9567f6d990f4-scripts\") pod \"7c8b8773-6b20-4130-a427-9567f6d990f4\" (UID: \"7c8b8773-6b20-4130-a427-9567f6d990f4\") " Dec 02 16:19:05 crc kubenswrapper[4933]: I1202 16:19:05.139307 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c8b8773-6b20-4130-a427-9567f6d990f4-combined-ca-bundle\") pod \"7c8b8773-6b20-4130-a427-9567f6d990f4\" (UID: \"7c8b8773-6b20-4130-a427-9567f6d990f4\") " Dec 02 16:19:05 crc kubenswrapper[4933]: I1202 16:19:05.139400 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c8b8773-6b20-4130-a427-9567f6d990f4-config-data\") pod \"7c8b8773-6b20-4130-a427-9567f6d990f4\" (UID: \"7c8b8773-6b20-4130-a427-9567f6d990f4\") " Dec 02 16:19:05 crc kubenswrapper[4933]: I1202 16:19:05.139487 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w8kk6\" (UniqueName: \"kubernetes.io/projected/7c8b8773-6b20-4130-a427-9567f6d990f4-kube-api-access-w8kk6\") pod \"7c8b8773-6b20-4130-a427-9567f6d990f4\" (UID: \"7c8b8773-6b20-4130-a427-9567f6d990f4\") " Dec 02 16:19:05 crc kubenswrapper[4933]: I1202 16:19:05.139785 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c8b8773-6b20-4130-a427-9567f6d990f4-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "7c8b8773-6b20-4130-a427-9567f6d990f4" (UID: "7c8b8773-6b20-4130-a427-9567f6d990f4"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 16:19:05 crc kubenswrapper[4933]: I1202 16:19:05.139854 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c8b8773-6b20-4130-a427-9567f6d990f4-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "7c8b8773-6b20-4130-a427-9567f6d990f4" (UID: "7c8b8773-6b20-4130-a427-9567f6d990f4"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 16:19:05 crc kubenswrapper[4933]: I1202 16:19:05.140383 4933 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7c8b8773-6b20-4130-a427-9567f6d990f4-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 16:19:05 crc kubenswrapper[4933]: I1202 16:19:05.140402 4933 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7c8b8773-6b20-4130-a427-9567f6d990f4-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 16:19:05 crc kubenswrapper[4933]: I1202 16:19:05.146446 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c8b8773-6b20-4130-a427-9567f6d990f4-scripts" (OuterVolumeSpecName: "scripts") pod "7c8b8773-6b20-4130-a427-9567f6d990f4" (UID: "7c8b8773-6b20-4130-a427-9567f6d990f4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:19:05 crc kubenswrapper[4933]: I1202 16:19:05.150853 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c8b8773-6b20-4130-a427-9567f6d990f4-kube-api-access-w8kk6" (OuterVolumeSpecName: "kube-api-access-w8kk6") pod "7c8b8773-6b20-4130-a427-9567f6d990f4" (UID: "7c8b8773-6b20-4130-a427-9567f6d990f4"). InnerVolumeSpecName "kube-api-access-w8kk6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:19:05 crc kubenswrapper[4933]: I1202 16:19:05.203099 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c8b8773-6b20-4130-a427-9567f6d990f4-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "7c8b8773-6b20-4130-a427-9567f6d990f4" (UID: "7c8b8773-6b20-4130-a427-9567f6d990f4"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:19:05 crc kubenswrapper[4933]: I1202 16:19:05.243668 4933 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7c8b8773-6b20-4130-a427-9567f6d990f4-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 02 16:19:05 crc kubenswrapper[4933]: I1202 16:19:05.254578 4933 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7c8b8773-6b20-4130-a427-9567f6d990f4-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 16:19:05 crc kubenswrapper[4933]: I1202 16:19:05.254599 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w8kk6\" (UniqueName: \"kubernetes.io/projected/7c8b8773-6b20-4130-a427-9567f6d990f4-kube-api-access-w8kk6\") on node \"crc\" DevicePath \"\"" Dec 02 16:19:05 crc kubenswrapper[4933]: I1202 16:19:05.260165 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c8b8773-6b20-4130-a427-9567f6d990f4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7c8b8773-6b20-4130-a427-9567f6d990f4" (UID: "7c8b8773-6b20-4130-a427-9567f6d990f4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:19:05 crc kubenswrapper[4933]: I1202 16:19:05.314517 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c8b8773-6b20-4130-a427-9567f6d990f4-config-data" (OuterVolumeSpecName: "config-data") pod "7c8b8773-6b20-4130-a427-9567f6d990f4" (UID: "7c8b8773-6b20-4130-a427-9567f6d990f4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:19:05 crc kubenswrapper[4933]: I1202 16:19:05.358248 4933 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c8b8773-6b20-4130-a427-9567f6d990f4-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 16:19:05 crc kubenswrapper[4933]: I1202 16:19:05.358284 4933 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c8b8773-6b20-4130-a427-9567f6d990f4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 16:19:05 crc kubenswrapper[4933]: I1202 16:19:05.953011 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7c8b8773-6b20-4130-a427-9567f6d990f4","Type":"ContainerDied","Data":"114091775de82c5b15b708aee85d5ef39fc5d79fb79f4f7a7ecc88c54e0a5783"} Dec 02 16:19:05 crc kubenswrapper[4933]: I1202 16:19:05.953080 4933 scope.go:117] "RemoveContainer" containerID="08cb795548dfc81bc703c4ad03738280a23c1fb2e45ff4980dea95fe4bd40c20" Dec 02 16:19:05 crc kubenswrapper[4933]: I1202 16:19:05.953108 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 16:19:05 crc kubenswrapper[4933]: I1202 16:19:05.995073 4933 scope.go:117] "RemoveContainer" containerID="263cb12cd86103682854f70122be5d9111e7bd2062dc5a3160bcd0553d308b3e" Dec 02 16:19:06 crc kubenswrapper[4933]: I1202 16:19:06.012716 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 16:19:06 crc kubenswrapper[4933]: I1202 16:19:06.029041 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 02 16:19:06 crc kubenswrapper[4933]: I1202 16:19:06.032185 4933 scope.go:117] "RemoveContainer" containerID="0cdfaa2c395b0884665e146c203ca33668060c1d384b091e2c7f522e486816ef" Dec 02 16:19:06 crc kubenswrapper[4933]: I1202 16:19:06.076363 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 02 16:19:06 crc kubenswrapper[4933]: E1202 16:19:06.076922 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c8b8773-6b20-4130-a427-9567f6d990f4" containerName="proxy-httpd" Dec 02 16:19:06 crc kubenswrapper[4933]: I1202 16:19:06.076958 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c8b8773-6b20-4130-a427-9567f6d990f4" containerName="proxy-httpd" Dec 02 16:19:06 crc kubenswrapper[4933]: E1202 16:19:06.077014 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c8b8773-6b20-4130-a427-9567f6d990f4" containerName="ceilometer-notification-agent" Dec 02 16:19:06 crc kubenswrapper[4933]: I1202 16:19:06.077022 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c8b8773-6b20-4130-a427-9567f6d990f4" containerName="ceilometer-notification-agent" Dec 02 16:19:06 crc kubenswrapper[4933]: E1202 16:19:06.077043 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c8b8773-6b20-4130-a427-9567f6d990f4" containerName="sg-core" Dec 02 16:19:06 crc kubenswrapper[4933]: I1202 16:19:06.077049 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c8b8773-6b20-4130-a427-9567f6d990f4" containerName="sg-core" Dec 02 16:19:06 crc kubenswrapper[4933]: E1202 16:19:06.077076 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c8b8773-6b20-4130-a427-9567f6d990f4" containerName="ceilometer-central-agent" Dec 02 16:19:06 crc kubenswrapper[4933]: I1202 16:19:06.077083 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c8b8773-6b20-4130-a427-9567f6d990f4" containerName="ceilometer-central-agent" Dec 02 16:19:06 crc kubenswrapper[4933]: I1202 16:19:06.077336 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c8b8773-6b20-4130-a427-9567f6d990f4" containerName="ceilometer-notification-agent" Dec 02 16:19:06 crc kubenswrapper[4933]: I1202 16:19:06.077353 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c8b8773-6b20-4130-a427-9567f6d990f4" containerName="ceilometer-central-agent" Dec 02 16:19:06 crc kubenswrapper[4933]: I1202 16:19:06.077372 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c8b8773-6b20-4130-a427-9567f6d990f4" containerName="sg-core" Dec 02 16:19:06 crc kubenswrapper[4933]: I1202 16:19:06.077384 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c8b8773-6b20-4130-a427-9567f6d990f4" containerName="proxy-httpd" Dec 02 16:19:06 crc kubenswrapper[4933]: I1202 16:19:06.079646 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 16:19:06 crc kubenswrapper[4933]: I1202 16:19:06.081519 4933 scope.go:117] "RemoveContainer" containerID="9f10a2aaf2d97f4029f4dbbdbdd93c4d6aa1b5cb107ba66005e44299b954ff69" Dec 02 16:19:06 crc kubenswrapper[4933]: I1202 16:19:06.082171 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 02 16:19:06 crc kubenswrapper[4933]: I1202 16:19:06.082416 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 02 16:19:06 crc kubenswrapper[4933]: I1202 16:19:06.082492 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 02 16:19:06 crc kubenswrapper[4933]: I1202 16:19:06.095225 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 16:19:06 crc kubenswrapper[4933]: I1202 16:19:06.176996 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/873b2114-d05d-4077-9441-60b82e242236-run-httpd\") pod \"ceilometer-0\" (UID: \"873b2114-d05d-4077-9441-60b82e242236\") " pod="openstack/ceilometer-0" Dec 02 16:19:06 crc kubenswrapper[4933]: I1202 16:19:06.177068 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/873b2114-d05d-4077-9441-60b82e242236-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"873b2114-d05d-4077-9441-60b82e242236\") " pod="openstack/ceilometer-0" Dec 02 16:19:06 crc kubenswrapper[4933]: I1202 16:19:06.177094 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/873b2114-d05d-4077-9441-60b82e242236-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"873b2114-d05d-4077-9441-60b82e242236\") " pod="openstack/ceilometer-0" Dec 02 16:19:06 crc kubenswrapper[4933]: I1202 16:19:06.177143 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sskjg\" (UniqueName: \"kubernetes.io/projected/873b2114-d05d-4077-9441-60b82e242236-kube-api-access-sskjg\") pod \"ceilometer-0\" (UID: \"873b2114-d05d-4077-9441-60b82e242236\") " pod="openstack/ceilometer-0" Dec 02 16:19:06 crc kubenswrapper[4933]: I1202 16:19:06.177186 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/873b2114-d05d-4077-9441-60b82e242236-scripts\") pod \"ceilometer-0\" (UID: \"873b2114-d05d-4077-9441-60b82e242236\") " pod="openstack/ceilometer-0" Dec 02 16:19:06 crc kubenswrapper[4933]: I1202 16:19:06.177361 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/873b2114-d05d-4077-9441-60b82e242236-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"873b2114-d05d-4077-9441-60b82e242236\") " pod="openstack/ceilometer-0" Dec 02 16:19:06 crc kubenswrapper[4933]: I1202 16:19:06.177427 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/873b2114-d05d-4077-9441-60b82e242236-log-httpd\") pod \"ceilometer-0\" (UID: \"873b2114-d05d-4077-9441-60b82e242236\") " pod="openstack/ceilometer-0" Dec 02 16:19:06 crc kubenswrapper[4933]: I1202 16:19:06.177468 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/873b2114-d05d-4077-9441-60b82e242236-config-data\") pod \"ceilometer-0\" (UID: \"873b2114-d05d-4077-9441-60b82e242236\") " pod="openstack/ceilometer-0" Dec 02 16:19:06 crc kubenswrapper[4933]: I1202 16:19:06.280306 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/873b2114-d05d-4077-9441-60b82e242236-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"873b2114-d05d-4077-9441-60b82e242236\") " pod="openstack/ceilometer-0" Dec 02 16:19:06 crc kubenswrapper[4933]: I1202 16:19:06.280405 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/873b2114-d05d-4077-9441-60b82e242236-log-httpd\") pod \"ceilometer-0\" (UID: \"873b2114-d05d-4077-9441-60b82e242236\") " pod="openstack/ceilometer-0" Dec 02 16:19:06 crc kubenswrapper[4933]: I1202 16:19:06.280432 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/873b2114-d05d-4077-9441-60b82e242236-config-data\") pod \"ceilometer-0\" (UID: \"873b2114-d05d-4077-9441-60b82e242236\") " pod="openstack/ceilometer-0" Dec 02 16:19:06 crc kubenswrapper[4933]: I1202 16:19:06.280543 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/873b2114-d05d-4077-9441-60b82e242236-run-httpd\") pod \"ceilometer-0\" (UID: \"873b2114-d05d-4077-9441-60b82e242236\") " pod="openstack/ceilometer-0" Dec 02 16:19:06 crc kubenswrapper[4933]: I1202 16:19:06.280599 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/873b2114-d05d-4077-9441-60b82e242236-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"873b2114-d05d-4077-9441-60b82e242236\") " pod="openstack/ceilometer-0" Dec 02 16:19:06 crc kubenswrapper[4933]: I1202 16:19:06.280623 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/873b2114-d05d-4077-9441-60b82e242236-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"873b2114-d05d-4077-9441-60b82e242236\") " pod="openstack/ceilometer-0" Dec 02 16:19:06 crc kubenswrapper[4933]: I1202 16:19:06.280682 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sskjg\" (UniqueName: \"kubernetes.io/projected/873b2114-d05d-4077-9441-60b82e242236-kube-api-access-sskjg\") pod \"ceilometer-0\" (UID: \"873b2114-d05d-4077-9441-60b82e242236\") " pod="openstack/ceilometer-0" Dec 02 16:19:06 crc kubenswrapper[4933]: I1202 16:19:06.280737 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/873b2114-d05d-4077-9441-60b82e242236-scripts\") pod \"ceilometer-0\" (UID: \"873b2114-d05d-4077-9441-60b82e242236\") " pod="openstack/ceilometer-0" Dec 02 16:19:06 crc kubenswrapper[4933]: I1202 16:19:06.281927 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/873b2114-d05d-4077-9441-60b82e242236-log-httpd\") pod \"ceilometer-0\" (UID: \"873b2114-d05d-4077-9441-60b82e242236\") " pod="openstack/ceilometer-0" Dec 02 16:19:06 crc kubenswrapper[4933]: I1202 16:19:06.282092 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/873b2114-d05d-4077-9441-60b82e242236-run-httpd\") pod \"ceilometer-0\" (UID: \"873b2114-d05d-4077-9441-60b82e242236\") " pod="openstack/ceilometer-0" Dec 02 16:19:06 crc kubenswrapper[4933]: I1202 16:19:06.285813 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/873b2114-d05d-4077-9441-60b82e242236-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"873b2114-d05d-4077-9441-60b82e242236\") " pod="openstack/ceilometer-0" Dec 02 16:19:06 crc kubenswrapper[4933]: I1202 16:19:06.286475 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/873b2114-d05d-4077-9441-60b82e242236-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"873b2114-d05d-4077-9441-60b82e242236\") " pod="openstack/ceilometer-0" Dec 02 16:19:06 crc kubenswrapper[4933]: I1202 16:19:06.290061 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/873b2114-d05d-4077-9441-60b82e242236-scripts\") pod \"ceilometer-0\" (UID: \"873b2114-d05d-4077-9441-60b82e242236\") " pod="openstack/ceilometer-0" Dec 02 16:19:06 crc kubenswrapper[4933]: I1202 16:19:06.290773 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/873b2114-d05d-4077-9441-60b82e242236-config-data\") pod \"ceilometer-0\" (UID: \"873b2114-d05d-4077-9441-60b82e242236\") " pod="openstack/ceilometer-0" Dec 02 16:19:06 crc kubenswrapper[4933]: I1202 16:19:06.296118 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/873b2114-d05d-4077-9441-60b82e242236-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"873b2114-d05d-4077-9441-60b82e242236\") " pod="openstack/ceilometer-0" Dec 02 16:19:06 crc kubenswrapper[4933]: I1202 16:19:06.302262 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sskjg\" (UniqueName: \"kubernetes.io/projected/873b2114-d05d-4077-9441-60b82e242236-kube-api-access-sskjg\") pod \"ceilometer-0\" (UID: \"873b2114-d05d-4077-9441-60b82e242236\") " pod="openstack/ceilometer-0" Dec 02 16:19:06 crc kubenswrapper[4933]: I1202 16:19:06.407908 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 16:19:06 crc kubenswrapper[4933]: I1202 16:19:06.977358 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 16:19:07 crc kubenswrapper[4933]: I1202 16:19:07.069319 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c8b8773-6b20-4130-a427-9567f6d990f4" path="/var/lib/kubelet/pods/7c8b8773-6b20-4130-a427-9567f6d990f4/volumes" Dec 02 16:19:07 crc kubenswrapper[4933]: I1202 16:19:07.979997 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"873b2114-d05d-4077-9441-60b82e242236","Type":"ContainerStarted","Data":"934dad6f8e158e95eb323703ce7615af58818b0adf59144632bbdaa1978d6166"} Dec 02 16:19:07 crc kubenswrapper[4933]: I1202 16:19:07.980591 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"873b2114-d05d-4077-9441-60b82e242236","Type":"ContainerStarted","Data":"a1df9b3c3338c6b5c9ec6f29f39f1f3935e4fdd2af7af6b486dbee588a839df2"} Dec 02 16:19:08 crc kubenswrapper[4933]: I1202 16:19:08.996066 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"873b2114-d05d-4077-9441-60b82e242236","Type":"ContainerStarted","Data":"d4050e9a624bf57313fa9cfee10dee5bfba7aa9a44fdd1818ade1487c18db41a"} Dec 02 16:19:10 crc kubenswrapper[4933]: I1202 16:19:10.480458 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 02 16:19:11 crc kubenswrapper[4933]: I1202 16:19:11.021343 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"873b2114-d05d-4077-9441-60b82e242236","Type":"ContainerStarted","Data":"5ae83b32186382be30f0bd99aad1fee84cebea5aa0795865d41558d65237b10a"} Dec 02 16:19:11 crc kubenswrapper[4933]: I1202 16:19:11.053731 4933 scope.go:117] "RemoveContainer" containerID="3464037e44f6657a3e78f158e1b9ac51eea30ff49dcabcf02071d819dacd47c7" Dec 02 16:19:11 crc kubenswrapper[4933]: E1202 16:19:11.054293 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 16:19:13 crc kubenswrapper[4933]: I1202 16:19:13.049150 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"873b2114-d05d-4077-9441-60b82e242236","Type":"ContainerStarted","Data":"8e60c5fa73b05e9e2135070d52fefa9fb46633272ae3c6acfde3525a45322b33"} Dec 02 16:19:13 crc kubenswrapper[4933]: I1202 16:19:13.049743 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 02 16:19:13 crc kubenswrapper[4933]: I1202 16:19:13.082982 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.636642221 podStartE2EDuration="8.082963677s" podCreationTimestamp="2025-12-02 16:19:05 +0000 UTC" firstStartedPulling="2025-12-02 16:19:06.976002489 +0000 UTC m=+1610.227229192" lastFinishedPulling="2025-12-02 16:19:12.422323945 +0000 UTC m=+1615.673550648" observedRunningTime="2025-12-02 16:19:13.073019843 +0000 UTC m=+1616.324246556" watchObservedRunningTime="2025-12-02 16:19:13.082963677 +0000 UTC m=+1616.334190380" Dec 02 16:19:26 crc kubenswrapper[4933]: I1202 16:19:26.054336 4933 scope.go:117] "RemoveContainer" containerID="3464037e44f6657a3e78f158e1b9ac51eea30ff49dcabcf02071d819dacd47c7" Dec 02 16:19:26 crc kubenswrapper[4933]: E1202 16:19:26.055472 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 16:19:36 crc kubenswrapper[4933]: I1202 16:19:36.419042 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 02 16:19:38 crc kubenswrapper[4933]: I1202 16:19:38.054120 4933 scope.go:117] "RemoveContainer" containerID="3464037e44f6657a3e78f158e1b9ac51eea30ff49dcabcf02071d819dacd47c7" Dec 02 16:19:38 crc kubenswrapper[4933]: E1202 16:19:38.054972 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 16:19:42 crc kubenswrapper[4933]: I1202 16:19:42.773423 4933 scope.go:117] "RemoveContainer" containerID="9d30cec3a97a831843594c14863a1367de3b3e59a815b953ed80bbf97f8d7a3f" Dec 02 16:19:42 crc kubenswrapper[4933]: I1202 16:19:42.811018 4933 scope.go:117] "RemoveContainer" containerID="432b0ec0aa421862c4a079761af395b452a813f5b9ceb3d0c7b111354bdf1e7a" Dec 02 16:19:47 crc kubenswrapper[4933]: I1202 16:19:47.682531 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-sync-kzxbf"] Dec 02 16:19:47 crc kubenswrapper[4933]: I1202 16:19:47.692765 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-sync-kzxbf"] Dec 02 16:19:47 crc kubenswrapper[4933]: I1202 16:19:47.786215 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-sync-747pm"] Dec 02 16:19:47 crc kubenswrapper[4933]: I1202 16:19:47.787927 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-747pm" Dec 02 16:19:47 crc kubenswrapper[4933]: I1202 16:19:47.813617 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-747pm"] Dec 02 16:19:47 crc kubenswrapper[4933]: I1202 16:19:47.906421 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70ee251f-65d2-476a-ab6d-3fc838cbdf55-combined-ca-bundle\") pod \"heat-db-sync-747pm\" (UID: \"70ee251f-65d2-476a-ab6d-3fc838cbdf55\") " pod="openstack/heat-db-sync-747pm" Dec 02 16:19:47 crc kubenswrapper[4933]: I1202 16:19:47.906645 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70ee251f-65d2-476a-ab6d-3fc838cbdf55-config-data\") pod \"heat-db-sync-747pm\" (UID: \"70ee251f-65d2-476a-ab6d-3fc838cbdf55\") " pod="openstack/heat-db-sync-747pm" Dec 02 16:19:47 crc kubenswrapper[4933]: I1202 16:19:47.906715 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xfj5\" (UniqueName: \"kubernetes.io/projected/70ee251f-65d2-476a-ab6d-3fc838cbdf55-kube-api-access-2xfj5\") pod \"heat-db-sync-747pm\" (UID: \"70ee251f-65d2-476a-ab6d-3fc838cbdf55\") " pod="openstack/heat-db-sync-747pm" Dec 02 16:19:48 crc kubenswrapper[4933]: I1202 16:19:48.008814 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70ee251f-65d2-476a-ab6d-3fc838cbdf55-config-data\") pod \"heat-db-sync-747pm\" (UID: \"70ee251f-65d2-476a-ab6d-3fc838cbdf55\") " pod="openstack/heat-db-sync-747pm" Dec 02 16:19:48 crc kubenswrapper[4933]: I1202 16:19:48.008919 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2xfj5\" (UniqueName: \"kubernetes.io/projected/70ee251f-65d2-476a-ab6d-3fc838cbdf55-kube-api-access-2xfj5\") pod \"heat-db-sync-747pm\" (UID: \"70ee251f-65d2-476a-ab6d-3fc838cbdf55\") " pod="openstack/heat-db-sync-747pm" Dec 02 16:19:48 crc kubenswrapper[4933]: I1202 16:19:48.009028 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70ee251f-65d2-476a-ab6d-3fc838cbdf55-combined-ca-bundle\") pod \"heat-db-sync-747pm\" (UID: \"70ee251f-65d2-476a-ab6d-3fc838cbdf55\") " pod="openstack/heat-db-sync-747pm" Dec 02 16:19:48 crc kubenswrapper[4933]: I1202 16:19:48.017139 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70ee251f-65d2-476a-ab6d-3fc838cbdf55-config-data\") pod \"heat-db-sync-747pm\" (UID: \"70ee251f-65d2-476a-ab6d-3fc838cbdf55\") " pod="openstack/heat-db-sync-747pm" Dec 02 16:19:48 crc kubenswrapper[4933]: I1202 16:19:48.021972 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70ee251f-65d2-476a-ab6d-3fc838cbdf55-combined-ca-bundle\") pod \"heat-db-sync-747pm\" (UID: \"70ee251f-65d2-476a-ab6d-3fc838cbdf55\") " pod="openstack/heat-db-sync-747pm" Dec 02 16:19:48 crc kubenswrapper[4933]: I1202 16:19:48.033338 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xfj5\" (UniqueName: \"kubernetes.io/projected/70ee251f-65d2-476a-ab6d-3fc838cbdf55-kube-api-access-2xfj5\") pod \"heat-db-sync-747pm\" (UID: \"70ee251f-65d2-476a-ab6d-3fc838cbdf55\") " pod="openstack/heat-db-sync-747pm" Dec 02 16:19:48 crc kubenswrapper[4933]: I1202 16:19:48.120580 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-747pm" Dec 02 16:19:48 crc kubenswrapper[4933]: I1202 16:19:48.669006 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-747pm"] Dec 02 16:19:48 crc kubenswrapper[4933]: W1202 16:19:48.674345 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod70ee251f_65d2_476a_ab6d_3fc838cbdf55.slice/crio-bcea26e68bf71bfda153742f8cc0c3dd020a522454987f76c4f4b40e84e45799 WatchSource:0}: Error finding container bcea26e68bf71bfda153742f8cc0c3dd020a522454987f76c4f4b40e84e45799: Status 404 returned error can't find the container with id bcea26e68bf71bfda153742f8cc0c3dd020a522454987f76c4f4b40e84e45799 Dec 02 16:19:49 crc kubenswrapper[4933]: I1202 16:19:49.067186 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4239535-0d5c-4b17-a695-9f57efb4d381" path="/var/lib/kubelet/pods/c4239535-0d5c-4b17-a695-9f57efb4d381/volumes" Dec 02 16:19:49 crc kubenswrapper[4933]: I1202 16:19:49.529183 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-747pm" event={"ID":"70ee251f-65d2-476a-ab6d-3fc838cbdf55","Type":"ContainerStarted","Data":"bcea26e68bf71bfda153742f8cc0c3dd020a522454987f76c4f4b40e84e45799"} Dec 02 16:19:49 crc kubenswrapper[4933]: I1202 16:19:49.922816 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 16:19:49 crc kubenswrapper[4933]: I1202 16:19:49.923100 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="873b2114-d05d-4077-9441-60b82e242236" containerName="ceilometer-central-agent" containerID="cri-o://934dad6f8e158e95eb323703ce7615af58818b0adf59144632bbdaa1978d6166" gracePeriod=30 Dec 02 16:19:49 crc kubenswrapper[4933]: I1202 16:19:49.923603 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="873b2114-d05d-4077-9441-60b82e242236" containerName="proxy-httpd" containerID="cri-o://8e60c5fa73b05e9e2135070d52fefa9fb46633272ae3c6acfde3525a45322b33" gracePeriod=30 Dec 02 16:19:49 crc kubenswrapper[4933]: I1202 16:19:49.923650 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="873b2114-d05d-4077-9441-60b82e242236" containerName="sg-core" containerID="cri-o://5ae83b32186382be30f0bd99aad1fee84cebea5aa0795865d41558d65237b10a" gracePeriod=30 Dec 02 16:19:49 crc kubenswrapper[4933]: I1202 16:19:49.923680 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="873b2114-d05d-4077-9441-60b82e242236" containerName="ceilometer-notification-agent" containerID="cri-o://d4050e9a624bf57313fa9cfee10dee5bfba7aa9a44fdd1818ade1487c18db41a" gracePeriod=30 Dec 02 16:19:50 crc kubenswrapper[4933]: I1202 16:19:50.563923 4933 generic.go:334] "Generic (PLEG): container finished" podID="873b2114-d05d-4077-9441-60b82e242236" containerID="8e60c5fa73b05e9e2135070d52fefa9fb46633272ae3c6acfde3525a45322b33" exitCode=0 Dec 02 16:19:50 crc kubenswrapper[4933]: I1202 16:19:50.564183 4933 generic.go:334] "Generic (PLEG): container finished" podID="873b2114-d05d-4077-9441-60b82e242236" containerID="5ae83b32186382be30f0bd99aad1fee84cebea5aa0795865d41558d65237b10a" exitCode=2 Dec 02 16:19:50 crc kubenswrapper[4933]: I1202 16:19:50.564193 4933 generic.go:334] "Generic (PLEG): container finished" podID="873b2114-d05d-4077-9441-60b82e242236" containerID="934dad6f8e158e95eb323703ce7615af58818b0adf59144632bbdaa1978d6166" exitCode=0 Dec 02 16:19:50 crc kubenswrapper[4933]: I1202 16:19:50.564114 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"873b2114-d05d-4077-9441-60b82e242236","Type":"ContainerDied","Data":"8e60c5fa73b05e9e2135070d52fefa9fb46633272ae3c6acfde3525a45322b33"} Dec 02 16:19:50 crc kubenswrapper[4933]: I1202 16:19:50.564229 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"873b2114-d05d-4077-9441-60b82e242236","Type":"ContainerDied","Data":"5ae83b32186382be30f0bd99aad1fee84cebea5aa0795865d41558d65237b10a"} Dec 02 16:19:50 crc kubenswrapper[4933]: I1202 16:19:50.564243 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"873b2114-d05d-4077-9441-60b82e242236","Type":"ContainerDied","Data":"934dad6f8e158e95eb323703ce7615af58818b0adf59144632bbdaa1978d6166"} Dec 02 16:19:50 crc kubenswrapper[4933]: I1202 16:19:50.663214 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 02 16:19:50 crc kubenswrapper[4933]: I1202 16:19:50.827372 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 02 16:19:51 crc kubenswrapper[4933]: I1202 16:19:51.611485 4933 generic.go:334] "Generic (PLEG): container finished" podID="873b2114-d05d-4077-9441-60b82e242236" containerID="d4050e9a624bf57313fa9cfee10dee5bfba7aa9a44fdd1818ade1487c18db41a" exitCode=0 Dec 02 16:19:51 crc kubenswrapper[4933]: I1202 16:19:51.611553 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"873b2114-d05d-4077-9441-60b82e242236","Type":"ContainerDied","Data":"d4050e9a624bf57313fa9cfee10dee5bfba7aa9a44fdd1818ade1487c18db41a"} Dec 02 16:19:52 crc kubenswrapper[4933]: I1202 16:19:52.035134 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 16:19:52 crc kubenswrapper[4933]: I1202 16:19:52.149209 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sskjg\" (UniqueName: \"kubernetes.io/projected/873b2114-d05d-4077-9441-60b82e242236-kube-api-access-sskjg\") pod \"873b2114-d05d-4077-9441-60b82e242236\" (UID: \"873b2114-d05d-4077-9441-60b82e242236\") " Dec 02 16:19:52 crc kubenswrapper[4933]: I1202 16:19:52.149288 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/873b2114-d05d-4077-9441-60b82e242236-scripts\") pod \"873b2114-d05d-4077-9441-60b82e242236\" (UID: \"873b2114-d05d-4077-9441-60b82e242236\") " Dec 02 16:19:52 crc kubenswrapper[4933]: I1202 16:19:52.149369 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/873b2114-d05d-4077-9441-60b82e242236-config-data\") pod \"873b2114-d05d-4077-9441-60b82e242236\" (UID: \"873b2114-d05d-4077-9441-60b82e242236\") " Dec 02 16:19:52 crc kubenswrapper[4933]: I1202 16:19:52.149414 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/873b2114-d05d-4077-9441-60b82e242236-run-httpd\") pod \"873b2114-d05d-4077-9441-60b82e242236\" (UID: \"873b2114-d05d-4077-9441-60b82e242236\") " Dec 02 16:19:52 crc kubenswrapper[4933]: I1202 16:19:52.149574 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/873b2114-d05d-4077-9441-60b82e242236-ceilometer-tls-certs\") pod \"873b2114-d05d-4077-9441-60b82e242236\" (UID: \"873b2114-d05d-4077-9441-60b82e242236\") " Dec 02 16:19:52 crc kubenswrapper[4933]: I1202 16:19:52.149626 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/873b2114-d05d-4077-9441-60b82e242236-log-httpd\") pod \"873b2114-d05d-4077-9441-60b82e242236\" (UID: \"873b2114-d05d-4077-9441-60b82e242236\") " Dec 02 16:19:52 crc kubenswrapper[4933]: I1202 16:19:52.149724 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/873b2114-d05d-4077-9441-60b82e242236-combined-ca-bundle\") pod \"873b2114-d05d-4077-9441-60b82e242236\" (UID: \"873b2114-d05d-4077-9441-60b82e242236\") " Dec 02 16:19:52 crc kubenswrapper[4933]: I1202 16:19:52.149785 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/873b2114-d05d-4077-9441-60b82e242236-sg-core-conf-yaml\") pod \"873b2114-d05d-4077-9441-60b82e242236\" (UID: \"873b2114-d05d-4077-9441-60b82e242236\") " Dec 02 16:19:52 crc kubenswrapper[4933]: I1202 16:19:52.150096 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/873b2114-d05d-4077-9441-60b82e242236-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "873b2114-d05d-4077-9441-60b82e242236" (UID: "873b2114-d05d-4077-9441-60b82e242236"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 16:19:52 crc kubenswrapper[4933]: I1202 16:19:52.150483 4933 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/873b2114-d05d-4077-9441-60b82e242236-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 16:19:52 crc kubenswrapper[4933]: I1202 16:19:52.150680 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/873b2114-d05d-4077-9441-60b82e242236-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "873b2114-d05d-4077-9441-60b82e242236" (UID: "873b2114-d05d-4077-9441-60b82e242236"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 16:19:52 crc kubenswrapper[4933]: I1202 16:19:52.163074 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/873b2114-d05d-4077-9441-60b82e242236-scripts" (OuterVolumeSpecName: "scripts") pod "873b2114-d05d-4077-9441-60b82e242236" (UID: "873b2114-d05d-4077-9441-60b82e242236"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:19:52 crc kubenswrapper[4933]: I1202 16:19:52.215133 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/873b2114-d05d-4077-9441-60b82e242236-kube-api-access-sskjg" (OuterVolumeSpecName: "kube-api-access-sskjg") pod "873b2114-d05d-4077-9441-60b82e242236" (UID: "873b2114-d05d-4077-9441-60b82e242236"). InnerVolumeSpecName "kube-api-access-sskjg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:19:52 crc kubenswrapper[4933]: I1202 16:19:52.253714 4933 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/873b2114-d05d-4077-9441-60b82e242236-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 16:19:52 crc kubenswrapper[4933]: I1202 16:19:52.253751 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sskjg\" (UniqueName: \"kubernetes.io/projected/873b2114-d05d-4077-9441-60b82e242236-kube-api-access-sskjg\") on node \"crc\" DevicePath \"\"" Dec 02 16:19:52 crc kubenswrapper[4933]: I1202 16:19:52.253764 4933 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/873b2114-d05d-4077-9441-60b82e242236-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 16:19:52 crc kubenswrapper[4933]: I1202 16:19:52.255153 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/873b2114-d05d-4077-9441-60b82e242236-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "873b2114-d05d-4077-9441-60b82e242236" (UID: "873b2114-d05d-4077-9441-60b82e242236"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:19:52 crc kubenswrapper[4933]: I1202 16:19:52.261084 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/873b2114-d05d-4077-9441-60b82e242236-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "873b2114-d05d-4077-9441-60b82e242236" (UID: "873b2114-d05d-4077-9441-60b82e242236"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:19:52 crc kubenswrapper[4933]: I1202 16:19:52.311884 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/873b2114-d05d-4077-9441-60b82e242236-config-data" (OuterVolumeSpecName: "config-data") pod "873b2114-d05d-4077-9441-60b82e242236" (UID: "873b2114-d05d-4077-9441-60b82e242236"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:19:52 crc kubenswrapper[4933]: I1202 16:19:52.343217 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/873b2114-d05d-4077-9441-60b82e242236-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "873b2114-d05d-4077-9441-60b82e242236" (UID: "873b2114-d05d-4077-9441-60b82e242236"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:19:52 crc kubenswrapper[4933]: I1202 16:19:52.355867 4933 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/873b2114-d05d-4077-9441-60b82e242236-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 16:19:52 crc kubenswrapper[4933]: I1202 16:19:52.355899 4933 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/873b2114-d05d-4077-9441-60b82e242236-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 02 16:19:52 crc kubenswrapper[4933]: I1202 16:19:52.355910 4933 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/873b2114-d05d-4077-9441-60b82e242236-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 16:19:52 crc kubenswrapper[4933]: I1202 16:19:52.355919 4933 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/873b2114-d05d-4077-9441-60b82e242236-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 02 16:19:52 crc kubenswrapper[4933]: I1202 16:19:52.625727 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"873b2114-d05d-4077-9441-60b82e242236","Type":"ContainerDied","Data":"a1df9b3c3338c6b5c9ec6f29f39f1f3935e4fdd2af7af6b486dbee588a839df2"} Dec 02 16:19:52 crc kubenswrapper[4933]: I1202 16:19:52.625781 4933 scope.go:117] "RemoveContainer" containerID="8e60c5fa73b05e9e2135070d52fefa9fb46633272ae3c6acfde3525a45322b33" Dec 02 16:19:52 crc kubenswrapper[4933]: I1202 16:19:52.625791 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 16:19:52 crc kubenswrapper[4933]: I1202 16:19:52.683725 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 16:19:52 crc kubenswrapper[4933]: I1202 16:19:52.696642 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 02 16:19:52 crc kubenswrapper[4933]: I1202 16:19:52.698867 4933 scope.go:117] "RemoveContainer" containerID="5ae83b32186382be30f0bd99aad1fee84cebea5aa0795865d41558d65237b10a" Dec 02 16:19:52 crc kubenswrapper[4933]: I1202 16:19:52.720924 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 02 16:19:52 crc kubenswrapper[4933]: E1202 16:19:52.721696 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="873b2114-d05d-4077-9441-60b82e242236" containerName="sg-core" Dec 02 16:19:52 crc kubenswrapper[4933]: I1202 16:19:52.721769 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="873b2114-d05d-4077-9441-60b82e242236" containerName="sg-core" Dec 02 16:19:52 crc kubenswrapper[4933]: E1202 16:19:52.721846 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="873b2114-d05d-4077-9441-60b82e242236" containerName="ceilometer-notification-agent" Dec 02 16:19:52 crc kubenswrapper[4933]: I1202 16:19:52.721907 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="873b2114-d05d-4077-9441-60b82e242236" containerName="ceilometer-notification-agent" Dec 02 16:19:52 crc kubenswrapper[4933]: E1202 16:19:52.721963 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="873b2114-d05d-4077-9441-60b82e242236" containerName="proxy-httpd" Dec 02 16:19:52 crc kubenswrapper[4933]: I1202 16:19:52.722011 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="873b2114-d05d-4077-9441-60b82e242236" containerName="proxy-httpd" Dec 02 16:19:52 crc kubenswrapper[4933]: E1202 16:19:52.722102 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="873b2114-d05d-4077-9441-60b82e242236" containerName="ceilometer-central-agent" Dec 02 16:19:52 crc kubenswrapper[4933]: I1202 16:19:52.722154 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="873b2114-d05d-4077-9441-60b82e242236" containerName="ceilometer-central-agent" Dec 02 16:19:52 crc kubenswrapper[4933]: I1202 16:19:52.722424 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="873b2114-d05d-4077-9441-60b82e242236" containerName="sg-core" Dec 02 16:19:52 crc kubenswrapper[4933]: I1202 16:19:52.722493 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="873b2114-d05d-4077-9441-60b82e242236" containerName="proxy-httpd" Dec 02 16:19:52 crc kubenswrapper[4933]: I1202 16:19:52.722562 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="873b2114-d05d-4077-9441-60b82e242236" containerName="ceilometer-central-agent" Dec 02 16:19:52 crc kubenswrapper[4933]: I1202 16:19:52.722618 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="873b2114-d05d-4077-9441-60b82e242236" containerName="ceilometer-notification-agent" Dec 02 16:19:52 crc kubenswrapper[4933]: I1202 16:19:52.724900 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 16:19:52 crc kubenswrapper[4933]: I1202 16:19:52.731306 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 02 16:19:52 crc kubenswrapper[4933]: I1202 16:19:52.731489 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 02 16:19:52 crc kubenswrapper[4933]: I1202 16:19:52.731574 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 02 16:19:52 crc kubenswrapper[4933]: I1202 16:19:52.742547 4933 scope.go:117] "RemoveContainer" containerID="d4050e9a624bf57313fa9cfee10dee5bfba7aa9a44fdd1818ade1487c18db41a" Dec 02 16:19:52 crc kubenswrapper[4933]: I1202 16:19:52.749616 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 16:19:52 crc kubenswrapper[4933]: I1202 16:19:52.767917 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d788537f-6bd8-4db8-a47d-2053d24dca64-config-data\") pod \"ceilometer-0\" (UID: \"d788537f-6bd8-4db8-a47d-2053d24dca64\") " pod="openstack/ceilometer-0" Dec 02 16:19:52 crc kubenswrapper[4933]: I1202 16:19:52.767975 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d788537f-6bd8-4db8-a47d-2053d24dca64-run-httpd\") pod \"ceilometer-0\" (UID: \"d788537f-6bd8-4db8-a47d-2053d24dca64\") " pod="openstack/ceilometer-0" Dec 02 16:19:52 crc kubenswrapper[4933]: I1202 16:19:52.768000 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d788537f-6bd8-4db8-a47d-2053d24dca64-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d788537f-6bd8-4db8-a47d-2053d24dca64\") " pod="openstack/ceilometer-0" Dec 02 16:19:52 crc kubenswrapper[4933]: I1202 16:19:52.768016 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d788537f-6bd8-4db8-a47d-2053d24dca64-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d788537f-6bd8-4db8-a47d-2053d24dca64\") " pod="openstack/ceilometer-0" Dec 02 16:19:52 crc kubenswrapper[4933]: I1202 16:19:52.768043 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99bj8\" (UniqueName: \"kubernetes.io/projected/d788537f-6bd8-4db8-a47d-2053d24dca64-kube-api-access-99bj8\") pod \"ceilometer-0\" (UID: \"d788537f-6bd8-4db8-a47d-2053d24dca64\") " pod="openstack/ceilometer-0" Dec 02 16:19:52 crc kubenswrapper[4933]: I1202 16:19:52.768113 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d788537f-6bd8-4db8-a47d-2053d24dca64-scripts\") pod \"ceilometer-0\" (UID: \"d788537f-6bd8-4db8-a47d-2053d24dca64\") " pod="openstack/ceilometer-0" Dec 02 16:19:52 crc kubenswrapper[4933]: I1202 16:19:52.768161 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d788537f-6bd8-4db8-a47d-2053d24dca64-log-httpd\") pod \"ceilometer-0\" (UID: \"d788537f-6bd8-4db8-a47d-2053d24dca64\") " pod="openstack/ceilometer-0" Dec 02 16:19:52 crc kubenswrapper[4933]: I1202 16:19:52.768226 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d788537f-6bd8-4db8-a47d-2053d24dca64-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"d788537f-6bd8-4db8-a47d-2053d24dca64\") " pod="openstack/ceilometer-0" Dec 02 16:19:52 crc kubenswrapper[4933]: I1202 16:19:52.772705 4933 scope.go:117] "RemoveContainer" containerID="934dad6f8e158e95eb323703ce7615af58818b0adf59144632bbdaa1978d6166" Dec 02 16:19:52 crc kubenswrapper[4933]: I1202 16:19:52.870013 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d788537f-6bd8-4db8-a47d-2053d24dca64-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"d788537f-6bd8-4db8-a47d-2053d24dca64\") " pod="openstack/ceilometer-0" Dec 02 16:19:52 crc kubenswrapper[4933]: I1202 16:19:52.870311 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d788537f-6bd8-4db8-a47d-2053d24dca64-config-data\") pod \"ceilometer-0\" (UID: \"d788537f-6bd8-4db8-a47d-2053d24dca64\") " pod="openstack/ceilometer-0" Dec 02 16:19:52 crc kubenswrapper[4933]: I1202 16:19:52.870446 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d788537f-6bd8-4db8-a47d-2053d24dca64-run-httpd\") pod \"ceilometer-0\" (UID: \"d788537f-6bd8-4db8-a47d-2053d24dca64\") " pod="openstack/ceilometer-0" Dec 02 16:19:52 crc kubenswrapper[4933]: I1202 16:19:52.870544 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d788537f-6bd8-4db8-a47d-2053d24dca64-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d788537f-6bd8-4db8-a47d-2053d24dca64\") " pod="openstack/ceilometer-0" Dec 02 16:19:52 crc kubenswrapper[4933]: I1202 16:19:52.870630 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d788537f-6bd8-4db8-a47d-2053d24dca64-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d788537f-6bd8-4db8-a47d-2053d24dca64\") " pod="openstack/ceilometer-0" Dec 02 16:19:52 crc kubenswrapper[4933]: I1202 16:19:52.870750 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99bj8\" (UniqueName: \"kubernetes.io/projected/d788537f-6bd8-4db8-a47d-2053d24dca64-kube-api-access-99bj8\") pod \"ceilometer-0\" (UID: \"d788537f-6bd8-4db8-a47d-2053d24dca64\") " pod="openstack/ceilometer-0" Dec 02 16:19:52 crc kubenswrapper[4933]: I1202 16:19:52.871005 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d788537f-6bd8-4db8-a47d-2053d24dca64-run-httpd\") pod \"ceilometer-0\" (UID: \"d788537f-6bd8-4db8-a47d-2053d24dca64\") " pod="openstack/ceilometer-0" Dec 02 16:19:52 crc kubenswrapper[4933]: I1202 16:19:52.871106 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d788537f-6bd8-4db8-a47d-2053d24dca64-scripts\") pod \"ceilometer-0\" (UID: \"d788537f-6bd8-4db8-a47d-2053d24dca64\") " pod="openstack/ceilometer-0" Dec 02 16:19:52 crc kubenswrapper[4933]: I1202 16:19:52.871289 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d788537f-6bd8-4db8-a47d-2053d24dca64-log-httpd\") pod \"ceilometer-0\" (UID: \"d788537f-6bd8-4db8-a47d-2053d24dca64\") " pod="openstack/ceilometer-0" Dec 02 16:19:52 crc kubenswrapper[4933]: I1202 16:19:52.871695 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d788537f-6bd8-4db8-a47d-2053d24dca64-log-httpd\") pod \"ceilometer-0\" (UID: \"d788537f-6bd8-4db8-a47d-2053d24dca64\") " pod="openstack/ceilometer-0" Dec 02 16:19:52 crc kubenswrapper[4933]: I1202 16:19:52.876534 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d788537f-6bd8-4db8-a47d-2053d24dca64-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d788537f-6bd8-4db8-a47d-2053d24dca64\") " pod="openstack/ceilometer-0" Dec 02 16:19:52 crc kubenswrapper[4933]: I1202 16:19:52.879754 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d788537f-6bd8-4db8-a47d-2053d24dca64-config-data\") pod \"ceilometer-0\" (UID: \"d788537f-6bd8-4db8-a47d-2053d24dca64\") " pod="openstack/ceilometer-0" Dec 02 16:19:52 crc kubenswrapper[4933]: I1202 16:19:52.895097 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d788537f-6bd8-4db8-a47d-2053d24dca64-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d788537f-6bd8-4db8-a47d-2053d24dca64\") " pod="openstack/ceilometer-0" Dec 02 16:19:52 crc kubenswrapper[4933]: I1202 16:19:52.908287 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d788537f-6bd8-4db8-a47d-2053d24dca64-scripts\") pod \"ceilometer-0\" (UID: \"d788537f-6bd8-4db8-a47d-2053d24dca64\") " pod="openstack/ceilometer-0" Dec 02 16:19:52 crc kubenswrapper[4933]: I1202 16:19:52.909017 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99bj8\" (UniqueName: \"kubernetes.io/projected/d788537f-6bd8-4db8-a47d-2053d24dca64-kube-api-access-99bj8\") pod \"ceilometer-0\" (UID: \"d788537f-6bd8-4db8-a47d-2053d24dca64\") " pod="openstack/ceilometer-0" Dec 02 16:19:52 crc kubenswrapper[4933]: I1202 16:19:52.920010 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d788537f-6bd8-4db8-a47d-2053d24dca64-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"d788537f-6bd8-4db8-a47d-2053d24dca64\") " pod="openstack/ceilometer-0" Dec 02 16:19:53 crc kubenswrapper[4933]: I1202 16:19:53.053226 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 16:19:53 crc kubenswrapper[4933]: I1202 16:19:53.054469 4933 scope.go:117] "RemoveContainer" containerID="3464037e44f6657a3e78f158e1b9ac51eea30ff49dcabcf02071d819dacd47c7" Dec 02 16:19:53 crc kubenswrapper[4933]: E1202 16:19:53.054723 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 16:19:53 crc kubenswrapper[4933]: I1202 16:19:53.068754 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="873b2114-d05d-4077-9441-60b82e242236" path="/var/lib/kubelet/pods/873b2114-d05d-4077-9441-60b82e242236/volumes" Dec 02 16:19:53 crc kubenswrapper[4933]: I1202 16:19:53.591835 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 16:19:53 crc kubenswrapper[4933]: I1202 16:19:53.645540 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d788537f-6bd8-4db8-a47d-2053d24dca64","Type":"ContainerStarted","Data":"3551c349429a26947774900a861a235fb3c10026845ce3a8675b2a9b227fb0fc"} Dec 02 16:19:55 crc kubenswrapper[4933]: I1202 16:19:55.193652 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="5df97ea3-3281-4995-bf14-cb09bf09f39d" containerName="rabbitmq" containerID="cri-o://3cc74c4fecf1add4592a00be1e97ffe77de72c7f603cf197d35316e05299c703" gracePeriod=604796 Dec 02 16:19:55 crc kubenswrapper[4933]: I1202 16:19:55.868185 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="e3cdda86-3d0d-486d-ae36-ab6792bff2ab" containerName="rabbitmq" containerID="cri-o://688e25e8a56e20f4ea19c240866f399d248c58f9b1fca7617e58dd509627ef00" gracePeriod=604795 Dec 02 16:19:57 crc kubenswrapper[4933]: I1202 16:19:57.533122 4933 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="5df97ea3-3281-4995-bf14-cb09bf09f39d" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.127:5671: connect: connection refused" Dec 02 16:19:57 crc kubenswrapper[4933]: I1202 16:19:57.873518 4933 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="e3cdda86-3d0d-486d-ae36-ab6792bff2ab" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.128:5671: connect: connection refused" Dec 02 16:20:01 crc kubenswrapper[4933]: I1202 16:20:01.749427 4933 generic.go:334] "Generic (PLEG): container finished" podID="5df97ea3-3281-4995-bf14-cb09bf09f39d" containerID="3cc74c4fecf1add4592a00be1e97ffe77de72c7f603cf197d35316e05299c703" exitCode=0 Dec 02 16:20:01 crc kubenswrapper[4933]: I1202 16:20:01.749847 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"5df97ea3-3281-4995-bf14-cb09bf09f39d","Type":"ContainerDied","Data":"3cc74c4fecf1add4592a00be1e97ffe77de72c7f603cf197d35316e05299c703"} Dec 02 16:20:02 crc kubenswrapper[4933]: I1202 16:20:02.763731 4933 generic.go:334] "Generic (PLEG): container finished" podID="e3cdda86-3d0d-486d-ae36-ab6792bff2ab" containerID="688e25e8a56e20f4ea19c240866f399d248c58f9b1fca7617e58dd509627ef00" exitCode=0 Dec 02 16:20:02 crc kubenswrapper[4933]: I1202 16:20:02.763867 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e3cdda86-3d0d-486d-ae36-ab6792bff2ab","Type":"ContainerDied","Data":"688e25e8a56e20f4ea19c240866f399d248c58f9b1fca7617e58dd509627ef00"} Dec 02 16:20:04 crc kubenswrapper[4933]: I1202 16:20:04.053619 4933 scope.go:117] "RemoveContainer" containerID="3464037e44f6657a3e78f158e1b9ac51eea30ff49dcabcf02071d819dacd47c7" Dec 02 16:20:04 crc kubenswrapper[4933]: E1202 16:20:04.054243 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 16:20:05 crc kubenswrapper[4933]: I1202 16:20:05.865985 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-594cb89c79-x8flx"] Dec 02 16:20:05 crc kubenswrapper[4933]: I1202 16:20:05.874139 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-594cb89c79-x8flx" Dec 02 16:20:05 crc kubenswrapper[4933]: I1202 16:20:05.952258 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-594cb89c79-x8flx"] Dec 02 16:20:05 crc kubenswrapper[4933]: I1202 16:20:05.954227 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Dec 02 16:20:06 crc kubenswrapper[4933]: I1202 16:20:06.054409 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/b8ec8fea-ec11-42d1-a82d-71522aa4397e-openstack-edpm-ipam\") pod \"dnsmasq-dns-594cb89c79-x8flx\" (UID: \"b8ec8fea-ec11-42d1-a82d-71522aa4397e\") " pod="openstack/dnsmasq-dns-594cb89c79-x8flx" Dec 02 16:20:06 crc kubenswrapper[4933]: I1202 16:20:06.054678 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wt8z\" (UniqueName: \"kubernetes.io/projected/b8ec8fea-ec11-42d1-a82d-71522aa4397e-kube-api-access-4wt8z\") pod \"dnsmasq-dns-594cb89c79-x8flx\" (UID: \"b8ec8fea-ec11-42d1-a82d-71522aa4397e\") " pod="openstack/dnsmasq-dns-594cb89c79-x8flx" Dec 02 16:20:06 crc kubenswrapper[4933]: I1202 16:20:06.054737 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b8ec8fea-ec11-42d1-a82d-71522aa4397e-ovsdbserver-nb\") pod \"dnsmasq-dns-594cb89c79-x8flx\" (UID: \"b8ec8fea-ec11-42d1-a82d-71522aa4397e\") " pod="openstack/dnsmasq-dns-594cb89c79-x8flx" Dec 02 16:20:06 crc kubenswrapper[4933]: I1202 16:20:06.054811 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8ec8fea-ec11-42d1-a82d-71522aa4397e-config\") pod \"dnsmasq-dns-594cb89c79-x8flx\" (UID: \"b8ec8fea-ec11-42d1-a82d-71522aa4397e\") " pod="openstack/dnsmasq-dns-594cb89c79-x8flx" Dec 02 16:20:06 crc kubenswrapper[4933]: I1202 16:20:06.054866 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b8ec8fea-ec11-42d1-a82d-71522aa4397e-ovsdbserver-sb\") pod \"dnsmasq-dns-594cb89c79-x8flx\" (UID: \"b8ec8fea-ec11-42d1-a82d-71522aa4397e\") " pod="openstack/dnsmasq-dns-594cb89c79-x8flx" Dec 02 16:20:06 crc kubenswrapper[4933]: I1202 16:20:06.055011 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b8ec8fea-ec11-42d1-a82d-71522aa4397e-dns-swift-storage-0\") pod \"dnsmasq-dns-594cb89c79-x8flx\" (UID: \"b8ec8fea-ec11-42d1-a82d-71522aa4397e\") " pod="openstack/dnsmasq-dns-594cb89c79-x8flx" Dec 02 16:20:06 crc kubenswrapper[4933]: I1202 16:20:06.055168 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b8ec8fea-ec11-42d1-a82d-71522aa4397e-dns-svc\") pod \"dnsmasq-dns-594cb89c79-x8flx\" (UID: \"b8ec8fea-ec11-42d1-a82d-71522aa4397e\") " pod="openstack/dnsmasq-dns-594cb89c79-x8flx" Dec 02 16:20:06 crc kubenswrapper[4933]: I1202 16:20:06.110742 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-594cb89c79-x8flx"] Dec 02 16:20:06 crc kubenswrapper[4933]: E1202 16:20:06.111917 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc dns-swift-storage-0 kube-api-access-4wt8z openstack-edpm-ipam ovsdbserver-nb ovsdbserver-sb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-594cb89c79-x8flx" podUID="b8ec8fea-ec11-42d1-a82d-71522aa4397e" Dec 02 16:20:06 crc kubenswrapper[4933]: I1202 16:20:06.144697 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5596c69fcc-lb9d4"] Dec 02 16:20:06 crc kubenswrapper[4933]: I1202 16:20:06.147034 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5596c69fcc-lb9d4" Dec 02 16:20:06 crc kubenswrapper[4933]: I1202 16:20:06.158464 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b8ec8fea-ec11-42d1-a82d-71522aa4397e-dns-svc\") pod \"dnsmasq-dns-594cb89c79-x8flx\" (UID: \"b8ec8fea-ec11-42d1-a82d-71522aa4397e\") " pod="openstack/dnsmasq-dns-594cb89c79-x8flx" Dec 02 16:20:06 crc kubenswrapper[4933]: I1202 16:20:06.158536 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/b8ec8fea-ec11-42d1-a82d-71522aa4397e-openstack-edpm-ipam\") pod \"dnsmasq-dns-594cb89c79-x8flx\" (UID: \"b8ec8fea-ec11-42d1-a82d-71522aa4397e\") " pod="openstack/dnsmasq-dns-594cb89c79-x8flx" Dec 02 16:20:06 crc kubenswrapper[4933]: I1202 16:20:06.158595 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wt8z\" (UniqueName: \"kubernetes.io/projected/b8ec8fea-ec11-42d1-a82d-71522aa4397e-kube-api-access-4wt8z\") pod \"dnsmasq-dns-594cb89c79-x8flx\" (UID: \"b8ec8fea-ec11-42d1-a82d-71522aa4397e\") " pod="openstack/dnsmasq-dns-594cb89c79-x8flx" Dec 02 16:20:06 crc kubenswrapper[4933]: I1202 16:20:06.158657 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b8ec8fea-ec11-42d1-a82d-71522aa4397e-ovsdbserver-nb\") pod \"dnsmasq-dns-594cb89c79-x8flx\" (UID: \"b8ec8fea-ec11-42d1-a82d-71522aa4397e\") " pod="openstack/dnsmasq-dns-594cb89c79-x8flx" Dec 02 16:20:06 crc kubenswrapper[4933]: I1202 16:20:06.158737 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8ec8fea-ec11-42d1-a82d-71522aa4397e-config\") pod \"dnsmasq-dns-594cb89c79-x8flx\" (UID: \"b8ec8fea-ec11-42d1-a82d-71522aa4397e\") " pod="openstack/dnsmasq-dns-594cb89c79-x8flx" Dec 02 16:20:06 crc kubenswrapper[4933]: I1202 16:20:06.158778 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b8ec8fea-ec11-42d1-a82d-71522aa4397e-ovsdbserver-sb\") pod \"dnsmasq-dns-594cb89c79-x8flx\" (UID: \"b8ec8fea-ec11-42d1-a82d-71522aa4397e\") " pod="openstack/dnsmasq-dns-594cb89c79-x8flx" Dec 02 16:20:06 crc kubenswrapper[4933]: I1202 16:20:06.158952 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b8ec8fea-ec11-42d1-a82d-71522aa4397e-dns-swift-storage-0\") pod \"dnsmasq-dns-594cb89c79-x8flx\" (UID: \"b8ec8fea-ec11-42d1-a82d-71522aa4397e\") " pod="openstack/dnsmasq-dns-594cb89c79-x8flx" Dec 02 16:20:06 crc kubenswrapper[4933]: I1202 16:20:06.160751 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b8ec8fea-ec11-42d1-a82d-71522aa4397e-dns-svc\") pod \"dnsmasq-dns-594cb89c79-x8flx\" (UID: \"b8ec8fea-ec11-42d1-a82d-71522aa4397e\") " pod="openstack/dnsmasq-dns-594cb89c79-x8flx" Dec 02 16:20:06 crc kubenswrapper[4933]: I1202 16:20:06.161266 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/b8ec8fea-ec11-42d1-a82d-71522aa4397e-openstack-edpm-ipam\") pod \"dnsmasq-dns-594cb89c79-x8flx\" (UID: \"b8ec8fea-ec11-42d1-a82d-71522aa4397e\") " pod="openstack/dnsmasq-dns-594cb89c79-x8flx" Dec 02 16:20:06 crc kubenswrapper[4933]: I1202 16:20:06.162133 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b8ec8fea-ec11-42d1-a82d-71522aa4397e-ovsdbserver-nb\") pod \"dnsmasq-dns-594cb89c79-x8flx\" (UID: \"b8ec8fea-ec11-42d1-a82d-71522aa4397e\") " pod="openstack/dnsmasq-dns-594cb89c79-x8flx" Dec 02 16:20:06 crc kubenswrapper[4933]: I1202 16:20:06.165167 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8ec8fea-ec11-42d1-a82d-71522aa4397e-config\") pod \"dnsmasq-dns-594cb89c79-x8flx\" (UID: \"b8ec8fea-ec11-42d1-a82d-71522aa4397e\") " pod="openstack/dnsmasq-dns-594cb89c79-x8flx" Dec 02 16:20:06 crc kubenswrapper[4933]: I1202 16:20:06.168808 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b8ec8fea-ec11-42d1-a82d-71522aa4397e-dns-swift-storage-0\") pod \"dnsmasq-dns-594cb89c79-x8flx\" (UID: \"b8ec8fea-ec11-42d1-a82d-71522aa4397e\") " pod="openstack/dnsmasq-dns-594cb89c79-x8flx" Dec 02 16:20:06 crc kubenswrapper[4933]: I1202 16:20:06.169056 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5596c69fcc-lb9d4"] Dec 02 16:20:06 crc kubenswrapper[4933]: I1202 16:20:06.169513 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b8ec8fea-ec11-42d1-a82d-71522aa4397e-ovsdbserver-sb\") pod \"dnsmasq-dns-594cb89c79-x8flx\" (UID: \"b8ec8fea-ec11-42d1-a82d-71522aa4397e\") " pod="openstack/dnsmasq-dns-594cb89c79-x8flx" Dec 02 16:20:06 crc kubenswrapper[4933]: I1202 16:20:06.225190 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wt8z\" (UniqueName: \"kubernetes.io/projected/b8ec8fea-ec11-42d1-a82d-71522aa4397e-kube-api-access-4wt8z\") pod \"dnsmasq-dns-594cb89c79-x8flx\" (UID: \"b8ec8fea-ec11-42d1-a82d-71522aa4397e\") " pod="openstack/dnsmasq-dns-594cb89c79-x8flx" Dec 02 16:20:06 crc kubenswrapper[4933]: I1202 16:20:06.261789 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d8a58bed-855e-481a-a1c6-a2fa6851e55c-ovsdbserver-sb\") pod \"dnsmasq-dns-5596c69fcc-lb9d4\" (UID: \"d8a58bed-855e-481a-a1c6-a2fa6851e55c\") " pod="openstack/dnsmasq-dns-5596c69fcc-lb9d4" Dec 02 16:20:06 crc kubenswrapper[4933]: I1202 16:20:06.262198 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bk57\" (UniqueName: \"kubernetes.io/projected/d8a58bed-855e-481a-a1c6-a2fa6851e55c-kube-api-access-8bk57\") pod \"dnsmasq-dns-5596c69fcc-lb9d4\" (UID: \"d8a58bed-855e-481a-a1c6-a2fa6851e55c\") " pod="openstack/dnsmasq-dns-5596c69fcc-lb9d4" Dec 02 16:20:06 crc kubenswrapper[4933]: I1202 16:20:06.262254 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d8a58bed-855e-481a-a1c6-a2fa6851e55c-dns-svc\") pod \"dnsmasq-dns-5596c69fcc-lb9d4\" (UID: \"d8a58bed-855e-481a-a1c6-a2fa6851e55c\") " pod="openstack/dnsmasq-dns-5596c69fcc-lb9d4" Dec 02 16:20:06 crc kubenswrapper[4933]: I1202 16:20:06.262306 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d8a58bed-855e-481a-a1c6-a2fa6851e55c-dns-swift-storage-0\") pod \"dnsmasq-dns-5596c69fcc-lb9d4\" (UID: \"d8a58bed-855e-481a-a1c6-a2fa6851e55c\") " pod="openstack/dnsmasq-dns-5596c69fcc-lb9d4" Dec 02 16:20:06 crc kubenswrapper[4933]: I1202 16:20:06.262566 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d8a58bed-855e-481a-a1c6-a2fa6851e55c-ovsdbserver-nb\") pod \"dnsmasq-dns-5596c69fcc-lb9d4\" (UID: \"d8a58bed-855e-481a-a1c6-a2fa6851e55c\") " pod="openstack/dnsmasq-dns-5596c69fcc-lb9d4" Dec 02 16:20:06 crc kubenswrapper[4933]: I1202 16:20:06.262600 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8a58bed-855e-481a-a1c6-a2fa6851e55c-config\") pod \"dnsmasq-dns-5596c69fcc-lb9d4\" (UID: \"d8a58bed-855e-481a-a1c6-a2fa6851e55c\") " pod="openstack/dnsmasq-dns-5596c69fcc-lb9d4" Dec 02 16:20:06 crc kubenswrapper[4933]: I1202 16:20:06.262623 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/d8a58bed-855e-481a-a1c6-a2fa6851e55c-openstack-edpm-ipam\") pod \"dnsmasq-dns-5596c69fcc-lb9d4\" (UID: \"d8a58bed-855e-481a-a1c6-a2fa6851e55c\") " pod="openstack/dnsmasq-dns-5596c69fcc-lb9d4" Dec 02 16:20:06 crc kubenswrapper[4933]: I1202 16:20:06.365115 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bk57\" (UniqueName: \"kubernetes.io/projected/d8a58bed-855e-481a-a1c6-a2fa6851e55c-kube-api-access-8bk57\") pod \"dnsmasq-dns-5596c69fcc-lb9d4\" (UID: \"d8a58bed-855e-481a-a1c6-a2fa6851e55c\") " pod="openstack/dnsmasq-dns-5596c69fcc-lb9d4" Dec 02 16:20:06 crc kubenswrapper[4933]: I1202 16:20:06.365185 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d8a58bed-855e-481a-a1c6-a2fa6851e55c-dns-svc\") pod \"dnsmasq-dns-5596c69fcc-lb9d4\" (UID: \"d8a58bed-855e-481a-a1c6-a2fa6851e55c\") " pod="openstack/dnsmasq-dns-5596c69fcc-lb9d4" Dec 02 16:20:06 crc kubenswrapper[4933]: I1202 16:20:06.365233 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d8a58bed-855e-481a-a1c6-a2fa6851e55c-dns-swift-storage-0\") pod \"dnsmasq-dns-5596c69fcc-lb9d4\" (UID: \"d8a58bed-855e-481a-a1c6-a2fa6851e55c\") " pod="openstack/dnsmasq-dns-5596c69fcc-lb9d4" Dec 02 16:20:06 crc kubenswrapper[4933]: I1202 16:20:06.365283 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d8a58bed-855e-481a-a1c6-a2fa6851e55c-ovsdbserver-nb\") pod \"dnsmasq-dns-5596c69fcc-lb9d4\" (UID: \"d8a58bed-855e-481a-a1c6-a2fa6851e55c\") " pod="openstack/dnsmasq-dns-5596c69fcc-lb9d4" Dec 02 16:20:06 crc kubenswrapper[4933]: I1202 16:20:06.365319 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8a58bed-855e-481a-a1c6-a2fa6851e55c-config\") pod \"dnsmasq-dns-5596c69fcc-lb9d4\" (UID: \"d8a58bed-855e-481a-a1c6-a2fa6851e55c\") " pod="openstack/dnsmasq-dns-5596c69fcc-lb9d4" Dec 02 16:20:06 crc kubenswrapper[4933]: I1202 16:20:06.365340 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/d8a58bed-855e-481a-a1c6-a2fa6851e55c-openstack-edpm-ipam\") pod \"dnsmasq-dns-5596c69fcc-lb9d4\" (UID: \"d8a58bed-855e-481a-a1c6-a2fa6851e55c\") " pod="openstack/dnsmasq-dns-5596c69fcc-lb9d4" Dec 02 16:20:06 crc kubenswrapper[4933]: I1202 16:20:06.365413 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d8a58bed-855e-481a-a1c6-a2fa6851e55c-ovsdbserver-sb\") pod \"dnsmasq-dns-5596c69fcc-lb9d4\" (UID: \"d8a58bed-855e-481a-a1c6-a2fa6851e55c\") " pod="openstack/dnsmasq-dns-5596c69fcc-lb9d4" Dec 02 16:20:06 crc kubenswrapper[4933]: I1202 16:20:06.366244 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d8a58bed-855e-481a-a1c6-a2fa6851e55c-ovsdbserver-sb\") pod \"dnsmasq-dns-5596c69fcc-lb9d4\" (UID: \"d8a58bed-855e-481a-a1c6-a2fa6851e55c\") " pod="openstack/dnsmasq-dns-5596c69fcc-lb9d4" Dec 02 16:20:06 crc kubenswrapper[4933]: I1202 16:20:06.366295 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d8a58bed-855e-481a-a1c6-a2fa6851e55c-dns-svc\") pod \"dnsmasq-dns-5596c69fcc-lb9d4\" (UID: \"d8a58bed-855e-481a-a1c6-a2fa6851e55c\") " pod="openstack/dnsmasq-dns-5596c69fcc-lb9d4" Dec 02 16:20:06 crc kubenswrapper[4933]: I1202 16:20:06.366838 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d8a58bed-855e-481a-a1c6-a2fa6851e55c-dns-swift-storage-0\") pod \"dnsmasq-dns-5596c69fcc-lb9d4\" (UID: \"d8a58bed-855e-481a-a1c6-a2fa6851e55c\") " pod="openstack/dnsmasq-dns-5596c69fcc-lb9d4" Dec 02 16:20:06 crc kubenswrapper[4933]: I1202 16:20:06.367036 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/d8a58bed-855e-481a-a1c6-a2fa6851e55c-openstack-edpm-ipam\") pod \"dnsmasq-dns-5596c69fcc-lb9d4\" (UID: \"d8a58bed-855e-481a-a1c6-a2fa6851e55c\") " pod="openstack/dnsmasq-dns-5596c69fcc-lb9d4" Dec 02 16:20:06 crc kubenswrapper[4933]: I1202 16:20:06.367734 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8a58bed-855e-481a-a1c6-a2fa6851e55c-config\") pod \"dnsmasq-dns-5596c69fcc-lb9d4\" (UID: \"d8a58bed-855e-481a-a1c6-a2fa6851e55c\") " pod="openstack/dnsmasq-dns-5596c69fcc-lb9d4" Dec 02 16:20:06 crc kubenswrapper[4933]: I1202 16:20:06.368116 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d8a58bed-855e-481a-a1c6-a2fa6851e55c-ovsdbserver-nb\") pod \"dnsmasq-dns-5596c69fcc-lb9d4\" (UID: \"d8a58bed-855e-481a-a1c6-a2fa6851e55c\") " pod="openstack/dnsmasq-dns-5596c69fcc-lb9d4" Dec 02 16:20:06 crc kubenswrapper[4933]: I1202 16:20:06.409537 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bk57\" (UniqueName: \"kubernetes.io/projected/d8a58bed-855e-481a-a1c6-a2fa6851e55c-kube-api-access-8bk57\") pod \"dnsmasq-dns-5596c69fcc-lb9d4\" (UID: \"d8a58bed-855e-481a-a1c6-a2fa6851e55c\") " pod="openstack/dnsmasq-dns-5596c69fcc-lb9d4" Dec 02 16:20:06 crc kubenswrapper[4933]: I1202 16:20:06.473857 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5596c69fcc-lb9d4" Dec 02 16:20:06 crc kubenswrapper[4933]: I1202 16:20:06.839412 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-594cb89c79-x8flx" Dec 02 16:20:06 crc kubenswrapper[4933]: I1202 16:20:06.852269 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-594cb89c79-x8flx" Dec 02 16:20:06 crc kubenswrapper[4933]: I1202 16:20:06.983239 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8ec8fea-ec11-42d1-a82d-71522aa4397e-config\") pod \"b8ec8fea-ec11-42d1-a82d-71522aa4397e\" (UID: \"b8ec8fea-ec11-42d1-a82d-71522aa4397e\") " Dec 02 16:20:06 crc kubenswrapper[4933]: I1202 16:20:06.983348 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b8ec8fea-ec11-42d1-a82d-71522aa4397e-dns-svc\") pod \"b8ec8fea-ec11-42d1-a82d-71522aa4397e\" (UID: \"b8ec8fea-ec11-42d1-a82d-71522aa4397e\") " Dec 02 16:20:06 crc kubenswrapper[4933]: I1202 16:20:06.983409 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4wt8z\" (UniqueName: \"kubernetes.io/projected/b8ec8fea-ec11-42d1-a82d-71522aa4397e-kube-api-access-4wt8z\") pod \"b8ec8fea-ec11-42d1-a82d-71522aa4397e\" (UID: \"b8ec8fea-ec11-42d1-a82d-71522aa4397e\") " Dec 02 16:20:06 crc kubenswrapper[4933]: I1202 16:20:06.983435 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b8ec8fea-ec11-42d1-a82d-71522aa4397e-dns-swift-storage-0\") pod \"b8ec8fea-ec11-42d1-a82d-71522aa4397e\" (UID: \"b8ec8fea-ec11-42d1-a82d-71522aa4397e\") " Dec 02 16:20:06 crc kubenswrapper[4933]: I1202 16:20:06.983743 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b8ec8fea-ec11-42d1-a82d-71522aa4397e-ovsdbserver-nb\") pod \"b8ec8fea-ec11-42d1-a82d-71522aa4397e\" (UID: \"b8ec8fea-ec11-42d1-a82d-71522aa4397e\") " Dec 02 16:20:06 crc kubenswrapper[4933]: I1202 16:20:06.983869 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/b8ec8fea-ec11-42d1-a82d-71522aa4397e-openstack-edpm-ipam\") pod \"b8ec8fea-ec11-42d1-a82d-71522aa4397e\" (UID: \"b8ec8fea-ec11-42d1-a82d-71522aa4397e\") " Dec 02 16:20:06 crc kubenswrapper[4933]: I1202 16:20:06.983940 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b8ec8fea-ec11-42d1-a82d-71522aa4397e-ovsdbserver-sb\") pod \"b8ec8fea-ec11-42d1-a82d-71522aa4397e\" (UID: \"b8ec8fea-ec11-42d1-a82d-71522aa4397e\") " Dec 02 16:20:06 crc kubenswrapper[4933]: I1202 16:20:06.984047 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8ec8fea-ec11-42d1-a82d-71522aa4397e-config" (OuterVolumeSpecName: "config") pod "b8ec8fea-ec11-42d1-a82d-71522aa4397e" (UID: "b8ec8fea-ec11-42d1-a82d-71522aa4397e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:20:06 crc kubenswrapper[4933]: I1202 16:20:06.984208 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8ec8fea-ec11-42d1-a82d-71522aa4397e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "b8ec8fea-ec11-42d1-a82d-71522aa4397e" (UID: "b8ec8fea-ec11-42d1-a82d-71522aa4397e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:20:06 crc kubenswrapper[4933]: I1202 16:20:06.984288 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8ec8fea-ec11-42d1-a82d-71522aa4397e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b8ec8fea-ec11-42d1-a82d-71522aa4397e" (UID: "b8ec8fea-ec11-42d1-a82d-71522aa4397e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:20:06 crc kubenswrapper[4933]: I1202 16:20:06.984313 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8ec8fea-ec11-42d1-a82d-71522aa4397e-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "b8ec8fea-ec11-42d1-a82d-71522aa4397e" (UID: "b8ec8fea-ec11-42d1-a82d-71522aa4397e"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:20:06 crc kubenswrapper[4933]: I1202 16:20:06.984523 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8ec8fea-ec11-42d1-a82d-71522aa4397e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b8ec8fea-ec11-42d1-a82d-71522aa4397e" (UID: "b8ec8fea-ec11-42d1-a82d-71522aa4397e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:20:06 crc kubenswrapper[4933]: I1202 16:20:06.984760 4933 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8ec8fea-ec11-42d1-a82d-71522aa4397e-config\") on node \"crc\" DevicePath \"\"" Dec 02 16:20:06 crc kubenswrapper[4933]: I1202 16:20:06.984773 4933 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b8ec8fea-ec11-42d1-a82d-71522aa4397e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 02 16:20:06 crc kubenswrapper[4933]: I1202 16:20:06.984784 4933 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b8ec8fea-ec11-42d1-a82d-71522aa4397e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 02 16:20:06 crc kubenswrapper[4933]: I1202 16:20:06.984794 4933 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/b8ec8fea-ec11-42d1-a82d-71522aa4397e-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Dec 02 16:20:06 crc kubenswrapper[4933]: I1202 16:20:06.984804 4933 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b8ec8fea-ec11-42d1-a82d-71522aa4397e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 02 16:20:06 crc kubenswrapper[4933]: I1202 16:20:06.985002 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8ec8fea-ec11-42d1-a82d-71522aa4397e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b8ec8fea-ec11-42d1-a82d-71522aa4397e" (UID: "b8ec8fea-ec11-42d1-a82d-71522aa4397e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:20:06 crc kubenswrapper[4933]: I1202 16:20:06.989619 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8ec8fea-ec11-42d1-a82d-71522aa4397e-kube-api-access-4wt8z" (OuterVolumeSpecName: "kube-api-access-4wt8z") pod "b8ec8fea-ec11-42d1-a82d-71522aa4397e" (UID: "b8ec8fea-ec11-42d1-a82d-71522aa4397e"). InnerVolumeSpecName "kube-api-access-4wt8z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:20:07 crc kubenswrapper[4933]: I1202 16:20:07.128751 4933 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b8ec8fea-ec11-42d1-a82d-71522aa4397e-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 16:20:07 crc kubenswrapper[4933]: I1202 16:20:07.128939 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4wt8z\" (UniqueName: \"kubernetes.io/projected/b8ec8fea-ec11-42d1-a82d-71522aa4397e-kube-api-access-4wt8z\") on node \"crc\" DevicePath \"\"" Dec 02 16:20:07 crc kubenswrapper[4933]: I1202 16:20:07.532311 4933 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="5df97ea3-3281-4995-bf14-cb09bf09f39d" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.127:5671: connect: connection refused" Dec 02 16:20:07 crc kubenswrapper[4933]: I1202 16:20:07.849256 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-594cb89c79-x8flx" Dec 02 16:20:07 crc kubenswrapper[4933]: I1202 16:20:07.873073 4933 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="e3cdda86-3d0d-486d-ae36-ab6792bff2ab" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.128:5671: connect: connection refused" Dec 02 16:20:07 crc kubenswrapper[4933]: I1202 16:20:07.923124 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-594cb89c79-x8flx"] Dec 02 16:20:07 crc kubenswrapper[4933]: I1202 16:20:07.936983 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-594cb89c79-x8flx"] Dec 02 16:20:09 crc kubenswrapper[4933]: I1202 16:20:09.067361 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8ec8fea-ec11-42d1-a82d-71522aa4397e" path="/var/lib/kubelet/pods/b8ec8fea-ec11-42d1-a82d-71522aa4397e/volumes" Dec 02 16:20:10 crc kubenswrapper[4933]: E1202 16:20:10.991330 4933 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-heat-engine:current-tested" Dec 02 16:20:10 crc kubenswrapper[4933]: E1202 16:20:10.991770 4933 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-heat-engine:current-tested" Dec 02 16:20:10 crc kubenswrapper[4933]: E1202 16:20:10.991943 4933 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:heat-db-sync,Image:quay.rdoproject.org/podified-master-centos10/openstack-heat-engine:current-tested,Command:[/bin/bash],Args:[-c /usr/bin/heat-manage --config-dir /etc/heat/heat.conf.d db_sync],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/heat/heat.conf.d/00-default.conf,SubPath:00-default.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/heat/heat.conf.d/01-custom.conf,SubPath:01-custom.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2xfj5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42418,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*42418,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-db-sync-747pm_openstack(70ee251f-65d2-476a-ab6d-3fc838cbdf55): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 16:20:10 crc kubenswrapper[4933]: E1202 16:20:10.993679 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/heat-db-sync-747pm" podUID="70ee251f-65d2-476a-ab6d-3fc838cbdf55" Dec 02 16:20:11 crc kubenswrapper[4933]: I1202 16:20:11.096068 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 02 16:20:11 crc kubenswrapper[4933]: I1202 16:20:11.235057 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e3cdda86-3d0d-486d-ae36-ab6792bff2ab-rabbitmq-plugins\") pod \"e3cdda86-3d0d-486d-ae36-ab6792bff2ab\" (UID: \"e3cdda86-3d0d-486d-ae36-ab6792bff2ab\") " Dec 02 16:20:11 crc kubenswrapper[4933]: I1202 16:20:11.235221 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"e3cdda86-3d0d-486d-ae36-ab6792bff2ab\" (UID: \"e3cdda86-3d0d-486d-ae36-ab6792bff2ab\") " Dec 02 16:20:11 crc kubenswrapper[4933]: I1202 16:20:11.235319 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e3cdda86-3d0d-486d-ae36-ab6792bff2ab-erlang-cookie-secret\") pod \"e3cdda86-3d0d-486d-ae36-ab6792bff2ab\" (UID: \"e3cdda86-3d0d-486d-ae36-ab6792bff2ab\") " Dec 02 16:20:11 crc kubenswrapper[4933]: I1202 16:20:11.235481 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e3cdda86-3d0d-486d-ae36-ab6792bff2ab-pod-info\") pod \"e3cdda86-3d0d-486d-ae36-ab6792bff2ab\" (UID: \"e3cdda86-3d0d-486d-ae36-ab6792bff2ab\") " Dec 02 16:20:11 crc kubenswrapper[4933]: I1202 16:20:11.235536 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e3cdda86-3d0d-486d-ae36-ab6792bff2ab-rabbitmq-confd\") pod \"e3cdda86-3d0d-486d-ae36-ab6792bff2ab\" (UID: \"e3cdda86-3d0d-486d-ae36-ab6792bff2ab\") " Dec 02 16:20:11 crc kubenswrapper[4933]: I1202 16:20:11.235571 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e3cdda86-3d0d-486d-ae36-ab6792bff2ab-server-conf\") pod \"e3cdda86-3d0d-486d-ae36-ab6792bff2ab\" (UID: \"e3cdda86-3d0d-486d-ae36-ab6792bff2ab\") " Dec 02 16:20:11 crc kubenswrapper[4933]: I1202 16:20:11.236192 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e3cdda86-3d0d-486d-ae36-ab6792bff2ab-rabbitmq-tls\") pod \"e3cdda86-3d0d-486d-ae36-ab6792bff2ab\" (UID: \"e3cdda86-3d0d-486d-ae36-ab6792bff2ab\") " Dec 02 16:20:11 crc kubenswrapper[4933]: I1202 16:20:11.236252 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e3cdda86-3d0d-486d-ae36-ab6792bff2ab-config-data\") pod \"e3cdda86-3d0d-486d-ae36-ab6792bff2ab\" (UID: \"e3cdda86-3d0d-486d-ae36-ab6792bff2ab\") " Dec 02 16:20:11 crc kubenswrapper[4933]: I1202 16:20:11.236308 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3cdda86-3d0d-486d-ae36-ab6792bff2ab-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "e3cdda86-3d0d-486d-ae36-ab6792bff2ab" (UID: "e3cdda86-3d0d-486d-ae36-ab6792bff2ab"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 16:20:11 crc kubenswrapper[4933]: I1202 16:20:11.236326 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-897gp\" (UniqueName: \"kubernetes.io/projected/e3cdda86-3d0d-486d-ae36-ab6792bff2ab-kube-api-access-897gp\") pod \"e3cdda86-3d0d-486d-ae36-ab6792bff2ab\" (UID: \"e3cdda86-3d0d-486d-ae36-ab6792bff2ab\") " Dec 02 16:20:11 crc kubenswrapper[4933]: I1202 16:20:11.236363 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e3cdda86-3d0d-486d-ae36-ab6792bff2ab-rabbitmq-erlang-cookie\") pod \"e3cdda86-3d0d-486d-ae36-ab6792bff2ab\" (UID: \"e3cdda86-3d0d-486d-ae36-ab6792bff2ab\") " Dec 02 16:20:11 crc kubenswrapper[4933]: I1202 16:20:11.236433 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e3cdda86-3d0d-486d-ae36-ab6792bff2ab-plugins-conf\") pod \"e3cdda86-3d0d-486d-ae36-ab6792bff2ab\" (UID: \"e3cdda86-3d0d-486d-ae36-ab6792bff2ab\") " Dec 02 16:20:11 crc kubenswrapper[4933]: I1202 16:20:11.237533 4933 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e3cdda86-3d0d-486d-ae36-ab6792bff2ab-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 02 16:20:11 crc kubenswrapper[4933]: I1202 16:20:11.241841 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3cdda86-3d0d-486d-ae36-ab6792bff2ab-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "e3cdda86-3d0d-486d-ae36-ab6792bff2ab" (UID: "e3cdda86-3d0d-486d-ae36-ab6792bff2ab"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:20:11 crc kubenswrapper[4933]: I1202 16:20:11.242286 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3cdda86-3d0d-486d-ae36-ab6792bff2ab-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "e3cdda86-3d0d-486d-ae36-ab6792bff2ab" (UID: "e3cdda86-3d0d-486d-ae36-ab6792bff2ab"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 16:20:11 crc kubenswrapper[4933]: I1202 16:20:11.246418 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3cdda86-3d0d-486d-ae36-ab6792bff2ab-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "e3cdda86-3d0d-486d-ae36-ab6792bff2ab" (UID: "e3cdda86-3d0d-486d-ae36-ab6792bff2ab"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:20:11 crc kubenswrapper[4933]: I1202 16:20:11.248984 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/e3cdda86-3d0d-486d-ae36-ab6792bff2ab-pod-info" (OuterVolumeSpecName: "pod-info") pod "e3cdda86-3d0d-486d-ae36-ab6792bff2ab" (UID: "e3cdda86-3d0d-486d-ae36-ab6792bff2ab"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 02 16:20:11 crc kubenswrapper[4933]: I1202 16:20:11.251055 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3cdda86-3d0d-486d-ae36-ab6792bff2ab-kube-api-access-897gp" (OuterVolumeSpecName: "kube-api-access-897gp") pod "e3cdda86-3d0d-486d-ae36-ab6792bff2ab" (UID: "e3cdda86-3d0d-486d-ae36-ab6792bff2ab"). InnerVolumeSpecName "kube-api-access-897gp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:20:11 crc kubenswrapper[4933]: I1202 16:20:11.265624 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "persistence") pod "e3cdda86-3d0d-486d-ae36-ab6792bff2ab" (UID: "e3cdda86-3d0d-486d-ae36-ab6792bff2ab"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 02 16:20:11 crc kubenswrapper[4933]: I1202 16:20:11.267950 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3cdda86-3d0d-486d-ae36-ab6792bff2ab-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "e3cdda86-3d0d-486d-ae36-ab6792bff2ab" (UID: "e3cdda86-3d0d-486d-ae36-ab6792bff2ab"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:20:11 crc kubenswrapper[4933]: I1202 16:20:11.283449 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3cdda86-3d0d-486d-ae36-ab6792bff2ab-config-data" (OuterVolumeSpecName: "config-data") pod "e3cdda86-3d0d-486d-ae36-ab6792bff2ab" (UID: "e3cdda86-3d0d-486d-ae36-ab6792bff2ab"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:20:11 crc kubenswrapper[4933]: I1202 16:20:11.312424 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3cdda86-3d0d-486d-ae36-ab6792bff2ab-server-conf" (OuterVolumeSpecName: "server-conf") pod "e3cdda86-3d0d-486d-ae36-ab6792bff2ab" (UID: "e3cdda86-3d0d-486d-ae36-ab6792bff2ab"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:20:11 crc kubenswrapper[4933]: I1202 16:20:11.339269 4933 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e3cdda86-3d0d-486d-ae36-ab6792bff2ab-pod-info\") on node \"crc\" DevicePath \"\"" Dec 02 16:20:11 crc kubenswrapper[4933]: I1202 16:20:11.339309 4933 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e3cdda86-3d0d-486d-ae36-ab6792bff2ab-server-conf\") on node \"crc\" DevicePath \"\"" Dec 02 16:20:11 crc kubenswrapper[4933]: I1202 16:20:11.339321 4933 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e3cdda86-3d0d-486d-ae36-ab6792bff2ab-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 02 16:20:11 crc kubenswrapper[4933]: I1202 16:20:11.339333 4933 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e3cdda86-3d0d-486d-ae36-ab6792bff2ab-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 16:20:11 crc kubenswrapper[4933]: I1202 16:20:11.339343 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-897gp\" (UniqueName: \"kubernetes.io/projected/e3cdda86-3d0d-486d-ae36-ab6792bff2ab-kube-api-access-897gp\") on node \"crc\" DevicePath \"\"" Dec 02 16:20:11 crc kubenswrapper[4933]: I1202 16:20:11.339355 4933 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e3cdda86-3d0d-486d-ae36-ab6792bff2ab-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 02 16:20:11 crc kubenswrapper[4933]: I1202 16:20:11.339364 4933 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e3cdda86-3d0d-486d-ae36-ab6792bff2ab-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 02 16:20:11 crc kubenswrapper[4933]: I1202 16:20:11.339390 4933 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Dec 02 16:20:11 crc kubenswrapper[4933]: I1202 16:20:11.339404 4933 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e3cdda86-3d0d-486d-ae36-ab6792bff2ab-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 02 16:20:11 crc kubenswrapper[4933]: I1202 16:20:11.377063 4933 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Dec 02 16:20:11 crc kubenswrapper[4933]: I1202 16:20:11.426990 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3cdda86-3d0d-486d-ae36-ab6792bff2ab-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "e3cdda86-3d0d-486d-ae36-ab6792bff2ab" (UID: "e3cdda86-3d0d-486d-ae36-ab6792bff2ab"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:20:11 crc kubenswrapper[4933]: I1202 16:20:11.441850 4933 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e3cdda86-3d0d-486d-ae36-ab6792bff2ab-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 02 16:20:11 crc kubenswrapper[4933]: I1202 16:20:11.441884 4933 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Dec 02 16:20:11 crc kubenswrapper[4933]: E1202 16:20:11.770994 4933 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-central:current-tested" Dec 02 16:20:11 crc kubenswrapper[4933]: E1202 16:20:11.771054 4933 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-central:current-tested" Dec 02 16:20:11 crc kubenswrapper[4933]: E1202 16:20:11.771194 4933 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-central:current-tested,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nc4h657h57bh86h579h545h599hd4h65bhcbh5c6h9h5f8h574h555h54fh568h64ch5d9h547h56bh597h56ch588h5b6h588h595h7bh7fh5d5h696h699q,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-99bj8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(d788537f-6bd8-4db8-a47d-2053d24dca64): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 16:20:11 crc kubenswrapper[4933]: I1202 16:20:11.865091 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 02 16:20:11 crc kubenswrapper[4933]: I1202 16:20:11.946076 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"5df97ea3-3281-4995-bf14-cb09bf09f39d","Type":"ContainerDied","Data":"32085553b903c00bc98aea6fc554407cf0f02237e8cc505f51502b4eda09da3c"} Dec 02 16:20:11 crc kubenswrapper[4933]: I1202 16:20:11.946150 4933 scope.go:117] "RemoveContainer" containerID="3cc74c4fecf1add4592a00be1e97ffe77de72c7f603cf197d35316e05299c703" Dec 02 16:20:11 crc kubenswrapper[4933]: I1202 16:20:11.946310 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 02 16:20:11 crc kubenswrapper[4933]: I1202 16:20:11.952533 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5df97ea3-3281-4995-bf14-cb09bf09f39d-rabbitmq-confd\") pod \"5df97ea3-3281-4995-bf14-cb09bf09f39d\" (UID: \"5df97ea3-3281-4995-bf14-cb09bf09f39d\") " Dec 02 16:20:11 crc kubenswrapper[4933]: I1202 16:20:11.952642 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b7ctd\" (UniqueName: \"kubernetes.io/projected/5df97ea3-3281-4995-bf14-cb09bf09f39d-kube-api-access-b7ctd\") pod \"5df97ea3-3281-4995-bf14-cb09bf09f39d\" (UID: \"5df97ea3-3281-4995-bf14-cb09bf09f39d\") " Dec 02 16:20:11 crc kubenswrapper[4933]: I1202 16:20:11.952688 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"5df97ea3-3281-4995-bf14-cb09bf09f39d\" (UID: \"5df97ea3-3281-4995-bf14-cb09bf09f39d\") " Dec 02 16:20:11 crc kubenswrapper[4933]: I1202 16:20:11.952808 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5df97ea3-3281-4995-bf14-cb09bf09f39d-plugins-conf\") pod \"5df97ea3-3281-4995-bf14-cb09bf09f39d\" (UID: \"5df97ea3-3281-4995-bf14-cb09bf09f39d\") " Dec 02 16:20:11 crc kubenswrapper[4933]: I1202 16:20:11.952874 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5df97ea3-3281-4995-bf14-cb09bf09f39d-pod-info\") pod \"5df97ea3-3281-4995-bf14-cb09bf09f39d\" (UID: \"5df97ea3-3281-4995-bf14-cb09bf09f39d\") " Dec 02 16:20:11 crc kubenswrapper[4933]: I1202 16:20:11.952930 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5df97ea3-3281-4995-bf14-cb09bf09f39d-rabbitmq-erlang-cookie\") pod \"5df97ea3-3281-4995-bf14-cb09bf09f39d\" (UID: \"5df97ea3-3281-4995-bf14-cb09bf09f39d\") " Dec 02 16:20:11 crc kubenswrapper[4933]: I1202 16:20:11.953019 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5df97ea3-3281-4995-bf14-cb09bf09f39d-server-conf\") pod \"5df97ea3-3281-4995-bf14-cb09bf09f39d\" (UID: \"5df97ea3-3281-4995-bf14-cb09bf09f39d\") " Dec 02 16:20:11 crc kubenswrapper[4933]: I1202 16:20:11.953048 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5df97ea3-3281-4995-bf14-cb09bf09f39d-erlang-cookie-secret\") pod \"5df97ea3-3281-4995-bf14-cb09bf09f39d\" (UID: \"5df97ea3-3281-4995-bf14-cb09bf09f39d\") " Dec 02 16:20:11 crc kubenswrapper[4933]: I1202 16:20:11.953091 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5df97ea3-3281-4995-bf14-cb09bf09f39d-rabbitmq-plugins\") pod \"5df97ea3-3281-4995-bf14-cb09bf09f39d\" (UID: \"5df97ea3-3281-4995-bf14-cb09bf09f39d\") " Dec 02 16:20:11 crc kubenswrapper[4933]: I1202 16:20:11.953117 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5df97ea3-3281-4995-bf14-cb09bf09f39d-rabbitmq-tls\") pod \"5df97ea3-3281-4995-bf14-cb09bf09f39d\" (UID: \"5df97ea3-3281-4995-bf14-cb09bf09f39d\") " Dec 02 16:20:11 crc kubenswrapper[4933]: I1202 16:20:11.953208 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5df97ea3-3281-4995-bf14-cb09bf09f39d-config-data\") pod \"5df97ea3-3281-4995-bf14-cb09bf09f39d\" (UID: \"5df97ea3-3281-4995-bf14-cb09bf09f39d\") " Dec 02 16:20:11 crc kubenswrapper[4933]: I1202 16:20:11.957626 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5df97ea3-3281-4995-bf14-cb09bf09f39d-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "5df97ea3-3281-4995-bf14-cb09bf09f39d" (UID: "5df97ea3-3281-4995-bf14-cb09bf09f39d"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 16:20:11 crc kubenswrapper[4933]: I1202 16:20:11.973804 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5df97ea3-3281-4995-bf14-cb09bf09f39d-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "5df97ea3-3281-4995-bf14-cb09bf09f39d" (UID: "5df97ea3-3281-4995-bf14-cb09bf09f39d"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:20:11 crc kubenswrapper[4933]: I1202 16:20:11.976548 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 02 16:20:11 crc kubenswrapper[4933]: I1202 16:20:11.977330 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5df97ea3-3281-4995-bf14-cb09bf09f39d-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "5df97ea3-3281-4995-bf14-cb09bf09f39d" (UID: "5df97ea3-3281-4995-bf14-cb09bf09f39d"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 16:20:11 crc kubenswrapper[4933]: I1202 16:20:11.977404 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e3cdda86-3d0d-486d-ae36-ab6792bff2ab","Type":"ContainerDied","Data":"725c79b533ea015717173c67be921234aebffade0a88dab09a5710a1934b5dd4"} Dec 02 16:20:11 crc kubenswrapper[4933]: E1202 16:20:11.979649 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-heat-engine:current-tested\\\"\"" pod="openstack/heat-db-sync-747pm" podUID="70ee251f-65d2-476a-ab6d-3fc838cbdf55" Dec 02 16:20:11 crc kubenswrapper[4933]: I1202 16:20:11.982487 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5df97ea3-3281-4995-bf14-cb09bf09f39d-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "5df97ea3-3281-4995-bf14-cb09bf09f39d" (UID: "5df97ea3-3281-4995-bf14-cb09bf09f39d"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:20:11 crc kubenswrapper[4933]: I1202 16:20:11.983846 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "persistence") pod "5df97ea3-3281-4995-bf14-cb09bf09f39d" (UID: "5df97ea3-3281-4995-bf14-cb09bf09f39d"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 02 16:20:11 crc kubenswrapper[4933]: I1202 16:20:11.987714 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/5df97ea3-3281-4995-bf14-cb09bf09f39d-pod-info" (OuterVolumeSpecName: "pod-info") pod "5df97ea3-3281-4995-bf14-cb09bf09f39d" (UID: "5df97ea3-3281-4995-bf14-cb09bf09f39d"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.000041 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5df97ea3-3281-4995-bf14-cb09bf09f39d-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "5df97ea3-3281-4995-bf14-cb09bf09f39d" (UID: "5df97ea3-3281-4995-bf14-cb09bf09f39d"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.020217 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5df97ea3-3281-4995-bf14-cb09bf09f39d-kube-api-access-b7ctd" (OuterVolumeSpecName: "kube-api-access-b7ctd") pod "5df97ea3-3281-4995-bf14-cb09bf09f39d" (UID: "5df97ea3-3281-4995-bf14-cb09bf09f39d"). InnerVolumeSpecName "kube-api-access-b7ctd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.031398 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5df97ea3-3281-4995-bf14-cb09bf09f39d-config-data" (OuterVolumeSpecName: "config-data") pod "5df97ea3-3281-4995-bf14-cb09bf09f39d" (UID: "5df97ea3-3281-4995-bf14-cb09bf09f39d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.065471 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b7ctd\" (UniqueName: \"kubernetes.io/projected/5df97ea3-3281-4995-bf14-cb09bf09f39d-kube-api-access-b7ctd\") on node \"crc\" DevicePath \"\"" Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.065527 4933 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.065541 4933 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5df97ea3-3281-4995-bf14-cb09bf09f39d-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.065556 4933 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5df97ea3-3281-4995-bf14-cb09bf09f39d-pod-info\") on node \"crc\" DevicePath \"\"" Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.065569 4933 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5df97ea3-3281-4995-bf14-cb09bf09f39d-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.065581 4933 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5df97ea3-3281-4995-bf14-cb09bf09f39d-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.065592 4933 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5df97ea3-3281-4995-bf14-cb09bf09f39d-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.065592 4933 scope.go:117] "RemoveContainer" containerID="c0cbe56a7a65f4eb7aa06267f0be741d781f0a45fc009bb3d23073ffd1f9bfcb" Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.065603 4933 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5df97ea3-3281-4995-bf14-cb09bf09f39d-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.065760 4933 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5df97ea3-3281-4995-bf14-cb09bf09f39d-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.088341 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.131139 4933 scope.go:117] "RemoveContainer" containerID="688e25e8a56e20f4ea19c240866f399d248c58f9b1fca7617e58dd509627ef00" Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.142318 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5df97ea3-3281-4995-bf14-cb09bf09f39d-server-conf" (OuterVolumeSpecName: "server-conf") pod "5df97ea3-3281-4995-bf14-cb09bf09f39d" (UID: "5df97ea3-3281-4995-bf14-cb09bf09f39d"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.149933 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.169198 4933 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5df97ea3-3281-4995-bf14-cb09bf09f39d-server-conf\") on node \"crc\" DevicePath \"\"" Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.175463 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5df97ea3-3281-4995-bf14-cb09bf09f39d-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "5df97ea3-3281-4995-bf14-cb09bf09f39d" (UID: "5df97ea3-3281-4995-bf14-cb09bf09f39d"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.186892 4933 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.194654 4933 scope.go:117] "RemoveContainer" containerID="7da6f3952a39b50e00a2e909de5ee41531348ab856141537311ba7dea9e99461" Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.195071 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 02 16:20:12 crc kubenswrapper[4933]: E1202 16:20:12.195901 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5df97ea3-3281-4995-bf14-cb09bf09f39d" containerName="setup-container" Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.195924 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="5df97ea3-3281-4995-bf14-cb09bf09f39d" containerName="setup-container" Dec 02 16:20:12 crc kubenswrapper[4933]: E1202 16:20:12.195990 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3cdda86-3d0d-486d-ae36-ab6792bff2ab" containerName="setup-container" Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.196000 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3cdda86-3d0d-486d-ae36-ab6792bff2ab" containerName="setup-container" Dec 02 16:20:12 crc kubenswrapper[4933]: E1202 16:20:12.196030 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3cdda86-3d0d-486d-ae36-ab6792bff2ab" containerName="rabbitmq" Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.196039 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3cdda86-3d0d-486d-ae36-ab6792bff2ab" containerName="rabbitmq" Dec 02 16:20:12 crc kubenswrapper[4933]: E1202 16:20:12.196056 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5df97ea3-3281-4995-bf14-cb09bf09f39d" containerName="rabbitmq" Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.196065 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="5df97ea3-3281-4995-bf14-cb09bf09f39d" containerName="rabbitmq" Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.196330 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3cdda86-3d0d-486d-ae36-ab6792bff2ab" containerName="rabbitmq" Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.196365 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="5df97ea3-3281-4995-bf14-cb09bf09f39d" containerName="rabbitmq" Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.200271 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.206102 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.206380 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.206608 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.206788 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.206905 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.207257 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-d9fbb" Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.209169 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.234956 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.271752 4933 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5df97ea3-3281-4995-bf14-cb09bf09f39d-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.271783 4933 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.298913 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.319461 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.341863 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5596c69fcc-lb9d4"] Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.379549 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/cd27c6c8-91fa-4036-b50e-996c263b202c-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"cd27c6c8-91fa-4036-b50e-996c263b202c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.379709 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/cd27c6c8-91fa-4036-b50e-996c263b202c-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"cd27c6c8-91fa-4036-b50e-996c263b202c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.379779 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cd27c6c8-91fa-4036-b50e-996c263b202c-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"cd27c6c8-91fa-4036-b50e-996c263b202c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.379845 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/cd27c6c8-91fa-4036-b50e-996c263b202c-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"cd27c6c8-91fa-4036-b50e-996c263b202c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.379879 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/cd27c6c8-91fa-4036-b50e-996c263b202c-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"cd27c6c8-91fa-4036-b50e-996c263b202c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.379951 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/cd27c6c8-91fa-4036-b50e-996c263b202c-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"cd27c6c8-91fa-4036-b50e-996c263b202c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.380004 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/cd27c6c8-91fa-4036-b50e-996c263b202c-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"cd27c6c8-91fa-4036-b50e-996c263b202c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.380101 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"cd27c6c8-91fa-4036-b50e-996c263b202c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.380152 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/cd27c6c8-91fa-4036-b50e-996c263b202c-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"cd27c6c8-91fa-4036-b50e-996c263b202c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.380216 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njm4k\" (UniqueName: \"kubernetes.io/projected/cd27c6c8-91fa-4036-b50e-996c263b202c-kube-api-access-njm4k\") pod \"rabbitmq-cell1-server-0\" (UID: \"cd27c6c8-91fa-4036-b50e-996c263b202c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.380249 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/cd27c6c8-91fa-4036-b50e-996c263b202c-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"cd27c6c8-91fa-4036-b50e-996c263b202c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.397036 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.400692 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.404412 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-58f87" Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.404715 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.404873 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.405009 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.405150 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.405333 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.407801 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.422860 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.483011 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-server-0\" (UID: \"528bcbae-e7ca-4ad1-8e7d-571da6cb972b\") " pod="openstack/rabbitmq-server-0" Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.483075 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/cd27c6c8-91fa-4036-b50e-996c263b202c-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"cd27c6c8-91fa-4036-b50e-996c263b202c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.483103 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/cd27c6c8-91fa-4036-b50e-996c263b202c-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"cd27c6c8-91fa-4036-b50e-996c263b202c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.483157 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/528bcbae-e7ca-4ad1-8e7d-571da6cb972b-pod-info\") pod \"rabbitmq-server-0\" (UID: \"528bcbae-e7ca-4ad1-8e7d-571da6cb972b\") " pod="openstack/rabbitmq-server-0" Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.483177 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/528bcbae-e7ca-4ad1-8e7d-571da6cb972b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"528bcbae-e7ca-4ad1-8e7d-571da6cb972b\") " pod="openstack/rabbitmq-server-0" Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.483200 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"cd27c6c8-91fa-4036-b50e-996c263b202c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.483228 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/cd27c6c8-91fa-4036-b50e-996c263b202c-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"cd27c6c8-91fa-4036-b50e-996c263b202c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.483262 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njm4k\" (UniqueName: \"kubernetes.io/projected/cd27c6c8-91fa-4036-b50e-996c263b202c-kube-api-access-njm4k\") pod \"rabbitmq-cell1-server-0\" (UID: \"cd27c6c8-91fa-4036-b50e-996c263b202c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.483280 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/528bcbae-e7ca-4ad1-8e7d-571da6cb972b-server-conf\") pod \"rabbitmq-server-0\" (UID: \"528bcbae-e7ca-4ad1-8e7d-571da6cb972b\") " pod="openstack/rabbitmq-server-0" Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.483299 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/cd27c6c8-91fa-4036-b50e-996c263b202c-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"cd27c6c8-91fa-4036-b50e-996c263b202c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.483314 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/528bcbae-e7ca-4ad1-8e7d-571da6cb972b-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"528bcbae-e7ca-4ad1-8e7d-571da6cb972b\") " pod="openstack/rabbitmq-server-0" Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.483350 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/528bcbae-e7ca-4ad1-8e7d-571da6cb972b-config-data\") pod \"rabbitmq-server-0\" (UID: \"528bcbae-e7ca-4ad1-8e7d-571da6cb972b\") " pod="openstack/rabbitmq-server-0" Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.483368 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/cd27c6c8-91fa-4036-b50e-996c263b202c-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"cd27c6c8-91fa-4036-b50e-996c263b202c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.483420 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/528bcbae-e7ca-4ad1-8e7d-571da6cb972b-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"528bcbae-e7ca-4ad1-8e7d-571da6cb972b\") " pod="openstack/rabbitmq-server-0" Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.483450 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kc54r\" (UniqueName: \"kubernetes.io/projected/528bcbae-e7ca-4ad1-8e7d-571da6cb972b-kube-api-access-kc54r\") pod \"rabbitmq-server-0\" (UID: \"528bcbae-e7ca-4ad1-8e7d-571da6cb972b\") " pod="openstack/rabbitmq-server-0" Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.483473 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/cd27c6c8-91fa-4036-b50e-996c263b202c-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"cd27c6c8-91fa-4036-b50e-996c263b202c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.483490 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/528bcbae-e7ca-4ad1-8e7d-571da6cb972b-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"528bcbae-e7ca-4ad1-8e7d-571da6cb972b\") " pod="openstack/rabbitmq-server-0" Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.483525 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/528bcbae-e7ca-4ad1-8e7d-571da6cb972b-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"528bcbae-e7ca-4ad1-8e7d-571da6cb972b\") " pod="openstack/rabbitmq-server-0" Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.483719 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/cd27c6c8-91fa-4036-b50e-996c263b202c-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"cd27c6c8-91fa-4036-b50e-996c263b202c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.483974 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/528bcbae-e7ca-4ad1-8e7d-571da6cb972b-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"528bcbae-e7ca-4ad1-8e7d-571da6cb972b\") " pod="openstack/rabbitmq-server-0" Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.484024 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cd27c6c8-91fa-4036-b50e-996c263b202c-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"cd27c6c8-91fa-4036-b50e-996c263b202c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.484940 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cd27c6c8-91fa-4036-b50e-996c263b202c-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"cd27c6c8-91fa-4036-b50e-996c263b202c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.485066 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/cd27c6c8-91fa-4036-b50e-996c263b202c-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"cd27c6c8-91fa-4036-b50e-996c263b202c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.485108 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/cd27c6c8-91fa-4036-b50e-996c263b202c-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"cd27c6c8-91fa-4036-b50e-996c263b202c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.485068 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/cd27c6c8-91fa-4036-b50e-996c263b202c-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"cd27c6c8-91fa-4036-b50e-996c263b202c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.485701 4933 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"cd27c6c8-91fa-4036-b50e-996c263b202c\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/rabbitmq-cell1-server-0" Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.486053 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/cd27c6c8-91fa-4036-b50e-996c263b202c-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"cd27c6c8-91fa-4036-b50e-996c263b202c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.486858 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/cd27c6c8-91fa-4036-b50e-996c263b202c-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"cd27c6c8-91fa-4036-b50e-996c263b202c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.488637 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/cd27c6c8-91fa-4036-b50e-996c263b202c-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"cd27c6c8-91fa-4036-b50e-996c263b202c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.490261 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/cd27c6c8-91fa-4036-b50e-996c263b202c-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"cd27c6c8-91fa-4036-b50e-996c263b202c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.490372 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/cd27c6c8-91fa-4036-b50e-996c263b202c-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"cd27c6c8-91fa-4036-b50e-996c263b202c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.490437 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/cd27c6c8-91fa-4036-b50e-996c263b202c-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"cd27c6c8-91fa-4036-b50e-996c263b202c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.500517 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njm4k\" (UniqueName: \"kubernetes.io/projected/cd27c6c8-91fa-4036-b50e-996c263b202c-kube-api-access-njm4k\") pod \"rabbitmq-cell1-server-0\" (UID: \"cd27c6c8-91fa-4036-b50e-996c263b202c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.541374 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"cd27c6c8-91fa-4036-b50e-996c263b202c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.587042 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kc54r\" (UniqueName: \"kubernetes.io/projected/528bcbae-e7ca-4ad1-8e7d-571da6cb972b-kube-api-access-kc54r\") pod \"rabbitmq-server-0\" (UID: \"528bcbae-e7ca-4ad1-8e7d-571da6cb972b\") " pod="openstack/rabbitmq-server-0" Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.587091 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/528bcbae-e7ca-4ad1-8e7d-571da6cb972b-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"528bcbae-e7ca-4ad1-8e7d-571da6cb972b\") " pod="openstack/rabbitmq-server-0" Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.587134 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/528bcbae-e7ca-4ad1-8e7d-571da6cb972b-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"528bcbae-e7ca-4ad1-8e7d-571da6cb972b\") " pod="openstack/rabbitmq-server-0" Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.587158 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/528bcbae-e7ca-4ad1-8e7d-571da6cb972b-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"528bcbae-e7ca-4ad1-8e7d-571da6cb972b\") " pod="openstack/rabbitmq-server-0" Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.587206 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-server-0\" (UID: \"528bcbae-e7ca-4ad1-8e7d-571da6cb972b\") " pod="openstack/rabbitmq-server-0" Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.587266 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/528bcbae-e7ca-4ad1-8e7d-571da6cb972b-pod-info\") pod \"rabbitmq-server-0\" (UID: \"528bcbae-e7ca-4ad1-8e7d-571da6cb972b\") " pod="openstack/rabbitmq-server-0" Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.587282 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/528bcbae-e7ca-4ad1-8e7d-571da6cb972b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"528bcbae-e7ca-4ad1-8e7d-571da6cb972b\") " pod="openstack/rabbitmq-server-0" Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.587331 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/528bcbae-e7ca-4ad1-8e7d-571da6cb972b-server-conf\") pod \"rabbitmq-server-0\" (UID: \"528bcbae-e7ca-4ad1-8e7d-571da6cb972b\") " pod="openstack/rabbitmq-server-0" Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.587349 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/528bcbae-e7ca-4ad1-8e7d-571da6cb972b-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"528bcbae-e7ca-4ad1-8e7d-571da6cb972b\") " pod="openstack/rabbitmq-server-0" Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.587382 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/528bcbae-e7ca-4ad1-8e7d-571da6cb972b-config-data\") pod \"rabbitmq-server-0\" (UID: \"528bcbae-e7ca-4ad1-8e7d-571da6cb972b\") " pod="openstack/rabbitmq-server-0" Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.587434 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/528bcbae-e7ca-4ad1-8e7d-571da6cb972b-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"528bcbae-e7ca-4ad1-8e7d-571da6cb972b\") " pod="openstack/rabbitmq-server-0" Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.588348 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/528bcbae-e7ca-4ad1-8e7d-571da6cb972b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"528bcbae-e7ca-4ad1-8e7d-571da6cb972b\") " pod="openstack/rabbitmq-server-0" Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.588886 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/528bcbae-e7ca-4ad1-8e7d-571da6cb972b-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"528bcbae-e7ca-4ad1-8e7d-571da6cb972b\") " pod="openstack/rabbitmq-server-0" Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.588939 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/528bcbae-e7ca-4ad1-8e7d-571da6cb972b-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"528bcbae-e7ca-4ad1-8e7d-571da6cb972b\") " pod="openstack/rabbitmq-server-0" Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.588956 4933 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-server-0\" (UID: \"528bcbae-e7ca-4ad1-8e7d-571da6cb972b\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/rabbitmq-server-0" Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.589047 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/528bcbae-e7ca-4ad1-8e7d-571da6cb972b-config-data\") pod \"rabbitmq-server-0\" (UID: \"528bcbae-e7ca-4ad1-8e7d-571da6cb972b\") " pod="openstack/rabbitmq-server-0" Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.589185 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/528bcbae-e7ca-4ad1-8e7d-571da6cb972b-server-conf\") pod \"rabbitmq-server-0\" (UID: \"528bcbae-e7ca-4ad1-8e7d-571da6cb972b\") " pod="openstack/rabbitmq-server-0" Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.591243 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/528bcbae-e7ca-4ad1-8e7d-571da6cb972b-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"528bcbae-e7ca-4ad1-8e7d-571da6cb972b\") " pod="openstack/rabbitmq-server-0" Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.591654 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/528bcbae-e7ca-4ad1-8e7d-571da6cb972b-pod-info\") pod \"rabbitmq-server-0\" (UID: \"528bcbae-e7ca-4ad1-8e7d-571da6cb972b\") " pod="openstack/rabbitmq-server-0" Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.592473 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/528bcbae-e7ca-4ad1-8e7d-571da6cb972b-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"528bcbae-e7ca-4ad1-8e7d-571da6cb972b\") " pod="openstack/rabbitmq-server-0" Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.593784 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/528bcbae-e7ca-4ad1-8e7d-571da6cb972b-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"528bcbae-e7ca-4ad1-8e7d-571da6cb972b\") " pod="openstack/rabbitmq-server-0" Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.606810 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kc54r\" (UniqueName: \"kubernetes.io/projected/528bcbae-e7ca-4ad1-8e7d-571da6cb972b-kube-api-access-kc54r\") pod \"rabbitmq-server-0\" (UID: \"528bcbae-e7ca-4ad1-8e7d-571da6cb972b\") " pod="openstack/rabbitmq-server-0" Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.651083 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-server-0\" (UID: \"528bcbae-e7ca-4ad1-8e7d-571da6cb972b\") " pod="openstack/rabbitmq-server-0" Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.831185 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 02 16:20:12 crc kubenswrapper[4933]: I1202 16:20:12.845890 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 02 16:20:13 crc kubenswrapper[4933]: I1202 16:20:13.012807 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d788537f-6bd8-4db8-a47d-2053d24dca64","Type":"ContainerStarted","Data":"feec5abf9e4c5579eaee5466b8251036b1775183109bf2658bdfcad159c26173"} Dec 02 16:20:13 crc kubenswrapper[4933]: I1202 16:20:13.018731 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5596c69fcc-lb9d4" event={"ID":"d8a58bed-855e-481a-a1c6-a2fa6851e55c","Type":"ContainerStarted","Data":"4d785975db6c411f349ac56de381bc96713fa15918641e445e91c2b1c0c54688"} Dec 02 16:20:13 crc kubenswrapper[4933]: I1202 16:20:13.018783 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5596c69fcc-lb9d4" event={"ID":"d8a58bed-855e-481a-a1c6-a2fa6851e55c","Type":"ContainerStarted","Data":"57f11aa3eb54ebe552e520de20a411467935accff2e695c71e55a5410619cf64"} Dec 02 16:20:13 crc kubenswrapper[4933]: I1202 16:20:13.096028 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5df97ea3-3281-4995-bf14-cb09bf09f39d" path="/var/lib/kubelet/pods/5df97ea3-3281-4995-bf14-cb09bf09f39d/volumes" Dec 02 16:20:13 crc kubenswrapper[4933]: I1202 16:20:13.097254 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3cdda86-3d0d-486d-ae36-ab6792bff2ab" path="/var/lib/kubelet/pods/e3cdda86-3d0d-486d-ae36-ab6792bff2ab/volumes" Dec 02 16:20:13 crc kubenswrapper[4933]: W1202 16:20:13.498559 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod528bcbae_e7ca_4ad1_8e7d_571da6cb972b.slice/crio-6b72877e9289a4b9d79f189196d87249508877b92057c131e63e89b5b18f1b19 WatchSource:0}: Error finding container 6b72877e9289a4b9d79f189196d87249508877b92057c131e63e89b5b18f1b19: Status 404 returned error can't find the container with id 6b72877e9289a4b9d79f189196d87249508877b92057c131e63e89b5b18f1b19 Dec 02 16:20:13 crc kubenswrapper[4933]: I1202 16:20:13.520699 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 02 16:20:13 crc kubenswrapper[4933]: I1202 16:20:13.654160 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 02 16:20:14 crc kubenswrapper[4933]: I1202 16:20:14.048847 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"cd27c6c8-91fa-4036-b50e-996c263b202c","Type":"ContainerStarted","Data":"872d921079f8c2500260e510586a0c43d04e20b6d7a9445b6bf901beebc8e5f0"} Dec 02 16:20:14 crc kubenswrapper[4933]: I1202 16:20:14.051445 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"528bcbae-e7ca-4ad1-8e7d-571da6cb972b","Type":"ContainerStarted","Data":"6b72877e9289a4b9d79f189196d87249508877b92057c131e63e89b5b18f1b19"} Dec 02 16:20:14 crc kubenswrapper[4933]: I1202 16:20:14.054476 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d788537f-6bd8-4db8-a47d-2053d24dca64","Type":"ContainerStarted","Data":"04d822c446a298411cc7176701688853ec87107c8b839c0147c1e1dc485ee4a9"} Dec 02 16:20:14 crc kubenswrapper[4933]: I1202 16:20:14.056597 4933 generic.go:334] "Generic (PLEG): container finished" podID="d8a58bed-855e-481a-a1c6-a2fa6851e55c" containerID="4d785975db6c411f349ac56de381bc96713fa15918641e445e91c2b1c0c54688" exitCode=0 Dec 02 16:20:14 crc kubenswrapper[4933]: I1202 16:20:14.056665 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5596c69fcc-lb9d4" event={"ID":"d8a58bed-855e-481a-a1c6-a2fa6851e55c","Type":"ContainerDied","Data":"4d785975db6c411f349ac56de381bc96713fa15918641e445e91c2b1c0c54688"} Dec 02 16:20:15 crc kubenswrapper[4933]: I1202 16:20:15.077150 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5596c69fcc-lb9d4" event={"ID":"d8a58bed-855e-481a-a1c6-a2fa6851e55c","Type":"ContainerStarted","Data":"bbdf49575d720cd338d3933f74f4cbd7c13ca96853c2e0fbae5e1f29217523ff"} Dec 02 16:20:15 crc kubenswrapper[4933]: I1202 16:20:15.077455 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5596c69fcc-lb9d4" Dec 02 16:20:15 crc kubenswrapper[4933]: I1202 16:20:15.104994 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5596c69fcc-lb9d4" podStartSLOduration=9.104966797 podStartE2EDuration="9.104966797s" podCreationTimestamp="2025-12-02 16:20:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 16:20:15.096039596 +0000 UTC m=+1678.347266299" watchObservedRunningTime="2025-12-02 16:20:15.104966797 +0000 UTC m=+1678.356193520" Dec 02 16:20:16 crc kubenswrapper[4933]: I1202 16:20:16.108788 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"cd27c6c8-91fa-4036-b50e-996c263b202c","Type":"ContainerStarted","Data":"84c2fc8a3efba357282589e14e34be2884195e564ce93ce7d024e625b6c03af5"} Dec 02 16:20:16 crc kubenswrapper[4933]: I1202 16:20:16.110414 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"528bcbae-e7ca-4ad1-8e7d-571da6cb972b","Type":"ContainerStarted","Data":"0bd8537fba2db527eab80712194d7edafd44322f20df6cc5891f400c21b3d841"} Dec 02 16:20:18 crc kubenswrapper[4933]: I1202 16:20:18.056206 4933 scope.go:117] "RemoveContainer" containerID="3464037e44f6657a3e78f158e1b9ac51eea30ff49dcabcf02071d819dacd47c7" Dec 02 16:20:18 crc kubenswrapper[4933]: E1202 16:20:18.057214 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 16:20:19 crc kubenswrapper[4933]: E1202 16:20:19.492104 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="d788537f-6bd8-4db8-a47d-2053d24dca64" Dec 02 16:20:20 crc kubenswrapper[4933]: I1202 16:20:20.184816 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d788537f-6bd8-4db8-a47d-2053d24dca64","Type":"ContainerStarted","Data":"dce1d06f57e72e2323e956648adb5d15024d9c24404b5b3ea1b0e912ec6d72b0"} Dec 02 16:20:20 crc kubenswrapper[4933]: I1202 16:20:20.185564 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 02 16:20:20 crc kubenswrapper[4933]: E1202 16:20:20.186525 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-central:current-tested\\\"\"" pod="openstack/ceilometer-0" podUID="d788537f-6bd8-4db8-a47d-2053d24dca64" Dec 02 16:20:21 crc kubenswrapper[4933]: E1202 16:20:21.196633 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-central:current-tested\\\"\"" pod="openstack/ceilometer-0" podUID="d788537f-6bd8-4db8-a47d-2053d24dca64" Dec 02 16:20:21 crc kubenswrapper[4933]: I1202 16:20:21.475982 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5596c69fcc-lb9d4" Dec 02 16:20:21 crc kubenswrapper[4933]: I1202 16:20:21.553742 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d99f6bc7f-xjx4w"] Dec 02 16:20:21 crc kubenswrapper[4933]: I1202 16:20:21.554056 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6d99f6bc7f-xjx4w" podUID="000e4cc8-9c69-4cd9-8d10-ea8a7a3411b4" containerName="dnsmasq-dns" containerID="cri-o://7c941942bb2f62e644f5bfe46064a4ba2d6bf94f7382f3849356d2c44a2a3c45" gracePeriod=10 Dec 02 16:20:22 crc kubenswrapper[4933]: I1202 16:20:22.163163 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d99f6bc7f-xjx4w" Dec 02 16:20:22 crc kubenswrapper[4933]: I1202 16:20:22.207399 4933 generic.go:334] "Generic (PLEG): container finished" podID="000e4cc8-9c69-4cd9-8d10-ea8a7a3411b4" containerID="7c941942bb2f62e644f5bfe46064a4ba2d6bf94f7382f3849356d2c44a2a3c45" exitCode=0 Dec 02 16:20:22 crc kubenswrapper[4933]: I1202 16:20:22.207445 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d99f6bc7f-xjx4w" event={"ID":"000e4cc8-9c69-4cd9-8d10-ea8a7a3411b4","Type":"ContainerDied","Data":"7c941942bb2f62e644f5bfe46064a4ba2d6bf94f7382f3849356d2c44a2a3c45"} Dec 02 16:20:22 crc kubenswrapper[4933]: I1202 16:20:22.207460 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d99f6bc7f-xjx4w" Dec 02 16:20:22 crc kubenswrapper[4933]: I1202 16:20:22.207488 4933 scope.go:117] "RemoveContainer" containerID="7c941942bb2f62e644f5bfe46064a4ba2d6bf94f7382f3849356d2c44a2a3c45" Dec 02 16:20:22 crc kubenswrapper[4933]: I1202 16:20:22.207477 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d99f6bc7f-xjx4w" event={"ID":"000e4cc8-9c69-4cd9-8d10-ea8a7a3411b4","Type":"ContainerDied","Data":"d70e6ae6dff7504fbed38c07dc7234938f58f90ee33d8b5ef964005348c2197d"} Dec 02 16:20:22 crc kubenswrapper[4933]: I1202 16:20:22.233129 4933 scope.go:117] "RemoveContainer" containerID="9f30483237a36797d943b695db267e6a2e3addd2847dcc1057e600d017d6dff6" Dec 02 16:20:22 crc kubenswrapper[4933]: I1202 16:20:22.262762 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/000e4cc8-9c69-4cd9-8d10-ea8a7a3411b4-dns-svc\") pod \"000e4cc8-9c69-4cd9-8d10-ea8a7a3411b4\" (UID: \"000e4cc8-9c69-4cd9-8d10-ea8a7a3411b4\") " Dec 02 16:20:22 crc kubenswrapper[4933]: I1202 16:20:22.262925 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cr7xw\" (UniqueName: \"kubernetes.io/projected/000e4cc8-9c69-4cd9-8d10-ea8a7a3411b4-kube-api-access-cr7xw\") pod \"000e4cc8-9c69-4cd9-8d10-ea8a7a3411b4\" (UID: \"000e4cc8-9c69-4cd9-8d10-ea8a7a3411b4\") " Dec 02 16:20:22 crc kubenswrapper[4933]: I1202 16:20:22.263057 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/000e4cc8-9c69-4cd9-8d10-ea8a7a3411b4-ovsdbserver-sb\") pod \"000e4cc8-9c69-4cd9-8d10-ea8a7a3411b4\" (UID: \"000e4cc8-9c69-4cd9-8d10-ea8a7a3411b4\") " Dec 02 16:20:22 crc kubenswrapper[4933]: I1202 16:20:22.263152 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/000e4cc8-9c69-4cd9-8d10-ea8a7a3411b4-config\") pod \"000e4cc8-9c69-4cd9-8d10-ea8a7a3411b4\" (UID: \"000e4cc8-9c69-4cd9-8d10-ea8a7a3411b4\") " Dec 02 16:20:22 crc kubenswrapper[4933]: I1202 16:20:22.263205 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/000e4cc8-9c69-4cd9-8d10-ea8a7a3411b4-ovsdbserver-nb\") pod \"000e4cc8-9c69-4cd9-8d10-ea8a7a3411b4\" (UID: \"000e4cc8-9c69-4cd9-8d10-ea8a7a3411b4\") " Dec 02 16:20:22 crc kubenswrapper[4933]: I1202 16:20:22.263343 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/000e4cc8-9c69-4cd9-8d10-ea8a7a3411b4-dns-swift-storage-0\") pod \"000e4cc8-9c69-4cd9-8d10-ea8a7a3411b4\" (UID: \"000e4cc8-9c69-4cd9-8d10-ea8a7a3411b4\") " Dec 02 16:20:22 crc kubenswrapper[4933]: I1202 16:20:22.268228 4933 scope.go:117] "RemoveContainer" containerID="7c941942bb2f62e644f5bfe46064a4ba2d6bf94f7382f3849356d2c44a2a3c45" Dec 02 16:20:22 crc kubenswrapper[4933]: E1202 16:20:22.269545 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c941942bb2f62e644f5bfe46064a4ba2d6bf94f7382f3849356d2c44a2a3c45\": container with ID starting with 7c941942bb2f62e644f5bfe46064a4ba2d6bf94f7382f3849356d2c44a2a3c45 not found: ID does not exist" containerID="7c941942bb2f62e644f5bfe46064a4ba2d6bf94f7382f3849356d2c44a2a3c45" Dec 02 16:20:22 crc kubenswrapper[4933]: I1202 16:20:22.269696 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c941942bb2f62e644f5bfe46064a4ba2d6bf94f7382f3849356d2c44a2a3c45"} err="failed to get container status \"7c941942bb2f62e644f5bfe46064a4ba2d6bf94f7382f3849356d2c44a2a3c45\": rpc error: code = NotFound desc = could not find container \"7c941942bb2f62e644f5bfe46064a4ba2d6bf94f7382f3849356d2c44a2a3c45\": container with ID starting with 7c941942bb2f62e644f5bfe46064a4ba2d6bf94f7382f3849356d2c44a2a3c45 not found: ID does not exist" Dec 02 16:20:22 crc kubenswrapper[4933]: I1202 16:20:22.269800 4933 scope.go:117] "RemoveContainer" containerID="9f30483237a36797d943b695db267e6a2e3addd2847dcc1057e600d017d6dff6" Dec 02 16:20:22 crc kubenswrapper[4933]: I1202 16:20:22.270341 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/000e4cc8-9c69-4cd9-8d10-ea8a7a3411b4-kube-api-access-cr7xw" (OuterVolumeSpecName: "kube-api-access-cr7xw") pod "000e4cc8-9c69-4cd9-8d10-ea8a7a3411b4" (UID: "000e4cc8-9c69-4cd9-8d10-ea8a7a3411b4"). InnerVolumeSpecName "kube-api-access-cr7xw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:20:22 crc kubenswrapper[4933]: E1202 16:20:22.271701 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f30483237a36797d943b695db267e6a2e3addd2847dcc1057e600d017d6dff6\": container with ID starting with 9f30483237a36797d943b695db267e6a2e3addd2847dcc1057e600d017d6dff6 not found: ID does not exist" containerID="9f30483237a36797d943b695db267e6a2e3addd2847dcc1057e600d017d6dff6" Dec 02 16:20:22 crc kubenswrapper[4933]: I1202 16:20:22.271851 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f30483237a36797d943b695db267e6a2e3addd2847dcc1057e600d017d6dff6"} err="failed to get container status \"9f30483237a36797d943b695db267e6a2e3addd2847dcc1057e600d017d6dff6\": rpc error: code = NotFound desc = could not find container \"9f30483237a36797d943b695db267e6a2e3addd2847dcc1057e600d017d6dff6\": container with ID starting with 9f30483237a36797d943b695db267e6a2e3addd2847dcc1057e600d017d6dff6 not found: ID does not exist" Dec 02 16:20:22 crc kubenswrapper[4933]: I1202 16:20:22.339424 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/000e4cc8-9c69-4cd9-8d10-ea8a7a3411b4-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "000e4cc8-9c69-4cd9-8d10-ea8a7a3411b4" (UID: "000e4cc8-9c69-4cd9-8d10-ea8a7a3411b4"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:20:22 crc kubenswrapper[4933]: I1202 16:20:22.341293 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/000e4cc8-9c69-4cd9-8d10-ea8a7a3411b4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "000e4cc8-9c69-4cd9-8d10-ea8a7a3411b4" (UID: "000e4cc8-9c69-4cd9-8d10-ea8a7a3411b4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:20:22 crc kubenswrapper[4933]: I1202 16:20:22.344224 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/000e4cc8-9c69-4cd9-8d10-ea8a7a3411b4-config" (OuterVolumeSpecName: "config") pod "000e4cc8-9c69-4cd9-8d10-ea8a7a3411b4" (UID: "000e4cc8-9c69-4cd9-8d10-ea8a7a3411b4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:20:22 crc kubenswrapper[4933]: I1202 16:20:22.366167 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/000e4cc8-9c69-4cd9-8d10-ea8a7a3411b4-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "000e4cc8-9c69-4cd9-8d10-ea8a7a3411b4" (UID: "000e4cc8-9c69-4cd9-8d10-ea8a7a3411b4"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:20:22 crc kubenswrapper[4933]: I1202 16:20:22.366709 4933 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/000e4cc8-9c69-4cd9-8d10-ea8a7a3411b4-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 16:20:22 crc kubenswrapper[4933]: I1202 16:20:22.366738 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cr7xw\" (UniqueName: \"kubernetes.io/projected/000e4cc8-9c69-4cd9-8d10-ea8a7a3411b4-kube-api-access-cr7xw\") on node \"crc\" DevicePath \"\"" Dec 02 16:20:22 crc kubenswrapper[4933]: I1202 16:20:22.366750 4933 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/000e4cc8-9c69-4cd9-8d10-ea8a7a3411b4-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 02 16:20:22 crc kubenswrapper[4933]: I1202 16:20:22.366759 4933 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/000e4cc8-9c69-4cd9-8d10-ea8a7a3411b4-config\") on node \"crc\" DevicePath \"\"" Dec 02 16:20:22 crc kubenswrapper[4933]: I1202 16:20:22.366768 4933 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/000e4cc8-9c69-4cd9-8d10-ea8a7a3411b4-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 02 16:20:22 crc kubenswrapper[4933]: I1202 16:20:22.373641 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/000e4cc8-9c69-4cd9-8d10-ea8a7a3411b4-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "000e4cc8-9c69-4cd9-8d10-ea8a7a3411b4" (UID: "000e4cc8-9c69-4cd9-8d10-ea8a7a3411b4"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:20:22 crc kubenswrapper[4933]: I1202 16:20:22.469982 4933 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/000e4cc8-9c69-4cd9-8d10-ea8a7a3411b4-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 02 16:20:22 crc kubenswrapper[4933]: I1202 16:20:22.546117 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d99f6bc7f-xjx4w"] Dec 02 16:20:22 crc kubenswrapper[4933]: I1202 16:20:22.557894 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6d99f6bc7f-xjx4w"] Dec 02 16:20:23 crc kubenswrapper[4933]: I1202 16:20:23.069013 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="000e4cc8-9c69-4cd9-8d10-ea8a7a3411b4" path="/var/lib/kubelet/pods/000e4cc8-9c69-4cd9-8d10-ea8a7a3411b4/volumes" Dec 02 16:20:26 crc kubenswrapper[4933]: I1202 16:20:26.253690 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-747pm" event={"ID":"70ee251f-65d2-476a-ab6d-3fc838cbdf55","Type":"ContainerStarted","Data":"ff5b687922c855aa07c54509f6bbccb820b172d2feaa3e082b62648bebf45765"} Dec 02 16:20:26 crc kubenswrapper[4933]: I1202 16:20:26.267851 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-sync-747pm" podStartSLOduration=2.697030743 podStartE2EDuration="39.267839514s" podCreationTimestamp="2025-12-02 16:19:47 +0000 UTC" firstStartedPulling="2025-12-02 16:19:48.677616645 +0000 UTC m=+1651.928843338" lastFinishedPulling="2025-12-02 16:20:25.248425396 +0000 UTC m=+1688.499652109" observedRunningTime="2025-12-02 16:20:26.267115065 +0000 UTC m=+1689.518341768" watchObservedRunningTime="2025-12-02 16:20:26.267839514 +0000 UTC m=+1689.519066207" Dec 02 16:20:26 crc kubenswrapper[4933]: I1202 16:20:26.713284 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mznps"] Dec 02 16:20:26 crc kubenswrapper[4933]: E1202 16:20:26.713748 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="000e4cc8-9c69-4cd9-8d10-ea8a7a3411b4" containerName="init" Dec 02 16:20:26 crc kubenswrapper[4933]: I1202 16:20:26.713763 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="000e4cc8-9c69-4cd9-8d10-ea8a7a3411b4" containerName="init" Dec 02 16:20:26 crc kubenswrapper[4933]: E1202 16:20:26.713794 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="000e4cc8-9c69-4cd9-8d10-ea8a7a3411b4" containerName="dnsmasq-dns" Dec 02 16:20:26 crc kubenswrapper[4933]: I1202 16:20:26.713800 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="000e4cc8-9c69-4cd9-8d10-ea8a7a3411b4" containerName="dnsmasq-dns" Dec 02 16:20:26 crc kubenswrapper[4933]: I1202 16:20:26.714080 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="000e4cc8-9c69-4cd9-8d10-ea8a7a3411b4" containerName="dnsmasq-dns" Dec 02 16:20:26 crc kubenswrapper[4933]: I1202 16:20:26.714867 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mznps" Dec 02 16:20:26 crc kubenswrapper[4933]: I1202 16:20:26.721152 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 16:20:26 crc kubenswrapper[4933]: I1202 16:20:26.721356 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 16:20:26 crc kubenswrapper[4933]: I1202 16:20:26.721506 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mlmmm" Dec 02 16:20:26 crc kubenswrapper[4933]: I1202 16:20:26.721604 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 16:20:26 crc kubenswrapper[4933]: I1202 16:20:26.730184 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mznps"] Dec 02 16:20:26 crc kubenswrapper[4933]: I1202 16:20:26.869531 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b712434b-6106-44ed-aa67-1328e50cdb2c-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-mznps\" (UID: \"b712434b-6106-44ed-aa67-1328e50cdb2c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mznps" Dec 02 16:20:26 crc kubenswrapper[4933]: I1202 16:20:26.869787 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzp24\" (UniqueName: \"kubernetes.io/projected/b712434b-6106-44ed-aa67-1328e50cdb2c-kube-api-access-fzp24\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-mznps\" (UID: \"b712434b-6106-44ed-aa67-1328e50cdb2c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mznps" Dec 02 16:20:26 crc kubenswrapper[4933]: I1202 16:20:26.869989 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b712434b-6106-44ed-aa67-1328e50cdb2c-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-mznps\" (UID: \"b712434b-6106-44ed-aa67-1328e50cdb2c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mznps" Dec 02 16:20:26 crc kubenswrapper[4933]: I1202 16:20:26.870197 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b712434b-6106-44ed-aa67-1328e50cdb2c-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-mznps\" (UID: \"b712434b-6106-44ed-aa67-1328e50cdb2c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mznps" Dec 02 16:20:26 crc kubenswrapper[4933]: I1202 16:20:26.975194 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzp24\" (UniqueName: \"kubernetes.io/projected/b712434b-6106-44ed-aa67-1328e50cdb2c-kube-api-access-fzp24\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-mznps\" (UID: \"b712434b-6106-44ed-aa67-1328e50cdb2c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mznps" Dec 02 16:20:26 crc kubenswrapper[4933]: I1202 16:20:26.975292 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b712434b-6106-44ed-aa67-1328e50cdb2c-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-mznps\" (UID: \"b712434b-6106-44ed-aa67-1328e50cdb2c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mznps" Dec 02 16:20:26 crc kubenswrapper[4933]: I1202 16:20:26.975375 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b712434b-6106-44ed-aa67-1328e50cdb2c-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-mznps\" (UID: \"b712434b-6106-44ed-aa67-1328e50cdb2c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mznps" Dec 02 16:20:26 crc kubenswrapper[4933]: I1202 16:20:26.975514 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b712434b-6106-44ed-aa67-1328e50cdb2c-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-mznps\" (UID: \"b712434b-6106-44ed-aa67-1328e50cdb2c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mznps" Dec 02 16:20:26 crc kubenswrapper[4933]: I1202 16:20:26.984513 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b712434b-6106-44ed-aa67-1328e50cdb2c-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-mznps\" (UID: \"b712434b-6106-44ed-aa67-1328e50cdb2c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mznps" Dec 02 16:20:26 crc kubenswrapper[4933]: I1202 16:20:26.994615 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b712434b-6106-44ed-aa67-1328e50cdb2c-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-mznps\" (UID: \"b712434b-6106-44ed-aa67-1328e50cdb2c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mznps" Dec 02 16:20:27 crc kubenswrapper[4933]: I1202 16:20:27.007740 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzp24\" (UniqueName: \"kubernetes.io/projected/b712434b-6106-44ed-aa67-1328e50cdb2c-kube-api-access-fzp24\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-mznps\" (UID: \"b712434b-6106-44ed-aa67-1328e50cdb2c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mznps" Dec 02 16:20:27 crc kubenswrapper[4933]: I1202 16:20:27.019651 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b712434b-6106-44ed-aa67-1328e50cdb2c-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-mznps\" (UID: \"b712434b-6106-44ed-aa67-1328e50cdb2c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mznps" Dec 02 16:20:27 crc kubenswrapper[4933]: I1202 16:20:27.048775 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mznps" Dec 02 16:20:27 crc kubenswrapper[4933]: I1202 16:20:27.725116 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mznps"] Dec 02 16:20:28 crc kubenswrapper[4933]: I1202 16:20:28.281079 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mznps" event={"ID":"b712434b-6106-44ed-aa67-1328e50cdb2c","Type":"ContainerStarted","Data":"9db0325c7877bb8068122588e86bcd0f997b56d2355a29b12048491336f7bd82"} Dec 02 16:20:28 crc kubenswrapper[4933]: I1202 16:20:28.285227 4933 generic.go:334] "Generic (PLEG): container finished" podID="70ee251f-65d2-476a-ab6d-3fc838cbdf55" containerID="ff5b687922c855aa07c54509f6bbccb820b172d2feaa3e082b62648bebf45765" exitCode=0 Dec 02 16:20:28 crc kubenswrapper[4933]: I1202 16:20:28.285262 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-747pm" event={"ID":"70ee251f-65d2-476a-ab6d-3fc838cbdf55","Type":"ContainerDied","Data":"ff5b687922c855aa07c54509f6bbccb820b172d2feaa3e082b62648bebf45765"} Dec 02 16:20:30 crc kubenswrapper[4933]: I1202 16:20:30.722071 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-747pm" Dec 02 16:20:30 crc kubenswrapper[4933]: I1202 16:20:30.813430 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70ee251f-65d2-476a-ab6d-3fc838cbdf55-config-data\") pod \"70ee251f-65d2-476a-ab6d-3fc838cbdf55\" (UID: \"70ee251f-65d2-476a-ab6d-3fc838cbdf55\") " Dec 02 16:20:30 crc kubenswrapper[4933]: I1202 16:20:30.813649 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70ee251f-65d2-476a-ab6d-3fc838cbdf55-combined-ca-bundle\") pod \"70ee251f-65d2-476a-ab6d-3fc838cbdf55\" (UID: \"70ee251f-65d2-476a-ab6d-3fc838cbdf55\") " Dec 02 16:20:30 crc kubenswrapper[4933]: I1202 16:20:30.813714 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2xfj5\" (UniqueName: \"kubernetes.io/projected/70ee251f-65d2-476a-ab6d-3fc838cbdf55-kube-api-access-2xfj5\") pod \"70ee251f-65d2-476a-ab6d-3fc838cbdf55\" (UID: \"70ee251f-65d2-476a-ab6d-3fc838cbdf55\") " Dec 02 16:20:30 crc kubenswrapper[4933]: I1202 16:20:30.822166 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70ee251f-65d2-476a-ab6d-3fc838cbdf55-kube-api-access-2xfj5" (OuterVolumeSpecName: "kube-api-access-2xfj5") pod "70ee251f-65d2-476a-ab6d-3fc838cbdf55" (UID: "70ee251f-65d2-476a-ab6d-3fc838cbdf55"). InnerVolumeSpecName "kube-api-access-2xfj5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:20:30 crc kubenswrapper[4933]: I1202 16:20:30.851982 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70ee251f-65d2-476a-ab6d-3fc838cbdf55-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "70ee251f-65d2-476a-ab6d-3fc838cbdf55" (UID: "70ee251f-65d2-476a-ab6d-3fc838cbdf55"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:20:30 crc kubenswrapper[4933]: I1202 16:20:30.915856 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70ee251f-65d2-476a-ab6d-3fc838cbdf55-config-data" (OuterVolumeSpecName: "config-data") pod "70ee251f-65d2-476a-ab6d-3fc838cbdf55" (UID: "70ee251f-65d2-476a-ab6d-3fc838cbdf55"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:20:30 crc kubenswrapper[4933]: I1202 16:20:30.916658 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70ee251f-65d2-476a-ab6d-3fc838cbdf55-config-data\") pod \"70ee251f-65d2-476a-ab6d-3fc838cbdf55\" (UID: \"70ee251f-65d2-476a-ab6d-3fc838cbdf55\") " Dec 02 16:20:30 crc kubenswrapper[4933]: W1202 16:20:30.916742 4933 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/70ee251f-65d2-476a-ab6d-3fc838cbdf55/volumes/kubernetes.io~secret/config-data Dec 02 16:20:30 crc kubenswrapper[4933]: I1202 16:20:30.916757 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70ee251f-65d2-476a-ab6d-3fc838cbdf55-config-data" (OuterVolumeSpecName: "config-data") pod "70ee251f-65d2-476a-ab6d-3fc838cbdf55" (UID: "70ee251f-65d2-476a-ab6d-3fc838cbdf55"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:20:30 crc kubenswrapper[4933]: I1202 16:20:30.917768 4933 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70ee251f-65d2-476a-ab6d-3fc838cbdf55-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 16:20:30 crc kubenswrapper[4933]: I1202 16:20:30.917804 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2xfj5\" (UniqueName: \"kubernetes.io/projected/70ee251f-65d2-476a-ab6d-3fc838cbdf55-kube-api-access-2xfj5\") on node \"crc\" DevicePath \"\"" Dec 02 16:20:30 crc kubenswrapper[4933]: I1202 16:20:30.917846 4933 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70ee251f-65d2-476a-ab6d-3fc838cbdf55-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 16:20:31 crc kubenswrapper[4933]: I1202 16:20:31.318055 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-747pm" event={"ID":"70ee251f-65d2-476a-ab6d-3fc838cbdf55","Type":"ContainerDied","Data":"bcea26e68bf71bfda153742f8cc0c3dd020a522454987f76c4f4b40e84e45799"} Dec 02 16:20:31 crc kubenswrapper[4933]: I1202 16:20:31.318369 4933 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bcea26e68bf71bfda153742f8cc0c3dd020a522454987f76c4f4b40e84e45799" Dec 02 16:20:31 crc kubenswrapper[4933]: I1202 16:20:31.318120 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-747pm" Dec 02 16:20:32 crc kubenswrapper[4933]: I1202 16:20:32.053870 4933 scope.go:117] "RemoveContainer" containerID="3464037e44f6657a3e78f158e1b9ac51eea30ff49dcabcf02071d819dacd47c7" Dec 02 16:20:32 crc kubenswrapper[4933]: E1202 16:20:32.054221 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 16:20:32 crc kubenswrapper[4933]: I1202 16:20:32.672793 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-c8865d5c5-ktz2c"] Dec 02 16:20:32 crc kubenswrapper[4933]: E1202 16:20:32.675497 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70ee251f-65d2-476a-ab6d-3fc838cbdf55" containerName="heat-db-sync" Dec 02 16:20:32 crc kubenswrapper[4933]: I1202 16:20:32.675526 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="70ee251f-65d2-476a-ab6d-3fc838cbdf55" containerName="heat-db-sync" Dec 02 16:20:32 crc kubenswrapper[4933]: I1202 16:20:32.675774 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="70ee251f-65d2-476a-ab6d-3fc838cbdf55" containerName="heat-db-sync" Dec 02 16:20:32 crc kubenswrapper[4933]: I1202 16:20:32.676698 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-c8865d5c5-ktz2c" Dec 02 16:20:32 crc kubenswrapper[4933]: I1202 16:20:32.696663 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-c8865d5c5-ktz2c"] Dec 02 16:20:32 crc kubenswrapper[4933]: I1202 16:20:32.719097 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-7cfb68f886-5mpb2"] Dec 02 16:20:32 crc kubenswrapper[4933]: I1202 16:20:32.721231 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-7cfb68f886-5mpb2" Dec 02 16:20:32 crc kubenswrapper[4933]: I1202 16:20:32.753560 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-7cfb68f886-5mpb2"] Dec 02 16:20:32 crc kubenswrapper[4933]: I1202 16:20:32.771116 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tn9j4\" (UniqueName: \"kubernetes.io/projected/1aa97de0-3a2a-474c-904e-5fa59773c33c-kube-api-access-tn9j4\") pod \"heat-engine-c8865d5c5-ktz2c\" (UID: \"1aa97de0-3a2a-474c-904e-5fa59773c33c\") " pod="openstack/heat-engine-c8865d5c5-ktz2c" Dec 02 16:20:32 crc kubenswrapper[4933]: I1202 16:20:32.771243 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1aa97de0-3a2a-474c-904e-5fa59773c33c-config-data\") pod \"heat-engine-c8865d5c5-ktz2c\" (UID: \"1aa97de0-3a2a-474c-904e-5fa59773c33c\") " pod="openstack/heat-engine-c8865d5c5-ktz2c" Dec 02 16:20:32 crc kubenswrapper[4933]: I1202 16:20:32.771474 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1aa97de0-3a2a-474c-904e-5fa59773c33c-config-data-custom\") pod \"heat-engine-c8865d5c5-ktz2c\" (UID: \"1aa97de0-3a2a-474c-904e-5fa59773c33c\") " pod="openstack/heat-engine-c8865d5c5-ktz2c" Dec 02 16:20:32 crc kubenswrapper[4933]: I1202 16:20:32.771563 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1aa97de0-3a2a-474c-904e-5fa59773c33c-combined-ca-bundle\") pod \"heat-engine-c8865d5c5-ktz2c\" (UID: \"1aa97de0-3a2a-474c-904e-5fa59773c33c\") " pod="openstack/heat-engine-c8865d5c5-ktz2c" Dec 02 16:20:32 crc kubenswrapper[4933]: I1202 16:20:32.803036 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-7d8849d76d-cc29j"] Dec 02 16:20:32 crc kubenswrapper[4933]: I1202 16:20:32.816533 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-7d8849d76d-cc29j" Dec 02 16:20:32 crc kubenswrapper[4933]: I1202 16:20:32.851548 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-7d8849d76d-cc29j"] Dec 02 16:20:32 crc kubenswrapper[4933]: I1202 16:20:32.873287 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1aa97de0-3a2a-474c-904e-5fa59773c33c-config-data-custom\") pod \"heat-engine-c8865d5c5-ktz2c\" (UID: \"1aa97de0-3a2a-474c-904e-5fa59773c33c\") " pod="openstack/heat-engine-c8865d5c5-ktz2c" Dec 02 16:20:32 crc kubenswrapper[4933]: I1202 16:20:32.873362 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1aa97de0-3a2a-474c-904e-5fa59773c33c-combined-ca-bundle\") pod \"heat-engine-c8865d5c5-ktz2c\" (UID: \"1aa97de0-3a2a-474c-904e-5fa59773c33c\") " pod="openstack/heat-engine-c8865d5c5-ktz2c" Dec 02 16:20:32 crc kubenswrapper[4933]: I1202 16:20:32.873399 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tn9j4\" (UniqueName: \"kubernetes.io/projected/1aa97de0-3a2a-474c-904e-5fa59773c33c-kube-api-access-tn9j4\") pod \"heat-engine-c8865d5c5-ktz2c\" (UID: \"1aa97de0-3a2a-474c-904e-5fa59773c33c\") " pod="openstack/heat-engine-c8865d5c5-ktz2c" Dec 02 16:20:32 crc kubenswrapper[4933]: I1202 16:20:32.873435 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0f856765-5f4c-445f-bcd1-736db6fb2c56-internal-tls-certs\") pod \"heat-api-7cfb68f886-5mpb2\" (UID: \"0f856765-5f4c-445f-bcd1-736db6fb2c56\") " pod="openstack/heat-api-7cfb68f886-5mpb2" Dec 02 16:20:32 crc kubenswrapper[4933]: I1202 16:20:32.873483 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1aa97de0-3a2a-474c-904e-5fa59773c33c-config-data\") pod \"heat-engine-c8865d5c5-ktz2c\" (UID: \"1aa97de0-3a2a-474c-904e-5fa59773c33c\") " pod="openstack/heat-engine-c8865d5c5-ktz2c" Dec 02 16:20:32 crc kubenswrapper[4933]: I1202 16:20:32.873521 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f856765-5f4c-445f-bcd1-736db6fb2c56-combined-ca-bundle\") pod \"heat-api-7cfb68f886-5mpb2\" (UID: \"0f856765-5f4c-445f-bcd1-736db6fb2c56\") " pod="openstack/heat-api-7cfb68f886-5mpb2" Dec 02 16:20:32 crc kubenswrapper[4933]: I1202 16:20:32.873560 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0f856765-5f4c-445f-bcd1-736db6fb2c56-public-tls-certs\") pod \"heat-api-7cfb68f886-5mpb2\" (UID: \"0f856765-5f4c-445f-bcd1-736db6fb2c56\") " pod="openstack/heat-api-7cfb68f886-5mpb2" Dec 02 16:20:32 crc kubenswrapper[4933]: I1202 16:20:32.873581 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f856765-5f4c-445f-bcd1-736db6fb2c56-config-data\") pod \"heat-api-7cfb68f886-5mpb2\" (UID: \"0f856765-5f4c-445f-bcd1-736db6fb2c56\") " pod="openstack/heat-api-7cfb68f886-5mpb2" Dec 02 16:20:32 crc kubenswrapper[4933]: I1202 16:20:32.873596 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0f856765-5f4c-445f-bcd1-736db6fb2c56-config-data-custom\") pod \"heat-api-7cfb68f886-5mpb2\" (UID: \"0f856765-5f4c-445f-bcd1-736db6fb2c56\") " pod="openstack/heat-api-7cfb68f886-5mpb2" Dec 02 16:20:32 crc kubenswrapper[4933]: I1202 16:20:32.873612 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qpf6\" (UniqueName: \"kubernetes.io/projected/0f856765-5f4c-445f-bcd1-736db6fb2c56-kube-api-access-8qpf6\") pod \"heat-api-7cfb68f886-5mpb2\" (UID: \"0f856765-5f4c-445f-bcd1-736db6fb2c56\") " pod="openstack/heat-api-7cfb68f886-5mpb2" Dec 02 16:20:32 crc kubenswrapper[4933]: I1202 16:20:32.881846 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1aa97de0-3a2a-474c-904e-5fa59773c33c-config-data-custom\") pod \"heat-engine-c8865d5c5-ktz2c\" (UID: \"1aa97de0-3a2a-474c-904e-5fa59773c33c\") " pod="openstack/heat-engine-c8865d5c5-ktz2c" Dec 02 16:20:32 crc kubenswrapper[4933]: I1202 16:20:32.883753 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1aa97de0-3a2a-474c-904e-5fa59773c33c-config-data\") pod \"heat-engine-c8865d5c5-ktz2c\" (UID: \"1aa97de0-3a2a-474c-904e-5fa59773c33c\") " pod="openstack/heat-engine-c8865d5c5-ktz2c" Dec 02 16:20:32 crc kubenswrapper[4933]: I1202 16:20:32.894201 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tn9j4\" (UniqueName: \"kubernetes.io/projected/1aa97de0-3a2a-474c-904e-5fa59773c33c-kube-api-access-tn9j4\") pod \"heat-engine-c8865d5c5-ktz2c\" (UID: \"1aa97de0-3a2a-474c-904e-5fa59773c33c\") " pod="openstack/heat-engine-c8865d5c5-ktz2c" Dec 02 16:20:32 crc kubenswrapper[4933]: I1202 16:20:32.901649 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1aa97de0-3a2a-474c-904e-5fa59773c33c-combined-ca-bundle\") pod \"heat-engine-c8865d5c5-ktz2c\" (UID: \"1aa97de0-3a2a-474c-904e-5fa59773c33c\") " pod="openstack/heat-engine-c8865d5c5-ktz2c" Dec 02 16:20:32 crc kubenswrapper[4933]: I1202 16:20:32.976079 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0f856765-5f4c-445f-bcd1-736db6fb2c56-internal-tls-certs\") pod \"heat-api-7cfb68f886-5mpb2\" (UID: \"0f856765-5f4c-445f-bcd1-736db6fb2c56\") " pod="openstack/heat-api-7cfb68f886-5mpb2" Dec 02 16:20:32 crc kubenswrapper[4933]: I1202 16:20:32.976187 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f856765-5f4c-445f-bcd1-736db6fb2c56-combined-ca-bundle\") pod \"heat-api-7cfb68f886-5mpb2\" (UID: \"0f856765-5f4c-445f-bcd1-736db6fb2c56\") " pod="openstack/heat-api-7cfb68f886-5mpb2" Dec 02 16:20:32 crc kubenswrapper[4933]: I1202 16:20:32.976209 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6ce3e9ef-c14e-49c8-bd2b-8b268f028516-config-data-custom\") pod \"heat-cfnapi-7d8849d76d-cc29j\" (UID: \"6ce3e9ef-c14e-49c8-bd2b-8b268f028516\") " pod="openstack/heat-cfnapi-7d8849d76d-cc29j" Dec 02 16:20:32 crc kubenswrapper[4933]: I1202 16:20:32.976254 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0f856765-5f4c-445f-bcd1-736db6fb2c56-public-tls-certs\") pod \"heat-api-7cfb68f886-5mpb2\" (UID: \"0f856765-5f4c-445f-bcd1-736db6fb2c56\") " pod="openstack/heat-api-7cfb68f886-5mpb2" Dec 02 16:20:32 crc kubenswrapper[4933]: I1202 16:20:32.976321 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f856765-5f4c-445f-bcd1-736db6fb2c56-config-data\") pod \"heat-api-7cfb68f886-5mpb2\" (UID: \"0f856765-5f4c-445f-bcd1-736db6fb2c56\") " pod="openstack/heat-api-7cfb68f886-5mpb2" Dec 02 16:20:32 crc kubenswrapper[4933]: I1202 16:20:32.976341 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0f856765-5f4c-445f-bcd1-736db6fb2c56-config-data-custom\") pod \"heat-api-7cfb68f886-5mpb2\" (UID: \"0f856765-5f4c-445f-bcd1-736db6fb2c56\") " pod="openstack/heat-api-7cfb68f886-5mpb2" Dec 02 16:20:32 crc kubenswrapper[4933]: I1202 16:20:32.976363 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qpf6\" (UniqueName: \"kubernetes.io/projected/0f856765-5f4c-445f-bcd1-736db6fb2c56-kube-api-access-8qpf6\") pod \"heat-api-7cfb68f886-5mpb2\" (UID: \"0f856765-5f4c-445f-bcd1-736db6fb2c56\") " pod="openstack/heat-api-7cfb68f886-5mpb2" Dec 02 16:20:32 crc kubenswrapper[4933]: I1202 16:20:32.976409 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvp4p\" (UniqueName: \"kubernetes.io/projected/6ce3e9ef-c14e-49c8-bd2b-8b268f028516-kube-api-access-dvp4p\") pod \"heat-cfnapi-7d8849d76d-cc29j\" (UID: \"6ce3e9ef-c14e-49c8-bd2b-8b268f028516\") " pod="openstack/heat-cfnapi-7d8849d76d-cc29j" Dec 02 16:20:32 crc kubenswrapper[4933]: I1202 16:20:32.976458 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ce3e9ef-c14e-49c8-bd2b-8b268f028516-config-data\") pod \"heat-cfnapi-7d8849d76d-cc29j\" (UID: \"6ce3e9ef-c14e-49c8-bd2b-8b268f028516\") " pod="openstack/heat-cfnapi-7d8849d76d-cc29j" Dec 02 16:20:32 crc kubenswrapper[4933]: I1202 16:20:32.976500 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ce3e9ef-c14e-49c8-bd2b-8b268f028516-public-tls-certs\") pod \"heat-cfnapi-7d8849d76d-cc29j\" (UID: \"6ce3e9ef-c14e-49c8-bd2b-8b268f028516\") " pod="openstack/heat-cfnapi-7d8849d76d-cc29j" Dec 02 16:20:32 crc kubenswrapper[4933]: I1202 16:20:32.976519 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ce3e9ef-c14e-49c8-bd2b-8b268f028516-internal-tls-certs\") pod \"heat-cfnapi-7d8849d76d-cc29j\" (UID: \"6ce3e9ef-c14e-49c8-bd2b-8b268f028516\") " pod="openstack/heat-cfnapi-7d8849d76d-cc29j" Dec 02 16:20:32 crc kubenswrapper[4933]: I1202 16:20:32.976554 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ce3e9ef-c14e-49c8-bd2b-8b268f028516-combined-ca-bundle\") pod \"heat-cfnapi-7d8849d76d-cc29j\" (UID: \"6ce3e9ef-c14e-49c8-bd2b-8b268f028516\") " pod="openstack/heat-cfnapi-7d8849d76d-cc29j" Dec 02 16:20:32 crc kubenswrapper[4933]: I1202 16:20:32.982290 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0f856765-5f4c-445f-bcd1-736db6fb2c56-internal-tls-certs\") pod \"heat-api-7cfb68f886-5mpb2\" (UID: \"0f856765-5f4c-445f-bcd1-736db6fb2c56\") " pod="openstack/heat-api-7cfb68f886-5mpb2" Dec 02 16:20:32 crc kubenswrapper[4933]: I1202 16:20:32.984640 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f856765-5f4c-445f-bcd1-736db6fb2c56-combined-ca-bundle\") pod \"heat-api-7cfb68f886-5mpb2\" (UID: \"0f856765-5f4c-445f-bcd1-736db6fb2c56\") " pod="openstack/heat-api-7cfb68f886-5mpb2" Dec 02 16:20:32 crc kubenswrapper[4933]: I1202 16:20:32.985218 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f856765-5f4c-445f-bcd1-736db6fb2c56-config-data\") pod \"heat-api-7cfb68f886-5mpb2\" (UID: \"0f856765-5f4c-445f-bcd1-736db6fb2c56\") " pod="openstack/heat-api-7cfb68f886-5mpb2" Dec 02 16:20:32 crc kubenswrapper[4933]: I1202 16:20:32.990373 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0f856765-5f4c-445f-bcd1-736db6fb2c56-public-tls-certs\") pod \"heat-api-7cfb68f886-5mpb2\" (UID: \"0f856765-5f4c-445f-bcd1-736db6fb2c56\") " pod="openstack/heat-api-7cfb68f886-5mpb2" Dec 02 16:20:32 crc kubenswrapper[4933]: I1202 16:20:32.995723 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0f856765-5f4c-445f-bcd1-736db6fb2c56-config-data-custom\") pod \"heat-api-7cfb68f886-5mpb2\" (UID: \"0f856765-5f4c-445f-bcd1-736db6fb2c56\") " pod="openstack/heat-api-7cfb68f886-5mpb2" Dec 02 16:20:32 crc kubenswrapper[4933]: I1202 16:20:32.997496 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qpf6\" (UniqueName: \"kubernetes.io/projected/0f856765-5f4c-445f-bcd1-736db6fb2c56-kube-api-access-8qpf6\") pod \"heat-api-7cfb68f886-5mpb2\" (UID: \"0f856765-5f4c-445f-bcd1-736db6fb2c56\") " pod="openstack/heat-api-7cfb68f886-5mpb2" Dec 02 16:20:33 crc kubenswrapper[4933]: I1202 16:20:33.004475 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-c8865d5c5-ktz2c" Dec 02 16:20:33 crc kubenswrapper[4933]: I1202 16:20:33.055388 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-7cfb68f886-5mpb2" Dec 02 16:20:33 crc kubenswrapper[4933]: I1202 16:20:33.080288 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ce3e9ef-c14e-49c8-bd2b-8b268f028516-public-tls-certs\") pod \"heat-cfnapi-7d8849d76d-cc29j\" (UID: \"6ce3e9ef-c14e-49c8-bd2b-8b268f028516\") " pod="openstack/heat-cfnapi-7d8849d76d-cc29j" Dec 02 16:20:33 crc kubenswrapper[4933]: I1202 16:20:33.080332 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ce3e9ef-c14e-49c8-bd2b-8b268f028516-internal-tls-certs\") pod \"heat-cfnapi-7d8849d76d-cc29j\" (UID: \"6ce3e9ef-c14e-49c8-bd2b-8b268f028516\") " pod="openstack/heat-cfnapi-7d8849d76d-cc29j" Dec 02 16:20:33 crc kubenswrapper[4933]: I1202 16:20:33.080383 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ce3e9ef-c14e-49c8-bd2b-8b268f028516-combined-ca-bundle\") pod \"heat-cfnapi-7d8849d76d-cc29j\" (UID: \"6ce3e9ef-c14e-49c8-bd2b-8b268f028516\") " pod="openstack/heat-cfnapi-7d8849d76d-cc29j" Dec 02 16:20:33 crc kubenswrapper[4933]: I1202 16:20:33.080513 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6ce3e9ef-c14e-49c8-bd2b-8b268f028516-config-data-custom\") pod \"heat-cfnapi-7d8849d76d-cc29j\" (UID: \"6ce3e9ef-c14e-49c8-bd2b-8b268f028516\") " pod="openstack/heat-cfnapi-7d8849d76d-cc29j" Dec 02 16:20:33 crc kubenswrapper[4933]: I1202 16:20:33.080663 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvp4p\" (UniqueName: \"kubernetes.io/projected/6ce3e9ef-c14e-49c8-bd2b-8b268f028516-kube-api-access-dvp4p\") pod \"heat-cfnapi-7d8849d76d-cc29j\" (UID: \"6ce3e9ef-c14e-49c8-bd2b-8b268f028516\") " pod="openstack/heat-cfnapi-7d8849d76d-cc29j" Dec 02 16:20:33 crc kubenswrapper[4933]: I1202 16:20:33.080725 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ce3e9ef-c14e-49c8-bd2b-8b268f028516-config-data\") pod \"heat-cfnapi-7d8849d76d-cc29j\" (UID: \"6ce3e9ef-c14e-49c8-bd2b-8b268f028516\") " pod="openstack/heat-cfnapi-7d8849d76d-cc29j" Dec 02 16:20:33 crc kubenswrapper[4933]: I1202 16:20:33.085386 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ce3e9ef-c14e-49c8-bd2b-8b268f028516-config-data\") pod \"heat-cfnapi-7d8849d76d-cc29j\" (UID: \"6ce3e9ef-c14e-49c8-bd2b-8b268f028516\") " pod="openstack/heat-cfnapi-7d8849d76d-cc29j" Dec 02 16:20:33 crc kubenswrapper[4933]: I1202 16:20:33.085766 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ce3e9ef-c14e-49c8-bd2b-8b268f028516-public-tls-certs\") pod \"heat-cfnapi-7d8849d76d-cc29j\" (UID: \"6ce3e9ef-c14e-49c8-bd2b-8b268f028516\") " pod="openstack/heat-cfnapi-7d8849d76d-cc29j" Dec 02 16:20:33 crc kubenswrapper[4933]: I1202 16:20:33.087537 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6ce3e9ef-c14e-49c8-bd2b-8b268f028516-config-data-custom\") pod \"heat-cfnapi-7d8849d76d-cc29j\" (UID: \"6ce3e9ef-c14e-49c8-bd2b-8b268f028516\") " pod="openstack/heat-cfnapi-7d8849d76d-cc29j" Dec 02 16:20:33 crc kubenswrapper[4933]: I1202 16:20:33.087905 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ce3e9ef-c14e-49c8-bd2b-8b268f028516-combined-ca-bundle\") pod \"heat-cfnapi-7d8849d76d-cc29j\" (UID: \"6ce3e9ef-c14e-49c8-bd2b-8b268f028516\") " pod="openstack/heat-cfnapi-7d8849d76d-cc29j" Dec 02 16:20:33 crc kubenswrapper[4933]: I1202 16:20:33.089551 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ce3e9ef-c14e-49c8-bd2b-8b268f028516-internal-tls-certs\") pod \"heat-cfnapi-7d8849d76d-cc29j\" (UID: \"6ce3e9ef-c14e-49c8-bd2b-8b268f028516\") " pod="openstack/heat-cfnapi-7d8849d76d-cc29j" Dec 02 16:20:33 crc kubenswrapper[4933]: I1202 16:20:33.096958 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvp4p\" (UniqueName: \"kubernetes.io/projected/6ce3e9ef-c14e-49c8-bd2b-8b268f028516-kube-api-access-dvp4p\") pod \"heat-cfnapi-7d8849d76d-cc29j\" (UID: \"6ce3e9ef-c14e-49c8-bd2b-8b268f028516\") " pod="openstack/heat-cfnapi-7d8849d76d-cc29j" Dec 02 16:20:33 crc kubenswrapper[4933]: I1202 16:20:33.150035 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-7d8849d76d-cc29j" Dec 02 16:20:35 crc kubenswrapper[4933]: I1202 16:20:35.114490 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 02 16:20:38 crc kubenswrapper[4933]: I1202 16:20:38.407518 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d788537f-6bd8-4db8-a47d-2053d24dca64","Type":"ContainerStarted","Data":"902ca51691ae3c700a75d25c7bf6ee78dcc281dcc72681c621d1f3f6ed991d76"} Dec 02 16:20:38 crc kubenswrapper[4933]: I1202 16:20:38.430502 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.9424640439999998 podStartE2EDuration="46.43048066s" podCreationTimestamp="2025-12-02 16:19:52 +0000 UTC" firstStartedPulling="2025-12-02 16:19:53.584033775 +0000 UTC m=+1656.835260478" lastFinishedPulling="2025-12-02 16:20:38.072050391 +0000 UTC m=+1701.323277094" observedRunningTime="2025-12-02 16:20:38.426580214 +0000 UTC m=+1701.677806917" watchObservedRunningTime="2025-12-02 16:20:38.43048066 +0000 UTC m=+1701.681707363" Dec 02 16:20:38 crc kubenswrapper[4933]: I1202 16:20:38.615378 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-c8865d5c5-ktz2c"] Dec 02 16:20:38 crc kubenswrapper[4933]: I1202 16:20:38.696887 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-7cfb68f886-5mpb2"] Dec 02 16:20:38 crc kubenswrapper[4933]: W1202 16:20:38.774402 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6ce3e9ef_c14e_49c8_bd2b_8b268f028516.slice/crio-394e1be52e9ade1d4e4676d920aff49fec8e5eabbe52d5821408bf0db2d8d415 WatchSource:0}: Error finding container 394e1be52e9ade1d4e4676d920aff49fec8e5eabbe52d5821408bf0db2d8d415: Status 404 returned error can't find the container with id 394e1be52e9ade1d4e4676d920aff49fec8e5eabbe52d5821408bf0db2d8d415 Dec 02 16:20:38 crc kubenswrapper[4933]: I1202 16:20:38.775009 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-7d8849d76d-cc29j"] Dec 02 16:20:39 crc kubenswrapper[4933]: I1202 16:20:39.420698 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-7cfb68f886-5mpb2" event={"ID":"0f856765-5f4c-445f-bcd1-736db6fb2c56","Type":"ContainerStarted","Data":"a7183e741f0ca94a914d050a01bc1eb691aa76b727febebcd29fa8fe0b01e450"} Dec 02 16:20:39 crc kubenswrapper[4933]: I1202 16:20:39.422449 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-7d8849d76d-cc29j" event={"ID":"6ce3e9ef-c14e-49c8-bd2b-8b268f028516","Type":"ContainerStarted","Data":"394e1be52e9ade1d4e4676d920aff49fec8e5eabbe52d5821408bf0db2d8d415"} Dec 02 16:20:39 crc kubenswrapper[4933]: I1202 16:20:39.423890 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-c8865d5c5-ktz2c" event={"ID":"1aa97de0-3a2a-474c-904e-5fa59773c33c","Type":"ContainerStarted","Data":"bd5892689d76d89ea86fd01cad76650b61f99787888c6430da10593be6d00de7"} Dec 02 16:20:39 crc kubenswrapper[4933]: I1202 16:20:39.423917 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-c8865d5c5-ktz2c" event={"ID":"1aa97de0-3a2a-474c-904e-5fa59773c33c","Type":"ContainerStarted","Data":"87d1965de8c74e77edf910fea2da26f8a862dcbbd7883e3a70facf73f466a667"} Dec 02 16:20:39 crc kubenswrapper[4933]: I1202 16:20:39.424492 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-c8865d5c5-ktz2c" Dec 02 16:20:39 crc kubenswrapper[4933]: I1202 16:20:39.428899 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mznps" event={"ID":"b712434b-6106-44ed-aa67-1328e50cdb2c","Type":"ContainerStarted","Data":"fe3546f0232b704a18dbcd1313184fdc4ad38c4584bfb55dfa9481f7182e543a"} Dec 02 16:20:39 crc kubenswrapper[4933]: I1202 16:20:39.463361 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-c8865d5c5-ktz2c" podStartSLOduration=7.46334188 podStartE2EDuration="7.46334188s" podCreationTimestamp="2025-12-02 16:20:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 16:20:39.439388093 +0000 UTC m=+1702.690614806" watchObservedRunningTime="2025-12-02 16:20:39.46334188 +0000 UTC m=+1702.714568583" Dec 02 16:20:39 crc kubenswrapper[4933]: I1202 16:20:39.486591 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mznps" podStartSLOduration=3.118276006 podStartE2EDuration="13.486571967s" podCreationTimestamp="2025-12-02 16:20:26 +0000 UTC" firstStartedPulling="2025-12-02 16:20:27.71375965 +0000 UTC m=+1690.964986353" lastFinishedPulling="2025-12-02 16:20:38.082055611 +0000 UTC m=+1701.333282314" observedRunningTime="2025-12-02 16:20:39.456230448 +0000 UTC m=+1702.707457151" watchObservedRunningTime="2025-12-02 16:20:39.486571967 +0000 UTC m=+1702.737798670" Dec 02 16:20:41 crc kubenswrapper[4933]: I1202 16:20:41.452899 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-7cfb68f886-5mpb2" event={"ID":"0f856765-5f4c-445f-bcd1-736db6fb2c56","Type":"ContainerStarted","Data":"9d0cdafa4c7d89d5c68448390d0ff348a2a9ae93dd6323fffc0568beba44c53e"} Dec 02 16:20:41 crc kubenswrapper[4933]: I1202 16:20:41.453250 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-7cfb68f886-5mpb2" Dec 02 16:20:41 crc kubenswrapper[4933]: I1202 16:20:41.454712 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-7d8849d76d-cc29j" event={"ID":"6ce3e9ef-c14e-49c8-bd2b-8b268f028516","Type":"ContainerStarted","Data":"8f4a8974c147f107ea9205682b827c26af62217c3aca4df3f15a2d3302da696e"} Dec 02 16:20:41 crc kubenswrapper[4933]: I1202 16:20:41.455245 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-7d8849d76d-cc29j" Dec 02 16:20:41 crc kubenswrapper[4933]: I1202 16:20:41.481549 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-7cfb68f886-5mpb2" podStartSLOduration=7.762551111 podStartE2EDuration="9.481529409s" podCreationTimestamp="2025-12-02 16:20:32 +0000 UTC" firstStartedPulling="2025-12-02 16:20:38.595959928 +0000 UTC m=+1701.847186631" lastFinishedPulling="2025-12-02 16:20:40.314938226 +0000 UTC m=+1703.566164929" observedRunningTime="2025-12-02 16:20:41.468961879 +0000 UTC m=+1704.720188602" watchObservedRunningTime="2025-12-02 16:20:41.481529409 +0000 UTC m=+1704.732756122" Dec 02 16:20:41 crc kubenswrapper[4933]: I1202 16:20:41.497180 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-7d8849d76d-cc29j" podStartSLOduration=7.960419104 podStartE2EDuration="9.497160971s" podCreationTimestamp="2025-12-02 16:20:32 +0000 UTC" firstStartedPulling="2025-12-02 16:20:38.782963278 +0000 UTC m=+1702.034189981" lastFinishedPulling="2025-12-02 16:20:40.319705145 +0000 UTC m=+1703.570931848" observedRunningTime="2025-12-02 16:20:41.489449652 +0000 UTC m=+1704.740676365" watchObservedRunningTime="2025-12-02 16:20:41.497160971 +0000 UTC m=+1704.748387674" Dec 02 16:20:42 crc kubenswrapper[4933]: I1202 16:20:42.973338 4933 scope.go:117] "RemoveContainer" containerID="feb9062af93b73044a3514582bb2b43e14a1b8d6f2d488fd8ebfa813c946cfae" Dec 02 16:20:43 crc kubenswrapper[4933]: I1202 16:20:43.010410 4933 scope.go:117] "RemoveContainer" containerID="f56f437ea9e5050b83f13c02df967bcb6a5972ddd757e066055b967858cf917d" Dec 02 16:20:45 crc kubenswrapper[4933]: I1202 16:20:45.053576 4933 scope.go:117] "RemoveContainer" containerID="3464037e44f6657a3e78f158e1b9ac51eea30ff49dcabcf02071d819dacd47c7" Dec 02 16:20:45 crc kubenswrapper[4933]: E1202 16:20:45.055067 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 16:20:48 crc kubenswrapper[4933]: I1202 16:20:48.537499 4933 generic.go:334] "Generic (PLEG): container finished" podID="528bcbae-e7ca-4ad1-8e7d-571da6cb972b" containerID="0bd8537fba2db527eab80712194d7edafd44322f20df6cc5891f400c21b3d841" exitCode=0 Dec 02 16:20:48 crc kubenswrapper[4933]: I1202 16:20:48.537691 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"528bcbae-e7ca-4ad1-8e7d-571da6cb972b","Type":"ContainerDied","Data":"0bd8537fba2db527eab80712194d7edafd44322f20df6cc5891f400c21b3d841"} Dec 02 16:20:48 crc kubenswrapper[4933]: I1202 16:20:48.548475 4933 generic.go:334] "Generic (PLEG): container finished" podID="cd27c6c8-91fa-4036-b50e-996c263b202c" containerID="84c2fc8a3efba357282589e14e34be2884195e564ce93ce7d024e625b6c03af5" exitCode=0 Dec 02 16:20:48 crc kubenswrapper[4933]: I1202 16:20:48.548535 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"cd27c6c8-91fa-4036-b50e-996c263b202c","Type":"ContainerDied","Data":"84c2fc8a3efba357282589e14e34be2884195e564ce93ce7d024e625b6c03af5"} Dec 02 16:20:49 crc kubenswrapper[4933]: I1202 16:20:49.562566 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"cd27c6c8-91fa-4036-b50e-996c263b202c","Type":"ContainerStarted","Data":"d09d37790d3c942ab0e057ecff68810d7d2d4c19631508adb220bcdc0ce8ab19"} Dec 02 16:20:49 crc kubenswrapper[4933]: I1202 16:20:49.563576 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 02 16:20:49 crc kubenswrapper[4933]: I1202 16:20:49.568747 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"528bcbae-e7ca-4ad1-8e7d-571da6cb972b","Type":"ContainerStarted","Data":"11cad03c552f8632b726656edf86e70dba1183a7dc249d1b6c28c4c1f5930793"} Dec 02 16:20:49 crc kubenswrapper[4933]: I1202 16:20:49.569075 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 02 16:20:49 crc kubenswrapper[4933]: I1202 16:20:49.611342 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.611316812 podStartE2EDuration="37.611316812s" podCreationTimestamp="2025-12-02 16:20:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 16:20:49.599450592 +0000 UTC m=+1712.850677305" watchObservedRunningTime="2025-12-02 16:20:49.611316812 +0000 UTC m=+1712.862543515" Dec 02 16:20:49 crc kubenswrapper[4933]: I1202 16:20:49.635694 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.63566874 podStartE2EDuration="37.63566874s" podCreationTimestamp="2025-12-02 16:20:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 16:20:49.625469784 +0000 UTC m=+1712.876696487" watchObservedRunningTime="2025-12-02 16:20:49.63566874 +0000 UTC m=+1712.886895443" Dec 02 16:20:49 crc kubenswrapper[4933]: I1202 16:20:49.996841 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-7cfb68f886-5mpb2" Dec 02 16:20:50 crc kubenswrapper[4933]: I1202 16:20:50.031864 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-7d8849d76d-cc29j" Dec 02 16:20:50 crc kubenswrapper[4933]: I1202 16:20:50.074736 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-5558cd5dc7-s4k68"] Dec 02 16:20:50 crc kubenswrapper[4933]: I1202 16:20:50.074997 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-api-5558cd5dc7-s4k68" podUID="4caba072-5d91-4126-909a-b4e7ad6167d6" containerName="heat-api" containerID="cri-o://502fb37e69c4ed4b648add46e46605f39c20b2666f164ad2678b88200437e652" gracePeriod=60 Dec 02 16:20:50 crc kubenswrapper[4933]: I1202 16:20:50.211075 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-5979c644c5-dckr9"] Dec 02 16:20:50 crc kubenswrapper[4933]: I1202 16:20:50.211341 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-cfnapi-5979c644c5-dckr9" podUID="3aa14443-9372-4f3d-bc34-3463b051b107" containerName="heat-cfnapi" containerID="cri-o://0b71cb6a351224d9b0489185cbd0ef0bfbecce853323c3977354db13eab3915f" gracePeriod=60 Dec 02 16:20:51 crc kubenswrapper[4933]: I1202 16:20:51.593486 4933 generic.go:334] "Generic (PLEG): container finished" podID="b712434b-6106-44ed-aa67-1328e50cdb2c" containerID="fe3546f0232b704a18dbcd1313184fdc4ad38c4584bfb55dfa9481f7182e543a" exitCode=0 Dec 02 16:20:51 crc kubenswrapper[4933]: I1202 16:20:51.593531 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mznps" event={"ID":"b712434b-6106-44ed-aa67-1328e50cdb2c","Type":"ContainerDied","Data":"fe3546f0232b704a18dbcd1313184fdc4ad38c4584bfb55dfa9481f7182e543a"} Dec 02 16:20:53 crc kubenswrapper[4933]: I1202 16:20:53.048378 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-c8865d5c5-ktz2c" Dec 02 16:20:53 crc kubenswrapper[4933]: I1202 16:20:53.133673 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-64b6fd7f9d-b5l2s"] Dec 02 16:20:53 crc kubenswrapper[4933]: I1202 16:20:53.134324 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-engine-64b6fd7f9d-b5l2s" podUID="59434c0f-e21f-45a2-909b-fa1143a5b5ae" containerName="heat-engine" containerID="cri-o://233e8e7fffa44b80d7319fbe8af766215914d33ff5dc13cd6f7267d68b6d25b8" gracePeriod=60 Dec 02 16:20:53 crc kubenswrapper[4933]: I1202 16:20:53.171547 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mznps" Dec 02 16:20:53 crc kubenswrapper[4933]: I1202 16:20:53.198989 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b712434b-6106-44ed-aa67-1328e50cdb2c-ssh-key\") pod \"b712434b-6106-44ed-aa67-1328e50cdb2c\" (UID: \"b712434b-6106-44ed-aa67-1328e50cdb2c\") " Dec 02 16:20:53 crc kubenswrapper[4933]: I1202 16:20:53.199111 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fzp24\" (UniqueName: \"kubernetes.io/projected/b712434b-6106-44ed-aa67-1328e50cdb2c-kube-api-access-fzp24\") pod \"b712434b-6106-44ed-aa67-1328e50cdb2c\" (UID: \"b712434b-6106-44ed-aa67-1328e50cdb2c\") " Dec 02 16:20:53 crc kubenswrapper[4933]: I1202 16:20:53.199210 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b712434b-6106-44ed-aa67-1328e50cdb2c-repo-setup-combined-ca-bundle\") pod \"b712434b-6106-44ed-aa67-1328e50cdb2c\" (UID: \"b712434b-6106-44ed-aa67-1328e50cdb2c\") " Dec 02 16:20:53 crc kubenswrapper[4933]: I1202 16:20:53.199324 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b712434b-6106-44ed-aa67-1328e50cdb2c-inventory\") pod \"b712434b-6106-44ed-aa67-1328e50cdb2c\" (UID: \"b712434b-6106-44ed-aa67-1328e50cdb2c\") " Dec 02 16:20:53 crc kubenswrapper[4933]: I1202 16:20:53.218976 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b712434b-6106-44ed-aa67-1328e50cdb2c-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "b712434b-6106-44ed-aa67-1328e50cdb2c" (UID: "b712434b-6106-44ed-aa67-1328e50cdb2c"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:20:53 crc kubenswrapper[4933]: I1202 16:20:53.237119 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b712434b-6106-44ed-aa67-1328e50cdb2c-kube-api-access-fzp24" (OuterVolumeSpecName: "kube-api-access-fzp24") pod "b712434b-6106-44ed-aa67-1328e50cdb2c" (UID: "b712434b-6106-44ed-aa67-1328e50cdb2c"). InnerVolumeSpecName "kube-api-access-fzp24". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:20:53 crc kubenswrapper[4933]: I1202 16:20:53.247842 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b712434b-6106-44ed-aa67-1328e50cdb2c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "b712434b-6106-44ed-aa67-1328e50cdb2c" (UID: "b712434b-6106-44ed-aa67-1328e50cdb2c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:20:53 crc kubenswrapper[4933]: I1202 16:20:53.265578 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b712434b-6106-44ed-aa67-1328e50cdb2c-inventory" (OuterVolumeSpecName: "inventory") pod "b712434b-6106-44ed-aa67-1328e50cdb2c" (UID: "b712434b-6106-44ed-aa67-1328e50cdb2c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:20:53 crc kubenswrapper[4933]: I1202 16:20:53.301496 4933 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b712434b-6106-44ed-aa67-1328e50cdb2c-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 16:20:53 crc kubenswrapper[4933]: I1202 16:20:53.301539 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fzp24\" (UniqueName: \"kubernetes.io/projected/b712434b-6106-44ed-aa67-1328e50cdb2c-kube-api-access-fzp24\") on node \"crc\" DevicePath \"\"" Dec 02 16:20:53 crc kubenswrapper[4933]: I1202 16:20:53.301554 4933 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b712434b-6106-44ed-aa67-1328e50cdb2c-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 16:20:53 crc kubenswrapper[4933]: I1202 16:20:53.301567 4933 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b712434b-6106-44ed-aa67-1328e50cdb2c-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 16:20:53 crc kubenswrapper[4933]: I1202 16:20:53.582701 4933 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-api-5558cd5dc7-s4k68" podUID="4caba072-5d91-4126-909a-b4e7ad6167d6" containerName="heat-api" probeResult="failure" output="Get \"https://10.217.0.212:8004/healthcheck\": read tcp 10.217.0.2:50480->10.217.0.212:8004: read: connection reset by peer" Dec 02 16:20:53 crc kubenswrapper[4933]: I1202 16:20:53.625455 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mznps" Dec 02 16:20:53 crc kubenswrapper[4933]: I1202 16:20:53.625463 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mznps" event={"ID":"b712434b-6106-44ed-aa67-1328e50cdb2c","Type":"ContainerDied","Data":"9db0325c7877bb8068122588e86bcd0f997b56d2355a29b12048491336f7bd82"} Dec 02 16:20:53 crc kubenswrapper[4933]: I1202 16:20:53.625503 4933 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9db0325c7877bb8068122588e86bcd0f997b56d2355a29b12048491336f7bd82" Dec 02 16:20:53 crc kubenswrapper[4933]: I1202 16:20:53.639301 4933 generic.go:334] "Generic (PLEG): container finished" podID="4caba072-5d91-4126-909a-b4e7ad6167d6" containerID="502fb37e69c4ed4b648add46e46605f39c20b2666f164ad2678b88200437e652" exitCode=0 Dec 02 16:20:53 crc kubenswrapper[4933]: I1202 16:20:53.639340 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5558cd5dc7-s4k68" event={"ID":"4caba072-5d91-4126-909a-b4e7ad6167d6","Type":"ContainerDied","Data":"502fb37e69c4ed4b648add46e46605f39c20b2666f164ad2678b88200437e652"} Dec 02 16:20:53 crc kubenswrapper[4933]: I1202 16:20:53.729575 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-l7rfq"] Dec 02 16:20:53 crc kubenswrapper[4933]: E1202 16:20:53.730574 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b712434b-6106-44ed-aa67-1328e50cdb2c" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 02 16:20:53 crc kubenswrapper[4933]: I1202 16:20:53.730598 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="b712434b-6106-44ed-aa67-1328e50cdb2c" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 02 16:20:53 crc kubenswrapper[4933]: I1202 16:20:53.730922 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="b712434b-6106-44ed-aa67-1328e50cdb2c" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 02 16:20:53 crc kubenswrapper[4933]: I1202 16:20:53.732202 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-l7rfq" Dec 02 16:20:53 crc kubenswrapper[4933]: I1202 16:20:53.737581 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 16:20:53 crc kubenswrapper[4933]: I1202 16:20:53.737612 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 16:20:53 crc kubenswrapper[4933]: I1202 16:20:53.738349 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mlmmm" Dec 02 16:20:53 crc kubenswrapper[4933]: I1202 16:20:53.738511 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 16:20:53 crc kubenswrapper[4933]: I1202 16:20:53.743105 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-l7rfq"] Dec 02 16:20:53 crc kubenswrapper[4933]: I1202 16:20:53.809964 4933 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-cfnapi-5979c644c5-dckr9" podUID="3aa14443-9372-4f3d-bc34-3463b051b107" containerName="heat-cfnapi" probeResult="failure" output="Get \"https://10.217.0.213:8000/healthcheck\": read tcp 10.217.0.2:48660->10.217.0.213:8000: read: connection reset by peer" Dec 02 16:20:53 crc kubenswrapper[4933]: I1202 16:20:53.813027 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntvjq\" (UniqueName: \"kubernetes.io/projected/1eeef066-30d5-47f9-90a0-2815244b7ebb-kube-api-access-ntvjq\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-l7rfq\" (UID: \"1eeef066-30d5-47f9-90a0-2815244b7ebb\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-l7rfq" Dec 02 16:20:53 crc kubenswrapper[4933]: I1202 16:20:53.813141 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1eeef066-30d5-47f9-90a0-2815244b7ebb-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-l7rfq\" (UID: \"1eeef066-30d5-47f9-90a0-2815244b7ebb\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-l7rfq" Dec 02 16:20:53 crc kubenswrapper[4933]: I1202 16:20:53.813238 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1eeef066-30d5-47f9-90a0-2815244b7ebb-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-l7rfq\" (UID: \"1eeef066-30d5-47f9-90a0-2815244b7ebb\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-l7rfq" Dec 02 16:20:53 crc kubenswrapper[4933]: I1202 16:20:53.915550 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1eeef066-30d5-47f9-90a0-2815244b7ebb-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-l7rfq\" (UID: \"1eeef066-30d5-47f9-90a0-2815244b7ebb\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-l7rfq" Dec 02 16:20:53 crc kubenswrapper[4933]: I1202 16:20:53.915716 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ntvjq\" (UniqueName: \"kubernetes.io/projected/1eeef066-30d5-47f9-90a0-2815244b7ebb-kube-api-access-ntvjq\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-l7rfq\" (UID: \"1eeef066-30d5-47f9-90a0-2815244b7ebb\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-l7rfq" Dec 02 16:20:53 crc kubenswrapper[4933]: I1202 16:20:53.915819 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1eeef066-30d5-47f9-90a0-2815244b7ebb-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-l7rfq\" (UID: \"1eeef066-30d5-47f9-90a0-2815244b7ebb\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-l7rfq" Dec 02 16:20:53 crc kubenswrapper[4933]: I1202 16:20:53.921571 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1eeef066-30d5-47f9-90a0-2815244b7ebb-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-l7rfq\" (UID: \"1eeef066-30d5-47f9-90a0-2815244b7ebb\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-l7rfq" Dec 02 16:20:53 crc kubenswrapper[4933]: I1202 16:20:53.921576 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1eeef066-30d5-47f9-90a0-2815244b7ebb-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-l7rfq\" (UID: \"1eeef066-30d5-47f9-90a0-2815244b7ebb\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-l7rfq" Dec 02 16:20:53 crc kubenswrapper[4933]: I1202 16:20:53.941578 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntvjq\" (UniqueName: \"kubernetes.io/projected/1eeef066-30d5-47f9-90a0-2815244b7ebb-kube-api-access-ntvjq\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-l7rfq\" (UID: \"1eeef066-30d5-47f9-90a0-2815244b7ebb\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-l7rfq" Dec 02 16:20:54 crc kubenswrapper[4933]: I1202 16:20:54.053702 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-l7rfq" Dec 02 16:20:54 crc kubenswrapper[4933]: I1202 16:20:54.267286 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5558cd5dc7-s4k68" Dec 02 16:20:54 crc kubenswrapper[4933]: I1202 16:20:54.334759 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4caba072-5d91-4126-909a-b4e7ad6167d6-config-data\") pod \"4caba072-5d91-4126-909a-b4e7ad6167d6\" (UID: \"4caba072-5d91-4126-909a-b4e7ad6167d6\") " Dec 02 16:20:54 crc kubenswrapper[4933]: I1202 16:20:54.334815 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4caba072-5d91-4126-909a-b4e7ad6167d6-config-data-custom\") pod \"4caba072-5d91-4126-909a-b4e7ad6167d6\" (UID: \"4caba072-5d91-4126-909a-b4e7ad6167d6\") " Dec 02 16:20:54 crc kubenswrapper[4933]: I1202 16:20:54.335250 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nxwgk\" (UniqueName: \"kubernetes.io/projected/4caba072-5d91-4126-909a-b4e7ad6167d6-kube-api-access-nxwgk\") pod \"4caba072-5d91-4126-909a-b4e7ad6167d6\" (UID: \"4caba072-5d91-4126-909a-b4e7ad6167d6\") " Dec 02 16:20:54 crc kubenswrapper[4933]: I1202 16:20:54.335433 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4caba072-5d91-4126-909a-b4e7ad6167d6-public-tls-certs\") pod \"4caba072-5d91-4126-909a-b4e7ad6167d6\" (UID: \"4caba072-5d91-4126-909a-b4e7ad6167d6\") " Dec 02 16:20:54 crc kubenswrapper[4933]: I1202 16:20:54.335600 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4caba072-5d91-4126-909a-b4e7ad6167d6-internal-tls-certs\") pod \"4caba072-5d91-4126-909a-b4e7ad6167d6\" (UID: \"4caba072-5d91-4126-909a-b4e7ad6167d6\") " Dec 02 16:20:54 crc kubenswrapper[4933]: I1202 16:20:54.335885 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4caba072-5d91-4126-909a-b4e7ad6167d6-combined-ca-bundle\") pod \"4caba072-5d91-4126-909a-b4e7ad6167d6\" (UID: \"4caba072-5d91-4126-909a-b4e7ad6167d6\") " Dec 02 16:20:54 crc kubenswrapper[4933]: I1202 16:20:54.393014 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4caba072-5d91-4126-909a-b4e7ad6167d6-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "4caba072-5d91-4126-909a-b4e7ad6167d6" (UID: "4caba072-5d91-4126-909a-b4e7ad6167d6"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:20:54 crc kubenswrapper[4933]: I1202 16:20:54.442649 4933 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4caba072-5d91-4126-909a-b4e7ad6167d6-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 02 16:20:54 crc kubenswrapper[4933]: I1202 16:20:54.449561 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4caba072-5d91-4126-909a-b4e7ad6167d6-kube-api-access-nxwgk" (OuterVolumeSpecName: "kube-api-access-nxwgk") pod "4caba072-5d91-4126-909a-b4e7ad6167d6" (UID: "4caba072-5d91-4126-909a-b4e7ad6167d6"). InnerVolumeSpecName "kube-api-access-nxwgk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:20:54 crc kubenswrapper[4933]: I1202 16:20:54.537107 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4caba072-5d91-4126-909a-b4e7ad6167d6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4caba072-5d91-4126-909a-b4e7ad6167d6" (UID: "4caba072-5d91-4126-909a-b4e7ad6167d6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:20:54 crc kubenswrapper[4933]: I1202 16:20:54.555351 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-5979c644c5-dckr9" Dec 02 16:20:54 crc kubenswrapper[4933]: I1202 16:20:54.559965 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mwrxb\" (UniqueName: \"kubernetes.io/projected/3aa14443-9372-4f3d-bc34-3463b051b107-kube-api-access-mwrxb\") pod \"3aa14443-9372-4f3d-bc34-3463b051b107\" (UID: \"3aa14443-9372-4f3d-bc34-3463b051b107\") " Dec 02 16:20:54 crc kubenswrapper[4933]: I1202 16:20:54.560182 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3aa14443-9372-4f3d-bc34-3463b051b107-combined-ca-bundle\") pod \"3aa14443-9372-4f3d-bc34-3463b051b107\" (UID: \"3aa14443-9372-4f3d-bc34-3463b051b107\") " Dec 02 16:20:54 crc kubenswrapper[4933]: I1202 16:20:54.560367 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3aa14443-9372-4f3d-bc34-3463b051b107-config-data-custom\") pod \"3aa14443-9372-4f3d-bc34-3463b051b107\" (UID: \"3aa14443-9372-4f3d-bc34-3463b051b107\") " Dec 02 16:20:54 crc kubenswrapper[4933]: I1202 16:20:54.560515 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3aa14443-9372-4f3d-bc34-3463b051b107-config-data\") pod \"3aa14443-9372-4f3d-bc34-3463b051b107\" (UID: \"3aa14443-9372-4f3d-bc34-3463b051b107\") " Dec 02 16:20:54 crc kubenswrapper[4933]: I1202 16:20:54.560628 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3aa14443-9372-4f3d-bc34-3463b051b107-public-tls-certs\") pod \"3aa14443-9372-4f3d-bc34-3463b051b107\" (UID: \"3aa14443-9372-4f3d-bc34-3463b051b107\") " Dec 02 16:20:54 crc kubenswrapper[4933]: I1202 16:20:54.560871 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3aa14443-9372-4f3d-bc34-3463b051b107-internal-tls-certs\") pod \"3aa14443-9372-4f3d-bc34-3463b051b107\" (UID: \"3aa14443-9372-4f3d-bc34-3463b051b107\") " Dec 02 16:20:54 crc kubenswrapper[4933]: I1202 16:20:54.561621 4933 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4caba072-5d91-4126-909a-b4e7ad6167d6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 16:20:54 crc kubenswrapper[4933]: I1202 16:20:54.561703 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nxwgk\" (UniqueName: \"kubernetes.io/projected/4caba072-5d91-4126-909a-b4e7ad6167d6-kube-api-access-nxwgk\") on node \"crc\" DevicePath \"\"" Dec 02 16:20:54 crc kubenswrapper[4933]: I1202 16:20:54.570864 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3aa14443-9372-4f3d-bc34-3463b051b107-kube-api-access-mwrxb" (OuterVolumeSpecName: "kube-api-access-mwrxb") pod "3aa14443-9372-4f3d-bc34-3463b051b107" (UID: "3aa14443-9372-4f3d-bc34-3463b051b107"). InnerVolumeSpecName "kube-api-access-mwrxb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:20:54 crc kubenswrapper[4933]: I1202 16:20:54.576962 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3aa14443-9372-4f3d-bc34-3463b051b107-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "3aa14443-9372-4f3d-bc34-3463b051b107" (UID: "3aa14443-9372-4f3d-bc34-3463b051b107"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:20:54 crc kubenswrapper[4933]: I1202 16:20:54.579915 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4caba072-5d91-4126-909a-b4e7ad6167d6-config-data" (OuterVolumeSpecName: "config-data") pod "4caba072-5d91-4126-909a-b4e7ad6167d6" (UID: "4caba072-5d91-4126-909a-b4e7ad6167d6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:20:54 crc kubenswrapper[4933]: I1202 16:20:54.591621 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4caba072-5d91-4126-909a-b4e7ad6167d6-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "4caba072-5d91-4126-909a-b4e7ad6167d6" (UID: "4caba072-5d91-4126-909a-b4e7ad6167d6"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:20:54 crc kubenswrapper[4933]: I1202 16:20:54.592976 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4caba072-5d91-4126-909a-b4e7ad6167d6-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "4caba072-5d91-4126-909a-b4e7ad6167d6" (UID: "4caba072-5d91-4126-909a-b4e7ad6167d6"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:20:54 crc kubenswrapper[4933]: I1202 16:20:54.619463 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3aa14443-9372-4f3d-bc34-3463b051b107-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3aa14443-9372-4f3d-bc34-3463b051b107" (UID: "3aa14443-9372-4f3d-bc34-3463b051b107"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:20:54 crc kubenswrapper[4933]: I1202 16:20:54.663632 4933 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3aa14443-9372-4f3d-bc34-3463b051b107-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 02 16:20:54 crc kubenswrapper[4933]: I1202 16:20:54.663664 4933 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4caba072-5d91-4126-909a-b4e7ad6167d6-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 16:20:54 crc kubenswrapper[4933]: I1202 16:20:54.663673 4933 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4caba072-5d91-4126-909a-b4e7ad6167d6-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 02 16:20:54 crc kubenswrapper[4933]: I1202 16:20:54.663683 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mwrxb\" (UniqueName: \"kubernetes.io/projected/3aa14443-9372-4f3d-bc34-3463b051b107-kube-api-access-mwrxb\") on node \"crc\" DevicePath \"\"" Dec 02 16:20:54 crc kubenswrapper[4933]: I1202 16:20:54.663693 4933 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3aa14443-9372-4f3d-bc34-3463b051b107-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 16:20:54 crc kubenswrapper[4933]: I1202 16:20:54.663701 4933 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4caba072-5d91-4126-909a-b4e7ad6167d6-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 02 16:20:54 crc kubenswrapper[4933]: I1202 16:20:54.683015 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3aa14443-9372-4f3d-bc34-3463b051b107-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "3aa14443-9372-4f3d-bc34-3463b051b107" (UID: "3aa14443-9372-4f3d-bc34-3463b051b107"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:20:54 crc kubenswrapper[4933]: I1202 16:20:54.700028 4933 generic.go:334] "Generic (PLEG): container finished" podID="3aa14443-9372-4f3d-bc34-3463b051b107" containerID="0b71cb6a351224d9b0489185cbd0ef0bfbecce853323c3977354db13eab3915f" exitCode=0 Dec 02 16:20:54 crc kubenswrapper[4933]: I1202 16:20:54.700141 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-5979c644c5-dckr9" Dec 02 16:20:54 crc kubenswrapper[4933]: I1202 16:20:54.700375 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-5979c644c5-dckr9" event={"ID":"3aa14443-9372-4f3d-bc34-3463b051b107","Type":"ContainerDied","Data":"0b71cb6a351224d9b0489185cbd0ef0bfbecce853323c3977354db13eab3915f"} Dec 02 16:20:54 crc kubenswrapper[4933]: I1202 16:20:54.700494 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-5979c644c5-dckr9" event={"ID":"3aa14443-9372-4f3d-bc34-3463b051b107","Type":"ContainerDied","Data":"d6bb5d2dc6f3c14c0923dbcea314c35ef7d8b113e5912ae5fcddf415151f2c07"} Dec 02 16:20:54 crc kubenswrapper[4933]: I1202 16:20:54.700579 4933 scope.go:117] "RemoveContainer" containerID="0b71cb6a351224d9b0489185cbd0ef0bfbecce853323c3977354db13eab3915f" Dec 02 16:20:54 crc kubenswrapper[4933]: I1202 16:20:54.705612 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3aa14443-9372-4f3d-bc34-3463b051b107-config-data" (OuterVolumeSpecName: "config-data") pod "3aa14443-9372-4f3d-bc34-3463b051b107" (UID: "3aa14443-9372-4f3d-bc34-3463b051b107"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:20:54 crc kubenswrapper[4933]: I1202 16:20:54.711021 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5558cd5dc7-s4k68" event={"ID":"4caba072-5d91-4126-909a-b4e7ad6167d6","Type":"ContainerDied","Data":"e955d0062e741a6b46397d5c5d7268ec2ca4a0b5773f8e4a110b5fb1b04055f2"} Dec 02 16:20:54 crc kubenswrapper[4933]: I1202 16:20:54.711204 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5558cd5dc7-s4k68" Dec 02 16:20:54 crc kubenswrapper[4933]: I1202 16:20:54.752016 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3aa14443-9372-4f3d-bc34-3463b051b107-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "3aa14443-9372-4f3d-bc34-3463b051b107" (UID: "3aa14443-9372-4f3d-bc34-3463b051b107"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:20:54 crc kubenswrapper[4933]: I1202 16:20:54.752964 4933 scope.go:117] "RemoveContainer" containerID="0b71cb6a351224d9b0489185cbd0ef0bfbecce853323c3977354db13eab3915f" Dec 02 16:20:54 crc kubenswrapper[4933]: E1202 16:20:54.753988 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b71cb6a351224d9b0489185cbd0ef0bfbecce853323c3977354db13eab3915f\": container with ID starting with 0b71cb6a351224d9b0489185cbd0ef0bfbecce853323c3977354db13eab3915f not found: ID does not exist" containerID="0b71cb6a351224d9b0489185cbd0ef0bfbecce853323c3977354db13eab3915f" Dec 02 16:20:54 crc kubenswrapper[4933]: I1202 16:20:54.754141 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b71cb6a351224d9b0489185cbd0ef0bfbecce853323c3977354db13eab3915f"} err="failed to get container status \"0b71cb6a351224d9b0489185cbd0ef0bfbecce853323c3977354db13eab3915f\": rpc error: code = NotFound desc = could not find container \"0b71cb6a351224d9b0489185cbd0ef0bfbecce853323c3977354db13eab3915f\": container with ID starting with 0b71cb6a351224d9b0489185cbd0ef0bfbecce853323c3977354db13eab3915f not found: ID does not exist" Dec 02 16:20:54 crc kubenswrapper[4933]: I1202 16:20:54.754239 4933 scope.go:117] "RemoveContainer" containerID="502fb37e69c4ed4b648add46e46605f39c20b2666f164ad2678b88200437e652" Dec 02 16:20:54 crc kubenswrapper[4933]: I1202 16:20:54.768895 4933 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3aa14443-9372-4f3d-bc34-3463b051b107-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 02 16:20:54 crc kubenswrapper[4933]: I1202 16:20:54.768931 4933 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3aa14443-9372-4f3d-bc34-3463b051b107-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 16:20:54 crc kubenswrapper[4933]: I1202 16:20:54.768941 4933 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3aa14443-9372-4f3d-bc34-3463b051b107-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 02 16:20:54 crc kubenswrapper[4933]: I1202 16:20:54.793189 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-5558cd5dc7-s4k68"] Dec 02 16:20:54 crc kubenswrapper[4933]: I1202 16:20:54.816774 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-5558cd5dc7-s4k68"] Dec 02 16:20:55 crc kubenswrapper[4933]: I1202 16:20:55.045964 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-5979c644c5-dckr9"] Dec 02 16:20:55 crc kubenswrapper[4933]: I1202 16:20:55.068660 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4caba072-5d91-4126-909a-b4e7ad6167d6" path="/var/lib/kubelet/pods/4caba072-5d91-4126-909a-b4e7ad6167d6/volumes" Dec 02 16:20:55 crc kubenswrapper[4933]: I1202 16:20:55.069902 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-5979c644c5-dckr9"] Dec 02 16:20:55 crc kubenswrapper[4933]: I1202 16:20:55.097041 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-l7rfq"] Dec 02 16:20:55 crc kubenswrapper[4933]: I1202 16:20:55.725617 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-l7rfq" event={"ID":"1eeef066-30d5-47f9-90a0-2815244b7ebb","Type":"ContainerStarted","Data":"2eb0a31bcdca192659cf4248fb4e39b3f7531fb0711782683bf6da4672bd67a3"} Dec 02 16:20:57 crc kubenswrapper[4933]: I1202 16:20:57.073507 4933 scope.go:117] "RemoveContainer" containerID="3464037e44f6657a3e78f158e1b9ac51eea30ff49dcabcf02071d819dacd47c7" Dec 02 16:20:57 crc kubenswrapper[4933]: E1202 16:20:57.074299 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 16:20:57 crc kubenswrapper[4933]: I1202 16:20:57.075401 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3aa14443-9372-4f3d-bc34-3463b051b107" path="/var/lib/kubelet/pods/3aa14443-9372-4f3d-bc34-3463b051b107/volumes" Dec 02 16:20:57 crc kubenswrapper[4933]: I1202 16:20:57.910984 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-sync-tdcg5"] Dec 02 16:20:57 crc kubenswrapper[4933]: I1202 16:20:57.930600 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-sync-tdcg5"] Dec 02 16:20:58 crc kubenswrapper[4933]: I1202 16:20:58.018812 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-sync-hjbbg"] Dec 02 16:20:58 crc kubenswrapper[4933]: E1202 16:20:58.019378 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3aa14443-9372-4f3d-bc34-3463b051b107" containerName="heat-cfnapi" Dec 02 16:20:58 crc kubenswrapper[4933]: I1202 16:20:58.019399 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="3aa14443-9372-4f3d-bc34-3463b051b107" containerName="heat-cfnapi" Dec 02 16:20:58 crc kubenswrapper[4933]: E1202 16:20:58.019417 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4caba072-5d91-4126-909a-b4e7ad6167d6" containerName="heat-api" Dec 02 16:20:58 crc kubenswrapper[4933]: I1202 16:20:58.019425 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="4caba072-5d91-4126-909a-b4e7ad6167d6" containerName="heat-api" Dec 02 16:20:58 crc kubenswrapper[4933]: I1202 16:20:58.019692 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="3aa14443-9372-4f3d-bc34-3463b051b107" containerName="heat-cfnapi" Dec 02 16:20:58 crc kubenswrapper[4933]: I1202 16:20:58.019714 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="4caba072-5d91-4126-909a-b4e7ad6167d6" containerName="heat-api" Dec 02 16:20:58 crc kubenswrapper[4933]: I1202 16:20:58.020568 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-hjbbg" Dec 02 16:20:58 crc kubenswrapper[4933]: I1202 16:20:58.023695 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 02 16:20:58 crc kubenswrapper[4933]: I1202 16:20:58.037388 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-hjbbg"] Dec 02 16:20:58 crc kubenswrapper[4933]: I1202 16:20:58.064930 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrlc9\" (UniqueName: \"kubernetes.io/projected/747183c3-25ee-4d86-a4d6-57a21a12adda-kube-api-access-mrlc9\") pod \"aodh-db-sync-hjbbg\" (UID: \"747183c3-25ee-4d86-a4d6-57a21a12adda\") " pod="openstack/aodh-db-sync-hjbbg" Dec 02 16:20:58 crc kubenswrapper[4933]: I1202 16:20:58.065012 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/747183c3-25ee-4d86-a4d6-57a21a12adda-combined-ca-bundle\") pod \"aodh-db-sync-hjbbg\" (UID: \"747183c3-25ee-4d86-a4d6-57a21a12adda\") " pod="openstack/aodh-db-sync-hjbbg" Dec 02 16:20:58 crc kubenswrapper[4933]: I1202 16:20:58.065129 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/747183c3-25ee-4d86-a4d6-57a21a12adda-scripts\") pod \"aodh-db-sync-hjbbg\" (UID: \"747183c3-25ee-4d86-a4d6-57a21a12adda\") " pod="openstack/aodh-db-sync-hjbbg" Dec 02 16:20:58 crc kubenswrapper[4933]: I1202 16:20:58.065163 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/747183c3-25ee-4d86-a4d6-57a21a12adda-config-data\") pod \"aodh-db-sync-hjbbg\" (UID: \"747183c3-25ee-4d86-a4d6-57a21a12adda\") " pod="openstack/aodh-db-sync-hjbbg" Dec 02 16:20:58 crc kubenswrapper[4933]: I1202 16:20:58.168349 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrlc9\" (UniqueName: \"kubernetes.io/projected/747183c3-25ee-4d86-a4d6-57a21a12adda-kube-api-access-mrlc9\") pod \"aodh-db-sync-hjbbg\" (UID: \"747183c3-25ee-4d86-a4d6-57a21a12adda\") " pod="openstack/aodh-db-sync-hjbbg" Dec 02 16:20:58 crc kubenswrapper[4933]: I1202 16:20:58.168407 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/747183c3-25ee-4d86-a4d6-57a21a12adda-combined-ca-bundle\") pod \"aodh-db-sync-hjbbg\" (UID: \"747183c3-25ee-4d86-a4d6-57a21a12adda\") " pod="openstack/aodh-db-sync-hjbbg" Dec 02 16:20:58 crc kubenswrapper[4933]: I1202 16:20:58.168507 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/747183c3-25ee-4d86-a4d6-57a21a12adda-scripts\") pod \"aodh-db-sync-hjbbg\" (UID: \"747183c3-25ee-4d86-a4d6-57a21a12adda\") " pod="openstack/aodh-db-sync-hjbbg" Dec 02 16:20:58 crc kubenswrapper[4933]: I1202 16:20:58.168532 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/747183c3-25ee-4d86-a4d6-57a21a12adda-config-data\") pod \"aodh-db-sync-hjbbg\" (UID: \"747183c3-25ee-4d86-a4d6-57a21a12adda\") " pod="openstack/aodh-db-sync-hjbbg" Dec 02 16:20:58 crc kubenswrapper[4933]: I1202 16:20:58.175306 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/747183c3-25ee-4d86-a4d6-57a21a12adda-scripts\") pod \"aodh-db-sync-hjbbg\" (UID: \"747183c3-25ee-4d86-a4d6-57a21a12adda\") " pod="openstack/aodh-db-sync-hjbbg" Dec 02 16:20:58 crc kubenswrapper[4933]: I1202 16:20:58.175872 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/747183c3-25ee-4d86-a4d6-57a21a12adda-config-data\") pod \"aodh-db-sync-hjbbg\" (UID: \"747183c3-25ee-4d86-a4d6-57a21a12adda\") " pod="openstack/aodh-db-sync-hjbbg" Dec 02 16:20:58 crc kubenswrapper[4933]: I1202 16:20:58.177438 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/747183c3-25ee-4d86-a4d6-57a21a12adda-combined-ca-bundle\") pod \"aodh-db-sync-hjbbg\" (UID: \"747183c3-25ee-4d86-a4d6-57a21a12adda\") " pod="openstack/aodh-db-sync-hjbbg" Dec 02 16:20:58 crc kubenswrapper[4933]: I1202 16:20:58.199568 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrlc9\" (UniqueName: \"kubernetes.io/projected/747183c3-25ee-4d86-a4d6-57a21a12adda-kube-api-access-mrlc9\") pod \"aodh-db-sync-hjbbg\" (UID: \"747183c3-25ee-4d86-a4d6-57a21a12adda\") " pod="openstack/aodh-db-sync-hjbbg" Dec 02 16:20:58 crc kubenswrapper[4933]: I1202 16:20:58.351240 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-hjbbg" Dec 02 16:20:58 crc kubenswrapper[4933]: I1202 16:20:58.884374 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-hjbbg"] Dec 02 16:20:58 crc kubenswrapper[4933]: W1202 16:20:58.890224 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod747183c3_25ee_4d86_a4d6_57a21a12adda.slice/crio-cdd3e3267fc9315268ddbca476190fb48a589fc20fa86a4f4b74851227bbfa7d WatchSource:0}: Error finding container cdd3e3267fc9315268ddbca476190fb48a589fc20fa86a4f4b74851227bbfa7d: Status 404 returned error can't find the container with id cdd3e3267fc9315268ddbca476190fb48a589fc20fa86a4f4b74851227bbfa7d Dec 02 16:20:59 crc kubenswrapper[4933]: I1202 16:20:59.092664 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="020884d1-c656-48c9-966b-5f7da8bf6af6" path="/var/lib/kubelet/pods/020884d1-c656-48c9-966b-5f7da8bf6af6/volumes" Dec 02 16:20:59 crc kubenswrapper[4933]: I1202 16:20:59.775756 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-hjbbg" event={"ID":"747183c3-25ee-4d86-a4d6-57a21a12adda","Type":"ContainerStarted","Data":"cdd3e3267fc9315268ddbca476190fb48a589fc20fa86a4f4b74851227bbfa7d"} Dec 02 16:20:59 crc kubenswrapper[4933]: I1202 16:20:59.777617 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-l7rfq" event={"ID":"1eeef066-30d5-47f9-90a0-2815244b7ebb","Type":"ContainerStarted","Data":"36cc561cc1936b1ec4bd32a7f12e5c06dba9f466a05f9a9a65c2e9f8d197c78d"} Dec 02 16:21:02 crc kubenswrapper[4933]: E1202 16:21:02.448705 4933 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="233e8e7fffa44b80d7319fbe8af766215914d33ff5dc13cd6f7267d68b6d25b8" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Dec 02 16:21:02 crc kubenswrapper[4933]: E1202 16:21:02.450092 4933 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="233e8e7fffa44b80d7319fbe8af766215914d33ff5dc13cd6f7267d68b6d25b8" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Dec 02 16:21:02 crc kubenswrapper[4933]: E1202 16:21:02.451462 4933 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="233e8e7fffa44b80d7319fbe8af766215914d33ff5dc13cd6f7267d68b6d25b8" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Dec 02 16:21:02 crc kubenswrapper[4933]: E1202 16:21:02.451505 4933 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/heat-engine-64b6fd7f9d-b5l2s" podUID="59434c0f-e21f-45a2-909b-fa1143a5b5ae" containerName="heat-engine" Dec 02 16:21:02 crc kubenswrapper[4933]: I1202 16:21:02.819183 4933 generic.go:334] "Generic (PLEG): container finished" podID="1eeef066-30d5-47f9-90a0-2815244b7ebb" containerID="36cc561cc1936b1ec4bd32a7f12e5c06dba9f466a05f9a9a65c2e9f8d197c78d" exitCode=0 Dec 02 16:21:02 crc kubenswrapper[4933]: I1202 16:21:02.819239 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-l7rfq" event={"ID":"1eeef066-30d5-47f9-90a0-2815244b7ebb","Type":"ContainerDied","Data":"36cc561cc1936b1ec4bd32a7f12e5c06dba9f466a05f9a9a65c2e9f8d197c78d"} Dec 02 16:21:02 crc kubenswrapper[4933]: I1202 16:21:02.836460 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 02 16:21:02 crc kubenswrapper[4933]: I1202 16:21:02.847975 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 02 16:21:04 crc kubenswrapper[4933]: I1202 16:21:04.869077 4933 generic.go:334] "Generic (PLEG): container finished" podID="59434c0f-e21f-45a2-909b-fa1143a5b5ae" containerID="233e8e7fffa44b80d7319fbe8af766215914d33ff5dc13cd6f7267d68b6d25b8" exitCode=0 Dec 02 16:21:04 crc kubenswrapper[4933]: I1202 16:21:04.869641 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-64b6fd7f9d-b5l2s" event={"ID":"59434c0f-e21f-45a2-909b-fa1143a5b5ae","Type":"ContainerDied","Data":"233e8e7fffa44b80d7319fbe8af766215914d33ff5dc13cd6f7267d68b6d25b8"} Dec 02 16:21:05 crc kubenswrapper[4933]: I1202 16:21:05.433251 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-l7rfq" Dec 02 16:21:05 crc kubenswrapper[4933]: I1202 16:21:05.504041 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1eeef066-30d5-47f9-90a0-2815244b7ebb-ssh-key\") pod \"1eeef066-30d5-47f9-90a0-2815244b7ebb\" (UID: \"1eeef066-30d5-47f9-90a0-2815244b7ebb\") " Dec 02 16:21:05 crc kubenswrapper[4933]: I1202 16:21:05.504598 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1eeef066-30d5-47f9-90a0-2815244b7ebb-inventory\") pod \"1eeef066-30d5-47f9-90a0-2815244b7ebb\" (UID: \"1eeef066-30d5-47f9-90a0-2815244b7ebb\") " Dec 02 16:21:05 crc kubenswrapper[4933]: I1202 16:21:05.504731 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ntvjq\" (UniqueName: \"kubernetes.io/projected/1eeef066-30d5-47f9-90a0-2815244b7ebb-kube-api-access-ntvjq\") pod \"1eeef066-30d5-47f9-90a0-2815244b7ebb\" (UID: \"1eeef066-30d5-47f9-90a0-2815244b7ebb\") " Dec 02 16:21:05 crc kubenswrapper[4933]: I1202 16:21:05.507790 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1eeef066-30d5-47f9-90a0-2815244b7ebb-kube-api-access-ntvjq" (OuterVolumeSpecName: "kube-api-access-ntvjq") pod "1eeef066-30d5-47f9-90a0-2815244b7ebb" (UID: "1eeef066-30d5-47f9-90a0-2815244b7ebb"). InnerVolumeSpecName "kube-api-access-ntvjq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:21:05 crc kubenswrapper[4933]: I1202 16:21:05.515235 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-64b6fd7f9d-b5l2s" Dec 02 16:21:05 crc kubenswrapper[4933]: I1202 16:21:05.553137 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1eeef066-30d5-47f9-90a0-2815244b7ebb-inventory" (OuterVolumeSpecName: "inventory") pod "1eeef066-30d5-47f9-90a0-2815244b7ebb" (UID: "1eeef066-30d5-47f9-90a0-2815244b7ebb"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:21:05 crc kubenswrapper[4933]: I1202 16:21:05.557759 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1eeef066-30d5-47f9-90a0-2815244b7ebb-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "1eeef066-30d5-47f9-90a0-2815244b7ebb" (UID: "1eeef066-30d5-47f9-90a0-2815244b7ebb"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:21:05 crc kubenswrapper[4933]: I1202 16:21:05.606625 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lfxk8\" (UniqueName: \"kubernetes.io/projected/59434c0f-e21f-45a2-909b-fa1143a5b5ae-kube-api-access-lfxk8\") pod \"59434c0f-e21f-45a2-909b-fa1143a5b5ae\" (UID: \"59434c0f-e21f-45a2-909b-fa1143a5b5ae\") " Dec 02 16:21:05 crc kubenswrapper[4933]: I1202 16:21:05.606683 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59434c0f-e21f-45a2-909b-fa1143a5b5ae-config-data\") pod \"59434c0f-e21f-45a2-909b-fa1143a5b5ae\" (UID: \"59434c0f-e21f-45a2-909b-fa1143a5b5ae\") " Dec 02 16:21:05 crc kubenswrapper[4933]: I1202 16:21:05.606748 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/59434c0f-e21f-45a2-909b-fa1143a5b5ae-config-data-custom\") pod \"59434c0f-e21f-45a2-909b-fa1143a5b5ae\" (UID: \"59434c0f-e21f-45a2-909b-fa1143a5b5ae\") " Dec 02 16:21:05 crc kubenswrapper[4933]: I1202 16:21:05.607059 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59434c0f-e21f-45a2-909b-fa1143a5b5ae-combined-ca-bundle\") pod \"59434c0f-e21f-45a2-909b-fa1143a5b5ae\" (UID: \"59434c0f-e21f-45a2-909b-fa1143a5b5ae\") " Dec 02 16:21:05 crc kubenswrapper[4933]: I1202 16:21:05.607978 4933 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1eeef066-30d5-47f9-90a0-2815244b7ebb-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 16:21:05 crc kubenswrapper[4933]: I1202 16:21:05.608006 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ntvjq\" (UniqueName: \"kubernetes.io/projected/1eeef066-30d5-47f9-90a0-2815244b7ebb-kube-api-access-ntvjq\") on node \"crc\" DevicePath \"\"" Dec 02 16:21:05 crc kubenswrapper[4933]: I1202 16:21:05.608021 4933 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1eeef066-30d5-47f9-90a0-2815244b7ebb-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 16:21:05 crc kubenswrapper[4933]: I1202 16:21:05.610079 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59434c0f-e21f-45a2-909b-fa1143a5b5ae-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "59434c0f-e21f-45a2-909b-fa1143a5b5ae" (UID: "59434c0f-e21f-45a2-909b-fa1143a5b5ae"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:21:05 crc kubenswrapper[4933]: I1202 16:21:05.610465 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59434c0f-e21f-45a2-909b-fa1143a5b5ae-kube-api-access-lfxk8" (OuterVolumeSpecName: "kube-api-access-lfxk8") pod "59434c0f-e21f-45a2-909b-fa1143a5b5ae" (UID: "59434c0f-e21f-45a2-909b-fa1143a5b5ae"). InnerVolumeSpecName "kube-api-access-lfxk8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:21:05 crc kubenswrapper[4933]: I1202 16:21:05.642572 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59434c0f-e21f-45a2-909b-fa1143a5b5ae-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "59434c0f-e21f-45a2-909b-fa1143a5b5ae" (UID: "59434c0f-e21f-45a2-909b-fa1143a5b5ae"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:21:05 crc kubenswrapper[4933]: I1202 16:21:05.673508 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59434c0f-e21f-45a2-909b-fa1143a5b5ae-config-data" (OuterVolumeSpecName: "config-data") pod "59434c0f-e21f-45a2-909b-fa1143a5b5ae" (UID: "59434c0f-e21f-45a2-909b-fa1143a5b5ae"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:21:05 crc kubenswrapper[4933]: I1202 16:21:05.710289 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lfxk8\" (UniqueName: \"kubernetes.io/projected/59434c0f-e21f-45a2-909b-fa1143a5b5ae-kube-api-access-lfxk8\") on node \"crc\" DevicePath \"\"" Dec 02 16:21:05 crc kubenswrapper[4933]: I1202 16:21:05.710327 4933 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59434c0f-e21f-45a2-909b-fa1143a5b5ae-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 16:21:05 crc kubenswrapper[4933]: I1202 16:21:05.710341 4933 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/59434c0f-e21f-45a2-909b-fa1143a5b5ae-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 02 16:21:05 crc kubenswrapper[4933]: I1202 16:21:05.710357 4933 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59434c0f-e21f-45a2-909b-fa1143a5b5ae-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 16:21:05 crc kubenswrapper[4933]: I1202 16:21:05.886242 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-64b6fd7f9d-b5l2s" event={"ID":"59434c0f-e21f-45a2-909b-fa1143a5b5ae","Type":"ContainerDied","Data":"f6a899d43c400f1edd0baa4bbb9b5afe2f83946f9b7961f7ecf625e2b3793b42"} Dec 02 16:21:05 crc kubenswrapper[4933]: I1202 16:21:05.886322 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-64b6fd7f9d-b5l2s" Dec 02 16:21:05 crc kubenswrapper[4933]: I1202 16:21:05.886394 4933 scope.go:117] "RemoveContainer" containerID="233e8e7fffa44b80d7319fbe8af766215914d33ff5dc13cd6f7267d68b6d25b8" Dec 02 16:21:05 crc kubenswrapper[4933]: I1202 16:21:05.890010 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-hjbbg" event={"ID":"747183c3-25ee-4d86-a4d6-57a21a12adda","Type":"ContainerStarted","Data":"d150a2ea84d17323ba5f1d3abe9bf753ac35e0c88d6ac8bbb05b056fad34319f"} Dec 02 16:21:05 crc kubenswrapper[4933]: I1202 16:21:05.892817 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-l7rfq" event={"ID":"1eeef066-30d5-47f9-90a0-2815244b7ebb","Type":"ContainerDied","Data":"2eb0a31bcdca192659cf4248fb4e39b3f7531fb0711782683bf6da4672bd67a3"} Dec 02 16:21:05 crc kubenswrapper[4933]: I1202 16:21:05.892894 4933 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2eb0a31bcdca192659cf4248fb4e39b3f7531fb0711782683bf6da4672bd67a3" Dec 02 16:21:05 crc kubenswrapper[4933]: I1202 16:21:05.892897 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-l7rfq" Dec 02 16:21:05 crc kubenswrapper[4933]: I1202 16:21:05.919769 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-sync-hjbbg" podStartSLOduration=2.594466003 podStartE2EDuration="8.919048909s" podCreationTimestamp="2025-12-02 16:20:57 +0000 UTC" firstStartedPulling="2025-12-02 16:20:58.898438648 +0000 UTC m=+1722.149665351" lastFinishedPulling="2025-12-02 16:21:05.223021554 +0000 UTC m=+1728.474248257" observedRunningTime="2025-12-02 16:21:05.912133222 +0000 UTC m=+1729.163359925" watchObservedRunningTime="2025-12-02 16:21:05.919048909 +0000 UTC m=+1729.170275622" Dec 02 16:21:05 crc kubenswrapper[4933]: I1202 16:21:05.954742 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-64b6fd7f9d-b5l2s"] Dec 02 16:21:05 crc kubenswrapper[4933]: I1202 16:21:05.967331 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-engine-64b6fd7f9d-b5l2s"] Dec 02 16:21:06 crc kubenswrapper[4933]: I1202 16:21:06.514260 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-db4xp"] Dec 02 16:21:06 crc kubenswrapper[4933]: E1202 16:21:06.515209 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59434c0f-e21f-45a2-909b-fa1143a5b5ae" containerName="heat-engine" Dec 02 16:21:06 crc kubenswrapper[4933]: I1202 16:21:06.515232 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="59434c0f-e21f-45a2-909b-fa1143a5b5ae" containerName="heat-engine" Dec 02 16:21:06 crc kubenswrapper[4933]: E1202 16:21:06.515272 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1eeef066-30d5-47f9-90a0-2815244b7ebb" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 02 16:21:06 crc kubenswrapper[4933]: I1202 16:21:06.515282 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="1eeef066-30d5-47f9-90a0-2815244b7ebb" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 02 16:21:06 crc kubenswrapper[4933]: I1202 16:21:06.515578 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="1eeef066-30d5-47f9-90a0-2815244b7ebb" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 02 16:21:06 crc kubenswrapper[4933]: I1202 16:21:06.515619 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="59434c0f-e21f-45a2-909b-fa1143a5b5ae" containerName="heat-engine" Dec 02 16:21:06 crc kubenswrapper[4933]: I1202 16:21:06.516599 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-db4xp" Dec 02 16:21:06 crc kubenswrapper[4933]: I1202 16:21:06.519486 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 16:21:06 crc kubenswrapper[4933]: I1202 16:21:06.519723 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mlmmm" Dec 02 16:21:06 crc kubenswrapper[4933]: I1202 16:21:06.520339 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 16:21:06 crc kubenswrapper[4933]: I1202 16:21:06.521682 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 16:21:06 crc kubenswrapper[4933]: I1202 16:21:06.526885 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-db4xp"] Dec 02 16:21:06 crc kubenswrapper[4933]: I1202 16:21:06.631064 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xms7w\" (UniqueName: \"kubernetes.io/projected/2060aa16-0f55-457f-98c1-058372e78f0f-kube-api-access-xms7w\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-db4xp\" (UID: \"2060aa16-0f55-457f-98c1-058372e78f0f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-db4xp" Dec 02 16:21:06 crc kubenswrapper[4933]: I1202 16:21:06.631150 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2060aa16-0f55-457f-98c1-058372e78f0f-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-db4xp\" (UID: \"2060aa16-0f55-457f-98c1-058372e78f0f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-db4xp" Dec 02 16:21:06 crc kubenswrapper[4933]: I1202 16:21:06.631201 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2060aa16-0f55-457f-98c1-058372e78f0f-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-db4xp\" (UID: \"2060aa16-0f55-457f-98c1-058372e78f0f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-db4xp" Dec 02 16:21:06 crc kubenswrapper[4933]: I1202 16:21:06.631232 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2060aa16-0f55-457f-98c1-058372e78f0f-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-db4xp\" (UID: \"2060aa16-0f55-457f-98c1-058372e78f0f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-db4xp" Dec 02 16:21:06 crc kubenswrapper[4933]: I1202 16:21:06.733266 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xms7w\" (UniqueName: \"kubernetes.io/projected/2060aa16-0f55-457f-98c1-058372e78f0f-kube-api-access-xms7w\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-db4xp\" (UID: \"2060aa16-0f55-457f-98c1-058372e78f0f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-db4xp" Dec 02 16:21:06 crc kubenswrapper[4933]: I1202 16:21:06.733368 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2060aa16-0f55-457f-98c1-058372e78f0f-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-db4xp\" (UID: \"2060aa16-0f55-457f-98c1-058372e78f0f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-db4xp" Dec 02 16:21:06 crc kubenswrapper[4933]: I1202 16:21:06.733424 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2060aa16-0f55-457f-98c1-058372e78f0f-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-db4xp\" (UID: \"2060aa16-0f55-457f-98c1-058372e78f0f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-db4xp" Dec 02 16:21:06 crc kubenswrapper[4933]: I1202 16:21:06.733456 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2060aa16-0f55-457f-98c1-058372e78f0f-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-db4xp\" (UID: \"2060aa16-0f55-457f-98c1-058372e78f0f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-db4xp" Dec 02 16:21:06 crc kubenswrapper[4933]: I1202 16:21:06.741572 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2060aa16-0f55-457f-98c1-058372e78f0f-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-db4xp\" (UID: \"2060aa16-0f55-457f-98c1-058372e78f0f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-db4xp" Dec 02 16:21:06 crc kubenswrapper[4933]: I1202 16:21:06.741956 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2060aa16-0f55-457f-98c1-058372e78f0f-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-db4xp\" (UID: \"2060aa16-0f55-457f-98c1-058372e78f0f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-db4xp" Dec 02 16:21:06 crc kubenswrapper[4933]: I1202 16:21:06.742428 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2060aa16-0f55-457f-98c1-058372e78f0f-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-db4xp\" (UID: \"2060aa16-0f55-457f-98c1-058372e78f0f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-db4xp" Dec 02 16:21:06 crc kubenswrapper[4933]: I1202 16:21:06.752881 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xms7w\" (UniqueName: \"kubernetes.io/projected/2060aa16-0f55-457f-98c1-058372e78f0f-kube-api-access-xms7w\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-db4xp\" (UID: \"2060aa16-0f55-457f-98c1-058372e78f0f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-db4xp" Dec 02 16:21:06 crc kubenswrapper[4933]: I1202 16:21:06.869279 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-db4xp" Dec 02 16:21:07 crc kubenswrapper[4933]: I1202 16:21:07.082301 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59434c0f-e21f-45a2-909b-fa1143a5b5ae" path="/var/lib/kubelet/pods/59434c0f-e21f-45a2-909b-fa1143a5b5ae/volumes" Dec 02 16:21:07 crc kubenswrapper[4933]: I1202 16:21:07.429746 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-db4xp"] Dec 02 16:21:07 crc kubenswrapper[4933]: I1202 16:21:07.919080 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-db4xp" event={"ID":"2060aa16-0f55-457f-98c1-058372e78f0f","Type":"ContainerStarted","Data":"843168771e4d5c08737467f304ddf760122c1c29a18f73264146ab29e93242b3"} Dec 02 16:21:08 crc kubenswrapper[4933]: I1202 16:21:08.931925 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-db4xp" event={"ID":"2060aa16-0f55-457f-98c1-058372e78f0f","Type":"ContainerStarted","Data":"52147fa4a8f7644dac2993ebb77b6476a67f7ed664bbd3f7fdcf3949422fdead"} Dec 02 16:21:08 crc kubenswrapper[4933]: I1202 16:21:08.933384 4933 generic.go:334] "Generic (PLEG): container finished" podID="747183c3-25ee-4d86-a4d6-57a21a12adda" containerID="d150a2ea84d17323ba5f1d3abe9bf753ac35e0c88d6ac8bbb05b056fad34319f" exitCode=0 Dec 02 16:21:08 crc kubenswrapper[4933]: I1202 16:21:08.933421 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-hjbbg" event={"ID":"747183c3-25ee-4d86-a4d6-57a21a12adda","Type":"ContainerDied","Data":"d150a2ea84d17323ba5f1d3abe9bf753ac35e0c88d6ac8bbb05b056fad34319f"} Dec 02 16:21:08 crc kubenswrapper[4933]: I1202 16:21:08.947844 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-db4xp" podStartSLOduration=1.916648442 podStartE2EDuration="2.947809626s" podCreationTimestamp="2025-12-02 16:21:06 +0000 UTC" firstStartedPulling="2025-12-02 16:21:07.427886224 +0000 UTC m=+1730.679112927" lastFinishedPulling="2025-12-02 16:21:08.459047408 +0000 UTC m=+1731.710274111" observedRunningTime="2025-12-02 16:21:08.946693016 +0000 UTC m=+1732.197919719" watchObservedRunningTime="2025-12-02 16:21:08.947809626 +0000 UTC m=+1732.199036329" Dec 02 16:21:10 crc kubenswrapper[4933]: I1202 16:21:10.452355 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-hjbbg" Dec 02 16:21:10 crc kubenswrapper[4933]: I1202 16:21:10.546511 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mrlc9\" (UniqueName: \"kubernetes.io/projected/747183c3-25ee-4d86-a4d6-57a21a12adda-kube-api-access-mrlc9\") pod \"747183c3-25ee-4d86-a4d6-57a21a12adda\" (UID: \"747183c3-25ee-4d86-a4d6-57a21a12adda\") " Dec 02 16:21:10 crc kubenswrapper[4933]: I1202 16:21:10.546579 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/747183c3-25ee-4d86-a4d6-57a21a12adda-scripts\") pod \"747183c3-25ee-4d86-a4d6-57a21a12adda\" (UID: \"747183c3-25ee-4d86-a4d6-57a21a12adda\") " Dec 02 16:21:10 crc kubenswrapper[4933]: I1202 16:21:10.546599 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/747183c3-25ee-4d86-a4d6-57a21a12adda-combined-ca-bundle\") pod \"747183c3-25ee-4d86-a4d6-57a21a12adda\" (UID: \"747183c3-25ee-4d86-a4d6-57a21a12adda\") " Dec 02 16:21:10 crc kubenswrapper[4933]: I1202 16:21:10.546803 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/747183c3-25ee-4d86-a4d6-57a21a12adda-config-data\") pod \"747183c3-25ee-4d86-a4d6-57a21a12adda\" (UID: \"747183c3-25ee-4d86-a4d6-57a21a12adda\") " Dec 02 16:21:10 crc kubenswrapper[4933]: I1202 16:21:10.551969 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/747183c3-25ee-4d86-a4d6-57a21a12adda-scripts" (OuterVolumeSpecName: "scripts") pod "747183c3-25ee-4d86-a4d6-57a21a12adda" (UID: "747183c3-25ee-4d86-a4d6-57a21a12adda"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:21:10 crc kubenswrapper[4933]: I1202 16:21:10.552448 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/747183c3-25ee-4d86-a4d6-57a21a12adda-kube-api-access-mrlc9" (OuterVolumeSpecName: "kube-api-access-mrlc9") pod "747183c3-25ee-4d86-a4d6-57a21a12adda" (UID: "747183c3-25ee-4d86-a4d6-57a21a12adda"). InnerVolumeSpecName "kube-api-access-mrlc9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:21:10 crc kubenswrapper[4933]: I1202 16:21:10.577620 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/747183c3-25ee-4d86-a4d6-57a21a12adda-config-data" (OuterVolumeSpecName: "config-data") pod "747183c3-25ee-4d86-a4d6-57a21a12adda" (UID: "747183c3-25ee-4d86-a4d6-57a21a12adda"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:21:10 crc kubenswrapper[4933]: I1202 16:21:10.581387 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/747183c3-25ee-4d86-a4d6-57a21a12adda-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "747183c3-25ee-4d86-a4d6-57a21a12adda" (UID: "747183c3-25ee-4d86-a4d6-57a21a12adda"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:21:10 crc kubenswrapper[4933]: I1202 16:21:10.650028 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mrlc9\" (UniqueName: \"kubernetes.io/projected/747183c3-25ee-4d86-a4d6-57a21a12adda-kube-api-access-mrlc9\") on node \"crc\" DevicePath \"\"" Dec 02 16:21:10 crc kubenswrapper[4933]: I1202 16:21:10.650065 4933 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/747183c3-25ee-4d86-a4d6-57a21a12adda-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 16:21:10 crc kubenswrapper[4933]: I1202 16:21:10.650075 4933 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/747183c3-25ee-4d86-a4d6-57a21a12adda-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 16:21:10 crc kubenswrapper[4933]: I1202 16:21:10.650085 4933 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/747183c3-25ee-4d86-a4d6-57a21a12adda-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 16:21:10 crc kubenswrapper[4933]: I1202 16:21:10.959395 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-hjbbg" event={"ID":"747183c3-25ee-4d86-a4d6-57a21a12adda","Type":"ContainerDied","Data":"cdd3e3267fc9315268ddbca476190fb48a589fc20fa86a4f4b74851227bbfa7d"} Dec 02 16:21:10 crc kubenswrapper[4933]: I1202 16:21:10.959669 4933 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cdd3e3267fc9315268ddbca476190fb48a589fc20fa86a4f4b74851227bbfa7d" Dec 02 16:21:10 crc kubenswrapper[4933]: I1202 16:21:10.959470 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-hjbbg" Dec 02 16:21:11 crc kubenswrapper[4933]: I1202 16:21:11.057041 4933 scope.go:117] "RemoveContainer" containerID="3464037e44f6657a3e78f158e1b9ac51eea30ff49dcabcf02071d819dacd47c7" Dec 02 16:21:11 crc kubenswrapper[4933]: E1202 16:21:11.071253 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 16:21:13 crc kubenswrapper[4933]: I1202 16:21:13.299989 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Dec 02 16:21:13 crc kubenswrapper[4933]: I1202 16:21:13.300593 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="270fd24f-c6e3-47d9-b20f-7a7c9aa15d37" containerName="aodh-api" containerID="cri-o://38de01f7ad7491c7c2e2d65b96fa2799b18c3f7c16abee72ce22a35810a0dac2" gracePeriod=30 Dec 02 16:21:13 crc kubenswrapper[4933]: I1202 16:21:13.300606 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="270fd24f-c6e3-47d9-b20f-7a7c9aa15d37" containerName="aodh-listener" containerID="cri-o://d03f41c6f6db8f3d1d3d64cc036e4af2271929cb2431a090ee946954af3e20a6" gracePeriod=30 Dec 02 16:21:13 crc kubenswrapper[4933]: I1202 16:21:13.300712 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="270fd24f-c6e3-47d9-b20f-7a7c9aa15d37" containerName="aodh-notifier" containerID="cri-o://e6558144312a5355d0032730e5646c9623d2b0930eb2e7221c517509602c624b" gracePeriod=30 Dec 02 16:21:13 crc kubenswrapper[4933]: I1202 16:21:13.300764 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="270fd24f-c6e3-47d9-b20f-7a7c9aa15d37" containerName="aodh-evaluator" containerID="cri-o://6ba4c0a8df56cf0bf86039a2a8fcbf3d98bc92ae1f70fc1e6fc99297dabf227b" gracePeriod=30 Dec 02 16:21:13 crc kubenswrapper[4933]: I1202 16:21:13.993909 4933 generic.go:334] "Generic (PLEG): container finished" podID="270fd24f-c6e3-47d9-b20f-7a7c9aa15d37" containerID="6ba4c0a8df56cf0bf86039a2a8fcbf3d98bc92ae1f70fc1e6fc99297dabf227b" exitCode=0 Dec 02 16:21:13 crc kubenswrapper[4933]: I1202 16:21:13.994238 4933 generic.go:334] "Generic (PLEG): container finished" podID="270fd24f-c6e3-47d9-b20f-7a7c9aa15d37" containerID="38de01f7ad7491c7c2e2d65b96fa2799b18c3f7c16abee72ce22a35810a0dac2" exitCode=0 Dec 02 16:21:13 crc kubenswrapper[4933]: I1202 16:21:13.993994 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"270fd24f-c6e3-47d9-b20f-7a7c9aa15d37","Type":"ContainerDied","Data":"6ba4c0a8df56cf0bf86039a2a8fcbf3d98bc92ae1f70fc1e6fc99297dabf227b"} Dec 02 16:21:13 crc kubenswrapper[4933]: I1202 16:21:13.994280 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"270fd24f-c6e3-47d9-b20f-7a7c9aa15d37","Type":"ContainerDied","Data":"38de01f7ad7491c7c2e2d65b96fa2799b18c3f7c16abee72ce22a35810a0dac2"} Dec 02 16:21:15 crc kubenswrapper[4933]: I1202 16:21:15.010681 4933 generic.go:334] "Generic (PLEG): container finished" podID="270fd24f-c6e3-47d9-b20f-7a7c9aa15d37" containerID="e6558144312a5355d0032730e5646c9623d2b0930eb2e7221c517509602c624b" exitCode=0 Dec 02 16:21:15 crc kubenswrapper[4933]: I1202 16:21:15.010713 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"270fd24f-c6e3-47d9-b20f-7a7c9aa15d37","Type":"ContainerDied","Data":"e6558144312a5355d0032730e5646c9623d2b0930eb2e7221c517509602c624b"} Dec 02 16:21:16 crc kubenswrapper[4933]: I1202 16:21:16.671835 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 02 16:21:16 crc kubenswrapper[4933]: I1202 16:21:16.818470 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/270fd24f-c6e3-47d9-b20f-7a7c9aa15d37-config-data\") pod \"270fd24f-c6e3-47d9-b20f-7a7c9aa15d37\" (UID: \"270fd24f-c6e3-47d9-b20f-7a7c9aa15d37\") " Dec 02 16:21:16 crc kubenswrapper[4933]: I1202 16:21:16.818581 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/270fd24f-c6e3-47d9-b20f-7a7c9aa15d37-public-tls-certs\") pod \"270fd24f-c6e3-47d9-b20f-7a7c9aa15d37\" (UID: \"270fd24f-c6e3-47d9-b20f-7a7c9aa15d37\") " Dec 02 16:21:16 crc kubenswrapper[4933]: I1202 16:21:16.818720 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vvtf2\" (UniqueName: \"kubernetes.io/projected/270fd24f-c6e3-47d9-b20f-7a7c9aa15d37-kube-api-access-vvtf2\") pod \"270fd24f-c6e3-47d9-b20f-7a7c9aa15d37\" (UID: \"270fd24f-c6e3-47d9-b20f-7a7c9aa15d37\") " Dec 02 16:21:16 crc kubenswrapper[4933]: I1202 16:21:16.818897 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/270fd24f-c6e3-47d9-b20f-7a7c9aa15d37-scripts\") pod \"270fd24f-c6e3-47d9-b20f-7a7c9aa15d37\" (UID: \"270fd24f-c6e3-47d9-b20f-7a7c9aa15d37\") " Dec 02 16:21:16 crc kubenswrapper[4933]: I1202 16:21:16.818971 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/270fd24f-c6e3-47d9-b20f-7a7c9aa15d37-combined-ca-bundle\") pod \"270fd24f-c6e3-47d9-b20f-7a7c9aa15d37\" (UID: \"270fd24f-c6e3-47d9-b20f-7a7c9aa15d37\") " Dec 02 16:21:16 crc kubenswrapper[4933]: I1202 16:21:16.819043 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/270fd24f-c6e3-47d9-b20f-7a7c9aa15d37-internal-tls-certs\") pod \"270fd24f-c6e3-47d9-b20f-7a7c9aa15d37\" (UID: \"270fd24f-c6e3-47d9-b20f-7a7c9aa15d37\") " Dec 02 16:21:16 crc kubenswrapper[4933]: I1202 16:21:16.824493 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/270fd24f-c6e3-47d9-b20f-7a7c9aa15d37-kube-api-access-vvtf2" (OuterVolumeSpecName: "kube-api-access-vvtf2") pod "270fd24f-c6e3-47d9-b20f-7a7c9aa15d37" (UID: "270fd24f-c6e3-47d9-b20f-7a7c9aa15d37"). InnerVolumeSpecName "kube-api-access-vvtf2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:21:16 crc kubenswrapper[4933]: I1202 16:21:16.841059 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/270fd24f-c6e3-47d9-b20f-7a7c9aa15d37-scripts" (OuterVolumeSpecName: "scripts") pod "270fd24f-c6e3-47d9-b20f-7a7c9aa15d37" (UID: "270fd24f-c6e3-47d9-b20f-7a7c9aa15d37"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:21:16 crc kubenswrapper[4933]: I1202 16:21:16.890499 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/270fd24f-c6e3-47d9-b20f-7a7c9aa15d37-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "270fd24f-c6e3-47d9-b20f-7a7c9aa15d37" (UID: "270fd24f-c6e3-47d9-b20f-7a7c9aa15d37"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:21:16 crc kubenswrapper[4933]: I1202 16:21:16.910332 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/270fd24f-c6e3-47d9-b20f-7a7c9aa15d37-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "270fd24f-c6e3-47d9-b20f-7a7c9aa15d37" (UID: "270fd24f-c6e3-47d9-b20f-7a7c9aa15d37"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:21:16 crc kubenswrapper[4933]: I1202 16:21:16.923026 4933 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/270fd24f-c6e3-47d9-b20f-7a7c9aa15d37-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 16:21:16 crc kubenswrapper[4933]: I1202 16:21:16.923320 4933 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/270fd24f-c6e3-47d9-b20f-7a7c9aa15d37-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 02 16:21:16 crc kubenswrapper[4933]: I1202 16:21:16.923440 4933 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/270fd24f-c6e3-47d9-b20f-7a7c9aa15d37-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 02 16:21:16 crc kubenswrapper[4933]: I1202 16:21:16.923514 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vvtf2\" (UniqueName: \"kubernetes.io/projected/270fd24f-c6e3-47d9-b20f-7a7c9aa15d37-kube-api-access-vvtf2\") on node \"crc\" DevicePath \"\"" Dec 02 16:21:16 crc kubenswrapper[4933]: I1202 16:21:16.948039 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/270fd24f-c6e3-47d9-b20f-7a7c9aa15d37-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "270fd24f-c6e3-47d9-b20f-7a7c9aa15d37" (UID: "270fd24f-c6e3-47d9-b20f-7a7c9aa15d37"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:21:16 crc kubenswrapper[4933]: I1202 16:21:16.987036 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/270fd24f-c6e3-47d9-b20f-7a7c9aa15d37-config-data" (OuterVolumeSpecName: "config-data") pod "270fd24f-c6e3-47d9-b20f-7a7c9aa15d37" (UID: "270fd24f-c6e3-47d9-b20f-7a7c9aa15d37"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:21:17 crc kubenswrapper[4933]: I1202 16:21:17.041698 4933 generic.go:334] "Generic (PLEG): container finished" podID="270fd24f-c6e3-47d9-b20f-7a7c9aa15d37" containerID="d03f41c6f6db8f3d1d3d64cc036e4af2271929cb2431a090ee946954af3e20a6" exitCode=0 Dec 02 16:21:17 crc kubenswrapper[4933]: I1202 16:21:17.041744 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"270fd24f-c6e3-47d9-b20f-7a7c9aa15d37","Type":"ContainerDied","Data":"d03f41c6f6db8f3d1d3d64cc036e4af2271929cb2431a090ee946954af3e20a6"} Dec 02 16:21:17 crc kubenswrapper[4933]: I1202 16:21:17.041775 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"270fd24f-c6e3-47d9-b20f-7a7c9aa15d37","Type":"ContainerDied","Data":"1bab90aa13745309c7af191669811adb8e31c099193d3ceb0c1f9464d49ebf01"} Dec 02 16:21:17 crc kubenswrapper[4933]: I1202 16:21:17.041792 4933 scope.go:117] "RemoveContainer" containerID="d03f41c6f6db8f3d1d3d64cc036e4af2271929cb2431a090ee946954af3e20a6" Dec 02 16:21:17 crc kubenswrapper[4933]: I1202 16:21:17.041996 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 02 16:21:17 crc kubenswrapper[4933]: I1202 16:21:17.075524 4933 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/270fd24f-c6e3-47d9-b20f-7a7c9aa15d37-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 16:21:17 crc kubenswrapper[4933]: I1202 16:21:17.076911 4933 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/270fd24f-c6e3-47d9-b20f-7a7c9aa15d37-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 16:21:17 crc kubenswrapper[4933]: I1202 16:21:17.122871 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Dec 02 16:21:17 crc kubenswrapper[4933]: I1202 16:21:17.137637 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-0"] Dec 02 16:21:17 crc kubenswrapper[4933]: I1202 16:21:17.157750 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Dec 02 16:21:17 crc kubenswrapper[4933]: E1202 16:21:17.158354 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="270fd24f-c6e3-47d9-b20f-7a7c9aa15d37" containerName="aodh-api" Dec 02 16:21:17 crc kubenswrapper[4933]: I1202 16:21:17.158372 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="270fd24f-c6e3-47d9-b20f-7a7c9aa15d37" containerName="aodh-api" Dec 02 16:21:17 crc kubenswrapper[4933]: E1202 16:21:17.158386 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="270fd24f-c6e3-47d9-b20f-7a7c9aa15d37" containerName="aodh-listener" Dec 02 16:21:17 crc kubenswrapper[4933]: I1202 16:21:17.158392 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="270fd24f-c6e3-47d9-b20f-7a7c9aa15d37" containerName="aodh-listener" Dec 02 16:21:17 crc kubenswrapper[4933]: E1202 16:21:17.158427 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="270fd24f-c6e3-47d9-b20f-7a7c9aa15d37" containerName="aodh-evaluator" Dec 02 16:21:17 crc kubenswrapper[4933]: I1202 16:21:17.158433 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="270fd24f-c6e3-47d9-b20f-7a7c9aa15d37" containerName="aodh-evaluator" Dec 02 16:21:17 crc kubenswrapper[4933]: E1202 16:21:17.158448 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="270fd24f-c6e3-47d9-b20f-7a7c9aa15d37" containerName="aodh-notifier" Dec 02 16:21:17 crc kubenswrapper[4933]: I1202 16:21:17.158454 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="270fd24f-c6e3-47d9-b20f-7a7c9aa15d37" containerName="aodh-notifier" Dec 02 16:21:17 crc kubenswrapper[4933]: E1202 16:21:17.158471 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="747183c3-25ee-4d86-a4d6-57a21a12adda" containerName="aodh-db-sync" Dec 02 16:21:17 crc kubenswrapper[4933]: I1202 16:21:17.158476 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="747183c3-25ee-4d86-a4d6-57a21a12adda" containerName="aodh-db-sync" Dec 02 16:21:17 crc kubenswrapper[4933]: I1202 16:21:17.158702 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="270fd24f-c6e3-47d9-b20f-7a7c9aa15d37" containerName="aodh-notifier" Dec 02 16:21:17 crc kubenswrapper[4933]: I1202 16:21:17.158722 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="747183c3-25ee-4d86-a4d6-57a21a12adda" containerName="aodh-db-sync" Dec 02 16:21:17 crc kubenswrapper[4933]: I1202 16:21:17.158734 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="270fd24f-c6e3-47d9-b20f-7a7c9aa15d37" containerName="aodh-api" Dec 02 16:21:17 crc kubenswrapper[4933]: I1202 16:21:17.158749 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="270fd24f-c6e3-47d9-b20f-7a7c9aa15d37" containerName="aodh-evaluator" Dec 02 16:21:17 crc kubenswrapper[4933]: I1202 16:21:17.158763 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="270fd24f-c6e3-47d9-b20f-7a7c9aa15d37" containerName="aodh-listener" Dec 02 16:21:17 crc kubenswrapper[4933]: I1202 16:21:17.161222 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Dec 02 16:21:17 crc kubenswrapper[4933]: I1202 16:21:17.161315 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 02 16:21:17 crc kubenswrapper[4933]: I1202 16:21:17.168476 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Dec 02 16:21:17 crc kubenswrapper[4933]: I1202 16:21:17.168679 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-internal-svc" Dec 02 16:21:17 crc kubenswrapper[4933]: I1202 16:21:17.168801 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-public-svc" Dec 02 16:21:17 crc kubenswrapper[4933]: I1202 16:21:17.168926 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-g75sl" Dec 02 16:21:17 crc kubenswrapper[4933]: I1202 16:21:17.169065 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Dec 02 16:21:17 crc kubenswrapper[4933]: I1202 16:21:17.199344 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e5f2071-d2a6-49cb-9c10-34a6f81bc75b-combined-ca-bundle\") pod \"aodh-0\" (UID: \"9e5f2071-d2a6-49cb-9c10-34a6f81bc75b\") " pod="openstack/aodh-0" Dec 02 16:21:17 crc kubenswrapper[4933]: I1202 16:21:17.199390 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e5f2071-d2a6-49cb-9c10-34a6f81bc75b-config-data\") pod \"aodh-0\" (UID: \"9e5f2071-d2a6-49cb-9c10-34a6f81bc75b\") " pod="openstack/aodh-0" Dec 02 16:21:17 crc kubenswrapper[4933]: I1202 16:21:17.199437 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fk9b\" (UniqueName: \"kubernetes.io/projected/9e5f2071-d2a6-49cb-9c10-34a6f81bc75b-kube-api-access-5fk9b\") pod \"aodh-0\" (UID: \"9e5f2071-d2a6-49cb-9c10-34a6f81bc75b\") " pod="openstack/aodh-0" Dec 02 16:21:17 crc kubenswrapper[4933]: I1202 16:21:17.199489 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e5f2071-d2a6-49cb-9c10-34a6f81bc75b-public-tls-certs\") pod \"aodh-0\" (UID: \"9e5f2071-d2a6-49cb-9c10-34a6f81bc75b\") " pod="openstack/aodh-0" Dec 02 16:21:17 crc kubenswrapper[4933]: I1202 16:21:17.199538 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e5f2071-d2a6-49cb-9c10-34a6f81bc75b-internal-tls-certs\") pod \"aodh-0\" (UID: \"9e5f2071-d2a6-49cb-9c10-34a6f81bc75b\") " pod="openstack/aodh-0" Dec 02 16:21:17 crc kubenswrapper[4933]: I1202 16:21:17.199582 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9e5f2071-d2a6-49cb-9c10-34a6f81bc75b-scripts\") pod \"aodh-0\" (UID: \"9e5f2071-d2a6-49cb-9c10-34a6f81bc75b\") " pod="openstack/aodh-0" Dec 02 16:21:17 crc kubenswrapper[4933]: I1202 16:21:17.276667 4933 scope.go:117] "RemoveContainer" containerID="e6558144312a5355d0032730e5646c9623d2b0930eb2e7221c517509602c624b" Dec 02 16:21:17 crc kubenswrapper[4933]: I1202 16:21:17.302326 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e5f2071-d2a6-49cb-9c10-34a6f81bc75b-combined-ca-bundle\") pod \"aodh-0\" (UID: \"9e5f2071-d2a6-49cb-9c10-34a6f81bc75b\") " pod="openstack/aodh-0" Dec 02 16:21:17 crc kubenswrapper[4933]: I1202 16:21:17.302369 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e5f2071-d2a6-49cb-9c10-34a6f81bc75b-config-data\") pod \"aodh-0\" (UID: \"9e5f2071-d2a6-49cb-9c10-34a6f81bc75b\") " pod="openstack/aodh-0" Dec 02 16:21:17 crc kubenswrapper[4933]: I1202 16:21:17.302403 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5fk9b\" (UniqueName: \"kubernetes.io/projected/9e5f2071-d2a6-49cb-9c10-34a6f81bc75b-kube-api-access-5fk9b\") pod \"aodh-0\" (UID: \"9e5f2071-d2a6-49cb-9c10-34a6f81bc75b\") " pod="openstack/aodh-0" Dec 02 16:21:17 crc kubenswrapper[4933]: I1202 16:21:17.302450 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e5f2071-d2a6-49cb-9c10-34a6f81bc75b-public-tls-certs\") pod \"aodh-0\" (UID: \"9e5f2071-d2a6-49cb-9c10-34a6f81bc75b\") " pod="openstack/aodh-0" Dec 02 16:21:17 crc kubenswrapper[4933]: I1202 16:21:17.302500 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e5f2071-d2a6-49cb-9c10-34a6f81bc75b-internal-tls-certs\") pod \"aodh-0\" (UID: \"9e5f2071-d2a6-49cb-9c10-34a6f81bc75b\") " pod="openstack/aodh-0" Dec 02 16:21:17 crc kubenswrapper[4933]: I1202 16:21:17.302557 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9e5f2071-d2a6-49cb-9c10-34a6f81bc75b-scripts\") pod \"aodh-0\" (UID: \"9e5f2071-d2a6-49cb-9c10-34a6f81bc75b\") " pod="openstack/aodh-0" Dec 02 16:21:17 crc kubenswrapper[4933]: I1202 16:21:17.306698 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e5f2071-d2a6-49cb-9c10-34a6f81bc75b-combined-ca-bundle\") pod \"aodh-0\" (UID: \"9e5f2071-d2a6-49cb-9c10-34a6f81bc75b\") " pod="openstack/aodh-0" Dec 02 16:21:17 crc kubenswrapper[4933]: I1202 16:21:17.306721 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e5f2071-d2a6-49cb-9c10-34a6f81bc75b-public-tls-certs\") pod \"aodh-0\" (UID: \"9e5f2071-d2a6-49cb-9c10-34a6f81bc75b\") " pod="openstack/aodh-0" Dec 02 16:21:17 crc kubenswrapper[4933]: I1202 16:21:17.308396 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9e5f2071-d2a6-49cb-9c10-34a6f81bc75b-scripts\") pod \"aodh-0\" (UID: \"9e5f2071-d2a6-49cb-9c10-34a6f81bc75b\") " pod="openstack/aodh-0" Dec 02 16:21:17 crc kubenswrapper[4933]: I1202 16:21:17.309339 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e5f2071-d2a6-49cb-9c10-34a6f81bc75b-config-data\") pod \"aodh-0\" (UID: \"9e5f2071-d2a6-49cb-9c10-34a6f81bc75b\") " pod="openstack/aodh-0" Dec 02 16:21:17 crc kubenswrapper[4933]: I1202 16:21:17.311734 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e5f2071-d2a6-49cb-9c10-34a6f81bc75b-internal-tls-certs\") pod \"aodh-0\" (UID: \"9e5f2071-d2a6-49cb-9c10-34a6f81bc75b\") " pod="openstack/aodh-0" Dec 02 16:21:17 crc kubenswrapper[4933]: I1202 16:21:17.324600 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fk9b\" (UniqueName: \"kubernetes.io/projected/9e5f2071-d2a6-49cb-9c10-34a6f81bc75b-kube-api-access-5fk9b\") pod \"aodh-0\" (UID: \"9e5f2071-d2a6-49cb-9c10-34a6f81bc75b\") " pod="openstack/aodh-0" Dec 02 16:21:17 crc kubenswrapper[4933]: I1202 16:21:17.435910 4933 scope.go:117] "RemoveContainer" containerID="6ba4c0a8df56cf0bf86039a2a8fcbf3d98bc92ae1f70fc1e6fc99297dabf227b" Dec 02 16:21:17 crc kubenswrapper[4933]: I1202 16:21:17.455191 4933 scope.go:117] "RemoveContainer" containerID="38de01f7ad7491c7c2e2d65b96fa2799b18c3f7c16abee72ce22a35810a0dac2" Dec 02 16:21:17 crc kubenswrapper[4933]: I1202 16:21:17.475185 4933 scope.go:117] "RemoveContainer" containerID="d03f41c6f6db8f3d1d3d64cc036e4af2271929cb2431a090ee946954af3e20a6" Dec 02 16:21:17 crc kubenswrapper[4933]: E1202 16:21:17.475530 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d03f41c6f6db8f3d1d3d64cc036e4af2271929cb2431a090ee946954af3e20a6\": container with ID starting with d03f41c6f6db8f3d1d3d64cc036e4af2271929cb2431a090ee946954af3e20a6 not found: ID does not exist" containerID="d03f41c6f6db8f3d1d3d64cc036e4af2271929cb2431a090ee946954af3e20a6" Dec 02 16:21:17 crc kubenswrapper[4933]: I1202 16:21:17.475578 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d03f41c6f6db8f3d1d3d64cc036e4af2271929cb2431a090ee946954af3e20a6"} err="failed to get container status \"d03f41c6f6db8f3d1d3d64cc036e4af2271929cb2431a090ee946954af3e20a6\": rpc error: code = NotFound desc = could not find container \"d03f41c6f6db8f3d1d3d64cc036e4af2271929cb2431a090ee946954af3e20a6\": container with ID starting with d03f41c6f6db8f3d1d3d64cc036e4af2271929cb2431a090ee946954af3e20a6 not found: ID does not exist" Dec 02 16:21:17 crc kubenswrapper[4933]: I1202 16:21:17.475619 4933 scope.go:117] "RemoveContainer" containerID="e6558144312a5355d0032730e5646c9623d2b0930eb2e7221c517509602c624b" Dec 02 16:21:17 crc kubenswrapper[4933]: E1202 16:21:17.475991 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6558144312a5355d0032730e5646c9623d2b0930eb2e7221c517509602c624b\": container with ID starting with e6558144312a5355d0032730e5646c9623d2b0930eb2e7221c517509602c624b not found: ID does not exist" containerID="e6558144312a5355d0032730e5646c9623d2b0930eb2e7221c517509602c624b" Dec 02 16:21:17 crc kubenswrapper[4933]: I1202 16:21:17.476035 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6558144312a5355d0032730e5646c9623d2b0930eb2e7221c517509602c624b"} err="failed to get container status \"e6558144312a5355d0032730e5646c9623d2b0930eb2e7221c517509602c624b\": rpc error: code = NotFound desc = could not find container \"e6558144312a5355d0032730e5646c9623d2b0930eb2e7221c517509602c624b\": container with ID starting with e6558144312a5355d0032730e5646c9623d2b0930eb2e7221c517509602c624b not found: ID does not exist" Dec 02 16:21:17 crc kubenswrapper[4933]: I1202 16:21:17.476066 4933 scope.go:117] "RemoveContainer" containerID="6ba4c0a8df56cf0bf86039a2a8fcbf3d98bc92ae1f70fc1e6fc99297dabf227b" Dec 02 16:21:17 crc kubenswrapper[4933]: E1202 16:21:17.476346 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ba4c0a8df56cf0bf86039a2a8fcbf3d98bc92ae1f70fc1e6fc99297dabf227b\": container with ID starting with 6ba4c0a8df56cf0bf86039a2a8fcbf3d98bc92ae1f70fc1e6fc99297dabf227b not found: ID does not exist" containerID="6ba4c0a8df56cf0bf86039a2a8fcbf3d98bc92ae1f70fc1e6fc99297dabf227b" Dec 02 16:21:17 crc kubenswrapper[4933]: I1202 16:21:17.476377 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ba4c0a8df56cf0bf86039a2a8fcbf3d98bc92ae1f70fc1e6fc99297dabf227b"} err="failed to get container status \"6ba4c0a8df56cf0bf86039a2a8fcbf3d98bc92ae1f70fc1e6fc99297dabf227b\": rpc error: code = NotFound desc = could not find container \"6ba4c0a8df56cf0bf86039a2a8fcbf3d98bc92ae1f70fc1e6fc99297dabf227b\": container with ID starting with 6ba4c0a8df56cf0bf86039a2a8fcbf3d98bc92ae1f70fc1e6fc99297dabf227b not found: ID does not exist" Dec 02 16:21:17 crc kubenswrapper[4933]: I1202 16:21:17.476398 4933 scope.go:117] "RemoveContainer" containerID="38de01f7ad7491c7c2e2d65b96fa2799b18c3f7c16abee72ce22a35810a0dac2" Dec 02 16:21:17 crc kubenswrapper[4933]: E1202 16:21:17.476599 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38de01f7ad7491c7c2e2d65b96fa2799b18c3f7c16abee72ce22a35810a0dac2\": container with ID starting with 38de01f7ad7491c7c2e2d65b96fa2799b18c3f7c16abee72ce22a35810a0dac2 not found: ID does not exist" containerID="38de01f7ad7491c7c2e2d65b96fa2799b18c3f7c16abee72ce22a35810a0dac2" Dec 02 16:21:17 crc kubenswrapper[4933]: I1202 16:21:17.476638 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38de01f7ad7491c7c2e2d65b96fa2799b18c3f7c16abee72ce22a35810a0dac2"} err="failed to get container status \"38de01f7ad7491c7c2e2d65b96fa2799b18c3f7c16abee72ce22a35810a0dac2\": rpc error: code = NotFound desc = could not find container \"38de01f7ad7491c7c2e2d65b96fa2799b18c3f7c16abee72ce22a35810a0dac2\": container with ID starting with 38de01f7ad7491c7c2e2d65b96fa2799b18c3f7c16abee72ce22a35810a0dac2 not found: ID does not exist" Dec 02 16:21:17 crc kubenswrapper[4933]: I1202 16:21:17.500478 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 02 16:21:18 crc kubenswrapper[4933]: I1202 16:21:18.006969 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Dec 02 16:21:18 crc kubenswrapper[4933]: I1202 16:21:18.057206 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"9e5f2071-d2a6-49cb-9c10-34a6f81bc75b","Type":"ContainerStarted","Data":"4be0976a8c4cac4ca856cd09837f8c6f661e5971d90762ca494bae9cc68b331d"} Dec 02 16:21:19 crc kubenswrapper[4933]: I1202 16:21:19.072356 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="270fd24f-c6e3-47d9-b20f-7a7c9aa15d37" path="/var/lib/kubelet/pods/270fd24f-c6e3-47d9-b20f-7a7c9aa15d37/volumes" Dec 02 16:21:19 crc kubenswrapper[4933]: I1202 16:21:19.087086 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"9e5f2071-d2a6-49cb-9c10-34a6f81bc75b","Type":"ContainerStarted","Data":"2d595f3ad9ef0c35baf18f18c684dd46b80a18bf492e448500035bd8474334f0"} Dec 02 16:21:20 crc kubenswrapper[4933]: I1202 16:21:20.089208 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"9e5f2071-d2a6-49cb-9c10-34a6f81bc75b","Type":"ContainerStarted","Data":"01d6eb610476e24c752d707a5cf44cfbb3fbd8241bd35529c11d8e7604087bd2"} Dec 02 16:21:21 crc kubenswrapper[4933]: I1202 16:21:21.112505 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"9e5f2071-d2a6-49cb-9c10-34a6f81bc75b","Type":"ContainerStarted","Data":"0eb6cc5b3fd526c174f746dae891902739e97008d3b29efe6b79f39940044bfd"} Dec 02 16:21:22 crc kubenswrapper[4933]: I1202 16:21:22.134024 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"9e5f2071-d2a6-49cb-9c10-34a6f81bc75b","Type":"ContainerStarted","Data":"5a5c538a6a2dcf252f7e1ab057c480ba400aa13aff5472a90678e873fdb9c4a0"} Dec 02 16:21:22 crc kubenswrapper[4933]: I1202 16:21:22.163214 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=1.364682357 podStartE2EDuration="5.16319041s" podCreationTimestamp="2025-12-02 16:21:17 +0000 UTC" firstStartedPulling="2025-12-02 16:21:17.974901541 +0000 UTC m=+1741.226128244" lastFinishedPulling="2025-12-02 16:21:21.773409594 +0000 UTC m=+1745.024636297" observedRunningTime="2025-12-02 16:21:22.154141105 +0000 UTC m=+1745.405367808" watchObservedRunningTime="2025-12-02 16:21:22.16319041 +0000 UTC m=+1745.414417113" Dec 02 16:21:24 crc kubenswrapper[4933]: I1202 16:21:24.054872 4933 scope.go:117] "RemoveContainer" containerID="3464037e44f6657a3e78f158e1b9ac51eea30ff49dcabcf02071d819dacd47c7" Dec 02 16:21:24 crc kubenswrapper[4933]: E1202 16:21:24.056083 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 16:21:38 crc kubenswrapper[4933]: I1202 16:21:38.054075 4933 scope.go:117] "RemoveContainer" containerID="3464037e44f6657a3e78f158e1b9ac51eea30ff49dcabcf02071d819dacd47c7" Dec 02 16:21:38 crc kubenswrapper[4933]: E1202 16:21:38.055067 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 16:21:43 crc kubenswrapper[4933]: I1202 16:21:43.196782 4933 scope.go:117] "RemoveContainer" containerID="dae24602cbed033884446a88e1c910eb45268850208f26904539043d4084c582" Dec 02 16:21:43 crc kubenswrapper[4933]: I1202 16:21:43.247425 4933 scope.go:117] "RemoveContainer" containerID="6fe16daec987538433d6adf8c740cdafa347db9e5e864e13c9d2b64a817e3ca5" Dec 02 16:21:43 crc kubenswrapper[4933]: I1202 16:21:43.307419 4933 scope.go:117] "RemoveContainer" containerID="4a0971639cd14f1cfd98e67c6cee918a1300afbfb85fa40767af6f4febb88f05" Dec 02 16:21:51 crc kubenswrapper[4933]: I1202 16:21:51.053186 4933 scope.go:117] "RemoveContainer" containerID="3464037e44f6657a3e78f158e1b9ac51eea30ff49dcabcf02071d819dacd47c7" Dec 02 16:21:51 crc kubenswrapper[4933]: E1202 16:21:51.054189 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 16:22:06 crc kubenswrapper[4933]: I1202 16:22:06.053696 4933 scope.go:117] "RemoveContainer" containerID="3464037e44f6657a3e78f158e1b9ac51eea30ff49dcabcf02071d819dacd47c7" Dec 02 16:22:06 crc kubenswrapper[4933]: E1202 16:22:06.054557 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 16:22:18 crc kubenswrapper[4933]: I1202 16:22:18.085122 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-h6v9b"] Dec 02 16:22:18 crc kubenswrapper[4933]: I1202 16:22:18.088533 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h6v9b" Dec 02 16:22:18 crc kubenswrapper[4933]: I1202 16:22:18.103597 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-h6v9b"] Dec 02 16:22:18 crc kubenswrapper[4933]: I1202 16:22:18.192347 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dfb3af35-57b8-4936-9baf-7cc31fe9682b-utilities\") pod \"community-operators-h6v9b\" (UID: \"dfb3af35-57b8-4936-9baf-7cc31fe9682b\") " pod="openshift-marketplace/community-operators-h6v9b" Dec 02 16:22:18 crc kubenswrapper[4933]: I1202 16:22:18.192396 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dfb3af35-57b8-4936-9baf-7cc31fe9682b-catalog-content\") pod \"community-operators-h6v9b\" (UID: \"dfb3af35-57b8-4936-9baf-7cc31fe9682b\") " pod="openshift-marketplace/community-operators-h6v9b" Dec 02 16:22:18 crc kubenswrapper[4933]: I1202 16:22:18.192443 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zf68\" (UniqueName: \"kubernetes.io/projected/dfb3af35-57b8-4936-9baf-7cc31fe9682b-kube-api-access-4zf68\") pod \"community-operators-h6v9b\" (UID: \"dfb3af35-57b8-4936-9baf-7cc31fe9682b\") " pod="openshift-marketplace/community-operators-h6v9b" Dec 02 16:22:18 crc kubenswrapper[4933]: I1202 16:22:18.295670 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dfb3af35-57b8-4936-9baf-7cc31fe9682b-utilities\") pod \"community-operators-h6v9b\" (UID: \"dfb3af35-57b8-4936-9baf-7cc31fe9682b\") " pod="openshift-marketplace/community-operators-h6v9b" Dec 02 16:22:18 crc kubenswrapper[4933]: I1202 16:22:18.296046 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dfb3af35-57b8-4936-9baf-7cc31fe9682b-catalog-content\") pod \"community-operators-h6v9b\" (UID: \"dfb3af35-57b8-4936-9baf-7cc31fe9682b\") " pod="openshift-marketplace/community-operators-h6v9b" Dec 02 16:22:18 crc kubenswrapper[4933]: I1202 16:22:18.296268 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dfb3af35-57b8-4936-9baf-7cc31fe9682b-utilities\") pod \"community-operators-h6v9b\" (UID: \"dfb3af35-57b8-4936-9baf-7cc31fe9682b\") " pod="openshift-marketplace/community-operators-h6v9b" Dec 02 16:22:18 crc kubenswrapper[4933]: I1202 16:22:18.296275 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4zf68\" (UniqueName: \"kubernetes.io/projected/dfb3af35-57b8-4936-9baf-7cc31fe9682b-kube-api-access-4zf68\") pod \"community-operators-h6v9b\" (UID: \"dfb3af35-57b8-4936-9baf-7cc31fe9682b\") " pod="openshift-marketplace/community-operators-h6v9b" Dec 02 16:22:18 crc kubenswrapper[4933]: I1202 16:22:18.296481 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dfb3af35-57b8-4936-9baf-7cc31fe9682b-catalog-content\") pod \"community-operators-h6v9b\" (UID: \"dfb3af35-57b8-4936-9baf-7cc31fe9682b\") " pod="openshift-marketplace/community-operators-h6v9b" Dec 02 16:22:18 crc kubenswrapper[4933]: I1202 16:22:18.316767 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zf68\" (UniqueName: \"kubernetes.io/projected/dfb3af35-57b8-4936-9baf-7cc31fe9682b-kube-api-access-4zf68\") pod \"community-operators-h6v9b\" (UID: \"dfb3af35-57b8-4936-9baf-7cc31fe9682b\") " pod="openshift-marketplace/community-operators-h6v9b" Dec 02 16:22:18 crc kubenswrapper[4933]: I1202 16:22:18.413161 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h6v9b" Dec 02 16:22:18 crc kubenswrapper[4933]: I1202 16:22:18.974976 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-h6v9b"] Dec 02 16:22:19 crc kubenswrapper[4933]: I1202 16:22:19.820302 4933 generic.go:334] "Generic (PLEG): container finished" podID="dfb3af35-57b8-4936-9baf-7cc31fe9682b" containerID="b02b0acd60fc8e15fb78ab2024d6d09823a9c4fd7d27844a4d51159e4b2dc53a" exitCode=0 Dec 02 16:22:19 crc kubenswrapper[4933]: I1202 16:22:19.820467 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h6v9b" event={"ID":"dfb3af35-57b8-4936-9baf-7cc31fe9682b","Type":"ContainerDied","Data":"b02b0acd60fc8e15fb78ab2024d6d09823a9c4fd7d27844a4d51159e4b2dc53a"} Dec 02 16:22:19 crc kubenswrapper[4933]: I1202 16:22:19.820620 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h6v9b" event={"ID":"dfb3af35-57b8-4936-9baf-7cc31fe9682b","Type":"ContainerStarted","Data":"6f9fca25aa5305595c1ab5fcd132b84bcd1028a5d43a122558f3d34ae56bfda4"} Dec 02 16:22:21 crc kubenswrapper[4933]: I1202 16:22:21.053629 4933 scope.go:117] "RemoveContainer" containerID="3464037e44f6657a3e78f158e1b9ac51eea30ff49dcabcf02071d819dacd47c7" Dec 02 16:22:21 crc kubenswrapper[4933]: E1202 16:22:21.054559 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 16:22:24 crc kubenswrapper[4933]: I1202 16:22:24.880310 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h6v9b" event={"ID":"dfb3af35-57b8-4936-9baf-7cc31fe9682b","Type":"ContainerStarted","Data":"5e98dbd8c046f54e1e9755510331a83a1b0e2fbed4f35c0c70314f4d2fe5a9fe"} Dec 02 16:22:25 crc kubenswrapper[4933]: I1202 16:22:25.914616 4933 generic.go:334] "Generic (PLEG): container finished" podID="dfb3af35-57b8-4936-9baf-7cc31fe9682b" containerID="5e98dbd8c046f54e1e9755510331a83a1b0e2fbed4f35c0c70314f4d2fe5a9fe" exitCode=0 Dec 02 16:22:25 crc kubenswrapper[4933]: I1202 16:22:25.914661 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h6v9b" event={"ID":"dfb3af35-57b8-4936-9baf-7cc31fe9682b","Type":"ContainerDied","Data":"5e98dbd8c046f54e1e9755510331a83a1b0e2fbed4f35c0c70314f4d2fe5a9fe"} Dec 02 16:22:26 crc kubenswrapper[4933]: I1202 16:22:26.928526 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h6v9b" event={"ID":"dfb3af35-57b8-4936-9baf-7cc31fe9682b","Type":"ContainerStarted","Data":"f6cd74c9828cd1548189799df7407c61cb98a82572fbb8c6795e253f10c9598d"} Dec 02 16:22:26 crc kubenswrapper[4933]: I1202 16:22:26.957296 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-h6v9b" podStartSLOduration=2.153195621 podStartE2EDuration="8.957250116s" podCreationTimestamp="2025-12-02 16:22:18 +0000 UTC" firstStartedPulling="2025-12-02 16:22:19.822271746 +0000 UTC m=+1803.073498449" lastFinishedPulling="2025-12-02 16:22:26.626326251 +0000 UTC m=+1809.877552944" observedRunningTime="2025-12-02 16:22:26.94516644 +0000 UTC m=+1810.196393133" watchObservedRunningTime="2025-12-02 16:22:26.957250116 +0000 UTC m=+1810.208476819" Dec 02 16:22:28 crc kubenswrapper[4933]: I1202 16:22:28.413279 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-h6v9b" Dec 02 16:22:28 crc kubenswrapper[4933]: I1202 16:22:28.413634 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-h6v9b" Dec 02 16:22:29 crc kubenswrapper[4933]: I1202 16:22:29.466644 4933 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-h6v9b" podUID="dfb3af35-57b8-4936-9baf-7cc31fe9682b" containerName="registry-server" probeResult="failure" output=< Dec 02 16:22:29 crc kubenswrapper[4933]: timeout: failed to connect service ":50051" within 1s Dec 02 16:22:29 crc kubenswrapper[4933]: > Dec 02 16:22:32 crc kubenswrapper[4933]: I1202 16:22:32.053574 4933 scope.go:117] "RemoveContainer" containerID="3464037e44f6657a3e78f158e1b9ac51eea30ff49dcabcf02071d819dacd47c7" Dec 02 16:22:32 crc kubenswrapper[4933]: E1202 16:22:32.054499 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 16:22:38 crc kubenswrapper[4933]: I1202 16:22:38.487725 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-h6v9b" Dec 02 16:22:38 crc kubenswrapper[4933]: I1202 16:22:38.551385 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-h6v9b" Dec 02 16:22:38 crc kubenswrapper[4933]: I1202 16:22:38.621409 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-h6v9b"] Dec 02 16:22:38 crc kubenswrapper[4933]: I1202 16:22:38.735864 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-m288s"] Dec 02 16:22:38 crc kubenswrapper[4933]: I1202 16:22:38.736103 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-m288s" podUID="cb2fcf27-bfa3-429d-8c22-ff2ba63fcdb0" containerName="registry-server" containerID="cri-o://3fe6ada5e8e517625cd45722d3e2289ed1ad3e59a2454e74217b73668d8565c8" gracePeriod=2 Dec 02 16:22:39 crc kubenswrapper[4933]: I1202 16:22:39.132952 4933 generic.go:334] "Generic (PLEG): container finished" podID="cb2fcf27-bfa3-429d-8c22-ff2ba63fcdb0" containerID="3fe6ada5e8e517625cd45722d3e2289ed1ad3e59a2454e74217b73668d8565c8" exitCode=0 Dec 02 16:22:39 crc kubenswrapper[4933]: I1202 16:22:39.133977 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m288s" event={"ID":"cb2fcf27-bfa3-429d-8c22-ff2ba63fcdb0","Type":"ContainerDied","Data":"3fe6ada5e8e517625cd45722d3e2289ed1ad3e59a2454e74217b73668d8565c8"} Dec 02 16:22:39 crc kubenswrapper[4933]: I1202 16:22:39.494281 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m288s" Dec 02 16:22:39 crc kubenswrapper[4933]: I1202 16:22:39.648248 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfvts\" (UniqueName: \"kubernetes.io/projected/cb2fcf27-bfa3-429d-8c22-ff2ba63fcdb0-kube-api-access-kfvts\") pod \"cb2fcf27-bfa3-429d-8c22-ff2ba63fcdb0\" (UID: \"cb2fcf27-bfa3-429d-8c22-ff2ba63fcdb0\") " Dec 02 16:22:39 crc kubenswrapper[4933]: I1202 16:22:39.648439 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb2fcf27-bfa3-429d-8c22-ff2ba63fcdb0-utilities\") pod \"cb2fcf27-bfa3-429d-8c22-ff2ba63fcdb0\" (UID: \"cb2fcf27-bfa3-429d-8c22-ff2ba63fcdb0\") " Dec 02 16:22:39 crc kubenswrapper[4933]: I1202 16:22:39.648491 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb2fcf27-bfa3-429d-8c22-ff2ba63fcdb0-catalog-content\") pod \"cb2fcf27-bfa3-429d-8c22-ff2ba63fcdb0\" (UID: \"cb2fcf27-bfa3-429d-8c22-ff2ba63fcdb0\") " Dec 02 16:22:39 crc kubenswrapper[4933]: I1202 16:22:39.649607 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb2fcf27-bfa3-429d-8c22-ff2ba63fcdb0-utilities" (OuterVolumeSpecName: "utilities") pod "cb2fcf27-bfa3-429d-8c22-ff2ba63fcdb0" (UID: "cb2fcf27-bfa3-429d-8c22-ff2ba63fcdb0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 16:22:39 crc kubenswrapper[4933]: I1202 16:22:39.655686 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb2fcf27-bfa3-429d-8c22-ff2ba63fcdb0-kube-api-access-kfvts" (OuterVolumeSpecName: "kube-api-access-kfvts") pod "cb2fcf27-bfa3-429d-8c22-ff2ba63fcdb0" (UID: "cb2fcf27-bfa3-429d-8c22-ff2ba63fcdb0"). InnerVolumeSpecName "kube-api-access-kfvts". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:22:39 crc kubenswrapper[4933]: I1202 16:22:39.697565 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb2fcf27-bfa3-429d-8c22-ff2ba63fcdb0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cb2fcf27-bfa3-429d-8c22-ff2ba63fcdb0" (UID: "cb2fcf27-bfa3-429d-8c22-ff2ba63fcdb0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 16:22:39 crc kubenswrapper[4933]: I1202 16:22:39.752574 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfvts\" (UniqueName: \"kubernetes.io/projected/cb2fcf27-bfa3-429d-8c22-ff2ba63fcdb0-kube-api-access-kfvts\") on node \"crc\" DevicePath \"\"" Dec 02 16:22:39 crc kubenswrapper[4933]: I1202 16:22:39.752609 4933 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb2fcf27-bfa3-429d-8c22-ff2ba63fcdb0-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 16:22:39 crc kubenswrapper[4933]: I1202 16:22:39.752619 4933 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb2fcf27-bfa3-429d-8c22-ff2ba63fcdb0-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 16:22:40 crc kubenswrapper[4933]: I1202 16:22:40.157474 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m288s" Dec 02 16:22:40 crc kubenswrapper[4933]: I1202 16:22:40.157483 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m288s" event={"ID":"cb2fcf27-bfa3-429d-8c22-ff2ba63fcdb0","Type":"ContainerDied","Data":"847cf8c6a96570454add401f7d0dd57ca9f8b35c57070876957a99110e00d54b"} Dec 02 16:22:40 crc kubenswrapper[4933]: I1202 16:22:40.157978 4933 scope.go:117] "RemoveContainer" containerID="3fe6ada5e8e517625cd45722d3e2289ed1ad3e59a2454e74217b73668d8565c8" Dec 02 16:22:40 crc kubenswrapper[4933]: I1202 16:22:40.198625 4933 scope.go:117] "RemoveContainer" containerID="d94de596053ab6113630c603d0368a9e7fee83ffcfc3bf8871cdf07b7fc62b44" Dec 02 16:22:40 crc kubenswrapper[4933]: I1202 16:22:40.203875 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-m288s"] Dec 02 16:22:40 crc kubenswrapper[4933]: I1202 16:22:40.215195 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-m288s"] Dec 02 16:22:40 crc kubenswrapper[4933]: I1202 16:22:40.278764 4933 scope.go:117] "RemoveContainer" containerID="64c15a8f1950117bf0ef473c7ac3d574ccb1eea8b971594c28e660cada5dbe5a" Dec 02 16:22:41 crc kubenswrapper[4933]: I1202 16:22:41.067082 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb2fcf27-bfa3-429d-8c22-ff2ba63fcdb0" path="/var/lib/kubelet/pods/cb2fcf27-bfa3-429d-8c22-ff2ba63fcdb0/volumes" Dec 02 16:22:43 crc kubenswrapper[4933]: I1202 16:22:43.460470 4933 scope.go:117] "RemoveContainer" containerID="cd2e80308c9f1b00eaa6c2edd6cc51563d704053c166b8a9f5f3440137e64f3c" Dec 02 16:22:46 crc kubenswrapper[4933]: I1202 16:22:46.054153 4933 scope.go:117] "RemoveContainer" containerID="3464037e44f6657a3e78f158e1b9ac51eea30ff49dcabcf02071d819dacd47c7" Dec 02 16:22:46 crc kubenswrapper[4933]: E1202 16:22:46.055301 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 16:23:00 crc kubenswrapper[4933]: I1202 16:23:00.053998 4933 scope.go:117] "RemoveContainer" containerID="3464037e44f6657a3e78f158e1b9ac51eea30ff49dcabcf02071d819dacd47c7" Dec 02 16:23:00 crc kubenswrapper[4933]: E1202 16:23:00.055980 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 16:23:15 crc kubenswrapper[4933]: I1202 16:23:15.054159 4933 scope.go:117] "RemoveContainer" containerID="3464037e44f6657a3e78f158e1b9ac51eea30ff49dcabcf02071d819dacd47c7" Dec 02 16:23:15 crc kubenswrapper[4933]: E1202 16:23:15.054865 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 16:23:26 crc kubenswrapper[4933]: I1202 16:23:26.054259 4933 scope.go:117] "RemoveContainer" containerID="3464037e44f6657a3e78f158e1b9ac51eea30ff49dcabcf02071d819dacd47c7" Dec 02 16:23:26 crc kubenswrapper[4933]: E1202 16:23:26.055062 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 16:23:37 crc kubenswrapper[4933]: I1202 16:23:37.065247 4933 scope.go:117] "RemoveContainer" containerID="3464037e44f6657a3e78f158e1b9ac51eea30ff49dcabcf02071d819dacd47c7" Dec 02 16:23:37 crc kubenswrapper[4933]: E1202 16:23:37.066762 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 16:23:43 crc kubenswrapper[4933]: I1202 16:23:43.559460 4933 scope.go:117] "RemoveContainer" containerID="43aeb7bd707fcb0da67202e8c8af991d815d581f71fecb2bcf97157a60608b96" Dec 02 16:23:52 crc kubenswrapper[4933]: I1202 16:23:52.052775 4933 scope.go:117] "RemoveContainer" containerID="3464037e44f6657a3e78f158e1b9ac51eea30ff49dcabcf02071d819dacd47c7" Dec 02 16:23:53 crc kubenswrapper[4933]: I1202 16:23:53.119174 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" event={"ID":"e6c1c5e6-50dd-428a-890c-2c3f0456f2fa","Type":"ContainerStarted","Data":"620bdd0c4263af7415f957bb835c5b4382baf2a25794824589647a73b3d410f0"} Dec 02 16:24:04 crc kubenswrapper[4933]: I1202 16:24:04.072643 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-kqjfb"] Dec 02 16:24:04 crc kubenswrapper[4933]: I1202 16:24:04.086688 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-tmvnh"] Dec 02 16:24:04 crc kubenswrapper[4933]: I1202 16:24:04.098977 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-92e3-account-create-update-mql6h"] Dec 02 16:24:04 crc kubenswrapper[4933]: I1202 16:24:04.111545 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-tmvnh"] Dec 02 16:24:04 crc kubenswrapper[4933]: I1202 16:24:04.126154 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-92e3-account-create-update-mql6h"] Dec 02 16:24:04 crc kubenswrapper[4933]: I1202 16:24:04.143128 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-kqjfb"] Dec 02 16:24:05 crc kubenswrapper[4933]: I1202 16:24:05.031547 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-2d40-account-create-update-cb4br"] Dec 02 16:24:05 crc kubenswrapper[4933]: I1202 16:24:05.045034 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-2d40-account-create-update-cb4br"] Dec 02 16:24:05 crc kubenswrapper[4933]: I1202 16:24:05.072902 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0fb1fa15-e7a1-412d-a2c7-20e6e8e8792b" path="/var/lib/kubelet/pods/0fb1fa15-e7a1-412d-a2c7-20e6e8e8792b/volumes" Dec 02 16:24:05 crc kubenswrapper[4933]: I1202 16:24:05.075296 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33d64345-2d6d-47a6-90cf-2a7430c9749d" path="/var/lib/kubelet/pods/33d64345-2d6d-47a6-90cf-2a7430c9749d/volumes" Dec 02 16:24:05 crc kubenswrapper[4933]: I1202 16:24:05.076632 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4402628f-1d1f-44ad-8b84-e414f7345014" path="/var/lib/kubelet/pods/4402628f-1d1f-44ad-8b84-e414f7345014/volumes" Dec 02 16:24:05 crc kubenswrapper[4933]: I1202 16:24:05.077438 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96eee4be-07d4-4e98-a725-94746710698c" path="/var/lib/kubelet/pods/96eee4be-07d4-4e98-a725-94746710698c/volumes" Dec 02 16:24:06 crc kubenswrapper[4933]: I1202 16:24:06.038089 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-l2rkw"] Dec 02 16:24:06 crc kubenswrapper[4933]: I1202 16:24:06.051389 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-l2rkw"] Dec 02 16:24:07 crc kubenswrapper[4933]: I1202 16:24:07.035512 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-f87c-account-create-update-zndp8"] Dec 02 16:24:07 crc kubenswrapper[4933]: I1202 16:24:07.048711 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-f87c-account-create-update-zndp8"] Dec 02 16:24:07 crc kubenswrapper[4933]: I1202 16:24:07.069079 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63164577-9438-40d2-96b6-ba81f3d960d4" path="/var/lib/kubelet/pods/63164577-9438-40d2-96b6-ba81f3d960d4/volumes" Dec 02 16:24:07 crc kubenswrapper[4933]: I1202 16:24:07.069825 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf17e438-7511-4c81-ad30-864da63a2965" path="/var/lib/kubelet/pods/cf17e438-7511-4c81-ad30-864da63a2965/volumes" Dec 02 16:24:12 crc kubenswrapper[4933]: I1202 16:24:12.049391 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-5pwnn"] Dec 02 16:24:12 crc kubenswrapper[4933]: I1202 16:24:12.062872 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-139c-account-create-update-rws48"] Dec 02 16:24:12 crc kubenswrapper[4933]: I1202 16:24:12.076061 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-139c-account-create-update-rws48"] Dec 02 16:24:12 crc kubenswrapper[4933]: I1202 16:24:12.099702 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-5pwnn"] Dec 02 16:24:13 crc kubenswrapper[4933]: I1202 16:24:13.034935 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-create-9g2jj"] Dec 02 16:24:13 crc kubenswrapper[4933]: I1202 16:24:13.048089 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-create-9g2jj"] Dec 02 16:24:13 crc kubenswrapper[4933]: I1202 16:24:13.075633 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b64b3a1-928a-49f7-b910-6d99cc7e7cf1" path="/var/lib/kubelet/pods/1b64b3a1-928a-49f7-b910-6d99cc7e7cf1/volumes" Dec 02 16:24:13 crc kubenswrapper[4933]: I1202 16:24:13.076352 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="360b6f71-eb21-4616-87ef-d142ca4a2e1a" path="/var/lib/kubelet/pods/360b6f71-eb21-4616-87ef-d142ca4a2e1a/volumes" Dec 02 16:24:13 crc kubenswrapper[4933]: I1202 16:24:13.077039 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ada1850f-4671-48a5-97f3-7b82aab8ab97" path="/var/lib/kubelet/pods/ada1850f-4671-48a5-97f3-7b82aab8ab97/volumes" Dec 02 16:24:14 crc kubenswrapper[4933]: I1202 16:24:14.037840 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-b25b-account-create-update-p4l95"] Dec 02 16:24:14 crc kubenswrapper[4933]: I1202 16:24:14.052634 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-dvgn2"] Dec 02 16:24:14 crc kubenswrapper[4933]: I1202 16:24:14.067576 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-b25b-account-create-update-p4l95"] Dec 02 16:24:14 crc kubenswrapper[4933]: I1202 16:24:14.078270 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-dvgn2"] Dec 02 16:24:14 crc kubenswrapper[4933]: I1202 16:24:14.089864 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-59cd-account-create-update-fqk6n"] Dec 02 16:24:14 crc kubenswrapper[4933]: I1202 16:24:14.100562 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-gnm4h"] Dec 02 16:24:14 crc kubenswrapper[4933]: I1202 16:24:14.111322 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-gnm4h"] Dec 02 16:24:14 crc kubenswrapper[4933]: I1202 16:24:14.120864 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-59cd-account-create-update-fqk6n"] Dec 02 16:24:15 crc kubenswrapper[4933]: I1202 16:24:15.066884 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ff9f4e8-8048-4527-be5c-58279375324a" path="/var/lib/kubelet/pods/1ff9f4e8-8048-4527-be5c-58279375324a/volumes" Dec 02 16:24:15 crc kubenswrapper[4933]: I1202 16:24:15.067656 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82ac1fbd-600e-4795-855d-3dae65b36263" path="/var/lib/kubelet/pods/82ac1fbd-600e-4795-855d-3dae65b36263/volumes" Dec 02 16:24:15 crc kubenswrapper[4933]: I1202 16:24:15.068385 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc660678-23e8-42d3-b34e-970411b1b48c" path="/var/lib/kubelet/pods/bc660678-23e8-42d3-b34e-970411b1b48c/volumes" Dec 02 16:24:15 crc kubenswrapper[4933]: I1202 16:24:15.068992 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be505e35-b5c6-434e-9f60-621f6adb19aa" path="/var/lib/kubelet/pods/be505e35-b5c6-434e-9f60-621f6adb19aa/volumes" Dec 02 16:24:17 crc kubenswrapper[4933]: I1202 16:24:17.034620 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-ab9a-account-create-update-s4k9m"] Dec 02 16:24:17 crc kubenswrapper[4933]: I1202 16:24:17.049115 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-ssstq"] Dec 02 16:24:17 crc kubenswrapper[4933]: I1202 16:24:17.080016 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5615-account-create-update-d844g"] Dec 02 16:24:17 crc kubenswrapper[4933]: I1202 16:24:17.080056 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-bgnf5"] Dec 02 16:24:17 crc kubenswrapper[4933]: I1202 16:24:17.090190 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-ssstq"] Dec 02 16:24:17 crc kubenswrapper[4933]: I1202 16:24:17.100166 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-6e7e-account-create-update-6zd9l"] Dec 02 16:24:17 crc kubenswrapper[4933]: I1202 16:24:17.110864 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-ab9a-account-create-update-s4k9m"] Dec 02 16:24:17 crc kubenswrapper[4933]: I1202 16:24:17.121102 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-5615-account-create-update-d844g"] Dec 02 16:24:17 crc kubenswrapper[4933]: I1202 16:24:17.130171 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-6e7e-account-create-update-6zd9l"] Dec 02 16:24:17 crc kubenswrapper[4933]: I1202 16:24:17.139590 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-bgnf5"] Dec 02 16:24:19 crc kubenswrapper[4933]: I1202 16:24:19.066596 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d265444-f155-4232-802e-b454de66daa6" path="/var/lib/kubelet/pods/1d265444-f155-4232-802e-b454de66daa6/volumes" Dec 02 16:24:19 crc kubenswrapper[4933]: I1202 16:24:19.067521 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3508d064-3032-4754-a0b9-a81c7e3a14b9" path="/var/lib/kubelet/pods/3508d064-3032-4754-a0b9-a81c7e3a14b9/volumes" Dec 02 16:24:19 crc kubenswrapper[4933]: I1202 16:24:19.068167 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f084c2d-a8f4-4082-9130-052fbf9c9a30" path="/var/lib/kubelet/pods/8f084c2d-a8f4-4082-9130-052fbf9c9a30/volumes" Dec 02 16:24:19 crc kubenswrapper[4933]: I1202 16:24:19.068760 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4d438d7-31b1-4eb8-8794-2cc6a8aa79a3" path="/var/lib/kubelet/pods/c4d438d7-31b1-4eb8-8794-2cc6a8aa79a3/volumes" Dec 02 16:24:19 crc kubenswrapper[4933]: I1202 16:24:19.069900 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea8158de-843d-4149-8b2d-5621374e8e71" path="/var/lib/kubelet/pods/ea8158de-843d-4149-8b2d-5621374e8e71/volumes" Dec 02 16:24:21 crc kubenswrapper[4933]: I1202 16:24:21.429420 4933 generic.go:334] "Generic (PLEG): container finished" podID="2060aa16-0f55-457f-98c1-058372e78f0f" containerID="52147fa4a8f7644dac2993ebb77b6476a67f7ed664bbd3f7fdcf3949422fdead" exitCode=0 Dec 02 16:24:21 crc kubenswrapper[4933]: I1202 16:24:21.429481 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-db4xp" event={"ID":"2060aa16-0f55-457f-98c1-058372e78f0f","Type":"ContainerDied","Data":"52147fa4a8f7644dac2993ebb77b6476a67f7ed664bbd3f7fdcf3949422fdead"} Dec 02 16:24:22 crc kubenswrapper[4933]: I1202 16:24:22.886453 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-db4xp" Dec 02 16:24:22 crc kubenswrapper[4933]: I1202 16:24:22.924625 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2060aa16-0f55-457f-98c1-058372e78f0f-inventory\") pod \"2060aa16-0f55-457f-98c1-058372e78f0f\" (UID: \"2060aa16-0f55-457f-98c1-058372e78f0f\") " Dec 02 16:24:22 crc kubenswrapper[4933]: I1202 16:24:22.963768 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2060aa16-0f55-457f-98c1-058372e78f0f-inventory" (OuterVolumeSpecName: "inventory") pod "2060aa16-0f55-457f-98c1-058372e78f0f" (UID: "2060aa16-0f55-457f-98c1-058372e78f0f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:24:23 crc kubenswrapper[4933]: I1202 16:24:23.026547 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2060aa16-0f55-457f-98c1-058372e78f0f-ssh-key\") pod \"2060aa16-0f55-457f-98c1-058372e78f0f\" (UID: \"2060aa16-0f55-457f-98c1-058372e78f0f\") " Dec 02 16:24:23 crc kubenswrapper[4933]: I1202 16:24:23.026970 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2060aa16-0f55-457f-98c1-058372e78f0f-bootstrap-combined-ca-bundle\") pod \"2060aa16-0f55-457f-98c1-058372e78f0f\" (UID: \"2060aa16-0f55-457f-98c1-058372e78f0f\") " Dec 02 16:24:23 crc kubenswrapper[4933]: I1202 16:24:23.027177 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xms7w\" (UniqueName: \"kubernetes.io/projected/2060aa16-0f55-457f-98c1-058372e78f0f-kube-api-access-xms7w\") pod \"2060aa16-0f55-457f-98c1-058372e78f0f\" (UID: \"2060aa16-0f55-457f-98c1-058372e78f0f\") " Dec 02 16:24:23 crc kubenswrapper[4933]: I1202 16:24:23.032458 4933 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2060aa16-0f55-457f-98c1-058372e78f0f-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 16:24:23 crc kubenswrapper[4933]: I1202 16:24:23.050251 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-vh4vf"] Dec 02 16:24:23 crc kubenswrapper[4933]: I1202 16:24:23.057036 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2060aa16-0f55-457f-98c1-058372e78f0f-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "2060aa16-0f55-457f-98c1-058372e78f0f" (UID: "2060aa16-0f55-457f-98c1-058372e78f0f"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:24:23 crc kubenswrapper[4933]: I1202 16:24:23.058045 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2060aa16-0f55-457f-98c1-058372e78f0f-kube-api-access-xms7w" (OuterVolumeSpecName: "kube-api-access-xms7w") pod "2060aa16-0f55-457f-98c1-058372e78f0f" (UID: "2060aa16-0f55-457f-98c1-058372e78f0f"). InnerVolumeSpecName "kube-api-access-xms7w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:24:23 crc kubenswrapper[4933]: I1202 16:24:23.093443 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2060aa16-0f55-457f-98c1-058372e78f0f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "2060aa16-0f55-457f-98c1-058372e78f0f" (UID: "2060aa16-0f55-457f-98c1-058372e78f0f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:24:23 crc kubenswrapper[4933]: I1202 16:24:23.135459 4933 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2060aa16-0f55-457f-98c1-058372e78f0f-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 16:24:23 crc kubenswrapper[4933]: I1202 16:24:23.135763 4933 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2060aa16-0f55-457f-98c1-058372e78f0f-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 16:24:23 crc kubenswrapper[4933]: I1202 16:24:23.135779 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xms7w\" (UniqueName: \"kubernetes.io/projected/2060aa16-0f55-457f-98c1-058372e78f0f-kube-api-access-xms7w\") on node \"crc\" DevicePath \"\"" Dec 02 16:24:23 crc kubenswrapper[4933]: I1202 16:24:23.159909 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-vh4vf"] Dec 02 16:24:23 crc kubenswrapper[4933]: I1202 16:24:23.454351 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-db4xp" event={"ID":"2060aa16-0f55-457f-98c1-058372e78f0f","Type":"ContainerDied","Data":"843168771e4d5c08737467f304ddf760122c1c29a18f73264146ab29e93242b3"} Dec 02 16:24:23 crc kubenswrapper[4933]: I1202 16:24:23.454398 4933 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="843168771e4d5c08737467f304ddf760122c1c29a18f73264146ab29e93242b3" Dec 02 16:24:23 crc kubenswrapper[4933]: I1202 16:24:23.454452 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-db4xp" Dec 02 16:24:23 crc kubenswrapper[4933]: I1202 16:24:23.549361 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-dtvjg"] Dec 02 16:24:23 crc kubenswrapper[4933]: E1202 16:24:23.549901 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb2fcf27-bfa3-429d-8c22-ff2ba63fcdb0" containerName="extract-utilities" Dec 02 16:24:23 crc kubenswrapper[4933]: I1202 16:24:23.549925 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb2fcf27-bfa3-429d-8c22-ff2ba63fcdb0" containerName="extract-utilities" Dec 02 16:24:23 crc kubenswrapper[4933]: E1202 16:24:23.549958 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb2fcf27-bfa3-429d-8c22-ff2ba63fcdb0" containerName="extract-content" Dec 02 16:24:23 crc kubenswrapper[4933]: I1202 16:24:23.549968 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb2fcf27-bfa3-429d-8c22-ff2ba63fcdb0" containerName="extract-content" Dec 02 16:24:23 crc kubenswrapper[4933]: E1202 16:24:23.549989 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb2fcf27-bfa3-429d-8c22-ff2ba63fcdb0" containerName="registry-server" Dec 02 16:24:23 crc kubenswrapper[4933]: I1202 16:24:23.549997 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb2fcf27-bfa3-429d-8c22-ff2ba63fcdb0" containerName="registry-server" Dec 02 16:24:23 crc kubenswrapper[4933]: E1202 16:24:23.550031 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2060aa16-0f55-457f-98c1-058372e78f0f" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 02 16:24:23 crc kubenswrapper[4933]: I1202 16:24:23.550040 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="2060aa16-0f55-457f-98c1-058372e78f0f" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 02 16:24:23 crc kubenswrapper[4933]: I1202 16:24:23.550290 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="2060aa16-0f55-457f-98c1-058372e78f0f" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 02 16:24:23 crc kubenswrapper[4933]: I1202 16:24:23.550319 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb2fcf27-bfa3-429d-8c22-ff2ba63fcdb0" containerName="registry-server" Dec 02 16:24:23 crc kubenswrapper[4933]: I1202 16:24:23.551240 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-dtvjg" Dec 02 16:24:23 crc kubenswrapper[4933]: I1202 16:24:23.553732 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 16:24:23 crc kubenswrapper[4933]: I1202 16:24:23.553795 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 16:24:23 crc kubenswrapper[4933]: I1202 16:24:23.553793 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mlmmm" Dec 02 16:24:23 crc kubenswrapper[4933]: I1202 16:24:23.558075 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 16:24:23 crc kubenswrapper[4933]: I1202 16:24:23.562288 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-dtvjg"] Dec 02 16:24:23 crc kubenswrapper[4933]: I1202 16:24:23.646613 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b97jl\" (UniqueName: \"kubernetes.io/projected/2baaff97-25e3-44ca-818e-0c9d121abe01-kube-api-access-b97jl\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-dtvjg\" (UID: \"2baaff97-25e3-44ca-818e-0c9d121abe01\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-dtvjg" Dec 02 16:24:23 crc kubenswrapper[4933]: I1202 16:24:23.646750 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2baaff97-25e3-44ca-818e-0c9d121abe01-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-dtvjg\" (UID: \"2baaff97-25e3-44ca-818e-0c9d121abe01\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-dtvjg" Dec 02 16:24:23 crc kubenswrapper[4933]: I1202 16:24:23.646840 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2baaff97-25e3-44ca-818e-0c9d121abe01-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-dtvjg\" (UID: \"2baaff97-25e3-44ca-818e-0c9d121abe01\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-dtvjg" Dec 02 16:24:23 crc kubenswrapper[4933]: I1202 16:24:23.749256 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2baaff97-25e3-44ca-818e-0c9d121abe01-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-dtvjg\" (UID: \"2baaff97-25e3-44ca-818e-0c9d121abe01\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-dtvjg" Dec 02 16:24:23 crc kubenswrapper[4933]: I1202 16:24:23.749355 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2baaff97-25e3-44ca-818e-0c9d121abe01-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-dtvjg\" (UID: \"2baaff97-25e3-44ca-818e-0c9d121abe01\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-dtvjg" Dec 02 16:24:23 crc kubenswrapper[4933]: I1202 16:24:23.749487 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b97jl\" (UniqueName: \"kubernetes.io/projected/2baaff97-25e3-44ca-818e-0c9d121abe01-kube-api-access-b97jl\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-dtvjg\" (UID: \"2baaff97-25e3-44ca-818e-0c9d121abe01\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-dtvjg" Dec 02 16:24:23 crc kubenswrapper[4933]: I1202 16:24:23.754617 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2baaff97-25e3-44ca-818e-0c9d121abe01-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-dtvjg\" (UID: \"2baaff97-25e3-44ca-818e-0c9d121abe01\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-dtvjg" Dec 02 16:24:23 crc kubenswrapper[4933]: I1202 16:24:23.754723 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2baaff97-25e3-44ca-818e-0c9d121abe01-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-dtvjg\" (UID: \"2baaff97-25e3-44ca-818e-0c9d121abe01\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-dtvjg" Dec 02 16:24:23 crc kubenswrapper[4933]: I1202 16:24:23.766649 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b97jl\" (UniqueName: \"kubernetes.io/projected/2baaff97-25e3-44ca-818e-0c9d121abe01-kube-api-access-b97jl\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-dtvjg\" (UID: \"2baaff97-25e3-44ca-818e-0c9d121abe01\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-dtvjg" Dec 02 16:24:23 crc kubenswrapper[4933]: I1202 16:24:23.873688 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-dtvjg" Dec 02 16:24:24 crc kubenswrapper[4933]: I1202 16:24:24.417255 4933 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 16:24:24 crc kubenswrapper[4933]: I1202 16:24:24.423008 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-dtvjg"] Dec 02 16:24:24 crc kubenswrapper[4933]: I1202 16:24:24.478360 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-dtvjg" event={"ID":"2baaff97-25e3-44ca-818e-0c9d121abe01","Type":"ContainerStarted","Data":"fa5946ba190d98e45f90261bd2280632f49623ce1eab32f63973d811dc3102aa"} Dec 02 16:24:25 crc kubenswrapper[4933]: I1202 16:24:25.072710 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e2abbab-0ae7-4cb8-9b51-a2423b7948da" path="/var/lib/kubelet/pods/8e2abbab-0ae7-4cb8-9b51-a2423b7948da/volumes" Dec 02 16:24:25 crc kubenswrapper[4933]: I1202 16:24:25.492606 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-dtvjg" event={"ID":"2baaff97-25e3-44ca-818e-0c9d121abe01","Type":"ContainerStarted","Data":"6f3e060059b607e96c92b12c92ec8488a4fe14f08a90b4d5bda7202ccd6edaf7"} Dec 02 16:24:25 crc kubenswrapper[4933]: I1202 16:24:25.514177 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-dtvjg" podStartSLOduration=2.068836752 podStartE2EDuration="2.5141579s" podCreationTimestamp="2025-12-02 16:24:23 +0000 UTC" firstStartedPulling="2025-12-02 16:24:24.416976168 +0000 UTC m=+1927.668202871" lastFinishedPulling="2025-12-02 16:24:24.862297306 +0000 UTC m=+1928.113524019" observedRunningTime="2025-12-02 16:24:25.506025073 +0000 UTC m=+1928.757251776" watchObservedRunningTime="2025-12-02 16:24:25.5141579 +0000 UTC m=+1928.765384603" Dec 02 16:24:43 crc kubenswrapper[4933]: I1202 16:24:43.625508 4933 scope.go:117] "RemoveContainer" containerID="680359322fbe0d13024f5c435bbcfbef006d6d62d239663d711e12b930f56cb0" Dec 02 16:24:43 crc kubenswrapper[4933]: I1202 16:24:43.650417 4933 scope.go:117] "RemoveContainer" containerID="81292d89bccccc8f23d5cb412aed4c0a83459fb6811457b2feac58a039c13c48" Dec 02 16:24:43 crc kubenswrapper[4933]: I1202 16:24:43.702612 4933 scope.go:117] "RemoveContainer" containerID="a96f0cd0e5a6ddd987b6b04080a43336cc8ffe9e08346f77b8de85b4e09b413c" Dec 02 16:24:43 crc kubenswrapper[4933]: I1202 16:24:43.733653 4933 scope.go:117] "RemoveContainer" containerID="fee102c14fb246b50092852ad06657a7a932daf8c51286fd2e6bf917c0160042" Dec 02 16:24:43 crc kubenswrapper[4933]: I1202 16:24:43.780938 4933 scope.go:117] "RemoveContainer" containerID="c67940f616718367b793f26e7ff40e948807364b58ce4eab90d8cb3819d1b627" Dec 02 16:24:43 crc kubenswrapper[4933]: I1202 16:24:43.842446 4933 scope.go:117] "RemoveContainer" containerID="2dcb116ae0821b2675b43f05b8244b124bb63ad3795def616da768a32fbfefd4" Dec 02 16:24:43 crc kubenswrapper[4933]: I1202 16:24:43.892042 4933 scope.go:117] "RemoveContainer" containerID="dfa4f9abf32790f776aba1c95b5303a69d2be0c713f9ab9292d8904028a9af3f" Dec 02 16:24:43 crc kubenswrapper[4933]: I1202 16:24:43.952077 4933 scope.go:117] "RemoveContainer" containerID="75029d347067d95cfb2f6f53714d3275445b13b9bf53c763cc00d6512c4308b4" Dec 02 16:24:43 crc kubenswrapper[4933]: I1202 16:24:43.979028 4933 scope.go:117] "RemoveContainer" containerID="f389c81d1304fc08b5c71de6d57df6441f36092cc3f38637b57496111803c656" Dec 02 16:24:44 crc kubenswrapper[4933]: I1202 16:24:44.006756 4933 scope.go:117] "RemoveContainer" containerID="6184db6079b1450eda447c57617970dfad41e894f6184cddfc8d394add273cf6" Dec 02 16:24:44 crc kubenswrapper[4933]: I1202 16:24:44.033324 4933 scope.go:117] "RemoveContainer" containerID="a2333808787add1f966b4b40008b6f37bd16437f2a0180b7d8d03e1e9b4eb72d" Dec 02 16:24:44 crc kubenswrapper[4933]: I1202 16:24:44.055280 4933 scope.go:117] "RemoveContainer" containerID="714b6d3a0e45b0bca815b97029794c499c5e4972182ee301d007b07f8025b6da" Dec 02 16:24:44 crc kubenswrapper[4933]: I1202 16:24:44.087355 4933 scope.go:117] "RemoveContainer" containerID="110e9e366b344c9454e199763a09d2f3161de244d6494ea4d8f115ca99b1faf0" Dec 02 16:24:44 crc kubenswrapper[4933]: I1202 16:24:44.107389 4933 scope.go:117] "RemoveContainer" containerID="972925d235e7cc5bec7e3cf4f10f50400f5340b66c9f7fe09d26ce2ff882f8c3" Dec 02 16:24:44 crc kubenswrapper[4933]: I1202 16:24:44.127762 4933 scope.go:117] "RemoveContainer" containerID="1d9bd01203c548cba964d12bb32bb71fdb8053e98deda054288b5b0db059913c" Dec 02 16:24:44 crc kubenswrapper[4933]: I1202 16:24:44.151252 4933 scope.go:117] "RemoveContainer" containerID="6c1d76c96756bbec8155e7a1c2bf381e04eaeec22018684ad5072b259093274a" Dec 02 16:24:44 crc kubenswrapper[4933]: I1202 16:24:44.182629 4933 scope.go:117] "RemoveContainer" containerID="7b1087e21782d7d67edc0d348786fee50b0980ada5c687456d587679e77d17fe" Dec 02 16:24:44 crc kubenswrapper[4933]: I1202 16:24:44.201671 4933 scope.go:117] "RemoveContainer" containerID="e6a144c1d0da23141ffa91100cf1e01ba5c7b68dbd0a8ed2ac949eb31b1a5686" Dec 02 16:24:44 crc kubenswrapper[4933]: I1202 16:24:44.224561 4933 scope.go:117] "RemoveContainer" containerID="fe4e7b2cb0d52f4b9903fec1ed83f1d9d7beccad4248982089285af8e762b828" Dec 02 16:24:44 crc kubenswrapper[4933]: I1202 16:24:44.245154 4933 scope.go:117] "RemoveContainer" containerID="3d3a713cc446acee639e6dad31a015f8d81143a07efdac0c0f386481d3124631" Dec 02 16:24:44 crc kubenswrapper[4933]: I1202 16:24:44.262592 4933 scope.go:117] "RemoveContainer" containerID="2435a130c6381d22ba3e9e38424d379235216bc10da713924dae4f1d2896c524" Dec 02 16:25:05 crc kubenswrapper[4933]: I1202 16:25:05.042634 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-x8fgp"] Dec 02 16:25:05 crc kubenswrapper[4933]: I1202 16:25:05.069705 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-x8fgp"] Dec 02 16:25:07 crc kubenswrapper[4933]: I1202 16:25:07.033149 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-sxblk"] Dec 02 16:25:07 crc kubenswrapper[4933]: I1202 16:25:07.045017 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-sxblk"] Dec 02 16:25:07 crc kubenswrapper[4933]: I1202 16:25:07.067920 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b7bd174-a0e2-41ff-bc1e-46095ec6f2c4" path="/var/lib/kubelet/pods/5b7bd174-a0e2-41ff-bc1e-46095ec6f2c4/volumes" Dec 02 16:25:07 crc kubenswrapper[4933]: I1202 16:25:07.068798 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d342549d-487a-4b41-89cc-8ce263aed373" path="/var/lib/kubelet/pods/d342549d-487a-4b41-89cc-8ce263aed373/volumes" Dec 02 16:25:11 crc kubenswrapper[4933]: I1202 16:25:11.042305 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-pd5s5"] Dec 02 16:25:11 crc kubenswrapper[4933]: I1202 16:25:11.051917 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-pd5s5"] Dec 02 16:25:11 crc kubenswrapper[4933]: I1202 16:25:11.066772 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7179e9e-f623-44a2-9f70-021244770e56" path="/var/lib/kubelet/pods/f7179e9e-f623-44a2-9f70-021244770e56/volumes" Dec 02 16:25:16 crc kubenswrapper[4933]: I1202 16:25:16.031173 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-rwgj6"] Dec 02 16:25:16 crc kubenswrapper[4933]: I1202 16:25:16.041089 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-rwgj6"] Dec 02 16:25:17 crc kubenswrapper[4933]: I1202 16:25:17.069203 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39ec29fe-4f0d-4afe-9440-0cdc7b30d651" path="/var/lib/kubelet/pods/39ec29fe-4f0d-4afe-9440-0cdc7b30d651/volumes" Dec 02 16:25:19 crc kubenswrapper[4933]: I1202 16:25:19.059205 4933 prober.go:107] "Probe failed" probeType="Liveness" pod="hostpath-provisioner/csi-hostpathplugin-5xvkq" podUID="9f5ccd80-acb3-4128-bf7c-e2726a9fe5ab" containerName="hostpath-provisioner" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 16:25:23 crc kubenswrapper[4933]: I1202 16:25:23.050867 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-649gh"] Dec 02 16:25:23 crc kubenswrapper[4933]: I1202 16:25:23.100175 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-649gh"] Dec 02 16:25:25 crc kubenswrapper[4933]: I1202 16:25:25.075412 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae883c36-aac5-49f4-839c-d0140fe724cc" path="/var/lib/kubelet/pods/ae883c36-aac5-49f4-839c-d0140fe724cc/volumes" Dec 02 16:25:34 crc kubenswrapper[4933]: I1202 16:25:34.031327 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-gtbcc"] Dec 02 16:25:34 crc kubenswrapper[4933]: I1202 16:25:34.048884 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-gtbcc"] Dec 02 16:25:35 crc kubenswrapper[4933]: I1202 16:25:35.093662 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c326307-73df-462e-98ec-0e4dc89fdd54" path="/var/lib/kubelet/pods/9c326307-73df-462e-98ec-0e4dc89fdd54/volumes" Dec 02 16:25:44 crc kubenswrapper[4933]: I1202 16:25:44.631517 4933 scope.go:117] "RemoveContainer" containerID="21d83126c87174185d7f389dac8b1554c1fbb2e8b04e811961f27bd1a5cff784" Dec 02 16:25:44 crc kubenswrapper[4933]: I1202 16:25:44.663571 4933 scope.go:117] "RemoveContainer" containerID="695f67b9aea9f7e5082f73616ee2fd565dbbf11fa4f169dba5f0d34bb67047dc" Dec 02 16:25:44 crc kubenswrapper[4933]: I1202 16:25:44.729812 4933 scope.go:117] "RemoveContainer" containerID="c37bb5d562f32a158fdbc43377a6a660cf52d09e6b14589c72e72f0ef54b3878" Dec 02 16:25:44 crc kubenswrapper[4933]: I1202 16:25:44.787257 4933 scope.go:117] "RemoveContainer" containerID="c0ba5a252954e9480867705618a22bf881b3aa04ad391282d038ca64caa0c0f4" Dec 02 16:25:44 crc kubenswrapper[4933]: I1202 16:25:44.849682 4933 scope.go:117] "RemoveContainer" containerID="aa43f89a5ee3cc09277b8d1c5371526e918b571f89705e74235a7dbb6f5a2482" Dec 02 16:25:44 crc kubenswrapper[4933]: I1202 16:25:44.893185 4933 scope.go:117] "RemoveContainer" containerID="4e73c93e39976283c0cf550c068a7692530e082ccc049f2f51661c2a9b7ee222" Dec 02 16:26:17 crc kubenswrapper[4933]: I1202 16:26:17.169525 4933 patch_prober.go:28] interesting pod/machine-config-daemon-d2p6w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 16:26:17 crc kubenswrapper[4933]: I1202 16:26:17.170162 4933 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 16:26:29 crc kubenswrapper[4933]: I1202 16:26:29.993445 4933 generic.go:334] "Generic (PLEG): container finished" podID="2baaff97-25e3-44ca-818e-0c9d121abe01" containerID="6f3e060059b607e96c92b12c92ec8488a4fe14f08a90b4d5bda7202ccd6edaf7" exitCode=0 Dec 02 16:26:29 crc kubenswrapper[4933]: I1202 16:26:29.994048 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-dtvjg" event={"ID":"2baaff97-25e3-44ca-818e-0c9d121abe01","Type":"ContainerDied","Data":"6f3e060059b607e96c92b12c92ec8488a4fe14f08a90b4d5bda7202ccd6edaf7"} Dec 02 16:26:31 crc kubenswrapper[4933]: I1202 16:26:31.539285 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-dtvjg" Dec 02 16:26:31 crc kubenswrapper[4933]: I1202 16:26:31.696164 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2baaff97-25e3-44ca-818e-0c9d121abe01-ssh-key\") pod \"2baaff97-25e3-44ca-818e-0c9d121abe01\" (UID: \"2baaff97-25e3-44ca-818e-0c9d121abe01\") " Dec 02 16:26:31 crc kubenswrapper[4933]: I1202 16:26:31.696402 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2baaff97-25e3-44ca-818e-0c9d121abe01-inventory\") pod \"2baaff97-25e3-44ca-818e-0c9d121abe01\" (UID: \"2baaff97-25e3-44ca-818e-0c9d121abe01\") " Dec 02 16:26:31 crc kubenswrapper[4933]: I1202 16:26:31.696500 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b97jl\" (UniqueName: \"kubernetes.io/projected/2baaff97-25e3-44ca-818e-0c9d121abe01-kube-api-access-b97jl\") pod \"2baaff97-25e3-44ca-818e-0c9d121abe01\" (UID: \"2baaff97-25e3-44ca-818e-0c9d121abe01\") " Dec 02 16:26:31 crc kubenswrapper[4933]: I1202 16:26:31.701945 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2baaff97-25e3-44ca-818e-0c9d121abe01-kube-api-access-b97jl" (OuterVolumeSpecName: "kube-api-access-b97jl") pod "2baaff97-25e3-44ca-818e-0c9d121abe01" (UID: "2baaff97-25e3-44ca-818e-0c9d121abe01"). InnerVolumeSpecName "kube-api-access-b97jl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:26:31 crc kubenswrapper[4933]: I1202 16:26:31.730037 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2baaff97-25e3-44ca-818e-0c9d121abe01-inventory" (OuterVolumeSpecName: "inventory") pod "2baaff97-25e3-44ca-818e-0c9d121abe01" (UID: "2baaff97-25e3-44ca-818e-0c9d121abe01"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:26:31 crc kubenswrapper[4933]: I1202 16:26:31.732403 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2baaff97-25e3-44ca-818e-0c9d121abe01-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "2baaff97-25e3-44ca-818e-0c9d121abe01" (UID: "2baaff97-25e3-44ca-818e-0c9d121abe01"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:26:31 crc kubenswrapper[4933]: I1202 16:26:31.799701 4933 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2baaff97-25e3-44ca-818e-0c9d121abe01-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 16:26:31 crc kubenswrapper[4933]: I1202 16:26:31.799731 4933 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2baaff97-25e3-44ca-818e-0c9d121abe01-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 16:26:31 crc kubenswrapper[4933]: I1202 16:26:31.799741 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b97jl\" (UniqueName: \"kubernetes.io/projected/2baaff97-25e3-44ca-818e-0c9d121abe01-kube-api-access-b97jl\") on node \"crc\" DevicePath \"\"" Dec 02 16:26:32 crc kubenswrapper[4933]: I1202 16:26:32.024585 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-dtvjg" event={"ID":"2baaff97-25e3-44ca-818e-0c9d121abe01","Type":"ContainerDied","Data":"fa5946ba190d98e45f90261bd2280632f49623ce1eab32f63973d811dc3102aa"} Dec 02 16:26:32 crc kubenswrapper[4933]: I1202 16:26:32.024630 4933 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fa5946ba190d98e45f90261bd2280632f49623ce1eab32f63973d811dc3102aa" Dec 02 16:26:32 crc kubenswrapper[4933]: I1202 16:26:32.024698 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-dtvjg" Dec 02 16:26:32 crc kubenswrapper[4933]: I1202 16:26:32.115231 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-87nbr"] Dec 02 16:26:32 crc kubenswrapper[4933]: E1202 16:26:32.116043 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2baaff97-25e3-44ca-818e-0c9d121abe01" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 02 16:26:32 crc kubenswrapper[4933]: I1202 16:26:32.116092 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="2baaff97-25e3-44ca-818e-0c9d121abe01" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 02 16:26:32 crc kubenswrapper[4933]: I1202 16:26:32.116539 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="2baaff97-25e3-44ca-818e-0c9d121abe01" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 02 16:26:32 crc kubenswrapper[4933]: I1202 16:26:32.118176 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-87nbr" Dec 02 16:26:32 crc kubenswrapper[4933]: I1202 16:26:32.121623 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 16:26:32 crc kubenswrapper[4933]: I1202 16:26:32.121893 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mlmmm" Dec 02 16:26:32 crc kubenswrapper[4933]: I1202 16:26:32.122834 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 16:26:32 crc kubenswrapper[4933]: I1202 16:26:32.122978 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 16:26:32 crc kubenswrapper[4933]: I1202 16:26:32.130841 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-87nbr"] Dec 02 16:26:32 crc kubenswrapper[4933]: I1202 16:26:32.311752 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0a4e084f-f2c5-418d-8990-6074168317ab-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-87nbr\" (UID: \"0a4e084f-f2c5-418d-8990-6074168317ab\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-87nbr" Dec 02 16:26:32 crc kubenswrapper[4933]: I1202 16:26:32.312116 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwmsg\" (UniqueName: \"kubernetes.io/projected/0a4e084f-f2c5-418d-8990-6074168317ab-kube-api-access-gwmsg\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-87nbr\" (UID: \"0a4e084f-f2c5-418d-8990-6074168317ab\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-87nbr" Dec 02 16:26:32 crc kubenswrapper[4933]: I1202 16:26:32.312217 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0a4e084f-f2c5-418d-8990-6074168317ab-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-87nbr\" (UID: \"0a4e084f-f2c5-418d-8990-6074168317ab\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-87nbr" Dec 02 16:26:32 crc kubenswrapper[4933]: I1202 16:26:32.415102 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0a4e084f-f2c5-418d-8990-6074168317ab-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-87nbr\" (UID: \"0a4e084f-f2c5-418d-8990-6074168317ab\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-87nbr" Dec 02 16:26:32 crc kubenswrapper[4933]: I1202 16:26:32.415378 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0a4e084f-f2c5-418d-8990-6074168317ab-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-87nbr\" (UID: \"0a4e084f-f2c5-418d-8990-6074168317ab\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-87nbr" Dec 02 16:26:32 crc kubenswrapper[4933]: I1202 16:26:32.415404 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwmsg\" (UniqueName: \"kubernetes.io/projected/0a4e084f-f2c5-418d-8990-6074168317ab-kube-api-access-gwmsg\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-87nbr\" (UID: \"0a4e084f-f2c5-418d-8990-6074168317ab\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-87nbr" Dec 02 16:26:32 crc kubenswrapper[4933]: I1202 16:26:32.419162 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0a4e084f-f2c5-418d-8990-6074168317ab-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-87nbr\" (UID: \"0a4e084f-f2c5-418d-8990-6074168317ab\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-87nbr" Dec 02 16:26:32 crc kubenswrapper[4933]: I1202 16:26:32.420053 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0a4e084f-f2c5-418d-8990-6074168317ab-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-87nbr\" (UID: \"0a4e084f-f2c5-418d-8990-6074168317ab\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-87nbr" Dec 02 16:26:32 crc kubenswrapper[4933]: I1202 16:26:32.431726 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwmsg\" (UniqueName: \"kubernetes.io/projected/0a4e084f-f2c5-418d-8990-6074168317ab-kube-api-access-gwmsg\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-87nbr\" (UID: \"0a4e084f-f2c5-418d-8990-6074168317ab\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-87nbr" Dec 02 16:26:32 crc kubenswrapper[4933]: I1202 16:26:32.479503 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-87nbr" Dec 02 16:26:33 crc kubenswrapper[4933]: I1202 16:26:33.046850 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-87nbr"] Dec 02 16:26:33 crc kubenswrapper[4933]: W1202 16:26:33.062346 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a4e084f_f2c5_418d_8990_6074168317ab.slice/crio-f9b773c9d867f5ccb2e22dd1d50b5c4196e244902c70833af712888b59143589 WatchSource:0}: Error finding container f9b773c9d867f5ccb2e22dd1d50b5c4196e244902c70833af712888b59143589: Status 404 returned error can't find the container with id f9b773c9d867f5ccb2e22dd1d50b5c4196e244902c70833af712888b59143589 Dec 02 16:26:34 crc kubenswrapper[4933]: I1202 16:26:34.048221 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-87nbr" event={"ID":"0a4e084f-f2c5-418d-8990-6074168317ab","Type":"ContainerStarted","Data":"9210a7f51593fb3407bde7262f5e3600a90330ccf6bb8288c2d0913791edf7ef"} Dec 02 16:26:34 crc kubenswrapper[4933]: I1202 16:26:34.048557 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-87nbr" event={"ID":"0a4e084f-f2c5-418d-8990-6074168317ab","Type":"ContainerStarted","Data":"f9b773c9d867f5ccb2e22dd1d50b5c4196e244902c70833af712888b59143589"} Dec 02 16:26:34 crc kubenswrapper[4933]: I1202 16:26:34.072293 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-87nbr" podStartSLOduration=1.531179131 podStartE2EDuration="2.072275437s" podCreationTimestamp="2025-12-02 16:26:32 +0000 UTC" firstStartedPulling="2025-12-02 16:26:33.068418746 +0000 UTC m=+2056.319645449" lastFinishedPulling="2025-12-02 16:26:33.609515042 +0000 UTC m=+2056.860741755" observedRunningTime="2025-12-02 16:26:34.06452864 +0000 UTC m=+2057.315755353" watchObservedRunningTime="2025-12-02 16:26:34.072275437 +0000 UTC m=+2057.323502140" Dec 02 16:26:34 crc kubenswrapper[4933]: I1202 16:26:34.925083 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zjdxd"] Dec 02 16:26:34 crc kubenswrapper[4933]: I1202 16:26:34.930287 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zjdxd" Dec 02 16:26:34 crc kubenswrapper[4933]: I1202 16:26:34.937629 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zjdxd"] Dec 02 16:26:35 crc kubenswrapper[4933]: I1202 16:26:35.076994 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47e7134c-d645-42ee-8590-1141139cf727-utilities\") pod \"redhat-operators-zjdxd\" (UID: \"47e7134c-d645-42ee-8590-1141139cf727\") " pod="openshift-marketplace/redhat-operators-zjdxd" Dec 02 16:26:35 crc kubenswrapper[4933]: I1202 16:26:35.077048 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47e7134c-d645-42ee-8590-1141139cf727-catalog-content\") pod \"redhat-operators-zjdxd\" (UID: \"47e7134c-d645-42ee-8590-1141139cf727\") " pod="openshift-marketplace/redhat-operators-zjdxd" Dec 02 16:26:35 crc kubenswrapper[4933]: I1202 16:26:35.077066 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkmt6\" (UniqueName: \"kubernetes.io/projected/47e7134c-d645-42ee-8590-1141139cf727-kube-api-access-bkmt6\") pod \"redhat-operators-zjdxd\" (UID: \"47e7134c-d645-42ee-8590-1141139cf727\") " pod="openshift-marketplace/redhat-operators-zjdxd" Dec 02 16:26:35 crc kubenswrapper[4933]: I1202 16:26:35.182905 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47e7134c-d645-42ee-8590-1141139cf727-utilities\") pod \"redhat-operators-zjdxd\" (UID: \"47e7134c-d645-42ee-8590-1141139cf727\") " pod="openshift-marketplace/redhat-operators-zjdxd" Dec 02 16:26:35 crc kubenswrapper[4933]: I1202 16:26:35.182969 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47e7134c-d645-42ee-8590-1141139cf727-catalog-content\") pod \"redhat-operators-zjdxd\" (UID: \"47e7134c-d645-42ee-8590-1141139cf727\") " pod="openshift-marketplace/redhat-operators-zjdxd" Dec 02 16:26:35 crc kubenswrapper[4933]: I1202 16:26:35.182992 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkmt6\" (UniqueName: \"kubernetes.io/projected/47e7134c-d645-42ee-8590-1141139cf727-kube-api-access-bkmt6\") pod \"redhat-operators-zjdxd\" (UID: \"47e7134c-d645-42ee-8590-1141139cf727\") " pod="openshift-marketplace/redhat-operators-zjdxd" Dec 02 16:26:35 crc kubenswrapper[4933]: I1202 16:26:35.183540 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47e7134c-d645-42ee-8590-1141139cf727-utilities\") pod \"redhat-operators-zjdxd\" (UID: \"47e7134c-d645-42ee-8590-1141139cf727\") " pod="openshift-marketplace/redhat-operators-zjdxd" Dec 02 16:26:35 crc kubenswrapper[4933]: I1202 16:26:35.183662 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47e7134c-d645-42ee-8590-1141139cf727-catalog-content\") pod \"redhat-operators-zjdxd\" (UID: \"47e7134c-d645-42ee-8590-1141139cf727\") " pod="openshift-marketplace/redhat-operators-zjdxd" Dec 02 16:26:35 crc kubenswrapper[4933]: I1202 16:26:35.207740 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkmt6\" (UniqueName: \"kubernetes.io/projected/47e7134c-d645-42ee-8590-1141139cf727-kube-api-access-bkmt6\") pod \"redhat-operators-zjdxd\" (UID: \"47e7134c-d645-42ee-8590-1141139cf727\") " pod="openshift-marketplace/redhat-operators-zjdxd" Dec 02 16:26:35 crc kubenswrapper[4933]: I1202 16:26:35.246431 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zjdxd" Dec 02 16:26:35 crc kubenswrapper[4933]: I1202 16:26:35.731808 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zjdxd"] Dec 02 16:26:36 crc kubenswrapper[4933]: I1202 16:26:36.069787 4933 generic.go:334] "Generic (PLEG): container finished" podID="47e7134c-d645-42ee-8590-1141139cf727" containerID="200384aa4227c6799eeeb3cf8387ef6c75ade0fff75885e8f80ddd086c0bb180" exitCode=0 Dec 02 16:26:36 crc kubenswrapper[4933]: I1202 16:26:36.069878 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zjdxd" event={"ID":"47e7134c-d645-42ee-8590-1141139cf727","Type":"ContainerDied","Data":"200384aa4227c6799eeeb3cf8387ef6c75ade0fff75885e8f80ddd086c0bb180"} Dec 02 16:26:36 crc kubenswrapper[4933]: I1202 16:26:36.069913 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zjdxd" event={"ID":"47e7134c-d645-42ee-8590-1141139cf727","Type":"ContainerStarted","Data":"455cb3bd858921b3cf4e4e7134b5e9da8daa7db9429ea5861418539e3ddc7b02"} Dec 02 16:26:38 crc kubenswrapper[4933]: I1202 16:26:38.104426 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zjdxd" event={"ID":"47e7134c-d645-42ee-8590-1141139cf727","Type":"ContainerStarted","Data":"b3dbd636851ba19c331c2c37e9b178b0a314bc1691468b1de62c78f9f55303d6"} Dec 02 16:26:41 crc kubenswrapper[4933]: I1202 16:26:41.142576 4933 generic.go:334] "Generic (PLEG): container finished" podID="47e7134c-d645-42ee-8590-1141139cf727" containerID="b3dbd636851ba19c331c2c37e9b178b0a314bc1691468b1de62c78f9f55303d6" exitCode=0 Dec 02 16:26:41 crc kubenswrapper[4933]: I1202 16:26:41.142617 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zjdxd" event={"ID":"47e7134c-d645-42ee-8590-1141139cf727","Type":"ContainerDied","Data":"b3dbd636851ba19c331c2c37e9b178b0a314bc1691468b1de62c78f9f55303d6"} Dec 02 16:26:42 crc kubenswrapper[4933]: I1202 16:26:42.161691 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zjdxd" event={"ID":"47e7134c-d645-42ee-8590-1141139cf727","Type":"ContainerStarted","Data":"c7c3d3eb3c512918b3901b42e95a1cb324bfbc162d274bc04779bad5185e7e75"} Dec 02 16:26:42 crc kubenswrapper[4933]: I1202 16:26:42.186995 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zjdxd" podStartSLOduration=2.463214352 podStartE2EDuration="8.186977629s" podCreationTimestamp="2025-12-02 16:26:34 +0000 UTC" firstStartedPulling="2025-12-02 16:26:36.07209668 +0000 UTC m=+2059.323323383" lastFinishedPulling="2025-12-02 16:26:41.795859967 +0000 UTC m=+2065.047086660" observedRunningTime="2025-12-02 16:26:42.182053318 +0000 UTC m=+2065.433280041" watchObservedRunningTime="2025-12-02 16:26:42.186977629 +0000 UTC m=+2065.438204332" Dec 02 16:26:45 crc kubenswrapper[4933]: I1202 16:26:45.246908 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zjdxd" Dec 02 16:26:45 crc kubenswrapper[4933]: I1202 16:26:45.247360 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zjdxd" Dec 02 16:26:46 crc kubenswrapper[4933]: I1202 16:26:46.322337 4933 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-zjdxd" podUID="47e7134c-d645-42ee-8590-1141139cf727" containerName="registry-server" probeResult="failure" output=< Dec 02 16:26:46 crc kubenswrapper[4933]: timeout: failed to connect service ":50051" within 1s Dec 02 16:26:46 crc kubenswrapper[4933]: > Dec 02 16:26:47 crc kubenswrapper[4933]: I1202 16:26:47.169896 4933 patch_prober.go:28] interesting pod/machine-config-daemon-d2p6w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 16:26:47 crc kubenswrapper[4933]: I1202 16:26:47.169950 4933 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 16:26:51 crc kubenswrapper[4933]: I1202 16:26:51.079214 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-56a2-account-create-update-nt6cr"] Dec 02 16:26:51 crc kubenswrapper[4933]: I1202 16:26:51.079721 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-pvgh5"] Dec 02 16:26:51 crc kubenswrapper[4933]: I1202 16:26:51.081199 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-jq7m4"] Dec 02 16:26:51 crc kubenswrapper[4933]: I1202 16:26:51.093161 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-56a2-account-create-update-nt6cr"] Dec 02 16:26:51 crc kubenswrapper[4933]: I1202 16:26:51.103584 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-jq7m4"] Dec 02 16:26:51 crc kubenswrapper[4933]: I1202 16:26:51.116083 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-pvgh5"] Dec 02 16:26:52 crc kubenswrapper[4933]: I1202 16:26:52.035914 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-2cb5-account-create-update-7gdfb"] Dec 02 16:26:52 crc kubenswrapper[4933]: I1202 16:26:52.053559 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-h4bvg"] Dec 02 16:26:52 crc kubenswrapper[4933]: I1202 16:26:52.065940 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-37a8-account-create-update-2fzs9"] Dec 02 16:26:52 crc kubenswrapper[4933]: I1202 16:26:52.077490 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-37a8-account-create-update-2fzs9"] Dec 02 16:26:52 crc kubenswrapper[4933]: I1202 16:26:52.088210 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-h4bvg"] Dec 02 16:26:52 crc kubenswrapper[4933]: I1202 16:26:52.098655 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-2cb5-account-create-update-7gdfb"] Dec 02 16:26:53 crc kubenswrapper[4933]: I1202 16:26:53.080895 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7971426e-62c3-4709-8e81-8ce9a491f510" path="/var/lib/kubelet/pods/7971426e-62c3-4709-8e81-8ce9a491f510/volumes" Dec 02 16:26:53 crc kubenswrapper[4933]: I1202 16:26:53.081867 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8bd90a06-f10c-4249-99c3-f186848b27f2" path="/var/lib/kubelet/pods/8bd90a06-f10c-4249-99c3-f186848b27f2/volumes" Dec 02 16:26:53 crc kubenswrapper[4933]: I1202 16:26:53.082866 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9bea586b-0ef5-4524-8171-5dce637605a8" path="/var/lib/kubelet/pods/9bea586b-0ef5-4524-8171-5dce637605a8/volumes" Dec 02 16:26:53 crc kubenswrapper[4933]: I1202 16:26:53.083586 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="adec7c24-3d21-4a10-a5f7-d7bf669b1007" path="/var/lib/kubelet/pods/adec7c24-3d21-4a10-a5f7-d7bf669b1007/volumes" Dec 02 16:26:53 crc kubenswrapper[4933]: I1202 16:26:53.085250 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4e7c120-984b-477a-929e-d0b9ddec63d8" path="/var/lib/kubelet/pods/d4e7c120-984b-477a-929e-d0b9ddec63d8/volumes" Dec 02 16:26:53 crc kubenswrapper[4933]: I1202 16:26:53.085956 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e488a57f-0085-46e8-b496-9d5886e003fb" path="/var/lib/kubelet/pods/e488a57f-0085-46e8-b496-9d5886e003fb/volumes" Dec 02 16:26:55 crc kubenswrapper[4933]: I1202 16:26:55.306138 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zjdxd" Dec 02 16:26:55 crc kubenswrapper[4933]: I1202 16:26:55.363467 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zjdxd" Dec 02 16:26:55 crc kubenswrapper[4933]: I1202 16:26:55.550592 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zjdxd"] Dec 02 16:26:56 crc kubenswrapper[4933]: I1202 16:26:56.426104 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-zjdxd" podUID="47e7134c-d645-42ee-8590-1141139cf727" containerName="registry-server" containerID="cri-o://c7c3d3eb3c512918b3901b42e95a1cb324bfbc162d274bc04779bad5185e7e75" gracePeriod=2 Dec 02 16:26:57 crc kubenswrapper[4933]: I1202 16:26:57.077872 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zjdxd" Dec 02 16:26:57 crc kubenswrapper[4933]: I1202 16:26:57.199902 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47e7134c-d645-42ee-8590-1141139cf727-catalog-content\") pod \"47e7134c-d645-42ee-8590-1141139cf727\" (UID: \"47e7134c-d645-42ee-8590-1141139cf727\") " Dec 02 16:26:57 crc kubenswrapper[4933]: I1202 16:26:57.200208 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47e7134c-d645-42ee-8590-1141139cf727-utilities\") pod \"47e7134c-d645-42ee-8590-1141139cf727\" (UID: \"47e7134c-d645-42ee-8590-1141139cf727\") " Dec 02 16:26:57 crc kubenswrapper[4933]: I1202 16:26:57.200266 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bkmt6\" (UniqueName: \"kubernetes.io/projected/47e7134c-d645-42ee-8590-1141139cf727-kube-api-access-bkmt6\") pod \"47e7134c-d645-42ee-8590-1141139cf727\" (UID: \"47e7134c-d645-42ee-8590-1141139cf727\") " Dec 02 16:26:57 crc kubenswrapper[4933]: I1202 16:26:57.201487 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47e7134c-d645-42ee-8590-1141139cf727-utilities" (OuterVolumeSpecName: "utilities") pod "47e7134c-d645-42ee-8590-1141139cf727" (UID: "47e7134c-d645-42ee-8590-1141139cf727"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 16:26:57 crc kubenswrapper[4933]: I1202 16:26:57.209710 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47e7134c-d645-42ee-8590-1141139cf727-kube-api-access-bkmt6" (OuterVolumeSpecName: "kube-api-access-bkmt6") pod "47e7134c-d645-42ee-8590-1141139cf727" (UID: "47e7134c-d645-42ee-8590-1141139cf727"). InnerVolumeSpecName "kube-api-access-bkmt6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:26:57 crc kubenswrapper[4933]: I1202 16:26:57.302894 4933 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47e7134c-d645-42ee-8590-1141139cf727-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 16:26:57 crc kubenswrapper[4933]: I1202 16:26:57.302935 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bkmt6\" (UniqueName: \"kubernetes.io/projected/47e7134c-d645-42ee-8590-1141139cf727-kube-api-access-bkmt6\") on node \"crc\" DevicePath \"\"" Dec 02 16:26:57 crc kubenswrapper[4933]: I1202 16:26:57.326098 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47e7134c-d645-42ee-8590-1141139cf727-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "47e7134c-d645-42ee-8590-1141139cf727" (UID: "47e7134c-d645-42ee-8590-1141139cf727"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 16:26:57 crc kubenswrapper[4933]: I1202 16:26:57.405033 4933 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47e7134c-d645-42ee-8590-1141139cf727-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 16:26:57 crc kubenswrapper[4933]: I1202 16:26:57.437708 4933 generic.go:334] "Generic (PLEG): container finished" podID="47e7134c-d645-42ee-8590-1141139cf727" containerID="c7c3d3eb3c512918b3901b42e95a1cb324bfbc162d274bc04779bad5185e7e75" exitCode=0 Dec 02 16:26:57 crc kubenswrapper[4933]: I1202 16:26:57.437781 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zjdxd" Dec 02 16:26:57 crc kubenswrapper[4933]: I1202 16:26:57.437774 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zjdxd" event={"ID":"47e7134c-d645-42ee-8590-1141139cf727","Type":"ContainerDied","Data":"c7c3d3eb3c512918b3901b42e95a1cb324bfbc162d274bc04779bad5185e7e75"} Dec 02 16:26:57 crc kubenswrapper[4933]: I1202 16:26:57.438897 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zjdxd" event={"ID":"47e7134c-d645-42ee-8590-1141139cf727","Type":"ContainerDied","Data":"455cb3bd858921b3cf4e4e7134b5e9da8daa7db9429ea5861418539e3ddc7b02"} Dec 02 16:26:57 crc kubenswrapper[4933]: I1202 16:26:57.438929 4933 scope.go:117] "RemoveContainer" containerID="c7c3d3eb3c512918b3901b42e95a1cb324bfbc162d274bc04779bad5185e7e75" Dec 02 16:26:57 crc kubenswrapper[4933]: I1202 16:26:57.465444 4933 scope.go:117] "RemoveContainer" containerID="b3dbd636851ba19c331c2c37e9b178b0a314bc1691468b1de62c78f9f55303d6" Dec 02 16:26:57 crc kubenswrapper[4933]: I1202 16:26:57.483594 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zjdxd"] Dec 02 16:26:57 crc kubenswrapper[4933]: I1202 16:26:57.501858 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-zjdxd"] Dec 02 16:26:57 crc kubenswrapper[4933]: I1202 16:26:57.507174 4933 scope.go:117] "RemoveContainer" containerID="200384aa4227c6799eeeb3cf8387ef6c75ade0fff75885e8f80ddd086c0bb180" Dec 02 16:26:57 crc kubenswrapper[4933]: I1202 16:26:57.543248 4933 scope.go:117] "RemoveContainer" containerID="c7c3d3eb3c512918b3901b42e95a1cb324bfbc162d274bc04779bad5185e7e75" Dec 02 16:26:57 crc kubenswrapper[4933]: E1202 16:26:57.543664 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7c3d3eb3c512918b3901b42e95a1cb324bfbc162d274bc04779bad5185e7e75\": container with ID starting with c7c3d3eb3c512918b3901b42e95a1cb324bfbc162d274bc04779bad5185e7e75 not found: ID does not exist" containerID="c7c3d3eb3c512918b3901b42e95a1cb324bfbc162d274bc04779bad5185e7e75" Dec 02 16:26:57 crc kubenswrapper[4933]: I1202 16:26:57.543718 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7c3d3eb3c512918b3901b42e95a1cb324bfbc162d274bc04779bad5185e7e75"} err="failed to get container status \"c7c3d3eb3c512918b3901b42e95a1cb324bfbc162d274bc04779bad5185e7e75\": rpc error: code = NotFound desc = could not find container \"c7c3d3eb3c512918b3901b42e95a1cb324bfbc162d274bc04779bad5185e7e75\": container with ID starting with c7c3d3eb3c512918b3901b42e95a1cb324bfbc162d274bc04779bad5185e7e75 not found: ID does not exist" Dec 02 16:26:57 crc kubenswrapper[4933]: I1202 16:26:57.543745 4933 scope.go:117] "RemoveContainer" containerID="b3dbd636851ba19c331c2c37e9b178b0a314bc1691468b1de62c78f9f55303d6" Dec 02 16:26:57 crc kubenswrapper[4933]: E1202 16:26:57.544543 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3dbd636851ba19c331c2c37e9b178b0a314bc1691468b1de62c78f9f55303d6\": container with ID starting with b3dbd636851ba19c331c2c37e9b178b0a314bc1691468b1de62c78f9f55303d6 not found: ID does not exist" containerID="b3dbd636851ba19c331c2c37e9b178b0a314bc1691468b1de62c78f9f55303d6" Dec 02 16:26:57 crc kubenswrapper[4933]: I1202 16:26:57.544576 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3dbd636851ba19c331c2c37e9b178b0a314bc1691468b1de62c78f9f55303d6"} err="failed to get container status \"b3dbd636851ba19c331c2c37e9b178b0a314bc1691468b1de62c78f9f55303d6\": rpc error: code = NotFound desc = could not find container \"b3dbd636851ba19c331c2c37e9b178b0a314bc1691468b1de62c78f9f55303d6\": container with ID starting with b3dbd636851ba19c331c2c37e9b178b0a314bc1691468b1de62c78f9f55303d6 not found: ID does not exist" Dec 02 16:26:57 crc kubenswrapper[4933]: I1202 16:26:57.544594 4933 scope.go:117] "RemoveContainer" containerID="200384aa4227c6799eeeb3cf8387ef6c75ade0fff75885e8f80ddd086c0bb180" Dec 02 16:26:57 crc kubenswrapper[4933]: E1202 16:26:57.545090 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"200384aa4227c6799eeeb3cf8387ef6c75ade0fff75885e8f80ddd086c0bb180\": container with ID starting with 200384aa4227c6799eeeb3cf8387ef6c75ade0fff75885e8f80ddd086c0bb180 not found: ID does not exist" containerID="200384aa4227c6799eeeb3cf8387ef6c75ade0fff75885e8f80ddd086c0bb180" Dec 02 16:26:57 crc kubenswrapper[4933]: I1202 16:26:57.545137 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"200384aa4227c6799eeeb3cf8387ef6c75ade0fff75885e8f80ddd086c0bb180"} err="failed to get container status \"200384aa4227c6799eeeb3cf8387ef6c75ade0fff75885e8f80ddd086c0bb180\": rpc error: code = NotFound desc = could not find container \"200384aa4227c6799eeeb3cf8387ef6c75ade0fff75885e8f80ddd086c0bb180\": container with ID starting with 200384aa4227c6799eeeb3cf8387ef6c75ade0fff75885e8f80ddd086c0bb180 not found: ID does not exist" Dec 02 16:26:59 crc kubenswrapper[4933]: I1202 16:26:59.068493 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47e7134c-d645-42ee-8590-1141139cf727" path="/var/lib/kubelet/pods/47e7134c-d645-42ee-8590-1141139cf727/volumes" Dec 02 16:27:17 crc kubenswrapper[4933]: I1202 16:27:17.068098 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-9217-account-create-update-mnldr"] Dec 02 16:27:17 crc kubenswrapper[4933]: I1202 16:27:17.068761 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-create-nc5lc"] Dec 02 16:27:17 crc kubenswrapper[4933]: I1202 16:27:17.075908 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-9217-account-create-update-mnldr"] Dec 02 16:27:17 crc kubenswrapper[4933]: I1202 16:27:17.085894 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-create-nc5lc"] Dec 02 16:27:17 crc kubenswrapper[4933]: I1202 16:27:17.169428 4933 patch_prober.go:28] interesting pod/machine-config-daemon-d2p6w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 16:27:17 crc kubenswrapper[4933]: I1202 16:27:17.169479 4933 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 16:27:17 crc kubenswrapper[4933]: I1202 16:27:17.169518 4933 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" Dec 02 16:27:17 crc kubenswrapper[4933]: I1202 16:27:17.170063 4933 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"620bdd0c4263af7415f957bb835c5b4382baf2a25794824589647a73b3d410f0"} pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 16:27:17 crc kubenswrapper[4933]: I1202 16:27:17.170119 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" containerName="machine-config-daemon" containerID="cri-o://620bdd0c4263af7415f957bb835c5b4382baf2a25794824589647a73b3d410f0" gracePeriod=600 Dec 02 16:27:17 crc kubenswrapper[4933]: I1202 16:27:17.703039 4933 generic.go:334] "Generic (PLEG): container finished" podID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" containerID="620bdd0c4263af7415f957bb835c5b4382baf2a25794824589647a73b3d410f0" exitCode=0 Dec 02 16:27:17 crc kubenswrapper[4933]: I1202 16:27:17.703133 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" event={"ID":"e6c1c5e6-50dd-428a-890c-2c3f0456f2fa","Type":"ContainerDied","Data":"620bdd0c4263af7415f957bb835c5b4382baf2a25794824589647a73b3d410f0"} Dec 02 16:27:17 crc kubenswrapper[4933]: I1202 16:27:17.703417 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" event={"ID":"e6c1c5e6-50dd-428a-890c-2c3f0456f2fa","Type":"ContainerStarted","Data":"b62e6bdb09561f36fac7e6658898fffc280a7586dbfd2297014e62834184505b"} Dec 02 16:27:17 crc kubenswrapper[4933]: I1202 16:27:17.703460 4933 scope.go:117] "RemoveContainer" containerID="3464037e44f6657a3e78f158e1b9ac51eea30ff49dcabcf02071d819dacd47c7" Dec 02 16:27:19 crc kubenswrapper[4933]: I1202 16:27:19.068613 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e06c62d-ee51-43e3-aa5e-eb045d4ec1c8" path="/var/lib/kubelet/pods/6e06c62d-ee51-43e3-aa5e-eb045d4ec1c8/volumes" Dec 02 16:27:19 crc kubenswrapper[4933]: I1202 16:27:19.071547 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9cde07d2-340f-48ad-bb66-db94386d9052" path="/var/lib/kubelet/pods/9cde07d2-340f-48ad-bb66-db94386d9052/volumes" Dec 02 16:27:23 crc kubenswrapper[4933]: I1202 16:27:23.080313 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-ck96l"] Dec 02 16:27:23 crc kubenswrapper[4933]: I1202 16:27:23.081039 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-ck96l"] Dec 02 16:27:25 crc kubenswrapper[4933]: I1202 16:27:25.075768 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21060aba-17fe-429c-b7db-206edaaf91b4" path="/var/lib/kubelet/pods/21060aba-17fe-429c-b7db-206edaaf91b4/volumes" Dec 02 16:27:45 crc kubenswrapper[4933]: I1202 16:27:45.137120 4933 scope.go:117] "RemoveContainer" containerID="a13ae2dda21fde57b30ec26bf269ac5108cc18c5faac2ae044bc0843d42b2feb" Dec 02 16:27:45 crc kubenswrapper[4933]: I1202 16:27:45.198660 4933 scope.go:117] "RemoveContainer" containerID="13dfbc5c0d228e4615e41c19a2b1d2405ee951cad4901d23e29cefd7ef71304c" Dec 02 16:27:45 crc kubenswrapper[4933]: I1202 16:27:45.278593 4933 scope.go:117] "RemoveContainer" containerID="b09775dea3640bffaee49f07a4d5c9359c413c076fe5b41e55c987e1c93bb0e1" Dec 02 16:27:45 crc kubenswrapper[4933]: I1202 16:27:45.329690 4933 scope.go:117] "RemoveContainer" containerID="7e58dcaca13dfc2b18cb9fef5e610f57b93bd430c5179fe8c0df14b01a0dd738" Dec 02 16:27:45 crc kubenswrapper[4933]: I1202 16:27:45.379083 4933 scope.go:117] "RemoveContainer" containerID="761e5650d93526f31152508daccf11a2a4a61a8b05ea5e05382de732134fb744" Dec 02 16:27:45 crc kubenswrapper[4933]: I1202 16:27:45.433103 4933 scope.go:117] "RemoveContainer" containerID="d164e95c31b94065ab22675c51a7b0bee0a24ed21e57d8134fbde780098d8c45" Dec 02 16:27:45 crc kubenswrapper[4933]: I1202 16:27:45.477601 4933 scope.go:117] "RemoveContainer" containerID="4ec71ca02ee7d7b0536de444c3d4a00719a478fad96c644b5725c9f37563e7d7" Dec 02 16:27:45 crc kubenswrapper[4933]: I1202 16:27:45.507442 4933 scope.go:117] "RemoveContainer" containerID="98f6a454b62a21e83e74b898e2d989fa4846f785c24ecabe3e51d7ebb5cbb6c7" Dec 02 16:27:45 crc kubenswrapper[4933]: I1202 16:27:45.532221 4933 scope.go:117] "RemoveContainer" containerID="7817274f9e9d8ada2f089d65e2b8f010643d1a99e66bb96bd5e79509a196423d" Dec 02 16:27:46 crc kubenswrapper[4933]: I1202 16:27:46.055501 4933 generic.go:334] "Generic (PLEG): container finished" podID="0a4e084f-f2c5-418d-8990-6074168317ab" containerID="9210a7f51593fb3407bde7262f5e3600a90330ccf6bb8288c2d0913791edf7ef" exitCode=0 Dec 02 16:27:46 crc kubenswrapper[4933]: I1202 16:27:46.055548 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-87nbr" event={"ID":"0a4e084f-f2c5-418d-8990-6074168317ab","Type":"ContainerDied","Data":"9210a7f51593fb3407bde7262f5e3600a90330ccf6bb8288c2d0913791edf7ef"} Dec 02 16:27:47 crc kubenswrapper[4933]: I1202 16:27:47.605159 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-87nbr" Dec 02 16:27:47 crc kubenswrapper[4933]: I1202 16:27:47.630533 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0a4e084f-f2c5-418d-8990-6074168317ab-inventory\") pod \"0a4e084f-f2c5-418d-8990-6074168317ab\" (UID: \"0a4e084f-f2c5-418d-8990-6074168317ab\") " Dec 02 16:27:47 crc kubenswrapper[4933]: I1202 16:27:47.630576 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0a4e084f-f2c5-418d-8990-6074168317ab-ssh-key\") pod \"0a4e084f-f2c5-418d-8990-6074168317ab\" (UID: \"0a4e084f-f2c5-418d-8990-6074168317ab\") " Dec 02 16:27:47 crc kubenswrapper[4933]: I1202 16:27:47.630992 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gwmsg\" (UniqueName: \"kubernetes.io/projected/0a4e084f-f2c5-418d-8990-6074168317ab-kube-api-access-gwmsg\") pod \"0a4e084f-f2c5-418d-8990-6074168317ab\" (UID: \"0a4e084f-f2c5-418d-8990-6074168317ab\") " Dec 02 16:27:47 crc kubenswrapper[4933]: I1202 16:27:47.694871 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a4e084f-f2c5-418d-8990-6074168317ab-kube-api-access-gwmsg" (OuterVolumeSpecName: "kube-api-access-gwmsg") pod "0a4e084f-f2c5-418d-8990-6074168317ab" (UID: "0a4e084f-f2c5-418d-8990-6074168317ab"). InnerVolumeSpecName "kube-api-access-gwmsg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:27:47 crc kubenswrapper[4933]: I1202 16:27:47.699517 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a4e084f-f2c5-418d-8990-6074168317ab-inventory" (OuterVolumeSpecName: "inventory") pod "0a4e084f-f2c5-418d-8990-6074168317ab" (UID: "0a4e084f-f2c5-418d-8990-6074168317ab"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:27:47 crc kubenswrapper[4933]: I1202 16:27:47.735286 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gwmsg\" (UniqueName: \"kubernetes.io/projected/0a4e084f-f2c5-418d-8990-6074168317ab-kube-api-access-gwmsg\") on node \"crc\" DevicePath \"\"" Dec 02 16:27:47 crc kubenswrapper[4933]: I1202 16:27:47.735328 4933 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0a4e084f-f2c5-418d-8990-6074168317ab-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 16:27:47 crc kubenswrapper[4933]: I1202 16:27:47.750262 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a4e084f-f2c5-418d-8990-6074168317ab-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "0a4e084f-f2c5-418d-8990-6074168317ab" (UID: "0a4e084f-f2c5-418d-8990-6074168317ab"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:27:47 crc kubenswrapper[4933]: I1202 16:27:47.837951 4933 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0a4e084f-f2c5-418d-8990-6074168317ab-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 16:27:48 crc kubenswrapper[4933]: I1202 16:27:48.097027 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-87nbr" event={"ID":"0a4e084f-f2c5-418d-8990-6074168317ab","Type":"ContainerDied","Data":"f9b773c9d867f5ccb2e22dd1d50b5c4196e244902c70833af712888b59143589"} Dec 02 16:27:48 crc kubenswrapper[4933]: I1202 16:27:48.097103 4933 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f9b773c9d867f5ccb2e22dd1d50b5c4196e244902c70833af712888b59143589" Dec 02 16:27:48 crc kubenswrapper[4933]: I1202 16:27:48.097197 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-87nbr" Dec 02 16:27:48 crc kubenswrapper[4933]: I1202 16:27:48.216076 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9htq4"] Dec 02 16:27:48 crc kubenswrapper[4933]: E1202 16:27:48.216661 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47e7134c-d645-42ee-8590-1141139cf727" containerName="registry-server" Dec 02 16:27:48 crc kubenswrapper[4933]: I1202 16:27:48.216679 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="47e7134c-d645-42ee-8590-1141139cf727" containerName="registry-server" Dec 02 16:27:48 crc kubenswrapper[4933]: E1202 16:27:48.216694 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47e7134c-d645-42ee-8590-1141139cf727" containerName="extract-content" Dec 02 16:27:48 crc kubenswrapper[4933]: I1202 16:27:48.216704 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="47e7134c-d645-42ee-8590-1141139cf727" containerName="extract-content" Dec 02 16:27:48 crc kubenswrapper[4933]: E1202 16:27:48.216722 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47e7134c-d645-42ee-8590-1141139cf727" containerName="extract-utilities" Dec 02 16:27:48 crc kubenswrapper[4933]: I1202 16:27:48.216731 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="47e7134c-d645-42ee-8590-1141139cf727" containerName="extract-utilities" Dec 02 16:27:48 crc kubenswrapper[4933]: E1202 16:27:48.216742 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a4e084f-f2c5-418d-8990-6074168317ab" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 02 16:27:48 crc kubenswrapper[4933]: I1202 16:27:48.216749 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a4e084f-f2c5-418d-8990-6074168317ab" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 02 16:27:48 crc kubenswrapper[4933]: I1202 16:27:48.217097 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="47e7134c-d645-42ee-8590-1141139cf727" containerName="registry-server" Dec 02 16:27:48 crc kubenswrapper[4933]: I1202 16:27:48.217118 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a4e084f-f2c5-418d-8990-6074168317ab" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 02 16:27:48 crc kubenswrapper[4933]: I1202 16:27:48.217947 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9htq4" Dec 02 16:27:48 crc kubenswrapper[4933]: I1202 16:27:48.220585 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 16:27:48 crc kubenswrapper[4933]: I1202 16:27:48.220827 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 16:27:48 crc kubenswrapper[4933]: I1202 16:27:48.220984 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mlmmm" Dec 02 16:27:48 crc kubenswrapper[4933]: I1202 16:27:48.221999 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 16:27:48 crc kubenswrapper[4933]: I1202 16:27:48.232574 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9htq4"] Dec 02 16:27:48 crc kubenswrapper[4933]: I1202 16:27:48.245683 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7aa88472-106f-484d-ae92-94c064b2a908-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-9htq4\" (UID: \"7aa88472-106f-484d-ae92-94c064b2a908\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9htq4" Dec 02 16:27:48 crc kubenswrapper[4933]: I1202 16:27:48.246154 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7aa88472-106f-484d-ae92-94c064b2a908-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-9htq4\" (UID: \"7aa88472-106f-484d-ae92-94c064b2a908\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9htq4" Dec 02 16:27:48 crc kubenswrapper[4933]: I1202 16:27:48.246234 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mf4rt\" (UniqueName: \"kubernetes.io/projected/7aa88472-106f-484d-ae92-94c064b2a908-kube-api-access-mf4rt\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-9htq4\" (UID: \"7aa88472-106f-484d-ae92-94c064b2a908\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9htq4" Dec 02 16:27:48 crc kubenswrapper[4933]: E1202 16:27:48.321215 4933 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a4e084f_f2c5_418d_8990_6074168317ab.slice/crio-f9b773c9d867f5ccb2e22dd1d50b5c4196e244902c70833af712888b59143589\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a4e084f_f2c5_418d_8990_6074168317ab.slice\": RecentStats: unable to find data in memory cache]" Dec 02 16:27:48 crc kubenswrapper[4933]: E1202 16:27:48.321300 4933 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a4e084f_f2c5_418d_8990_6074168317ab.slice/crio-f9b773c9d867f5ccb2e22dd1d50b5c4196e244902c70833af712888b59143589\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a4e084f_f2c5_418d_8990_6074168317ab.slice\": RecentStats: unable to find data in memory cache]" Dec 02 16:27:48 crc kubenswrapper[4933]: I1202 16:27:48.348018 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7aa88472-106f-484d-ae92-94c064b2a908-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-9htq4\" (UID: \"7aa88472-106f-484d-ae92-94c064b2a908\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9htq4" Dec 02 16:27:48 crc kubenswrapper[4933]: I1202 16:27:48.348190 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7aa88472-106f-484d-ae92-94c064b2a908-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-9htq4\" (UID: \"7aa88472-106f-484d-ae92-94c064b2a908\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9htq4" Dec 02 16:27:48 crc kubenswrapper[4933]: I1202 16:27:48.348236 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mf4rt\" (UniqueName: \"kubernetes.io/projected/7aa88472-106f-484d-ae92-94c064b2a908-kube-api-access-mf4rt\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-9htq4\" (UID: \"7aa88472-106f-484d-ae92-94c064b2a908\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9htq4" Dec 02 16:27:48 crc kubenswrapper[4933]: I1202 16:27:48.351937 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7aa88472-106f-484d-ae92-94c064b2a908-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-9htq4\" (UID: \"7aa88472-106f-484d-ae92-94c064b2a908\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9htq4" Dec 02 16:27:48 crc kubenswrapper[4933]: I1202 16:27:48.352601 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7aa88472-106f-484d-ae92-94c064b2a908-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-9htq4\" (UID: \"7aa88472-106f-484d-ae92-94c064b2a908\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9htq4" Dec 02 16:27:48 crc kubenswrapper[4933]: I1202 16:27:48.364162 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mf4rt\" (UniqueName: \"kubernetes.io/projected/7aa88472-106f-484d-ae92-94c064b2a908-kube-api-access-mf4rt\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-9htq4\" (UID: \"7aa88472-106f-484d-ae92-94c064b2a908\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9htq4" Dec 02 16:27:48 crc kubenswrapper[4933]: I1202 16:27:48.543483 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9htq4" Dec 02 16:27:49 crc kubenswrapper[4933]: I1202 16:27:49.088532 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9htq4"] Dec 02 16:27:49 crc kubenswrapper[4933]: I1202 16:27:49.114193 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9htq4" event={"ID":"7aa88472-106f-484d-ae92-94c064b2a908","Type":"ContainerStarted","Data":"7dce71cdb41d09168085ca1f31ff13bd0a4f188d3977b18b066fd81ae3a85a67"} Dec 02 16:27:50 crc kubenswrapper[4933]: I1202 16:27:50.052983 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-dz7nl"] Dec 02 16:27:50 crc kubenswrapper[4933]: I1202 16:27:50.071925 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-6zh5j"] Dec 02 16:27:50 crc kubenswrapper[4933]: I1202 16:27:50.086129 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-dz7nl"] Dec 02 16:27:50 crc kubenswrapper[4933]: I1202 16:27:50.098867 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-6zh5j"] Dec 02 16:27:50 crc kubenswrapper[4933]: I1202 16:27:50.133428 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9htq4" event={"ID":"7aa88472-106f-484d-ae92-94c064b2a908","Type":"ContainerStarted","Data":"9ec43b4f88230ba83f7de1e68d05f54739f581aacc5e333aeb73734a74ff331c"} Dec 02 16:27:50 crc kubenswrapper[4933]: I1202 16:27:50.154939 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9htq4" podStartSLOduration=1.634084558 podStartE2EDuration="2.154914498s" podCreationTimestamp="2025-12-02 16:27:48 +0000 UTC" firstStartedPulling="2025-12-02 16:27:49.096658074 +0000 UTC m=+2132.347884777" lastFinishedPulling="2025-12-02 16:27:49.617488014 +0000 UTC m=+2132.868714717" observedRunningTime="2025-12-02 16:27:50.149139554 +0000 UTC m=+2133.400366257" watchObservedRunningTime="2025-12-02 16:27:50.154914498 +0000 UTC m=+2133.406141201" Dec 02 16:27:51 crc kubenswrapper[4933]: I1202 16:27:51.066432 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19ded5fa-9330-468d-b544-de40fe542bf0" path="/var/lib/kubelet/pods/19ded5fa-9330-468d-b544-de40fe542bf0/volumes" Dec 02 16:27:51 crc kubenswrapper[4933]: I1202 16:27:51.067450 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52d2793b-1d77-4e12-bbaf-104e4174d0c6" path="/var/lib/kubelet/pods/52d2793b-1d77-4e12-bbaf-104e4174d0c6/volumes" Dec 02 16:27:55 crc kubenswrapper[4933]: I1202 16:27:55.187933 4933 generic.go:334] "Generic (PLEG): container finished" podID="7aa88472-106f-484d-ae92-94c064b2a908" containerID="9ec43b4f88230ba83f7de1e68d05f54739f581aacc5e333aeb73734a74ff331c" exitCode=0 Dec 02 16:27:55 crc kubenswrapper[4933]: I1202 16:27:55.188132 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9htq4" event={"ID":"7aa88472-106f-484d-ae92-94c064b2a908","Type":"ContainerDied","Data":"9ec43b4f88230ba83f7de1e68d05f54739f581aacc5e333aeb73734a74ff331c"} Dec 02 16:27:56 crc kubenswrapper[4933]: I1202 16:27:56.797944 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9htq4" Dec 02 16:27:56 crc kubenswrapper[4933]: I1202 16:27:56.860784 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7aa88472-106f-484d-ae92-94c064b2a908-ssh-key\") pod \"7aa88472-106f-484d-ae92-94c064b2a908\" (UID: \"7aa88472-106f-484d-ae92-94c064b2a908\") " Dec 02 16:27:56 crc kubenswrapper[4933]: I1202 16:27:56.860947 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mf4rt\" (UniqueName: \"kubernetes.io/projected/7aa88472-106f-484d-ae92-94c064b2a908-kube-api-access-mf4rt\") pod \"7aa88472-106f-484d-ae92-94c064b2a908\" (UID: \"7aa88472-106f-484d-ae92-94c064b2a908\") " Dec 02 16:27:56 crc kubenswrapper[4933]: I1202 16:27:56.861082 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7aa88472-106f-484d-ae92-94c064b2a908-inventory\") pod \"7aa88472-106f-484d-ae92-94c064b2a908\" (UID: \"7aa88472-106f-484d-ae92-94c064b2a908\") " Dec 02 16:27:56 crc kubenswrapper[4933]: I1202 16:27:56.879855 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7aa88472-106f-484d-ae92-94c064b2a908-kube-api-access-mf4rt" (OuterVolumeSpecName: "kube-api-access-mf4rt") pod "7aa88472-106f-484d-ae92-94c064b2a908" (UID: "7aa88472-106f-484d-ae92-94c064b2a908"). InnerVolumeSpecName "kube-api-access-mf4rt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:27:56 crc kubenswrapper[4933]: I1202 16:27:56.902543 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7aa88472-106f-484d-ae92-94c064b2a908-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "7aa88472-106f-484d-ae92-94c064b2a908" (UID: "7aa88472-106f-484d-ae92-94c064b2a908"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:27:56 crc kubenswrapper[4933]: I1202 16:27:56.917916 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7aa88472-106f-484d-ae92-94c064b2a908-inventory" (OuterVolumeSpecName: "inventory") pod "7aa88472-106f-484d-ae92-94c064b2a908" (UID: "7aa88472-106f-484d-ae92-94c064b2a908"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:27:56 crc kubenswrapper[4933]: I1202 16:27:56.964972 4933 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7aa88472-106f-484d-ae92-94c064b2a908-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 16:27:56 crc kubenswrapper[4933]: I1202 16:27:56.965009 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mf4rt\" (UniqueName: \"kubernetes.io/projected/7aa88472-106f-484d-ae92-94c064b2a908-kube-api-access-mf4rt\") on node \"crc\" DevicePath \"\"" Dec 02 16:27:56 crc kubenswrapper[4933]: I1202 16:27:56.965024 4933 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7aa88472-106f-484d-ae92-94c064b2a908-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 16:27:57 crc kubenswrapper[4933]: I1202 16:27:57.212855 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9htq4" event={"ID":"7aa88472-106f-484d-ae92-94c064b2a908","Type":"ContainerDied","Data":"7dce71cdb41d09168085ca1f31ff13bd0a4f188d3977b18b066fd81ae3a85a67"} Dec 02 16:27:57 crc kubenswrapper[4933]: I1202 16:27:57.213156 4933 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7dce71cdb41d09168085ca1f31ff13bd0a4f188d3977b18b066fd81ae3a85a67" Dec 02 16:27:57 crc kubenswrapper[4933]: I1202 16:27:57.212918 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9htq4" Dec 02 16:27:57 crc kubenswrapper[4933]: I1202 16:27:57.277504 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-mqv4w"] Dec 02 16:27:57 crc kubenswrapper[4933]: E1202 16:27:57.278793 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7aa88472-106f-484d-ae92-94c064b2a908" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 02 16:27:57 crc kubenswrapper[4933]: I1202 16:27:57.278927 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="7aa88472-106f-484d-ae92-94c064b2a908" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 02 16:27:57 crc kubenswrapper[4933]: I1202 16:27:57.279712 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="7aa88472-106f-484d-ae92-94c064b2a908" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 02 16:27:57 crc kubenswrapper[4933]: I1202 16:27:57.281170 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mqv4w" Dec 02 16:27:57 crc kubenswrapper[4933]: I1202 16:27:57.285147 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mlmmm" Dec 02 16:27:57 crc kubenswrapper[4933]: I1202 16:27:57.285183 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 16:27:57 crc kubenswrapper[4933]: I1202 16:27:57.285474 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 16:27:57 crc kubenswrapper[4933]: I1202 16:27:57.291119 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 16:27:57 crc kubenswrapper[4933]: I1202 16:27:57.321003 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-mqv4w"] Dec 02 16:27:57 crc kubenswrapper[4933]: I1202 16:27:57.371379 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/213c04c8-e37f-4898-8141-cd8a5a5e6626-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-mqv4w\" (UID: \"213c04c8-e37f-4898-8141-cd8a5a5e6626\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mqv4w" Dec 02 16:27:57 crc kubenswrapper[4933]: I1202 16:27:57.371451 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgjns\" (UniqueName: \"kubernetes.io/projected/213c04c8-e37f-4898-8141-cd8a5a5e6626-kube-api-access-hgjns\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-mqv4w\" (UID: \"213c04c8-e37f-4898-8141-cd8a5a5e6626\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mqv4w" Dec 02 16:27:57 crc kubenswrapper[4933]: I1202 16:27:57.371698 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/213c04c8-e37f-4898-8141-cd8a5a5e6626-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-mqv4w\" (UID: \"213c04c8-e37f-4898-8141-cd8a5a5e6626\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mqv4w" Dec 02 16:27:57 crc kubenswrapper[4933]: I1202 16:27:57.472899 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/213c04c8-e37f-4898-8141-cd8a5a5e6626-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-mqv4w\" (UID: \"213c04c8-e37f-4898-8141-cd8a5a5e6626\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mqv4w" Dec 02 16:27:57 crc kubenswrapper[4933]: I1202 16:27:57.472964 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hgjns\" (UniqueName: \"kubernetes.io/projected/213c04c8-e37f-4898-8141-cd8a5a5e6626-kube-api-access-hgjns\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-mqv4w\" (UID: \"213c04c8-e37f-4898-8141-cd8a5a5e6626\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mqv4w" Dec 02 16:27:57 crc kubenswrapper[4933]: I1202 16:27:57.473151 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/213c04c8-e37f-4898-8141-cd8a5a5e6626-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-mqv4w\" (UID: \"213c04c8-e37f-4898-8141-cd8a5a5e6626\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mqv4w" Dec 02 16:27:57 crc kubenswrapper[4933]: I1202 16:27:57.477663 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/213c04c8-e37f-4898-8141-cd8a5a5e6626-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-mqv4w\" (UID: \"213c04c8-e37f-4898-8141-cd8a5a5e6626\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mqv4w" Dec 02 16:27:57 crc kubenswrapper[4933]: I1202 16:27:57.477663 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/213c04c8-e37f-4898-8141-cd8a5a5e6626-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-mqv4w\" (UID: \"213c04c8-e37f-4898-8141-cd8a5a5e6626\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mqv4w" Dec 02 16:27:57 crc kubenswrapper[4933]: I1202 16:27:57.489806 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hgjns\" (UniqueName: \"kubernetes.io/projected/213c04c8-e37f-4898-8141-cd8a5a5e6626-kube-api-access-hgjns\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-mqv4w\" (UID: \"213c04c8-e37f-4898-8141-cd8a5a5e6626\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mqv4w" Dec 02 16:27:57 crc kubenswrapper[4933]: I1202 16:27:57.610890 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mqv4w" Dec 02 16:27:58 crc kubenswrapper[4933]: I1202 16:27:58.212332 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-mqv4w"] Dec 02 16:27:58 crc kubenswrapper[4933]: W1202 16:27:58.216156 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod213c04c8_e37f_4898_8141_cd8a5a5e6626.slice/crio-5640dc0691c9ff635bfae6e2cd665fc0b07e7f1dfa6938f88f5ddc05597ed874 WatchSource:0}: Error finding container 5640dc0691c9ff635bfae6e2cd665fc0b07e7f1dfa6938f88f5ddc05597ed874: Status 404 returned error can't find the container with id 5640dc0691c9ff635bfae6e2cd665fc0b07e7f1dfa6938f88f5ddc05597ed874 Dec 02 16:27:59 crc kubenswrapper[4933]: I1202 16:27:59.239676 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mqv4w" event={"ID":"213c04c8-e37f-4898-8141-cd8a5a5e6626","Type":"ContainerStarted","Data":"7cc37adbd28fdaad6629e40faa5d7a690683089f6371442fb4ddd98cda70e1cb"} Dec 02 16:27:59 crc kubenswrapper[4933]: I1202 16:27:59.240505 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mqv4w" event={"ID":"213c04c8-e37f-4898-8141-cd8a5a5e6626","Type":"ContainerStarted","Data":"5640dc0691c9ff635bfae6e2cd665fc0b07e7f1dfa6938f88f5ddc05597ed874"} Dec 02 16:27:59 crc kubenswrapper[4933]: I1202 16:27:59.270393 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mqv4w" podStartSLOduration=1.8404100159999999 podStartE2EDuration="2.270315327s" podCreationTimestamp="2025-12-02 16:27:57 +0000 UTC" firstStartedPulling="2025-12-02 16:27:58.219162413 +0000 UTC m=+2141.470389126" lastFinishedPulling="2025-12-02 16:27:58.649067734 +0000 UTC m=+2141.900294437" observedRunningTime="2025-12-02 16:27:59.261141371 +0000 UTC m=+2142.512368074" watchObservedRunningTime="2025-12-02 16:27:59.270315327 +0000 UTC m=+2142.521542100" Dec 02 16:28:35 crc kubenswrapper[4933]: I1202 16:28:35.068753 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-j4ntg"] Dec 02 16:28:35 crc kubenswrapper[4933]: I1202 16:28:35.069417 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-j4ntg"] Dec 02 16:28:37 crc kubenswrapper[4933]: I1202 16:28:37.076767 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e63b60f6-5169-44c1-b64f-6bad25c8ba22" path="/var/lib/kubelet/pods/e63b60f6-5169-44c1-b64f-6bad25c8ba22/volumes" Dec 02 16:28:37 crc kubenswrapper[4933]: I1202 16:28:37.659911 4933 generic.go:334] "Generic (PLEG): container finished" podID="213c04c8-e37f-4898-8141-cd8a5a5e6626" containerID="7cc37adbd28fdaad6629e40faa5d7a690683089f6371442fb4ddd98cda70e1cb" exitCode=0 Dec 02 16:28:37 crc kubenswrapper[4933]: I1202 16:28:37.660050 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mqv4w" event={"ID":"213c04c8-e37f-4898-8141-cd8a5a5e6626","Type":"ContainerDied","Data":"7cc37adbd28fdaad6629e40faa5d7a690683089f6371442fb4ddd98cda70e1cb"} Dec 02 16:28:39 crc kubenswrapper[4933]: I1202 16:28:39.180723 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mqv4w" Dec 02 16:28:39 crc kubenswrapper[4933]: I1202 16:28:39.321778 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/213c04c8-e37f-4898-8141-cd8a5a5e6626-ssh-key\") pod \"213c04c8-e37f-4898-8141-cd8a5a5e6626\" (UID: \"213c04c8-e37f-4898-8141-cd8a5a5e6626\") " Dec 02 16:28:39 crc kubenswrapper[4933]: I1202 16:28:39.321942 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hgjns\" (UniqueName: \"kubernetes.io/projected/213c04c8-e37f-4898-8141-cd8a5a5e6626-kube-api-access-hgjns\") pod \"213c04c8-e37f-4898-8141-cd8a5a5e6626\" (UID: \"213c04c8-e37f-4898-8141-cd8a5a5e6626\") " Dec 02 16:28:39 crc kubenswrapper[4933]: I1202 16:28:39.321965 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/213c04c8-e37f-4898-8141-cd8a5a5e6626-inventory\") pod \"213c04c8-e37f-4898-8141-cd8a5a5e6626\" (UID: \"213c04c8-e37f-4898-8141-cd8a5a5e6626\") " Dec 02 16:28:39 crc kubenswrapper[4933]: I1202 16:28:39.372751 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/213c04c8-e37f-4898-8141-cd8a5a5e6626-kube-api-access-hgjns" (OuterVolumeSpecName: "kube-api-access-hgjns") pod "213c04c8-e37f-4898-8141-cd8a5a5e6626" (UID: "213c04c8-e37f-4898-8141-cd8a5a5e6626"). InnerVolumeSpecName "kube-api-access-hgjns". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:28:39 crc kubenswrapper[4933]: I1202 16:28:39.379085 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/213c04c8-e37f-4898-8141-cd8a5a5e6626-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "213c04c8-e37f-4898-8141-cd8a5a5e6626" (UID: "213c04c8-e37f-4898-8141-cd8a5a5e6626"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:28:39 crc kubenswrapper[4933]: I1202 16:28:39.379847 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/213c04c8-e37f-4898-8141-cd8a5a5e6626-inventory" (OuterVolumeSpecName: "inventory") pod "213c04c8-e37f-4898-8141-cd8a5a5e6626" (UID: "213c04c8-e37f-4898-8141-cd8a5a5e6626"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:28:39 crc kubenswrapper[4933]: I1202 16:28:39.424741 4933 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/213c04c8-e37f-4898-8141-cd8a5a5e6626-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 16:28:39 crc kubenswrapper[4933]: I1202 16:28:39.424793 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hgjns\" (UniqueName: \"kubernetes.io/projected/213c04c8-e37f-4898-8141-cd8a5a5e6626-kube-api-access-hgjns\") on node \"crc\" DevicePath \"\"" Dec 02 16:28:39 crc kubenswrapper[4933]: I1202 16:28:39.424810 4933 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/213c04c8-e37f-4898-8141-cd8a5a5e6626-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 16:28:39 crc kubenswrapper[4933]: I1202 16:28:39.684220 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mqv4w" event={"ID":"213c04c8-e37f-4898-8141-cd8a5a5e6626","Type":"ContainerDied","Data":"5640dc0691c9ff635bfae6e2cd665fc0b07e7f1dfa6938f88f5ddc05597ed874"} Dec 02 16:28:39 crc kubenswrapper[4933]: I1202 16:28:39.684270 4933 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5640dc0691c9ff635bfae6e2cd665fc0b07e7f1dfa6938f88f5ddc05597ed874" Dec 02 16:28:39 crc kubenswrapper[4933]: I1202 16:28:39.684338 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mqv4w" Dec 02 16:28:39 crc kubenswrapper[4933]: I1202 16:28:39.881675 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b4tdz"] Dec 02 16:28:39 crc kubenswrapper[4933]: E1202 16:28:39.883001 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="213c04c8-e37f-4898-8141-cd8a5a5e6626" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 02 16:28:39 crc kubenswrapper[4933]: I1202 16:28:39.883028 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="213c04c8-e37f-4898-8141-cd8a5a5e6626" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 02 16:28:39 crc kubenswrapper[4933]: I1202 16:28:39.884809 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="213c04c8-e37f-4898-8141-cd8a5a5e6626" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 02 16:28:39 crc kubenswrapper[4933]: I1202 16:28:39.891941 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b4tdz" Dec 02 16:28:39 crc kubenswrapper[4933]: I1202 16:28:39.894810 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 16:28:39 crc kubenswrapper[4933]: I1202 16:28:39.897750 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 16:28:39 crc kubenswrapper[4933]: I1202 16:28:39.897930 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mlmmm" Dec 02 16:28:39 crc kubenswrapper[4933]: I1202 16:28:39.898384 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 16:28:39 crc kubenswrapper[4933]: I1202 16:28:39.917327 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b4tdz"] Dec 02 16:28:39 crc kubenswrapper[4933]: I1202 16:28:39.984565 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9e5c1276-65ce-4553-9d05-e8e27aaef6b3-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-b4tdz\" (UID: \"9e5c1276-65ce-4553-9d05-e8e27aaef6b3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b4tdz" Dec 02 16:28:39 crc kubenswrapper[4933]: I1202 16:28:39.984754 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9e5c1276-65ce-4553-9d05-e8e27aaef6b3-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-b4tdz\" (UID: \"9e5c1276-65ce-4553-9d05-e8e27aaef6b3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b4tdz" Dec 02 16:28:39 crc kubenswrapper[4933]: I1202 16:28:39.984912 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qlgh\" (UniqueName: \"kubernetes.io/projected/9e5c1276-65ce-4553-9d05-e8e27aaef6b3-kube-api-access-7qlgh\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-b4tdz\" (UID: \"9e5c1276-65ce-4553-9d05-e8e27aaef6b3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b4tdz" Dec 02 16:28:40 crc kubenswrapper[4933]: I1202 16:28:40.087294 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9e5c1276-65ce-4553-9d05-e8e27aaef6b3-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-b4tdz\" (UID: \"9e5c1276-65ce-4553-9d05-e8e27aaef6b3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b4tdz" Dec 02 16:28:40 crc kubenswrapper[4933]: I1202 16:28:40.087461 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qlgh\" (UniqueName: \"kubernetes.io/projected/9e5c1276-65ce-4553-9d05-e8e27aaef6b3-kube-api-access-7qlgh\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-b4tdz\" (UID: \"9e5c1276-65ce-4553-9d05-e8e27aaef6b3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b4tdz" Dec 02 16:28:40 crc kubenswrapper[4933]: I1202 16:28:40.087668 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9e5c1276-65ce-4553-9d05-e8e27aaef6b3-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-b4tdz\" (UID: \"9e5c1276-65ce-4553-9d05-e8e27aaef6b3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b4tdz" Dec 02 16:28:40 crc kubenswrapper[4933]: I1202 16:28:40.092418 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9e5c1276-65ce-4553-9d05-e8e27aaef6b3-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-b4tdz\" (UID: \"9e5c1276-65ce-4553-9d05-e8e27aaef6b3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b4tdz" Dec 02 16:28:40 crc kubenswrapper[4933]: I1202 16:28:40.093100 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9e5c1276-65ce-4553-9d05-e8e27aaef6b3-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-b4tdz\" (UID: \"9e5c1276-65ce-4553-9d05-e8e27aaef6b3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b4tdz" Dec 02 16:28:40 crc kubenswrapper[4933]: I1202 16:28:40.108674 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qlgh\" (UniqueName: \"kubernetes.io/projected/9e5c1276-65ce-4553-9d05-e8e27aaef6b3-kube-api-access-7qlgh\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-b4tdz\" (UID: \"9e5c1276-65ce-4553-9d05-e8e27aaef6b3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b4tdz" Dec 02 16:28:40 crc kubenswrapper[4933]: I1202 16:28:40.216617 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b4tdz" Dec 02 16:28:40 crc kubenswrapper[4933]: I1202 16:28:40.778487 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b4tdz"] Dec 02 16:28:41 crc kubenswrapper[4933]: I1202 16:28:41.714188 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b4tdz" event={"ID":"9e5c1276-65ce-4553-9d05-e8e27aaef6b3","Type":"ContainerStarted","Data":"f10c299619b89768dcf66b36ca9156b06009045433ddd96c6419a2a7afbdfbc3"} Dec 02 16:28:42 crc kubenswrapper[4933]: I1202 16:28:42.737648 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b4tdz" event={"ID":"9e5c1276-65ce-4553-9d05-e8e27aaef6b3","Type":"ContainerStarted","Data":"bd4e67548774d436658264cebb57f39b257e92596730b21a8f5774933498f1ef"} Dec 02 16:28:42 crc kubenswrapper[4933]: I1202 16:28:42.759203 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b4tdz" podStartSLOduration=2.929980199 podStartE2EDuration="3.7591832s" podCreationTimestamp="2025-12-02 16:28:39 +0000 UTC" firstStartedPulling="2025-12-02 16:28:40.782970916 +0000 UTC m=+2184.034197619" lastFinishedPulling="2025-12-02 16:28:41.612173917 +0000 UTC m=+2184.863400620" observedRunningTime="2025-12-02 16:28:42.758335597 +0000 UTC m=+2186.009562320" watchObservedRunningTime="2025-12-02 16:28:42.7591832 +0000 UTC m=+2186.010409903" Dec 02 16:28:45 crc kubenswrapper[4933]: I1202 16:28:45.774699 4933 scope.go:117] "RemoveContainer" containerID="eaa265c2a674d7092efccc8193f3d1a1ca1dd70657192ee9eb954691089f34fa" Dec 02 16:28:45 crc kubenswrapper[4933]: I1202 16:28:45.816159 4933 scope.go:117] "RemoveContainer" containerID="21db6466c97949d3a87805fbfb31c2d56f02e0893d8b2f33c62470d5ebe1fcc5" Dec 02 16:28:45 crc kubenswrapper[4933]: I1202 16:28:45.887085 4933 scope.go:117] "RemoveContainer" containerID="97790b38e1af16f3cb724c0fa81690697155a68e454d4a7c2c0fcdfdf7c7265f" Dec 02 16:28:49 crc kubenswrapper[4933]: I1202 16:28:49.218300 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-5m8hc"] Dec 02 16:28:49 crc kubenswrapper[4933]: I1202 16:28:49.222951 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5m8hc" Dec 02 16:28:49 crc kubenswrapper[4933]: I1202 16:28:49.234602 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5m8hc"] Dec 02 16:28:49 crc kubenswrapper[4933]: I1202 16:28:49.317242 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35ca6ba4-cef6-4f11-9283-1fa90877c0dd-utilities\") pod \"redhat-marketplace-5m8hc\" (UID: \"35ca6ba4-cef6-4f11-9283-1fa90877c0dd\") " pod="openshift-marketplace/redhat-marketplace-5m8hc" Dec 02 16:28:49 crc kubenswrapper[4933]: I1202 16:28:49.317569 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2g8zk\" (UniqueName: \"kubernetes.io/projected/35ca6ba4-cef6-4f11-9283-1fa90877c0dd-kube-api-access-2g8zk\") pod \"redhat-marketplace-5m8hc\" (UID: \"35ca6ba4-cef6-4f11-9283-1fa90877c0dd\") " pod="openshift-marketplace/redhat-marketplace-5m8hc" Dec 02 16:28:49 crc kubenswrapper[4933]: I1202 16:28:49.317669 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35ca6ba4-cef6-4f11-9283-1fa90877c0dd-catalog-content\") pod \"redhat-marketplace-5m8hc\" (UID: \"35ca6ba4-cef6-4f11-9283-1fa90877c0dd\") " pod="openshift-marketplace/redhat-marketplace-5m8hc" Dec 02 16:28:49 crc kubenswrapper[4933]: I1202 16:28:49.419553 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2g8zk\" (UniqueName: \"kubernetes.io/projected/35ca6ba4-cef6-4f11-9283-1fa90877c0dd-kube-api-access-2g8zk\") pod \"redhat-marketplace-5m8hc\" (UID: \"35ca6ba4-cef6-4f11-9283-1fa90877c0dd\") " pod="openshift-marketplace/redhat-marketplace-5m8hc" Dec 02 16:28:49 crc kubenswrapper[4933]: I1202 16:28:49.419736 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35ca6ba4-cef6-4f11-9283-1fa90877c0dd-catalog-content\") pod \"redhat-marketplace-5m8hc\" (UID: \"35ca6ba4-cef6-4f11-9283-1fa90877c0dd\") " pod="openshift-marketplace/redhat-marketplace-5m8hc" Dec 02 16:28:49 crc kubenswrapper[4933]: I1202 16:28:49.419995 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35ca6ba4-cef6-4f11-9283-1fa90877c0dd-utilities\") pod \"redhat-marketplace-5m8hc\" (UID: \"35ca6ba4-cef6-4f11-9283-1fa90877c0dd\") " pod="openshift-marketplace/redhat-marketplace-5m8hc" Dec 02 16:28:49 crc kubenswrapper[4933]: I1202 16:28:49.420507 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35ca6ba4-cef6-4f11-9283-1fa90877c0dd-utilities\") pod \"redhat-marketplace-5m8hc\" (UID: \"35ca6ba4-cef6-4f11-9283-1fa90877c0dd\") " pod="openshift-marketplace/redhat-marketplace-5m8hc" Dec 02 16:28:49 crc kubenswrapper[4933]: I1202 16:28:49.420508 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35ca6ba4-cef6-4f11-9283-1fa90877c0dd-catalog-content\") pod \"redhat-marketplace-5m8hc\" (UID: \"35ca6ba4-cef6-4f11-9283-1fa90877c0dd\") " pod="openshift-marketplace/redhat-marketplace-5m8hc" Dec 02 16:28:49 crc kubenswrapper[4933]: I1202 16:28:49.449999 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2g8zk\" (UniqueName: \"kubernetes.io/projected/35ca6ba4-cef6-4f11-9283-1fa90877c0dd-kube-api-access-2g8zk\") pod \"redhat-marketplace-5m8hc\" (UID: \"35ca6ba4-cef6-4f11-9283-1fa90877c0dd\") " pod="openshift-marketplace/redhat-marketplace-5m8hc" Dec 02 16:28:49 crc kubenswrapper[4933]: I1202 16:28:49.563106 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5m8hc" Dec 02 16:28:50 crc kubenswrapper[4933]: I1202 16:28:50.149584 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5m8hc"] Dec 02 16:28:50 crc kubenswrapper[4933]: I1202 16:28:50.828613 4933 generic.go:334] "Generic (PLEG): container finished" podID="35ca6ba4-cef6-4f11-9283-1fa90877c0dd" containerID="e6081ad2f2538bd3ea85e485e1ca5efae28ce13e04388094d4c92e35265832cb" exitCode=0 Dec 02 16:28:50 crc kubenswrapper[4933]: I1202 16:28:50.828736 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5m8hc" event={"ID":"35ca6ba4-cef6-4f11-9283-1fa90877c0dd","Type":"ContainerDied","Data":"e6081ad2f2538bd3ea85e485e1ca5efae28ce13e04388094d4c92e35265832cb"} Dec 02 16:28:50 crc kubenswrapper[4933]: I1202 16:28:50.828983 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5m8hc" event={"ID":"35ca6ba4-cef6-4f11-9283-1fa90877c0dd","Type":"ContainerStarted","Data":"c0d782676c7ece4618f14403d993310985347b2d8cff3a13d831a763886ebef9"} Dec 02 16:28:51 crc kubenswrapper[4933]: I1202 16:28:51.022703 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-fmxgc"] Dec 02 16:28:51 crc kubenswrapper[4933]: I1202 16:28:51.025973 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fmxgc" Dec 02 16:28:51 crc kubenswrapper[4933]: I1202 16:28:51.041520 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fmxgc"] Dec 02 16:28:51 crc kubenswrapper[4933]: I1202 16:28:51.063548 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2d4d73b-5b45-46c7-a537-d0f51fae87af-utilities\") pod \"certified-operators-fmxgc\" (UID: \"a2d4d73b-5b45-46c7-a537-d0f51fae87af\") " pod="openshift-marketplace/certified-operators-fmxgc" Dec 02 16:28:51 crc kubenswrapper[4933]: I1202 16:28:51.063741 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24kxd\" (UniqueName: \"kubernetes.io/projected/a2d4d73b-5b45-46c7-a537-d0f51fae87af-kube-api-access-24kxd\") pod \"certified-operators-fmxgc\" (UID: \"a2d4d73b-5b45-46c7-a537-d0f51fae87af\") " pod="openshift-marketplace/certified-operators-fmxgc" Dec 02 16:28:51 crc kubenswrapper[4933]: I1202 16:28:51.064079 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2d4d73b-5b45-46c7-a537-d0f51fae87af-catalog-content\") pod \"certified-operators-fmxgc\" (UID: \"a2d4d73b-5b45-46c7-a537-d0f51fae87af\") " pod="openshift-marketplace/certified-operators-fmxgc" Dec 02 16:28:51 crc kubenswrapper[4933]: I1202 16:28:51.166218 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2d4d73b-5b45-46c7-a537-d0f51fae87af-utilities\") pod \"certified-operators-fmxgc\" (UID: \"a2d4d73b-5b45-46c7-a537-d0f51fae87af\") " pod="openshift-marketplace/certified-operators-fmxgc" Dec 02 16:28:51 crc kubenswrapper[4933]: I1202 16:28:51.166379 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24kxd\" (UniqueName: \"kubernetes.io/projected/a2d4d73b-5b45-46c7-a537-d0f51fae87af-kube-api-access-24kxd\") pod \"certified-operators-fmxgc\" (UID: \"a2d4d73b-5b45-46c7-a537-d0f51fae87af\") " pod="openshift-marketplace/certified-operators-fmxgc" Dec 02 16:28:51 crc kubenswrapper[4933]: I1202 16:28:51.166528 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2d4d73b-5b45-46c7-a537-d0f51fae87af-catalog-content\") pod \"certified-operators-fmxgc\" (UID: \"a2d4d73b-5b45-46c7-a537-d0f51fae87af\") " pod="openshift-marketplace/certified-operators-fmxgc" Dec 02 16:28:51 crc kubenswrapper[4933]: I1202 16:28:51.166889 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2d4d73b-5b45-46c7-a537-d0f51fae87af-utilities\") pod \"certified-operators-fmxgc\" (UID: \"a2d4d73b-5b45-46c7-a537-d0f51fae87af\") " pod="openshift-marketplace/certified-operators-fmxgc" Dec 02 16:28:51 crc kubenswrapper[4933]: I1202 16:28:51.167636 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2d4d73b-5b45-46c7-a537-d0f51fae87af-catalog-content\") pod \"certified-operators-fmxgc\" (UID: \"a2d4d73b-5b45-46c7-a537-d0f51fae87af\") " pod="openshift-marketplace/certified-operators-fmxgc" Dec 02 16:28:51 crc kubenswrapper[4933]: I1202 16:28:51.187712 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24kxd\" (UniqueName: \"kubernetes.io/projected/a2d4d73b-5b45-46c7-a537-d0f51fae87af-kube-api-access-24kxd\") pod \"certified-operators-fmxgc\" (UID: \"a2d4d73b-5b45-46c7-a537-d0f51fae87af\") " pod="openshift-marketplace/certified-operators-fmxgc" Dec 02 16:28:51 crc kubenswrapper[4933]: I1202 16:28:51.376428 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fmxgc" Dec 02 16:28:51 crc kubenswrapper[4933]: W1202 16:28:51.937219 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda2d4d73b_5b45_46c7_a537_d0f51fae87af.slice/crio-b66884f0af0d9111591d3e979daea7f4ecee398837f934e77a7c649cd0472a7e WatchSource:0}: Error finding container b66884f0af0d9111591d3e979daea7f4ecee398837f934e77a7c649cd0472a7e: Status 404 returned error can't find the container with id b66884f0af0d9111591d3e979daea7f4ecee398837f934e77a7c649cd0472a7e Dec 02 16:28:51 crc kubenswrapper[4933]: I1202 16:28:51.941489 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fmxgc"] Dec 02 16:28:52 crc kubenswrapper[4933]: I1202 16:28:52.871144 4933 generic.go:334] "Generic (PLEG): container finished" podID="35ca6ba4-cef6-4f11-9283-1fa90877c0dd" containerID="d8b5d59b32ec481caa1d08d663d42cc70d6c0197083b5ef5bb033abf6d9f297f" exitCode=0 Dec 02 16:28:52 crc kubenswrapper[4933]: I1202 16:28:52.871270 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5m8hc" event={"ID":"35ca6ba4-cef6-4f11-9283-1fa90877c0dd","Type":"ContainerDied","Data":"d8b5d59b32ec481caa1d08d663d42cc70d6c0197083b5ef5bb033abf6d9f297f"} Dec 02 16:28:52 crc kubenswrapper[4933]: I1202 16:28:52.877007 4933 generic.go:334] "Generic (PLEG): container finished" podID="a2d4d73b-5b45-46c7-a537-d0f51fae87af" containerID="78a5819afa7d63eee93ffa840156b2d7157202d34624e854d0815fefea9fe3f9" exitCode=0 Dec 02 16:28:52 crc kubenswrapper[4933]: I1202 16:28:52.877052 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fmxgc" event={"ID":"a2d4d73b-5b45-46c7-a537-d0f51fae87af","Type":"ContainerDied","Data":"78a5819afa7d63eee93ffa840156b2d7157202d34624e854d0815fefea9fe3f9"} Dec 02 16:28:52 crc kubenswrapper[4933]: I1202 16:28:52.877080 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fmxgc" event={"ID":"a2d4d73b-5b45-46c7-a537-d0f51fae87af","Type":"ContainerStarted","Data":"b66884f0af0d9111591d3e979daea7f4ecee398837f934e77a7c649cd0472a7e"} Dec 02 16:28:53 crc kubenswrapper[4933]: I1202 16:28:53.892017 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5m8hc" event={"ID":"35ca6ba4-cef6-4f11-9283-1fa90877c0dd","Type":"ContainerStarted","Data":"05eac890384af6a27f9633d4d405b59e2b31ff3647dc435621642c0d6352d70f"} Dec 02 16:28:53 crc kubenswrapper[4933]: I1202 16:28:53.894357 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fmxgc" event={"ID":"a2d4d73b-5b45-46c7-a537-d0f51fae87af","Type":"ContainerStarted","Data":"f352f9537a88e014c8cd265b43cc3ffd4954944c484672bfebd631110e4ffa4d"} Dec 02 16:28:53 crc kubenswrapper[4933]: I1202 16:28:53.925433 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-5m8hc" podStartSLOduration=2.42272592 podStartE2EDuration="4.925416355s" podCreationTimestamp="2025-12-02 16:28:49 +0000 UTC" firstStartedPulling="2025-12-02 16:28:50.83068939 +0000 UTC m=+2194.081916103" lastFinishedPulling="2025-12-02 16:28:53.333379835 +0000 UTC m=+2196.584606538" observedRunningTime="2025-12-02 16:28:53.917613435 +0000 UTC m=+2197.168840138" watchObservedRunningTime="2025-12-02 16:28:53.925416355 +0000 UTC m=+2197.176643058" Dec 02 16:28:55 crc kubenswrapper[4933]: I1202 16:28:55.920095 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fmxgc" event={"ID":"a2d4d73b-5b45-46c7-a537-d0f51fae87af","Type":"ContainerDied","Data":"f352f9537a88e014c8cd265b43cc3ffd4954944c484672bfebd631110e4ffa4d"} Dec 02 16:28:55 crc kubenswrapper[4933]: I1202 16:28:55.919897 4933 generic.go:334] "Generic (PLEG): container finished" podID="a2d4d73b-5b45-46c7-a537-d0f51fae87af" containerID="f352f9537a88e014c8cd265b43cc3ffd4954944c484672bfebd631110e4ffa4d" exitCode=0 Dec 02 16:28:57 crc kubenswrapper[4933]: I1202 16:28:57.947503 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fmxgc" event={"ID":"a2d4d73b-5b45-46c7-a537-d0f51fae87af","Type":"ContainerStarted","Data":"59acc8dae3ce9631366ae4cb5320329a37cbaea25cee892aadde09d6cd05f4f1"} Dec 02 16:28:57 crc kubenswrapper[4933]: I1202 16:28:57.975698 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-fmxgc" podStartSLOduration=4.018665286 podStartE2EDuration="7.975670819s" podCreationTimestamp="2025-12-02 16:28:50 +0000 UTC" firstStartedPulling="2025-12-02 16:28:52.879141933 +0000 UTC m=+2196.130368626" lastFinishedPulling="2025-12-02 16:28:56.836147456 +0000 UTC m=+2200.087374159" observedRunningTime="2025-12-02 16:28:57.962878935 +0000 UTC m=+2201.214105668" watchObservedRunningTime="2025-12-02 16:28:57.975670819 +0000 UTC m=+2201.226897522" Dec 02 16:28:59 crc kubenswrapper[4933]: I1202 16:28:59.563651 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-5m8hc" Dec 02 16:28:59 crc kubenswrapper[4933]: I1202 16:28:59.564076 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-5m8hc" Dec 02 16:28:59 crc kubenswrapper[4933]: I1202 16:28:59.625413 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-5m8hc" Dec 02 16:29:00 crc kubenswrapper[4933]: I1202 16:29:00.019984 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-5m8hc" Dec 02 16:29:00 crc kubenswrapper[4933]: I1202 16:29:00.814238 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5m8hc"] Dec 02 16:29:01 crc kubenswrapper[4933]: I1202 16:29:01.377296 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-fmxgc" Dec 02 16:29:01 crc kubenswrapper[4933]: I1202 16:29:01.377653 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-fmxgc" Dec 02 16:29:01 crc kubenswrapper[4933]: I1202 16:29:01.431749 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-fmxgc" Dec 02 16:29:01 crc kubenswrapper[4933]: I1202 16:29:01.993098 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-5m8hc" podUID="35ca6ba4-cef6-4f11-9283-1fa90877c0dd" containerName="registry-server" containerID="cri-o://05eac890384af6a27f9633d4d405b59e2b31ff3647dc435621642c0d6352d70f" gracePeriod=2 Dec 02 16:29:02 crc kubenswrapper[4933]: I1202 16:29:02.478750 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5m8hc" Dec 02 16:29:02 crc kubenswrapper[4933]: I1202 16:29:02.591013 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35ca6ba4-cef6-4f11-9283-1fa90877c0dd-catalog-content\") pod \"35ca6ba4-cef6-4f11-9283-1fa90877c0dd\" (UID: \"35ca6ba4-cef6-4f11-9283-1fa90877c0dd\") " Dec 02 16:29:02 crc kubenswrapper[4933]: I1202 16:29:02.591186 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2g8zk\" (UniqueName: \"kubernetes.io/projected/35ca6ba4-cef6-4f11-9283-1fa90877c0dd-kube-api-access-2g8zk\") pod \"35ca6ba4-cef6-4f11-9283-1fa90877c0dd\" (UID: \"35ca6ba4-cef6-4f11-9283-1fa90877c0dd\") " Dec 02 16:29:02 crc kubenswrapper[4933]: I1202 16:29:02.591274 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35ca6ba4-cef6-4f11-9283-1fa90877c0dd-utilities\") pod \"35ca6ba4-cef6-4f11-9283-1fa90877c0dd\" (UID: \"35ca6ba4-cef6-4f11-9283-1fa90877c0dd\") " Dec 02 16:29:02 crc kubenswrapper[4933]: I1202 16:29:02.592471 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35ca6ba4-cef6-4f11-9283-1fa90877c0dd-utilities" (OuterVolumeSpecName: "utilities") pod "35ca6ba4-cef6-4f11-9283-1fa90877c0dd" (UID: "35ca6ba4-cef6-4f11-9283-1fa90877c0dd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 16:29:02 crc kubenswrapper[4933]: I1202 16:29:02.597204 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35ca6ba4-cef6-4f11-9283-1fa90877c0dd-kube-api-access-2g8zk" (OuterVolumeSpecName: "kube-api-access-2g8zk") pod "35ca6ba4-cef6-4f11-9283-1fa90877c0dd" (UID: "35ca6ba4-cef6-4f11-9283-1fa90877c0dd"). InnerVolumeSpecName "kube-api-access-2g8zk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:29:02 crc kubenswrapper[4933]: I1202 16:29:02.611180 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35ca6ba4-cef6-4f11-9283-1fa90877c0dd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "35ca6ba4-cef6-4f11-9283-1fa90877c0dd" (UID: "35ca6ba4-cef6-4f11-9283-1fa90877c0dd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 16:29:02 crc kubenswrapper[4933]: I1202 16:29:02.694329 4933 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35ca6ba4-cef6-4f11-9283-1fa90877c0dd-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 16:29:02 crc kubenswrapper[4933]: I1202 16:29:02.694604 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2g8zk\" (UniqueName: \"kubernetes.io/projected/35ca6ba4-cef6-4f11-9283-1fa90877c0dd-kube-api-access-2g8zk\") on node \"crc\" DevicePath \"\"" Dec 02 16:29:02 crc kubenswrapper[4933]: I1202 16:29:02.694699 4933 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35ca6ba4-cef6-4f11-9283-1fa90877c0dd-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 16:29:03 crc kubenswrapper[4933]: I1202 16:29:03.033626 4933 generic.go:334] "Generic (PLEG): container finished" podID="35ca6ba4-cef6-4f11-9283-1fa90877c0dd" containerID="05eac890384af6a27f9633d4d405b59e2b31ff3647dc435621642c0d6352d70f" exitCode=0 Dec 02 16:29:03 crc kubenswrapper[4933]: I1202 16:29:03.033962 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5m8hc" event={"ID":"35ca6ba4-cef6-4f11-9283-1fa90877c0dd","Type":"ContainerDied","Data":"05eac890384af6a27f9633d4d405b59e2b31ff3647dc435621642c0d6352d70f"} Dec 02 16:29:03 crc kubenswrapper[4933]: I1202 16:29:03.034007 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5m8hc" event={"ID":"35ca6ba4-cef6-4f11-9283-1fa90877c0dd","Type":"ContainerDied","Data":"c0d782676c7ece4618f14403d993310985347b2d8cff3a13d831a763886ebef9"} Dec 02 16:29:03 crc kubenswrapper[4933]: I1202 16:29:03.034026 4933 scope.go:117] "RemoveContainer" containerID="05eac890384af6a27f9633d4d405b59e2b31ff3647dc435621642c0d6352d70f" Dec 02 16:29:03 crc kubenswrapper[4933]: I1202 16:29:03.035267 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5m8hc" Dec 02 16:29:03 crc kubenswrapper[4933]: I1202 16:29:03.113674 4933 scope.go:117] "RemoveContainer" containerID="d8b5d59b32ec481caa1d08d663d42cc70d6c0197083b5ef5bb033abf6d9f297f" Dec 02 16:29:03 crc kubenswrapper[4933]: I1202 16:29:03.169616 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5m8hc"] Dec 02 16:29:03 crc kubenswrapper[4933]: I1202 16:29:03.195616 4933 scope.go:117] "RemoveContainer" containerID="e6081ad2f2538bd3ea85e485e1ca5efae28ce13e04388094d4c92e35265832cb" Dec 02 16:29:03 crc kubenswrapper[4933]: I1202 16:29:03.199140 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-5m8hc"] Dec 02 16:29:03 crc kubenswrapper[4933]: I1202 16:29:03.238233 4933 scope.go:117] "RemoveContainer" containerID="05eac890384af6a27f9633d4d405b59e2b31ff3647dc435621642c0d6352d70f" Dec 02 16:29:03 crc kubenswrapper[4933]: E1202 16:29:03.244739 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05eac890384af6a27f9633d4d405b59e2b31ff3647dc435621642c0d6352d70f\": container with ID starting with 05eac890384af6a27f9633d4d405b59e2b31ff3647dc435621642c0d6352d70f not found: ID does not exist" containerID="05eac890384af6a27f9633d4d405b59e2b31ff3647dc435621642c0d6352d70f" Dec 02 16:29:03 crc kubenswrapper[4933]: I1202 16:29:03.244841 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05eac890384af6a27f9633d4d405b59e2b31ff3647dc435621642c0d6352d70f"} err="failed to get container status \"05eac890384af6a27f9633d4d405b59e2b31ff3647dc435621642c0d6352d70f\": rpc error: code = NotFound desc = could not find container \"05eac890384af6a27f9633d4d405b59e2b31ff3647dc435621642c0d6352d70f\": container with ID starting with 05eac890384af6a27f9633d4d405b59e2b31ff3647dc435621642c0d6352d70f not found: ID does not exist" Dec 02 16:29:03 crc kubenswrapper[4933]: I1202 16:29:03.244879 4933 scope.go:117] "RemoveContainer" containerID="d8b5d59b32ec481caa1d08d663d42cc70d6c0197083b5ef5bb033abf6d9f297f" Dec 02 16:29:03 crc kubenswrapper[4933]: E1202 16:29:03.245248 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8b5d59b32ec481caa1d08d663d42cc70d6c0197083b5ef5bb033abf6d9f297f\": container with ID starting with d8b5d59b32ec481caa1d08d663d42cc70d6c0197083b5ef5bb033abf6d9f297f not found: ID does not exist" containerID="d8b5d59b32ec481caa1d08d663d42cc70d6c0197083b5ef5bb033abf6d9f297f" Dec 02 16:29:03 crc kubenswrapper[4933]: I1202 16:29:03.245308 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8b5d59b32ec481caa1d08d663d42cc70d6c0197083b5ef5bb033abf6d9f297f"} err="failed to get container status \"d8b5d59b32ec481caa1d08d663d42cc70d6c0197083b5ef5bb033abf6d9f297f\": rpc error: code = NotFound desc = could not find container \"d8b5d59b32ec481caa1d08d663d42cc70d6c0197083b5ef5bb033abf6d9f297f\": container with ID starting with d8b5d59b32ec481caa1d08d663d42cc70d6c0197083b5ef5bb033abf6d9f297f not found: ID does not exist" Dec 02 16:29:03 crc kubenswrapper[4933]: I1202 16:29:03.245328 4933 scope.go:117] "RemoveContainer" containerID="e6081ad2f2538bd3ea85e485e1ca5efae28ce13e04388094d4c92e35265832cb" Dec 02 16:29:03 crc kubenswrapper[4933]: E1202 16:29:03.247762 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6081ad2f2538bd3ea85e485e1ca5efae28ce13e04388094d4c92e35265832cb\": container with ID starting with e6081ad2f2538bd3ea85e485e1ca5efae28ce13e04388094d4c92e35265832cb not found: ID does not exist" containerID="e6081ad2f2538bd3ea85e485e1ca5efae28ce13e04388094d4c92e35265832cb" Dec 02 16:29:03 crc kubenswrapper[4933]: I1202 16:29:03.247816 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6081ad2f2538bd3ea85e485e1ca5efae28ce13e04388094d4c92e35265832cb"} err="failed to get container status \"e6081ad2f2538bd3ea85e485e1ca5efae28ce13e04388094d4c92e35265832cb\": rpc error: code = NotFound desc = could not find container \"e6081ad2f2538bd3ea85e485e1ca5efae28ce13e04388094d4c92e35265832cb\": container with ID starting with e6081ad2f2538bd3ea85e485e1ca5efae28ce13e04388094d4c92e35265832cb not found: ID does not exist" Dec 02 16:29:05 crc kubenswrapper[4933]: I1202 16:29:05.076210 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35ca6ba4-cef6-4f11-9283-1fa90877c0dd" path="/var/lib/kubelet/pods/35ca6ba4-cef6-4f11-9283-1fa90877c0dd/volumes" Dec 02 16:29:11 crc kubenswrapper[4933]: I1202 16:29:11.431417 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-fmxgc" Dec 02 16:29:11 crc kubenswrapper[4933]: I1202 16:29:11.477864 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fmxgc"] Dec 02 16:29:12 crc kubenswrapper[4933]: I1202 16:29:12.160704 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-fmxgc" podUID="a2d4d73b-5b45-46c7-a537-d0f51fae87af" containerName="registry-server" containerID="cri-o://59acc8dae3ce9631366ae4cb5320329a37cbaea25cee892aadde09d6cd05f4f1" gracePeriod=2 Dec 02 16:29:12 crc kubenswrapper[4933]: I1202 16:29:12.664991 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fmxgc" Dec 02 16:29:12 crc kubenswrapper[4933]: I1202 16:29:12.695795 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2d4d73b-5b45-46c7-a537-d0f51fae87af-utilities\") pod \"a2d4d73b-5b45-46c7-a537-d0f51fae87af\" (UID: \"a2d4d73b-5b45-46c7-a537-d0f51fae87af\") " Dec 02 16:29:12 crc kubenswrapper[4933]: I1202 16:29:12.696143 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-24kxd\" (UniqueName: \"kubernetes.io/projected/a2d4d73b-5b45-46c7-a537-d0f51fae87af-kube-api-access-24kxd\") pod \"a2d4d73b-5b45-46c7-a537-d0f51fae87af\" (UID: \"a2d4d73b-5b45-46c7-a537-d0f51fae87af\") " Dec 02 16:29:12 crc kubenswrapper[4933]: I1202 16:29:12.696305 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2d4d73b-5b45-46c7-a537-d0f51fae87af-catalog-content\") pod \"a2d4d73b-5b45-46c7-a537-d0f51fae87af\" (UID: \"a2d4d73b-5b45-46c7-a537-d0f51fae87af\") " Dec 02 16:29:12 crc kubenswrapper[4933]: I1202 16:29:12.697384 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a2d4d73b-5b45-46c7-a537-d0f51fae87af-utilities" (OuterVolumeSpecName: "utilities") pod "a2d4d73b-5b45-46c7-a537-d0f51fae87af" (UID: "a2d4d73b-5b45-46c7-a537-d0f51fae87af"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 16:29:12 crc kubenswrapper[4933]: I1202 16:29:12.710282 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2d4d73b-5b45-46c7-a537-d0f51fae87af-kube-api-access-24kxd" (OuterVolumeSpecName: "kube-api-access-24kxd") pod "a2d4d73b-5b45-46c7-a537-d0f51fae87af" (UID: "a2d4d73b-5b45-46c7-a537-d0f51fae87af"). InnerVolumeSpecName "kube-api-access-24kxd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:29:12 crc kubenswrapper[4933]: I1202 16:29:12.760884 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a2d4d73b-5b45-46c7-a537-d0f51fae87af-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a2d4d73b-5b45-46c7-a537-d0f51fae87af" (UID: "a2d4d73b-5b45-46c7-a537-d0f51fae87af"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 16:29:12 crc kubenswrapper[4933]: I1202 16:29:12.799068 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-24kxd\" (UniqueName: \"kubernetes.io/projected/a2d4d73b-5b45-46c7-a537-d0f51fae87af-kube-api-access-24kxd\") on node \"crc\" DevicePath \"\"" Dec 02 16:29:12 crc kubenswrapper[4933]: I1202 16:29:12.799329 4933 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2d4d73b-5b45-46c7-a537-d0f51fae87af-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 16:29:12 crc kubenswrapper[4933]: I1202 16:29:12.799404 4933 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2d4d73b-5b45-46c7-a537-d0f51fae87af-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 16:29:13 crc kubenswrapper[4933]: I1202 16:29:13.174109 4933 generic.go:334] "Generic (PLEG): container finished" podID="a2d4d73b-5b45-46c7-a537-d0f51fae87af" containerID="59acc8dae3ce9631366ae4cb5320329a37cbaea25cee892aadde09d6cd05f4f1" exitCode=0 Dec 02 16:29:13 crc kubenswrapper[4933]: I1202 16:29:13.174186 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fmxgc" Dec 02 16:29:13 crc kubenswrapper[4933]: I1202 16:29:13.174229 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fmxgc" event={"ID":"a2d4d73b-5b45-46c7-a537-d0f51fae87af","Type":"ContainerDied","Data":"59acc8dae3ce9631366ae4cb5320329a37cbaea25cee892aadde09d6cd05f4f1"} Dec 02 16:29:13 crc kubenswrapper[4933]: I1202 16:29:13.174505 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fmxgc" event={"ID":"a2d4d73b-5b45-46c7-a537-d0f51fae87af","Type":"ContainerDied","Data":"b66884f0af0d9111591d3e979daea7f4ecee398837f934e77a7c649cd0472a7e"} Dec 02 16:29:13 crc kubenswrapper[4933]: I1202 16:29:13.174529 4933 scope.go:117] "RemoveContainer" containerID="59acc8dae3ce9631366ae4cb5320329a37cbaea25cee892aadde09d6cd05f4f1" Dec 02 16:29:13 crc kubenswrapper[4933]: I1202 16:29:13.201498 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fmxgc"] Dec 02 16:29:13 crc kubenswrapper[4933]: I1202 16:29:13.211734 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-fmxgc"] Dec 02 16:29:13 crc kubenswrapper[4933]: I1202 16:29:13.224499 4933 scope.go:117] "RemoveContainer" containerID="f352f9537a88e014c8cd265b43cc3ffd4954944c484672bfebd631110e4ffa4d" Dec 02 16:29:13 crc kubenswrapper[4933]: I1202 16:29:13.253265 4933 scope.go:117] "RemoveContainer" containerID="78a5819afa7d63eee93ffa840156b2d7157202d34624e854d0815fefea9fe3f9" Dec 02 16:29:13 crc kubenswrapper[4933]: I1202 16:29:13.303275 4933 scope.go:117] "RemoveContainer" containerID="59acc8dae3ce9631366ae4cb5320329a37cbaea25cee892aadde09d6cd05f4f1" Dec 02 16:29:13 crc kubenswrapper[4933]: E1202 16:29:13.303715 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59acc8dae3ce9631366ae4cb5320329a37cbaea25cee892aadde09d6cd05f4f1\": container with ID starting with 59acc8dae3ce9631366ae4cb5320329a37cbaea25cee892aadde09d6cd05f4f1 not found: ID does not exist" containerID="59acc8dae3ce9631366ae4cb5320329a37cbaea25cee892aadde09d6cd05f4f1" Dec 02 16:29:13 crc kubenswrapper[4933]: I1202 16:29:13.303748 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59acc8dae3ce9631366ae4cb5320329a37cbaea25cee892aadde09d6cd05f4f1"} err="failed to get container status \"59acc8dae3ce9631366ae4cb5320329a37cbaea25cee892aadde09d6cd05f4f1\": rpc error: code = NotFound desc = could not find container \"59acc8dae3ce9631366ae4cb5320329a37cbaea25cee892aadde09d6cd05f4f1\": container with ID starting with 59acc8dae3ce9631366ae4cb5320329a37cbaea25cee892aadde09d6cd05f4f1 not found: ID does not exist" Dec 02 16:29:13 crc kubenswrapper[4933]: I1202 16:29:13.303769 4933 scope.go:117] "RemoveContainer" containerID="f352f9537a88e014c8cd265b43cc3ffd4954944c484672bfebd631110e4ffa4d" Dec 02 16:29:13 crc kubenswrapper[4933]: E1202 16:29:13.304176 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f352f9537a88e014c8cd265b43cc3ffd4954944c484672bfebd631110e4ffa4d\": container with ID starting with f352f9537a88e014c8cd265b43cc3ffd4954944c484672bfebd631110e4ffa4d not found: ID does not exist" containerID="f352f9537a88e014c8cd265b43cc3ffd4954944c484672bfebd631110e4ffa4d" Dec 02 16:29:13 crc kubenswrapper[4933]: I1202 16:29:13.304227 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f352f9537a88e014c8cd265b43cc3ffd4954944c484672bfebd631110e4ffa4d"} err="failed to get container status \"f352f9537a88e014c8cd265b43cc3ffd4954944c484672bfebd631110e4ffa4d\": rpc error: code = NotFound desc = could not find container \"f352f9537a88e014c8cd265b43cc3ffd4954944c484672bfebd631110e4ffa4d\": container with ID starting with f352f9537a88e014c8cd265b43cc3ffd4954944c484672bfebd631110e4ffa4d not found: ID does not exist" Dec 02 16:29:13 crc kubenswrapper[4933]: I1202 16:29:13.304261 4933 scope.go:117] "RemoveContainer" containerID="78a5819afa7d63eee93ffa840156b2d7157202d34624e854d0815fefea9fe3f9" Dec 02 16:29:13 crc kubenswrapper[4933]: E1202 16:29:13.304737 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78a5819afa7d63eee93ffa840156b2d7157202d34624e854d0815fefea9fe3f9\": container with ID starting with 78a5819afa7d63eee93ffa840156b2d7157202d34624e854d0815fefea9fe3f9 not found: ID does not exist" containerID="78a5819afa7d63eee93ffa840156b2d7157202d34624e854d0815fefea9fe3f9" Dec 02 16:29:13 crc kubenswrapper[4933]: I1202 16:29:13.304781 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78a5819afa7d63eee93ffa840156b2d7157202d34624e854d0815fefea9fe3f9"} err="failed to get container status \"78a5819afa7d63eee93ffa840156b2d7157202d34624e854d0815fefea9fe3f9\": rpc error: code = NotFound desc = could not find container \"78a5819afa7d63eee93ffa840156b2d7157202d34624e854d0815fefea9fe3f9\": container with ID starting with 78a5819afa7d63eee93ffa840156b2d7157202d34624e854d0815fefea9fe3f9 not found: ID does not exist" Dec 02 16:29:15 crc kubenswrapper[4933]: I1202 16:29:15.065680 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2d4d73b-5b45-46c7-a537-d0f51fae87af" path="/var/lib/kubelet/pods/a2d4d73b-5b45-46c7-a537-d0f51fae87af/volumes" Dec 02 16:29:17 crc kubenswrapper[4933]: I1202 16:29:17.169237 4933 patch_prober.go:28] interesting pod/machine-config-daemon-d2p6w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 16:29:17 crc kubenswrapper[4933]: I1202 16:29:17.169732 4933 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 16:29:32 crc kubenswrapper[4933]: I1202 16:29:32.382069 4933 generic.go:334] "Generic (PLEG): container finished" podID="9e5c1276-65ce-4553-9d05-e8e27aaef6b3" containerID="bd4e67548774d436658264cebb57f39b257e92596730b21a8f5774933498f1ef" exitCode=0 Dec 02 16:29:32 crc kubenswrapper[4933]: I1202 16:29:32.382177 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b4tdz" event={"ID":"9e5c1276-65ce-4553-9d05-e8e27aaef6b3","Type":"ContainerDied","Data":"bd4e67548774d436658264cebb57f39b257e92596730b21a8f5774933498f1ef"} Dec 02 16:29:33 crc kubenswrapper[4933]: I1202 16:29:33.880250 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b4tdz" Dec 02 16:29:34 crc kubenswrapper[4933]: I1202 16:29:34.015709 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9e5c1276-65ce-4553-9d05-e8e27aaef6b3-inventory\") pod \"9e5c1276-65ce-4553-9d05-e8e27aaef6b3\" (UID: \"9e5c1276-65ce-4553-9d05-e8e27aaef6b3\") " Dec 02 16:29:34 crc kubenswrapper[4933]: I1202 16:29:34.015932 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7qlgh\" (UniqueName: \"kubernetes.io/projected/9e5c1276-65ce-4553-9d05-e8e27aaef6b3-kube-api-access-7qlgh\") pod \"9e5c1276-65ce-4553-9d05-e8e27aaef6b3\" (UID: \"9e5c1276-65ce-4553-9d05-e8e27aaef6b3\") " Dec 02 16:29:34 crc kubenswrapper[4933]: I1202 16:29:34.016021 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9e5c1276-65ce-4553-9d05-e8e27aaef6b3-ssh-key\") pod \"9e5c1276-65ce-4553-9d05-e8e27aaef6b3\" (UID: \"9e5c1276-65ce-4553-9d05-e8e27aaef6b3\") " Dec 02 16:29:34 crc kubenswrapper[4933]: I1202 16:29:34.081075 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e5c1276-65ce-4553-9d05-e8e27aaef6b3-kube-api-access-7qlgh" (OuterVolumeSpecName: "kube-api-access-7qlgh") pod "9e5c1276-65ce-4553-9d05-e8e27aaef6b3" (UID: "9e5c1276-65ce-4553-9d05-e8e27aaef6b3"). InnerVolumeSpecName "kube-api-access-7qlgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:29:34 crc kubenswrapper[4933]: I1202 16:29:34.093030 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e5c1276-65ce-4553-9d05-e8e27aaef6b3-inventory" (OuterVolumeSpecName: "inventory") pod "9e5c1276-65ce-4553-9d05-e8e27aaef6b3" (UID: "9e5c1276-65ce-4553-9d05-e8e27aaef6b3"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:29:34 crc kubenswrapper[4933]: I1202 16:29:34.113028 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e5c1276-65ce-4553-9d05-e8e27aaef6b3-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "9e5c1276-65ce-4553-9d05-e8e27aaef6b3" (UID: "9e5c1276-65ce-4553-9d05-e8e27aaef6b3"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:29:34 crc kubenswrapper[4933]: I1202 16:29:34.122061 4933 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9e5c1276-65ce-4553-9d05-e8e27aaef6b3-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 16:29:34 crc kubenswrapper[4933]: I1202 16:29:34.122100 4933 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9e5c1276-65ce-4553-9d05-e8e27aaef6b3-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 16:29:34 crc kubenswrapper[4933]: I1202 16:29:34.122113 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7qlgh\" (UniqueName: \"kubernetes.io/projected/9e5c1276-65ce-4553-9d05-e8e27aaef6b3-kube-api-access-7qlgh\") on node \"crc\" DevicePath \"\"" Dec 02 16:29:34 crc kubenswrapper[4933]: I1202 16:29:34.407150 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b4tdz" event={"ID":"9e5c1276-65ce-4553-9d05-e8e27aaef6b3","Type":"ContainerDied","Data":"f10c299619b89768dcf66b36ca9156b06009045433ddd96c6419a2a7afbdfbc3"} Dec 02 16:29:34 crc kubenswrapper[4933]: I1202 16:29:34.407204 4933 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f10c299619b89768dcf66b36ca9156b06009045433ddd96c6419a2a7afbdfbc3" Dec 02 16:29:34 crc kubenswrapper[4933]: I1202 16:29:34.407274 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b4tdz" Dec 02 16:29:34 crc kubenswrapper[4933]: I1202 16:29:34.501687 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-4m7pz"] Dec 02 16:29:34 crc kubenswrapper[4933]: E1202 16:29:34.502311 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35ca6ba4-cef6-4f11-9283-1fa90877c0dd" containerName="extract-utilities" Dec 02 16:29:34 crc kubenswrapper[4933]: I1202 16:29:34.502333 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="35ca6ba4-cef6-4f11-9283-1fa90877c0dd" containerName="extract-utilities" Dec 02 16:29:34 crc kubenswrapper[4933]: E1202 16:29:34.502349 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2d4d73b-5b45-46c7-a537-d0f51fae87af" containerName="extract-utilities" Dec 02 16:29:34 crc kubenswrapper[4933]: I1202 16:29:34.502355 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2d4d73b-5b45-46c7-a537-d0f51fae87af" containerName="extract-utilities" Dec 02 16:29:34 crc kubenswrapper[4933]: E1202 16:29:34.502373 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2d4d73b-5b45-46c7-a537-d0f51fae87af" containerName="registry-server" Dec 02 16:29:34 crc kubenswrapper[4933]: I1202 16:29:34.502381 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2d4d73b-5b45-46c7-a537-d0f51fae87af" containerName="registry-server" Dec 02 16:29:34 crc kubenswrapper[4933]: E1202 16:29:34.502395 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2d4d73b-5b45-46c7-a537-d0f51fae87af" containerName="extract-content" Dec 02 16:29:34 crc kubenswrapper[4933]: I1202 16:29:34.502401 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2d4d73b-5b45-46c7-a537-d0f51fae87af" containerName="extract-content" Dec 02 16:29:34 crc kubenswrapper[4933]: E1202 16:29:34.502414 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35ca6ba4-cef6-4f11-9283-1fa90877c0dd" containerName="extract-content" Dec 02 16:29:34 crc kubenswrapper[4933]: I1202 16:29:34.502420 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="35ca6ba4-cef6-4f11-9283-1fa90877c0dd" containerName="extract-content" Dec 02 16:29:34 crc kubenswrapper[4933]: E1202 16:29:34.502436 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35ca6ba4-cef6-4f11-9283-1fa90877c0dd" containerName="registry-server" Dec 02 16:29:34 crc kubenswrapper[4933]: I1202 16:29:34.502442 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="35ca6ba4-cef6-4f11-9283-1fa90877c0dd" containerName="registry-server" Dec 02 16:29:34 crc kubenswrapper[4933]: E1202 16:29:34.502451 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e5c1276-65ce-4553-9d05-e8e27aaef6b3" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 02 16:29:34 crc kubenswrapper[4933]: I1202 16:29:34.502460 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e5c1276-65ce-4553-9d05-e8e27aaef6b3" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 02 16:29:34 crc kubenswrapper[4933]: I1202 16:29:34.502684 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e5c1276-65ce-4553-9d05-e8e27aaef6b3" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 02 16:29:34 crc kubenswrapper[4933]: I1202 16:29:34.502700 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="35ca6ba4-cef6-4f11-9283-1fa90877c0dd" containerName="registry-server" Dec 02 16:29:34 crc kubenswrapper[4933]: I1202 16:29:34.502711 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2d4d73b-5b45-46c7-a537-d0f51fae87af" containerName="registry-server" Dec 02 16:29:34 crc kubenswrapper[4933]: I1202 16:29:34.503626 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-4m7pz" Dec 02 16:29:34 crc kubenswrapper[4933]: I1202 16:29:34.506060 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 16:29:34 crc kubenswrapper[4933]: I1202 16:29:34.506116 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 16:29:34 crc kubenswrapper[4933]: I1202 16:29:34.506201 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 16:29:34 crc kubenswrapper[4933]: I1202 16:29:34.506314 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mlmmm" Dec 02 16:29:34 crc kubenswrapper[4933]: I1202 16:29:34.519870 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-4m7pz"] Dec 02 16:29:34 crc kubenswrapper[4933]: I1202 16:29:34.632986 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/8b53d0b1-3511-4e0c-9d87-b7ceab39da16-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-4m7pz\" (UID: \"8b53d0b1-3511-4e0c-9d87-b7ceab39da16\") " pod="openstack/ssh-known-hosts-edpm-deployment-4m7pz" Dec 02 16:29:34 crc kubenswrapper[4933]: I1202 16:29:34.633423 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8b53d0b1-3511-4e0c-9d87-b7ceab39da16-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-4m7pz\" (UID: \"8b53d0b1-3511-4e0c-9d87-b7ceab39da16\") " pod="openstack/ssh-known-hosts-edpm-deployment-4m7pz" Dec 02 16:29:34 crc kubenswrapper[4933]: I1202 16:29:34.633739 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcfxs\" (UniqueName: \"kubernetes.io/projected/8b53d0b1-3511-4e0c-9d87-b7ceab39da16-kube-api-access-zcfxs\") pod \"ssh-known-hosts-edpm-deployment-4m7pz\" (UID: \"8b53d0b1-3511-4e0c-9d87-b7ceab39da16\") " pod="openstack/ssh-known-hosts-edpm-deployment-4m7pz" Dec 02 16:29:34 crc kubenswrapper[4933]: I1202 16:29:34.736345 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zcfxs\" (UniqueName: \"kubernetes.io/projected/8b53d0b1-3511-4e0c-9d87-b7ceab39da16-kube-api-access-zcfxs\") pod \"ssh-known-hosts-edpm-deployment-4m7pz\" (UID: \"8b53d0b1-3511-4e0c-9d87-b7ceab39da16\") " pod="openstack/ssh-known-hosts-edpm-deployment-4m7pz" Dec 02 16:29:34 crc kubenswrapper[4933]: I1202 16:29:34.736443 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/8b53d0b1-3511-4e0c-9d87-b7ceab39da16-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-4m7pz\" (UID: \"8b53d0b1-3511-4e0c-9d87-b7ceab39da16\") " pod="openstack/ssh-known-hosts-edpm-deployment-4m7pz" Dec 02 16:29:34 crc kubenswrapper[4933]: I1202 16:29:34.736556 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8b53d0b1-3511-4e0c-9d87-b7ceab39da16-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-4m7pz\" (UID: \"8b53d0b1-3511-4e0c-9d87-b7ceab39da16\") " pod="openstack/ssh-known-hosts-edpm-deployment-4m7pz" Dec 02 16:29:34 crc kubenswrapper[4933]: I1202 16:29:34.740615 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8b53d0b1-3511-4e0c-9d87-b7ceab39da16-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-4m7pz\" (UID: \"8b53d0b1-3511-4e0c-9d87-b7ceab39da16\") " pod="openstack/ssh-known-hosts-edpm-deployment-4m7pz" Dec 02 16:29:34 crc kubenswrapper[4933]: I1202 16:29:34.744696 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/8b53d0b1-3511-4e0c-9d87-b7ceab39da16-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-4m7pz\" (UID: \"8b53d0b1-3511-4e0c-9d87-b7ceab39da16\") " pod="openstack/ssh-known-hosts-edpm-deployment-4m7pz" Dec 02 16:29:34 crc kubenswrapper[4933]: I1202 16:29:34.753206 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcfxs\" (UniqueName: \"kubernetes.io/projected/8b53d0b1-3511-4e0c-9d87-b7ceab39da16-kube-api-access-zcfxs\") pod \"ssh-known-hosts-edpm-deployment-4m7pz\" (UID: \"8b53d0b1-3511-4e0c-9d87-b7ceab39da16\") " pod="openstack/ssh-known-hosts-edpm-deployment-4m7pz" Dec 02 16:29:34 crc kubenswrapper[4933]: I1202 16:29:34.821083 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-4m7pz" Dec 02 16:29:35 crc kubenswrapper[4933]: I1202 16:29:35.394378 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-4m7pz"] Dec 02 16:29:35 crc kubenswrapper[4933]: I1202 16:29:35.403043 4933 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 16:29:35 crc kubenswrapper[4933]: I1202 16:29:35.436884 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-4m7pz" event={"ID":"8b53d0b1-3511-4e0c-9d87-b7ceab39da16","Type":"ContainerStarted","Data":"7382ad7cabcbbe188224e424e2d7918f3fb1bab2dc4e8418b3b84b8971259f7a"} Dec 02 16:29:36 crc kubenswrapper[4933]: I1202 16:29:36.448259 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-4m7pz" event={"ID":"8b53d0b1-3511-4e0c-9d87-b7ceab39da16","Type":"ContainerStarted","Data":"e95428e3a6d6c09b953590ee330d82d635e2ad318c4c7e6a0d3299e5a80f4210"} Dec 02 16:29:36 crc kubenswrapper[4933]: I1202 16:29:36.472229 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-4m7pz" podStartSLOduration=2.047044458 podStartE2EDuration="2.472207671s" podCreationTimestamp="2025-12-02 16:29:34 +0000 UTC" firstStartedPulling="2025-12-02 16:29:35.402808449 +0000 UTC m=+2238.654035152" lastFinishedPulling="2025-12-02 16:29:35.827971652 +0000 UTC m=+2239.079198365" observedRunningTime="2025-12-02 16:29:36.4624886 +0000 UTC m=+2239.713715293" watchObservedRunningTime="2025-12-02 16:29:36.472207671 +0000 UTC m=+2239.723434374" Dec 02 16:29:43 crc kubenswrapper[4933]: I1202 16:29:43.523838 4933 generic.go:334] "Generic (PLEG): container finished" podID="8b53d0b1-3511-4e0c-9d87-b7ceab39da16" containerID="e95428e3a6d6c09b953590ee330d82d635e2ad318c4c7e6a0d3299e5a80f4210" exitCode=0 Dec 02 16:29:43 crc kubenswrapper[4933]: I1202 16:29:43.523973 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-4m7pz" event={"ID":"8b53d0b1-3511-4e0c-9d87-b7ceab39da16","Type":"ContainerDied","Data":"e95428e3a6d6c09b953590ee330d82d635e2ad318c4c7e6a0d3299e5a80f4210"} Dec 02 16:29:45 crc kubenswrapper[4933]: I1202 16:29:45.019437 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-4m7pz" Dec 02 16:29:45 crc kubenswrapper[4933]: I1202 16:29:45.084292 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8b53d0b1-3511-4e0c-9d87-b7ceab39da16-ssh-key-openstack-edpm-ipam\") pod \"8b53d0b1-3511-4e0c-9d87-b7ceab39da16\" (UID: \"8b53d0b1-3511-4e0c-9d87-b7ceab39da16\") " Dec 02 16:29:45 crc kubenswrapper[4933]: I1202 16:29:45.084919 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/8b53d0b1-3511-4e0c-9d87-b7ceab39da16-inventory-0\") pod \"8b53d0b1-3511-4e0c-9d87-b7ceab39da16\" (UID: \"8b53d0b1-3511-4e0c-9d87-b7ceab39da16\") " Dec 02 16:29:45 crc kubenswrapper[4933]: I1202 16:29:45.085043 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zcfxs\" (UniqueName: \"kubernetes.io/projected/8b53d0b1-3511-4e0c-9d87-b7ceab39da16-kube-api-access-zcfxs\") pod \"8b53d0b1-3511-4e0c-9d87-b7ceab39da16\" (UID: \"8b53d0b1-3511-4e0c-9d87-b7ceab39da16\") " Dec 02 16:29:45 crc kubenswrapper[4933]: I1202 16:29:45.120025 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b53d0b1-3511-4e0c-9d87-b7ceab39da16-kube-api-access-zcfxs" (OuterVolumeSpecName: "kube-api-access-zcfxs") pod "8b53d0b1-3511-4e0c-9d87-b7ceab39da16" (UID: "8b53d0b1-3511-4e0c-9d87-b7ceab39da16"). InnerVolumeSpecName "kube-api-access-zcfxs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:29:45 crc kubenswrapper[4933]: I1202 16:29:45.190007 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zcfxs\" (UniqueName: \"kubernetes.io/projected/8b53d0b1-3511-4e0c-9d87-b7ceab39da16-kube-api-access-zcfxs\") on node \"crc\" DevicePath \"\"" Dec 02 16:29:45 crc kubenswrapper[4933]: I1202 16:29:45.245745 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b53d0b1-3511-4e0c-9d87-b7ceab39da16-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "8b53d0b1-3511-4e0c-9d87-b7ceab39da16" (UID: "8b53d0b1-3511-4e0c-9d87-b7ceab39da16"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:29:45 crc kubenswrapper[4933]: I1202 16:29:45.270316 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b53d0b1-3511-4e0c-9d87-b7ceab39da16-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "8b53d0b1-3511-4e0c-9d87-b7ceab39da16" (UID: "8b53d0b1-3511-4e0c-9d87-b7ceab39da16"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:29:45 crc kubenswrapper[4933]: I1202 16:29:45.291863 4933 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8b53d0b1-3511-4e0c-9d87-b7ceab39da16-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Dec 02 16:29:45 crc kubenswrapper[4933]: I1202 16:29:45.291906 4933 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/8b53d0b1-3511-4e0c-9d87-b7ceab39da16-inventory-0\") on node \"crc\" DevicePath \"\"" Dec 02 16:29:45 crc kubenswrapper[4933]: I1202 16:29:45.544424 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-4m7pz" event={"ID":"8b53d0b1-3511-4e0c-9d87-b7ceab39da16","Type":"ContainerDied","Data":"7382ad7cabcbbe188224e424e2d7918f3fb1bab2dc4e8418b3b84b8971259f7a"} Dec 02 16:29:45 crc kubenswrapper[4933]: I1202 16:29:45.544472 4933 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7382ad7cabcbbe188224e424e2d7918f3fb1bab2dc4e8418b3b84b8971259f7a" Dec 02 16:29:45 crc kubenswrapper[4933]: I1202 16:29:45.544487 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-4m7pz" Dec 02 16:29:45 crc kubenswrapper[4933]: I1202 16:29:45.636522 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-fgvgb"] Dec 02 16:29:45 crc kubenswrapper[4933]: E1202 16:29:45.637067 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b53d0b1-3511-4e0c-9d87-b7ceab39da16" containerName="ssh-known-hosts-edpm-deployment" Dec 02 16:29:45 crc kubenswrapper[4933]: I1202 16:29:45.637086 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b53d0b1-3511-4e0c-9d87-b7ceab39da16" containerName="ssh-known-hosts-edpm-deployment" Dec 02 16:29:45 crc kubenswrapper[4933]: I1202 16:29:45.637327 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b53d0b1-3511-4e0c-9d87-b7ceab39da16" containerName="ssh-known-hosts-edpm-deployment" Dec 02 16:29:45 crc kubenswrapper[4933]: I1202 16:29:45.638355 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fgvgb" Dec 02 16:29:45 crc kubenswrapper[4933]: I1202 16:29:45.641595 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 16:29:45 crc kubenswrapper[4933]: I1202 16:29:45.641744 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 16:29:45 crc kubenswrapper[4933]: I1202 16:29:45.642261 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mlmmm" Dec 02 16:29:45 crc kubenswrapper[4933]: I1202 16:29:45.645417 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 16:29:45 crc kubenswrapper[4933]: I1202 16:29:45.653768 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-fgvgb"] Dec 02 16:29:45 crc kubenswrapper[4933]: I1202 16:29:45.802583 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0feefe80-99a0-4d78-9753-c823a10fc0f8-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-fgvgb\" (UID: \"0feefe80-99a0-4d78-9753-c823a10fc0f8\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fgvgb" Dec 02 16:29:45 crc kubenswrapper[4933]: I1202 16:29:45.802751 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59258\" (UniqueName: \"kubernetes.io/projected/0feefe80-99a0-4d78-9753-c823a10fc0f8-kube-api-access-59258\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-fgvgb\" (UID: \"0feefe80-99a0-4d78-9753-c823a10fc0f8\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fgvgb" Dec 02 16:29:45 crc kubenswrapper[4933]: I1202 16:29:45.803756 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0feefe80-99a0-4d78-9753-c823a10fc0f8-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-fgvgb\" (UID: \"0feefe80-99a0-4d78-9753-c823a10fc0f8\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fgvgb" Dec 02 16:29:45 crc kubenswrapper[4933]: I1202 16:29:45.906002 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0feefe80-99a0-4d78-9753-c823a10fc0f8-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-fgvgb\" (UID: \"0feefe80-99a0-4d78-9753-c823a10fc0f8\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fgvgb" Dec 02 16:29:45 crc kubenswrapper[4933]: I1202 16:29:45.906424 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0feefe80-99a0-4d78-9753-c823a10fc0f8-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-fgvgb\" (UID: \"0feefe80-99a0-4d78-9753-c823a10fc0f8\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fgvgb" Dec 02 16:29:45 crc kubenswrapper[4933]: I1202 16:29:45.906532 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59258\" (UniqueName: \"kubernetes.io/projected/0feefe80-99a0-4d78-9753-c823a10fc0f8-kube-api-access-59258\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-fgvgb\" (UID: \"0feefe80-99a0-4d78-9753-c823a10fc0f8\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fgvgb" Dec 02 16:29:45 crc kubenswrapper[4933]: I1202 16:29:45.911022 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0feefe80-99a0-4d78-9753-c823a10fc0f8-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-fgvgb\" (UID: \"0feefe80-99a0-4d78-9753-c823a10fc0f8\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fgvgb" Dec 02 16:29:45 crc kubenswrapper[4933]: I1202 16:29:45.912199 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0feefe80-99a0-4d78-9753-c823a10fc0f8-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-fgvgb\" (UID: \"0feefe80-99a0-4d78-9753-c823a10fc0f8\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fgvgb" Dec 02 16:29:45 crc kubenswrapper[4933]: I1202 16:29:45.926671 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59258\" (UniqueName: \"kubernetes.io/projected/0feefe80-99a0-4d78-9753-c823a10fc0f8-kube-api-access-59258\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-fgvgb\" (UID: \"0feefe80-99a0-4d78-9753-c823a10fc0f8\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fgvgb" Dec 02 16:29:46 crc kubenswrapper[4933]: I1202 16:29:46.002643 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fgvgb" Dec 02 16:29:46 crc kubenswrapper[4933]: I1202 16:29:46.588753 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-fgvgb"] Dec 02 16:29:46 crc kubenswrapper[4933]: W1202 16:29:46.593383 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0feefe80_99a0_4d78_9753_c823a10fc0f8.slice/crio-07aa60630c4132df63c6c1655fe9ada621b021e7a025bd7d02f6a5dd279d4919 WatchSource:0}: Error finding container 07aa60630c4132df63c6c1655fe9ada621b021e7a025bd7d02f6a5dd279d4919: Status 404 returned error can't find the container with id 07aa60630c4132df63c6c1655fe9ada621b021e7a025bd7d02f6a5dd279d4919 Dec 02 16:29:47 crc kubenswrapper[4933]: I1202 16:29:47.169327 4933 patch_prober.go:28] interesting pod/machine-config-daemon-d2p6w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 16:29:47 crc kubenswrapper[4933]: I1202 16:29:47.169698 4933 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 16:29:47 crc kubenswrapper[4933]: I1202 16:29:47.569961 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fgvgb" event={"ID":"0feefe80-99a0-4d78-9753-c823a10fc0f8","Type":"ContainerStarted","Data":"85ea5ae177f360ae0832667eb2fec93c281a28e4b426f35ba84dd16e71416119"} Dec 02 16:29:47 crc kubenswrapper[4933]: I1202 16:29:47.570319 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fgvgb" event={"ID":"0feefe80-99a0-4d78-9753-c823a10fc0f8","Type":"ContainerStarted","Data":"07aa60630c4132df63c6c1655fe9ada621b021e7a025bd7d02f6a5dd279d4919"} Dec 02 16:29:47 crc kubenswrapper[4933]: I1202 16:29:47.590643 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fgvgb" podStartSLOduration=2.14938323 podStartE2EDuration="2.590625913s" podCreationTimestamp="2025-12-02 16:29:45 +0000 UTC" firstStartedPulling="2025-12-02 16:29:46.595561675 +0000 UTC m=+2249.846788388" lastFinishedPulling="2025-12-02 16:29:47.036804368 +0000 UTC m=+2250.288031071" observedRunningTime="2025-12-02 16:29:47.585919437 +0000 UTC m=+2250.837146150" watchObservedRunningTime="2025-12-02 16:29:47.590625913 +0000 UTC m=+2250.841852626" Dec 02 16:29:55 crc kubenswrapper[4933]: I1202 16:29:55.680084 4933 generic.go:334] "Generic (PLEG): container finished" podID="0feefe80-99a0-4d78-9753-c823a10fc0f8" containerID="85ea5ae177f360ae0832667eb2fec93c281a28e4b426f35ba84dd16e71416119" exitCode=0 Dec 02 16:29:55 crc kubenswrapper[4933]: I1202 16:29:55.680165 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fgvgb" event={"ID":"0feefe80-99a0-4d78-9753-c823a10fc0f8","Type":"ContainerDied","Data":"85ea5ae177f360ae0832667eb2fec93c281a28e4b426f35ba84dd16e71416119"} Dec 02 16:29:57 crc kubenswrapper[4933]: I1202 16:29:57.181656 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fgvgb" Dec 02 16:29:57 crc kubenswrapper[4933]: I1202 16:29:57.307963 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0feefe80-99a0-4d78-9753-c823a10fc0f8-ssh-key\") pod \"0feefe80-99a0-4d78-9753-c823a10fc0f8\" (UID: \"0feefe80-99a0-4d78-9753-c823a10fc0f8\") " Dec 02 16:29:57 crc kubenswrapper[4933]: I1202 16:29:57.308023 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0feefe80-99a0-4d78-9753-c823a10fc0f8-inventory\") pod \"0feefe80-99a0-4d78-9753-c823a10fc0f8\" (UID: \"0feefe80-99a0-4d78-9753-c823a10fc0f8\") " Dec 02 16:29:57 crc kubenswrapper[4933]: I1202 16:29:57.308159 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-59258\" (UniqueName: \"kubernetes.io/projected/0feefe80-99a0-4d78-9753-c823a10fc0f8-kube-api-access-59258\") pod \"0feefe80-99a0-4d78-9753-c823a10fc0f8\" (UID: \"0feefe80-99a0-4d78-9753-c823a10fc0f8\") " Dec 02 16:29:57 crc kubenswrapper[4933]: I1202 16:29:57.320080 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0feefe80-99a0-4d78-9753-c823a10fc0f8-kube-api-access-59258" (OuterVolumeSpecName: "kube-api-access-59258") pod "0feefe80-99a0-4d78-9753-c823a10fc0f8" (UID: "0feefe80-99a0-4d78-9753-c823a10fc0f8"). InnerVolumeSpecName "kube-api-access-59258". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:29:57 crc kubenswrapper[4933]: I1202 16:29:57.344878 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0feefe80-99a0-4d78-9753-c823a10fc0f8-inventory" (OuterVolumeSpecName: "inventory") pod "0feefe80-99a0-4d78-9753-c823a10fc0f8" (UID: "0feefe80-99a0-4d78-9753-c823a10fc0f8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:29:57 crc kubenswrapper[4933]: I1202 16:29:57.347095 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0feefe80-99a0-4d78-9753-c823a10fc0f8-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "0feefe80-99a0-4d78-9753-c823a10fc0f8" (UID: "0feefe80-99a0-4d78-9753-c823a10fc0f8"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:29:57 crc kubenswrapper[4933]: I1202 16:29:57.411687 4933 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0feefe80-99a0-4d78-9753-c823a10fc0f8-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 16:29:57 crc kubenswrapper[4933]: I1202 16:29:57.411738 4933 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0feefe80-99a0-4d78-9753-c823a10fc0f8-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 16:29:57 crc kubenswrapper[4933]: I1202 16:29:57.411754 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-59258\" (UniqueName: \"kubernetes.io/projected/0feefe80-99a0-4d78-9753-c823a10fc0f8-kube-api-access-59258\") on node \"crc\" DevicePath \"\"" Dec 02 16:29:57 crc kubenswrapper[4933]: I1202 16:29:57.727658 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fgvgb" event={"ID":"0feefe80-99a0-4d78-9753-c823a10fc0f8","Type":"ContainerDied","Data":"07aa60630c4132df63c6c1655fe9ada621b021e7a025bd7d02f6a5dd279d4919"} Dec 02 16:29:57 crc kubenswrapper[4933]: I1202 16:29:57.728044 4933 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="07aa60630c4132df63c6c1655fe9ada621b021e7a025bd7d02f6a5dd279d4919" Dec 02 16:29:57 crc kubenswrapper[4933]: I1202 16:29:57.727724 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fgvgb" Dec 02 16:29:57 crc kubenswrapper[4933]: I1202 16:29:57.814334 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-r5bzb"] Dec 02 16:29:57 crc kubenswrapper[4933]: E1202 16:29:57.815005 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0feefe80-99a0-4d78-9753-c823a10fc0f8" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 02 16:29:57 crc kubenswrapper[4933]: I1202 16:29:57.815047 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="0feefe80-99a0-4d78-9753-c823a10fc0f8" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 02 16:29:57 crc kubenswrapper[4933]: I1202 16:29:57.815390 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="0feefe80-99a0-4d78-9753-c823a10fc0f8" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 02 16:29:57 crc kubenswrapper[4933]: I1202 16:29:57.816371 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-r5bzb" Dec 02 16:29:57 crc kubenswrapper[4933]: I1202 16:29:57.818818 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 16:29:57 crc kubenswrapper[4933]: I1202 16:29:57.819262 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mlmmm" Dec 02 16:29:57 crc kubenswrapper[4933]: I1202 16:29:57.823103 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 16:29:57 crc kubenswrapper[4933]: I1202 16:29:57.824337 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 16:29:57 crc kubenswrapper[4933]: I1202 16:29:57.825573 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-r5bzb"] Dec 02 16:29:57 crc kubenswrapper[4933]: I1202 16:29:57.923901 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22hqc\" (UniqueName: \"kubernetes.io/projected/b2b6f713-7e70-4d1e-9ee4-5fa433a01ded-kube-api-access-22hqc\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-r5bzb\" (UID: \"b2b6f713-7e70-4d1e-9ee4-5fa433a01ded\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-r5bzb" Dec 02 16:29:57 crc kubenswrapper[4933]: I1202 16:29:57.923968 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b2b6f713-7e70-4d1e-9ee4-5fa433a01ded-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-r5bzb\" (UID: \"b2b6f713-7e70-4d1e-9ee4-5fa433a01ded\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-r5bzb" Dec 02 16:29:57 crc kubenswrapper[4933]: I1202 16:29:57.924279 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b2b6f713-7e70-4d1e-9ee4-5fa433a01ded-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-r5bzb\" (UID: \"b2b6f713-7e70-4d1e-9ee4-5fa433a01ded\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-r5bzb" Dec 02 16:29:58 crc kubenswrapper[4933]: I1202 16:29:58.026800 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22hqc\" (UniqueName: \"kubernetes.io/projected/b2b6f713-7e70-4d1e-9ee4-5fa433a01ded-kube-api-access-22hqc\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-r5bzb\" (UID: \"b2b6f713-7e70-4d1e-9ee4-5fa433a01ded\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-r5bzb" Dec 02 16:29:58 crc kubenswrapper[4933]: I1202 16:29:58.026895 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b2b6f713-7e70-4d1e-9ee4-5fa433a01ded-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-r5bzb\" (UID: \"b2b6f713-7e70-4d1e-9ee4-5fa433a01ded\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-r5bzb" Dec 02 16:29:58 crc kubenswrapper[4933]: I1202 16:29:58.027003 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b2b6f713-7e70-4d1e-9ee4-5fa433a01ded-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-r5bzb\" (UID: \"b2b6f713-7e70-4d1e-9ee4-5fa433a01ded\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-r5bzb" Dec 02 16:29:58 crc kubenswrapper[4933]: I1202 16:29:58.032229 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b2b6f713-7e70-4d1e-9ee4-5fa433a01ded-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-r5bzb\" (UID: \"b2b6f713-7e70-4d1e-9ee4-5fa433a01ded\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-r5bzb" Dec 02 16:29:58 crc kubenswrapper[4933]: I1202 16:29:58.032429 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b2b6f713-7e70-4d1e-9ee4-5fa433a01ded-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-r5bzb\" (UID: \"b2b6f713-7e70-4d1e-9ee4-5fa433a01ded\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-r5bzb" Dec 02 16:29:58 crc kubenswrapper[4933]: I1202 16:29:58.045786 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22hqc\" (UniqueName: \"kubernetes.io/projected/b2b6f713-7e70-4d1e-9ee4-5fa433a01ded-kube-api-access-22hqc\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-r5bzb\" (UID: \"b2b6f713-7e70-4d1e-9ee4-5fa433a01ded\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-r5bzb" Dec 02 16:29:58 crc kubenswrapper[4933]: I1202 16:29:58.136493 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-r5bzb" Dec 02 16:29:58 crc kubenswrapper[4933]: I1202 16:29:58.698036 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-r5bzb"] Dec 02 16:29:58 crc kubenswrapper[4933]: W1202 16:29:58.709762 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb2b6f713_7e70_4d1e_9ee4_5fa433a01ded.slice/crio-0daa639f6b47725eccb012d673109e5fad5718c75852bd0ce3d743eb27283125 WatchSource:0}: Error finding container 0daa639f6b47725eccb012d673109e5fad5718c75852bd0ce3d743eb27283125: Status 404 returned error can't find the container with id 0daa639f6b47725eccb012d673109e5fad5718c75852bd0ce3d743eb27283125 Dec 02 16:29:58 crc kubenswrapper[4933]: I1202 16:29:58.740555 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-r5bzb" event={"ID":"b2b6f713-7e70-4d1e-9ee4-5fa433a01ded","Type":"ContainerStarted","Data":"0daa639f6b47725eccb012d673109e5fad5718c75852bd0ce3d743eb27283125"} Dec 02 16:29:59 crc kubenswrapper[4933]: I1202 16:29:59.751929 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-r5bzb" event={"ID":"b2b6f713-7e70-4d1e-9ee4-5fa433a01ded","Type":"ContainerStarted","Data":"96052bd2a9cfad1fedd03cb09398132a69a96815167843573d761959cc3c9ac6"} Dec 02 16:29:59 crc kubenswrapper[4933]: I1202 16:29:59.766723 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-r5bzb" podStartSLOduration=2.217956935 podStartE2EDuration="2.766702463s" podCreationTimestamp="2025-12-02 16:29:57 +0000 UTC" firstStartedPulling="2025-12-02 16:29:58.71332231 +0000 UTC m=+2261.964549003" lastFinishedPulling="2025-12-02 16:29:59.262067828 +0000 UTC m=+2262.513294531" observedRunningTime="2025-12-02 16:29:59.763785265 +0000 UTC m=+2263.015011978" watchObservedRunningTime="2025-12-02 16:29:59.766702463 +0000 UTC m=+2263.017929166" Dec 02 16:30:00 crc kubenswrapper[4933]: I1202 16:30:00.145567 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411550-2x6q5"] Dec 02 16:30:00 crc kubenswrapper[4933]: I1202 16:30:00.147804 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411550-2x6q5" Dec 02 16:30:00 crc kubenswrapper[4933]: I1202 16:30:00.150242 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 02 16:30:00 crc kubenswrapper[4933]: I1202 16:30:00.150665 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 02 16:30:00 crc kubenswrapper[4933]: I1202 16:30:00.157841 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411550-2x6q5"] Dec 02 16:30:00 crc kubenswrapper[4933]: I1202 16:30:00.284996 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/699448fc-4c20-4353-b844-e6beec90e5ac-secret-volume\") pod \"collect-profiles-29411550-2x6q5\" (UID: \"699448fc-4c20-4353-b844-e6beec90e5ac\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411550-2x6q5" Dec 02 16:30:00 crc kubenswrapper[4933]: I1202 16:30:00.285036 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/699448fc-4c20-4353-b844-e6beec90e5ac-config-volume\") pod \"collect-profiles-29411550-2x6q5\" (UID: \"699448fc-4c20-4353-b844-e6beec90e5ac\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411550-2x6q5" Dec 02 16:30:00 crc kubenswrapper[4933]: I1202 16:30:00.285109 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h65j2\" (UniqueName: \"kubernetes.io/projected/699448fc-4c20-4353-b844-e6beec90e5ac-kube-api-access-h65j2\") pod \"collect-profiles-29411550-2x6q5\" (UID: \"699448fc-4c20-4353-b844-e6beec90e5ac\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411550-2x6q5" Dec 02 16:30:00 crc kubenswrapper[4933]: I1202 16:30:00.387232 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/699448fc-4c20-4353-b844-e6beec90e5ac-secret-volume\") pod \"collect-profiles-29411550-2x6q5\" (UID: \"699448fc-4c20-4353-b844-e6beec90e5ac\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411550-2x6q5" Dec 02 16:30:00 crc kubenswrapper[4933]: I1202 16:30:00.387292 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/699448fc-4c20-4353-b844-e6beec90e5ac-config-volume\") pod \"collect-profiles-29411550-2x6q5\" (UID: \"699448fc-4c20-4353-b844-e6beec90e5ac\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411550-2x6q5" Dec 02 16:30:00 crc kubenswrapper[4933]: I1202 16:30:00.387379 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h65j2\" (UniqueName: \"kubernetes.io/projected/699448fc-4c20-4353-b844-e6beec90e5ac-kube-api-access-h65j2\") pod \"collect-profiles-29411550-2x6q5\" (UID: \"699448fc-4c20-4353-b844-e6beec90e5ac\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411550-2x6q5" Dec 02 16:30:00 crc kubenswrapper[4933]: I1202 16:30:00.388327 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/699448fc-4c20-4353-b844-e6beec90e5ac-config-volume\") pod \"collect-profiles-29411550-2x6q5\" (UID: \"699448fc-4c20-4353-b844-e6beec90e5ac\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411550-2x6q5" Dec 02 16:30:00 crc kubenswrapper[4933]: I1202 16:30:00.393045 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/699448fc-4c20-4353-b844-e6beec90e5ac-secret-volume\") pod \"collect-profiles-29411550-2x6q5\" (UID: \"699448fc-4c20-4353-b844-e6beec90e5ac\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411550-2x6q5" Dec 02 16:30:00 crc kubenswrapper[4933]: I1202 16:30:00.405184 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h65j2\" (UniqueName: \"kubernetes.io/projected/699448fc-4c20-4353-b844-e6beec90e5ac-kube-api-access-h65j2\") pod \"collect-profiles-29411550-2x6q5\" (UID: \"699448fc-4c20-4353-b844-e6beec90e5ac\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411550-2x6q5" Dec 02 16:30:00 crc kubenswrapper[4933]: I1202 16:30:00.488630 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411550-2x6q5" Dec 02 16:30:00 crc kubenswrapper[4933]: I1202 16:30:00.972218 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411550-2x6q5"] Dec 02 16:30:00 crc kubenswrapper[4933]: W1202 16:30:00.976206 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod699448fc_4c20_4353_b844_e6beec90e5ac.slice/crio-7ba47d8d433e57b7f62bfbebdd1dca321840a8466ee6a4f04eca7c000d637cbb WatchSource:0}: Error finding container 7ba47d8d433e57b7f62bfbebdd1dca321840a8466ee6a4f04eca7c000d637cbb: Status 404 returned error can't find the container with id 7ba47d8d433e57b7f62bfbebdd1dca321840a8466ee6a4f04eca7c000d637cbb Dec 02 16:30:01 crc kubenswrapper[4933]: I1202 16:30:01.776123 4933 generic.go:334] "Generic (PLEG): container finished" podID="699448fc-4c20-4353-b844-e6beec90e5ac" containerID="8bbbe127965e9e18276dab3fc4eafefa87a3b997f5ce8c03b3ec5d398be43c05" exitCode=0 Dec 02 16:30:01 crc kubenswrapper[4933]: I1202 16:30:01.776161 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411550-2x6q5" event={"ID":"699448fc-4c20-4353-b844-e6beec90e5ac","Type":"ContainerDied","Data":"8bbbe127965e9e18276dab3fc4eafefa87a3b997f5ce8c03b3ec5d398be43c05"} Dec 02 16:30:01 crc kubenswrapper[4933]: I1202 16:30:01.776692 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411550-2x6q5" event={"ID":"699448fc-4c20-4353-b844-e6beec90e5ac","Type":"ContainerStarted","Data":"7ba47d8d433e57b7f62bfbebdd1dca321840a8466ee6a4f04eca7c000d637cbb"} Dec 02 16:30:03 crc kubenswrapper[4933]: I1202 16:30:03.257755 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411550-2x6q5" Dec 02 16:30:03 crc kubenswrapper[4933]: I1202 16:30:03.366466 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h65j2\" (UniqueName: \"kubernetes.io/projected/699448fc-4c20-4353-b844-e6beec90e5ac-kube-api-access-h65j2\") pod \"699448fc-4c20-4353-b844-e6beec90e5ac\" (UID: \"699448fc-4c20-4353-b844-e6beec90e5ac\") " Dec 02 16:30:03 crc kubenswrapper[4933]: I1202 16:30:03.366808 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/699448fc-4c20-4353-b844-e6beec90e5ac-secret-volume\") pod \"699448fc-4c20-4353-b844-e6beec90e5ac\" (UID: \"699448fc-4c20-4353-b844-e6beec90e5ac\") " Dec 02 16:30:03 crc kubenswrapper[4933]: I1202 16:30:03.366902 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/699448fc-4c20-4353-b844-e6beec90e5ac-config-volume\") pod \"699448fc-4c20-4353-b844-e6beec90e5ac\" (UID: \"699448fc-4c20-4353-b844-e6beec90e5ac\") " Dec 02 16:30:03 crc kubenswrapper[4933]: I1202 16:30:03.368092 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/699448fc-4c20-4353-b844-e6beec90e5ac-config-volume" (OuterVolumeSpecName: "config-volume") pod "699448fc-4c20-4353-b844-e6beec90e5ac" (UID: "699448fc-4c20-4353-b844-e6beec90e5ac"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:30:03 crc kubenswrapper[4933]: I1202 16:30:03.368862 4933 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/699448fc-4c20-4353-b844-e6beec90e5ac-config-volume\") on node \"crc\" DevicePath \"\"" Dec 02 16:30:03 crc kubenswrapper[4933]: I1202 16:30:03.379621 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/699448fc-4c20-4353-b844-e6beec90e5ac-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "699448fc-4c20-4353-b844-e6beec90e5ac" (UID: "699448fc-4c20-4353-b844-e6beec90e5ac"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:30:03 crc kubenswrapper[4933]: I1202 16:30:03.382116 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/699448fc-4c20-4353-b844-e6beec90e5ac-kube-api-access-h65j2" (OuterVolumeSpecName: "kube-api-access-h65j2") pod "699448fc-4c20-4353-b844-e6beec90e5ac" (UID: "699448fc-4c20-4353-b844-e6beec90e5ac"). InnerVolumeSpecName "kube-api-access-h65j2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:30:03 crc kubenswrapper[4933]: I1202 16:30:03.472929 4933 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/699448fc-4c20-4353-b844-e6beec90e5ac-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 02 16:30:03 crc kubenswrapper[4933]: I1202 16:30:03.473053 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h65j2\" (UniqueName: \"kubernetes.io/projected/699448fc-4c20-4353-b844-e6beec90e5ac-kube-api-access-h65j2\") on node \"crc\" DevicePath \"\"" Dec 02 16:30:03 crc kubenswrapper[4933]: I1202 16:30:03.814593 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411550-2x6q5" event={"ID":"699448fc-4c20-4353-b844-e6beec90e5ac","Type":"ContainerDied","Data":"7ba47d8d433e57b7f62bfbebdd1dca321840a8466ee6a4f04eca7c000d637cbb"} Dec 02 16:30:03 crc kubenswrapper[4933]: I1202 16:30:03.814954 4933 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7ba47d8d433e57b7f62bfbebdd1dca321840a8466ee6a4f04eca7c000d637cbb" Dec 02 16:30:03 crc kubenswrapper[4933]: I1202 16:30:03.814651 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411550-2x6q5" Dec 02 16:30:04 crc kubenswrapper[4933]: I1202 16:30:04.342042 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411505-4m9wv"] Dec 02 16:30:04 crc kubenswrapper[4933]: I1202 16:30:04.354887 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411505-4m9wv"] Dec 02 16:30:05 crc kubenswrapper[4933]: I1202 16:30:05.068736 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11c9d818-4012-423f-b3ce-bec8ac30f1d7" path="/var/lib/kubelet/pods/11c9d818-4012-423f-b3ce-bec8ac30f1d7/volumes" Dec 02 16:30:09 crc kubenswrapper[4933]: I1202 16:30:09.878100 4933 generic.go:334] "Generic (PLEG): container finished" podID="b2b6f713-7e70-4d1e-9ee4-5fa433a01ded" containerID="96052bd2a9cfad1fedd03cb09398132a69a96815167843573d761959cc3c9ac6" exitCode=0 Dec 02 16:30:09 crc kubenswrapper[4933]: I1202 16:30:09.878167 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-r5bzb" event={"ID":"b2b6f713-7e70-4d1e-9ee4-5fa433a01ded","Type":"ContainerDied","Data":"96052bd2a9cfad1fedd03cb09398132a69a96815167843573d761959cc3c9ac6"} Dec 02 16:30:11 crc kubenswrapper[4933]: I1202 16:30:11.357938 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-r5bzb" Dec 02 16:30:11 crc kubenswrapper[4933]: I1202 16:30:11.476030 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-22hqc\" (UniqueName: \"kubernetes.io/projected/b2b6f713-7e70-4d1e-9ee4-5fa433a01ded-kube-api-access-22hqc\") pod \"b2b6f713-7e70-4d1e-9ee4-5fa433a01ded\" (UID: \"b2b6f713-7e70-4d1e-9ee4-5fa433a01ded\") " Dec 02 16:30:11 crc kubenswrapper[4933]: I1202 16:30:11.476217 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b2b6f713-7e70-4d1e-9ee4-5fa433a01ded-ssh-key\") pod \"b2b6f713-7e70-4d1e-9ee4-5fa433a01ded\" (UID: \"b2b6f713-7e70-4d1e-9ee4-5fa433a01ded\") " Dec 02 16:30:11 crc kubenswrapper[4933]: I1202 16:30:11.476327 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b2b6f713-7e70-4d1e-9ee4-5fa433a01ded-inventory\") pod \"b2b6f713-7e70-4d1e-9ee4-5fa433a01ded\" (UID: \"b2b6f713-7e70-4d1e-9ee4-5fa433a01ded\") " Dec 02 16:30:11 crc kubenswrapper[4933]: I1202 16:30:11.494554 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2b6f713-7e70-4d1e-9ee4-5fa433a01ded-kube-api-access-22hqc" (OuterVolumeSpecName: "kube-api-access-22hqc") pod "b2b6f713-7e70-4d1e-9ee4-5fa433a01ded" (UID: "b2b6f713-7e70-4d1e-9ee4-5fa433a01ded"). InnerVolumeSpecName "kube-api-access-22hqc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:30:11 crc kubenswrapper[4933]: I1202 16:30:11.511426 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2b6f713-7e70-4d1e-9ee4-5fa433a01ded-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "b2b6f713-7e70-4d1e-9ee4-5fa433a01ded" (UID: "b2b6f713-7e70-4d1e-9ee4-5fa433a01ded"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:30:11 crc kubenswrapper[4933]: I1202 16:30:11.516832 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2b6f713-7e70-4d1e-9ee4-5fa433a01ded-inventory" (OuterVolumeSpecName: "inventory") pod "b2b6f713-7e70-4d1e-9ee4-5fa433a01ded" (UID: "b2b6f713-7e70-4d1e-9ee4-5fa433a01ded"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:30:11 crc kubenswrapper[4933]: I1202 16:30:11.579240 4933 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b2b6f713-7e70-4d1e-9ee4-5fa433a01ded-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 16:30:11 crc kubenswrapper[4933]: I1202 16:30:11.579286 4933 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b2b6f713-7e70-4d1e-9ee4-5fa433a01ded-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 16:30:11 crc kubenswrapper[4933]: I1202 16:30:11.579300 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-22hqc\" (UniqueName: \"kubernetes.io/projected/b2b6f713-7e70-4d1e-9ee4-5fa433a01ded-kube-api-access-22hqc\") on node \"crc\" DevicePath \"\"" Dec 02 16:30:11 crc kubenswrapper[4933]: I1202 16:30:11.900464 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-r5bzb" event={"ID":"b2b6f713-7e70-4d1e-9ee4-5fa433a01ded","Type":"ContainerDied","Data":"0daa639f6b47725eccb012d673109e5fad5718c75852bd0ce3d743eb27283125"} Dec 02 16:30:11 crc kubenswrapper[4933]: I1202 16:30:11.900520 4933 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0daa639f6b47725eccb012d673109e5fad5718c75852bd0ce3d743eb27283125" Dec 02 16:30:11 crc kubenswrapper[4933]: I1202 16:30:11.900550 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-r5bzb" Dec 02 16:30:12 crc kubenswrapper[4933]: I1202 16:30:12.027519 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-87krj"] Dec 02 16:30:12 crc kubenswrapper[4933]: E1202 16:30:12.028173 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2b6f713-7e70-4d1e-9ee4-5fa433a01ded" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 02 16:30:12 crc kubenswrapper[4933]: I1202 16:30:12.028201 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2b6f713-7e70-4d1e-9ee4-5fa433a01ded" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 02 16:30:12 crc kubenswrapper[4933]: E1202 16:30:12.028249 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="699448fc-4c20-4353-b844-e6beec90e5ac" containerName="collect-profiles" Dec 02 16:30:12 crc kubenswrapper[4933]: I1202 16:30:12.028258 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="699448fc-4c20-4353-b844-e6beec90e5ac" containerName="collect-profiles" Dec 02 16:30:12 crc kubenswrapper[4933]: I1202 16:30:12.028537 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="699448fc-4c20-4353-b844-e6beec90e5ac" containerName="collect-profiles" Dec 02 16:30:12 crc kubenswrapper[4933]: I1202 16:30:12.028558 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2b6f713-7e70-4d1e-9ee4-5fa433a01ded" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 02 16:30:12 crc kubenswrapper[4933]: I1202 16:30:12.029627 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-87krj" Dec 02 16:30:12 crc kubenswrapper[4933]: I1202 16:30:12.032466 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Dec 02 16:30:12 crc kubenswrapper[4933]: I1202 16:30:12.032714 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 16:30:12 crc kubenswrapper[4933]: I1202 16:30:12.032927 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 16:30:12 crc kubenswrapper[4933]: I1202 16:30:12.032956 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Dec 02 16:30:12 crc kubenswrapper[4933]: I1202 16:30:12.033206 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Dec 02 16:30:12 crc kubenswrapper[4933]: I1202 16:30:12.033334 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0" Dec 02 16:30:12 crc kubenswrapper[4933]: I1202 16:30:12.033531 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Dec 02 16:30:12 crc kubenswrapper[4933]: I1202 16:30:12.033714 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mlmmm" Dec 02 16:30:12 crc kubenswrapper[4933]: I1202 16:30:12.033870 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 16:30:12 crc kubenswrapper[4933]: I1202 16:30:12.050003 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-87krj"] Dec 02 16:30:12 crc kubenswrapper[4933]: I1202 16:30:12.090315 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6efaf1ff-2f24-4e58-8899-a5ca660bd6cc-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-87krj\" (UID: \"6efaf1ff-2f24-4e58-8899-a5ca660bd6cc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-87krj" Dec 02 16:30:12 crc kubenswrapper[4933]: I1202 16:30:12.090652 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6efaf1ff-2f24-4e58-8899-a5ca660bd6cc-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-87krj\" (UID: \"6efaf1ff-2f24-4e58-8899-a5ca660bd6cc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-87krj" Dec 02 16:30:12 crc kubenswrapper[4933]: I1202 16:30:12.090811 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6efaf1ff-2f24-4e58-8899-a5ca660bd6cc-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-87krj\" (UID: \"6efaf1ff-2f24-4e58-8899-a5ca660bd6cc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-87krj" Dec 02 16:30:12 crc kubenswrapper[4933]: I1202 16:30:12.090958 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6efaf1ff-2f24-4e58-8899-a5ca660bd6cc-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-87krj\" (UID: \"6efaf1ff-2f24-4e58-8899-a5ca660bd6cc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-87krj" Dec 02 16:30:12 crc kubenswrapper[4933]: I1202 16:30:12.091091 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6efaf1ff-2f24-4e58-8899-a5ca660bd6cc-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-87krj\" (UID: \"6efaf1ff-2f24-4e58-8899-a5ca660bd6cc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-87krj" Dec 02 16:30:12 crc kubenswrapper[4933]: I1202 16:30:12.091209 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6efaf1ff-2f24-4e58-8899-a5ca660bd6cc-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-87krj\" (UID: \"6efaf1ff-2f24-4e58-8899-a5ca660bd6cc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-87krj" Dec 02 16:30:12 crc kubenswrapper[4933]: I1202 16:30:12.091276 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6efaf1ff-2f24-4e58-8899-a5ca660bd6cc-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-87krj\" (UID: \"6efaf1ff-2f24-4e58-8899-a5ca660bd6cc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-87krj" Dec 02 16:30:12 crc kubenswrapper[4933]: I1202 16:30:12.091343 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6efaf1ff-2f24-4e58-8899-a5ca660bd6cc-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-87krj\" (UID: \"6efaf1ff-2f24-4e58-8899-a5ca660bd6cc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-87krj" Dec 02 16:30:12 crc kubenswrapper[4933]: I1202 16:30:12.091390 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6efaf1ff-2f24-4e58-8899-a5ca660bd6cc-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-87krj\" (UID: \"6efaf1ff-2f24-4e58-8899-a5ca660bd6cc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-87krj" Dec 02 16:30:12 crc kubenswrapper[4933]: I1202 16:30:12.091416 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6efaf1ff-2f24-4e58-8899-a5ca660bd6cc-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-87krj\" (UID: \"6efaf1ff-2f24-4e58-8899-a5ca660bd6cc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-87krj" Dec 02 16:30:12 crc kubenswrapper[4933]: I1202 16:30:12.091484 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6efaf1ff-2f24-4e58-8899-a5ca660bd6cc-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-87krj\" (UID: \"6efaf1ff-2f24-4e58-8899-a5ca660bd6cc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-87krj" Dec 02 16:30:12 crc kubenswrapper[4933]: I1202 16:30:12.091510 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6efaf1ff-2f24-4e58-8899-a5ca660bd6cc-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-87krj\" (UID: \"6efaf1ff-2f24-4e58-8899-a5ca660bd6cc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-87krj" Dec 02 16:30:12 crc kubenswrapper[4933]: I1202 16:30:12.091562 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6efaf1ff-2f24-4e58-8899-a5ca660bd6cc-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-87krj\" (UID: \"6efaf1ff-2f24-4e58-8899-a5ca660bd6cc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-87krj" Dec 02 16:30:12 crc kubenswrapper[4933]: I1202 16:30:12.091583 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6efaf1ff-2f24-4e58-8899-a5ca660bd6cc-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-87krj\" (UID: \"6efaf1ff-2f24-4e58-8899-a5ca660bd6cc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-87krj" Dec 02 16:30:12 crc kubenswrapper[4933]: I1202 16:30:12.091601 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6efaf1ff-2f24-4e58-8899-a5ca660bd6cc-telemetry-power-monitoring-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-87krj\" (UID: \"6efaf1ff-2f24-4e58-8899-a5ca660bd6cc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-87krj" Dec 02 16:30:12 crc kubenswrapper[4933]: I1202 16:30:12.091628 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4c7mq\" (UniqueName: \"kubernetes.io/projected/6efaf1ff-2f24-4e58-8899-a5ca660bd6cc-kube-api-access-4c7mq\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-87krj\" (UID: \"6efaf1ff-2f24-4e58-8899-a5ca660bd6cc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-87krj" Dec 02 16:30:12 crc kubenswrapper[4933]: I1202 16:30:12.193666 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6efaf1ff-2f24-4e58-8899-a5ca660bd6cc-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-87krj\" (UID: \"6efaf1ff-2f24-4e58-8899-a5ca660bd6cc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-87krj" Dec 02 16:30:12 crc kubenswrapper[4933]: I1202 16:30:12.193774 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6efaf1ff-2f24-4e58-8899-a5ca660bd6cc-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-87krj\" (UID: \"6efaf1ff-2f24-4e58-8899-a5ca660bd6cc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-87krj" Dec 02 16:30:12 crc kubenswrapper[4933]: I1202 16:30:12.193809 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6efaf1ff-2f24-4e58-8899-a5ca660bd6cc-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-87krj\" (UID: \"6efaf1ff-2f24-4e58-8899-a5ca660bd6cc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-87krj" Dec 02 16:30:12 crc kubenswrapper[4933]: I1202 16:30:12.194082 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6efaf1ff-2f24-4e58-8899-a5ca660bd6cc-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-87krj\" (UID: \"6efaf1ff-2f24-4e58-8899-a5ca660bd6cc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-87krj" Dec 02 16:30:12 crc kubenswrapper[4933]: I1202 16:30:12.194130 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6efaf1ff-2f24-4e58-8899-a5ca660bd6cc-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-87krj\" (UID: \"6efaf1ff-2f24-4e58-8899-a5ca660bd6cc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-87krj" Dec 02 16:30:12 crc kubenswrapper[4933]: I1202 16:30:12.194427 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6efaf1ff-2f24-4e58-8899-a5ca660bd6cc-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-87krj\" (UID: \"6efaf1ff-2f24-4e58-8899-a5ca660bd6cc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-87krj" Dec 02 16:30:12 crc kubenswrapper[4933]: I1202 16:30:12.194469 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6efaf1ff-2f24-4e58-8899-a5ca660bd6cc-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-87krj\" (UID: \"6efaf1ff-2f24-4e58-8899-a5ca660bd6cc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-87krj" Dec 02 16:30:12 crc kubenswrapper[4933]: I1202 16:30:12.194497 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6efaf1ff-2f24-4e58-8899-a5ca660bd6cc-telemetry-power-monitoring-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-87krj\" (UID: \"6efaf1ff-2f24-4e58-8899-a5ca660bd6cc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-87krj" Dec 02 16:30:12 crc kubenswrapper[4933]: I1202 16:30:12.194533 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4c7mq\" (UniqueName: \"kubernetes.io/projected/6efaf1ff-2f24-4e58-8899-a5ca660bd6cc-kube-api-access-4c7mq\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-87krj\" (UID: \"6efaf1ff-2f24-4e58-8899-a5ca660bd6cc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-87krj" Dec 02 16:30:12 crc kubenswrapper[4933]: I1202 16:30:12.194574 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6efaf1ff-2f24-4e58-8899-a5ca660bd6cc-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-87krj\" (UID: \"6efaf1ff-2f24-4e58-8899-a5ca660bd6cc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-87krj" Dec 02 16:30:12 crc kubenswrapper[4933]: I1202 16:30:12.194677 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6efaf1ff-2f24-4e58-8899-a5ca660bd6cc-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-87krj\" (UID: \"6efaf1ff-2f24-4e58-8899-a5ca660bd6cc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-87krj" Dec 02 16:30:12 crc kubenswrapper[4933]: I1202 16:30:12.194728 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6efaf1ff-2f24-4e58-8899-a5ca660bd6cc-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-87krj\" (UID: \"6efaf1ff-2f24-4e58-8899-a5ca660bd6cc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-87krj" Dec 02 16:30:12 crc kubenswrapper[4933]: I1202 16:30:12.194780 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6efaf1ff-2f24-4e58-8899-a5ca660bd6cc-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-87krj\" (UID: \"6efaf1ff-2f24-4e58-8899-a5ca660bd6cc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-87krj" Dec 02 16:30:12 crc kubenswrapper[4933]: I1202 16:30:12.194838 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6efaf1ff-2f24-4e58-8899-a5ca660bd6cc-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-87krj\" (UID: \"6efaf1ff-2f24-4e58-8899-a5ca660bd6cc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-87krj" Dec 02 16:30:12 crc kubenswrapper[4933]: I1202 16:30:12.194881 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6efaf1ff-2f24-4e58-8899-a5ca660bd6cc-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-87krj\" (UID: \"6efaf1ff-2f24-4e58-8899-a5ca660bd6cc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-87krj" Dec 02 16:30:12 crc kubenswrapper[4933]: I1202 16:30:12.194921 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6efaf1ff-2f24-4e58-8899-a5ca660bd6cc-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-87krj\" (UID: \"6efaf1ff-2f24-4e58-8899-a5ca660bd6cc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-87krj" Dec 02 16:30:12 crc kubenswrapper[4933]: I1202 16:30:12.200813 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6efaf1ff-2f24-4e58-8899-a5ca660bd6cc-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-87krj\" (UID: \"6efaf1ff-2f24-4e58-8899-a5ca660bd6cc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-87krj" Dec 02 16:30:12 crc kubenswrapper[4933]: I1202 16:30:12.201934 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6efaf1ff-2f24-4e58-8899-a5ca660bd6cc-telemetry-power-monitoring-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-87krj\" (UID: \"6efaf1ff-2f24-4e58-8899-a5ca660bd6cc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-87krj" Dec 02 16:30:12 crc kubenswrapper[4933]: I1202 16:30:12.202100 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6efaf1ff-2f24-4e58-8899-a5ca660bd6cc-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-87krj\" (UID: \"6efaf1ff-2f24-4e58-8899-a5ca660bd6cc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-87krj" Dec 02 16:30:12 crc kubenswrapper[4933]: I1202 16:30:12.204546 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6efaf1ff-2f24-4e58-8899-a5ca660bd6cc-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-87krj\" (UID: \"6efaf1ff-2f24-4e58-8899-a5ca660bd6cc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-87krj" Dec 02 16:30:12 crc kubenswrapper[4933]: I1202 16:30:12.205210 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6efaf1ff-2f24-4e58-8899-a5ca660bd6cc-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-87krj\" (UID: \"6efaf1ff-2f24-4e58-8899-a5ca660bd6cc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-87krj" Dec 02 16:30:12 crc kubenswrapper[4933]: I1202 16:30:12.205415 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6efaf1ff-2f24-4e58-8899-a5ca660bd6cc-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-87krj\" (UID: \"6efaf1ff-2f24-4e58-8899-a5ca660bd6cc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-87krj" Dec 02 16:30:12 crc kubenswrapper[4933]: I1202 16:30:12.205506 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6efaf1ff-2f24-4e58-8899-a5ca660bd6cc-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-87krj\" (UID: \"6efaf1ff-2f24-4e58-8899-a5ca660bd6cc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-87krj" Dec 02 16:30:12 crc kubenswrapper[4933]: I1202 16:30:12.205554 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6efaf1ff-2f24-4e58-8899-a5ca660bd6cc-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-87krj\" (UID: \"6efaf1ff-2f24-4e58-8899-a5ca660bd6cc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-87krj" Dec 02 16:30:12 crc kubenswrapper[4933]: I1202 16:30:12.205561 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6efaf1ff-2f24-4e58-8899-a5ca660bd6cc-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-87krj\" (UID: \"6efaf1ff-2f24-4e58-8899-a5ca660bd6cc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-87krj" Dec 02 16:30:12 crc kubenswrapper[4933]: I1202 16:30:12.205810 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6efaf1ff-2f24-4e58-8899-a5ca660bd6cc-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-87krj\" (UID: \"6efaf1ff-2f24-4e58-8899-a5ca660bd6cc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-87krj" Dec 02 16:30:12 crc kubenswrapper[4933]: I1202 16:30:12.208108 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6efaf1ff-2f24-4e58-8899-a5ca660bd6cc-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-87krj\" (UID: \"6efaf1ff-2f24-4e58-8899-a5ca660bd6cc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-87krj" Dec 02 16:30:12 crc kubenswrapper[4933]: I1202 16:30:12.208480 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6efaf1ff-2f24-4e58-8899-a5ca660bd6cc-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-87krj\" (UID: \"6efaf1ff-2f24-4e58-8899-a5ca660bd6cc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-87krj" Dec 02 16:30:12 crc kubenswrapper[4933]: I1202 16:30:12.209279 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6efaf1ff-2f24-4e58-8899-a5ca660bd6cc-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-87krj\" (UID: \"6efaf1ff-2f24-4e58-8899-a5ca660bd6cc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-87krj" Dec 02 16:30:12 crc kubenswrapper[4933]: I1202 16:30:12.209531 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6efaf1ff-2f24-4e58-8899-a5ca660bd6cc-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-87krj\" (UID: \"6efaf1ff-2f24-4e58-8899-a5ca660bd6cc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-87krj" Dec 02 16:30:12 crc kubenswrapper[4933]: I1202 16:30:12.211986 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6efaf1ff-2f24-4e58-8899-a5ca660bd6cc-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-87krj\" (UID: \"6efaf1ff-2f24-4e58-8899-a5ca660bd6cc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-87krj" Dec 02 16:30:12 crc kubenswrapper[4933]: I1202 16:30:12.217278 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4c7mq\" (UniqueName: \"kubernetes.io/projected/6efaf1ff-2f24-4e58-8899-a5ca660bd6cc-kube-api-access-4c7mq\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-87krj\" (UID: \"6efaf1ff-2f24-4e58-8899-a5ca660bd6cc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-87krj" Dec 02 16:30:12 crc kubenswrapper[4933]: I1202 16:30:12.349042 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-87krj" Dec 02 16:30:12 crc kubenswrapper[4933]: I1202 16:30:12.912417 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-87krj"] Dec 02 16:30:13 crc kubenswrapper[4933]: I1202 16:30:13.921417 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-87krj" event={"ID":"6efaf1ff-2f24-4e58-8899-a5ca660bd6cc","Type":"ContainerStarted","Data":"cf7caa0d8be0be701fbcee782f2bc690c029f671f890bd4b17f2378101b1c315"} Dec 02 16:30:13 crc kubenswrapper[4933]: I1202 16:30:13.921714 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-87krj" event={"ID":"6efaf1ff-2f24-4e58-8899-a5ca660bd6cc","Type":"ContainerStarted","Data":"14ca1c2406b04e878ce8e76c8119d9a4a5cfaf3a2acdd4fb812314b25a30ef86"} Dec 02 16:30:13 crc kubenswrapper[4933]: I1202 16:30:13.949625 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-87krj" podStartSLOduration=1.385982682 podStartE2EDuration="1.949608189s" podCreationTimestamp="2025-12-02 16:30:12 +0000 UTC" firstStartedPulling="2025-12-02 16:30:12.908084264 +0000 UTC m=+2276.159310967" lastFinishedPulling="2025-12-02 16:30:13.471709771 +0000 UTC m=+2276.722936474" observedRunningTime="2025-12-02 16:30:13.939267572 +0000 UTC m=+2277.190494275" watchObservedRunningTime="2025-12-02 16:30:13.949608189 +0000 UTC m=+2277.200834892" Dec 02 16:30:17 crc kubenswrapper[4933]: I1202 16:30:17.169371 4933 patch_prober.go:28] interesting pod/machine-config-daemon-d2p6w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 16:30:17 crc kubenswrapper[4933]: I1202 16:30:17.170009 4933 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 16:30:17 crc kubenswrapper[4933]: I1202 16:30:17.170056 4933 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" Dec 02 16:30:17 crc kubenswrapper[4933]: I1202 16:30:17.171037 4933 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b62e6bdb09561f36fac7e6658898fffc280a7586dbfd2297014e62834184505b"} pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 16:30:17 crc kubenswrapper[4933]: I1202 16:30:17.171094 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" containerName="machine-config-daemon" containerID="cri-o://b62e6bdb09561f36fac7e6658898fffc280a7586dbfd2297014e62834184505b" gracePeriod=600 Dec 02 16:30:17 crc kubenswrapper[4933]: E1202 16:30:17.595220 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 16:30:17 crc kubenswrapper[4933]: I1202 16:30:17.965415 4933 generic.go:334] "Generic (PLEG): container finished" podID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" containerID="b62e6bdb09561f36fac7e6658898fffc280a7586dbfd2297014e62834184505b" exitCode=0 Dec 02 16:30:17 crc kubenswrapper[4933]: I1202 16:30:17.965460 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" event={"ID":"e6c1c5e6-50dd-428a-890c-2c3f0456f2fa","Type":"ContainerDied","Data":"b62e6bdb09561f36fac7e6658898fffc280a7586dbfd2297014e62834184505b"} Dec 02 16:30:17 crc kubenswrapper[4933]: I1202 16:30:17.965497 4933 scope.go:117] "RemoveContainer" containerID="620bdd0c4263af7415f957bb835c5b4382baf2a25794824589647a73b3d410f0" Dec 02 16:30:17 crc kubenswrapper[4933]: I1202 16:30:17.966359 4933 scope.go:117] "RemoveContainer" containerID="b62e6bdb09561f36fac7e6658898fffc280a7586dbfd2297014e62834184505b" Dec 02 16:30:17 crc kubenswrapper[4933]: E1202 16:30:17.966740 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 16:30:30 crc kubenswrapper[4933]: I1202 16:30:30.054260 4933 scope.go:117] "RemoveContainer" containerID="b62e6bdb09561f36fac7e6658898fffc280a7586dbfd2297014e62834184505b" Dec 02 16:30:30 crc kubenswrapper[4933]: E1202 16:30:30.055104 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 16:30:31 crc kubenswrapper[4933]: I1202 16:30:31.070411 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-sync-747pm"] Dec 02 16:30:31 crc kubenswrapper[4933]: I1202 16:30:31.080803 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-sync-747pm"] Dec 02 16:30:33 crc kubenswrapper[4933]: I1202 16:30:33.065336 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70ee251f-65d2-476a-ab6d-3fc838cbdf55" path="/var/lib/kubelet/pods/70ee251f-65d2-476a-ab6d-3fc838cbdf55/volumes" Dec 02 16:30:44 crc kubenswrapper[4933]: I1202 16:30:44.053491 4933 scope.go:117] "RemoveContainer" containerID="b62e6bdb09561f36fac7e6658898fffc280a7586dbfd2297014e62834184505b" Dec 02 16:30:44 crc kubenswrapper[4933]: E1202 16:30:44.054258 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 16:30:46 crc kubenswrapper[4933]: I1202 16:30:46.099398 4933 scope.go:117] "RemoveContainer" containerID="30c530fce0541189af024a3f9587d8114687b666d0ff8684af85e2de71b36f90" Dec 02 16:30:46 crc kubenswrapper[4933]: I1202 16:30:46.141732 4933 scope.go:117] "RemoveContainer" containerID="ff5b687922c855aa07c54509f6bbccb820b172d2feaa3e082b62648bebf45765" Dec 02 16:30:59 crc kubenswrapper[4933]: I1202 16:30:59.054864 4933 scope.go:117] "RemoveContainer" containerID="b62e6bdb09561f36fac7e6658898fffc280a7586dbfd2297014e62834184505b" Dec 02 16:30:59 crc kubenswrapper[4933]: E1202 16:30:59.055958 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 16:31:00 crc kubenswrapper[4933]: I1202 16:31:00.487060 4933 generic.go:334] "Generic (PLEG): container finished" podID="6efaf1ff-2f24-4e58-8899-a5ca660bd6cc" containerID="cf7caa0d8be0be701fbcee782f2bc690c029f671f890bd4b17f2378101b1c315" exitCode=0 Dec 02 16:31:00 crc kubenswrapper[4933]: I1202 16:31:00.488026 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-87krj" event={"ID":"6efaf1ff-2f24-4e58-8899-a5ca660bd6cc","Type":"ContainerDied","Data":"cf7caa0d8be0be701fbcee782f2bc690c029f671f890bd4b17f2378101b1c315"} Dec 02 16:31:01 crc kubenswrapper[4933]: I1202 16:31:01.971618 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-87krj" Dec 02 16:31:02 crc kubenswrapper[4933]: I1202 16:31:02.074252 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6efaf1ff-2f24-4e58-8899-a5ca660bd6cc-inventory\") pod \"6efaf1ff-2f24-4e58-8899-a5ca660bd6cc\" (UID: \"6efaf1ff-2f24-4e58-8899-a5ca660bd6cc\") " Dec 02 16:31:02 crc kubenswrapper[4933]: I1202 16:31:02.074375 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6efaf1ff-2f24-4e58-8899-a5ca660bd6cc-ssh-key\") pod \"6efaf1ff-2f24-4e58-8899-a5ca660bd6cc\" (UID: \"6efaf1ff-2f24-4e58-8899-a5ca660bd6cc\") " Dec 02 16:31:02 crc kubenswrapper[4933]: I1202 16:31:02.074424 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6efaf1ff-2f24-4e58-8899-a5ca660bd6cc-telemetry-power-monitoring-combined-ca-bundle\") pod \"6efaf1ff-2f24-4e58-8899-a5ca660bd6cc\" (UID: \"6efaf1ff-2f24-4e58-8899-a5ca660bd6cc\") " Dec 02 16:31:02 crc kubenswrapper[4933]: I1202 16:31:02.074469 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6efaf1ff-2f24-4e58-8899-a5ca660bd6cc-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"6efaf1ff-2f24-4e58-8899-a5ca660bd6cc\" (UID: \"6efaf1ff-2f24-4e58-8899-a5ca660bd6cc\") " Dec 02 16:31:02 crc kubenswrapper[4933]: I1202 16:31:02.074495 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6efaf1ff-2f24-4e58-8899-a5ca660bd6cc-openstack-edpm-ipam-ovn-default-certs-0\") pod \"6efaf1ff-2f24-4e58-8899-a5ca660bd6cc\" (UID: \"6efaf1ff-2f24-4e58-8899-a5ca660bd6cc\") " Dec 02 16:31:02 crc kubenswrapper[4933]: I1202 16:31:02.074528 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6efaf1ff-2f24-4e58-8899-a5ca660bd6cc-libvirt-combined-ca-bundle\") pod \"6efaf1ff-2f24-4e58-8899-a5ca660bd6cc\" (UID: \"6efaf1ff-2f24-4e58-8899-a5ca660bd6cc\") " Dec 02 16:31:02 crc kubenswrapper[4933]: I1202 16:31:02.074559 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4c7mq\" (UniqueName: \"kubernetes.io/projected/6efaf1ff-2f24-4e58-8899-a5ca660bd6cc-kube-api-access-4c7mq\") pod \"6efaf1ff-2f24-4e58-8899-a5ca660bd6cc\" (UID: \"6efaf1ff-2f24-4e58-8899-a5ca660bd6cc\") " Dec 02 16:31:02 crc kubenswrapper[4933]: I1202 16:31:02.074599 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6efaf1ff-2f24-4e58-8899-a5ca660bd6cc-bootstrap-combined-ca-bundle\") pod \"6efaf1ff-2f24-4e58-8899-a5ca660bd6cc\" (UID: \"6efaf1ff-2f24-4e58-8899-a5ca660bd6cc\") " Dec 02 16:31:02 crc kubenswrapper[4933]: I1202 16:31:02.074650 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6efaf1ff-2f24-4e58-8899-a5ca660bd6cc-repo-setup-combined-ca-bundle\") pod \"6efaf1ff-2f24-4e58-8899-a5ca660bd6cc\" (UID: \"6efaf1ff-2f24-4e58-8899-a5ca660bd6cc\") " Dec 02 16:31:02 crc kubenswrapper[4933]: I1202 16:31:02.074675 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6efaf1ff-2f24-4e58-8899-a5ca660bd6cc-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"6efaf1ff-2f24-4e58-8899-a5ca660bd6cc\" (UID: \"6efaf1ff-2f24-4e58-8899-a5ca660bd6cc\") " Dec 02 16:31:02 crc kubenswrapper[4933]: I1202 16:31:02.074723 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6efaf1ff-2f24-4e58-8899-a5ca660bd6cc-nova-combined-ca-bundle\") pod \"6efaf1ff-2f24-4e58-8899-a5ca660bd6cc\" (UID: \"6efaf1ff-2f24-4e58-8899-a5ca660bd6cc\") " Dec 02 16:31:02 crc kubenswrapper[4933]: I1202 16:31:02.074777 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6efaf1ff-2f24-4e58-8899-a5ca660bd6cc-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") pod \"6efaf1ff-2f24-4e58-8899-a5ca660bd6cc\" (UID: \"6efaf1ff-2f24-4e58-8899-a5ca660bd6cc\") " Dec 02 16:31:02 crc kubenswrapper[4933]: I1202 16:31:02.074807 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6efaf1ff-2f24-4e58-8899-a5ca660bd6cc-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"6efaf1ff-2f24-4e58-8899-a5ca660bd6cc\" (UID: \"6efaf1ff-2f24-4e58-8899-a5ca660bd6cc\") " Dec 02 16:31:02 crc kubenswrapper[4933]: I1202 16:31:02.074888 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6efaf1ff-2f24-4e58-8899-a5ca660bd6cc-ovn-combined-ca-bundle\") pod \"6efaf1ff-2f24-4e58-8899-a5ca660bd6cc\" (UID: \"6efaf1ff-2f24-4e58-8899-a5ca660bd6cc\") " Dec 02 16:31:02 crc kubenswrapper[4933]: I1202 16:31:02.074922 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6efaf1ff-2f24-4e58-8899-a5ca660bd6cc-telemetry-combined-ca-bundle\") pod \"6efaf1ff-2f24-4e58-8899-a5ca660bd6cc\" (UID: \"6efaf1ff-2f24-4e58-8899-a5ca660bd6cc\") " Dec 02 16:31:02 crc kubenswrapper[4933]: I1202 16:31:02.074951 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6efaf1ff-2f24-4e58-8899-a5ca660bd6cc-neutron-metadata-combined-ca-bundle\") pod \"6efaf1ff-2f24-4e58-8899-a5ca660bd6cc\" (UID: \"6efaf1ff-2f24-4e58-8899-a5ca660bd6cc\") " Dec 02 16:31:02 crc kubenswrapper[4933]: I1202 16:31:02.080130 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6efaf1ff-2f24-4e58-8899-a5ca660bd6cc-kube-api-access-4c7mq" (OuterVolumeSpecName: "kube-api-access-4c7mq") pod "6efaf1ff-2f24-4e58-8899-a5ca660bd6cc" (UID: "6efaf1ff-2f24-4e58-8899-a5ca660bd6cc"). InnerVolumeSpecName "kube-api-access-4c7mq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:31:02 crc kubenswrapper[4933]: I1202 16:31:02.082840 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6efaf1ff-2f24-4e58-8899-a5ca660bd6cc-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "6efaf1ff-2f24-4e58-8899-a5ca660bd6cc" (UID: "6efaf1ff-2f24-4e58-8899-a5ca660bd6cc"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:31:02 crc kubenswrapper[4933]: I1202 16:31:02.083065 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6efaf1ff-2f24-4e58-8899-a5ca660bd6cc-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "6efaf1ff-2f24-4e58-8899-a5ca660bd6cc" (UID: "6efaf1ff-2f24-4e58-8899-a5ca660bd6cc"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:31:02 crc kubenswrapper[4933]: I1202 16:31:02.083154 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6efaf1ff-2f24-4e58-8899-a5ca660bd6cc-telemetry-power-monitoring-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-power-monitoring-combined-ca-bundle") pod "6efaf1ff-2f24-4e58-8899-a5ca660bd6cc" (UID: "6efaf1ff-2f24-4e58-8899-a5ca660bd6cc"). InnerVolumeSpecName "telemetry-power-monitoring-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:31:02 crc kubenswrapper[4933]: I1202 16:31:02.083257 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6efaf1ff-2f24-4e58-8899-a5ca660bd6cc-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "6efaf1ff-2f24-4e58-8899-a5ca660bd6cc" (UID: "6efaf1ff-2f24-4e58-8899-a5ca660bd6cc"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:31:02 crc kubenswrapper[4933]: I1202 16:31:02.083325 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6efaf1ff-2f24-4e58-8899-a5ca660bd6cc-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "6efaf1ff-2f24-4e58-8899-a5ca660bd6cc" (UID: "6efaf1ff-2f24-4e58-8899-a5ca660bd6cc"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:31:02 crc kubenswrapper[4933]: I1202 16:31:02.083362 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6efaf1ff-2f24-4e58-8899-a5ca660bd6cc-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0") pod "6efaf1ff-2f24-4e58-8899-a5ca660bd6cc" (UID: "6efaf1ff-2f24-4e58-8899-a5ca660bd6cc"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:31:02 crc kubenswrapper[4933]: I1202 16:31:02.086178 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6efaf1ff-2f24-4e58-8899-a5ca660bd6cc-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "6efaf1ff-2f24-4e58-8899-a5ca660bd6cc" (UID: "6efaf1ff-2f24-4e58-8899-a5ca660bd6cc"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:31:02 crc kubenswrapper[4933]: I1202 16:31:02.086756 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6efaf1ff-2f24-4e58-8899-a5ca660bd6cc-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "6efaf1ff-2f24-4e58-8899-a5ca660bd6cc" (UID: "6efaf1ff-2f24-4e58-8899-a5ca660bd6cc"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:31:02 crc kubenswrapper[4933]: I1202 16:31:02.086792 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6efaf1ff-2f24-4e58-8899-a5ca660bd6cc-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "6efaf1ff-2f24-4e58-8899-a5ca660bd6cc" (UID: "6efaf1ff-2f24-4e58-8899-a5ca660bd6cc"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:31:02 crc kubenswrapper[4933]: I1202 16:31:02.087354 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6efaf1ff-2f24-4e58-8899-a5ca660bd6cc-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "6efaf1ff-2f24-4e58-8899-a5ca660bd6cc" (UID: "6efaf1ff-2f24-4e58-8899-a5ca660bd6cc"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:31:02 crc kubenswrapper[4933]: I1202 16:31:02.088036 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6efaf1ff-2f24-4e58-8899-a5ca660bd6cc-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "6efaf1ff-2f24-4e58-8899-a5ca660bd6cc" (UID: "6efaf1ff-2f24-4e58-8899-a5ca660bd6cc"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:31:02 crc kubenswrapper[4933]: I1202 16:31:02.088171 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6efaf1ff-2f24-4e58-8899-a5ca660bd6cc-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "6efaf1ff-2f24-4e58-8899-a5ca660bd6cc" (UID: "6efaf1ff-2f24-4e58-8899-a5ca660bd6cc"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:31:02 crc kubenswrapper[4933]: I1202 16:31:02.088492 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6efaf1ff-2f24-4e58-8899-a5ca660bd6cc-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "6efaf1ff-2f24-4e58-8899-a5ca660bd6cc" (UID: "6efaf1ff-2f24-4e58-8899-a5ca660bd6cc"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:31:02 crc kubenswrapper[4933]: I1202 16:31:02.121908 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6efaf1ff-2f24-4e58-8899-a5ca660bd6cc-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "6efaf1ff-2f24-4e58-8899-a5ca660bd6cc" (UID: "6efaf1ff-2f24-4e58-8899-a5ca660bd6cc"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:31:02 crc kubenswrapper[4933]: I1202 16:31:02.124683 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6efaf1ff-2f24-4e58-8899-a5ca660bd6cc-inventory" (OuterVolumeSpecName: "inventory") pod "6efaf1ff-2f24-4e58-8899-a5ca660bd6cc" (UID: "6efaf1ff-2f24-4e58-8899-a5ca660bd6cc"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:31:02 crc kubenswrapper[4933]: I1202 16:31:02.178775 4933 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6efaf1ff-2f24-4e58-8899-a5ca660bd6cc-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 16:31:02 crc kubenswrapper[4933]: I1202 16:31:02.178846 4933 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6efaf1ff-2f24-4e58-8899-a5ca660bd6cc-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 16:31:02 crc kubenswrapper[4933]: I1202 16:31:02.178866 4933 reconciler_common.go:293] "Volume detached for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6efaf1ff-2f24-4e58-8899-a5ca660bd6cc-telemetry-power-monitoring-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 16:31:02 crc kubenswrapper[4933]: I1202 16:31:02.178887 4933 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6efaf1ff-2f24-4e58-8899-a5ca660bd6cc-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 02 16:31:02 crc kubenswrapper[4933]: I1202 16:31:02.178910 4933 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6efaf1ff-2f24-4e58-8899-a5ca660bd6cc-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 02 16:31:02 crc kubenswrapper[4933]: I1202 16:31:02.178930 4933 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6efaf1ff-2f24-4e58-8899-a5ca660bd6cc-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 16:31:02 crc kubenswrapper[4933]: I1202 16:31:02.178949 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4c7mq\" (UniqueName: \"kubernetes.io/projected/6efaf1ff-2f24-4e58-8899-a5ca660bd6cc-kube-api-access-4c7mq\") on node \"crc\" DevicePath \"\"" Dec 02 16:31:02 crc kubenswrapper[4933]: I1202 16:31:02.178966 4933 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6efaf1ff-2f24-4e58-8899-a5ca660bd6cc-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 16:31:02 crc kubenswrapper[4933]: I1202 16:31:02.178983 4933 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6efaf1ff-2f24-4e58-8899-a5ca660bd6cc-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 16:31:02 crc kubenswrapper[4933]: I1202 16:31:02.179001 4933 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6efaf1ff-2f24-4e58-8899-a5ca660bd6cc-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 02 16:31:02 crc kubenswrapper[4933]: I1202 16:31:02.179019 4933 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6efaf1ff-2f24-4e58-8899-a5ca660bd6cc-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 16:31:02 crc kubenswrapper[4933]: I1202 16:31:02.179037 4933 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6efaf1ff-2f24-4e58-8899-a5ca660bd6cc-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 02 16:31:02 crc kubenswrapper[4933]: I1202 16:31:02.179059 4933 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6efaf1ff-2f24-4e58-8899-a5ca660bd6cc-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 02 16:31:02 crc kubenswrapper[4933]: I1202 16:31:02.179079 4933 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6efaf1ff-2f24-4e58-8899-a5ca660bd6cc-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 16:31:02 crc kubenswrapper[4933]: I1202 16:31:02.179096 4933 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6efaf1ff-2f24-4e58-8899-a5ca660bd6cc-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 16:31:02 crc kubenswrapper[4933]: I1202 16:31:02.179113 4933 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6efaf1ff-2f24-4e58-8899-a5ca660bd6cc-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 16:31:02 crc kubenswrapper[4933]: I1202 16:31:02.511947 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-87krj" event={"ID":"6efaf1ff-2f24-4e58-8899-a5ca660bd6cc","Type":"ContainerDied","Data":"14ca1c2406b04e878ce8e76c8119d9a4a5cfaf3a2acdd4fb812314b25a30ef86"} Dec 02 16:31:02 crc kubenswrapper[4933]: I1202 16:31:02.511986 4933 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="14ca1c2406b04e878ce8e76c8119d9a4a5cfaf3a2acdd4fb812314b25a30ef86" Dec 02 16:31:02 crc kubenswrapper[4933]: I1202 16:31:02.512043 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-87krj" Dec 02 16:31:02 crc kubenswrapper[4933]: I1202 16:31:02.620944 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-56tww"] Dec 02 16:31:02 crc kubenswrapper[4933]: E1202 16:31:02.621473 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6efaf1ff-2f24-4e58-8899-a5ca660bd6cc" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 02 16:31:02 crc kubenswrapper[4933]: I1202 16:31:02.621492 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="6efaf1ff-2f24-4e58-8899-a5ca660bd6cc" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 02 16:31:02 crc kubenswrapper[4933]: I1202 16:31:02.621708 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="6efaf1ff-2f24-4e58-8899-a5ca660bd6cc" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 02 16:31:02 crc kubenswrapper[4933]: I1202 16:31:02.623140 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-56tww" Dec 02 16:31:02 crc kubenswrapper[4933]: I1202 16:31:02.625470 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Dec 02 16:31:02 crc kubenswrapper[4933]: I1202 16:31:02.625529 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 16:31:02 crc kubenswrapper[4933]: I1202 16:31:02.625886 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mlmmm" Dec 02 16:31:02 crc kubenswrapper[4933]: I1202 16:31:02.626063 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 16:31:02 crc kubenswrapper[4933]: I1202 16:31:02.626327 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 16:31:02 crc kubenswrapper[4933]: I1202 16:31:02.635736 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-56tww"] Dec 02 16:31:02 crc kubenswrapper[4933]: I1202 16:31:02.689244 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5bb48f13-b80b-462d-acf5-8751ec7aaa8e-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-56tww\" (UID: \"5bb48f13-b80b-462d-acf5-8751ec7aaa8e\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-56tww" Dec 02 16:31:02 crc kubenswrapper[4933]: I1202 16:31:02.689423 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7rxb\" (UniqueName: \"kubernetes.io/projected/5bb48f13-b80b-462d-acf5-8751ec7aaa8e-kube-api-access-s7rxb\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-56tww\" (UID: \"5bb48f13-b80b-462d-acf5-8751ec7aaa8e\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-56tww" Dec 02 16:31:02 crc kubenswrapper[4933]: I1202 16:31:02.689525 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/5bb48f13-b80b-462d-acf5-8751ec7aaa8e-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-56tww\" (UID: \"5bb48f13-b80b-462d-acf5-8751ec7aaa8e\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-56tww" Dec 02 16:31:02 crc kubenswrapper[4933]: I1202 16:31:02.689582 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bb48f13-b80b-462d-acf5-8751ec7aaa8e-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-56tww\" (UID: \"5bb48f13-b80b-462d-acf5-8751ec7aaa8e\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-56tww" Dec 02 16:31:02 crc kubenswrapper[4933]: I1202 16:31:02.689737 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5bb48f13-b80b-462d-acf5-8751ec7aaa8e-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-56tww\" (UID: \"5bb48f13-b80b-462d-acf5-8751ec7aaa8e\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-56tww" Dec 02 16:31:02 crc kubenswrapper[4933]: I1202 16:31:02.797023 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5bb48f13-b80b-462d-acf5-8751ec7aaa8e-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-56tww\" (UID: \"5bb48f13-b80b-462d-acf5-8751ec7aaa8e\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-56tww" Dec 02 16:31:02 crc kubenswrapper[4933]: I1202 16:31:02.803606 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5bb48f13-b80b-462d-acf5-8751ec7aaa8e-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-56tww\" (UID: \"5bb48f13-b80b-462d-acf5-8751ec7aaa8e\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-56tww" Dec 02 16:31:02 crc kubenswrapper[4933]: I1202 16:31:02.804255 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7rxb\" (UniqueName: \"kubernetes.io/projected/5bb48f13-b80b-462d-acf5-8751ec7aaa8e-kube-api-access-s7rxb\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-56tww\" (UID: \"5bb48f13-b80b-462d-acf5-8751ec7aaa8e\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-56tww" Dec 02 16:31:02 crc kubenswrapper[4933]: I1202 16:31:02.804316 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/5bb48f13-b80b-462d-acf5-8751ec7aaa8e-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-56tww\" (UID: \"5bb48f13-b80b-462d-acf5-8751ec7aaa8e\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-56tww" Dec 02 16:31:02 crc kubenswrapper[4933]: I1202 16:31:02.804368 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bb48f13-b80b-462d-acf5-8751ec7aaa8e-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-56tww\" (UID: \"5bb48f13-b80b-462d-acf5-8751ec7aaa8e\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-56tww" Dec 02 16:31:02 crc kubenswrapper[4933]: I1202 16:31:02.807616 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/5bb48f13-b80b-462d-acf5-8751ec7aaa8e-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-56tww\" (UID: \"5bb48f13-b80b-462d-acf5-8751ec7aaa8e\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-56tww" Dec 02 16:31:02 crc kubenswrapper[4933]: I1202 16:31:02.809398 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5bb48f13-b80b-462d-acf5-8751ec7aaa8e-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-56tww\" (UID: \"5bb48f13-b80b-462d-acf5-8751ec7aaa8e\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-56tww" Dec 02 16:31:02 crc kubenswrapper[4933]: I1202 16:31:02.829729 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5bb48f13-b80b-462d-acf5-8751ec7aaa8e-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-56tww\" (UID: \"5bb48f13-b80b-462d-acf5-8751ec7aaa8e\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-56tww" Dec 02 16:31:02 crc kubenswrapper[4933]: I1202 16:31:02.831114 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bb48f13-b80b-462d-acf5-8751ec7aaa8e-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-56tww\" (UID: \"5bb48f13-b80b-462d-acf5-8751ec7aaa8e\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-56tww" Dec 02 16:31:02 crc kubenswrapper[4933]: I1202 16:31:02.833860 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7rxb\" (UniqueName: \"kubernetes.io/projected/5bb48f13-b80b-462d-acf5-8751ec7aaa8e-kube-api-access-s7rxb\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-56tww\" (UID: \"5bb48f13-b80b-462d-acf5-8751ec7aaa8e\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-56tww" Dec 02 16:31:02 crc kubenswrapper[4933]: I1202 16:31:02.953054 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-56tww" Dec 02 16:31:03 crc kubenswrapper[4933]: I1202 16:31:03.552976 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-56tww"] Dec 02 16:31:04 crc kubenswrapper[4933]: I1202 16:31:04.539706 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-56tww" event={"ID":"5bb48f13-b80b-462d-acf5-8751ec7aaa8e","Type":"ContainerStarted","Data":"94ea46e74b90ddb010818a02d386fbe9db9c3130e0e25b205eeb6ce87b8c7402"} Dec 02 16:31:04 crc kubenswrapper[4933]: I1202 16:31:04.540060 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-56tww" event={"ID":"5bb48f13-b80b-462d-acf5-8751ec7aaa8e","Type":"ContainerStarted","Data":"92a3d1803abf7056b85defcf64ced3d69b576c0cd4a03a1f8c6546edb83528d0"} Dec 02 16:31:04 crc kubenswrapper[4933]: I1202 16:31:04.557977 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-56tww" podStartSLOduration=1.899889006 podStartE2EDuration="2.557949607s" podCreationTimestamp="2025-12-02 16:31:02 +0000 UTC" firstStartedPulling="2025-12-02 16:31:03.552415277 +0000 UTC m=+2326.803641980" lastFinishedPulling="2025-12-02 16:31:04.210475878 +0000 UTC m=+2327.461702581" observedRunningTime="2025-12-02 16:31:04.552119571 +0000 UTC m=+2327.803346264" watchObservedRunningTime="2025-12-02 16:31:04.557949607 +0000 UTC m=+2327.809176310" Dec 02 16:31:10 crc kubenswrapper[4933]: I1202 16:31:10.054190 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-sync-hjbbg"] Dec 02 16:31:10 crc kubenswrapper[4933]: I1202 16:31:10.072967 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-sync-hjbbg"] Dec 02 16:31:11 crc kubenswrapper[4933]: I1202 16:31:11.068706 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="747183c3-25ee-4d86-a4d6-57a21a12adda" path="/var/lib/kubelet/pods/747183c3-25ee-4d86-a4d6-57a21a12adda/volumes" Dec 02 16:31:12 crc kubenswrapper[4933]: I1202 16:31:12.053530 4933 scope.go:117] "RemoveContainer" containerID="b62e6bdb09561f36fac7e6658898fffc280a7586dbfd2297014e62834184505b" Dec 02 16:31:12 crc kubenswrapper[4933]: E1202 16:31:12.054516 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 16:31:27 crc kubenswrapper[4933]: I1202 16:31:27.063663 4933 scope.go:117] "RemoveContainer" containerID="b62e6bdb09561f36fac7e6658898fffc280a7586dbfd2297014e62834184505b" Dec 02 16:31:27 crc kubenswrapper[4933]: E1202 16:31:27.064662 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 16:31:40 crc kubenswrapper[4933]: I1202 16:31:40.053726 4933 scope.go:117] "RemoveContainer" containerID="b62e6bdb09561f36fac7e6658898fffc280a7586dbfd2297014e62834184505b" Dec 02 16:31:40 crc kubenswrapper[4933]: E1202 16:31:40.054847 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 16:31:46 crc kubenswrapper[4933]: I1202 16:31:46.248650 4933 scope.go:117] "RemoveContainer" containerID="d150a2ea84d17323ba5f1d3abe9bf753ac35e0c88d6ac8bbb05b056fad34319f" Dec 02 16:31:54 crc kubenswrapper[4933]: I1202 16:31:54.053982 4933 scope.go:117] "RemoveContainer" containerID="b62e6bdb09561f36fac7e6658898fffc280a7586dbfd2297014e62834184505b" Dec 02 16:31:54 crc kubenswrapper[4933]: E1202 16:31:54.055047 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 16:32:07 crc kubenswrapper[4933]: I1202 16:32:07.061146 4933 scope.go:117] "RemoveContainer" containerID="b62e6bdb09561f36fac7e6658898fffc280a7586dbfd2297014e62834184505b" Dec 02 16:32:07 crc kubenswrapper[4933]: E1202 16:32:07.061863 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 16:32:08 crc kubenswrapper[4933]: I1202 16:32:08.276539 4933 generic.go:334] "Generic (PLEG): container finished" podID="5bb48f13-b80b-462d-acf5-8751ec7aaa8e" containerID="94ea46e74b90ddb010818a02d386fbe9db9c3130e0e25b205eeb6ce87b8c7402" exitCode=0 Dec 02 16:32:08 crc kubenswrapper[4933]: I1202 16:32:08.276672 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-56tww" event={"ID":"5bb48f13-b80b-462d-acf5-8751ec7aaa8e","Type":"ContainerDied","Data":"94ea46e74b90ddb010818a02d386fbe9db9c3130e0e25b205eeb6ce87b8c7402"} Dec 02 16:32:09 crc kubenswrapper[4933]: I1202 16:32:09.778752 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-56tww" Dec 02 16:32:09 crc kubenswrapper[4933]: I1202 16:32:09.831397 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bb48f13-b80b-462d-acf5-8751ec7aaa8e-ovn-combined-ca-bundle\") pod \"5bb48f13-b80b-462d-acf5-8751ec7aaa8e\" (UID: \"5bb48f13-b80b-462d-acf5-8751ec7aaa8e\") " Dec 02 16:32:09 crc kubenswrapper[4933]: I1202 16:32:09.831495 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5bb48f13-b80b-462d-acf5-8751ec7aaa8e-inventory\") pod \"5bb48f13-b80b-462d-acf5-8751ec7aaa8e\" (UID: \"5bb48f13-b80b-462d-acf5-8751ec7aaa8e\") " Dec 02 16:32:09 crc kubenswrapper[4933]: I1202 16:32:09.831586 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5bb48f13-b80b-462d-acf5-8751ec7aaa8e-ssh-key\") pod \"5bb48f13-b80b-462d-acf5-8751ec7aaa8e\" (UID: \"5bb48f13-b80b-462d-acf5-8751ec7aaa8e\") " Dec 02 16:32:09 crc kubenswrapper[4933]: I1202 16:32:09.831616 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s7rxb\" (UniqueName: \"kubernetes.io/projected/5bb48f13-b80b-462d-acf5-8751ec7aaa8e-kube-api-access-s7rxb\") pod \"5bb48f13-b80b-462d-acf5-8751ec7aaa8e\" (UID: \"5bb48f13-b80b-462d-acf5-8751ec7aaa8e\") " Dec 02 16:32:09 crc kubenswrapper[4933]: I1202 16:32:09.831789 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/5bb48f13-b80b-462d-acf5-8751ec7aaa8e-ovncontroller-config-0\") pod \"5bb48f13-b80b-462d-acf5-8751ec7aaa8e\" (UID: \"5bb48f13-b80b-462d-acf5-8751ec7aaa8e\") " Dec 02 16:32:09 crc kubenswrapper[4933]: I1202 16:32:09.837643 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5bb48f13-b80b-462d-acf5-8751ec7aaa8e-kube-api-access-s7rxb" (OuterVolumeSpecName: "kube-api-access-s7rxb") pod "5bb48f13-b80b-462d-acf5-8751ec7aaa8e" (UID: "5bb48f13-b80b-462d-acf5-8751ec7aaa8e"). InnerVolumeSpecName "kube-api-access-s7rxb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:32:09 crc kubenswrapper[4933]: I1202 16:32:09.837747 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5bb48f13-b80b-462d-acf5-8751ec7aaa8e-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "5bb48f13-b80b-462d-acf5-8751ec7aaa8e" (UID: "5bb48f13-b80b-462d-acf5-8751ec7aaa8e"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:32:09 crc kubenswrapper[4933]: I1202 16:32:09.861069 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5bb48f13-b80b-462d-acf5-8751ec7aaa8e-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "5bb48f13-b80b-462d-acf5-8751ec7aaa8e" (UID: "5bb48f13-b80b-462d-acf5-8751ec7aaa8e"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:32:09 crc kubenswrapper[4933]: I1202 16:32:09.863314 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5bb48f13-b80b-462d-acf5-8751ec7aaa8e-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "5bb48f13-b80b-462d-acf5-8751ec7aaa8e" (UID: "5bb48f13-b80b-462d-acf5-8751ec7aaa8e"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:32:09 crc kubenswrapper[4933]: I1202 16:32:09.866860 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5bb48f13-b80b-462d-acf5-8751ec7aaa8e-inventory" (OuterVolumeSpecName: "inventory") pod "5bb48f13-b80b-462d-acf5-8751ec7aaa8e" (UID: "5bb48f13-b80b-462d-acf5-8751ec7aaa8e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:32:09 crc kubenswrapper[4933]: I1202 16:32:09.936525 4933 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bb48f13-b80b-462d-acf5-8751ec7aaa8e-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 16:32:09 crc kubenswrapper[4933]: I1202 16:32:09.936647 4933 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5bb48f13-b80b-462d-acf5-8751ec7aaa8e-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 16:32:09 crc kubenswrapper[4933]: I1202 16:32:09.936659 4933 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5bb48f13-b80b-462d-acf5-8751ec7aaa8e-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 16:32:09 crc kubenswrapper[4933]: I1202 16:32:09.936669 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s7rxb\" (UniqueName: \"kubernetes.io/projected/5bb48f13-b80b-462d-acf5-8751ec7aaa8e-kube-api-access-s7rxb\") on node \"crc\" DevicePath \"\"" Dec 02 16:32:09 crc kubenswrapper[4933]: I1202 16:32:09.936679 4933 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/5bb48f13-b80b-462d-acf5-8751ec7aaa8e-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Dec 02 16:32:10 crc kubenswrapper[4933]: I1202 16:32:10.302791 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-56tww" event={"ID":"5bb48f13-b80b-462d-acf5-8751ec7aaa8e","Type":"ContainerDied","Data":"92a3d1803abf7056b85defcf64ced3d69b576c0cd4a03a1f8c6546edb83528d0"} Dec 02 16:32:10 crc kubenswrapper[4933]: I1202 16:32:10.303145 4933 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="92a3d1803abf7056b85defcf64ced3d69b576c0cd4a03a1f8c6546edb83528d0" Dec 02 16:32:10 crc kubenswrapper[4933]: I1202 16:32:10.302871 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-56tww" Dec 02 16:32:10 crc kubenswrapper[4933]: I1202 16:32:10.420781 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-snnm2"] Dec 02 16:32:10 crc kubenswrapper[4933]: E1202 16:32:10.421450 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bb48f13-b80b-462d-acf5-8751ec7aaa8e" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 02 16:32:10 crc kubenswrapper[4933]: I1202 16:32:10.421476 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bb48f13-b80b-462d-acf5-8751ec7aaa8e" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 02 16:32:10 crc kubenswrapper[4933]: I1202 16:32:10.421779 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="5bb48f13-b80b-462d-acf5-8751ec7aaa8e" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 02 16:32:10 crc kubenswrapper[4933]: I1202 16:32:10.423431 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-snnm2" Dec 02 16:32:10 crc kubenswrapper[4933]: I1202 16:32:10.433858 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Dec 02 16:32:10 crc kubenswrapper[4933]: I1202 16:32:10.433920 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 16:32:10 crc kubenswrapper[4933]: I1202 16:32:10.434021 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Dec 02 16:32:10 crc kubenswrapper[4933]: I1202 16:32:10.434071 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 16:32:10 crc kubenswrapper[4933]: I1202 16:32:10.434236 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mlmmm" Dec 02 16:32:10 crc kubenswrapper[4933]: I1202 16:32:10.434277 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 16:32:10 crc kubenswrapper[4933]: I1202 16:32:10.439964 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-snnm2"] Dec 02 16:32:10 crc kubenswrapper[4933]: I1202 16:32:10.560812 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/438c203a-5750-4377-95f9-3aa2353f6f53-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-snnm2\" (UID: \"438c203a-5750-4377-95f9-3aa2353f6f53\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-snnm2" Dec 02 16:32:10 crc kubenswrapper[4933]: I1202 16:32:10.560900 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/438c203a-5750-4377-95f9-3aa2353f6f53-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-snnm2\" (UID: \"438c203a-5750-4377-95f9-3aa2353f6f53\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-snnm2" Dec 02 16:32:10 crc kubenswrapper[4933]: I1202 16:32:10.561074 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/438c203a-5750-4377-95f9-3aa2353f6f53-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-snnm2\" (UID: \"438c203a-5750-4377-95f9-3aa2353f6f53\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-snnm2" Dec 02 16:32:10 crc kubenswrapper[4933]: I1202 16:32:10.561280 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zf926\" (UniqueName: \"kubernetes.io/projected/438c203a-5750-4377-95f9-3aa2353f6f53-kube-api-access-zf926\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-snnm2\" (UID: \"438c203a-5750-4377-95f9-3aa2353f6f53\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-snnm2" Dec 02 16:32:10 crc kubenswrapper[4933]: I1202 16:32:10.561345 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/438c203a-5750-4377-95f9-3aa2353f6f53-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-snnm2\" (UID: \"438c203a-5750-4377-95f9-3aa2353f6f53\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-snnm2" Dec 02 16:32:10 crc kubenswrapper[4933]: I1202 16:32:10.561444 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/438c203a-5750-4377-95f9-3aa2353f6f53-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-snnm2\" (UID: \"438c203a-5750-4377-95f9-3aa2353f6f53\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-snnm2" Dec 02 16:32:10 crc kubenswrapper[4933]: I1202 16:32:10.663702 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/438c203a-5750-4377-95f9-3aa2353f6f53-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-snnm2\" (UID: \"438c203a-5750-4377-95f9-3aa2353f6f53\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-snnm2" Dec 02 16:32:10 crc kubenswrapper[4933]: I1202 16:32:10.663878 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/438c203a-5750-4377-95f9-3aa2353f6f53-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-snnm2\" (UID: \"438c203a-5750-4377-95f9-3aa2353f6f53\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-snnm2" Dec 02 16:32:10 crc kubenswrapper[4933]: I1202 16:32:10.663938 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/438c203a-5750-4377-95f9-3aa2353f6f53-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-snnm2\" (UID: \"438c203a-5750-4377-95f9-3aa2353f6f53\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-snnm2" Dec 02 16:32:10 crc kubenswrapper[4933]: I1202 16:32:10.663998 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/438c203a-5750-4377-95f9-3aa2353f6f53-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-snnm2\" (UID: \"438c203a-5750-4377-95f9-3aa2353f6f53\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-snnm2" Dec 02 16:32:10 crc kubenswrapper[4933]: I1202 16:32:10.664085 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zf926\" (UniqueName: \"kubernetes.io/projected/438c203a-5750-4377-95f9-3aa2353f6f53-kube-api-access-zf926\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-snnm2\" (UID: \"438c203a-5750-4377-95f9-3aa2353f6f53\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-snnm2" Dec 02 16:32:10 crc kubenswrapper[4933]: I1202 16:32:10.664129 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/438c203a-5750-4377-95f9-3aa2353f6f53-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-snnm2\" (UID: \"438c203a-5750-4377-95f9-3aa2353f6f53\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-snnm2" Dec 02 16:32:10 crc kubenswrapper[4933]: I1202 16:32:10.669275 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/438c203a-5750-4377-95f9-3aa2353f6f53-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-snnm2\" (UID: \"438c203a-5750-4377-95f9-3aa2353f6f53\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-snnm2" Dec 02 16:32:10 crc kubenswrapper[4933]: I1202 16:32:10.669437 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/438c203a-5750-4377-95f9-3aa2353f6f53-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-snnm2\" (UID: \"438c203a-5750-4377-95f9-3aa2353f6f53\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-snnm2" Dec 02 16:32:10 crc kubenswrapper[4933]: I1202 16:32:10.669702 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/438c203a-5750-4377-95f9-3aa2353f6f53-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-snnm2\" (UID: \"438c203a-5750-4377-95f9-3aa2353f6f53\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-snnm2" Dec 02 16:32:10 crc kubenswrapper[4933]: I1202 16:32:10.674106 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/438c203a-5750-4377-95f9-3aa2353f6f53-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-snnm2\" (UID: \"438c203a-5750-4377-95f9-3aa2353f6f53\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-snnm2" Dec 02 16:32:10 crc kubenswrapper[4933]: I1202 16:32:10.678362 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/438c203a-5750-4377-95f9-3aa2353f6f53-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-snnm2\" (UID: \"438c203a-5750-4377-95f9-3aa2353f6f53\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-snnm2" Dec 02 16:32:10 crc kubenswrapper[4933]: I1202 16:32:10.695909 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zf926\" (UniqueName: \"kubernetes.io/projected/438c203a-5750-4377-95f9-3aa2353f6f53-kube-api-access-zf926\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-snnm2\" (UID: \"438c203a-5750-4377-95f9-3aa2353f6f53\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-snnm2" Dec 02 16:32:10 crc kubenswrapper[4933]: I1202 16:32:10.759306 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-snnm2" Dec 02 16:32:11 crc kubenswrapper[4933]: I1202 16:32:11.282759 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-snnm2"] Dec 02 16:32:11 crc kubenswrapper[4933]: W1202 16:32:11.283901 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod438c203a_5750_4377_95f9_3aa2353f6f53.slice/crio-eb3b7e2407d74e584b1200ef2e916c6ce09bd6075676e52cc33fb1d47103166e WatchSource:0}: Error finding container eb3b7e2407d74e584b1200ef2e916c6ce09bd6075676e52cc33fb1d47103166e: Status 404 returned error can't find the container with id eb3b7e2407d74e584b1200ef2e916c6ce09bd6075676e52cc33fb1d47103166e Dec 02 16:32:11 crc kubenswrapper[4933]: I1202 16:32:11.313854 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-snnm2" event={"ID":"438c203a-5750-4377-95f9-3aa2353f6f53","Type":"ContainerStarted","Data":"eb3b7e2407d74e584b1200ef2e916c6ce09bd6075676e52cc33fb1d47103166e"} Dec 02 16:32:12 crc kubenswrapper[4933]: I1202 16:32:12.327650 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-snnm2" event={"ID":"438c203a-5750-4377-95f9-3aa2353f6f53","Type":"ContainerStarted","Data":"ae14783c49793830b7d9224062ae50984639cdd0f37fdc8818d93aea01601c7d"} Dec 02 16:32:12 crc kubenswrapper[4933]: I1202 16:32:12.349365 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-snnm2" podStartSLOduration=1.851681202 podStartE2EDuration="2.34934124s" podCreationTimestamp="2025-12-02 16:32:10 +0000 UTC" firstStartedPulling="2025-12-02 16:32:11.286135773 +0000 UTC m=+2394.537362476" lastFinishedPulling="2025-12-02 16:32:11.783795811 +0000 UTC m=+2395.035022514" observedRunningTime="2025-12-02 16:32:12.343312078 +0000 UTC m=+2395.594538781" watchObservedRunningTime="2025-12-02 16:32:12.34934124 +0000 UTC m=+2395.600567943" Dec 02 16:32:20 crc kubenswrapper[4933]: I1202 16:32:20.054400 4933 scope.go:117] "RemoveContainer" containerID="b62e6bdb09561f36fac7e6658898fffc280a7586dbfd2297014e62834184505b" Dec 02 16:32:20 crc kubenswrapper[4933]: E1202 16:32:20.055719 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 16:32:32 crc kubenswrapper[4933]: I1202 16:32:32.053671 4933 scope.go:117] "RemoveContainer" containerID="b62e6bdb09561f36fac7e6658898fffc280a7586dbfd2297014e62834184505b" Dec 02 16:32:32 crc kubenswrapper[4933]: E1202 16:32:32.054483 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 16:32:43 crc kubenswrapper[4933]: I1202 16:32:43.053759 4933 scope.go:117] "RemoveContainer" containerID="b62e6bdb09561f36fac7e6658898fffc280a7586dbfd2297014e62834184505b" Dec 02 16:32:43 crc kubenswrapper[4933]: E1202 16:32:43.054769 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 16:32:58 crc kubenswrapper[4933]: I1202 16:32:58.053344 4933 scope.go:117] "RemoveContainer" containerID="b62e6bdb09561f36fac7e6658898fffc280a7586dbfd2297014e62834184505b" Dec 02 16:32:58 crc kubenswrapper[4933]: E1202 16:32:58.054189 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 16:33:00 crc kubenswrapper[4933]: I1202 16:33:00.872259 4933 generic.go:334] "Generic (PLEG): container finished" podID="438c203a-5750-4377-95f9-3aa2353f6f53" containerID="ae14783c49793830b7d9224062ae50984639cdd0f37fdc8818d93aea01601c7d" exitCode=0 Dec 02 16:33:00 crc kubenswrapper[4933]: I1202 16:33:00.872455 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-snnm2" event={"ID":"438c203a-5750-4377-95f9-3aa2353f6f53","Type":"ContainerDied","Data":"ae14783c49793830b7d9224062ae50984639cdd0f37fdc8818d93aea01601c7d"} Dec 02 16:33:02 crc kubenswrapper[4933]: I1202 16:33:02.332910 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-snnm2" Dec 02 16:33:02 crc kubenswrapper[4933]: I1202 16:33:02.455440 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zf926\" (UniqueName: \"kubernetes.io/projected/438c203a-5750-4377-95f9-3aa2353f6f53-kube-api-access-zf926\") pod \"438c203a-5750-4377-95f9-3aa2353f6f53\" (UID: \"438c203a-5750-4377-95f9-3aa2353f6f53\") " Dec 02 16:33:02 crc kubenswrapper[4933]: I1202 16:33:02.455557 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/438c203a-5750-4377-95f9-3aa2353f6f53-ssh-key\") pod \"438c203a-5750-4377-95f9-3aa2353f6f53\" (UID: \"438c203a-5750-4377-95f9-3aa2353f6f53\") " Dec 02 16:33:02 crc kubenswrapper[4933]: I1202 16:33:02.455677 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/438c203a-5750-4377-95f9-3aa2353f6f53-neutron-metadata-combined-ca-bundle\") pod \"438c203a-5750-4377-95f9-3aa2353f6f53\" (UID: \"438c203a-5750-4377-95f9-3aa2353f6f53\") " Dec 02 16:33:02 crc kubenswrapper[4933]: I1202 16:33:02.455706 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/438c203a-5750-4377-95f9-3aa2353f6f53-neutron-ovn-metadata-agent-neutron-config-0\") pod \"438c203a-5750-4377-95f9-3aa2353f6f53\" (UID: \"438c203a-5750-4377-95f9-3aa2353f6f53\") " Dec 02 16:33:02 crc kubenswrapper[4933]: I1202 16:33:02.455914 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/438c203a-5750-4377-95f9-3aa2353f6f53-nova-metadata-neutron-config-0\") pod \"438c203a-5750-4377-95f9-3aa2353f6f53\" (UID: \"438c203a-5750-4377-95f9-3aa2353f6f53\") " Dec 02 16:33:02 crc kubenswrapper[4933]: I1202 16:33:02.455994 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/438c203a-5750-4377-95f9-3aa2353f6f53-inventory\") pod \"438c203a-5750-4377-95f9-3aa2353f6f53\" (UID: \"438c203a-5750-4377-95f9-3aa2353f6f53\") " Dec 02 16:33:02 crc kubenswrapper[4933]: I1202 16:33:02.462444 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/438c203a-5750-4377-95f9-3aa2353f6f53-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "438c203a-5750-4377-95f9-3aa2353f6f53" (UID: "438c203a-5750-4377-95f9-3aa2353f6f53"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:33:02 crc kubenswrapper[4933]: I1202 16:33:02.463188 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/438c203a-5750-4377-95f9-3aa2353f6f53-kube-api-access-zf926" (OuterVolumeSpecName: "kube-api-access-zf926") pod "438c203a-5750-4377-95f9-3aa2353f6f53" (UID: "438c203a-5750-4377-95f9-3aa2353f6f53"). InnerVolumeSpecName "kube-api-access-zf926". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:33:02 crc kubenswrapper[4933]: I1202 16:33:02.491547 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/438c203a-5750-4377-95f9-3aa2353f6f53-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "438c203a-5750-4377-95f9-3aa2353f6f53" (UID: "438c203a-5750-4377-95f9-3aa2353f6f53"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:33:02 crc kubenswrapper[4933]: I1202 16:33:02.491800 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/438c203a-5750-4377-95f9-3aa2353f6f53-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "438c203a-5750-4377-95f9-3aa2353f6f53" (UID: "438c203a-5750-4377-95f9-3aa2353f6f53"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:33:02 crc kubenswrapper[4933]: I1202 16:33:02.493953 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/438c203a-5750-4377-95f9-3aa2353f6f53-inventory" (OuterVolumeSpecName: "inventory") pod "438c203a-5750-4377-95f9-3aa2353f6f53" (UID: "438c203a-5750-4377-95f9-3aa2353f6f53"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:33:02 crc kubenswrapper[4933]: I1202 16:33:02.508834 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/438c203a-5750-4377-95f9-3aa2353f6f53-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "438c203a-5750-4377-95f9-3aa2353f6f53" (UID: "438c203a-5750-4377-95f9-3aa2353f6f53"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:33:02 crc kubenswrapper[4933]: I1202 16:33:02.559110 4933 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/438c203a-5750-4377-95f9-3aa2353f6f53-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 02 16:33:02 crc kubenswrapper[4933]: I1202 16:33:02.559332 4933 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/438c203a-5750-4377-95f9-3aa2353f6f53-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 16:33:02 crc kubenswrapper[4933]: I1202 16:33:02.559389 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zf926\" (UniqueName: \"kubernetes.io/projected/438c203a-5750-4377-95f9-3aa2353f6f53-kube-api-access-zf926\") on node \"crc\" DevicePath \"\"" Dec 02 16:33:02 crc kubenswrapper[4933]: I1202 16:33:02.559441 4933 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/438c203a-5750-4377-95f9-3aa2353f6f53-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 16:33:02 crc kubenswrapper[4933]: I1202 16:33:02.559497 4933 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/438c203a-5750-4377-95f9-3aa2353f6f53-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 16:33:02 crc kubenswrapper[4933]: I1202 16:33:02.559553 4933 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/438c203a-5750-4377-95f9-3aa2353f6f53-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 02 16:33:02 crc kubenswrapper[4933]: I1202 16:33:02.897558 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-snnm2" event={"ID":"438c203a-5750-4377-95f9-3aa2353f6f53","Type":"ContainerDied","Data":"eb3b7e2407d74e584b1200ef2e916c6ce09bd6075676e52cc33fb1d47103166e"} Dec 02 16:33:02 crc kubenswrapper[4933]: I1202 16:33:02.897605 4933 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eb3b7e2407d74e584b1200ef2e916c6ce09bd6075676e52cc33fb1d47103166e" Dec 02 16:33:02 crc kubenswrapper[4933]: I1202 16:33:02.897689 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-snnm2" Dec 02 16:33:03 crc kubenswrapper[4933]: I1202 16:33:03.009589 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6bm6r"] Dec 02 16:33:03 crc kubenswrapper[4933]: E1202 16:33:03.010265 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="438c203a-5750-4377-95f9-3aa2353f6f53" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 02 16:33:03 crc kubenswrapper[4933]: I1202 16:33:03.010288 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="438c203a-5750-4377-95f9-3aa2353f6f53" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 02 16:33:03 crc kubenswrapper[4933]: I1202 16:33:03.010615 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="438c203a-5750-4377-95f9-3aa2353f6f53" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 02 16:33:03 crc kubenswrapper[4933]: I1202 16:33:03.011783 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6bm6r" Dec 02 16:33:03 crc kubenswrapper[4933]: I1202 16:33:03.015123 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mlmmm" Dec 02 16:33:03 crc kubenswrapper[4933]: I1202 16:33:03.015361 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 16:33:03 crc kubenswrapper[4933]: I1202 16:33:03.015599 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 16:33:03 crc kubenswrapper[4933]: I1202 16:33:03.015903 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Dec 02 16:33:03 crc kubenswrapper[4933]: I1202 16:33:03.016295 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 16:33:03 crc kubenswrapper[4933]: I1202 16:33:03.024129 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6bm6r"] Dec 02 16:33:03 crc kubenswrapper[4933]: I1202 16:33:03.076223 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7b465ce0-4efc-4624-b140-f3bbb0e0b420-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6bm6r\" (UID: \"7b465ce0-4efc-4624-b140-f3bbb0e0b420\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6bm6r" Dec 02 16:33:03 crc kubenswrapper[4933]: I1202 16:33:03.077416 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7b465ce0-4efc-4624-b140-f3bbb0e0b420-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6bm6r\" (UID: \"7b465ce0-4efc-4624-b140-f3bbb0e0b420\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6bm6r" Dec 02 16:33:03 crc kubenswrapper[4933]: I1202 16:33:03.077487 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b465ce0-4efc-4624-b140-f3bbb0e0b420-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6bm6r\" (UID: \"7b465ce0-4efc-4624-b140-f3bbb0e0b420\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6bm6r" Dec 02 16:33:03 crc kubenswrapper[4933]: I1202 16:33:03.077585 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/7b465ce0-4efc-4624-b140-f3bbb0e0b420-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6bm6r\" (UID: \"7b465ce0-4efc-4624-b140-f3bbb0e0b420\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6bm6r" Dec 02 16:33:03 crc kubenswrapper[4933]: I1202 16:33:03.077635 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-st4hv\" (UniqueName: \"kubernetes.io/projected/7b465ce0-4efc-4624-b140-f3bbb0e0b420-kube-api-access-st4hv\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6bm6r\" (UID: \"7b465ce0-4efc-4624-b140-f3bbb0e0b420\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6bm6r" Dec 02 16:33:03 crc kubenswrapper[4933]: I1202 16:33:03.180303 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7b465ce0-4efc-4624-b140-f3bbb0e0b420-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6bm6r\" (UID: \"7b465ce0-4efc-4624-b140-f3bbb0e0b420\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6bm6r" Dec 02 16:33:03 crc kubenswrapper[4933]: I1202 16:33:03.180359 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b465ce0-4efc-4624-b140-f3bbb0e0b420-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6bm6r\" (UID: \"7b465ce0-4efc-4624-b140-f3bbb0e0b420\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6bm6r" Dec 02 16:33:03 crc kubenswrapper[4933]: I1202 16:33:03.180416 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/7b465ce0-4efc-4624-b140-f3bbb0e0b420-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6bm6r\" (UID: \"7b465ce0-4efc-4624-b140-f3bbb0e0b420\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6bm6r" Dec 02 16:33:03 crc kubenswrapper[4933]: I1202 16:33:03.180450 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-st4hv\" (UniqueName: \"kubernetes.io/projected/7b465ce0-4efc-4624-b140-f3bbb0e0b420-kube-api-access-st4hv\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6bm6r\" (UID: \"7b465ce0-4efc-4624-b140-f3bbb0e0b420\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6bm6r" Dec 02 16:33:03 crc kubenswrapper[4933]: I1202 16:33:03.180517 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7b465ce0-4efc-4624-b140-f3bbb0e0b420-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6bm6r\" (UID: \"7b465ce0-4efc-4624-b140-f3bbb0e0b420\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6bm6r" Dec 02 16:33:03 crc kubenswrapper[4933]: I1202 16:33:03.187539 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7b465ce0-4efc-4624-b140-f3bbb0e0b420-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6bm6r\" (UID: \"7b465ce0-4efc-4624-b140-f3bbb0e0b420\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6bm6r" Dec 02 16:33:03 crc kubenswrapper[4933]: I1202 16:33:03.187576 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7b465ce0-4efc-4624-b140-f3bbb0e0b420-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6bm6r\" (UID: \"7b465ce0-4efc-4624-b140-f3bbb0e0b420\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6bm6r" Dec 02 16:33:03 crc kubenswrapper[4933]: I1202 16:33:03.196493 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/7b465ce0-4efc-4624-b140-f3bbb0e0b420-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6bm6r\" (UID: \"7b465ce0-4efc-4624-b140-f3bbb0e0b420\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6bm6r" Dec 02 16:33:03 crc kubenswrapper[4933]: I1202 16:33:03.197491 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b465ce0-4efc-4624-b140-f3bbb0e0b420-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6bm6r\" (UID: \"7b465ce0-4efc-4624-b140-f3bbb0e0b420\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6bm6r" Dec 02 16:33:03 crc kubenswrapper[4933]: I1202 16:33:03.199118 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-st4hv\" (UniqueName: \"kubernetes.io/projected/7b465ce0-4efc-4624-b140-f3bbb0e0b420-kube-api-access-st4hv\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6bm6r\" (UID: \"7b465ce0-4efc-4624-b140-f3bbb0e0b420\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6bm6r" Dec 02 16:33:03 crc kubenswrapper[4933]: I1202 16:33:03.345694 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6bm6r" Dec 02 16:33:03 crc kubenswrapper[4933]: I1202 16:33:03.957374 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6bm6r"] Dec 02 16:33:04 crc kubenswrapper[4933]: I1202 16:33:04.926680 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6bm6r" event={"ID":"7b465ce0-4efc-4624-b140-f3bbb0e0b420","Type":"ContainerStarted","Data":"82dd50c934da092aa2398c925b7401b3243407b6c13ad596d5e45a523bd74cdb"} Dec 02 16:33:04 crc kubenswrapper[4933]: I1202 16:33:04.927558 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6bm6r" event={"ID":"7b465ce0-4efc-4624-b140-f3bbb0e0b420","Type":"ContainerStarted","Data":"7bac38fdfdb063dc1fd90edfcfd64974971f1993329365714beb037665b0d545"} Dec 02 16:33:04 crc kubenswrapper[4933]: I1202 16:33:04.948815 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6bm6r" podStartSLOduration=2.52247992 podStartE2EDuration="2.948794376s" podCreationTimestamp="2025-12-02 16:33:02 +0000 UTC" firstStartedPulling="2025-12-02 16:33:03.97095974 +0000 UTC m=+2447.222186463" lastFinishedPulling="2025-12-02 16:33:04.397274196 +0000 UTC m=+2447.648500919" observedRunningTime="2025-12-02 16:33:04.945084446 +0000 UTC m=+2448.196311149" watchObservedRunningTime="2025-12-02 16:33:04.948794376 +0000 UTC m=+2448.200021079" Dec 02 16:33:09 crc kubenswrapper[4933]: I1202 16:33:09.053341 4933 scope.go:117] "RemoveContainer" containerID="b62e6bdb09561f36fac7e6658898fffc280a7586dbfd2297014e62834184505b" Dec 02 16:33:09 crc kubenswrapper[4933]: E1202 16:33:09.053850 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 16:33:21 crc kubenswrapper[4933]: I1202 16:33:21.053391 4933 scope.go:117] "RemoveContainer" containerID="b62e6bdb09561f36fac7e6658898fffc280a7586dbfd2297014e62834184505b" Dec 02 16:33:21 crc kubenswrapper[4933]: E1202 16:33:21.055297 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 16:33:33 crc kubenswrapper[4933]: I1202 16:33:33.054819 4933 scope.go:117] "RemoveContainer" containerID="b62e6bdb09561f36fac7e6658898fffc280a7586dbfd2297014e62834184505b" Dec 02 16:33:33 crc kubenswrapper[4933]: E1202 16:33:33.056064 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 16:33:44 crc kubenswrapper[4933]: I1202 16:33:44.053918 4933 scope.go:117] "RemoveContainer" containerID="b62e6bdb09561f36fac7e6658898fffc280a7586dbfd2297014e62834184505b" Dec 02 16:33:44 crc kubenswrapper[4933]: E1202 16:33:44.054630 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 16:33:55 crc kubenswrapper[4933]: I1202 16:33:55.054154 4933 scope.go:117] "RemoveContainer" containerID="b62e6bdb09561f36fac7e6658898fffc280a7586dbfd2297014e62834184505b" Dec 02 16:33:55 crc kubenswrapper[4933]: E1202 16:33:55.055070 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 16:34:07 crc kubenswrapper[4933]: I1202 16:34:07.062151 4933 scope.go:117] "RemoveContainer" containerID="b62e6bdb09561f36fac7e6658898fffc280a7586dbfd2297014e62834184505b" Dec 02 16:34:07 crc kubenswrapper[4933]: E1202 16:34:07.063613 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 16:34:20 crc kubenswrapper[4933]: I1202 16:34:20.054919 4933 scope.go:117] "RemoveContainer" containerID="b62e6bdb09561f36fac7e6658898fffc280a7586dbfd2297014e62834184505b" Dec 02 16:34:20 crc kubenswrapper[4933]: E1202 16:34:20.056146 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 16:34:27 crc kubenswrapper[4933]: I1202 16:34:27.561913 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-nm778"] Dec 02 16:34:27 crc kubenswrapper[4933]: I1202 16:34:27.564955 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nm778" Dec 02 16:34:27 crc kubenswrapper[4933]: I1202 16:34:27.586589 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nm778"] Dec 02 16:34:27 crc kubenswrapper[4933]: I1202 16:34:27.726700 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lm2kn\" (UniqueName: \"kubernetes.io/projected/6d107571-e567-4dff-bbce-918b69cc7fa2-kube-api-access-lm2kn\") pod \"community-operators-nm778\" (UID: \"6d107571-e567-4dff-bbce-918b69cc7fa2\") " pod="openshift-marketplace/community-operators-nm778" Dec 02 16:34:27 crc kubenswrapper[4933]: I1202 16:34:27.727006 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d107571-e567-4dff-bbce-918b69cc7fa2-catalog-content\") pod \"community-operators-nm778\" (UID: \"6d107571-e567-4dff-bbce-918b69cc7fa2\") " pod="openshift-marketplace/community-operators-nm778" Dec 02 16:34:27 crc kubenswrapper[4933]: I1202 16:34:27.727221 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d107571-e567-4dff-bbce-918b69cc7fa2-utilities\") pod \"community-operators-nm778\" (UID: \"6d107571-e567-4dff-bbce-918b69cc7fa2\") " pod="openshift-marketplace/community-operators-nm778" Dec 02 16:34:27 crc kubenswrapper[4933]: I1202 16:34:27.829708 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d107571-e567-4dff-bbce-918b69cc7fa2-utilities\") pod \"community-operators-nm778\" (UID: \"6d107571-e567-4dff-bbce-918b69cc7fa2\") " pod="openshift-marketplace/community-operators-nm778" Dec 02 16:34:27 crc kubenswrapper[4933]: I1202 16:34:27.830198 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lm2kn\" (UniqueName: \"kubernetes.io/projected/6d107571-e567-4dff-bbce-918b69cc7fa2-kube-api-access-lm2kn\") pod \"community-operators-nm778\" (UID: \"6d107571-e567-4dff-bbce-918b69cc7fa2\") " pod="openshift-marketplace/community-operators-nm778" Dec 02 16:34:27 crc kubenswrapper[4933]: I1202 16:34:27.830377 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d107571-e567-4dff-bbce-918b69cc7fa2-catalog-content\") pod \"community-operators-nm778\" (UID: \"6d107571-e567-4dff-bbce-918b69cc7fa2\") " pod="openshift-marketplace/community-operators-nm778" Dec 02 16:34:27 crc kubenswrapper[4933]: I1202 16:34:27.830427 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d107571-e567-4dff-bbce-918b69cc7fa2-utilities\") pod \"community-operators-nm778\" (UID: \"6d107571-e567-4dff-bbce-918b69cc7fa2\") " pod="openshift-marketplace/community-operators-nm778" Dec 02 16:34:27 crc kubenswrapper[4933]: I1202 16:34:27.830740 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d107571-e567-4dff-bbce-918b69cc7fa2-catalog-content\") pod \"community-operators-nm778\" (UID: \"6d107571-e567-4dff-bbce-918b69cc7fa2\") " pod="openshift-marketplace/community-operators-nm778" Dec 02 16:34:27 crc kubenswrapper[4933]: I1202 16:34:27.856177 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lm2kn\" (UniqueName: \"kubernetes.io/projected/6d107571-e567-4dff-bbce-918b69cc7fa2-kube-api-access-lm2kn\") pod \"community-operators-nm778\" (UID: \"6d107571-e567-4dff-bbce-918b69cc7fa2\") " pod="openshift-marketplace/community-operators-nm778" Dec 02 16:34:27 crc kubenswrapper[4933]: I1202 16:34:27.923513 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nm778" Dec 02 16:34:28 crc kubenswrapper[4933]: I1202 16:34:28.513590 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nm778"] Dec 02 16:34:28 crc kubenswrapper[4933]: I1202 16:34:28.911848 4933 generic.go:334] "Generic (PLEG): container finished" podID="6d107571-e567-4dff-bbce-918b69cc7fa2" containerID="e4994fab5a8dc4f04f90a2c40eb740ce9d73c69ea002b775b433b43c04392a44" exitCode=0 Dec 02 16:34:28 crc kubenswrapper[4933]: I1202 16:34:28.911907 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nm778" event={"ID":"6d107571-e567-4dff-bbce-918b69cc7fa2","Type":"ContainerDied","Data":"e4994fab5a8dc4f04f90a2c40eb740ce9d73c69ea002b775b433b43c04392a44"} Dec 02 16:34:28 crc kubenswrapper[4933]: I1202 16:34:28.911945 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nm778" event={"ID":"6d107571-e567-4dff-bbce-918b69cc7fa2","Type":"ContainerStarted","Data":"51edf94adf278ee1dfd6c49ff96b502ab1a5bde4f0f66bbd4be13d0eee1a54d4"} Dec 02 16:34:30 crc kubenswrapper[4933]: I1202 16:34:30.931749 4933 generic.go:334] "Generic (PLEG): container finished" podID="6d107571-e567-4dff-bbce-918b69cc7fa2" containerID="aeb343dabc0e0ce0e7905ee8ffeeec6dae3751690c2e5d46effb59a7daf287ac" exitCode=0 Dec 02 16:34:30 crc kubenswrapper[4933]: I1202 16:34:30.931808 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nm778" event={"ID":"6d107571-e567-4dff-bbce-918b69cc7fa2","Type":"ContainerDied","Data":"aeb343dabc0e0ce0e7905ee8ffeeec6dae3751690c2e5d46effb59a7daf287ac"} Dec 02 16:34:31 crc kubenswrapper[4933]: I1202 16:34:31.053652 4933 scope.go:117] "RemoveContainer" containerID="b62e6bdb09561f36fac7e6658898fffc280a7586dbfd2297014e62834184505b" Dec 02 16:34:31 crc kubenswrapper[4933]: E1202 16:34:31.054026 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 16:34:31 crc kubenswrapper[4933]: I1202 16:34:31.945111 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nm778" event={"ID":"6d107571-e567-4dff-bbce-918b69cc7fa2","Type":"ContainerStarted","Data":"cda2a649c8006acc57cedb585f20b36edd9902f6ff0b45f01b14d500df3b57b0"} Dec 02 16:34:31 crc kubenswrapper[4933]: I1202 16:34:31.969802 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-nm778" podStartSLOduration=2.247509839 podStartE2EDuration="4.969782382s" podCreationTimestamp="2025-12-02 16:34:27 +0000 UTC" firstStartedPulling="2025-12-02 16:34:28.91429938 +0000 UTC m=+2532.165526093" lastFinishedPulling="2025-12-02 16:34:31.636571933 +0000 UTC m=+2534.887798636" observedRunningTime="2025-12-02 16:34:31.96296146 +0000 UTC m=+2535.214188163" watchObservedRunningTime="2025-12-02 16:34:31.969782382 +0000 UTC m=+2535.221009085" Dec 02 16:34:37 crc kubenswrapper[4933]: I1202 16:34:37.924629 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-nm778" Dec 02 16:34:37 crc kubenswrapper[4933]: I1202 16:34:37.925520 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-nm778" Dec 02 16:34:37 crc kubenswrapper[4933]: I1202 16:34:37.985400 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-nm778" Dec 02 16:34:38 crc kubenswrapper[4933]: I1202 16:34:38.066354 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-nm778" Dec 02 16:34:38 crc kubenswrapper[4933]: I1202 16:34:38.547897 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nm778"] Dec 02 16:34:40 crc kubenswrapper[4933]: I1202 16:34:40.044408 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-nm778" podUID="6d107571-e567-4dff-bbce-918b69cc7fa2" containerName="registry-server" containerID="cri-o://cda2a649c8006acc57cedb585f20b36edd9902f6ff0b45f01b14d500df3b57b0" gracePeriod=2 Dec 02 16:34:40 crc kubenswrapper[4933]: I1202 16:34:40.531388 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nm778" Dec 02 16:34:40 crc kubenswrapper[4933]: I1202 16:34:40.676151 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d107571-e567-4dff-bbce-918b69cc7fa2-catalog-content\") pod \"6d107571-e567-4dff-bbce-918b69cc7fa2\" (UID: \"6d107571-e567-4dff-bbce-918b69cc7fa2\") " Dec 02 16:34:40 crc kubenswrapper[4933]: I1202 16:34:40.677795 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d107571-e567-4dff-bbce-918b69cc7fa2-utilities\") pod \"6d107571-e567-4dff-bbce-918b69cc7fa2\" (UID: \"6d107571-e567-4dff-bbce-918b69cc7fa2\") " Dec 02 16:34:40 crc kubenswrapper[4933]: I1202 16:34:40.677993 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lm2kn\" (UniqueName: \"kubernetes.io/projected/6d107571-e567-4dff-bbce-918b69cc7fa2-kube-api-access-lm2kn\") pod \"6d107571-e567-4dff-bbce-918b69cc7fa2\" (UID: \"6d107571-e567-4dff-bbce-918b69cc7fa2\") " Dec 02 16:34:40 crc kubenswrapper[4933]: I1202 16:34:40.679094 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d107571-e567-4dff-bbce-918b69cc7fa2-utilities" (OuterVolumeSpecName: "utilities") pod "6d107571-e567-4dff-bbce-918b69cc7fa2" (UID: "6d107571-e567-4dff-bbce-918b69cc7fa2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 16:34:40 crc kubenswrapper[4933]: I1202 16:34:40.683215 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d107571-e567-4dff-bbce-918b69cc7fa2-kube-api-access-lm2kn" (OuterVolumeSpecName: "kube-api-access-lm2kn") pod "6d107571-e567-4dff-bbce-918b69cc7fa2" (UID: "6d107571-e567-4dff-bbce-918b69cc7fa2"). InnerVolumeSpecName "kube-api-access-lm2kn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:34:40 crc kubenswrapper[4933]: I1202 16:34:40.744017 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d107571-e567-4dff-bbce-918b69cc7fa2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6d107571-e567-4dff-bbce-918b69cc7fa2" (UID: "6d107571-e567-4dff-bbce-918b69cc7fa2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 16:34:40 crc kubenswrapper[4933]: I1202 16:34:40.780465 4933 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d107571-e567-4dff-bbce-918b69cc7fa2-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 16:34:40 crc kubenswrapper[4933]: I1202 16:34:40.780502 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lm2kn\" (UniqueName: \"kubernetes.io/projected/6d107571-e567-4dff-bbce-918b69cc7fa2-kube-api-access-lm2kn\") on node \"crc\" DevicePath \"\"" Dec 02 16:34:40 crc kubenswrapper[4933]: I1202 16:34:40.780513 4933 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d107571-e567-4dff-bbce-918b69cc7fa2-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 16:34:41 crc kubenswrapper[4933]: I1202 16:34:41.057243 4933 generic.go:334] "Generic (PLEG): container finished" podID="6d107571-e567-4dff-bbce-918b69cc7fa2" containerID="cda2a649c8006acc57cedb585f20b36edd9902f6ff0b45f01b14d500df3b57b0" exitCode=0 Dec 02 16:34:41 crc kubenswrapper[4933]: I1202 16:34:41.057377 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nm778" Dec 02 16:34:41 crc kubenswrapper[4933]: I1202 16:34:41.068911 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nm778" event={"ID":"6d107571-e567-4dff-bbce-918b69cc7fa2","Type":"ContainerDied","Data":"cda2a649c8006acc57cedb585f20b36edd9902f6ff0b45f01b14d500df3b57b0"} Dec 02 16:34:41 crc kubenswrapper[4933]: I1202 16:34:41.068952 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nm778" event={"ID":"6d107571-e567-4dff-bbce-918b69cc7fa2","Type":"ContainerDied","Data":"51edf94adf278ee1dfd6c49ff96b502ab1a5bde4f0f66bbd4be13d0eee1a54d4"} Dec 02 16:34:41 crc kubenswrapper[4933]: I1202 16:34:41.068982 4933 scope.go:117] "RemoveContainer" containerID="cda2a649c8006acc57cedb585f20b36edd9902f6ff0b45f01b14d500df3b57b0" Dec 02 16:34:41 crc kubenswrapper[4933]: I1202 16:34:41.100536 4933 scope.go:117] "RemoveContainer" containerID="aeb343dabc0e0ce0e7905ee8ffeeec6dae3751690c2e5d46effb59a7daf287ac" Dec 02 16:34:41 crc kubenswrapper[4933]: I1202 16:34:41.102105 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nm778"] Dec 02 16:34:41 crc kubenswrapper[4933]: I1202 16:34:41.113672 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-nm778"] Dec 02 16:34:41 crc kubenswrapper[4933]: I1202 16:34:41.121571 4933 scope.go:117] "RemoveContainer" containerID="e4994fab5a8dc4f04f90a2c40eb740ce9d73c69ea002b775b433b43c04392a44" Dec 02 16:34:41 crc kubenswrapper[4933]: I1202 16:34:41.186048 4933 scope.go:117] "RemoveContainer" containerID="cda2a649c8006acc57cedb585f20b36edd9902f6ff0b45f01b14d500df3b57b0" Dec 02 16:34:41 crc kubenswrapper[4933]: E1202 16:34:41.186465 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cda2a649c8006acc57cedb585f20b36edd9902f6ff0b45f01b14d500df3b57b0\": container with ID starting with cda2a649c8006acc57cedb585f20b36edd9902f6ff0b45f01b14d500df3b57b0 not found: ID does not exist" containerID="cda2a649c8006acc57cedb585f20b36edd9902f6ff0b45f01b14d500df3b57b0" Dec 02 16:34:41 crc kubenswrapper[4933]: I1202 16:34:41.186509 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cda2a649c8006acc57cedb585f20b36edd9902f6ff0b45f01b14d500df3b57b0"} err="failed to get container status \"cda2a649c8006acc57cedb585f20b36edd9902f6ff0b45f01b14d500df3b57b0\": rpc error: code = NotFound desc = could not find container \"cda2a649c8006acc57cedb585f20b36edd9902f6ff0b45f01b14d500df3b57b0\": container with ID starting with cda2a649c8006acc57cedb585f20b36edd9902f6ff0b45f01b14d500df3b57b0 not found: ID does not exist" Dec 02 16:34:41 crc kubenswrapper[4933]: I1202 16:34:41.186538 4933 scope.go:117] "RemoveContainer" containerID="aeb343dabc0e0ce0e7905ee8ffeeec6dae3751690c2e5d46effb59a7daf287ac" Dec 02 16:34:41 crc kubenswrapper[4933]: E1202 16:34:41.187174 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aeb343dabc0e0ce0e7905ee8ffeeec6dae3751690c2e5d46effb59a7daf287ac\": container with ID starting with aeb343dabc0e0ce0e7905ee8ffeeec6dae3751690c2e5d46effb59a7daf287ac not found: ID does not exist" containerID="aeb343dabc0e0ce0e7905ee8ffeeec6dae3751690c2e5d46effb59a7daf287ac" Dec 02 16:34:41 crc kubenswrapper[4933]: I1202 16:34:41.187199 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aeb343dabc0e0ce0e7905ee8ffeeec6dae3751690c2e5d46effb59a7daf287ac"} err="failed to get container status \"aeb343dabc0e0ce0e7905ee8ffeeec6dae3751690c2e5d46effb59a7daf287ac\": rpc error: code = NotFound desc = could not find container \"aeb343dabc0e0ce0e7905ee8ffeeec6dae3751690c2e5d46effb59a7daf287ac\": container with ID starting with aeb343dabc0e0ce0e7905ee8ffeeec6dae3751690c2e5d46effb59a7daf287ac not found: ID does not exist" Dec 02 16:34:41 crc kubenswrapper[4933]: I1202 16:34:41.187214 4933 scope.go:117] "RemoveContainer" containerID="e4994fab5a8dc4f04f90a2c40eb740ce9d73c69ea002b775b433b43c04392a44" Dec 02 16:34:41 crc kubenswrapper[4933]: E1202 16:34:41.187462 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4994fab5a8dc4f04f90a2c40eb740ce9d73c69ea002b775b433b43c04392a44\": container with ID starting with e4994fab5a8dc4f04f90a2c40eb740ce9d73c69ea002b775b433b43c04392a44 not found: ID does not exist" containerID="e4994fab5a8dc4f04f90a2c40eb740ce9d73c69ea002b775b433b43c04392a44" Dec 02 16:34:41 crc kubenswrapper[4933]: I1202 16:34:41.187488 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4994fab5a8dc4f04f90a2c40eb740ce9d73c69ea002b775b433b43c04392a44"} err="failed to get container status \"e4994fab5a8dc4f04f90a2c40eb740ce9d73c69ea002b775b433b43c04392a44\": rpc error: code = NotFound desc = could not find container \"e4994fab5a8dc4f04f90a2c40eb740ce9d73c69ea002b775b433b43c04392a44\": container with ID starting with e4994fab5a8dc4f04f90a2c40eb740ce9d73c69ea002b775b433b43c04392a44 not found: ID does not exist" Dec 02 16:34:43 crc kubenswrapper[4933]: I1202 16:34:43.067744 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d107571-e567-4dff-bbce-918b69cc7fa2" path="/var/lib/kubelet/pods/6d107571-e567-4dff-bbce-918b69cc7fa2/volumes" Dec 02 16:34:46 crc kubenswrapper[4933]: I1202 16:34:46.053924 4933 scope.go:117] "RemoveContainer" containerID="b62e6bdb09561f36fac7e6658898fffc280a7586dbfd2297014e62834184505b" Dec 02 16:34:46 crc kubenswrapper[4933]: E1202 16:34:46.054677 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 16:35:01 crc kubenswrapper[4933]: I1202 16:35:01.054544 4933 scope.go:117] "RemoveContainer" containerID="b62e6bdb09561f36fac7e6658898fffc280a7586dbfd2297014e62834184505b" Dec 02 16:35:01 crc kubenswrapper[4933]: E1202 16:35:01.055301 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 16:35:12 crc kubenswrapper[4933]: I1202 16:35:12.053255 4933 scope.go:117] "RemoveContainer" containerID="b62e6bdb09561f36fac7e6658898fffc280a7586dbfd2297014e62834184505b" Dec 02 16:35:12 crc kubenswrapper[4933]: E1202 16:35:12.054246 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 16:35:25 crc kubenswrapper[4933]: I1202 16:35:25.053184 4933 scope.go:117] "RemoveContainer" containerID="b62e6bdb09561f36fac7e6658898fffc280a7586dbfd2297014e62834184505b" Dec 02 16:35:25 crc kubenswrapper[4933]: I1202 16:35:25.584336 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" event={"ID":"e6c1c5e6-50dd-428a-890c-2c3f0456f2fa","Type":"ContainerStarted","Data":"c3819fcdd251b779a33969de8166c1977feab4eb854051d1c067f90a06dd33e0"} Dec 02 16:36:53 crc kubenswrapper[4933]: I1202 16:36:53.379924 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-rtmgc"] Dec 02 16:36:53 crc kubenswrapper[4933]: E1202 16:36:53.381115 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d107571-e567-4dff-bbce-918b69cc7fa2" containerName="extract-content" Dec 02 16:36:53 crc kubenswrapper[4933]: I1202 16:36:53.381131 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d107571-e567-4dff-bbce-918b69cc7fa2" containerName="extract-content" Dec 02 16:36:53 crc kubenswrapper[4933]: E1202 16:36:53.381147 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d107571-e567-4dff-bbce-918b69cc7fa2" containerName="extract-utilities" Dec 02 16:36:53 crc kubenswrapper[4933]: I1202 16:36:53.381154 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d107571-e567-4dff-bbce-918b69cc7fa2" containerName="extract-utilities" Dec 02 16:36:53 crc kubenswrapper[4933]: E1202 16:36:53.381178 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d107571-e567-4dff-bbce-918b69cc7fa2" containerName="registry-server" Dec 02 16:36:53 crc kubenswrapper[4933]: I1202 16:36:53.381184 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d107571-e567-4dff-bbce-918b69cc7fa2" containerName="registry-server" Dec 02 16:36:53 crc kubenswrapper[4933]: I1202 16:36:53.381515 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d107571-e567-4dff-bbce-918b69cc7fa2" containerName="registry-server" Dec 02 16:36:53 crc kubenswrapper[4933]: I1202 16:36:53.383794 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rtmgc" Dec 02 16:36:53 crc kubenswrapper[4933]: I1202 16:36:53.393759 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rtmgc"] Dec 02 16:36:53 crc kubenswrapper[4933]: I1202 16:36:53.436421 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6a26611-32c2-40f4-a952-d4fc763ffa37-utilities\") pod \"redhat-operators-rtmgc\" (UID: \"a6a26611-32c2-40f4-a952-d4fc763ffa37\") " pod="openshift-marketplace/redhat-operators-rtmgc" Dec 02 16:36:53 crc kubenswrapper[4933]: I1202 16:36:53.436582 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6a26611-32c2-40f4-a952-d4fc763ffa37-catalog-content\") pod \"redhat-operators-rtmgc\" (UID: \"a6a26611-32c2-40f4-a952-d4fc763ffa37\") " pod="openshift-marketplace/redhat-operators-rtmgc" Dec 02 16:36:53 crc kubenswrapper[4933]: I1202 16:36:53.436802 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cb8g7\" (UniqueName: \"kubernetes.io/projected/a6a26611-32c2-40f4-a952-d4fc763ffa37-kube-api-access-cb8g7\") pod \"redhat-operators-rtmgc\" (UID: \"a6a26611-32c2-40f4-a952-d4fc763ffa37\") " pod="openshift-marketplace/redhat-operators-rtmgc" Dec 02 16:36:53 crc kubenswrapper[4933]: I1202 16:36:53.538554 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6a26611-32c2-40f4-a952-d4fc763ffa37-catalog-content\") pod \"redhat-operators-rtmgc\" (UID: \"a6a26611-32c2-40f4-a952-d4fc763ffa37\") " pod="openshift-marketplace/redhat-operators-rtmgc" Dec 02 16:36:53 crc kubenswrapper[4933]: I1202 16:36:53.538703 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cb8g7\" (UniqueName: \"kubernetes.io/projected/a6a26611-32c2-40f4-a952-d4fc763ffa37-kube-api-access-cb8g7\") pod \"redhat-operators-rtmgc\" (UID: \"a6a26611-32c2-40f4-a952-d4fc763ffa37\") " pod="openshift-marketplace/redhat-operators-rtmgc" Dec 02 16:36:53 crc kubenswrapper[4933]: I1202 16:36:53.538790 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6a26611-32c2-40f4-a952-d4fc763ffa37-utilities\") pod \"redhat-operators-rtmgc\" (UID: \"a6a26611-32c2-40f4-a952-d4fc763ffa37\") " pod="openshift-marketplace/redhat-operators-rtmgc" Dec 02 16:36:53 crc kubenswrapper[4933]: I1202 16:36:53.539353 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6a26611-32c2-40f4-a952-d4fc763ffa37-utilities\") pod \"redhat-operators-rtmgc\" (UID: \"a6a26611-32c2-40f4-a952-d4fc763ffa37\") " pod="openshift-marketplace/redhat-operators-rtmgc" Dec 02 16:36:53 crc kubenswrapper[4933]: I1202 16:36:53.539942 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6a26611-32c2-40f4-a952-d4fc763ffa37-catalog-content\") pod \"redhat-operators-rtmgc\" (UID: \"a6a26611-32c2-40f4-a952-d4fc763ffa37\") " pod="openshift-marketplace/redhat-operators-rtmgc" Dec 02 16:36:53 crc kubenswrapper[4933]: I1202 16:36:53.562404 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cb8g7\" (UniqueName: \"kubernetes.io/projected/a6a26611-32c2-40f4-a952-d4fc763ffa37-kube-api-access-cb8g7\") pod \"redhat-operators-rtmgc\" (UID: \"a6a26611-32c2-40f4-a952-d4fc763ffa37\") " pod="openshift-marketplace/redhat-operators-rtmgc" Dec 02 16:36:53 crc kubenswrapper[4933]: I1202 16:36:53.714663 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rtmgc" Dec 02 16:36:54 crc kubenswrapper[4933]: I1202 16:36:54.248018 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rtmgc"] Dec 02 16:36:54 crc kubenswrapper[4933]: I1202 16:36:54.993667 4933 generic.go:334] "Generic (PLEG): container finished" podID="a6a26611-32c2-40f4-a952-d4fc763ffa37" containerID="2a1817ea1967bd654feb1fab409c3513c116756efcfbba5871a272ebc4739987" exitCode=0 Dec 02 16:36:54 crc kubenswrapper[4933]: I1202 16:36:54.993753 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rtmgc" event={"ID":"a6a26611-32c2-40f4-a952-d4fc763ffa37","Type":"ContainerDied","Data":"2a1817ea1967bd654feb1fab409c3513c116756efcfbba5871a272ebc4739987"} Dec 02 16:36:54 crc kubenswrapper[4933]: I1202 16:36:54.993966 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rtmgc" event={"ID":"a6a26611-32c2-40f4-a952-d4fc763ffa37","Type":"ContainerStarted","Data":"64e7e249eceac6e38cc6b3601637402166423b6fd74ced49d0e36143f4c55b11"} Dec 02 16:36:54 crc kubenswrapper[4933]: I1202 16:36:54.997497 4933 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 16:36:57 crc kubenswrapper[4933]: I1202 16:36:57.016423 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rtmgc" event={"ID":"a6a26611-32c2-40f4-a952-d4fc763ffa37","Type":"ContainerStarted","Data":"d18ebc9b18bfa8d17d95a66a420345a1ccc2871e390720b47ef788eec0cb837c"} Dec 02 16:36:59 crc kubenswrapper[4933]: I1202 16:36:59.052071 4933 generic.go:334] "Generic (PLEG): container finished" podID="a6a26611-32c2-40f4-a952-d4fc763ffa37" containerID="d18ebc9b18bfa8d17d95a66a420345a1ccc2871e390720b47ef788eec0cb837c" exitCode=0 Dec 02 16:36:59 crc kubenswrapper[4933]: I1202 16:36:59.052113 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rtmgc" event={"ID":"a6a26611-32c2-40f4-a952-d4fc763ffa37","Type":"ContainerDied","Data":"d18ebc9b18bfa8d17d95a66a420345a1ccc2871e390720b47ef788eec0cb837c"} Dec 02 16:37:00 crc kubenswrapper[4933]: I1202 16:37:00.065784 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rtmgc" event={"ID":"a6a26611-32c2-40f4-a952-d4fc763ffa37","Type":"ContainerStarted","Data":"4fcc2f33348861d0877cb99b919c59e662b893d9f9998bf18498fe831a421b35"} Dec 02 16:37:00 crc kubenswrapper[4933]: I1202 16:37:00.089738 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-rtmgc" podStartSLOduration=2.364885486 podStartE2EDuration="7.089720131s" podCreationTimestamp="2025-12-02 16:36:53 +0000 UTC" firstStartedPulling="2025-12-02 16:36:54.997226368 +0000 UTC m=+2678.248453071" lastFinishedPulling="2025-12-02 16:36:59.722061003 +0000 UTC m=+2682.973287716" observedRunningTime="2025-12-02 16:37:00.084110341 +0000 UTC m=+2683.335337054" watchObservedRunningTime="2025-12-02 16:37:00.089720131 +0000 UTC m=+2683.340946824" Dec 02 16:37:03 crc kubenswrapper[4933]: I1202 16:37:03.714858 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-rtmgc" Dec 02 16:37:03 crc kubenswrapper[4933]: I1202 16:37:03.715403 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-rtmgc" Dec 02 16:37:04 crc kubenswrapper[4933]: I1202 16:37:04.765808 4933 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-rtmgc" podUID="a6a26611-32c2-40f4-a952-d4fc763ffa37" containerName="registry-server" probeResult="failure" output=< Dec 02 16:37:04 crc kubenswrapper[4933]: timeout: failed to connect service ":50051" within 1s Dec 02 16:37:04 crc kubenswrapper[4933]: > Dec 02 16:37:12 crc kubenswrapper[4933]: I1202 16:37:12.189717 4933 generic.go:334] "Generic (PLEG): container finished" podID="7b465ce0-4efc-4624-b140-f3bbb0e0b420" containerID="82dd50c934da092aa2398c925b7401b3243407b6c13ad596d5e45a523bd74cdb" exitCode=0 Dec 02 16:37:12 crc kubenswrapper[4933]: I1202 16:37:12.190025 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6bm6r" event={"ID":"7b465ce0-4efc-4624-b140-f3bbb0e0b420","Type":"ContainerDied","Data":"82dd50c934da092aa2398c925b7401b3243407b6c13ad596d5e45a523bd74cdb"} Dec 02 16:37:13 crc kubenswrapper[4933]: I1202 16:37:13.620875 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6bm6r" Dec 02 16:37:13 crc kubenswrapper[4933]: I1202 16:37:13.700793 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7b465ce0-4efc-4624-b140-f3bbb0e0b420-inventory\") pod \"7b465ce0-4efc-4624-b140-f3bbb0e0b420\" (UID: \"7b465ce0-4efc-4624-b140-f3bbb0e0b420\") " Dec 02 16:37:13 crc kubenswrapper[4933]: I1202 16:37:13.701470 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b465ce0-4efc-4624-b140-f3bbb0e0b420-libvirt-combined-ca-bundle\") pod \"7b465ce0-4efc-4624-b140-f3bbb0e0b420\" (UID: \"7b465ce0-4efc-4624-b140-f3bbb0e0b420\") " Dec 02 16:37:13 crc kubenswrapper[4933]: I1202 16:37:13.701594 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-st4hv\" (UniqueName: \"kubernetes.io/projected/7b465ce0-4efc-4624-b140-f3bbb0e0b420-kube-api-access-st4hv\") pod \"7b465ce0-4efc-4624-b140-f3bbb0e0b420\" (UID: \"7b465ce0-4efc-4624-b140-f3bbb0e0b420\") " Dec 02 16:37:13 crc kubenswrapper[4933]: I1202 16:37:13.701753 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7b465ce0-4efc-4624-b140-f3bbb0e0b420-ssh-key\") pod \"7b465ce0-4efc-4624-b140-f3bbb0e0b420\" (UID: \"7b465ce0-4efc-4624-b140-f3bbb0e0b420\") " Dec 02 16:37:13 crc kubenswrapper[4933]: I1202 16:37:13.701904 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/7b465ce0-4efc-4624-b140-f3bbb0e0b420-libvirt-secret-0\") pod \"7b465ce0-4efc-4624-b140-f3bbb0e0b420\" (UID: \"7b465ce0-4efc-4624-b140-f3bbb0e0b420\") " Dec 02 16:37:13 crc kubenswrapper[4933]: I1202 16:37:13.725043 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b465ce0-4efc-4624-b140-f3bbb0e0b420-kube-api-access-st4hv" (OuterVolumeSpecName: "kube-api-access-st4hv") pod "7b465ce0-4efc-4624-b140-f3bbb0e0b420" (UID: "7b465ce0-4efc-4624-b140-f3bbb0e0b420"). InnerVolumeSpecName "kube-api-access-st4hv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:37:13 crc kubenswrapper[4933]: I1202 16:37:13.744462 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b465ce0-4efc-4624-b140-f3bbb0e0b420-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "7b465ce0-4efc-4624-b140-f3bbb0e0b420" (UID: "7b465ce0-4efc-4624-b140-f3bbb0e0b420"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:37:13 crc kubenswrapper[4933]: I1202 16:37:13.756102 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b465ce0-4efc-4624-b140-f3bbb0e0b420-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "7b465ce0-4efc-4624-b140-f3bbb0e0b420" (UID: "7b465ce0-4efc-4624-b140-f3bbb0e0b420"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:37:13 crc kubenswrapper[4933]: I1202 16:37:13.759961 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b465ce0-4efc-4624-b140-f3bbb0e0b420-inventory" (OuterVolumeSpecName: "inventory") pod "7b465ce0-4efc-4624-b140-f3bbb0e0b420" (UID: "7b465ce0-4efc-4624-b140-f3bbb0e0b420"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:37:13 crc kubenswrapper[4933]: I1202 16:37:13.797025 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b465ce0-4efc-4624-b140-f3bbb0e0b420-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "7b465ce0-4efc-4624-b140-f3bbb0e0b420" (UID: "7b465ce0-4efc-4624-b140-f3bbb0e0b420"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:37:13 crc kubenswrapper[4933]: I1202 16:37:13.805547 4933 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7b465ce0-4efc-4624-b140-f3bbb0e0b420-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 16:37:13 crc kubenswrapper[4933]: I1202 16:37:13.805585 4933 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b465ce0-4efc-4624-b140-f3bbb0e0b420-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 16:37:13 crc kubenswrapper[4933]: I1202 16:37:13.805597 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-st4hv\" (UniqueName: \"kubernetes.io/projected/7b465ce0-4efc-4624-b140-f3bbb0e0b420-kube-api-access-st4hv\") on node \"crc\" DevicePath \"\"" Dec 02 16:37:13 crc kubenswrapper[4933]: I1202 16:37:13.805606 4933 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7b465ce0-4efc-4624-b140-f3bbb0e0b420-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 16:37:13 crc kubenswrapper[4933]: I1202 16:37:13.805615 4933 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/7b465ce0-4efc-4624-b140-f3bbb0e0b420-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Dec 02 16:37:13 crc kubenswrapper[4933]: I1202 16:37:13.809348 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-rtmgc" Dec 02 16:37:13 crc kubenswrapper[4933]: I1202 16:37:13.882619 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-rtmgc" Dec 02 16:37:14 crc kubenswrapper[4933]: I1202 16:37:14.069747 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rtmgc"] Dec 02 16:37:14 crc kubenswrapper[4933]: I1202 16:37:14.214906 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6bm6r" Dec 02 16:37:14 crc kubenswrapper[4933]: I1202 16:37:14.216064 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6bm6r" event={"ID":"7b465ce0-4efc-4624-b140-f3bbb0e0b420","Type":"ContainerDied","Data":"7bac38fdfdb063dc1fd90edfcfd64974971f1993329365714beb037665b0d545"} Dec 02 16:37:14 crc kubenswrapper[4933]: I1202 16:37:14.216134 4933 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7bac38fdfdb063dc1fd90edfcfd64974971f1993329365714beb037665b0d545" Dec 02 16:37:14 crc kubenswrapper[4933]: I1202 16:37:14.295733 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-cbhhv"] Dec 02 16:37:14 crc kubenswrapper[4933]: E1202 16:37:14.296367 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b465ce0-4efc-4624-b140-f3bbb0e0b420" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 02 16:37:14 crc kubenswrapper[4933]: I1202 16:37:14.296393 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b465ce0-4efc-4624-b140-f3bbb0e0b420" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 02 16:37:14 crc kubenswrapper[4933]: I1202 16:37:14.296709 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b465ce0-4efc-4624-b140-f3bbb0e0b420" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 02 16:37:14 crc kubenswrapper[4933]: I1202 16:37:14.297684 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cbhhv" Dec 02 16:37:14 crc kubenswrapper[4933]: I1202 16:37:14.301018 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 16:37:14 crc kubenswrapper[4933]: I1202 16:37:14.301461 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Dec 02 16:37:14 crc kubenswrapper[4933]: I1202 16:37:14.301626 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 16:37:14 crc kubenswrapper[4933]: I1202 16:37:14.301733 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mlmmm" Dec 02 16:37:14 crc kubenswrapper[4933]: I1202 16:37:14.301962 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Dec 02 16:37:14 crc kubenswrapper[4933]: I1202 16:37:14.303544 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Dec 02 16:37:14 crc kubenswrapper[4933]: I1202 16:37:14.305111 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 16:37:14 crc kubenswrapper[4933]: I1202 16:37:14.312880 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-cbhhv"] Dec 02 16:37:14 crc kubenswrapper[4933]: I1202 16:37:14.418997 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/5e83fdad-1cda-4282-b5e7-6911bdd8d9a0-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cbhhv\" (UID: \"5e83fdad-1cda-4282-b5e7-6911bdd8d9a0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cbhhv" Dec 02 16:37:14 crc kubenswrapper[4933]: I1202 16:37:14.419152 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/5e83fdad-1cda-4282-b5e7-6911bdd8d9a0-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cbhhv\" (UID: \"5e83fdad-1cda-4282-b5e7-6911bdd8d9a0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cbhhv" Dec 02 16:37:14 crc kubenswrapper[4933]: I1202 16:37:14.419189 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e83fdad-1cda-4282-b5e7-6911bdd8d9a0-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cbhhv\" (UID: \"5e83fdad-1cda-4282-b5e7-6911bdd8d9a0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cbhhv" Dec 02 16:37:14 crc kubenswrapper[4933]: I1202 16:37:14.419227 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6q6x2\" (UniqueName: \"kubernetes.io/projected/5e83fdad-1cda-4282-b5e7-6911bdd8d9a0-kube-api-access-6q6x2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cbhhv\" (UID: \"5e83fdad-1cda-4282-b5e7-6911bdd8d9a0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cbhhv" Dec 02 16:37:14 crc kubenswrapper[4933]: I1202 16:37:14.419367 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/5e83fdad-1cda-4282-b5e7-6911bdd8d9a0-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cbhhv\" (UID: \"5e83fdad-1cda-4282-b5e7-6911bdd8d9a0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cbhhv" Dec 02 16:37:14 crc kubenswrapper[4933]: I1202 16:37:14.419432 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5e83fdad-1cda-4282-b5e7-6911bdd8d9a0-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cbhhv\" (UID: \"5e83fdad-1cda-4282-b5e7-6911bdd8d9a0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cbhhv" Dec 02 16:37:14 crc kubenswrapper[4933]: I1202 16:37:14.419782 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e83fdad-1cda-4282-b5e7-6911bdd8d9a0-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cbhhv\" (UID: \"5e83fdad-1cda-4282-b5e7-6911bdd8d9a0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cbhhv" Dec 02 16:37:14 crc kubenswrapper[4933]: I1202 16:37:14.419940 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/5e83fdad-1cda-4282-b5e7-6911bdd8d9a0-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cbhhv\" (UID: \"5e83fdad-1cda-4282-b5e7-6911bdd8d9a0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cbhhv" Dec 02 16:37:14 crc kubenswrapper[4933]: I1202 16:37:14.420025 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/5e83fdad-1cda-4282-b5e7-6911bdd8d9a0-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cbhhv\" (UID: \"5e83fdad-1cda-4282-b5e7-6911bdd8d9a0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cbhhv" Dec 02 16:37:14 crc kubenswrapper[4933]: I1202 16:37:14.521635 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/5e83fdad-1cda-4282-b5e7-6911bdd8d9a0-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cbhhv\" (UID: \"5e83fdad-1cda-4282-b5e7-6911bdd8d9a0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cbhhv" Dec 02 16:37:14 crc kubenswrapper[4933]: I1202 16:37:14.521679 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e83fdad-1cda-4282-b5e7-6911bdd8d9a0-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cbhhv\" (UID: \"5e83fdad-1cda-4282-b5e7-6911bdd8d9a0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cbhhv" Dec 02 16:37:14 crc kubenswrapper[4933]: I1202 16:37:14.521705 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6q6x2\" (UniqueName: \"kubernetes.io/projected/5e83fdad-1cda-4282-b5e7-6911bdd8d9a0-kube-api-access-6q6x2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cbhhv\" (UID: \"5e83fdad-1cda-4282-b5e7-6911bdd8d9a0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cbhhv" Dec 02 16:37:14 crc kubenswrapper[4933]: I1202 16:37:14.521737 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/5e83fdad-1cda-4282-b5e7-6911bdd8d9a0-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cbhhv\" (UID: \"5e83fdad-1cda-4282-b5e7-6911bdd8d9a0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cbhhv" Dec 02 16:37:14 crc kubenswrapper[4933]: I1202 16:37:14.521760 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5e83fdad-1cda-4282-b5e7-6911bdd8d9a0-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cbhhv\" (UID: \"5e83fdad-1cda-4282-b5e7-6911bdd8d9a0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cbhhv" Dec 02 16:37:14 crc kubenswrapper[4933]: I1202 16:37:14.521874 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e83fdad-1cda-4282-b5e7-6911bdd8d9a0-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cbhhv\" (UID: \"5e83fdad-1cda-4282-b5e7-6911bdd8d9a0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cbhhv" Dec 02 16:37:14 crc kubenswrapper[4933]: I1202 16:37:14.521921 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/5e83fdad-1cda-4282-b5e7-6911bdd8d9a0-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cbhhv\" (UID: \"5e83fdad-1cda-4282-b5e7-6911bdd8d9a0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cbhhv" Dec 02 16:37:14 crc kubenswrapper[4933]: I1202 16:37:14.521956 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/5e83fdad-1cda-4282-b5e7-6911bdd8d9a0-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cbhhv\" (UID: \"5e83fdad-1cda-4282-b5e7-6911bdd8d9a0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cbhhv" Dec 02 16:37:14 crc kubenswrapper[4933]: I1202 16:37:14.521980 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/5e83fdad-1cda-4282-b5e7-6911bdd8d9a0-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cbhhv\" (UID: \"5e83fdad-1cda-4282-b5e7-6911bdd8d9a0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cbhhv" Dec 02 16:37:14 crc kubenswrapper[4933]: I1202 16:37:14.522748 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/5e83fdad-1cda-4282-b5e7-6911bdd8d9a0-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cbhhv\" (UID: \"5e83fdad-1cda-4282-b5e7-6911bdd8d9a0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cbhhv" Dec 02 16:37:14 crc kubenswrapper[4933]: I1202 16:37:14.527191 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/5e83fdad-1cda-4282-b5e7-6911bdd8d9a0-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cbhhv\" (UID: \"5e83fdad-1cda-4282-b5e7-6911bdd8d9a0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cbhhv" Dec 02 16:37:14 crc kubenswrapper[4933]: I1202 16:37:14.527227 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e83fdad-1cda-4282-b5e7-6911bdd8d9a0-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cbhhv\" (UID: \"5e83fdad-1cda-4282-b5e7-6911bdd8d9a0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cbhhv" Dec 02 16:37:14 crc kubenswrapper[4933]: I1202 16:37:14.527782 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/5e83fdad-1cda-4282-b5e7-6911bdd8d9a0-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cbhhv\" (UID: \"5e83fdad-1cda-4282-b5e7-6911bdd8d9a0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cbhhv" Dec 02 16:37:14 crc kubenswrapper[4933]: I1202 16:37:14.528380 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/5e83fdad-1cda-4282-b5e7-6911bdd8d9a0-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cbhhv\" (UID: \"5e83fdad-1cda-4282-b5e7-6911bdd8d9a0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cbhhv" Dec 02 16:37:14 crc kubenswrapper[4933]: I1202 16:37:14.528994 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/5e83fdad-1cda-4282-b5e7-6911bdd8d9a0-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cbhhv\" (UID: \"5e83fdad-1cda-4282-b5e7-6911bdd8d9a0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cbhhv" Dec 02 16:37:14 crc kubenswrapper[4933]: I1202 16:37:14.530649 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5e83fdad-1cda-4282-b5e7-6911bdd8d9a0-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cbhhv\" (UID: \"5e83fdad-1cda-4282-b5e7-6911bdd8d9a0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cbhhv" Dec 02 16:37:14 crc kubenswrapper[4933]: I1202 16:37:14.532041 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e83fdad-1cda-4282-b5e7-6911bdd8d9a0-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cbhhv\" (UID: \"5e83fdad-1cda-4282-b5e7-6911bdd8d9a0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cbhhv" Dec 02 16:37:14 crc kubenswrapper[4933]: I1202 16:37:14.555880 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6q6x2\" (UniqueName: \"kubernetes.io/projected/5e83fdad-1cda-4282-b5e7-6911bdd8d9a0-kube-api-access-6q6x2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cbhhv\" (UID: \"5e83fdad-1cda-4282-b5e7-6911bdd8d9a0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cbhhv" Dec 02 16:37:14 crc kubenswrapper[4933]: I1202 16:37:14.623671 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cbhhv" Dec 02 16:37:15 crc kubenswrapper[4933]: I1202 16:37:15.203320 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-cbhhv"] Dec 02 16:37:15 crc kubenswrapper[4933]: I1202 16:37:15.226386 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cbhhv" event={"ID":"5e83fdad-1cda-4282-b5e7-6911bdd8d9a0","Type":"ContainerStarted","Data":"bed6b4203c03f8645c8ef5362b4ceb5c7b9d7617ea668d4eea2294b993505268"} Dec 02 16:37:15 crc kubenswrapper[4933]: I1202 16:37:15.226556 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-rtmgc" podUID="a6a26611-32c2-40f4-a952-d4fc763ffa37" containerName="registry-server" containerID="cri-o://4fcc2f33348861d0877cb99b919c59e662b893d9f9998bf18498fe831a421b35" gracePeriod=2 Dec 02 16:37:15 crc kubenswrapper[4933]: I1202 16:37:15.733399 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rtmgc" Dec 02 16:37:15 crc kubenswrapper[4933]: I1202 16:37:15.863851 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6a26611-32c2-40f4-a952-d4fc763ffa37-utilities\") pod \"a6a26611-32c2-40f4-a952-d4fc763ffa37\" (UID: \"a6a26611-32c2-40f4-a952-d4fc763ffa37\") " Dec 02 16:37:15 crc kubenswrapper[4933]: I1202 16:37:15.864200 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cb8g7\" (UniqueName: \"kubernetes.io/projected/a6a26611-32c2-40f4-a952-d4fc763ffa37-kube-api-access-cb8g7\") pod \"a6a26611-32c2-40f4-a952-d4fc763ffa37\" (UID: \"a6a26611-32c2-40f4-a952-d4fc763ffa37\") " Dec 02 16:37:15 crc kubenswrapper[4933]: I1202 16:37:15.864364 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6a26611-32c2-40f4-a952-d4fc763ffa37-catalog-content\") pod \"a6a26611-32c2-40f4-a952-d4fc763ffa37\" (UID: \"a6a26611-32c2-40f4-a952-d4fc763ffa37\") " Dec 02 16:37:15 crc kubenswrapper[4933]: I1202 16:37:15.865335 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6a26611-32c2-40f4-a952-d4fc763ffa37-utilities" (OuterVolumeSpecName: "utilities") pod "a6a26611-32c2-40f4-a952-d4fc763ffa37" (UID: "a6a26611-32c2-40f4-a952-d4fc763ffa37"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 16:37:15 crc kubenswrapper[4933]: I1202 16:37:15.869101 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6a26611-32c2-40f4-a952-d4fc763ffa37-kube-api-access-cb8g7" (OuterVolumeSpecName: "kube-api-access-cb8g7") pod "a6a26611-32c2-40f4-a952-d4fc763ffa37" (UID: "a6a26611-32c2-40f4-a952-d4fc763ffa37"). InnerVolumeSpecName "kube-api-access-cb8g7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:37:15 crc kubenswrapper[4933]: I1202 16:37:15.967325 4933 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6a26611-32c2-40f4-a952-d4fc763ffa37-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 16:37:15 crc kubenswrapper[4933]: I1202 16:37:15.967359 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cb8g7\" (UniqueName: \"kubernetes.io/projected/a6a26611-32c2-40f4-a952-d4fc763ffa37-kube-api-access-cb8g7\") on node \"crc\" DevicePath \"\"" Dec 02 16:37:15 crc kubenswrapper[4933]: I1202 16:37:15.975262 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6a26611-32c2-40f4-a952-d4fc763ffa37-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a6a26611-32c2-40f4-a952-d4fc763ffa37" (UID: "a6a26611-32c2-40f4-a952-d4fc763ffa37"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 16:37:16 crc kubenswrapper[4933]: I1202 16:37:16.069030 4933 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6a26611-32c2-40f4-a952-d4fc763ffa37-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 16:37:16 crc kubenswrapper[4933]: I1202 16:37:16.249325 4933 generic.go:334] "Generic (PLEG): container finished" podID="a6a26611-32c2-40f4-a952-d4fc763ffa37" containerID="4fcc2f33348861d0877cb99b919c59e662b893d9f9998bf18498fe831a421b35" exitCode=0 Dec 02 16:37:16 crc kubenswrapper[4933]: I1202 16:37:16.249478 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rtmgc" event={"ID":"a6a26611-32c2-40f4-a952-d4fc763ffa37","Type":"ContainerDied","Data":"4fcc2f33348861d0877cb99b919c59e662b893d9f9998bf18498fe831a421b35"} Dec 02 16:37:16 crc kubenswrapper[4933]: I1202 16:37:16.249514 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rtmgc" event={"ID":"a6a26611-32c2-40f4-a952-d4fc763ffa37","Type":"ContainerDied","Data":"64e7e249eceac6e38cc6b3601637402166423b6fd74ced49d0e36143f4c55b11"} Dec 02 16:37:16 crc kubenswrapper[4933]: I1202 16:37:16.249556 4933 scope.go:117] "RemoveContainer" containerID="4fcc2f33348861d0877cb99b919c59e662b893d9f9998bf18498fe831a421b35" Dec 02 16:37:16 crc kubenswrapper[4933]: I1202 16:37:16.250768 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rtmgc" Dec 02 16:37:16 crc kubenswrapper[4933]: I1202 16:37:16.253401 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cbhhv" event={"ID":"5e83fdad-1cda-4282-b5e7-6911bdd8d9a0","Type":"ContainerStarted","Data":"553b068134b0ec8453a42d285ff6e3e8416e1c72c197831fe3f967c811b55ed7"} Dec 02 16:37:16 crc kubenswrapper[4933]: I1202 16:37:16.284654 4933 scope.go:117] "RemoveContainer" containerID="d18ebc9b18bfa8d17d95a66a420345a1ccc2871e390720b47ef788eec0cb837c" Dec 02 16:37:16 crc kubenswrapper[4933]: I1202 16:37:16.286844 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cbhhv" podStartSLOduration=1.7205648199999999 podStartE2EDuration="2.286807453s" podCreationTimestamp="2025-12-02 16:37:14 +0000 UTC" firstStartedPulling="2025-12-02 16:37:15.211808393 +0000 UTC m=+2698.463035096" lastFinishedPulling="2025-12-02 16:37:15.778051016 +0000 UTC m=+2699.029277729" observedRunningTime="2025-12-02 16:37:16.278728278 +0000 UTC m=+2699.529954981" watchObservedRunningTime="2025-12-02 16:37:16.286807453 +0000 UTC m=+2699.538034156" Dec 02 16:37:16 crc kubenswrapper[4933]: I1202 16:37:16.318910 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rtmgc"] Dec 02 16:37:16 crc kubenswrapper[4933]: I1202 16:37:16.321425 4933 scope.go:117] "RemoveContainer" containerID="2a1817ea1967bd654feb1fab409c3513c116756efcfbba5871a272ebc4739987" Dec 02 16:37:16 crc kubenswrapper[4933]: I1202 16:37:16.332269 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-rtmgc"] Dec 02 16:37:16 crc kubenswrapper[4933]: I1202 16:37:16.348668 4933 scope.go:117] "RemoveContainer" containerID="4fcc2f33348861d0877cb99b919c59e662b893d9f9998bf18498fe831a421b35" Dec 02 16:37:16 crc kubenswrapper[4933]: E1202 16:37:16.349043 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4fcc2f33348861d0877cb99b919c59e662b893d9f9998bf18498fe831a421b35\": container with ID starting with 4fcc2f33348861d0877cb99b919c59e662b893d9f9998bf18498fe831a421b35 not found: ID does not exist" containerID="4fcc2f33348861d0877cb99b919c59e662b893d9f9998bf18498fe831a421b35" Dec 02 16:37:16 crc kubenswrapper[4933]: I1202 16:37:16.349081 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4fcc2f33348861d0877cb99b919c59e662b893d9f9998bf18498fe831a421b35"} err="failed to get container status \"4fcc2f33348861d0877cb99b919c59e662b893d9f9998bf18498fe831a421b35\": rpc error: code = NotFound desc = could not find container \"4fcc2f33348861d0877cb99b919c59e662b893d9f9998bf18498fe831a421b35\": container with ID starting with 4fcc2f33348861d0877cb99b919c59e662b893d9f9998bf18498fe831a421b35 not found: ID does not exist" Dec 02 16:37:16 crc kubenswrapper[4933]: I1202 16:37:16.349109 4933 scope.go:117] "RemoveContainer" containerID="d18ebc9b18bfa8d17d95a66a420345a1ccc2871e390720b47ef788eec0cb837c" Dec 02 16:37:16 crc kubenswrapper[4933]: E1202 16:37:16.349469 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d18ebc9b18bfa8d17d95a66a420345a1ccc2871e390720b47ef788eec0cb837c\": container with ID starting with d18ebc9b18bfa8d17d95a66a420345a1ccc2871e390720b47ef788eec0cb837c not found: ID does not exist" containerID="d18ebc9b18bfa8d17d95a66a420345a1ccc2871e390720b47ef788eec0cb837c" Dec 02 16:37:16 crc kubenswrapper[4933]: I1202 16:37:16.349493 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d18ebc9b18bfa8d17d95a66a420345a1ccc2871e390720b47ef788eec0cb837c"} err="failed to get container status \"d18ebc9b18bfa8d17d95a66a420345a1ccc2871e390720b47ef788eec0cb837c\": rpc error: code = NotFound desc = could not find container \"d18ebc9b18bfa8d17d95a66a420345a1ccc2871e390720b47ef788eec0cb837c\": container with ID starting with d18ebc9b18bfa8d17d95a66a420345a1ccc2871e390720b47ef788eec0cb837c not found: ID does not exist" Dec 02 16:37:16 crc kubenswrapper[4933]: I1202 16:37:16.349506 4933 scope.go:117] "RemoveContainer" containerID="2a1817ea1967bd654feb1fab409c3513c116756efcfbba5871a272ebc4739987" Dec 02 16:37:16 crc kubenswrapper[4933]: E1202 16:37:16.349803 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a1817ea1967bd654feb1fab409c3513c116756efcfbba5871a272ebc4739987\": container with ID starting with 2a1817ea1967bd654feb1fab409c3513c116756efcfbba5871a272ebc4739987 not found: ID does not exist" containerID="2a1817ea1967bd654feb1fab409c3513c116756efcfbba5871a272ebc4739987" Dec 02 16:37:16 crc kubenswrapper[4933]: I1202 16:37:16.349837 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a1817ea1967bd654feb1fab409c3513c116756efcfbba5871a272ebc4739987"} err="failed to get container status \"2a1817ea1967bd654feb1fab409c3513c116756efcfbba5871a272ebc4739987\": rpc error: code = NotFound desc = could not find container \"2a1817ea1967bd654feb1fab409c3513c116756efcfbba5871a272ebc4739987\": container with ID starting with 2a1817ea1967bd654feb1fab409c3513c116756efcfbba5871a272ebc4739987 not found: ID does not exist" Dec 02 16:37:17 crc kubenswrapper[4933]: I1202 16:37:17.072602 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6a26611-32c2-40f4-a952-d4fc763ffa37" path="/var/lib/kubelet/pods/a6a26611-32c2-40f4-a952-d4fc763ffa37/volumes" Dec 02 16:37:47 crc kubenswrapper[4933]: I1202 16:37:47.169856 4933 patch_prober.go:28] interesting pod/machine-config-daemon-d2p6w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 16:37:47 crc kubenswrapper[4933]: I1202 16:37:47.170294 4933 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 16:38:17 crc kubenswrapper[4933]: I1202 16:38:17.170065 4933 patch_prober.go:28] interesting pod/machine-config-daemon-d2p6w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 16:38:17 crc kubenswrapper[4933]: I1202 16:38:17.171435 4933 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 16:38:47 crc kubenswrapper[4933]: I1202 16:38:47.169016 4933 patch_prober.go:28] interesting pod/machine-config-daemon-d2p6w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 16:38:47 crc kubenswrapper[4933]: I1202 16:38:47.169542 4933 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 16:38:47 crc kubenswrapper[4933]: I1202 16:38:47.169590 4933 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" Dec 02 16:38:47 crc kubenswrapper[4933]: I1202 16:38:47.170860 4933 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c3819fcdd251b779a33969de8166c1977feab4eb854051d1c067f90a06dd33e0"} pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 16:38:47 crc kubenswrapper[4933]: I1202 16:38:47.170914 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" containerName="machine-config-daemon" containerID="cri-o://c3819fcdd251b779a33969de8166c1977feab4eb854051d1c067f90a06dd33e0" gracePeriod=600 Dec 02 16:38:48 crc kubenswrapper[4933]: I1202 16:38:48.074903 4933 generic.go:334] "Generic (PLEG): container finished" podID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" containerID="c3819fcdd251b779a33969de8166c1977feab4eb854051d1c067f90a06dd33e0" exitCode=0 Dec 02 16:38:48 crc kubenswrapper[4933]: I1202 16:38:48.074968 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" event={"ID":"e6c1c5e6-50dd-428a-890c-2c3f0456f2fa","Type":"ContainerDied","Data":"c3819fcdd251b779a33969de8166c1977feab4eb854051d1c067f90a06dd33e0"} Dec 02 16:38:48 crc kubenswrapper[4933]: I1202 16:38:48.075449 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" event={"ID":"e6c1c5e6-50dd-428a-890c-2c3f0456f2fa","Type":"ContainerStarted","Data":"68b78e6cd96af98221363c9fce993885171fbe98149e90f073dca0715a2cf767"} Dec 02 16:38:48 crc kubenswrapper[4933]: I1202 16:38:48.075473 4933 scope.go:117] "RemoveContainer" containerID="b62e6bdb09561f36fac7e6658898fffc280a7586dbfd2297014e62834184505b" Dec 02 16:39:17 crc kubenswrapper[4933]: I1202 16:39:17.099669 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-smbj9"] Dec 02 16:39:17 crc kubenswrapper[4933]: E1202 16:39:17.100960 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6a26611-32c2-40f4-a952-d4fc763ffa37" containerName="registry-server" Dec 02 16:39:17 crc kubenswrapper[4933]: I1202 16:39:17.100978 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6a26611-32c2-40f4-a952-d4fc763ffa37" containerName="registry-server" Dec 02 16:39:17 crc kubenswrapper[4933]: E1202 16:39:17.101013 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6a26611-32c2-40f4-a952-d4fc763ffa37" containerName="extract-content" Dec 02 16:39:17 crc kubenswrapper[4933]: I1202 16:39:17.101021 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6a26611-32c2-40f4-a952-d4fc763ffa37" containerName="extract-content" Dec 02 16:39:17 crc kubenswrapper[4933]: E1202 16:39:17.101061 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6a26611-32c2-40f4-a952-d4fc763ffa37" containerName="extract-utilities" Dec 02 16:39:17 crc kubenswrapper[4933]: I1202 16:39:17.101069 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6a26611-32c2-40f4-a952-d4fc763ffa37" containerName="extract-utilities" Dec 02 16:39:17 crc kubenswrapper[4933]: I1202 16:39:17.101350 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6a26611-32c2-40f4-a952-d4fc763ffa37" containerName="registry-server" Dec 02 16:39:17 crc kubenswrapper[4933]: I1202 16:39:17.103587 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-smbj9" Dec 02 16:39:17 crc kubenswrapper[4933]: I1202 16:39:17.140191 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-smbj9"] Dec 02 16:39:17 crc kubenswrapper[4933]: I1202 16:39:17.232315 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjxw2\" (UniqueName: \"kubernetes.io/projected/4a6f6bc7-4417-480f-8b2e-70cf6485e9aa-kube-api-access-fjxw2\") pod \"redhat-marketplace-smbj9\" (UID: \"4a6f6bc7-4417-480f-8b2e-70cf6485e9aa\") " pod="openshift-marketplace/redhat-marketplace-smbj9" Dec 02 16:39:17 crc kubenswrapper[4933]: I1202 16:39:17.232414 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a6f6bc7-4417-480f-8b2e-70cf6485e9aa-catalog-content\") pod \"redhat-marketplace-smbj9\" (UID: \"4a6f6bc7-4417-480f-8b2e-70cf6485e9aa\") " pod="openshift-marketplace/redhat-marketplace-smbj9" Dec 02 16:39:17 crc kubenswrapper[4933]: I1202 16:39:17.232454 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a6f6bc7-4417-480f-8b2e-70cf6485e9aa-utilities\") pod \"redhat-marketplace-smbj9\" (UID: \"4a6f6bc7-4417-480f-8b2e-70cf6485e9aa\") " pod="openshift-marketplace/redhat-marketplace-smbj9" Dec 02 16:39:17 crc kubenswrapper[4933]: I1202 16:39:17.334748 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjxw2\" (UniqueName: \"kubernetes.io/projected/4a6f6bc7-4417-480f-8b2e-70cf6485e9aa-kube-api-access-fjxw2\") pod \"redhat-marketplace-smbj9\" (UID: \"4a6f6bc7-4417-480f-8b2e-70cf6485e9aa\") " pod="openshift-marketplace/redhat-marketplace-smbj9" Dec 02 16:39:17 crc kubenswrapper[4933]: I1202 16:39:17.334877 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a6f6bc7-4417-480f-8b2e-70cf6485e9aa-catalog-content\") pod \"redhat-marketplace-smbj9\" (UID: \"4a6f6bc7-4417-480f-8b2e-70cf6485e9aa\") " pod="openshift-marketplace/redhat-marketplace-smbj9" Dec 02 16:39:17 crc kubenswrapper[4933]: I1202 16:39:17.334924 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a6f6bc7-4417-480f-8b2e-70cf6485e9aa-utilities\") pod \"redhat-marketplace-smbj9\" (UID: \"4a6f6bc7-4417-480f-8b2e-70cf6485e9aa\") " pod="openshift-marketplace/redhat-marketplace-smbj9" Dec 02 16:39:17 crc kubenswrapper[4933]: I1202 16:39:17.335511 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a6f6bc7-4417-480f-8b2e-70cf6485e9aa-utilities\") pod \"redhat-marketplace-smbj9\" (UID: \"4a6f6bc7-4417-480f-8b2e-70cf6485e9aa\") " pod="openshift-marketplace/redhat-marketplace-smbj9" Dec 02 16:39:17 crc kubenswrapper[4933]: I1202 16:39:17.335557 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a6f6bc7-4417-480f-8b2e-70cf6485e9aa-catalog-content\") pod \"redhat-marketplace-smbj9\" (UID: \"4a6f6bc7-4417-480f-8b2e-70cf6485e9aa\") " pod="openshift-marketplace/redhat-marketplace-smbj9" Dec 02 16:39:17 crc kubenswrapper[4933]: I1202 16:39:17.353767 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjxw2\" (UniqueName: \"kubernetes.io/projected/4a6f6bc7-4417-480f-8b2e-70cf6485e9aa-kube-api-access-fjxw2\") pod \"redhat-marketplace-smbj9\" (UID: \"4a6f6bc7-4417-480f-8b2e-70cf6485e9aa\") " pod="openshift-marketplace/redhat-marketplace-smbj9" Dec 02 16:39:17 crc kubenswrapper[4933]: I1202 16:39:17.437340 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-smbj9" Dec 02 16:39:17 crc kubenswrapper[4933]: I1202 16:39:17.938235 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-smbj9"] Dec 02 16:39:18 crc kubenswrapper[4933]: I1202 16:39:18.463523 4933 generic.go:334] "Generic (PLEG): container finished" podID="4a6f6bc7-4417-480f-8b2e-70cf6485e9aa" containerID="e92b0364bc807e96c21de127b81d8b332c6438b4de1987710352c34ec7302f9a" exitCode=0 Dec 02 16:39:18 crc kubenswrapper[4933]: I1202 16:39:18.464270 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-smbj9" event={"ID":"4a6f6bc7-4417-480f-8b2e-70cf6485e9aa","Type":"ContainerDied","Data":"e92b0364bc807e96c21de127b81d8b332c6438b4de1987710352c34ec7302f9a"} Dec 02 16:39:18 crc kubenswrapper[4933]: I1202 16:39:18.464372 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-smbj9" event={"ID":"4a6f6bc7-4417-480f-8b2e-70cf6485e9aa","Type":"ContainerStarted","Data":"8b84ab047de5d85447f7ad5e2554a0df7ff2882cce9f1a4c9148bb4cc017a17b"} Dec 02 16:39:19 crc kubenswrapper[4933]: I1202 16:39:19.475437 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-smbj9" event={"ID":"4a6f6bc7-4417-480f-8b2e-70cf6485e9aa","Type":"ContainerStarted","Data":"10c4e1ae2c87d071125cdd01ff8fa8eecc32aac4adb16f3cb7dc8ed6c14ad906"} Dec 02 16:39:20 crc kubenswrapper[4933]: I1202 16:39:20.489163 4933 generic.go:334] "Generic (PLEG): container finished" podID="4a6f6bc7-4417-480f-8b2e-70cf6485e9aa" containerID="10c4e1ae2c87d071125cdd01ff8fa8eecc32aac4adb16f3cb7dc8ed6c14ad906" exitCode=0 Dec 02 16:39:20 crc kubenswrapper[4933]: I1202 16:39:20.489601 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-smbj9" event={"ID":"4a6f6bc7-4417-480f-8b2e-70cf6485e9aa","Type":"ContainerDied","Data":"10c4e1ae2c87d071125cdd01ff8fa8eecc32aac4adb16f3cb7dc8ed6c14ad906"} Dec 02 16:39:22 crc kubenswrapper[4933]: I1202 16:39:22.514234 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-smbj9" event={"ID":"4a6f6bc7-4417-480f-8b2e-70cf6485e9aa","Type":"ContainerStarted","Data":"9e1594eecd607a767b0c63100b1935a0ad52ec5cbf4310b6be6f543e398e79e9"} Dec 02 16:39:22 crc kubenswrapper[4933]: I1202 16:39:22.541291 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-smbj9" podStartSLOduration=1.7849420230000002 podStartE2EDuration="5.541269301s" podCreationTimestamp="2025-12-02 16:39:17 +0000 UTC" firstStartedPulling="2025-12-02 16:39:18.466635647 +0000 UTC m=+2821.717862340" lastFinishedPulling="2025-12-02 16:39:22.222962915 +0000 UTC m=+2825.474189618" observedRunningTime="2025-12-02 16:39:22.531018487 +0000 UTC m=+2825.782245190" watchObservedRunningTime="2025-12-02 16:39:22.541269301 +0000 UTC m=+2825.792496004" Dec 02 16:39:27 crc kubenswrapper[4933]: I1202 16:39:27.438044 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-smbj9" Dec 02 16:39:27 crc kubenswrapper[4933]: I1202 16:39:27.439843 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-smbj9" Dec 02 16:39:27 crc kubenswrapper[4933]: I1202 16:39:27.502155 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-smbj9" Dec 02 16:39:27 crc kubenswrapper[4933]: I1202 16:39:27.636564 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-smbj9" Dec 02 16:39:27 crc kubenswrapper[4933]: I1202 16:39:27.744893 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-smbj9"] Dec 02 16:39:29 crc kubenswrapper[4933]: I1202 16:39:29.584155 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-smbj9" podUID="4a6f6bc7-4417-480f-8b2e-70cf6485e9aa" containerName="registry-server" containerID="cri-o://9e1594eecd607a767b0c63100b1935a0ad52ec5cbf4310b6be6f543e398e79e9" gracePeriod=2 Dec 02 16:39:30 crc kubenswrapper[4933]: I1202 16:39:30.237676 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-smbj9" Dec 02 16:39:30 crc kubenswrapper[4933]: I1202 16:39:30.256307 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a6f6bc7-4417-480f-8b2e-70cf6485e9aa-catalog-content\") pod \"4a6f6bc7-4417-480f-8b2e-70cf6485e9aa\" (UID: \"4a6f6bc7-4417-480f-8b2e-70cf6485e9aa\") " Dec 02 16:39:30 crc kubenswrapper[4933]: I1202 16:39:30.256432 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a6f6bc7-4417-480f-8b2e-70cf6485e9aa-utilities\") pod \"4a6f6bc7-4417-480f-8b2e-70cf6485e9aa\" (UID: \"4a6f6bc7-4417-480f-8b2e-70cf6485e9aa\") " Dec 02 16:39:30 crc kubenswrapper[4933]: I1202 16:39:30.256845 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fjxw2\" (UniqueName: \"kubernetes.io/projected/4a6f6bc7-4417-480f-8b2e-70cf6485e9aa-kube-api-access-fjxw2\") pod \"4a6f6bc7-4417-480f-8b2e-70cf6485e9aa\" (UID: \"4a6f6bc7-4417-480f-8b2e-70cf6485e9aa\") " Dec 02 16:39:30 crc kubenswrapper[4933]: I1202 16:39:30.260673 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a6f6bc7-4417-480f-8b2e-70cf6485e9aa-utilities" (OuterVolumeSpecName: "utilities") pod "4a6f6bc7-4417-480f-8b2e-70cf6485e9aa" (UID: "4a6f6bc7-4417-480f-8b2e-70cf6485e9aa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 16:39:30 crc kubenswrapper[4933]: I1202 16:39:30.264420 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a6f6bc7-4417-480f-8b2e-70cf6485e9aa-kube-api-access-fjxw2" (OuterVolumeSpecName: "kube-api-access-fjxw2") pod "4a6f6bc7-4417-480f-8b2e-70cf6485e9aa" (UID: "4a6f6bc7-4417-480f-8b2e-70cf6485e9aa"). InnerVolumeSpecName "kube-api-access-fjxw2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:39:30 crc kubenswrapper[4933]: I1202 16:39:30.285063 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a6f6bc7-4417-480f-8b2e-70cf6485e9aa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4a6f6bc7-4417-480f-8b2e-70cf6485e9aa" (UID: "4a6f6bc7-4417-480f-8b2e-70cf6485e9aa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 16:39:30 crc kubenswrapper[4933]: I1202 16:39:30.359239 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fjxw2\" (UniqueName: \"kubernetes.io/projected/4a6f6bc7-4417-480f-8b2e-70cf6485e9aa-kube-api-access-fjxw2\") on node \"crc\" DevicePath \"\"" Dec 02 16:39:30 crc kubenswrapper[4933]: I1202 16:39:30.359270 4933 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a6f6bc7-4417-480f-8b2e-70cf6485e9aa-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 16:39:30 crc kubenswrapper[4933]: I1202 16:39:30.359280 4933 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a6f6bc7-4417-480f-8b2e-70cf6485e9aa-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 16:39:30 crc kubenswrapper[4933]: I1202 16:39:30.600206 4933 generic.go:334] "Generic (PLEG): container finished" podID="4a6f6bc7-4417-480f-8b2e-70cf6485e9aa" containerID="9e1594eecd607a767b0c63100b1935a0ad52ec5cbf4310b6be6f543e398e79e9" exitCode=0 Dec 02 16:39:30 crc kubenswrapper[4933]: I1202 16:39:30.600273 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-smbj9" event={"ID":"4a6f6bc7-4417-480f-8b2e-70cf6485e9aa","Type":"ContainerDied","Data":"9e1594eecd607a767b0c63100b1935a0ad52ec5cbf4310b6be6f543e398e79e9"} Dec 02 16:39:30 crc kubenswrapper[4933]: I1202 16:39:30.601748 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-smbj9" event={"ID":"4a6f6bc7-4417-480f-8b2e-70cf6485e9aa","Type":"ContainerDied","Data":"8b84ab047de5d85447f7ad5e2554a0df7ff2882cce9f1a4c9148bb4cc017a17b"} Dec 02 16:39:30 crc kubenswrapper[4933]: I1202 16:39:30.601854 4933 scope.go:117] "RemoveContainer" containerID="9e1594eecd607a767b0c63100b1935a0ad52ec5cbf4310b6be6f543e398e79e9" Dec 02 16:39:30 crc kubenswrapper[4933]: I1202 16:39:30.600288 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-smbj9" Dec 02 16:39:30 crc kubenswrapper[4933]: I1202 16:39:30.635082 4933 scope.go:117] "RemoveContainer" containerID="10c4e1ae2c87d071125cdd01ff8fa8eecc32aac4adb16f3cb7dc8ed6c14ad906" Dec 02 16:39:30 crc kubenswrapper[4933]: I1202 16:39:30.687994 4933 scope.go:117] "RemoveContainer" containerID="e92b0364bc807e96c21de127b81d8b332c6438b4de1987710352c34ec7302f9a" Dec 02 16:39:30 crc kubenswrapper[4933]: I1202 16:39:30.699969 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-smbj9"] Dec 02 16:39:30 crc kubenswrapper[4933]: I1202 16:39:30.716925 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-smbj9"] Dec 02 16:39:30 crc kubenswrapper[4933]: I1202 16:39:30.732234 4933 scope.go:117] "RemoveContainer" containerID="9e1594eecd607a767b0c63100b1935a0ad52ec5cbf4310b6be6f543e398e79e9" Dec 02 16:39:30 crc kubenswrapper[4933]: E1202 16:39:30.733323 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e1594eecd607a767b0c63100b1935a0ad52ec5cbf4310b6be6f543e398e79e9\": container with ID starting with 9e1594eecd607a767b0c63100b1935a0ad52ec5cbf4310b6be6f543e398e79e9 not found: ID does not exist" containerID="9e1594eecd607a767b0c63100b1935a0ad52ec5cbf4310b6be6f543e398e79e9" Dec 02 16:39:30 crc kubenswrapper[4933]: I1202 16:39:30.733359 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e1594eecd607a767b0c63100b1935a0ad52ec5cbf4310b6be6f543e398e79e9"} err="failed to get container status \"9e1594eecd607a767b0c63100b1935a0ad52ec5cbf4310b6be6f543e398e79e9\": rpc error: code = NotFound desc = could not find container \"9e1594eecd607a767b0c63100b1935a0ad52ec5cbf4310b6be6f543e398e79e9\": container with ID starting with 9e1594eecd607a767b0c63100b1935a0ad52ec5cbf4310b6be6f543e398e79e9 not found: ID does not exist" Dec 02 16:39:30 crc kubenswrapper[4933]: I1202 16:39:30.733385 4933 scope.go:117] "RemoveContainer" containerID="10c4e1ae2c87d071125cdd01ff8fa8eecc32aac4adb16f3cb7dc8ed6c14ad906" Dec 02 16:39:30 crc kubenswrapper[4933]: E1202 16:39:30.733615 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10c4e1ae2c87d071125cdd01ff8fa8eecc32aac4adb16f3cb7dc8ed6c14ad906\": container with ID starting with 10c4e1ae2c87d071125cdd01ff8fa8eecc32aac4adb16f3cb7dc8ed6c14ad906 not found: ID does not exist" containerID="10c4e1ae2c87d071125cdd01ff8fa8eecc32aac4adb16f3cb7dc8ed6c14ad906" Dec 02 16:39:30 crc kubenswrapper[4933]: I1202 16:39:30.733643 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10c4e1ae2c87d071125cdd01ff8fa8eecc32aac4adb16f3cb7dc8ed6c14ad906"} err="failed to get container status \"10c4e1ae2c87d071125cdd01ff8fa8eecc32aac4adb16f3cb7dc8ed6c14ad906\": rpc error: code = NotFound desc = could not find container \"10c4e1ae2c87d071125cdd01ff8fa8eecc32aac4adb16f3cb7dc8ed6c14ad906\": container with ID starting with 10c4e1ae2c87d071125cdd01ff8fa8eecc32aac4adb16f3cb7dc8ed6c14ad906 not found: ID does not exist" Dec 02 16:39:30 crc kubenswrapper[4933]: I1202 16:39:30.733660 4933 scope.go:117] "RemoveContainer" containerID="e92b0364bc807e96c21de127b81d8b332c6438b4de1987710352c34ec7302f9a" Dec 02 16:39:30 crc kubenswrapper[4933]: E1202 16:39:30.733872 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e92b0364bc807e96c21de127b81d8b332c6438b4de1987710352c34ec7302f9a\": container with ID starting with e92b0364bc807e96c21de127b81d8b332c6438b4de1987710352c34ec7302f9a not found: ID does not exist" containerID="e92b0364bc807e96c21de127b81d8b332c6438b4de1987710352c34ec7302f9a" Dec 02 16:39:30 crc kubenswrapper[4933]: I1202 16:39:30.733899 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e92b0364bc807e96c21de127b81d8b332c6438b4de1987710352c34ec7302f9a"} err="failed to get container status \"e92b0364bc807e96c21de127b81d8b332c6438b4de1987710352c34ec7302f9a\": rpc error: code = NotFound desc = could not find container \"e92b0364bc807e96c21de127b81d8b332c6438b4de1987710352c34ec7302f9a\": container with ID starting with e92b0364bc807e96c21de127b81d8b332c6438b4de1987710352c34ec7302f9a not found: ID does not exist" Dec 02 16:39:31 crc kubenswrapper[4933]: I1202 16:39:31.069425 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a6f6bc7-4417-480f-8b2e-70cf6485e9aa" path="/var/lib/kubelet/pods/4a6f6bc7-4417-480f-8b2e-70cf6485e9aa/volumes" Dec 02 16:39:37 crc kubenswrapper[4933]: I1202 16:39:37.141116 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-74j9l"] Dec 02 16:39:37 crc kubenswrapper[4933]: E1202 16:39:37.142895 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a6f6bc7-4417-480f-8b2e-70cf6485e9aa" containerName="extract-utilities" Dec 02 16:39:37 crc kubenswrapper[4933]: I1202 16:39:37.142922 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a6f6bc7-4417-480f-8b2e-70cf6485e9aa" containerName="extract-utilities" Dec 02 16:39:37 crc kubenswrapper[4933]: E1202 16:39:37.142991 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a6f6bc7-4417-480f-8b2e-70cf6485e9aa" containerName="registry-server" Dec 02 16:39:37 crc kubenswrapper[4933]: I1202 16:39:37.143001 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a6f6bc7-4417-480f-8b2e-70cf6485e9aa" containerName="registry-server" Dec 02 16:39:37 crc kubenswrapper[4933]: E1202 16:39:37.143044 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a6f6bc7-4417-480f-8b2e-70cf6485e9aa" containerName="extract-content" Dec 02 16:39:37 crc kubenswrapper[4933]: I1202 16:39:37.143054 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a6f6bc7-4417-480f-8b2e-70cf6485e9aa" containerName="extract-content" Dec 02 16:39:37 crc kubenswrapper[4933]: I1202 16:39:37.144002 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a6f6bc7-4417-480f-8b2e-70cf6485e9aa" containerName="registry-server" Dec 02 16:39:37 crc kubenswrapper[4933]: I1202 16:39:37.146861 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-74j9l" Dec 02 16:39:37 crc kubenswrapper[4933]: I1202 16:39:37.157314 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-74j9l"] Dec 02 16:39:37 crc kubenswrapper[4933]: I1202 16:39:37.243355 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f232e9bb-2848-43ff-a71f-d05c4efc23f2-catalog-content\") pod \"certified-operators-74j9l\" (UID: \"f232e9bb-2848-43ff-a71f-d05c4efc23f2\") " pod="openshift-marketplace/certified-operators-74j9l" Dec 02 16:39:37 crc kubenswrapper[4933]: I1202 16:39:37.245178 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g95vz\" (UniqueName: \"kubernetes.io/projected/f232e9bb-2848-43ff-a71f-d05c4efc23f2-kube-api-access-g95vz\") pod \"certified-operators-74j9l\" (UID: \"f232e9bb-2848-43ff-a71f-d05c4efc23f2\") " pod="openshift-marketplace/certified-operators-74j9l" Dec 02 16:39:37 crc kubenswrapper[4933]: I1202 16:39:37.245344 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f232e9bb-2848-43ff-a71f-d05c4efc23f2-utilities\") pod \"certified-operators-74j9l\" (UID: \"f232e9bb-2848-43ff-a71f-d05c4efc23f2\") " pod="openshift-marketplace/certified-operators-74j9l" Dec 02 16:39:37 crc kubenswrapper[4933]: I1202 16:39:37.349529 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f232e9bb-2848-43ff-a71f-d05c4efc23f2-catalog-content\") pod \"certified-operators-74j9l\" (UID: \"f232e9bb-2848-43ff-a71f-d05c4efc23f2\") " pod="openshift-marketplace/certified-operators-74j9l" Dec 02 16:39:37 crc kubenswrapper[4933]: I1202 16:39:37.349658 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g95vz\" (UniqueName: \"kubernetes.io/projected/f232e9bb-2848-43ff-a71f-d05c4efc23f2-kube-api-access-g95vz\") pod \"certified-operators-74j9l\" (UID: \"f232e9bb-2848-43ff-a71f-d05c4efc23f2\") " pod="openshift-marketplace/certified-operators-74j9l" Dec 02 16:39:37 crc kubenswrapper[4933]: I1202 16:39:37.349718 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f232e9bb-2848-43ff-a71f-d05c4efc23f2-utilities\") pod \"certified-operators-74j9l\" (UID: \"f232e9bb-2848-43ff-a71f-d05c4efc23f2\") " pod="openshift-marketplace/certified-operators-74j9l" Dec 02 16:39:37 crc kubenswrapper[4933]: I1202 16:39:37.350231 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f232e9bb-2848-43ff-a71f-d05c4efc23f2-catalog-content\") pod \"certified-operators-74j9l\" (UID: \"f232e9bb-2848-43ff-a71f-d05c4efc23f2\") " pod="openshift-marketplace/certified-operators-74j9l" Dec 02 16:39:37 crc kubenswrapper[4933]: I1202 16:39:37.350401 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f232e9bb-2848-43ff-a71f-d05c4efc23f2-utilities\") pod \"certified-operators-74j9l\" (UID: \"f232e9bb-2848-43ff-a71f-d05c4efc23f2\") " pod="openshift-marketplace/certified-operators-74j9l" Dec 02 16:39:37 crc kubenswrapper[4933]: I1202 16:39:37.369782 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g95vz\" (UniqueName: \"kubernetes.io/projected/f232e9bb-2848-43ff-a71f-d05c4efc23f2-kube-api-access-g95vz\") pod \"certified-operators-74j9l\" (UID: \"f232e9bb-2848-43ff-a71f-d05c4efc23f2\") " pod="openshift-marketplace/certified-operators-74j9l" Dec 02 16:39:37 crc kubenswrapper[4933]: I1202 16:39:37.481604 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-74j9l" Dec 02 16:39:37 crc kubenswrapper[4933]: I1202 16:39:37.977235 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-74j9l"] Dec 02 16:39:38 crc kubenswrapper[4933]: I1202 16:39:38.701704 4933 generic.go:334] "Generic (PLEG): container finished" podID="f232e9bb-2848-43ff-a71f-d05c4efc23f2" containerID="e7140068f7243a056c558a4ecce4a80c43aea232463403a317b7f6deb20b193b" exitCode=0 Dec 02 16:39:38 crc kubenswrapper[4933]: I1202 16:39:38.701758 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-74j9l" event={"ID":"f232e9bb-2848-43ff-a71f-d05c4efc23f2","Type":"ContainerDied","Data":"e7140068f7243a056c558a4ecce4a80c43aea232463403a317b7f6deb20b193b"} Dec 02 16:39:38 crc kubenswrapper[4933]: I1202 16:39:38.702045 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-74j9l" event={"ID":"f232e9bb-2848-43ff-a71f-d05c4efc23f2","Type":"ContainerStarted","Data":"8fd215056a2d37c52dd89e85741751a4a30ec133206f02e759d7b5af5a8b3991"} Dec 02 16:39:39 crc kubenswrapper[4933]: I1202 16:39:39.717269 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-74j9l" event={"ID":"f232e9bb-2848-43ff-a71f-d05c4efc23f2","Type":"ContainerStarted","Data":"958d6bd19ea8ae5ba9b8ba660404502bc59f8b021841f697fea06b2ce147dea6"} Dec 02 16:39:40 crc kubenswrapper[4933]: E1202 16:39:40.535139 4933 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf232e9bb_2848_43ff_a71f_d05c4efc23f2.slice/crio-958d6bd19ea8ae5ba9b8ba660404502bc59f8b021841f697fea06b2ce147dea6.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf232e9bb_2848_43ff_a71f_d05c4efc23f2.slice/crio-conmon-958d6bd19ea8ae5ba9b8ba660404502bc59f8b021841f697fea06b2ce147dea6.scope\": RecentStats: unable to find data in memory cache]" Dec 02 16:39:40 crc kubenswrapper[4933]: I1202 16:39:40.735905 4933 generic.go:334] "Generic (PLEG): container finished" podID="f232e9bb-2848-43ff-a71f-d05c4efc23f2" containerID="958d6bd19ea8ae5ba9b8ba660404502bc59f8b021841f697fea06b2ce147dea6" exitCode=0 Dec 02 16:39:40 crc kubenswrapper[4933]: I1202 16:39:40.735995 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-74j9l" event={"ID":"f232e9bb-2848-43ff-a71f-d05c4efc23f2","Type":"ContainerDied","Data":"958d6bd19ea8ae5ba9b8ba660404502bc59f8b021841f697fea06b2ce147dea6"} Dec 02 16:39:41 crc kubenswrapper[4933]: I1202 16:39:41.748332 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-74j9l" event={"ID":"f232e9bb-2848-43ff-a71f-d05c4efc23f2","Type":"ContainerStarted","Data":"28a9afeb8141c15e055e570025ac92b433cdb107d5948115e8da0fb87f680455"} Dec 02 16:39:41 crc kubenswrapper[4933]: I1202 16:39:41.774995 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-74j9l" podStartSLOduration=2.217784381 podStartE2EDuration="4.774976425s" podCreationTimestamp="2025-12-02 16:39:37 +0000 UTC" firstStartedPulling="2025-12-02 16:39:38.703883598 +0000 UTC m=+2841.955110301" lastFinishedPulling="2025-12-02 16:39:41.261075642 +0000 UTC m=+2844.512302345" observedRunningTime="2025-12-02 16:39:41.762707407 +0000 UTC m=+2845.013934110" watchObservedRunningTime="2025-12-02 16:39:41.774976425 +0000 UTC m=+2845.026203128" Dec 02 16:39:47 crc kubenswrapper[4933]: I1202 16:39:47.482074 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-74j9l" Dec 02 16:39:47 crc kubenswrapper[4933]: I1202 16:39:47.483578 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-74j9l" Dec 02 16:39:47 crc kubenswrapper[4933]: I1202 16:39:47.542963 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-74j9l" Dec 02 16:39:47 crc kubenswrapper[4933]: I1202 16:39:47.869736 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-74j9l" Dec 02 16:39:48 crc kubenswrapper[4933]: I1202 16:39:48.892850 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-74j9l"] Dec 02 16:39:50 crc kubenswrapper[4933]: I1202 16:39:50.834487 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-74j9l" podUID="f232e9bb-2848-43ff-a71f-d05c4efc23f2" containerName="registry-server" containerID="cri-o://28a9afeb8141c15e055e570025ac92b433cdb107d5948115e8da0fb87f680455" gracePeriod=2 Dec 02 16:39:51 crc kubenswrapper[4933]: I1202 16:39:51.508790 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-74j9l" Dec 02 16:39:51 crc kubenswrapper[4933]: I1202 16:39:51.584594 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g95vz\" (UniqueName: \"kubernetes.io/projected/f232e9bb-2848-43ff-a71f-d05c4efc23f2-kube-api-access-g95vz\") pod \"f232e9bb-2848-43ff-a71f-d05c4efc23f2\" (UID: \"f232e9bb-2848-43ff-a71f-d05c4efc23f2\") " Dec 02 16:39:51 crc kubenswrapper[4933]: I1202 16:39:51.584647 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f232e9bb-2848-43ff-a71f-d05c4efc23f2-catalog-content\") pod \"f232e9bb-2848-43ff-a71f-d05c4efc23f2\" (UID: \"f232e9bb-2848-43ff-a71f-d05c4efc23f2\") " Dec 02 16:39:51 crc kubenswrapper[4933]: I1202 16:39:51.584812 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f232e9bb-2848-43ff-a71f-d05c4efc23f2-utilities\") pod \"f232e9bb-2848-43ff-a71f-d05c4efc23f2\" (UID: \"f232e9bb-2848-43ff-a71f-d05c4efc23f2\") " Dec 02 16:39:51 crc kubenswrapper[4933]: I1202 16:39:51.585686 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f232e9bb-2848-43ff-a71f-d05c4efc23f2-utilities" (OuterVolumeSpecName: "utilities") pod "f232e9bb-2848-43ff-a71f-d05c4efc23f2" (UID: "f232e9bb-2848-43ff-a71f-d05c4efc23f2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 16:39:51 crc kubenswrapper[4933]: I1202 16:39:51.590351 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f232e9bb-2848-43ff-a71f-d05c4efc23f2-kube-api-access-g95vz" (OuterVolumeSpecName: "kube-api-access-g95vz") pod "f232e9bb-2848-43ff-a71f-d05c4efc23f2" (UID: "f232e9bb-2848-43ff-a71f-d05c4efc23f2"). InnerVolumeSpecName "kube-api-access-g95vz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:39:51 crc kubenswrapper[4933]: I1202 16:39:51.632419 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f232e9bb-2848-43ff-a71f-d05c4efc23f2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f232e9bb-2848-43ff-a71f-d05c4efc23f2" (UID: "f232e9bb-2848-43ff-a71f-d05c4efc23f2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 16:39:51 crc kubenswrapper[4933]: I1202 16:39:51.687582 4933 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f232e9bb-2848-43ff-a71f-d05c4efc23f2-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 16:39:51 crc kubenswrapper[4933]: I1202 16:39:51.687616 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g95vz\" (UniqueName: \"kubernetes.io/projected/f232e9bb-2848-43ff-a71f-d05c4efc23f2-kube-api-access-g95vz\") on node \"crc\" DevicePath \"\"" Dec 02 16:39:51 crc kubenswrapper[4933]: I1202 16:39:51.687627 4933 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f232e9bb-2848-43ff-a71f-d05c4efc23f2-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 16:39:51 crc kubenswrapper[4933]: I1202 16:39:51.848680 4933 generic.go:334] "Generic (PLEG): container finished" podID="f232e9bb-2848-43ff-a71f-d05c4efc23f2" containerID="28a9afeb8141c15e055e570025ac92b433cdb107d5948115e8da0fb87f680455" exitCode=0 Dec 02 16:39:51 crc kubenswrapper[4933]: I1202 16:39:51.848745 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-74j9l" Dec 02 16:39:51 crc kubenswrapper[4933]: I1202 16:39:51.848730 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-74j9l" event={"ID":"f232e9bb-2848-43ff-a71f-d05c4efc23f2","Type":"ContainerDied","Data":"28a9afeb8141c15e055e570025ac92b433cdb107d5948115e8da0fb87f680455"} Dec 02 16:39:51 crc kubenswrapper[4933]: I1202 16:39:51.848927 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-74j9l" event={"ID":"f232e9bb-2848-43ff-a71f-d05c4efc23f2","Type":"ContainerDied","Data":"8fd215056a2d37c52dd89e85741751a4a30ec133206f02e759d7b5af5a8b3991"} Dec 02 16:39:51 crc kubenswrapper[4933]: I1202 16:39:51.848949 4933 scope.go:117] "RemoveContainer" containerID="28a9afeb8141c15e055e570025ac92b433cdb107d5948115e8da0fb87f680455" Dec 02 16:39:51 crc kubenswrapper[4933]: I1202 16:39:51.881029 4933 scope.go:117] "RemoveContainer" containerID="958d6bd19ea8ae5ba9b8ba660404502bc59f8b021841f697fea06b2ce147dea6" Dec 02 16:39:51 crc kubenswrapper[4933]: I1202 16:39:51.890183 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-74j9l"] Dec 02 16:39:51 crc kubenswrapper[4933]: I1202 16:39:51.900035 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-74j9l"] Dec 02 16:39:51 crc kubenswrapper[4933]: I1202 16:39:51.919960 4933 scope.go:117] "RemoveContainer" containerID="e7140068f7243a056c558a4ecce4a80c43aea232463403a317b7f6deb20b193b" Dec 02 16:39:51 crc kubenswrapper[4933]: I1202 16:39:51.979568 4933 scope.go:117] "RemoveContainer" containerID="28a9afeb8141c15e055e570025ac92b433cdb107d5948115e8da0fb87f680455" Dec 02 16:39:51 crc kubenswrapper[4933]: E1202 16:39:51.980049 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28a9afeb8141c15e055e570025ac92b433cdb107d5948115e8da0fb87f680455\": container with ID starting with 28a9afeb8141c15e055e570025ac92b433cdb107d5948115e8da0fb87f680455 not found: ID does not exist" containerID="28a9afeb8141c15e055e570025ac92b433cdb107d5948115e8da0fb87f680455" Dec 02 16:39:51 crc kubenswrapper[4933]: I1202 16:39:51.980086 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28a9afeb8141c15e055e570025ac92b433cdb107d5948115e8da0fb87f680455"} err="failed to get container status \"28a9afeb8141c15e055e570025ac92b433cdb107d5948115e8da0fb87f680455\": rpc error: code = NotFound desc = could not find container \"28a9afeb8141c15e055e570025ac92b433cdb107d5948115e8da0fb87f680455\": container with ID starting with 28a9afeb8141c15e055e570025ac92b433cdb107d5948115e8da0fb87f680455 not found: ID does not exist" Dec 02 16:39:51 crc kubenswrapper[4933]: I1202 16:39:51.980114 4933 scope.go:117] "RemoveContainer" containerID="958d6bd19ea8ae5ba9b8ba660404502bc59f8b021841f697fea06b2ce147dea6" Dec 02 16:39:51 crc kubenswrapper[4933]: E1202 16:39:51.980738 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"958d6bd19ea8ae5ba9b8ba660404502bc59f8b021841f697fea06b2ce147dea6\": container with ID starting with 958d6bd19ea8ae5ba9b8ba660404502bc59f8b021841f697fea06b2ce147dea6 not found: ID does not exist" containerID="958d6bd19ea8ae5ba9b8ba660404502bc59f8b021841f697fea06b2ce147dea6" Dec 02 16:39:51 crc kubenswrapper[4933]: I1202 16:39:51.980758 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"958d6bd19ea8ae5ba9b8ba660404502bc59f8b021841f697fea06b2ce147dea6"} err="failed to get container status \"958d6bd19ea8ae5ba9b8ba660404502bc59f8b021841f697fea06b2ce147dea6\": rpc error: code = NotFound desc = could not find container \"958d6bd19ea8ae5ba9b8ba660404502bc59f8b021841f697fea06b2ce147dea6\": container with ID starting with 958d6bd19ea8ae5ba9b8ba660404502bc59f8b021841f697fea06b2ce147dea6 not found: ID does not exist" Dec 02 16:39:51 crc kubenswrapper[4933]: I1202 16:39:51.980775 4933 scope.go:117] "RemoveContainer" containerID="e7140068f7243a056c558a4ecce4a80c43aea232463403a317b7f6deb20b193b" Dec 02 16:39:51 crc kubenswrapper[4933]: E1202 16:39:51.983589 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7140068f7243a056c558a4ecce4a80c43aea232463403a317b7f6deb20b193b\": container with ID starting with e7140068f7243a056c558a4ecce4a80c43aea232463403a317b7f6deb20b193b not found: ID does not exist" containerID="e7140068f7243a056c558a4ecce4a80c43aea232463403a317b7f6deb20b193b" Dec 02 16:39:51 crc kubenswrapper[4933]: I1202 16:39:51.983610 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7140068f7243a056c558a4ecce4a80c43aea232463403a317b7f6deb20b193b"} err="failed to get container status \"e7140068f7243a056c558a4ecce4a80c43aea232463403a317b7f6deb20b193b\": rpc error: code = NotFound desc = could not find container \"e7140068f7243a056c558a4ecce4a80c43aea232463403a317b7f6deb20b193b\": container with ID starting with e7140068f7243a056c558a4ecce4a80c43aea232463403a317b7f6deb20b193b not found: ID does not exist" Dec 02 16:39:53 crc kubenswrapper[4933]: I1202 16:39:53.071007 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f232e9bb-2848-43ff-a71f-d05c4efc23f2" path="/var/lib/kubelet/pods/f232e9bb-2848-43ff-a71f-d05c4efc23f2/volumes" Dec 02 16:39:58 crc kubenswrapper[4933]: I1202 16:39:58.927908 4933 generic.go:334] "Generic (PLEG): container finished" podID="5e83fdad-1cda-4282-b5e7-6911bdd8d9a0" containerID="553b068134b0ec8453a42d285ff6e3e8416e1c72c197831fe3f967c811b55ed7" exitCode=0 Dec 02 16:39:58 crc kubenswrapper[4933]: I1202 16:39:58.929156 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cbhhv" event={"ID":"5e83fdad-1cda-4282-b5e7-6911bdd8d9a0","Type":"ContainerDied","Data":"553b068134b0ec8453a42d285ff6e3e8416e1c72c197831fe3f967c811b55ed7"} Dec 02 16:40:00 crc kubenswrapper[4933]: I1202 16:40:00.467893 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cbhhv" Dec 02 16:40:00 crc kubenswrapper[4933]: I1202 16:40:00.516040 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e83fdad-1cda-4282-b5e7-6911bdd8d9a0-nova-combined-ca-bundle\") pod \"5e83fdad-1cda-4282-b5e7-6911bdd8d9a0\" (UID: \"5e83fdad-1cda-4282-b5e7-6911bdd8d9a0\") " Dec 02 16:40:00 crc kubenswrapper[4933]: I1202 16:40:00.516091 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e83fdad-1cda-4282-b5e7-6911bdd8d9a0-inventory\") pod \"5e83fdad-1cda-4282-b5e7-6911bdd8d9a0\" (UID: \"5e83fdad-1cda-4282-b5e7-6911bdd8d9a0\") " Dec 02 16:40:00 crc kubenswrapper[4933]: I1202 16:40:00.516124 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/5e83fdad-1cda-4282-b5e7-6911bdd8d9a0-nova-cell1-compute-config-0\") pod \"5e83fdad-1cda-4282-b5e7-6911bdd8d9a0\" (UID: \"5e83fdad-1cda-4282-b5e7-6911bdd8d9a0\") " Dec 02 16:40:00 crc kubenswrapper[4933]: I1202 16:40:00.516186 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6q6x2\" (UniqueName: \"kubernetes.io/projected/5e83fdad-1cda-4282-b5e7-6911bdd8d9a0-kube-api-access-6q6x2\") pod \"5e83fdad-1cda-4282-b5e7-6911bdd8d9a0\" (UID: \"5e83fdad-1cda-4282-b5e7-6911bdd8d9a0\") " Dec 02 16:40:00 crc kubenswrapper[4933]: I1202 16:40:00.516257 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/5e83fdad-1cda-4282-b5e7-6911bdd8d9a0-nova-cell1-compute-config-1\") pod \"5e83fdad-1cda-4282-b5e7-6911bdd8d9a0\" (UID: \"5e83fdad-1cda-4282-b5e7-6911bdd8d9a0\") " Dec 02 16:40:00 crc kubenswrapper[4933]: I1202 16:40:00.516294 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5e83fdad-1cda-4282-b5e7-6911bdd8d9a0-ssh-key\") pod \"5e83fdad-1cda-4282-b5e7-6911bdd8d9a0\" (UID: \"5e83fdad-1cda-4282-b5e7-6911bdd8d9a0\") " Dec 02 16:40:00 crc kubenswrapper[4933]: I1202 16:40:00.516312 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/5e83fdad-1cda-4282-b5e7-6911bdd8d9a0-nova-migration-ssh-key-0\") pod \"5e83fdad-1cda-4282-b5e7-6911bdd8d9a0\" (UID: \"5e83fdad-1cda-4282-b5e7-6911bdd8d9a0\") " Dec 02 16:40:00 crc kubenswrapper[4933]: I1202 16:40:00.516352 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/5e83fdad-1cda-4282-b5e7-6911bdd8d9a0-nova-migration-ssh-key-1\") pod \"5e83fdad-1cda-4282-b5e7-6911bdd8d9a0\" (UID: \"5e83fdad-1cda-4282-b5e7-6911bdd8d9a0\") " Dec 02 16:40:00 crc kubenswrapper[4933]: I1202 16:40:00.516380 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/5e83fdad-1cda-4282-b5e7-6911bdd8d9a0-nova-extra-config-0\") pod \"5e83fdad-1cda-4282-b5e7-6911bdd8d9a0\" (UID: \"5e83fdad-1cda-4282-b5e7-6911bdd8d9a0\") " Dec 02 16:40:00 crc kubenswrapper[4933]: I1202 16:40:00.562862 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e83fdad-1cda-4282-b5e7-6911bdd8d9a0-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "5e83fdad-1cda-4282-b5e7-6911bdd8d9a0" (UID: "5e83fdad-1cda-4282-b5e7-6911bdd8d9a0"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:40:00 crc kubenswrapper[4933]: I1202 16:40:00.565209 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e83fdad-1cda-4282-b5e7-6911bdd8d9a0-kube-api-access-6q6x2" (OuterVolumeSpecName: "kube-api-access-6q6x2") pod "5e83fdad-1cda-4282-b5e7-6911bdd8d9a0" (UID: "5e83fdad-1cda-4282-b5e7-6911bdd8d9a0"). InnerVolumeSpecName "kube-api-access-6q6x2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:40:00 crc kubenswrapper[4933]: I1202 16:40:00.596374 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e83fdad-1cda-4282-b5e7-6911bdd8d9a0-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "5e83fdad-1cda-4282-b5e7-6911bdd8d9a0" (UID: "5e83fdad-1cda-4282-b5e7-6911bdd8d9a0"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:40:00 crc kubenswrapper[4933]: I1202 16:40:00.599559 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e83fdad-1cda-4282-b5e7-6911bdd8d9a0-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "5e83fdad-1cda-4282-b5e7-6911bdd8d9a0" (UID: "5e83fdad-1cda-4282-b5e7-6911bdd8d9a0"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:40:00 crc kubenswrapper[4933]: I1202 16:40:00.608951 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e83fdad-1cda-4282-b5e7-6911bdd8d9a0-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "5e83fdad-1cda-4282-b5e7-6911bdd8d9a0" (UID: "5e83fdad-1cda-4282-b5e7-6911bdd8d9a0"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:40:00 crc kubenswrapper[4933]: I1202 16:40:00.619051 4933 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/5e83fdad-1cda-4282-b5e7-6911bdd8d9a0-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Dec 02 16:40:00 crc kubenswrapper[4933]: I1202 16:40:00.619085 4933 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/5e83fdad-1cda-4282-b5e7-6911bdd8d9a0-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Dec 02 16:40:00 crc kubenswrapper[4933]: I1202 16:40:00.619097 4933 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/5e83fdad-1cda-4282-b5e7-6911bdd8d9a0-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Dec 02 16:40:00 crc kubenswrapper[4933]: I1202 16:40:00.619110 4933 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e83fdad-1cda-4282-b5e7-6911bdd8d9a0-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 16:40:00 crc kubenswrapper[4933]: I1202 16:40:00.619123 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6q6x2\" (UniqueName: \"kubernetes.io/projected/5e83fdad-1cda-4282-b5e7-6911bdd8d9a0-kube-api-access-6q6x2\") on node \"crc\" DevicePath \"\"" Dec 02 16:40:00 crc kubenswrapper[4933]: I1202 16:40:00.624355 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e83fdad-1cda-4282-b5e7-6911bdd8d9a0-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "5e83fdad-1cda-4282-b5e7-6911bdd8d9a0" (UID: "5e83fdad-1cda-4282-b5e7-6911bdd8d9a0"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:40:00 crc kubenswrapper[4933]: I1202 16:40:00.625753 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e83fdad-1cda-4282-b5e7-6911bdd8d9a0-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "5e83fdad-1cda-4282-b5e7-6911bdd8d9a0" (UID: "5e83fdad-1cda-4282-b5e7-6911bdd8d9a0"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:40:00 crc kubenswrapper[4933]: I1202 16:40:00.626920 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e83fdad-1cda-4282-b5e7-6911bdd8d9a0-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "5e83fdad-1cda-4282-b5e7-6911bdd8d9a0" (UID: "5e83fdad-1cda-4282-b5e7-6911bdd8d9a0"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:40:00 crc kubenswrapper[4933]: I1202 16:40:00.631549 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e83fdad-1cda-4282-b5e7-6911bdd8d9a0-inventory" (OuterVolumeSpecName: "inventory") pod "5e83fdad-1cda-4282-b5e7-6911bdd8d9a0" (UID: "5e83fdad-1cda-4282-b5e7-6911bdd8d9a0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:40:00 crc kubenswrapper[4933]: I1202 16:40:00.722363 4933 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5e83fdad-1cda-4282-b5e7-6911bdd8d9a0-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 16:40:00 crc kubenswrapper[4933]: I1202 16:40:00.722710 4933 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/5e83fdad-1cda-4282-b5e7-6911bdd8d9a0-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Dec 02 16:40:00 crc kubenswrapper[4933]: I1202 16:40:00.722729 4933 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e83fdad-1cda-4282-b5e7-6911bdd8d9a0-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 16:40:00 crc kubenswrapper[4933]: I1202 16:40:00.722745 4933 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/5e83fdad-1cda-4282-b5e7-6911bdd8d9a0-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Dec 02 16:40:00 crc kubenswrapper[4933]: I1202 16:40:00.956940 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cbhhv" event={"ID":"5e83fdad-1cda-4282-b5e7-6911bdd8d9a0","Type":"ContainerDied","Data":"bed6b4203c03f8645c8ef5362b4ceb5c7b9d7617ea668d4eea2294b993505268"} Dec 02 16:40:00 crc kubenswrapper[4933]: I1202 16:40:00.956996 4933 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bed6b4203c03f8645c8ef5362b4ceb5c7b9d7617ea668d4eea2294b993505268" Dec 02 16:40:00 crc kubenswrapper[4933]: I1202 16:40:00.957002 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cbhhv" Dec 02 16:40:01 crc kubenswrapper[4933]: I1202 16:40:01.138223 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zzqmp"] Dec 02 16:40:01 crc kubenswrapper[4933]: E1202 16:40:01.138717 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e83fdad-1cda-4282-b5e7-6911bdd8d9a0" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 02 16:40:01 crc kubenswrapper[4933]: I1202 16:40:01.138735 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e83fdad-1cda-4282-b5e7-6911bdd8d9a0" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 02 16:40:01 crc kubenswrapper[4933]: E1202 16:40:01.138752 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f232e9bb-2848-43ff-a71f-d05c4efc23f2" containerName="registry-server" Dec 02 16:40:01 crc kubenswrapper[4933]: I1202 16:40:01.138758 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="f232e9bb-2848-43ff-a71f-d05c4efc23f2" containerName="registry-server" Dec 02 16:40:01 crc kubenswrapper[4933]: E1202 16:40:01.138784 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f232e9bb-2848-43ff-a71f-d05c4efc23f2" containerName="extract-utilities" Dec 02 16:40:01 crc kubenswrapper[4933]: I1202 16:40:01.138791 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="f232e9bb-2848-43ff-a71f-d05c4efc23f2" containerName="extract-utilities" Dec 02 16:40:01 crc kubenswrapper[4933]: E1202 16:40:01.138805 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f232e9bb-2848-43ff-a71f-d05c4efc23f2" containerName="extract-content" Dec 02 16:40:01 crc kubenswrapper[4933]: I1202 16:40:01.138811 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="f232e9bb-2848-43ff-a71f-d05c4efc23f2" containerName="extract-content" Dec 02 16:40:01 crc kubenswrapper[4933]: I1202 16:40:01.139040 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e83fdad-1cda-4282-b5e7-6911bdd8d9a0" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 02 16:40:01 crc kubenswrapper[4933]: I1202 16:40:01.139066 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="f232e9bb-2848-43ff-a71f-d05c4efc23f2" containerName="registry-server" Dec 02 16:40:01 crc kubenswrapper[4933]: I1202 16:40:01.139870 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zzqmp" Dec 02 16:40:01 crc kubenswrapper[4933]: I1202 16:40:01.144640 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mlmmm" Dec 02 16:40:01 crc kubenswrapper[4933]: I1202 16:40:01.144737 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Dec 02 16:40:01 crc kubenswrapper[4933]: I1202 16:40:01.144791 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 16:40:01 crc kubenswrapper[4933]: I1202 16:40:01.149276 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zzqmp"] Dec 02 16:40:01 crc kubenswrapper[4933]: I1202 16:40:01.149554 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 16:40:01 crc kubenswrapper[4933]: I1202 16:40:01.149952 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 16:40:01 crc kubenswrapper[4933]: I1202 16:40:01.341702 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/efd2ea83-1192-411e-9c7f-bff8761883fb-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zzqmp\" (UID: \"efd2ea83-1192-411e-9c7f-bff8761883fb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zzqmp" Dec 02 16:40:01 crc kubenswrapper[4933]: I1202 16:40:01.342308 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/efd2ea83-1192-411e-9c7f-bff8761883fb-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zzqmp\" (UID: \"efd2ea83-1192-411e-9c7f-bff8761883fb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zzqmp" Dec 02 16:40:01 crc kubenswrapper[4933]: I1202 16:40:01.342549 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efd2ea83-1192-411e-9c7f-bff8761883fb-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zzqmp\" (UID: \"efd2ea83-1192-411e-9c7f-bff8761883fb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zzqmp" Dec 02 16:40:01 crc kubenswrapper[4933]: I1202 16:40:01.342919 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/efd2ea83-1192-411e-9c7f-bff8761883fb-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zzqmp\" (UID: \"efd2ea83-1192-411e-9c7f-bff8761883fb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zzqmp" Dec 02 16:40:01 crc kubenswrapper[4933]: I1202 16:40:01.343340 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/efd2ea83-1192-411e-9c7f-bff8761883fb-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zzqmp\" (UID: \"efd2ea83-1192-411e-9c7f-bff8761883fb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zzqmp" Dec 02 16:40:01 crc kubenswrapper[4933]: I1202 16:40:01.343536 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bccjz\" (UniqueName: \"kubernetes.io/projected/efd2ea83-1192-411e-9c7f-bff8761883fb-kube-api-access-bccjz\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zzqmp\" (UID: \"efd2ea83-1192-411e-9c7f-bff8761883fb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zzqmp" Dec 02 16:40:01 crc kubenswrapper[4933]: I1202 16:40:01.343910 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/efd2ea83-1192-411e-9c7f-bff8761883fb-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zzqmp\" (UID: \"efd2ea83-1192-411e-9c7f-bff8761883fb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zzqmp" Dec 02 16:40:01 crc kubenswrapper[4933]: I1202 16:40:01.446007 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/efd2ea83-1192-411e-9c7f-bff8761883fb-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zzqmp\" (UID: \"efd2ea83-1192-411e-9c7f-bff8761883fb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zzqmp" Dec 02 16:40:01 crc kubenswrapper[4933]: I1202 16:40:01.446111 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/efd2ea83-1192-411e-9c7f-bff8761883fb-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zzqmp\" (UID: \"efd2ea83-1192-411e-9c7f-bff8761883fb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zzqmp" Dec 02 16:40:01 crc kubenswrapper[4933]: I1202 16:40:01.446153 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bccjz\" (UniqueName: \"kubernetes.io/projected/efd2ea83-1192-411e-9c7f-bff8761883fb-kube-api-access-bccjz\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zzqmp\" (UID: \"efd2ea83-1192-411e-9c7f-bff8761883fb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zzqmp" Dec 02 16:40:01 crc kubenswrapper[4933]: I1202 16:40:01.446206 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/efd2ea83-1192-411e-9c7f-bff8761883fb-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zzqmp\" (UID: \"efd2ea83-1192-411e-9c7f-bff8761883fb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zzqmp" Dec 02 16:40:01 crc kubenswrapper[4933]: I1202 16:40:01.446255 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/efd2ea83-1192-411e-9c7f-bff8761883fb-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zzqmp\" (UID: \"efd2ea83-1192-411e-9c7f-bff8761883fb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zzqmp" Dec 02 16:40:01 crc kubenswrapper[4933]: I1202 16:40:01.446292 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/efd2ea83-1192-411e-9c7f-bff8761883fb-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zzqmp\" (UID: \"efd2ea83-1192-411e-9c7f-bff8761883fb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zzqmp" Dec 02 16:40:01 crc kubenswrapper[4933]: I1202 16:40:01.446313 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efd2ea83-1192-411e-9c7f-bff8761883fb-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zzqmp\" (UID: \"efd2ea83-1192-411e-9c7f-bff8761883fb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zzqmp" Dec 02 16:40:01 crc kubenswrapper[4933]: I1202 16:40:01.451543 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/efd2ea83-1192-411e-9c7f-bff8761883fb-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zzqmp\" (UID: \"efd2ea83-1192-411e-9c7f-bff8761883fb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zzqmp" Dec 02 16:40:01 crc kubenswrapper[4933]: I1202 16:40:01.451559 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/efd2ea83-1192-411e-9c7f-bff8761883fb-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zzqmp\" (UID: \"efd2ea83-1192-411e-9c7f-bff8761883fb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zzqmp" Dec 02 16:40:01 crc kubenswrapper[4933]: I1202 16:40:01.452902 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efd2ea83-1192-411e-9c7f-bff8761883fb-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zzqmp\" (UID: \"efd2ea83-1192-411e-9c7f-bff8761883fb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zzqmp" Dec 02 16:40:01 crc kubenswrapper[4933]: I1202 16:40:01.452947 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/efd2ea83-1192-411e-9c7f-bff8761883fb-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zzqmp\" (UID: \"efd2ea83-1192-411e-9c7f-bff8761883fb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zzqmp" Dec 02 16:40:01 crc kubenswrapper[4933]: I1202 16:40:01.453405 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/efd2ea83-1192-411e-9c7f-bff8761883fb-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zzqmp\" (UID: \"efd2ea83-1192-411e-9c7f-bff8761883fb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zzqmp" Dec 02 16:40:01 crc kubenswrapper[4933]: I1202 16:40:01.454106 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/efd2ea83-1192-411e-9c7f-bff8761883fb-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zzqmp\" (UID: \"efd2ea83-1192-411e-9c7f-bff8761883fb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zzqmp" Dec 02 16:40:01 crc kubenswrapper[4933]: I1202 16:40:01.467058 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bccjz\" (UniqueName: \"kubernetes.io/projected/efd2ea83-1192-411e-9c7f-bff8761883fb-kube-api-access-bccjz\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zzqmp\" (UID: \"efd2ea83-1192-411e-9c7f-bff8761883fb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zzqmp" Dec 02 16:40:01 crc kubenswrapper[4933]: I1202 16:40:01.497195 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zzqmp" Dec 02 16:40:02 crc kubenswrapper[4933]: I1202 16:40:02.038564 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zzqmp"] Dec 02 16:40:02 crc kubenswrapper[4933]: I1202 16:40:02.985081 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zzqmp" event={"ID":"efd2ea83-1192-411e-9c7f-bff8761883fb","Type":"ContainerStarted","Data":"74548f55acc06066a7a242b3e599b9845a952307a83be00ac4b8b82fc810e13e"} Dec 02 16:40:03 crc kubenswrapper[4933]: I1202 16:40:03.998914 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zzqmp" event={"ID":"efd2ea83-1192-411e-9c7f-bff8761883fb","Type":"ContainerStarted","Data":"01507f838c23dc391cebf0b8b55fd3d49037866f39f08b295a71c30df67628d6"} Dec 02 16:40:04 crc kubenswrapper[4933]: I1202 16:40:04.045274 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zzqmp" podStartSLOduration=2.347192011 podStartE2EDuration="3.045251794s" podCreationTimestamp="2025-12-02 16:40:01 +0000 UTC" firstStartedPulling="2025-12-02 16:40:02.042175747 +0000 UTC m=+2865.293402460" lastFinishedPulling="2025-12-02 16:40:02.74023555 +0000 UTC m=+2865.991462243" observedRunningTime="2025-12-02 16:40:04.036971913 +0000 UTC m=+2867.288198616" watchObservedRunningTime="2025-12-02 16:40:04.045251794 +0000 UTC m=+2867.296478497" Dec 02 16:40:47 crc kubenswrapper[4933]: I1202 16:40:47.169218 4933 patch_prober.go:28] interesting pod/machine-config-daemon-d2p6w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 16:40:47 crc kubenswrapper[4933]: I1202 16:40:47.169720 4933 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 16:41:17 crc kubenswrapper[4933]: I1202 16:41:17.169072 4933 patch_prober.go:28] interesting pod/machine-config-daemon-d2p6w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 16:41:17 crc kubenswrapper[4933]: I1202 16:41:17.169687 4933 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 16:41:47 crc kubenswrapper[4933]: I1202 16:41:47.169906 4933 patch_prober.go:28] interesting pod/machine-config-daemon-d2p6w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 16:41:47 crc kubenswrapper[4933]: I1202 16:41:47.170459 4933 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 16:41:47 crc kubenswrapper[4933]: I1202 16:41:47.170504 4933 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" Dec 02 16:41:47 crc kubenswrapper[4933]: I1202 16:41:47.171436 4933 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"68b78e6cd96af98221363c9fce993885171fbe98149e90f073dca0715a2cf767"} pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 16:41:47 crc kubenswrapper[4933]: I1202 16:41:47.171493 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" containerName="machine-config-daemon" containerID="cri-o://68b78e6cd96af98221363c9fce993885171fbe98149e90f073dca0715a2cf767" gracePeriod=600 Dec 02 16:41:47 crc kubenswrapper[4933]: E1202 16:41:47.295621 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 16:41:48 crc kubenswrapper[4933]: I1202 16:41:48.183844 4933 generic.go:334] "Generic (PLEG): container finished" podID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" containerID="68b78e6cd96af98221363c9fce993885171fbe98149e90f073dca0715a2cf767" exitCode=0 Dec 02 16:41:48 crc kubenswrapper[4933]: I1202 16:41:48.183865 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" event={"ID":"e6c1c5e6-50dd-428a-890c-2c3f0456f2fa","Type":"ContainerDied","Data":"68b78e6cd96af98221363c9fce993885171fbe98149e90f073dca0715a2cf767"} Dec 02 16:41:48 crc kubenswrapper[4933]: I1202 16:41:48.184141 4933 scope.go:117] "RemoveContainer" containerID="c3819fcdd251b779a33969de8166c1977feab4eb854051d1c067f90a06dd33e0" Dec 02 16:41:48 crc kubenswrapper[4933]: I1202 16:41:48.184905 4933 scope.go:117] "RemoveContainer" containerID="68b78e6cd96af98221363c9fce993885171fbe98149e90f073dca0715a2cf767" Dec 02 16:41:48 crc kubenswrapper[4933]: E1202 16:41:48.185212 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 16:42:00 crc kubenswrapper[4933]: I1202 16:42:00.053738 4933 scope.go:117] "RemoveContainer" containerID="68b78e6cd96af98221363c9fce993885171fbe98149e90f073dca0715a2cf767" Dec 02 16:42:00 crc kubenswrapper[4933]: E1202 16:42:00.054714 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 16:42:13 crc kubenswrapper[4933]: I1202 16:42:13.053207 4933 scope.go:117] "RemoveContainer" containerID="68b78e6cd96af98221363c9fce993885171fbe98149e90f073dca0715a2cf767" Dec 02 16:42:13 crc kubenswrapper[4933]: E1202 16:42:13.054451 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 16:42:27 crc kubenswrapper[4933]: I1202 16:42:27.065123 4933 scope.go:117] "RemoveContainer" containerID="68b78e6cd96af98221363c9fce993885171fbe98149e90f073dca0715a2cf767" Dec 02 16:42:27 crc kubenswrapper[4933]: E1202 16:42:27.066178 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 16:42:38 crc kubenswrapper[4933]: I1202 16:42:38.053537 4933 scope.go:117] "RemoveContainer" containerID="68b78e6cd96af98221363c9fce993885171fbe98149e90f073dca0715a2cf767" Dec 02 16:42:38 crc kubenswrapper[4933]: E1202 16:42:38.054419 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 16:42:49 crc kubenswrapper[4933]: I1202 16:42:49.052953 4933 scope.go:117] "RemoveContainer" containerID="68b78e6cd96af98221363c9fce993885171fbe98149e90f073dca0715a2cf767" Dec 02 16:42:49 crc kubenswrapper[4933]: E1202 16:42:49.053793 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 16:43:02 crc kubenswrapper[4933]: I1202 16:43:02.054111 4933 scope.go:117] "RemoveContainer" containerID="68b78e6cd96af98221363c9fce993885171fbe98149e90f073dca0715a2cf767" Dec 02 16:43:02 crc kubenswrapper[4933]: E1202 16:43:02.055083 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 16:43:14 crc kubenswrapper[4933]: I1202 16:43:14.053607 4933 scope.go:117] "RemoveContainer" containerID="68b78e6cd96af98221363c9fce993885171fbe98149e90f073dca0715a2cf767" Dec 02 16:43:14 crc kubenswrapper[4933]: E1202 16:43:14.054526 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 16:43:26 crc kubenswrapper[4933]: I1202 16:43:26.071591 4933 scope.go:117] "RemoveContainer" containerID="68b78e6cd96af98221363c9fce993885171fbe98149e90f073dca0715a2cf767" Dec 02 16:43:26 crc kubenswrapper[4933]: E1202 16:43:26.072342 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 16:43:39 crc kubenswrapper[4933]: I1202 16:43:39.053939 4933 scope.go:117] "RemoveContainer" containerID="68b78e6cd96af98221363c9fce993885171fbe98149e90f073dca0715a2cf767" Dec 02 16:43:39 crc kubenswrapper[4933]: E1202 16:43:39.054697 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 16:43:39 crc kubenswrapper[4933]: I1202 16:43:39.469803 4933 generic.go:334] "Generic (PLEG): container finished" podID="efd2ea83-1192-411e-9c7f-bff8761883fb" containerID="01507f838c23dc391cebf0b8b55fd3d49037866f39f08b295a71c30df67628d6" exitCode=0 Dec 02 16:43:39 crc kubenswrapper[4933]: I1202 16:43:39.469865 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zzqmp" event={"ID":"efd2ea83-1192-411e-9c7f-bff8761883fb","Type":"ContainerDied","Data":"01507f838c23dc391cebf0b8b55fd3d49037866f39f08b295a71c30df67628d6"} Dec 02 16:43:40 crc kubenswrapper[4933]: I1202 16:43:40.953775 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zzqmp" Dec 02 16:43:41 crc kubenswrapper[4933]: I1202 16:43:41.074519 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/efd2ea83-1192-411e-9c7f-bff8761883fb-inventory\") pod \"efd2ea83-1192-411e-9c7f-bff8761883fb\" (UID: \"efd2ea83-1192-411e-9c7f-bff8761883fb\") " Dec 02 16:43:41 crc kubenswrapper[4933]: I1202 16:43:41.074562 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bccjz\" (UniqueName: \"kubernetes.io/projected/efd2ea83-1192-411e-9c7f-bff8761883fb-kube-api-access-bccjz\") pod \"efd2ea83-1192-411e-9c7f-bff8761883fb\" (UID: \"efd2ea83-1192-411e-9c7f-bff8761883fb\") " Dec 02 16:43:41 crc kubenswrapper[4933]: I1202 16:43:41.074686 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/efd2ea83-1192-411e-9c7f-bff8761883fb-ceilometer-compute-config-data-1\") pod \"efd2ea83-1192-411e-9c7f-bff8761883fb\" (UID: \"efd2ea83-1192-411e-9c7f-bff8761883fb\") " Dec 02 16:43:41 crc kubenswrapper[4933]: I1202 16:43:41.074977 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efd2ea83-1192-411e-9c7f-bff8761883fb-telemetry-combined-ca-bundle\") pod \"efd2ea83-1192-411e-9c7f-bff8761883fb\" (UID: \"efd2ea83-1192-411e-9c7f-bff8761883fb\") " Dec 02 16:43:41 crc kubenswrapper[4933]: I1202 16:43:41.075040 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/efd2ea83-1192-411e-9c7f-bff8761883fb-ssh-key\") pod \"efd2ea83-1192-411e-9c7f-bff8761883fb\" (UID: \"efd2ea83-1192-411e-9c7f-bff8761883fb\") " Dec 02 16:43:41 crc kubenswrapper[4933]: I1202 16:43:41.075104 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/efd2ea83-1192-411e-9c7f-bff8761883fb-ceilometer-compute-config-data-2\") pod \"efd2ea83-1192-411e-9c7f-bff8761883fb\" (UID: \"efd2ea83-1192-411e-9c7f-bff8761883fb\") " Dec 02 16:43:41 crc kubenswrapper[4933]: I1202 16:43:41.075170 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/efd2ea83-1192-411e-9c7f-bff8761883fb-ceilometer-compute-config-data-0\") pod \"efd2ea83-1192-411e-9c7f-bff8761883fb\" (UID: \"efd2ea83-1192-411e-9c7f-bff8761883fb\") " Dec 02 16:43:41 crc kubenswrapper[4933]: I1202 16:43:41.081347 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efd2ea83-1192-411e-9c7f-bff8761883fb-kube-api-access-bccjz" (OuterVolumeSpecName: "kube-api-access-bccjz") pod "efd2ea83-1192-411e-9c7f-bff8761883fb" (UID: "efd2ea83-1192-411e-9c7f-bff8761883fb"). InnerVolumeSpecName "kube-api-access-bccjz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:43:41 crc kubenswrapper[4933]: I1202 16:43:41.083134 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efd2ea83-1192-411e-9c7f-bff8761883fb-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "efd2ea83-1192-411e-9c7f-bff8761883fb" (UID: "efd2ea83-1192-411e-9c7f-bff8761883fb"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:43:41 crc kubenswrapper[4933]: I1202 16:43:41.108682 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efd2ea83-1192-411e-9c7f-bff8761883fb-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "efd2ea83-1192-411e-9c7f-bff8761883fb" (UID: "efd2ea83-1192-411e-9c7f-bff8761883fb"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:43:41 crc kubenswrapper[4933]: I1202 16:43:41.108718 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efd2ea83-1192-411e-9c7f-bff8761883fb-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "efd2ea83-1192-411e-9c7f-bff8761883fb" (UID: "efd2ea83-1192-411e-9c7f-bff8761883fb"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:43:41 crc kubenswrapper[4933]: I1202 16:43:41.110775 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efd2ea83-1192-411e-9c7f-bff8761883fb-inventory" (OuterVolumeSpecName: "inventory") pod "efd2ea83-1192-411e-9c7f-bff8761883fb" (UID: "efd2ea83-1192-411e-9c7f-bff8761883fb"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:43:41 crc kubenswrapper[4933]: I1202 16:43:41.114989 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efd2ea83-1192-411e-9c7f-bff8761883fb-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "efd2ea83-1192-411e-9c7f-bff8761883fb" (UID: "efd2ea83-1192-411e-9c7f-bff8761883fb"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:43:41 crc kubenswrapper[4933]: I1202 16:43:41.117845 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efd2ea83-1192-411e-9c7f-bff8761883fb-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "efd2ea83-1192-411e-9c7f-bff8761883fb" (UID: "efd2ea83-1192-411e-9c7f-bff8761883fb"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:43:41 crc kubenswrapper[4933]: I1202 16:43:41.178590 4933 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efd2ea83-1192-411e-9c7f-bff8761883fb-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 16:43:41 crc kubenswrapper[4933]: I1202 16:43:41.178642 4933 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/efd2ea83-1192-411e-9c7f-bff8761883fb-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 16:43:41 crc kubenswrapper[4933]: I1202 16:43:41.178658 4933 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/efd2ea83-1192-411e-9c7f-bff8761883fb-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Dec 02 16:43:41 crc kubenswrapper[4933]: I1202 16:43:41.178672 4933 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/efd2ea83-1192-411e-9c7f-bff8761883fb-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Dec 02 16:43:41 crc kubenswrapper[4933]: I1202 16:43:41.178686 4933 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/efd2ea83-1192-411e-9c7f-bff8761883fb-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 16:43:41 crc kubenswrapper[4933]: I1202 16:43:41.178698 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bccjz\" (UniqueName: \"kubernetes.io/projected/efd2ea83-1192-411e-9c7f-bff8761883fb-kube-api-access-bccjz\") on node \"crc\" DevicePath \"\"" Dec 02 16:43:41 crc kubenswrapper[4933]: I1202 16:43:41.178747 4933 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/efd2ea83-1192-411e-9c7f-bff8761883fb-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Dec 02 16:43:41 crc kubenswrapper[4933]: I1202 16:43:41.517783 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zzqmp" event={"ID":"efd2ea83-1192-411e-9c7f-bff8761883fb","Type":"ContainerDied","Data":"74548f55acc06066a7a242b3e599b9845a952307a83be00ac4b8b82fc810e13e"} Dec 02 16:43:41 crc kubenswrapper[4933]: I1202 16:43:41.517840 4933 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="74548f55acc06066a7a242b3e599b9845a952307a83be00ac4b8b82fc810e13e" Dec 02 16:43:41 crc kubenswrapper[4933]: I1202 16:43:41.517868 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zzqmp" Dec 02 16:43:41 crc kubenswrapper[4933]: I1202 16:43:41.605575 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-hrqsx"] Dec 02 16:43:41 crc kubenswrapper[4933]: E1202 16:43:41.606152 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efd2ea83-1192-411e-9c7f-bff8761883fb" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 02 16:43:41 crc kubenswrapper[4933]: I1202 16:43:41.606174 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="efd2ea83-1192-411e-9c7f-bff8761883fb" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 02 16:43:41 crc kubenswrapper[4933]: I1202 16:43:41.606394 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="efd2ea83-1192-411e-9c7f-bff8761883fb" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 02 16:43:41 crc kubenswrapper[4933]: I1202 16:43:41.607303 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-hrqsx" Dec 02 16:43:41 crc kubenswrapper[4933]: I1202 16:43:41.610097 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 16:43:41 crc kubenswrapper[4933]: I1202 16:43:41.610406 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-ipmi-config-data" Dec 02 16:43:41 crc kubenswrapper[4933]: I1202 16:43:41.610679 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 16:43:41 crc kubenswrapper[4933]: I1202 16:43:41.610942 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mlmmm" Dec 02 16:43:41 crc kubenswrapper[4933]: I1202 16:43:41.611111 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 16:43:41 crc kubenswrapper[4933]: I1202 16:43:41.633301 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-hrqsx"] Dec 02 16:43:41 crc kubenswrapper[4933]: I1202 16:43:41.791746 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/06b52c33-11b2-4a83-a9b8-2de5845e6e89-ceilometer-ipmi-config-data-2\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-hrqsx\" (UID: \"06b52c33-11b2-4a83-a9b8-2de5845e6e89\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-hrqsx" Dec 02 16:43:41 crc kubenswrapper[4933]: I1202 16:43:41.792145 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/06b52c33-11b2-4a83-a9b8-2de5845e6e89-ssh-key\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-hrqsx\" (UID: \"06b52c33-11b2-4a83-a9b8-2de5845e6e89\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-hrqsx" Dec 02 16:43:41 crc kubenswrapper[4933]: I1202 16:43:41.792176 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pq9ts\" (UniqueName: \"kubernetes.io/projected/06b52c33-11b2-4a83-a9b8-2de5845e6e89-kube-api-access-pq9ts\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-hrqsx\" (UID: \"06b52c33-11b2-4a83-a9b8-2de5845e6e89\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-hrqsx" Dec 02 16:43:41 crc kubenswrapper[4933]: I1202 16:43:41.792220 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/06b52c33-11b2-4a83-a9b8-2de5845e6e89-ceilometer-ipmi-config-data-0\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-hrqsx\" (UID: \"06b52c33-11b2-4a83-a9b8-2de5845e6e89\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-hrqsx" Dec 02 16:43:41 crc kubenswrapper[4933]: I1202 16:43:41.792462 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/06b52c33-11b2-4a83-a9b8-2de5845e6e89-inventory\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-hrqsx\" (UID: \"06b52c33-11b2-4a83-a9b8-2de5845e6e89\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-hrqsx" Dec 02 16:43:41 crc kubenswrapper[4933]: I1202 16:43:41.792573 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/06b52c33-11b2-4a83-a9b8-2de5845e6e89-ceilometer-ipmi-config-data-1\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-hrqsx\" (UID: \"06b52c33-11b2-4a83-a9b8-2de5845e6e89\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-hrqsx" Dec 02 16:43:41 crc kubenswrapper[4933]: I1202 16:43:41.792637 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06b52c33-11b2-4a83-a9b8-2de5845e6e89-telemetry-power-monitoring-combined-ca-bundle\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-hrqsx\" (UID: \"06b52c33-11b2-4a83-a9b8-2de5845e6e89\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-hrqsx" Dec 02 16:43:41 crc kubenswrapper[4933]: I1202 16:43:41.896393 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/06b52c33-11b2-4a83-a9b8-2de5845e6e89-ssh-key\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-hrqsx\" (UID: \"06b52c33-11b2-4a83-a9b8-2de5845e6e89\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-hrqsx" Dec 02 16:43:41 crc kubenswrapper[4933]: I1202 16:43:41.896500 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pq9ts\" (UniqueName: \"kubernetes.io/projected/06b52c33-11b2-4a83-a9b8-2de5845e6e89-kube-api-access-pq9ts\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-hrqsx\" (UID: \"06b52c33-11b2-4a83-a9b8-2de5845e6e89\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-hrqsx" Dec 02 16:43:41 crc kubenswrapper[4933]: I1202 16:43:41.896624 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/06b52c33-11b2-4a83-a9b8-2de5845e6e89-ceilometer-ipmi-config-data-0\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-hrqsx\" (UID: \"06b52c33-11b2-4a83-a9b8-2de5845e6e89\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-hrqsx" Dec 02 16:43:41 crc kubenswrapper[4933]: I1202 16:43:41.896782 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/06b52c33-11b2-4a83-a9b8-2de5845e6e89-inventory\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-hrqsx\" (UID: \"06b52c33-11b2-4a83-a9b8-2de5845e6e89\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-hrqsx" Dec 02 16:43:41 crc kubenswrapper[4933]: I1202 16:43:41.896897 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/06b52c33-11b2-4a83-a9b8-2de5845e6e89-ceilometer-ipmi-config-data-1\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-hrqsx\" (UID: \"06b52c33-11b2-4a83-a9b8-2de5845e6e89\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-hrqsx" Dec 02 16:43:41 crc kubenswrapper[4933]: I1202 16:43:41.896959 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06b52c33-11b2-4a83-a9b8-2de5845e6e89-telemetry-power-monitoring-combined-ca-bundle\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-hrqsx\" (UID: \"06b52c33-11b2-4a83-a9b8-2de5845e6e89\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-hrqsx" Dec 02 16:43:41 crc kubenswrapper[4933]: I1202 16:43:41.897230 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/06b52c33-11b2-4a83-a9b8-2de5845e6e89-ceilometer-ipmi-config-data-2\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-hrqsx\" (UID: \"06b52c33-11b2-4a83-a9b8-2de5845e6e89\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-hrqsx" Dec 02 16:43:41 crc kubenswrapper[4933]: I1202 16:43:41.903297 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/06b52c33-11b2-4a83-a9b8-2de5845e6e89-inventory\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-hrqsx\" (UID: \"06b52c33-11b2-4a83-a9b8-2de5845e6e89\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-hrqsx" Dec 02 16:43:41 crc kubenswrapper[4933]: I1202 16:43:41.903416 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/06b52c33-11b2-4a83-a9b8-2de5845e6e89-ceilometer-ipmi-config-data-2\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-hrqsx\" (UID: \"06b52c33-11b2-4a83-a9b8-2de5845e6e89\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-hrqsx" Dec 02 16:43:41 crc kubenswrapper[4933]: I1202 16:43:41.903977 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/06b52c33-11b2-4a83-a9b8-2de5845e6e89-ssh-key\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-hrqsx\" (UID: \"06b52c33-11b2-4a83-a9b8-2de5845e6e89\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-hrqsx" Dec 02 16:43:41 crc kubenswrapper[4933]: I1202 16:43:41.905184 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/06b52c33-11b2-4a83-a9b8-2de5845e6e89-ceilometer-ipmi-config-data-0\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-hrqsx\" (UID: \"06b52c33-11b2-4a83-a9b8-2de5845e6e89\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-hrqsx" Dec 02 16:43:41 crc kubenswrapper[4933]: I1202 16:43:41.908486 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/06b52c33-11b2-4a83-a9b8-2de5845e6e89-ceilometer-ipmi-config-data-1\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-hrqsx\" (UID: \"06b52c33-11b2-4a83-a9b8-2de5845e6e89\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-hrqsx" Dec 02 16:43:41 crc kubenswrapper[4933]: I1202 16:43:41.917242 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06b52c33-11b2-4a83-a9b8-2de5845e6e89-telemetry-power-monitoring-combined-ca-bundle\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-hrqsx\" (UID: \"06b52c33-11b2-4a83-a9b8-2de5845e6e89\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-hrqsx" Dec 02 16:43:41 crc kubenswrapper[4933]: I1202 16:43:41.918383 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pq9ts\" (UniqueName: \"kubernetes.io/projected/06b52c33-11b2-4a83-a9b8-2de5845e6e89-kube-api-access-pq9ts\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-hrqsx\" (UID: \"06b52c33-11b2-4a83-a9b8-2de5845e6e89\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-hrqsx" Dec 02 16:43:41 crc kubenswrapper[4933]: I1202 16:43:41.928746 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-hrqsx" Dec 02 16:43:42 crc kubenswrapper[4933]: I1202 16:43:42.465272 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-hrqsx"] Dec 02 16:43:42 crc kubenswrapper[4933]: W1202 16:43:42.468411 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod06b52c33_11b2_4a83_a9b8_2de5845e6e89.slice/crio-6b376b01a97f9ef4d25a73b007737b6a914c8acb9938e485cc68376806ab6c10 WatchSource:0}: Error finding container 6b376b01a97f9ef4d25a73b007737b6a914c8acb9938e485cc68376806ab6c10: Status 404 returned error can't find the container with id 6b376b01a97f9ef4d25a73b007737b6a914c8acb9938e485cc68376806ab6c10 Dec 02 16:43:42 crc kubenswrapper[4933]: I1202 16:43:42.470411 4933 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 16:43:42 crc kubenswrapper[4933]: I1202 16:43:42.528205 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-hrqsx" event={"ID":"06b52c33-11b2-4a83-a9b8-2de5845e6e89","Type":"ContainerStarted","Data":"6b376b01a97f9ef4d25a73b007737b6a914c8acb9938e485cc68376806ab6c10"} Dec 02 16:43:43 crc kubenswrapper[4933]: I1202 16:43:43.544274 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-hrqsx" event={"ID":"06b52c33-11b2-4a83-a9b8-2de5845e6e89","Type":"ContainerStarted","Data":"6400317194d6c65ae61777c8bfcc362ab95fb4aa579dd0be2d8087553d023a3f"} Dec 02 16:43:43 crc kubenswrapper[4933]: I1202 16:43:43.581756 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-hrqsx" podStartSLOduration=1.8696650940000001 podStartE2EDuration="2.581734803s" podCreationTimestamp="2025-12-02 16:43:41 +0000 UTC" firstStartedPulling="2025-12-02 16:43:42.47020417 +0000 UTC m=+3085.721430873" lastFinishedPulling="2025-12-02 16:43:43.182273879 +0000 UTC m=+3086.433500582" observedRunningTime="2025-12-02 16:43:43.564204435 +0000 UTC m=+3086.815431138" watchObservedRunningTime="2025-12-02 16:43:43.581734803 +0000 UTC m=+3086.832961516" Dec 02 16:43:54 crc kubenswrapper[4933]: I1202 16:43:54.053033 4933 scope.go:117] "RemoveContainer" containerID="68b78e6cd96af98221363c9fce993885171fbe98149e90f073dca0715a2cf767" Dec 02 16:43:54 crc kubenswrapper[4933]: E1202 16:43:54.053808 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 16:44:06 crc kubenswrapper[4933]: I1202 16:44:06.054394 4933 scope.go:117] "RemoveContainer" containerID="68b78e6cd96af98221363c9fce993885171fbe98149e90f073dca0715a2cf767" Dec 02 16:44:06 crc kubenswrapper[4933]: E1202 16:44:06.056209 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 16:44:21 crc kubenswrapper[4933]: I1202 16:44:21.053177 4933 scope.go:117] "RemoveContainer" containerID="68b78e6cd96af98221363c9fce993885171fbe98149e90f073dca0715a2cf767" Dec 02 16:44:21 crc kubenswrapper[4933]: E1202 16:44:21.055172 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 16:44:32 crc kubenswrapper[4933]: I1202 16:44:32.054157 4933 scope.go:117] "RemoveContainer" containerID="68b78e6cd96af98221363c9fce993885171fbe98149e90f073dca0715a2cf767" Dec 02 16:44:32 crc kubenswrapper[4933]: E1202 16:44:32.055052 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 16:44:45 crc kubenswrapper[4933]: I1202 16:44:45.054080 4933 scope.go:117] "RemoveContainer" containerID="68b78e6cd96af98221363c9fce993885171fbe98149e90f073dca0715a2cf767" Dec 02 16:44:45 crc kubenswrapper[4933]: E1202 16:44:45.055113 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 16:45:00 crc kubenswrapper[4933]: I1202 16:45:00.053565 4933 scope.go:117] "RemoveContainer" containerID="68b78e6cd96af98221363c9fce993885171fbe98149e90f073dca0715a2cf767" Dec 02 16:45:00 crc kubenswrapper[4933]: E1202 16:45:00.054354 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 16:45:00 crc kubenswrapper[4933]: I1202 16:45:00.156376 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411565-vmzq6"] Dec 02 16:45:00 crc kubenswrapper[4933]: I1202 16:45:00.158524 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411565-vmzq6" Dec 02 16:45:00 crc kubenswrapper[4933]: I1202 16:45:00.161131 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 02 16:45:00 crc kubenswrapper[4933]: I1202 16:45:00.161344 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 02 16:45:00 crc kubenswrapper[4933]: I1202 16:45:00.172745 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411565-vmzq6"] Dec 02 16:45:00 crc kubenswrapper[4933]: I1202 16:45:00.252670 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8zqn\" (UniqueName: \"kubernetes.io/projected/27cff0b6-327c-4b71-b01b-ed77b98d24ea-kube-api-access-t8zqn\") pod \"collect-profiles-29411565-vmzq6\" (UID: \"27cff0b6-327c-4b71-b01b-ed77b98d24ea\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411565-vmzq6" Dec 02 16:45:00 crc kubenswrapper[4933]: I1202 16:45:00.252772 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/27cff0b6-327c-4b71-b01b-ed77b98d24ea-config-volume\") pod \"collect-profiles-29411565-vmzq6\" (UID: \"27cff0b6-327c-4b71-b01b-ed77b98d24ea\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411565-vmzq6" Dec 02 16:45:00 crc kubenswrapper[4933]: I1202 16:45:00.252858 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/27cff0b6-327c-4b71-b01b-ed77b98d24ea-secret-volume\") pod \"collect-profiles-29411565-vmzq6\" (UID: \"27cff0b6-327c-4b71-b01b-ed77b98d24ea\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411565-vmzq6" Dec 02 16:45:00 crc kubenswrapper[4933]: I1202 16:45:00.354649 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8zqn\" (UniqueName: \"kubernetes.io/projected/27cff0b6-327c-4b71-b01b-ed77b98d24ea-kube-api-access-t8zqn\") pod \"collect-profiles-29411565-vmzq6\" (UID: \"27cff0b6-327c-4b71-b01b-ed77b98d24ea\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411565-vmzq6" Dec 02 16:45:00 crc kubenswrapper[4933]: I1202 16:45:00.354731 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/27cff0b6-327c-4b71-b01b-ed77b98d24ea-config-volume\") pod \"collect-profiles-29411565-vmzq6\" (UID: \"27cff0b6-327c-4b71-b01b-ed77b98d24ea\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411565-vmzq6" Dec 02 16:45:00 crc kubenswrapper[4933]: I1202 16:45:00.354779 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/27cff0b6-327c-4b71-b01b-ed77b98d24ea-secret-volume\") pod \"collect-profiles-29411565-vmzq6\" (UID: \"27cff0b6-327c-4b71-b01b-ed77b98d24ea\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411565-vmzq6" Dec 02 16:45:00 crc kubenswrapper[4933]: I1202 16:45:00.355694 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/27cff0b6-327c-4b71-b01b-ed77b98d24ea-config-volume\") pod \"collect-profiles-29411565-vmzq6\" (UID: \"27cff0b6-327c-4b71-b01b-ed77b98d24ea\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411565-vmzq6" Dec 02 16:45:00 crc kubenswrapper[4933]: I1202 16:45:00.370526 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/27cff0b6-327c-4b71-b01b-ed77b98d24ea-secret-volume\") pod \"collect-profiles-29411565-vmzq6\" (UID: \"27cff0b6-327c-4b71-b01b-ed77b98d24ea\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411565-vmzq6" Dec 02 16:45:00 crc kubenswrapper[4933]: I1202 16:45:00.374437 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8zqn\" (UniqueName: \"kubernetes.io/projected/27cff0b6-327c-4b71-b01b-ed77b98d24ea-kube-api-access-t8zqn\") pod \"collect-profiles-29411565-vmzq6\" (UID: \"27cff0b6-327c-4b71-b01b-ed77b98d24ea\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411565-vmzq6" Dec 02 16:45:00 crc kubenswrapper[4933]: I1202 16:45:00.489980 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411565-vmzq6" Dec 02 16:45:01 crc kubenswrapper[4933]: I1202 16:45:01.085048 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411565-vmzq6"] Dec 02 16:45:01 crc kubenswrapper[4933]: I1202 16:45:01.429657 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411565-vmzq6" event={"ID":"27cff0b6-327c-4b71-b01b-ed77b98d24ea","Type":"ContainerStarted","Data":"55b6840cbc2de06f8d5f558141ab0bd9efd15fd02b53cefa1764c60107739e6c"} Dec 02 16:45:01 crc kubenswrapper[4933]: I1202 16:45:01.430129 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411565-vmzq6" event={"ID":"27cff0b6-327c-4b71-b01b-ed77b98d24ea","Type":"ContainerStarted","Data":"b9e47356bc6afd25a8b34a98a7f8efc2e7eaceb4713dbc3a295dc8c4fcc1146d"} Dec 02 16:45:01 crc kubenswrapper[4933]: I1202 16:45:01.454001 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29411565-vmzq6" podStartSLOduration=1.45398261 podStartE2EDuration="1.45398261s" podCreationTimestamp="2025-12-02 16:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 16:45:01.444029104 +0000 UTC m=+3164.695255817" watchObservedRunningTime="2025-12-02 16:45:01.45398261 +0000 UTC m=+3164.705209313" Dec 02 16:45:02 crc kubenswrapper[4933]: I1202 16:45:02.443165 4933 generic.go:334] "Generic (PLEG): container finished" podID="27cff0b6-327c-4b71-b01b-ed77b98d24ea" containerID="55b6840cbc2de06f8d5f558141ab0bd9efd15fd02b53cefa1764c60107739e6c" exitCode=0 Dec 02 16:45:02 crc kubenswrapper[4933]: I1202 16:45:02.443228 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411565-vmzq6" event={"ID":"27cff0b6-327c-4b71-b01b-ed77b98d24ea","Type":"ContainerDied","Data":"55b6840cbc2de06f8d5f558141ab0bd9efd15fd02b53cefa1764c60107739e6c"} Dec 02 16:45:03 crc kubenswrapper[4933]: I1202 16:45:03.876292 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411565-vmzq6" Dec 02 16:45:03 crc kubenswrapper[4933]: I1202 16:45:03.963026 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t8zqn\" (UniqueName: \"kubernetes.io/projected/27cff0b6-327c-4b71-b01b-ed77b98d24ea-kube-api-access-t8zqn\") pod \"27cff0b6-327c-4b71-b01b-ed77b98d24ea\" (UID: \"27cff0b6-327c-4b71-b01b-ed77b98d24ea\") " Dec 02 16:45:03 crc kubenswrapper[4933]: I1202 16:45:03.963201 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/27cff0b6-327c-4b71-b01b-ed77b98d24ea-config-volume\") pod \"27cff0b6-327c-4b71-b01b-ed77b98d24ea\" (UID: \"27cff0b6-327c-4b71-b01b-ed77b98d24ea\") " Dec 02 16:45:03 crc kubenswrapper[4933]: I1202 16:45:03.963505 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/27cff0b6-327c-4b71-b01b-ed77b98d24ea-secret-volume\") pod \"27cff0b6-327c-4b71-b01b-ed77b98d24ea\" (UID: \"27cff0b6-327c-4b71-b01b-ed77b98d24ea\") " Dec 02 16:45:03 crc kubenswrapper[4933]: I1202 16:45:03.972790 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27cff0b6-327c-4b71-b01b-ed77b98d24ea-config-volume" (OuterVolumeSpecName: "config-volume") pod "27cff0b6-327c-4b71-b01b-ed77b98d24ea" (UID: "27cff0b6-327c-4b71-b01b-ed77b98d24ea"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 16:45:03 crc kubenswrapper[4933]: I1202 16:45:03.980095 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27cff0b6-327c-4b71-b01b-ed77b98d24ea-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "27cff0b6-327c-4b71-b01b-ed77b98d24ea" (UID: "27cff0b6-327c-4b71-b01b-ed77b98d24ea"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:45:03 crc kubenswrapper[4933]: I1202 16:45:03.986719 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27cff0b6-327c-4b71-b01b-ed77b98d24ea-kube-api-access-t8zqn" (OuterVolumeSpecName: "kube-api-access-t8zqn") pod "27cff0b6-327c-4b71-b01b-ed77b98d24ea" (UID: "27cff0b6-327c-4b71-b01b-ed77b98d24ea"). InnerVolumeSpecName "kube-api-access-t8zqn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:45:04 crc kubenswrapper[4933]: I1202 16:45:04.068125 4933 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/27cff0b6-327c-4b71-b01b-ed77b98d24ea-config-volume\") on node \"crc\" DevicePath \"\"" Dec 02 16:45:04 crc kubenswrapper[4933]: I1202 16:45:04.068158 4933 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/27cff0b6-327c-4b71-b01b-ed77b98d24ea-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 02 16:45:04 crc kubenswrapper[4933]: I1202 16:45:04.068170 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t8zqn\" (UniqueName: \"kubernetes.io/projected/27cff0b6-327c-4b71-b01b-ed77b98d24ea-kube-api-access-t8zqn\") on node \"crc\" DevicePath \"\"" Dec 02 16:45:04 crc kubenswrapper[4933]: I1202 16:45:04.471049 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411565-vmzq6" event={"ID":"27cff0b6-327c-4b71-b01b-ed77b98d24ea","Type":"ContainerDied","Data":"b9e47356bc6afd25a8b34a98a7f8efc2e7eaceb4713dbc3a295dc8c4fcc1146d"} Dec 02 16:45:04 crc kubenswrapper[4933]: I1202 16:45:04.471092 4933 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b9e47356bc6afd25a8b34a98a7f8efc2e7eaceb4713dbc3a295dc8c4fcc1146d" Dec 02 16:45:04 crc kubenswrapper[4933]: I1202 16:45:04.471135 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411565-vmzq6" Dec 02 16:45:04 crc kubenswrapper[4933]: I1202 16:45:04.536504 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411520-6kmwh"] Dec 02 16:45:04 crc kubenswrapper[4933]: I1202 16:45:04.557267 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411520-6kmwh"] Dec 02 16:45:05 crc kubenswrapper[4933]: I1202 16:45:05.066433 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5122e65-9bfd-4382-8057-177aa3d4450c" path="/var/lib/kubelet/pods/f5122e65-9bfd-4382-8057-177aa3d4450c/volumes" Dec 02 16:45:11 crc kubenswrapper[4933]: I1202 16:45:11.053695 4933 scope.go:117] "RemoveContainer" containerID="68b78e6cd96af98221363c9fce993885171fbe98149e90f073dca0715a2cf767" Dec 02 16:45:11 crc kubenswrapper[4933]: E1202 16:45:11.054758 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 16:45:20 crc kubenswrapper[4933]: I1202 16:45:20.124803 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8spjz"] Dec 02 16:45:20 crc kubenswrapper[4933]: E1202 16:45:20.165096 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27cff0b6-327c-4b71-b01b-ed77b98d24ea" containerName="collect-profiles" Dec 02 16:45:20 crc kubenswrapper[4933]: I1202 16:45:20.165147 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="27cff0b6-327c-4b71-b01b-ed77b98d24ea" containerName="collect-profiles" Dec 02 16:45:20 crc kubenswrapper[4933]: I1202 16:45:20.166234 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="27cff0b6-327c-4b71-b01b-ed77b98d24ea" containerName="collect-profiles" Dec 02 16:45:20 crc kubenswrapper[4933]: I1202 16:45:20.170417 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8spjz"] Dec 02 16:45:20 crc kubenswrapper[4933]: I1202 16:45:20.170585 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8spjz" Dec 02 16:45:20 crc kubenswrapper[4933]: I1202 16:45:20.266376 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57555967-eb83-498d-aab4-59f4cd9f798e-catalog-content\") pod \"community-operators-8spjz\" (UID: \"57555967-eb83-498d-aab4-59f4cd9f798e\") " pod="openshift-marketplace/community-operators-8spjz" Dec 02 16:45:20 crc kubenswrapper[4933]: I1202 16:45:20.267069 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57555967-eb83-498d-aab4-59f4cd9f798e-utilities\") pod \"community-operators-8spjz\" (UID: \"57555967-eb83-498d-aab4-59f4cd9f798e\") " pod="openshift-marketplace/community-operators-8spjz" Dec 02 16:45:20 crc kubenswrapper[4933]: I1202 16:45:20.267390 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wggj\" (UniqueName: \"kubernetes.io/projected/57555967-eb83-498d-aab4-59f4cd9f798e-kube-api-access-5wggj\") pod \"community-operators-8spjz\" (UID: \"57555967-eb83-498d-aab4-59f4cd9f798e\") " pod="openshift-marketplace/community-operators-8spjz" Dec 02 16:45:20 crc kubenswrapper[4933]: I1202 16:45:20.370113 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57555967-eb83-498d-aab4-59f4cd9f798e-utilities\") pod \"community-operators-8spjz\" (UID: \"57555967-eb83-498d-aab4-59f4cd9f798e\") " pod="openshift-marketplace/community-operators-8spjz" Dec 02 16:45:20 crc kubenswrapper[4933]: I1202 16:45:20.370256 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5wggj\" (UniqueName: \"kubernetes.io/projected/57555967-eb83-498d-aab4-59f4cd9f798e-kube-api-access-5wggj\") pod \"community-operators-8spjz\" (UID: \"57555967-eb83-498d-aab4-59f4cd9f798e\") " pod="openshift-marketplace/community-operators-8spjz" Dec 02 16:45:20 crc kubenswrapper[4933]: I1202 16:45:20.370285 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57555967-eb83-498d-aab4-59f4cd9f798e-catalog-content\") pod \"community-operators-8spjz\" (UID: \"57555967-eb83-498d-aab4-59f4cd9f798e\") " pod="openshift-marketplace/community-operators-8spjz" Dec 02 16:45:20 crc kubenswrapper[4933]: I1202 16:45:20.370974 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57555967-eb83-498d-aab4-59f4cd9f798e-catalog-content\") pod \"community-operators-8spjz\" (UID: \"57555967-eb83-498d-aab4-59f4cd9f798e\") " pod="openshift-marketplace/community-operators-8spjz" Dec 02 16:45:20 crc kubenswrapper[4933]: I1202 16:45:20.371241 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57555967-eb83-498d-aab4-59f4cd9f798e-utilities\") pod \"community-operators-8spjz\" (UID: \"57555967-eb83-498d-aab4-59f4cd9f798e\") " pod="openshift-marketplace/community-operators-8spjz" Dec 02 16:45:20 crc kubenswrapper[4933]: I1202 16:45:20.392229 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wggj\" (UniqueName: \"kubernetes.io/projected/57555967-eb83-498d-aab4-59f4cd9f798e-kube-api-access-5wggj\") pod \"community-operators-8spjz\" (UID: \"57555967-eb83-498d-aab4-59f4cd9f798e\") " pod="openshift-marketplace/community-operators-8spjz" Dec 02 16:45:20 crc kubenswrapper[4933]: I1202 16:45:20.497847 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8spjz" Dec 02 16:45:21 crc kubenswrapper[4933]: I1202 16:45:21.032734 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8spjz"] Dec 02 16:45:21 crc kubenswrapper[4933]: I1202 16:45:21.679246 4933 generic.go:334] "Generic (PLEG): container finished" podID="57555967-eb83-498d-aab4-59f4cd9f798e" containerID="4ccedde9e9aadcbd8bb8f70e7ea4f71f7ed5a8b131c02d6ac2a186cf5c7f0d3d" exitCode=0 Dec 02 16:45:21 crc kubenswrapper[4933]: I1202 16:45:21.679331 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8spjz" event={"ID":"57555967-eb83-498d-aab4-59f4cd9f798e","Type":"ContainerDied","Data":"4ccedde9e9aadcbd8bb8f70e7ea4f71f7ed5a8b131c02d6ac2a186cf5c7f0d3d"} Dec 02 16:45:21 crc kubenswrapper[4933]: I1202 16:45:21.679623 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8spjz" event={"ID":"57555967-eb83-498d-aab4-59f4cd9f798e","Type":"ContainerStarted","Data":"86fefa5b5dd6d5bb9de4d4bfff9216ad0818509704abdd74c56a0e96fa8f153d"} Dec 02 16:45:23 crc kubenswrapper[4933]: I1202 16:45:23.724635 4933 generic.go:334] "Generic (PLEG): container finished" podID="57555967-eb83-498d-aab4-59f4cd9f798e" containerID="e4552447daa69fae06972b18d7d137815b4bb20d8e0cf4cdfeaca5a361ff9639" exitCode=0 Dec 02 16:45:23 crc kubenswrapper[4933]: I1202 16:45:23.724695 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8spjz" event={"ID":"57555967-eb83-498d-aab4-59f4cd9f798e","Type":"ContainerDied","Data":"e4552447daa69fae06972b18d7d137815b4bb20d8e0cf4cdfeaca5a361ff9639"} Dec 02 16:45:24 crc kubenswrapper[4933]: I1202 16:45:24.054865 4933 scope.go:117] "RemoveContainer" containerID="68b78e6cd96af98221363c9fce993885171fbe98149e90f073dca0715a2cf767" Dec 02 16:45:24 crc kubenswrapper[4933]: E1202 16:45:24.055332 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 16:45:24 crc kubenswrapper[4933]: I1202 16:45:24.742050 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8spjz" event={"ID":"57555967-eb83-498d-aab4-59f4cd9f798e","Type":"ContainerStarted","Data":"8db813bdeed00af40a324beaa2ca534a196a0a8e35f8d0419f1c4256988f5170"} Dec 02 16:45:24 crc kubenswrapper[4933]: I1202 16:45:24.767795 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8spjz" podStartSLOduration=2.317059933 podStartE2EDuration="4.767774488s" podCreationTimestamp="2025-12-02 16:45:20 +0000 UTC" firstStartedPulling="2025-12-02 16:45:21.681068438 +0000 UTC m=+3184.932295151" lastFinishedPulling="2025-12-02 16:45:24.131783003 +0000 UTC m=+3187.383009706" observedRunningTime="2025-12-02 16:45:24.762703103 +0000 UTC m=+3188.013929836" watchObservedRunningTime="2025-12-02 16:45:24.767774488 +0000 UTC m=+3188.019001211" Dec 02 16:45:30 crc kubenswrapper[4933]: I1202 16:45:30.498129 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8spjz" Dec 02 16:45:30 crc kubenswrapper[4933]: I1202 16:45:30.498782 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8spjz" Dec 02 16:45:30 crc kubenswrapper[4933]: I1202 16:45:30.570165 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8spjz" Dec 02 16:45:30 crc kubenswrapper[4933]: I1202 16:45:30.857012 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8spjz" Dec 02 16:45:30 crc kubenswrapper[4933]: I1202 16:45:30.907047 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8spjz"] Dec 02 16:45:32 crc kubenswrapper[4933]: I1202 16:45:32.844095 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-8spjz" podUID="57555967-eb83-498d-aab4-59f4cd9f798e" containerName="registry-server" containerID="cri-o://8db813bdeed00af40a324beaa2ca534a196a0a8e35f8d0419f1c4256988f5170" gracePeriod=2 Dec 02 16:45:33 crc kubenswrapper[4933]: I1202 16:45:33.330216 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8spjz" Dec 02 16:45:33 crc kubenswrapper[4933]: I1202 16:45:33.489211 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57555967-eb83-498d-aab4-59f4cd9f798e-utilities\") pod \"57555967-eb83-498d-aab4-59f4cd9f798e\" (UID: \"57555967-eb83-498d-aab4-59f4cd9f798e\") " Dec 02 16:45:33 crc kubenswrapper[4933]: I1202 16:45:33.489256 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57555967-eb83-498d-aab4-59f4cd9f798e-catalog-content\") pod \"57555967-eb83-498d-aab4-59f4cd9f798e\" (UID: \"57555967-eb83-498d-aab4-59f4cd9f798e\") " Dec 02 16:45:33 crc kubenswrapper[4933]: I1202 16:45:33.489393 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5wggj\" (UniqueName: \"kubernetes.io/projected/57555967-eb83-498d-aab4-59f4cd9f798e-kube-api-access-5wggj\") pod \"57555967-eb83-498d-aab4-59f4cd9f798e\" (UID: \"57555967-eb83-498d-aab4-59f4cd9f798e\") " Dec 02 16:45:33 crc kubenswrapper[4933]: I1202 16:45:33.490090 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57555967-eb83-498d-aab4-59f4cd9f798e-utilities" (OuterVolumeSpecName: "utilities") pod "57555967-eb83-498d-aab4-59f4cd9f798e" (UID: "57555967-eb83-498d-aab4-59f4cd9f798e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 16:45:33 crc kubenswrapper[4933]: I1202 16:45:33.490560 4933 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57555967-eb83-498d-aab4-59f4cd9f798e-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 16:45:33 crc kubenswrapper[4933]: I1202 16:45:33.494914 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57555967-eb83-498d-aab4-59f4cd9f798e-kube-api-access-5wggj" (OuterVolumeSpecName: "kube-api-access-5wggj") pod "57555967-eb83-498d-aab4-59f4cd9f798e" (UID: "57555967-eb83-498d-aab4-59f4cd9f798e"). InnerVolumeSpecName "kube-api-access-5wggj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:45:33 crc kubenswrapper[4933]: I1202 16:45:33.542547 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57555967-eb83-498d-aab4-59f4cd9f798e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57555967-eb83-498d-aab4-59f4cd9f798e" (UID: "57555967-eb83-498d-aab4-59f4cd9f798e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 16:45:33 crc kubenswrapper[4933]: I1202 16:45:33.593161 4933 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57555967-eb83-498d-aab4-59f4cd9f798e-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 16:45:33 crc kubenswrapper[4933]: I1202 16:45:33.593194 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5wggj\" (UniqueName: \"kubernetes.io/projected/57555967-eb83-498d-aab4-59f4cd9f798e-kube-api-access-5wggj\") on node \"crc\" DevicePath \"\"" Dec 02 16:45:33 crc kubenswrapper[4933]: I1202 16:45:33.859627 4933 generic.go:334] "Generic (PLEG): container finished" podID="57555967-eb83-498d-aab4-59f4cd9f798e" containerID="8db813bdeed00af40a324beaa2ca534a196a0a8e35f8d0419f1c4256988f5170" exitCode=0 Dec 02 16:45:33 crc kubenswrapper[4933]: I1202 16:45:33.859677 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8spjz" event={"ID":"57555967-eb83-498d-aab4-59f4cd9f798e","Type":"ContainerDied","Data":"8db813bdeed00af40a324beaa2ca534a196a0a8e35f8d0419f1c4256988f5170"} Dec 02 16:45:33 crc kubenswrapper[4933]: I1202 16:45:33.859716 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8spjz" event={"ID":"57555967-eb83-498d-aab4-59f4cd9f798e","Type":"ContainerDied","Data":"86fefa5b5dd6d5bb9de4d4bfff9216ad0818509704abdd74c56a0e96fa8f153d"} Dec 02 16:45:33 crc kubenswrapper[4933]: I1202 16:45:33.859737 4933 scope.go:117] "RemoveContainer" containerID="8db813bdeed00af40a324beaa2ca534a196a0a8e35f8d0419f1c4256988f5170" Dec 02 16:45:33 crc kubenswrapper[4933]: I1202 16:45:33.859755 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8spjz" Dec 02 16:45:33 crc kubenswrapper[4933]: I1202 16:45:33.890038 4933 scope.go:117] "RemoveContainer" containerID="e4552447daa69fae06972b18d7d137815b4bb20d8e0cf4cdfeaca5a361ff9639" Dec 02 16:45:33 crc kubenswrapper[4933]: I1202 16:45:33.912201 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8spjz"] Dec 02 16:45:33 crc kubenswrapper[4933]: I1202 16:45:33.920985 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-8spjz"] Dec 02 16:45:33 crc kubenswrapper[4933]: I1202 16:45:33.936502 4933 scope.go:117] "RemoveContainer" containerID="4ccedde9e9aadcbd8bb8f70e7ea4f71f7ed5a8b131c02d6ac2a186cf5c7f0d3d" Dec 02 16:45:33 crc kubenswrapper[4933]: I1202 16:45:33.971899 4933 scope.go:117] "RemoveContainer" containerID="8db813bdeed00af40a324beaa2ca534a196a0a8e35f8d0419f1c4256988f5170" Dec 02 16:45:33 crc kubenswrapper[4933]: E1202 16:45:33.972275 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8db813bdeed00af40a324beaa2ca534a196a0a8e35f8d0419f1c4256988f5170\": container with ID starting with 8db813bdeed00af40a324beaa2ca534a196a0a8e35f8d0419f1c4256988f5170 not found: ID does not exist" containerID="8db813bdeed00af40a324beaa2ca534a196a0a8e35f8d0419f1c4256988f5170" Dec 02 16:45:33 crc kubenswrapper[4933]: I1202 16:45:33.972305 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8db813bdeed00af40a324beaa2ca534a196a0a8e35f8d0419f1c4256988f5170"} err="failed to get container status \"8db813bdeed00af40a324beaa2ca534a196a0a8e35f8d0419f1c4256988f5170\": rpc error: code = NotFound desc = could not find container \"8db813bdeed00af40a324beaa2ca534a196a0a8e35f8d0419f1c4256988f5170\": container with ID starting with 8db813bdeed00af40a324beaa2ca534a196a0a8e35f8d0419f1c4256988f5170 not found: ID does not exist" Dec 02 16:45:33 crc kubenswrapper[4933]: I1202 16:45:33.972326 4933 scope.go:117] "RemoveContainer" containerID="e4552447daa69fae06972b18d7d137815b4bb20d8e0cf4cdfeaca5a361ff9639" Dec 02 16:45:33 crc kubenswrapper[4933]: E1202 16:45:33.972591 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4552447daa69fae06972b18d7d137815b4bb20d8e0cf4cdfeaca5a361ff9639\": container with ID starting with e4552447daa69fae06972b18d7d137815b4bb20d8e0cf4cdfeaca5a361ff9639 not found: ID does not exist" containerID="e4552447daa69fae06972b18d7d137815b4bb20d8e0cf4cdfeaca5a361ff9639" Dec 02 16:45:33 crc kubenswrapper[4933]: I1202 16:45:33.972625 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4552447daa69fae06972b18d7d137815b4bb20d8e0cf4cdfeaca5a361ff9639"} err="failed to get container status \"e4552447daa69fae06972b18d7d137815b4bb20d8e0cf4cdfeaca5a361ff9639\": rpc error: code = NotFound desc = could not find container \"e4552447daa69fae06972b18d7d137815b4bb20d8e0cf4cdfeaca5a361ff9639\": container with ID starting with e4552447daa69fae06972b18d7d137815b4bb20d8e0cf4cdfeaca5a361ff9639 not found: ID does not exist" Dec 02 16:45:33 crc kubenswrapper[4933]: I1202 16:45:33.972645 4933 scope.go:117] "RemoveContainer" containerID="4ccedde9e9aadcbd8bb8f70e7ea4f71f7ed5a8b131c02d6ac2a186cf5c7f0d3d" Dec 02 16:45:33 crc kubenswrapper[4933]: E1202 16:45:33.972937 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ccedde9e9aadcbd8bb8f70e7ea4f71f7ed5a8b131c02d6ac2a186cf5c7f0d3d\": container with ID starting with 4ccedde9e9aadcbd8bb8f70e7ea4f71f7ed5a8b131c02d6ac2a186cf5c7f0d3d not found: ID does not exist" containerID="4ccedde9e9aadcbd8bb8f70e7ea4f71f7ed5a8b131c02d6ac2a186cf5c7f0d3d" Dec 02 16:45:33 crc kubenswrapper[4933]: I1202 16:45:33.972962 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ccedde9e9aadcbd8bb8f70e7ea4f71f7ed5a8b131c02d6ac2a186cf5c7f0d3d"} err="failed to get container status \"4ccedde9e9aadcbd8bb8f70e7ea4f71f7ed5a8b131c02d6ac2a186cf5c7f0d3d\": rpc error: code = NotFound desc = could not find container \"4ccedde9e9aadcbd8bb8f70e7ea4f71f7ed5a8b131c02d6ac2a186cf5c7f0d3d\": container with ID starting with 4ccedde9e9aadcbd8bb8f70e7ea4f71f7ed5a8b131c02d6ac2a186cf5c7f0d3d not found: ID does not exist" Dec 02 16:45:35 crc kubenswrapper[4933]: I1202 16:45:35.067894 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57555967-eb83-498d-aab4-59f4cd9f798e" path="/var/lib/kubelet/pods/57555967-eb83-498d-aab4-59f4cd9f798e/volumes" Dec 02 16:45:37 crc kubenswrapper[4933]: I1202 16:45:37.064870 4933 scope.go:117] "RemoveContainer" containerID="68b78e6cd96af98221363c9fce993885171fbe98149e90f073dca0715a2cf767" Dec 02 16:45:37 crc kubenswrapper[4933]: E1202 16:45:37.065588 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 16:45:41 crc kubenswrapper[4933]: I1202 16:45:41.959741 4933 generic.go:334] "Generic (PLEG): container finished" podID="06b52c33-11b2-4a83-a9b8-2de5845e6e89" containerID="6400317194d6c65ae61777c8bfcc362ab95fb4aa579dd0be2d8087553d023a3f" exitCode=0 Dec 02 16:45:41 crc kubenswrapper[4933]: I1202 16:45:41.959867 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-hrqsx" event={"ID":"06b52c33-11b2-4a83-a9b8-2de5845e6e89","Type":"ContainerDied","Data":"6400317194d6c65ae61777c8bfcc362ab95fb4aa579dd0be2d8087553d023a3f"} Dec 02 16:45:43 crc kubenswrapper[4933]: I1202 16:45:43.453491 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-hrqsx" Dec 02 16:45:43 crc kubenswrapper[4933]: I1202 16:45:43.529187 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/06b52c33-11b2-4a83-a9b8-2de5845e6e89-inventory\") pod \"06b52c33-11b2-4a83-a9b8-2de5845e6e89\" (UID: \"06b52c33-11b2-4a83-a9b8-2de5845e6e89\") " Dec 02 16:45:43 crc kubenswrapper[4933]: I1202 16:45:43.529331 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pq9ts\" (UniqueName: \"kubernetes.io/projected/06b52c33-11b2-4a83-a9b8-2de5845e6e89-kube-api-access-pq9ts\") pod \"06b52c33-11b2-4a83-a9b8-2de5845e6e89\" (UID: \"06b52c33-11b2-4a83-a9b8-2de5845e6e89\") " Dec 02 16:45:43 crc kubenswrapper[4933]: I1202 16:45:43.529358 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/06b52c33-11b2-4a83-a9b8-2de5845e6e89-ssh-key\") pod \"06b52c33-11b2-4a83-a9b8-2de5845e6e89\" (UID: \"06b52c33-11b2-4a83-a9b8-2de5845e6e89\") " Dec 02 16:45:43 crc kubenswrapper[4933]: I1202 16:45:43.529404 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/06b52c33-11b2-4a83-a9b8-2de5845e6e89-ceilometer-ipmi-config-data-2\") pod \"06b52c33-11b2-4a83-a9b8-2de5845e6e89\" (UID: \"06b52c33-11b2-4a83-a9b8-2de5845e6e89\") " Dec 02 16:45:43 crc kubenswrapper[4933]: I1202 16:45:43.529527 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/06b52c33-11b2-4a83-a9b8-2de5845e6e89-ceilometer-ipmi-config-data-1\") pod \"06b52c33-11b2-4a83-a9b8-2de5845e6e89\" (UID: \"06b52c33-11b2-4a83-a9b8-2de5845e6e89\") " Dec 02 16:45:43 crc kubenswrapper[4933]: I1202 16:45:43.529547 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/06b52c33-11b2-4a83-a9b8-2de5845e6e89-ceilometer-ipmi-config-data-0\") pod \"06b52c33-11b2-4a83-a9b8-2de5845e6e89\" (UID: \"06b52c33-11b2-4a83-a9b8-2de5845e6e89\") " Dec 02 16:45:43 crc kubenswrapper[4933]: I1202 16:45:43.529628 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06b52c33-11b2-4a83-a9b8-2de5845e6e89-telemetry-power-monitoring-combined-ca-bundle\") pod \"06b52c33-11b2-4a83-a9b8-2de5845e6e89\" (UID: \"06b52c33-11b2-4a83-a9b8-2de5845e6e89\") " Dec 02 16:45:43 crc kubenswrapper[4933]: I1202 16:45:43.535120 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06b52c33-11b2-4a83-a9b8-2de5845e6e89-kube-api-access-pq9ts" (OuterVolumeSpecName: "kube-api-access-pq9ts") pod "06b52c33-11b2-4a83-a9b8-2de5845e6e89" (UID: "06b52c33-11b2-4a83-a9b8-2de5845e6e89"). InnerVolumeSpecName "kube-api-access-pq9ts". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:45:43 crc kubenswrapper[4933]: I1202 16:45:43.536001 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06b52c33-11b2-4a83-a9b8-2de5845e6e89-telemetry-power-monitoring-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-power-monitoring-combined-ca-bundle") pod "06b52c33-11b2-4a83-a9b8-2de5845e6e89" (UID: "06b52c33-11b2-4a83-a9b8-2de5845e6e89"). InnerVolumeSpecName "telemetry-power-monitoring-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:45:43 crc kubenswrapper[4933]: I1202 16:45:43.563121 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06b52c33-11b2-4a83-a9b8-2de5845e6e89-inventory" (OuterVolumeSpecName: "inventory") pod "06b52c33-11b2-4a83-a9b8-2de5845e6e89" (UID: "06b52c33-11b2-4a83-a9b8-2de5845e6e89"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:45:43 crc kubenswrapper[4933]: I1202 16:45:43.564199 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06b52c33-11b2-4a83-a9b8-2de5845e6e89-ceilometer-ipmi-config-data-0" (OuterVolumeSpecName: "ceilometer-ipmi-config-data-0") pod "06b52c33-11b2-4a83-a9b8-2de5845e6e89" (UID: "06b52c33-11b2-4a83-a9b8-2de5845e6e89"). InnerVolumeSpecName "ceilometer-ipmi-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:45:43 crc kubenswrapper[4933]: I1202 16:45:43.567429 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06b52c33-11b2-4a83-a9b8-2de5845e6e89-ceilometer-ipmi-config-data-1" (OuterVolumeSpecName: "ceilometer-ipmi-config-data-1") pod "06b52c33-11b2-4a83-a9b8-2de5845e6e89" (UID: "06b52c33-11b2-4a83-a9b8-2de5845e6e89"). InnerVolumeSpecName "ceilometer-ipmi-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:45:43 crc kubenswrapper[4933]: I1202 16:45:43.578097 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06b52c33-11b2-4a83-a9b8-2de5845e6e89-ceilometer-ipmi-config-data-2" (OuterVolumeSpecName: "ceilometer-ipmi-config-data-2") pod "06b52c33-11b2-4a83-a9b8-2de5845e6e89" (UID: "06b52c33-11b2-4a83-a9b8-2de5845e6e89"). InnerVolumeSpecName "ceilometer-ipmi-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:45:43 crc kubenswrapper[4933]: I1202 16:45:43.581807 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06b52c33-11b2-4a83-a9b8-2de5845e6e89-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "06b52c33-11b2-4a83-a9b8-2de5845e6e89" (UID: "06b52c33-11b2-4a83-a9b8-2de5845e6e89"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:45:43 crc kubenswrapper[4933]: I1202 16:45:43.633104 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pq9ts\" (UniqueName: \"kubernetes.io/projected/06b52c33-11b2-4a83-a9b8-2de5845e6e89-kube-api-access-pq9ts\") on node \"crc\" DevicePath \"\"" Dec 02 16:45:43 crc kubenswrapper[4933]: I1202 16:45:43.633727 4933 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/06b52c33-11b2-4a83-a9b8-2de5845e6e89-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 16:45:43 crc kubenswrapper[4933]: I1202 16:45:43.633861 4933 reconciler_common.go:293] "Volume detached for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/06b52c33-11b2-4a83-a9b8-2de5845e6e89-ceilometer-ipmi-config-data-2\") on node \"crc\" DevicePath \"\"" Dec 02 16:45:43 crc kubenswrapper[4933]: I1202 16:45:43.633963 4933 reconciler_common.go:293] "Volume detached for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/06b52c33-11b2-4a83-a9b8-2de5845e6e89-ceilometer-ipmi-config-data-1\") on node \"crc\" DevicePath \"\"" Dec 02 16:45:43 crc kubenswrapper[4933]: I1202 16:45:43.634041 4933 reconciler_common.go:293] "Volume detached for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/06b52c33-11b2-4a83-a9b8-2de5845e6e89-ceilometer-ipmi-config-data-0\") on node \"crc\" DevicePath \"\"" Dec 02 16:45:43 crc kubenswrapper[4933]: I1202 16:45:43.634121 4933 reconciler_common.go:293] "Volume detached for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06b52c33-11b2-4a83-a9b8-2de5845e6e89-telemetry-power-monitoring-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 16:45:43 crc kubenswrapper[4933]: I1202 16:45:43.634231 4933 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/06b52c33-11b2-4a83-a9b8-2de5845e6e89-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 16:45:43 crc kubenswrapper[4933]: I1202 16:45:43.990180 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-hrqsx" event={"ID":"06b52c33-11b2-4a83-a9b8-2de5845e6e89","Type":"ContainerDied","Data":"6b376b01a97f9ef4d25a73b007737b6a914c8acb9938e485cc68376806ab6c10"} Dec 02 16:45:43 crc kubenswrapper[4933]: I1202 16:45:43.990227 4933 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6b376b01a97f9ef4d25a73b007737b6a914c8acb9938e485cc68376806ab6c10" Dec 02 16:45:43 crc kubenswrapper[4933]: I1202 16:45:43.990283 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-hrqsx" Dec 02 16:45:44 crc kubenswrapper[4933]: I1202 16:45:44.112237 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/logging-edpm-deployment-openstack-edpm-ipam-995pm"] Dec 02 16:45:44 crc kubenswrapper[4933]: E1202 16:45:44.112695 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57555967-eb83-498d-aab4-59f4cd9f798e" containerName="extract-utilities" Dec 02 16:45:44 crc kubenswrapper[4933]: I1202 16:45:44.112712 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="57555967-eb83-498d-aab4-59f4cd9f798e" containerName="extract-utilities" Dec 02 16:45:44 crc kubenswrapper[4933]: E1202 16:45:44.112725 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06b52c33-11b2-4a83-a9b8-2de5845e6e89" containerName="telemetry-power-monitoring-edpm-deployment-openstack-edpm-ipam" Dec 02 16:45:44 crc kubenswrapper[4933]: I1202 16:45:44.112734 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="06b52c33-11b2-4a83-a9b8-2de5845e6e89" containerName="telemetry-power-monitoring-edpm-deployment-openstack-edpm-ipam" Dec 02 16:45:44 crc kubenswrapper[4933]: E1202 16:45:44.112793 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57555967-eb83-498d-aab4-59f4cd9f798e" containerName="extract-content" Dec 02 16:45:44 crc kubenswrapper[4933]: I1202 16:45:44.112804 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="57555967-eb83-498d-aab4-59f4cd9f798e" containerName="extract-content" Dec 02 16:45:44 crc kubenswrapper[4933]: E1202 16:45:44.112844 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57555967-eb83-498d-aab4-59f4cd9f798e" containerName="registry-server" Dec 02 16:45:44 crc kubenswrapper[4933]: I1202 16:45:44.112853 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="57555967-eb83-498d-aab4-59f4cd9f798e" containerName="registry-server" Dec 02 16:45:44 crc kubenswrapper[4933]: I1202 16:45:44.113071 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="06b52c33-11b2-4a83-a9b8-2de5845e6e89" containerName="telemetry-power-monitoring-edpm-deployment-openstack-edpm-ipam" Dec 02 16:45:44 crc kubenswrapper[4933]: I1202 16:45:44.113101 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="57555967-eb83-498d-aab4-59f4cd9f798e" containerName="registry-server" Dec 02 16:45:44 crc kubenswrapper[4933]: I1202 16:45:44.113897 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-995pm" Dec 02 16:45:44 crc kubenswrapper[4933]: I1202 16:45:44.116195 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 16:45:44 crc kubenswrapper[4933]: I1202 16:45:44.116771 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"logging-compute-config-data" Dec 02 16:45:44 crc kubenswrapper[4933]: I1202 16:45:44.117540 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mlmmm" Dec 02 16:45:44 crc kubenswrapper[4933]: I1202 16:45:44.117582 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 16:45:44 crc kubenswrapper[4933]: I1202 16:45:44.117539 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 16:45:44 crc kubenswrapper[4933]: I1202 16:45:44.128343 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/logging-edpm-deployment-openstack-edpm-ipam-995pm"] Dec 02 16:45:44 crc kubenswrapper[4933]: I1202 16:45:44.249016 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e28d7f01-efc3-4011-baea-61cc2a6f0cd9-ssh-key\") pod \"logging-edpm-deployment-openstack-edpm-ipam-995pm\" (UID: \"e28d7f01-efc3-4011-baea-61cc2a6f0cd9\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-995pm" Dec 02 16:45:44 crc kubenswrapper[4933]: I1202 16:45:44.249068 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/e28d7f01-efc3-4011-baea-61cc2a6f0cd9-logging-compute-config-data-1\") pod \"logging-edpm-deployment-openstack-edpm-ipam-995pm\" (UID: \"e28d7f01-efc3-4011-baea-61cc2a6f0cd9\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-995pm" Dec 02 16:45:44 crc kubenswrapper[4933]: I1202 16:45:44.249114 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bl8x8\" (UniqueName: \"kubernetes.io/projected/e28d7f01-efc3-4011-baea-61cc2a6f0cd9-kube-api-access-bl8x8\") pod \"logging-edpm-deployment-openstack-edpm-ipam-995pm\" (UID: \"e28d7f01-efc3-4011-baea-61cc2a6f0cd9\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-995pm" Dec 02 16:45:44 crc kubenswrapper[4933]: I1202 16:45:44.249586 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/e28d7f01-efc3-4011-baea-61cc2a6f0cd9-logging-compute-config-data-0\") pod \"logging-edpm-deployment-openstack-edpm-ipam-995pm\" (UID: \"e28d7f01-efc3-4011-baea-61cc2a6f0cd9\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-995pm" Dec 02 16:45:44 crc kubenswrapper[4933]: I1202 16:45:44.249795 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e28d7f01-efc3-4011-baea-61cc2a6f0cd9-inventory\") pod \"logging-edpm-deployment-openstack-edpm-ipam-995pm\" (UID: \"e28d7f01-efc3-4011-baea-61cc2a6f0cd9\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-995pm" Dec 02 16:45:44 crc kubenswrapper[4933]: I1202 16:45:44.352772 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/e28d7f01-efc3-4011-baea-61cc2a6f0cd9-logging-compute-config-data-0\") pod \"logging-edpm-deployment-openstack-edpm-ipam-995pm\" (UID: \"e28d7f01-efc3-4011-baea-61cc2a6f0cd9\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-995pm" Dec 02 16:45:44 crc kubenswrapper[4933]: I1202 16:45:44.352974 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e28d7f01-efc3-4011-baea-61cc2a6f0cd9-inventory\") pod \"logging-edpm-deployment-openstack-edpm-ipam-995pm\" (UID: \"e28d7f01-efc3-4011-baea-61cc2a6f0cd9\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-995pm" Dec 02 16:45:44 crc kubenswrapper[4933]: I1202 16:45:44.353195 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e28d7f01-efc3-4011-baea-61cc2a6f0cd9-ssh-key\") pod \"logging-edpm-deployment-openstack-edpm-ipam-995pm\" (UID: \"e28d7f01-efc3-4011-baea-61cc2a6f0cd9\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-995pm" Dec 02 16:45:44 crc kubenswrapper[4933]: I1202 16:45:44.353287 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/e28d7f01-efc3-4011-baea-61cc2a6f0cd9-logging-compute-config-data-1\") pod \"logging-edpm-deployment-openstack-edpm-ipam-995pm\" (UID: \"e28d7f01-efc3-4011-baea-61cc2a6f0cd9\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-995pm" Dec 02 16:45:44 crc kubenswrapper[4933]: I1202 16:45:44.353384 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bl8x8\" (UniqueName: \"kubernetes.io/projected/e28d7f01-efc3-4011-baea-61cc2a6f0cd9-kube-api-access-bl8x8\") pod \"logging-edpm-deployment-openstack-edpm-ipam-995pm\" (UID: \"e28d7f01-efc3-4011-baea-61cc2a6f0cd9\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-995pm" Dec 02 16:45:44 crc kubenswrapper[4933]: I1202 16:45:44.358397 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e28d7f01-efc3-4011-baea-61cc2a6f0cd9-inventory\") pod \"logging-edpm-deployment-openstack-edpm-ipam-995pm\" (UID: \"e28d7f01-efc3-4011-baea-61cc2a6f0cd9\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-995pm" Dec 02 16:45:44 crc kubenswrapper[4933]: I1202 16:45:44.358782 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e28d7f01-efc3-4011-baea-61cc2a6f0cd9-ssh-key\") pod \"logging-edpm-deployment-openstack-edpm-ipam-995pm\" (UID: \"e28d7f01-efc3-4011-baea-61cc2a6f0cd9\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-995pm" Dec 02 16:45:44 crc kubenswrapper[4933]: I1202 16:45:44.359945 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/e28d7f01-efc3-4011-baea-61cc2a6f0cd9-logging-compute-config-data-0\") pod \"logging-edpm-deployment-openstack-edpm-ipam-995pm\" (UID: \"e28d7f01-efc3-4011-baea-61cc2a6f0cd9\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-995pm" Dec 02 16:45:44 crc kubenswrapper[4933]: I1202 16:45:44.361138 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/e28d7f01-efc3-4011-baea-61cc2a6f0cd9-logging-compute-config-data-1\") pod \"logging-edpm-deployment-openstack-edpm-ipam-995pm\" (UID: \"e28d7f01-efc3-4011-baea-61cc2a6f0cd9\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-995pm" Dec 02 16:45:44 crc kubenswrapper[4933]: I1202 16:45:44.375253 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bl8x8\" (UniqueName: \"kubernetes.io/projected/e28d7f01-efc3-4011-baea-61cc2a6f0cd9-kube-api-access-bl8x8\") pod \"logging-edpm-deployment-openstack-edpm-ipam-995pm\" (UID: \"e28d7f01-efc3-4011-baea-61cc2a6f0cd9\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-995pm" Dec 02 16:45:44 crc kubenswrapper[4933]: I1202 16:45:44.437664 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-995pm" Dec 02 16:45:45 crc kubenswrapper[4933]: I1202 16:45:45.010692 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/logging-edpm-deployment-openstack-edpm-ipam-995pm"] Dec 02 16:45:46 crc kubenswrapper[4933]: I1202 16:45:46.022430 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-995pm" event={"ID":"e28d7f01-efc3-4011-baea-61cc2a6f0cd9","Type":"ContainerStarted","Data":"dc6d43f807c0f8f4d2489527f9f105e755fcd0d5cc4c0e650f01509ea3ead587"} Dec 02 16:45:46 crc kubenswrapper[4933]: I1202 16:45:46.023029 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-995pm" event={"ID":"e28d7f01-efc3-4011-baea-61cc2a6f0cd9","Type":"ContainerStarted","Data":"99f938f328229bed87e3234a8e1040e2f1543c47c66a8eb10ec608f6cb20a4b3"} Dec 02 16:45:46 crc kubenswrapper[4933]: I1202 16:45:46.054643 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-995pm" podStartSLOduration=1.548359721 podStartE2EDuration="2.054628703s" podCreationTimestamp="2025-12-02 16:45:44 +0000 UTC" firstStartedPulling="2025-12-02 16:45:45.00615186 +0000 UTC m=+3208.257378563" lastFinishedPulling="2025-12-02 16:45:45.512420832 +0000 UTC m=+3208.763647545" observedRunningTime="2025-12-02 16:45:46.04513789 +0000 UTC m=+3209.296364593" watchObservedRunningTime="2025-12-02 16:45:46.054628703 +0000 UTC m=+3209.305855406" Dec 02 16:45:46 crc kubenswrapper[4933]: I1202 16:45:46.732997 4933 scope.go:117] "RemoveContainer" containerID="1b55922489ae09b24390d69952c65f42a945ff2320201273adb2f754532f3c1a" Dec 02 16:45:51 crc kubenswrapper[4933]: I1202 16:45:51.053352 4933 scope.go:117] "RemoveContainer" containerID="68b78e6cd96af98221363c9fce993885171fbe98149e90f073dca0715a2cf767" Dec 02 16:45:51 crc kubenswrapper[4933]: E1202 16:45:51.054550 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 16:46:01 crc kubenswrapper[4933]: I1202 16:46:01.215237 4933 generic.go:334] "Generic (PLEG): container finished" podID="e28d7f01-efc3-4011-baea-61cc2a6f0cd9" containerID="dc6d43f807c0f8f4d2489527f9f105e755fcd0d5cc4c0e650f01509ea3ead587" exitCode=0 Dec 02 16:46:01 crc kubenswrapper[4933]: I1202 16:46:01.215317 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-995pm" event={"ID":"e28d7f01-efc3-4011-baea-61cc2a6f0cd9","Type":"ContainerDied","Data":"dc6d43f807c0f8f4d2489527f9f105e755fcd0d5cc4c0e650f01509ea3ead587"} Dec 02 16:46:02 crc kubenswrapper[4933]: I1202 16:46:02.735602 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-995pm" Dec 02 16:46:02 crc kubenswrapper[4933]: I1202 16:46:02.820292 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bl8x8\" (UniqueName: \"kubernetes.io/projected/e28d7f01-efc3-4011-baea-61cc2a6f0cd9-kube-api-access-bl8x8\") pod \"e28d7f01-efc3-4011-baea-61cc2a6f0cd9\" (UID: \"e28d7f01-efc3-4011-baea-61cc2a6f0cd9\") " Dec 02 16:46:02 crc kubenswrapper[4933]: I1202 16:46:02.820787 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e28d7f01-efc3-4011-baea-61cc2a6f0cd9-ssh-key\") pod \"e28d7f01-efc3-4011-baea-61cc2a6f0cd9\" (UID: \"e28d7f01-efc3-4011-baea-61cc2a6f0cd9\") " Dec 02 16:46:02 crc kubenswrapper[4933]: I1202 16:46:02.821189 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/e28d7f01-efc3-4011-baea-61cc2a6f0cd9-logging-compute-config-data-0\") pod \"e28d7f01-efc3-4011-baea-61cc2a6f0cd9\" (UID: \"e28d7f01-efc3-4011-baea-61cc2a6f0cd9\") " Dec 02 16:46:02 crc kubenswrapper[4933]: I1202 16:46:02.821268 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/e28d7f01-efc3-4011-baea-61cc2a6f0cd9-logging-compute-config-data-1\") pod \"e28d7f01-efc3-4011-baea-61cc2a6f0cd9\" (UID: \"e28d7f01-efc3-4011-baea-61cc2a6f0cd9\") " Dec 02 16:46:02 crc kubenswrapper[4933]: I1202 16:46:02.821313 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e28d7f01-efc3-4011-baea-61cc2a6f0cd9-inventory\") pod \"e28d7f01-efc3-4011-baea-61cc2a6f0cd9\" (UID: \"e28d7f01-efc3-4011-baea-61cc2a6f0cd9\") " Dec 02 16:46:02 crc kubenswrapper[4933]: I1202 16:46:02.828319 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e28d7f01-efc3-4011-baea-61cc2a6f0cd9-kube-api-access-bl8x8" (OuterVolumeSpecName: "kube-api-access-bl8x8") pod "e28d7f01-efc3-4011-baea-61cc2a6f0cd9" (UID: "e28d7f01-efc3-4011-baea-61cc2a6f0cd9"). InnerVolumeSpecName "kube-api-access-bl8x8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:46:02 crc kubenswrapper[4933]: I1202 16:46:02.860274 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e28d7f01-efc3-4011-baea-61cc2a6f0cd9-logging-compute-config-data-1" (OuterVolumeSpecName: "logging-compute-config-data-1") pod "e28d7f01-efc3-4011-baea-61cc2a6f0cd9" (UID: "e28d7f01-efc3-4011-baea-61cc2a6f0cd9"). InnerVolumeSpecName "logging-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:46:02 crc kubenswrapper[4933]: I1202 16:46:02.860326 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e28d7f01-efc3-4011-baea-61cc2a6f0cd9-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "e28d7f01-efc3-4011-baea-61cc2a6f0cd9" (UID: "e28d7f01-efc3-4011-baea-61cc2a6f0cd9"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:46:02 crc kubenswrapper[4933]: I1202 16:46:02.860986 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e28d7f01-efc3-4011-baea-61cc2a6f0cd9-inventory" (OuterVolumeSpecName: "inventory") pod "e28d7f01-efc3-4011-baea-61cc2a6f0cd9" (UID: "e28d7f01-efc3-4011-baea-61cc2a6f0cd9"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:46:02 crc kubenswrapper[4933]: I1202 16:46:02.886485 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e28d7f01-efc3-4011-baea-61cc2a6f0cd9-logging-compute-config-data-0" (OuterVolumeSpecName: "logging-compute-config-data-0") pod "e28d7f01-efc3-4011-baea-61cc2a6f0cd9" (UID: "e28d7f01-efc3-4011-baea-61cc2a6f0cd9"). InnerVolumeSpecName "logging-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 16:46:02 crc kubenswrapper[4933]: I1202 16:46:02.925475 4933 reconciler_common.go:293] "Volume detached for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/e28d7f01-efc3-4011-baea-61cc2a6f0cd9-logging-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Dec 02 16:46:02 crc kubenswrapper[4933]: I1202 16:46:02.926182 4933 reconciler_common.go:293] "Volume detached for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/e28d7f01-efc3-4011-baea-61cc2a6f0cd9-logging-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Dec 02 16:46:02 crc kubenswrapper[4933]: I1202 16:46:02.926276 4933 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e28d7f01-efc3-4011-baea-61cc2a6f0cd9-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 16:46:02 crc kubenswrapper[4933]: I1202 16:46:02.926360 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bl8x8\" (UniqueName: \"kubernetes.io/projected/e28d7f01-efc3-4011-baea-61cc2a6f0cd9-kube-api-access-bl8x8\") on node \"crc\" DevicePath \"\"" Dec 02 16:46:02 crc kubenswrapper[4933]: I1202 16:46:02.926436 4933 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e28d7f01-efc3-4011-baea-61cc2a6f0cd9-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 16:46:03 crc kubenswrapper[4933]: I1202 16:46:03.244758 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-995pm" event={"ID":"e28d7f01-efc3-4011-baea-61cc2a6f0cd9","Type":"ContainerDied","Data":"99f938f328229bed87e3234a8e1040e2f1543c47c66a8eb10ec608f6cb20a4b3"} Dec 02 16:46:03 crc kubenswrapper[4933]: I1202 16:46:03.245023 4933 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="99f938f328229bed87e3234a8e1040e2f1543c47c66a8eb10ec608f6cb20a4b3" Dec 02 16:46:03 crc kubenswrapper[4933]: I1202 16:46:03.244848 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-995pm" Dec 02 16:46:06 crc kubenswrapper[4933]: I1202 16:46:06.053580 4933 scope.go:117] "RemoveContainer" containerID="68b78e6cd96af98221363c9fce993885171fbe98149e90f073dca0715a2cf767" Dec 02 16:46:06 crc kubenswrapper[4933]: E1202 16:46:06.054343 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 16:46:21 crc kubenswrapper[4933]: I1202 16:46:21.059073 4933 scope.go:117] "RemoveContainer" containerID="68b78e6cd96af98221363c9fce993885171fbe98149e90f073dca0715a2cf767" Dec 02 16:46:21 crc kubenswrapper[4933]: E1202 16:46:21.060412 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 16:46:33 crc kubenswrapper[4933]: I1202 16:46:33.054784 4933 scope.go:117] "RemoveContainer" containerID="68b78e6cd96af98221363c9fce993885171fbe98149e90f073dca0715a2cf767" Dec 02 16:46:33 crc kubenswrapper[4933]: E1202 16:46:33.055678 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 16:46:46 crc kubenswrapper[4933]: I1202 16:46:46.052959 4933 scope.go:117] "RemoveContainer" containerID="68b78e6cd96af98221363c9fce993885171fbe98149e90f073dca0715a2cf767" Dec 02 16:46:46 crc kubenswrapper[4933]: E1202 16:46:46.053775 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 16:46:59 crc kubenswrapper[4933]: I1202 16:46:59.053886 4933 scope.go:117] "RemoveContainer" containerID="68b78e6cd96af98221363c9fce993885171fbe98149e90f073dca0715a2cf767" Dec 02 16:46:59 crc kubenswrapper[4933]: I1202 16:46:59.948973 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" event={"ID":"e6c1c5e6-50dd-428a-890c-2c3f0456f2fa","Type":"ContainerStarted","Data":"317be8989886cb7a67b142af24f3660c896333713b671c1f9653516d46c7121c"} Dec 02 16:49:17 crc kubenswrapper[4933]: I1202 16:49:17.169111 4933 patch_prober.go:28] interesting pod/machine-config-daemon-d2p6w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 16:49:17 crc kubenswrapper[4933]: I1202 16:49:17.169658 4933 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 16:49:47 crc kubenswrapper[4933]: I1202 16:49:47.169461 4933 patch_prober.go:28] interesting pod/machine-config-daemon-d2p6w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 16:49:47 crc kubenswrapper[4933]: I1202 16:49:47.169910 4933 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 16:50:01 crc kubenswrapper[4933]: I1202 16:50:01.189536 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-tgsz8"] Dec 02 16:50:01 crc kubenswrapper[4933]: E1202 16:50:01.190734 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e28d7f01-efc3-4011-baea-61cc2a6f0cd9" containerName="logging-edpm-deployment-openstack-edpm-ipam" Dec 02 16:50:01 crc kubenswrapper[4933]: I1202 16:50:01.190754 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="e28d7f01-efc3-4011-baea-61cc2a6f0cd9" containerName="logging-edpm-deployment-openstack-edpm-ipam" Dec 02 16:50:01 crc kubenswrapper[4933]: I1202 16:50:01.192913 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="e28d7f01-efc3-4011-baea-61cc2a6f0cd9" containerName="logging-edpm-deployment-openstack-edpm-ipam" Dec 02 16:50:01 crc kubenswrapper[4933]: I1202 16:50:01.195138 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tgsz8" Dec 02 16:50:01 crc kubenswrapper[4933]: I1202 16:50:01.210248 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tgsz8"] Dec 02 16:50:01 crc kubenswrapper[4933]: I1202 16:50:01.353684 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb9e50fd-5ff0-4e00-828a-56b1bba8a1b3-catalog-content\") pod \"redhat-marketplace-tgsz8\" (UID: \"bb9e50fd-5ff0-4e00-828a-56b1bba8a1b3\") " pod="openshift-marketplace/redhat-marketplace-tgsz8" Dec 02 16:50:01 crc kubenswrapper[4933]: I1202 16:50:01.353928 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vld6d\" (UniqueName: \"kubernetes.io/projected/bb9e50fd-5ff0-4e00-828a-56b1bba8a1b3-kube-api-access-vld6d\") pod \"redhat-marketplace-tgsz8\" (UID: \"bb9e50fd-5ff0-4e00-828a-56b1bba8a1b3\") " pod="openshift-marketplace/redhat-marketplace-tgsz8" Dec 02 16:50:01 crc kubenswrapper[4933]: I1202 16:50:01.354017 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb9e50fd-5ff0-4e00-828a-56b1bba8a1b3-utilities\") pod \"redhat-marketplace-tgsz8\" (UID: \"bb9e50fd-5ff0-4e00-828a-56b1bba8a1b3\") " pod="openshift-marketplace/redhat-marketplace-tgsz8" Dec 02 16:50:01 crc kubenswrapper[4933]: I1202 16:50:01.456116 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb9e50fd-5ff0-4e00-828a-56b1bba8a1b3-catalog-content\") pod \"redhat-marketplace-tgsz8\" (UID: \"bb9e50fd-5ff0-4e00-828a-56b1bba8a1b3\") " pod="openshift-marketplace/redhat-marketplace-tgsz8" Dec 02 16:50:01 crc kubenswrapper[4933]: I1202 16:50:01.456271 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vld6d\" (UniqueName: \"kubernetes.io/projected/bb9e50fd-5ff0-4e00-828a-56b1bba8a1b3-kube-api-access-vld6d\") pod \"redhat-marketplace-tgsz8\" (UID: \"bb9e50fd-5ff0-4e00-828a-56b1bba8a1b3\") " pod="openshift-marketplace/redhat-marketplace-tgsz8" Dec 02 16:50:01 crc kubenswrapper[4933]: I1202 16:50:01.456349 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb9e50fd-5ff0-4e00-828a-56b1bba8a1b3-utilities\") pod \"redhat-marketplace-tgsz8\" (UID: \"bb9e50fd-5ff0-4e00-828a-56b1bba8a1b3\") " pod="openshift-marketplace/redhat-marketplace-tgsz8" Dec 02 16:50:01 crc kubenswrapper[4933]: I1202 16:50:01.456580 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb9e50fd-5ff0-4e00-828a-56b1bba8a1b3-catalog-content\") pod \"redhat-marketplace-tgsz8\" (UID: \"bb9e50fd-5ff0-4e00-828a-56b1bba8a1b3\") " pod="openshift-marketplace/redhat-marketplace-tgsz8" Dec 02 16:50:01 crc kubenswrapper[4933]: I1202 16:50:01.456848 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb9e50fd-5ff0-4e00-828a-56b1bba8a1b3-utilities\") pod \"redhat-marketplace-tgsz8\" (UID: \"bb9e50fd-5ff0-4e00-828a-56b1bba8a1b3\") " pod="openshift-marketplace/redhat-marketplace-tgsz8" Dec 02 16:50:01 crc kubenswrapper[4933]: I1202 16:50:01.488968 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vld6d\" (UniqueName: \"kubernetes.io/projected/bb9e50fd-5ff0-4e00-828a-56b1bba8a1b3-kube-api-access-vld6d\") pod \"redhat-marketplace-tgsz8\" (UID: \"bb9e50fd-5ff0-4e00-828a-56b1bba8a1b3\") " pod="openshift-marketplace/redhat-marketplace-tgsz8" Dec 02 16:50:01 crc kubenswrapper[4933]: I1202 16:50:01.522397 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tgsz8" Dec 02 16:50:02 crc kubenswrapper[4933]: I1202 16:50:02.033505 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tgsz8"] Dec 02 16:50:02 crc kubenswrapper[4933]: I1202 16:50:02.081541 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tgsz8" event={"ID":"bb9e50fd-5ff0-4e00-828a-56b1bba8a1b3","Type":"ContainerStarted","Data":"c72462c57e2d388721bf963b5780b4638036453a5d343c698cb2a33f2d96e73b"} Dec 02 16:50:03 crc kubenswrapper[4933]: I1202 16:50:03.093734 4933 generic.go:334] "Generic (PLEG): container finished" podID="bb9e50fd-5ff0-4e00-828a-56b1bba8a1b3" containerID="25c1ae234e160d2bdbcb782d70545ba07545b28f556b2e7b974fb58768380398" exitCode=0 Dec 02 16:50:03 crc kubenswrapper[4933]: I1202 16:50:03.093806 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tgsz8" event={"ID":"bb9e50fd-5ff0-4e00-828a-56b1bba8a1b3","Type":"ContainerDied","Data":"25c1ae234e160d2bdbcb782d70545ba07545b28f556b2e7b974fb58768380398"} Dec 02 16:50:03 crc kubenswrapper[4933]: I1202 16:50:03.096461 4933 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 16:50:04 crc kubenswrapper[4933]: I1202 16:50:04.119978 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tgsz8" event={"ID":"bb9e50fd-5ff0-4e00-828a-56b1bba8a1b3","Type":"ContainerStarted","Data":"0d250e418302efb557f5463b12d2884ee9d436484961d06f0cdc9df06a4229a8"} Dec 02 16:50:05 crc kubenswrapper[4933]: I1202 16:50:05.133684 4933 generic.go:334] "Generic (PLEG): container finished" podID="bb9e50fd-5ff0-4e00-828a-56b1bba8a1b3" containerID="0d250e418302efb557f5463b12d2884ee9d436484961d06f0cdc9df06a4229a8" exitCode=0 Dec 02 16:50:05 crc kubenswrapper[4933]: I1202 16:50:05.133783 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tgsz8" event={"ID":"bb9e50fd-5ff0-4e00-828a-56b1bba8a1b3","Type":"ContainerDied","Data":"0d250e418302efb557f5463b12d2884ee9d436484961d06f0cdc9df06a4229a8"} Dec 02 16:50:06 crc kubenswrapper[4933]: I1202 16:50:06.146938 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tgsz8" event={"ID":"bb9e50fd-5ff0-4e00-828a-56b1bba8a1b3","Type":"ContainerStarted","Data":"4fd03478222118366efddbaf9962ef21bc17542df142f1bf2f557221957d216b"} Dec 02 16:50:06 crc kubenswrapper[4933]: I1202 16:50:06.175376 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-tgsz8" podStartSLOduration=2.6440906 podStartE2EDuration="5.175353911s" podCreationTimestamp="2025-12-02 16:50:01 +0000 UTC" firstStartedPulling="2025-12-02 16:50:03.096171915 +0000 UTC m=+3466.347398628" lastFinishedPulling="2025-12-02 16:50:05.627435226 +0000 UTC m=+3468.878661939" observedRunningTime="2025-12-02 16:50:06.163575748 +0000 UTC m=+3469.414802471" watchObservedRunningTime="2025-12-02 16:50:06.175353911 +0000 UTC m=+3469.426580614" Dec 02 16:50:11 crc kubenswrapper[4933]: I1202 16:50:11.522700 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-tgsz8" Dec 02 16:50:11 crc kubenswrapper[4933]: I1202 16:50:11.523502 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-tgsz8" Dec 02 16:50:11 crc kubenswrapper[4933]: I1202 16:50:11.589176 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-tgsz8" Dec 02 16:50:12 crc kubenswrapper[4933]: I1202 16:50:12.281508 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-tgsz8" Dec 02 16:50:12 crc kubenswrapper[4933]: I1202 16:50:12.368503 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tgsz8"] Dec 02 16:50:14 crc kubenswrapper[4933]: I1202 16:50:14.232396 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-tgsz8" podUID="bb9e50fd-5ff0-4e00-828a-56b1bba8a1b3" containerName="registry-server" containerID="cri-o://4fd03478222118366efddbaf9962ef21bc17542df142f1bf2f557221957d216b" gracePeriod=2 Dec 02 16:50:14 crc kubenswrapper[4933]: I1202 16:50:14.804762 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tgsz8" Dec 02 16:50:14 crc kubenswrapper[4933]: I1202 16:50:14.903908 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb9e50fd-5ff0-4e00-828a-56b1bba8a1b3-catalog-content\") pod \"bb9e50fd-5ff0-4e00-828a-56b1bba8a1b3\" (UID: \"bb9e50fd-5ff0-4e00-828a-56b1bba8a1b3\") " Dec 02 16:50:14 crc kubenswrapper[4933]: I1202 16:50:14.904093 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vld6d\" (UniqueName: \"kubernetes.io/projected/bb9e50fd-5ff0-4e00-828a-56b1bba8a1b3-kube-api-access-vld6d\") pod \"bb9e50fd-5ff0-4e00-828a-56b1bba8a1b3\" (UID: \"bb9e50fd-5ff0-4e00-828a-56b1bba8a1b3\") " Dec 02 16:50:14 crc kubenswrapper[4933]: I1202 16:50:14.904175 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb9e50fd-5ff0-4e00-828a-56b1bba8a1b3-utilities\") pod \"bb9e50fd-5ff0-4e00-828a-56b1bba8a1b3\" (UID: \"bb9e50fd-5ff0-4e00-828a-56b1bba8a1b3\") " Dec 02 16:50:14 crc kubenswrapper[4933]: I1202 16:50:14.904798 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb9e50fd-5ff0-4e00-828a-56b1bba8a1b3-utilities" (OuterVolumeSpecName: "utilities") pod "bb9e50fd-5ff0-4e00-828a-56b1bba8a1b3" (UID: "bb9e50fd-5ff0-4e00-828a-56b1bba8a1b3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 16:50:14 crc kubenswrapper[4933]: I1202 16:50:14.910073 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb9e50fd-5ff0-4e00-828a-56b1bba8a1b3-kube-api-access-vld6d" (OuterVolumeSpecName: "kube-api-access-vld6d") pod "bb9e50fd-5ff0-4e00-828a-56b1bba8a1b3" (UID: "bb9e50fd-5ff0-4e00-828a-56b1bba8a1b3"). InnerVolumeSpecName "kube-api-access-vld6d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:50:14 crc kubenswrapper[4933]: I1202 16:50:14.921308 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb9e50fd-5ff0-4e00-828a-56b1bba8a1b3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bb9e50fd-5ff0-4e00-828a-56b1bba8a1b3" (UID: "bb9e50fd-5ff0-4e00-828a-56b1bba8a1b3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 16:50:15 crc kubenswrapper[4933]: I1202 16:50:15.007281 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vld6d\" (UniqueName: \"kubernetes.io/projected/bb9e50fd-5ff0-4e00-828a-56b1bba8a1b3-kube-api-access-vld6d\") on node \"crc\" DevicePath \"\"" Dec 02 16:50:15 crc kubenswrapper[4933]: I1202 16:50:15.007851 4933 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb9e50fd-5ff0-4e00-828a-56b1bba8a1b3-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 16:50:15 crc kubenswrapper[4933]: I1202 16:50:15.007929 4933 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb9e50fd-5ff0-4e00-828a-56b1bba8a1b3-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 16:50:15 crc kubenswrapper[4933]: I1202 16:50:15.245052 4933 generic.go:334] "Generic (PLEG): container finished" podID="bb9e50fd-5ff0-4e00-828a-56b1bba8a1b3" containerID="4fd03478222118366efddbaf9962ef21bc17542df142f1bf2f557221957d216b" exitCode=0 Dec 02 16:50:15 crc kubenswrapper[4933]: I1202 16:50:15.245892 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tgsz8" Dec 02 16:50:15 crc kubenswrapper[4933]: I1202 16:50:15.245915 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tgsz8" event={"ID":"bb9e50fd-5ff0-4e00-828a-56b1bba8a1b3","Type":"ContainerDied","Data":"4fd03478222118366efddbaf9962ef21bc17542df142f1bf2f557221957d216b"} Dec 02 16:50:15 crc kubenswrapper[4933]: I1202 16:50:15.246361 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tgsz8" event={"ID":"bb9e50fd-5ff0-4e00-828a-56b1bba8a1b3","Type":"ContainerDied","Data":"c72462c57e2d388721bf963b5780b4638036453a5d343c698cb2a33f2d96e73b"} Dec 02 16:50:15 crc kubenswrapper[4933]: I1202 16:50:15.246381 4933 scope.go:117] "RemoveContainer" containerID="4fd03478222118366efddbaf9962ef21bc17542df142f1bf2f557221957d216b" Dec 02 16:50:15 crc kubenswrapper[4933]: I1202 16:50:15.272088 4933 scope.go:117] "RemoveContainer" containerID="0d250e418302efb557f5463b12d2884ee9d436484961d06f0cdc9df06a4229a8" Dec 02 16:50:15 crc kubenswrapper[4933]: I1202 16:50:15.281914 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tgsz8"] Dec 02 16:50:15 crc kubenswrapper[4933]: I1202 16:50:15.293668 4933 scope.go:117] "RemoveContainer" containerID="25c1ae234e160d2bdbcb782d70545ba07545b28f556b2e7b974fb58768380398" Dec 02 16:50:15 crc kubenswrapper[4933]: I1202 16:50:15.295007 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-tgsz8"] Dec 02 16:50:15 crc kubenswrapper[4933]: I1202 16:50:15.347526 4933 scope.go:117] "RemoveContainer" containerID="4fd03478222118366efddbaf9962ef21bc17542df142f1bf2f557221957d216b" Dec 02 16:50:15 crc kubenswrapper[4933]: E1202 16:50:15.348013 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4fd03478222118366efddbaf9962ef21bc17542df142f1bf2f557221957d216b\": container with ID starting with 4fd03478222118366efddbaf9962ef21bc17542df142f1bf2f557221957d216b not found: ID does not exist" containerID="4fd03478222118366efddbaf9962ef21bc17542df142f1bf2f557221957d216b" Dec 02 16:50:15 crc kubenswrapper[4933]: I1202 16:50:15.348054 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4fd03478222118366efddbaf9962ef21bc17542df142f1bf2f557221957d216b"} err="failed to get container status \"4fd03478222118366efddbaf9962ef21bc17542df142f1bf2f557221957d216b\": rpc error: code = NotFound desc = could not find container \"4fd03478222118366efddbaf9962ef21bc17542df142f1bf2f557221957d216b\": container with ID starting with 4fd03478222118366efddbaf9962ef21bc17542df142f1bf2f557221957d216b not found: ID does not exist" Dec 02 16:50:15 crc kubenswrapper[4933]: I1202 16:50:15.348157 4933 scope.go:117] "RemoveContainer" containerID="0d250e418302efb557f5463b12d2884ee9d436484961d06f0cdc9df06a4229a8" Dec 02 16:50:15 crc kubenswrapper[4933]: E1202 16:50:15.348725 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d250e418302efb557f5463b12d2884ee9d436484961d06f0cdc9df06a4229a8\": container with ID starting with 0d250e418302efb557f5463b12d2884ee9d436484961d06f0cdc9df06a4229a8 not found: ID does not exist" containerID="0d250e418302efb557f5463b12d2884ee9d436484961d06f0cdc9df06a4229a8" Dec 02 16:50:15 crc kubenswrapper[4933]: I1202 16:50:15.348763 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d250e418302efb557f5463b12d2884ee9d436484961d06f0cdc9df06a4229a8"} err="failed to get container status \"0d250e418302efb557f5463b12d2884ee9d436484961d06f0cdc9df06a4229a8\": rpc error: code = NotFound desc = could not find container \"0d250e418302efb557f5463b12d2884ee9d436484961d06f0cdc9df06a4229a8\": container with ID starting with 0d250e418302efb557f5463b12d2884ee9d436484961d06f0cdc9df06a4229a8 not found: ID does not exist" Dec 02 16:50:15 crc kubenswrapper[4933]: I1202 16:50:15.348778 4933 scope.go:117] "RemoveContainer" containerID="25c1ae234e160d2bdbcb782d70545ba07545b28f556b2e7b974fb58768380398" Dec 02 16:50:15 crc kubenswrapper[4933]: E1202 16:50:15.349159 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25c1ae234e160d2bdbcb782d70545ba07545b28f556b2e7b974fb58768380398\": container with ID starting with 25c1ae234e160d2bdbcb782d70545ba07545b28f556b2e7b974fb58768380398 not found: ID does not exist" containerID="25c1ae234e160d2bdbcb782d70545ba07545b28f556b2e7b974fb58768380398" Dec 02 16:50:15 crc kubenswrapper[4933]: I1202 16:50:15.349176 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25c1ae234e160d2bdbcb782d70545ba07545b28f556b2e7b974fb58768380398"} err="failed to get container status \"25c1ae234e160d2bdbcb782d70545ba07545b28f556b2e7b974fb58768380398\": rpc error: code = NotFound desc = could not find container \"25c1ae234e160d2bdbcb782d70545ba07545b28f556b2e7b974fb58768380398\": container with ID starting with 25c1ae234e160d2bdbcb782d70545ba07545b28f556b2e7b974fb58768380398 not found: ID does not exist" Dec 02 16:50:17 crc kubenswrapper[4933]: I1202 16:50:17.069694 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb9e50fd-5ff0-4e00-828a-56b1bba8a1b3" path="/var/lib/kubelet/pods/bb9e50fd-5ff0-4e00-828a-56b1bba8a1b3/volumes" Dec 02 16:50:17 crc kubenswrapper[4933]: I1202 16:50:17.169365 4933 patch_prober.go:28] interesting pod/machine-config-daemon-d2p6w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 16:50:17 crc kubenswrapper[4933]: I1202 16:50:17.169472 4933 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 16:50:17 crc kubenswrapper[4933]: I1202 16:50:17.169559 4933 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" Dec 02 16:50:17 crc kubenswrapper[4933]: I1202 16:50:17.171333 4933 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"317be8989886cb7a67b142af24f3660c896333713b671c1f9653516d46c7121c"} pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 16:50:17 crc kubenswrapper[4933]: I1202 16:50:17.171434 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" containerName="machine-config-daemon" containerID="cri-o://317be8989886cb7a67b142af24f3660c896333713b671c1f9653516d46c7121c" gracePeriod=600 Dec 02 16:50:18 crc kubenswrapper[4933]: I1202 16:50:18.281995 4933 generic.go:334] "Generic (PLEG): container finished" podID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" containerID="317be8989886cb7a67b142af24f3660c896333713b671c1f9653516d46c7121c" exitCode=0 Dec 02 16:50:18 crc kubenswrapper[4933]: I1202 16:50:18.282050 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" event={"ID":"e6c1c5e6-50dd-428a-890c-2c3f0456f2fa","Type":"ContainerDied","Data":"317be8989886cb7a67b142af24f3660c896333713b671c1f9653516d46c7121c"} Dec 02 16:50:18 crc kubenswrapper[4933]: I1202 16:50:18.282324 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" event={"ID":"e6c1c5e6-50dd-428a-890c-2c3f0456f2fa","Type":"ContainerStarted","Data":"0536b3f7f2f953996f9cc98878934d57a628ba193ea15fc456e5b4e41cade065"} Dec 02 16:50:18 crc kubenswrapper[4933]: I1202 16:50:18.282358 4933 scope.go:117] "RemoveContainer" containerID="68b78e6cd96af98221363c9fce993885171fbe98149e90f073dca0715a2cf767" Dec 02 16:50:56 crc kubenswrapper[4933]: E1202 16:50:56.761089 4933 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.213:36586->38.102.83.213:44657: write tcp 38.102.83.213:36586->38.102.83.213:44657: write: broken pipe Dec 02 16:52:47 crc kubenswrapper[4933]: I1202 16:52:47.169980 4933 patch_prober.go:28] interesting pod/machine-config-daemon-d2p6w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 16:52:47 crc kubenswrapper[4933]: I1202 16:52:47.170621 4933 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 16:53:17 crc kubenswrapper[4933]: I1202 16:53:17.169654 4933 patch_prober.go:28] interesting pod/machine-config-daemon-d2p6w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 16:53:17 crc kubenswrapper[4933]: I1202 16:53:17.170371 4933 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 16:53:34 crc kubenswrapper[4933]: I1202 16:53:34.898700 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-tvfll"] Dec 02 16:53:34 crc kubenswrapper[4933]: E1202 16:53:34.900067 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb9e50fd-5ff0-4e00-828a-56b1bba8a1b3" containerName="extract-utilities" Dec 02 16:53:34 crc kubenswrapper[4933]: I1202 16:53:34.900094 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb9e50fd-5ff0-4e00-828a-56b1bba8a1b3" containerName="extract-utilities" Dec 02 16:53:34 crc kubenswrapper[4933]: E1202 16:53:34.900120 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb9e50fd-5ff0-4e00-828a-56b1bba8a1b3" containerName="extract-content" Dec 02 16:53:34 crc kubenswrapper[4933]: I1202 16:53:34.900132 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb9e50fd-5ff0-4e00-828a-56b1bba8a1b3" containerName="extract-content" Dec 02 16:53:34 crc kubenswrapper[4933]: E1202 16:53:34.900202 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb9e50fd-5ff0-4e00-828a-56b1bba8a1b3" containerName="registry-server" Dec 02 16:53:34 crc kubenswrapper[4933]: I1202 16:53:34.900215 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb9e50fd-5ff0-4e00-828a-56b1bba8a1b3" containerName="registry-server" Dec 02 16:53:34 crc kubenswrapper[4933]: I1202 16:53:34.900585 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb9e50fd-5ff0-4e00-828a-56b1bba8a1b3" containerName="registry-server" Dec 02 16:53:34 crc kubenswrapper[4933]: I1202 16:53:34.903158 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tvfll" Dec 02 16:53:34 crc kubenswrapper[4933]: I1202 16:53:34.926564 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tvfll"] Dec 02 16:53:35 crc kubenswrapper[4933]: I1202 16:53:35.055178 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00804f87-39e2-409a-bb48-6e4e1574bacd-utilities\") pod \"certified-operators-tvfll\" (UID: \"00804f87-39e2-409a-bb48-6e4e1574bacd\") " pod="openshift-marketplace/certified-operators-tvfll" Dec 02 16:53:35 crc kubenswrapper[4933]: I1202 16:53:35.055230 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7c2d\" (UniqueName: \"kubernetes.io/projected/00804f87-39e2-409a-bb48-6e4e1574bacd-kube-api-access-n7c2d\") pod \"certified-operators-tvfll\" (UID: \"00804f87-39e2-409a-bb48-6e4e1574bacd\") " pod="openshift-marketplace/certified-operators-tvfll" Dec 02 16:53:35 crc kubenswrapper[4933]: I1202 16:53:35.055504 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00804f87-39e2-409a-bb48-6e4e1574bacd-catalog-content\") pod \"certified-operators-tvfll\" (UID: \"00804f87-39e2-409a-bb48-6e4e1574bacd\") " pod="openshift-marketplace/certified-operators-tvfll" Dec 02 16:53:35 crc kubenswrapper[4933]: I1202 16:53:35.158261 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00804f87-39e2-409a-bb48-6e4e1574bacd-utilities\") pod \"certified-operators-tvfll\" (UID: \"00804f87-39e2-409a-bb48-6e4e1574bacd\") " pod="openshift-marketplace/certified-operators-tvfll" Dec 02 16:53:35 crc kubenswrapper[4933]: I1202 16:53:35.158334 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7c2d\" (UniqueName: \"kubernetes.io/projected/00804f87-39e2-409a-bb48-6e4e1574bacd-kube-api-access-n7c2d\") pod \"certified-operators-tvfll\" (UID: \"00804f87-39e2-409a-bb48-6e4e1574bacd\") " pod="openshift-marketplace/certified-operators-tvfll" Dec 02 16:53:35 crc kubenswrapper[4933]: I1202 16:53:35.158417 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00804f87-39e2-409a-bb48-6e4e1574bacd-catalog-content\") pod \"certified-operators-tvfll\" (UID: \"00804f87-39e2-409a-bb48-6e4e1574bacd\") " pod="openshift-marketplace/certified-operators-tvfll" Dec 02 16:53:35 crc kubenswrapper[4933]: I1202 16:53:35.159379 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00804f87-39e2-409a-bb48-6e4e1574bacd-utilities\") pod \"certified-operators-tvfll\" (UID: \"00804f87-39e2-409a-bb48-6e4e1574bacd\") " pod="openshift-marketplace/certified-operators-tvfll" Dec 02 16:53:35 crc kubenswrapper[4933]: I1202 16:53:35.159414 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00804f87-39e2-409a-bb48-6e4e1574bacd-catalog-content\") pod \"certified-operators-tvfll\" (UID: \"00804f87-39e2-409a-bb48-6e4e1574bacd\") " pod="openshift-marketplace/certified-operators-tvfll" Dec 02 16:53:35 crc kubenswrapper[4933]: I1202 16:53:35.195968 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7c2d\" (UniqueName: \"kubernetes.io/projected/00804f87-39e2-409a-bb48-6e4e1574bacd-kube-api-access-n7c2d\") pod \"certified-operators-tvfll\" (UID: \"00804f87-39e2-409a-bb48-6e4e1574bacd\") " pod="openshift-marketplace/certified-operators-tvfll" Dec 02 16:53:35 crc kubenswrapper[4933]: I1202 16:53:35.234064 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tvfll" Dec 02 16:53:35 crc kubenswrapper[4933]: I1202 16:53:35.891020 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tvfll"] Dec 02 16:53:36 crc kubenswrapper[4933]: I1202 16:53:36.674341 4933 generic.go:334] "Generic (PLEG): container finished" podID="00804f87-39e2-409a-bb48-6e4e1574bacd" containerID="cdf944372b4fcd8ad048cf557d02460d2b68beac2c166a5a28a3a21ed904fc73" exitCode=0 Dec 02 16:53:36 crc kubenswrapper[4933]: I1202 16:53:36.674403 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tvfll" event={"ID":"00804f87-39e2-409a-bb48-6e4e1574bacd","Type":"ContainerDied","Data":"cdf944372b4fcd8ad048cf557d02460d2b68beac2c166a5a28a3a21ed904fc73"} Dec 02 16:53:36 crc kubenswrapper[4933]: I1202 16:53:36.674651 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tvfll" event={"ID":"00804f87-39e2-409a-bb48-6e4e1574bacd","Type":"ContainerStarted","Data":"5faf51e423c0abfd4f00f469f68b946ecdbb907c0fa329d63ea943d1d0e29b8f"} Dec 02 16:53:37 crc kubenswrapper[4933]: I1202 16:53:37.300045 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-stw6b"] Dec 02 16:53:37 crc kubenswrapper[4933]: I1202 16:53:37.302924 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-stw6b" Dec 02 16:53:37 crc kubenswrapper[4933]: I1202 16:53:37.328698 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-stw6b"] Dec 02 16:53:37 crc kubenswrapper[4933]: I1202 16:53:37.419292 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab1b92a3-0498-4a38-8ea3-e45e7619a111-catalog-content\") pod \"redhat-operators-stw6b\" (UID: \"ab1b92a3-0498-4a38-8ea3-e45e7619a111\") " pod="openshift-marketplace/redhat-operators-stw6b" Dec 02 16:53:37 crc kubenswrapper[4933]: I1202 16:53:37.419406 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab1b92a3-0498-4a38-8ea3-e45e7619a111-utilities\") pod \"redhat-operators-stw6b\" (UID: \"ab1b92a3-0498-4a38-8ea3-e45e7619a111\") " pod="openshift-marketplace/redhat-operators-stw6b" Dec 02 16:53:37 crc kubenswrapper[4933]: I1202 16:53:37.419453 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-px8k5\" (UniqueName: \"kubernetes.io/projected/ab1b92a3-0498-4a38-8ea3-e45e7619a111-kube-api-access-px8k5\") pod \"redhat-operators-stw6b\" (UID: \"ab1b92a3-0498-4a38-8ea3-e45e7619a111\") " pod="openshift-marketplace/redhat-operators-stw6b" Dec 02 16:53:37 crc kubenswrapper[4933]: I1202 16:53:37.522135 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab1b92a3-0498-4a38-8ea3-e45e7619a111-catalog-content\") pod \"redhat-operators-stw6b\" (UID: \"ab1b92a3-0498-4a38-8ea3-e45e7619a111\") " pod="openshift-marketplace/redhat-operators-stw6b" Dec 02 16:53:37 crc kubenswrapper[4933]: I1202 16:53:37.522235 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab1b92a3-0498-4a38-8ea3-e45e7619a111-utilities\") pod \"redhat-operators-stw6b\" (UID: \"ab1b92a3-0498-4a38-8ea3-e45e7619a111\") " pod="openshift-marketplace/redhat-operators-stw6b" Dec 02 16:53:37 crc kubenswrapper[4933]: I1202 16:53:37.522270 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-px8k5\" (UniqueName: \"kubernetes.io/projected/ab1b92a3-0498-4a38-8ea3-e45e7619a111-kube-api-access-px8k5\") pod \"redhat-operators-stw6b\" (UID: \"ab1b92a3-0498-4a38-8ea3-e45e7619a111\") " pod="openshift-marketplace/redhat-operators-stw6b" Dec 02 16:53:37 crc kubenswrapper[4933]: I1202 16:53:37.522802 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab1b92a3-0498-4a38-8ea3-e45e7619a111-catalog-content\") pod \"redhat-operators-stw6b\" (UID: \"ab1b92a3-0498-4a38-8ea3-e45e7619a111\") " pod="openshift-marketplace/redhat-operators-stw6b" Dec 02 16:53:37 crc kubenswrapper[4933]: I1202 16:53:37.522815 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab1b92a3-0498-4a38-8ea3-e45e7619a111-utilities\") pod \"redhat-operators-stw6b\" (UID: \"ab1b92a3-0498-4a38-8ea3-e45e7619a111\") " pod="openshift-marketplace/redhat-operators-stw6b" Dec 02 16:53:37 crc kubenswrapper[4933]: I1202 16:53:37.544814 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-px8k5\" (UniqueName: \"kubernetes.io/projected/ab1b92a3-0498-4a38-8ea3-e45e7619a111-kube-api-access-px8k5\") pod \"redhat-operators-stw6b\" (UID: \"ab1b92a3-0498-4a38-8ea3-e45e7619a111\") " pod="openshift-marketplace/redhat-operators-stw6b" Dec 02 16:53:37 crc kubenswrapper[4933]: I1202 16:53:37.623683 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-stw6b" Dec 02 16:53:38 crc kubenswrapper[4933]: I1202 16:53:38.105387 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-stw6b"] Dec 02 16:53:38 crc kubenswrapper[4933]: W1202 16:53:38.115150 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab1b92a3_0498_4a38_8ea3_e45e7619a111.slice/crio-5650871bc008f9cee4d3d98e76398481ea69eee34a2f46b07a0d6f904b0fa03c WatchSource:0}: Error finding container 5650871bc008f9cee4d3d98e76398481ea69eee34a2f46b07a0d6f904b0fa03c: Status 404 returned error can't find the container with id 5650871bc008f9cee4d3d98e76398481ea69eee34a2f46b07a0d6f904b0fa03c Dec 02 16:53:38 crc kubenswrapper[4933]: I1202 16:53:38.699465 4933 generic.go:334] "Generic (PLEG): container finished" podID="ab1b92a3-0498-4a38-8ea3-e45e7619a111" containerID="b0ee3268c93cba5171ca1d99937a45919b5e7057b6c358b6a748da80f2456c4b" exitCode=0 Dec 02 16:53:38 crc kubenswrapper[4933]: I1202 16:53:38.699579 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-stw6b" event={"ID":"ab1b92a3-0498-4a38-8ea3-e45e7619a111","Type":"ContainerDied","Data":"b0ee3268c93cba5171ca1d99937a45919b5e7057b6c358b6a748da80f2456c4b"} Dec 02 16:53:38 crc kubenswrapper[4933]: I1202 16:53:38.699825 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-stw6b" event={"ID":"ab1b92a3-0498-4a38-8ea3-e45e7619a111","Type":"ContainerStarted","Data":"5650871bc008f9cee4d3d98e76398481ea69eee34a2f46b07a0d6f904b0fa03c"} Dec 02 16:53:38 crc kubenswrapper[4933]: I1202 16:53:38.702664 4933 generic.go:334] "Generic (PLEG): container finished" podID="00804f87-39e2-409a-bb48-6e4e1574bacd" containerID="c2468b19aeae0208f23eda7a80502ecb9884de6bde0e09a2d1e4f3258b2a2e9a" exitCode=0 Dec 02 16:53:38 crc kubenswrapper[4933]: I1202 16:53:38.702707 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tvfll" event={"ID":"00804f87-39e2-409a-bb48-6e4e1574bacd","Type":"ContainerDied","Data":"c2468b19aeae0208f23eda7a80502ecb9884de6bde0e09a2d1e4f3258b2a2e9a"} Dec 02 16:53:39 crc kubenswrapper[4933]: I1202 16:53:39.723680 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-stw6b" event={"ID":"ab1b92a3-0498-4a38-8ea3-e45e7619a111","Type":"ContainerStarted","Data":"04d320b99a5001c8464afd3f993f9e2086ec0abe3c296527c566ada1cc30a738"} Dec 02 16:53:39 crc kubenswrapper[4933]: I1202 16:53:39.728447 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tvfll" event={"ID":"00804f87-39e2-409a-bb48-6e4e1574bacd","Type":"ContainerStarted","Data":"ff67ae121f45c2d0d210e75143d688cd533f6c40f64703d9899f1ede9d69f4d0"} Dec 02 16:53:39 crc kubenswrapper[4933]: I1202 16:53:39.803887 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-tvfll" podStartSLOduration=3.169497266 podStartE2EDuration="5.803855433s" podCreationTimestamp="2025-12-02 16:53:34 +0000 UTC" firstStartedPulling="2025-12-02 16:53:36.677565899 +0000 UTC m=+3679.928792602" lastFinishedPulling="2025-12-02 16:53:39.311924066 +0000 UTC m=+3682.563150769" observedRunningTime="2025-12-02 16:53:39.785891594 +0000 UTC m=+3683.037118327" watchObservedRunningTime="2025-12-02 16:53:39.803855433 +0000 UTC m=+3683.055082136" Dec 02 16:53:42 crc kubenswrapper[4933]: I1202 16:53:42.761181 4933 generic.go:334] "Generic (PLEG): container finished" podID="ab1b92a3-0498-4a38-8ea3-e45e7619a111" containerID="04d320b99a5001c8464afd3f993f9e2086ec0abe3c296527c566ada1cc30a738" exitCode=0 Dec 02 16:53:42 crc kubenswrapper[4933]: I1202 16:53:42.761247 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-stw6b" event={"ID":"ab1b92a3-0498-4a38-8ea3-e45e7619a111","Type":"ContainerDied","Data":"04d320b99a5001c8464afd3f993f9e2086ec0abe3c296527c566ada1cc30a738"} Dec 02 16:53:43 crc kubenswrapper[4933]: I1202 16:53:43.788464 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-stw6b" event={"ID":"ab1b92a3-0498-4a38-8ea3-e45e7619a111","Type":"ContainerStarted","Data":"55871590d44af2896f7a834bfbc1c5fc4a985451bc6e368ec21e9d180189f3d3"} Dec 02 16:53:43 crc kubenswrapper[4933]: I1202 16:53:43.807032 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-stw6b" podStartSLOduration=2.242849282 podStartE2EDuration="6.807016366s" podCreationTimestamp="2025-12-02 16:53:37 +0000 UTC" firstStartedPulling="2025-12-02 16:53:38.701670601 +0000 UTC m=+3681.952897304" lastFinishedPulling="2025-12-02 16:53:43.265837685 +0000 UTC m=+3686.517064388" observedRunningTime="2025-12-02 16:53:43.805332841 +0000 UTC m=+3687.056559544" watchObservedRunningTime="2025-12-02 16:53:43.807016366 +0000 UTC m=+3687.058243069" Dec 02 16:53:45 crc kubenswrapper[4933]: I1202 16:53:45.235272 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-tvfll" Dec 02 16:53:45 crc kubenswrapper[4933]: I1202 16:53:45.235639 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-tvfll" Dec 02 16:53:45 crc kubenswrapper[4933]: I1202 16:53:45.287573 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-tvfll" Dec 02 16:53:45 crc kubenswrapper[4933]: I1202 16:53:45.867201 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-tvfll" Dec 02 16:53:47 crc kubenswrapper[4933]: I1202 16:53:47.169146 4933 patch_prober.go:28] interesting pod/machine-config-daemon-d2p6w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 16:53:47 crc kubenswrapper[4933]: I1202 16:53:47.169726 4933 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 16:53:47 crc kubenswrapper[4933]: I1202 16:53:47.169769 4933 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" Dec 02 16:53:47 crc kubenswrapper[4933]: I1202 16:53:47.171173 4933 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0536b3f7f2f953996f9cc98878934d57a628ba193ea15fc456e5b4e41cade065"} pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 16:53:47 crc kubenswrapper[4933]: I1202 16:53:47.171247 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" containerName="machine-config-daemon" containerID="cri-o://0536b3f7f2f953996f9cc98878934d57a628ba193ea15fc456e5b4e41cade065" gracePeriod=600 Dec 02 16:53:47 crc kubenswrapper[4933]: I1202 16:53:47.275219 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tvfll"] Dec 02 16:53:47 crc kubenswrapper[4933]: E1202 16:53:47.290703 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 16:53:47 crc kubenswrapper[4933]: I1202 16:53:47.624932 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-stw6b" Dec 02 16:53:47 crc kubenswrapper[4933]: I1202 16:53:47.624996 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-stw6b" Dec 02 16:53:47 crc kubenswrapper[4933]: I1202 16:53:47.833986 4933 generic.go:334] "Generic (PLEG): container finished" podID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" containerID="0536b3f7f2f953996f9cc98878934d57a628ba193ea15fc456e5b4e41cade065" exitCode=0 Dec 02 16:53:47 crc kubenswrapper[4933]: I1202 16:53:47.834049 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" event={"ID":"e6c1c5e6-50dd-428a-890c-2c3f0456f2fa","Type":"ContainerDied","Data":"0536b3f7f2f953996f9cc98878934d57a628ba193ea15fc456e5b4e41cade065"} Dec 02 16:53:47 crc kubenswrapper[4933]: I1202 16:53:47.834378 4933 scope.go:117] "RemoveContainer" containerID="317be8989886cb7a67b142af24f3660c896333713b671c1f9653516d46c7121c" Dec 02 16:53:47 crc kubenswrapper[4933]: I1202 16:53:47.834505 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-tvfll" podUID="00804f87-39e2-409a-bb48-6e4e1574bacd" containerName="registry-server" containerID="cri-o://ff67ae121f45c2d0d210e75143d688cd533f6c40f64703d9899f1ede9d69f4d0" gracePeriod=2 Dec 02 16:53:47 crc kubenswrapper[4933]: I1202 16:53:47.835574 4933 scope.go:117] "RemoveContainer" containerID="0536b3f7f2f953996f9cc98878934d57a628ba193ea15fc456e5b4e41cade065" Dec 02 16:53:47 crc kubenswrapper[4933]: E1202 16:53:47.836271 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 16:53:48 crc kubenswrapper[4933]: I1202 16:53:48.537209 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tvfll" Dec 02 16:53:48 crc kubenswrapper[4933]: I1202 16:53:48.700236 4933 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-stw6b" podUID="ab1b92a3-0498-4a38-8ea3-e45e7619a111" containerName="registry-server" probeResult="failure" output=< Dec 02 16:53:48 crc kubenswrapper[4933]: timeout: failed to connect service ":50051" within 1s Dec 02 16:53:48 crc kubenswrapper[4933]: > Dec 02 16:53:48 crc kubenswrapper[4933]: I1202 16:53:48.717523 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00804f87-39e2-409a-bb48-6e4e1574bacd-catalog-content\") pod \"00804f87-39e2-409a-bb48-6e4e1574bacd\" (UID: \"00804f87-39e2-409a-bb48-6e4e1574bacd\") " Dec 02 16:53:48 crc kubenswrapper[4933]: I1202 16:53:48.717738 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n7c2d\" (UniqueName: \"kubernetes.io/projected/00804f87-39e2-409a-bb48-6e4e1574bacd-kube-api-access-n7c2d\") pod \"00804f87-39e2-409a-bb48-6e4e1574bacd\" (UID: \"00804f87-39e2-409a-bb48-6e4e1574bacd\") " Dec 02 16:53:48 crc kubenswrapper[4933]: I1202 16:53:48.717923 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00804f87-39e2-409a-bb48-6e4e1574bacd-utilities\") pod \"00804f87-39e2-409a-bb48-6e4e1574bacd\" (UID: \"00804f87-39e2-409a-bb48-6e4e1574bacd\") " Dec 02 16:53:48 crc kubenswrapper[4933]: I1202 16:53:48.718855 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/00804f87-39e2-409a-bb48-6e4e1574bacd-utilities" (OuterVolumeSpecName: "utilities") pod "00804f87-39e2-409a-bb48-6e4e1574bacd" (UID: "00804f87-39e2-409a-bb48-6e4e1574bacd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 16:53:48 crc kubenswrapper[4933]: I1202 16:53:48.725890 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00804f87-39e2-409a-bb48-6e4e1574bacd-kube-api-access-n7c2d" (OuterVolumeSpecName: "kube-api-access-n7c2d") pod "00804f87-39e2-409a-bb48-6e4e1574bacd" (UID: "00804f87-39e2-409a-bb48-6e4e1574bacd"). InnerVolumeSpecName "kube-api-access-n7c2d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:53:48 crc kubenswrapper[4933]: I1202 16:53:48.760375 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/00804f87-39e2-409a-bb48-6e4e1574bacd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "00804f87-39e2-409a-bb48-6e4e1574bacd" (UID: "00804f87-39e2-409a-bb48-6e4e1574bacd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 16:53:48 crc kubenswrapper[4933]: I1202 16:53:48.820102 4933 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00804f87-39e2-409a-bb48-6e4e1574bacd-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 16:53:48 crc kubenswrapper[4933]: I1202 16:53:48.820133 4933 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00804f87-39e2-409a-bb48-6e4e1574bacd-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 16:53:48 crc kubenswrapper[4933]: I1202 16:53:48.820147 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n7c2d\" (UniqueName: \"kubernetes.io/projected/00804f87-39e2-409a-bb48-6e4e1574bacd-kube-api-access-n7c2d\") on node \"crc\" DevicePath \"\"" Dec 02 16:53:48 crc kubenswrapper[4933]: I1202 16:53:48.850708 4933 generic.go:334] "Generic (PLEG): container finished" podID="00804f87-39e2-409a-bb48-6e4e1574bacd" containerID="ff67ae121f45c2d0d210e75143d688cd533f6c40f64703d9899f1ede9d69f4d0" exitCode=0 Dec 02 16:53:48 crc kubenswrapper[4933]: I1202 16:53:48.850761 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tvfll" event={"ID":"00804f87-39e2-409a-bb48-6e4e1574bacd","Type":"ContainerDied","Data":"ff67ae121f45c2d0d210e75143d688cd533f6c40f64703d9899f1ede9d69f4d0"} Dec 02 16:53:48 crc kubenswrapper[4933]: I1202 16:53:48.850792 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tvfll" event={"ID":"00804f87-39e2-409a-bb48-6e4e1574bacd","Type":"ContainerDied","Data":"5faf51e423c0abfd4f00f469f68b946ecdbb907c0fa329d63ea943d1d0e29b8f"} Dec 02 16:53:48 crc kubenswrapper[4933]: I1202 16:53:48.850813 4933 scope.go:117] "RemoveContainer" containerID="ff67ae121f45c2d0d210e75143d688cd533f6c40f64703d9899f1ede9d69f4d0" Dec 02 16:53:48 crc kubenswrapper[4933]: I1202 16:53:48.851004 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tvfll" Dec 02 16:53:48 crc kubenswrapper[4933]: I1202 16:53:48.887787 4933 scope.go:117] "RemoveContainer" containerID="c2468b19aeae0208f23eda7a80502ecb9884de6bde0e09a2d1e4f3258b2a2e9a" Dec 02 16:53:48 crc kubenswrapper[4933]: I1202 16:53:48.887862 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tvfll"] Dec 02 16:53:48 crc kubenswrapper[4933]: I1202 16:53:48.899467 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-tvfll"] Dec 02 16:53:48 crc kubenswrapper[4933]: I1202 16:53:48.917371 4933 scope.go:117] "RemoveContainer" containerID="cdf944372b4fcd8ad048cf557d02460d2b68beac2c166a5a28a3a21ed904fc73" Dec 02 16:53:48 crc kubenswrapper[4933]: I1202 16:53:48.999450 4933 scope.go:117] "RemoveContainer" containerID="ff67ae121f45c2d0d210e75143d688cd533f6c40f64703d9899f1ede9d69f4d0" Dec 02 16:53:49 crc kubenswrapper[4933]: E1202 16:53:48.999980 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff67ae121f45c2d0d210e75143d688cd533f6c40f64703d9899f1ede9d69f4d0\": container with ID starting with ff67ae121f45c2d0d210e75143d688cd533f6c40f64703d9899f1ede9d69f4d0 not found: ID does not exist" containerID="ff67ae121f45c2d0d210e75143d688cd533f6c40f64703d9899f1ede9d69f4d0" Dec 02 16:53:49 crc kubenswrapper[4933]: I1202 16:53:49.000017 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff67ae121f45c2d0d210e75143d688cd533f6c40f64703d9899f1ede9d69f4d0"} err="failed to get container status \"ff67ae121f45c2d0d210e75143d688cd533f6c40f64703d9899f1ede9d69f4d0\": rpc error: code = NotFound desc = could not find container \"ff67ae121f45c2d0d210e75143d688cd533f6c40f64703d9899f1ede9d69f4d0\": container with ID starting with ff67ae121f45c2d0d210e75143d688cd533f6c40f64703d9899f1ede9d69f4d0 not found: ID does not exist" Dec 02 16:53:49 crc kubenswrapper[4933]: I1202 16:53:49.000038 4933 scope.go:117] "RemoveContainer" containerID="c2468b19aeae0208f23eda7a80502ecb9884de6bde0e09a2d1e4f3258b2a2e9a" Dec 02 16:53:49 crc kubenswrapper[4933]: E1202 16:53:49.000471 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c2468b19aeae0208f23eda7a80502ecb9884de6bde0e09a2d1e4f3258b2a2e9a\": container with ID starting with c2468b19aeae0208f23eda7a80502ecb9884de6bde0e09a2d1e4f3258b2a2e9a not found: ID does not exist" containerID="c2468b19aeae0208f23eda7a80502ecb9884de6bde0e09a2d1e4f3258b2a2e9a" Dec 02 16:53:49 crc kubenswrapper[4933]: I1202 16:53:49.000491 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2468b19aeae0208f23eda7a80502ecb9884de6bde0e09a2d1e4f3258b2a2e9a"} err="failed to get container status \"c2468b19aeae0208f23eda7a80502ecb9884de6bde0e09a2d1e4f3258b2a2e9a\": rpc error: code = NotFound desc = could not find container \"c2468b19aeae0208f23eda7a80502ecb9884de6bde0e09a2d1e4f3258b2a2e9a\": container with ID starting with c2468b19aeae0208f23eda7a80502ecb9884de6bde0e09a2d1e4f3258b2a2e9a not found: ID does not exist" Dec 02 16:53:49 crc kubenswrapper[4933]: I1202 16:53:49.000503 4933 scope.go:117] "RemoveContainer" containerID="cdf944372b4fcd8ad048cf557d02460d2b68beac2c166a5a28a3a21ed904fc73" Dec 02 16:53:49 crc kubenswrapper[4933]: E1202 16:53:49.000806 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cdf944372b4fcd8ad048cf557d02460d2b68beac2c166a5a28a3a21ed904fc73\": container with ID starting with cdf944372b4fcd8ad048cf557d02460d2b68beac2c166a5a28a3a21ed904fc73 not found: ID does not exist" containerID="cdf944372b4fcd8ad048cf557d02460d2b68beac2c166a5a28a3a21ed904fc73" Dec 02 16:53:49 crc kubenswrapper[4933]: I1202 16:53:49.000849 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cdf944372b4fcd8ad048cf557d02460d2b68beac2c166a5a28a3a21ed904fc73"} err="failed to get container status \"cdf944372b4fcd8ad048cf557d02460d2b68beac2c166a5a28a3a21ed904fc73\": rpc error: code = NotFound desc = could not find container \"cdf944372b4fcd8ad048cf557d02460d2b68beac2c166a5a28a3a21ed904fc73\": container with ID starting with cdf944372b4fcd8ad048cf557d02460d2b68beac2c166a5a28a3a21ed904fc73 not found: ID does not exist" Dec 02 16:53:49 crc kubenswrapper[4933]: I1202 16:53:49.068603 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00804f87-39e2-409a-bb48-6e4e1574bacd" path="/var/lib/kubelet/pods/00804f87-39e2-409a-bb48-6e4e1574bacd/volumes" Dec 02 16:53:57 crc kubenswrapper[4933]: I1202 16:53:57.686458 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-stw6b" Dec 02 16:53:57 crc kubenswrapper[4933]: I1202 16:53:57.747664 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-stw6b" Dec 02 16:53:57 crc kubenswrapper[4933]: I1202 16:53:57.934642 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-stw6b"] Dec 02 16:53:58 crc kubenswrapper[4933]: I1202 16:53:58.963157 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-stw6b" podUID="ab1b92a3-0498-4a38-8ea3-e45e7619a111" containerName="registry-server" containerID="cri-o://55871590d44af2896f7a834bfbc1c5fc4a985451bc6e368ec21e9d180189f3d3" gracePeriod=2 Dec 02 16:53:59 crc kubenswrapper[4933]: I1202 16:53:59.541435 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-stw6b" Dec 02 16:53:59 crc kubenswrapper[4933]: I1202 16:53:59.692436 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab1b92a3-0498-4a38-8ea3-e45e7619a111-catalog-content\") pod \"ab1b92a3-0498-4a38-8ea3-e45e7619a111\" (UID: \"ab1b92a3-0498-4a38-8ea3-e45e7619a111\") " Dec 02 16:53:59 crc kubenswrapper[4933]: I1202 16:53:59.692707 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-px8k5\" (UniqueName: \"kubernetes.io/projected/ab1b92a3-0498-4a38-8ea3-e45e7619a111-kube-api-access-px8k5\") pod \"ab1b92a3-0498-4a38-8ea3-e45e7619a111\" (UID: \"ab1b92a3-0498-4a38-8ea3-e45e7619a111\") " Dec 02 16:53:59 crc kubenswrapper[4933]: I1202 16:53:59.692785 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab1b92a3-0498-4a38-8ea3-e45e7619a111-utilities\") pod \"ab1b92a3-0498-4a38-8ea3-e45e7619a111\" (UID: \"ab1b92a3-0498-4a38-8ea3-e45e7619a111\") " Dec 02 16:53:59 crc kubenswrapper[4933]: I1202 16:53:59.695048 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab1b92a3-0498-4a38-8ea3-e45e7619a111-utilities" (OuterVolumeSpecName: "utilities") pod "ab1b92a3-0498-4a38-8ea3-e45e7619a111" (UID: "ab1b92a3-0498-4a38-8ea3-e45e7619a111"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 16:53:59 crc kubenswrapper[4933]: I1202 16:53:59.728068 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab1b92a3-0498-4a38-8ea3-e45e7619a111-kube-api-access-px8k5" (OuterVolumeSpecName: "kube-api-access-px8k5") pod "ab1b92a3-0498-4a38-8ea3-e45e7619a111" (UID: "ab1b92a3-0498-4a38-8ea3-e45e7619a111"). InnerVolumeSpecName "kube-api-access-px8k5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:53:59 crc kubenswrapper[4933]: I1202 16:53:59.797078 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-px8k5\" (UniqueName: \"kubernetes.io/projected/ab1b92a3-0498-4a38-8ea3-e45e7619a111-kube-api-access-px8k5\") on node \"crc\" DevicePath \"\"" Dec 02 16:53:59 crc kubenswrapper[4933]: I1202 16:53:59.797338 4933 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab1b92a3-0498-4a38-8ea3-e45e7619a111-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 16:53:59 crc kubenswrapper[4933]: I1202 16:53:59.810906 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab1b92a3-0498-4a38-8ea3-e45e7619a111-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ab1b92a3-0498-4a38-8ea3-e45e7619a111" (UID: "ab1b92a3-0498-4a38-8ea3-e45e7619a111"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 16:53:59 crc kubenswrapper[4933]: I1202 16:53:59.900007 4933 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab1b92a3-0498-4a38-8ea3-e45e7619a111-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 16:53:59 crc kubenswrapper[4933]: I1202 16:53:59.974576 4933 generic.go:334] "Generic (PLEG): container finished" podID="ab1b92a3-0498-4a38-8ea3-e45e7619a111" containerID="55871590d44af2896f7a834bfbc1c5fc4a985451bc6e368ec21e9d180189f3d3" exitCode=0 Dec 02 16:53:59 crc kubenswrapper[4933]: I1202 16:53:59.974659 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-stw6b" event={"ID":"ab1b92a3-0498-4a38-8ea3-e45e7619a111","Type":"ContainerDied","Data":"55871590d44af2896f7a834bfbc1c5fc4a985451bc6e368ec21e9d180189f3d3"} Dec 02 16:53:59 crc kubenswrapper[4933]: I1202 16:53:59.974686 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-stw6b" event={"ID":"ab1b92a3-0498-4a38-8ea3-e45e7619a111","Type":"ContainerDied","Data":"5650871bc008f9cee4d3d98e76398481ea69eee34a2f46b07a0d6f904b0fa03c"} Dec 02 16:53:59 crc kubenswrapper[4933]: I1202 16:53:59.974703 4933 scope.go:117] "RemoveContainer" containerID="55871590d44af2896f7a834bfbc1c5fc4a985451bc6e368ec21e9d180189f3d3" Dec 02 16:53:59 crc kubenswrapper[4933]: I1202 16:53:59.974717 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-stw6b" Dec 02 16:53:59 crc kubenswrapper[4933]: I1202 16:53:59.999536 4933 scope.go:117] "RemoveContainer" containerID="04d320b99a5001c8464afd3f993f9e2086ec0abe3c296527c566ada1cc30a738" Dec 02 16:54:00 crc kubenswrapper[4933]: I1202 16:54:00.016994 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-stw6b"] Dec 02 16:54:00 crc kubenswrapper[4933]: I1202 16:54:00.026211 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-stw6b"] Dec 02 16:54:00 crc kubenswrapper[4933]: I1202 16:54:00.031963 4933 scope.go:117] "RemoveContainer" containerID="b0ee3268c93cba5171ca1d99937a45919b5e7057b6c358b6a748da80f2456c4b" Dec 02 16:54:00 crc kubenswrapper[4933]: I1202 16:54:00.108456 4933 scope.go:117] "RemoveContainer" containerID="55871590d44af2896f7a834bfbc1c5fc4a985451bc6e368ec21e9d180189f3d3" Dec 02 16:54:00 crc kubenswrapper[4933]: E1202 16:54:00.108997 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55871590d44af2896f7a834bfbc1c5fc4a985451bc6e368ec21e9d180189f3d3\": container with ID starting with 55871590d44af2896f7a834bfbc1c5fc4a985451bc6e368ec21e9d180189f3d3 not found: ID does not exist" containerID="55871590d44af2896f7a834bfbc1c5fc4a985451bc6e368ec21e9d180189f3d3" Dec 02 16:54:00 crc kubenswrapper[4933]: I1202 16:54:00.109028 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55871590d44af2896f7a834bfbc1c5fc4a985451bc6e368ec21e9d180189f3d3"} err="failed to get container status \"55871590d44af2896f7a834bfbc1c5fc4a985451bc6e368ec21e9d180189f3d3\": rpc error: code = NotFound desc = could not find container \"55871590d44af2896f7a834bfbc1c5fc4a985451bc6e368ec21e9d180189f3d3\": container with ID starting with 55871590d44af2896f7a834bfbc1c5fc4a985451bc6e368ec21e9d180189f3d3 not found: ID does not exist" Dec 02 16:54:00 crc kubenswrapper[4933]: I1202 16:54:00.109049 4933 scope.go:117] "RemoveContainer" containerID="04d320b99a5001c8464afd3f993f9e2086ec0abe3c296527c566ada1cc30a738" Dec 02 16:54:00 crc kubenswrapper[4933]: E1202 16:54:00.109328 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04d320b99a5001c8464afd3f993f9e2086ec0abe3c296527c566ada1cc30a738\": container with ID starting with 04d320b99a5001c8464afd3f993f9e2086ec0abe3c296527c566ada1cc30a738 not found: ID does not exist" containerID="04d320b99a5001c8464afd3f993f9e2086ec0abe3c296527c566ada1cc30a738" Dec 02 16:54:00 crc kubenswrapper[4933]: I1202 16:54:00.109353 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04d320b99a5001c8464afd3f993f9e2086ec0abe3c296527c566ada1cc30a738"} err="failed to get container status \"04d320b99a5001c8464afd3f993f9e2086ec0abe3c296527c566ada1cc30a738\": rpc error: code = NotFound desc = could not find container \"04d320b99a5001c8464afd3f993f9e2086ec0abe3c296527c566ada1cc30a738\": container with ID starting with 04d320b99a5001c8464afd3f993f9e2086ec0abe3c296527c566ada1cc30a738 not found: ID does not exist" Dec 02 16:54:00 crc kubenswrapper[4933]: I1202 16:54:00.109367 4933 scope.go:117] "RemoveContainer" containerID="b0ee3268c93cba5171ca1d99937a45919b5e7057b6c358b6a748da80f2456c4b" Dec 02 16:54:00 crc kubenswrapper[4933]: E1202 16:54:00.109692 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0ee3268c93cba5171ca1d99937a45919b5e7057b6c358b6a748da80f2456c4b\": container with ID starting with b0ee3268c93cba5171ca1d99937a45919b5e7057b6c358b6a748da80f2456c4b not found: ID does not exist" containerID="b0ee3268c93cba5171ca1d99937a45919b5e7057b6c358b6a748da80f2456c4b" Dec 02 16:54:00 crc kubenswrapper[4933]: I1202 16:54:00.109715 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0ee3268c93cba5171ca1d99937a45919b5e7057b6c358b6a748da80f2456c4b"} err="failed to get container status \"b0ee3268c93cba5171ca1d99937a45919b5e7057b6c358b6a748da80f2456c4b\": rpc error: code = NotFound desc = could not find container \"b0ee3268c93cba5171ca1d99937a45919b5e7057b6c358b6a748da80f2456c4b\": container with ID starting with b0ee3268c93cba5171ca1d99937a45919b5e7057b6c358b6a748da80f2456c4b not found: ID does not exist" Dec 02 16:54:01 crc kubenswrapper[4933]: I1202 16:54:01.075404 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab1b92a3-0498-4a38-8ea3-e45e7619a111" path="/var/lib/kubelet/pods/ab1b92a3-0498-4a38-8ea3-e45e7619a111/volumes" Dec 02 16:54:03 crc kubenswrapper[4933]: I1202 16:54:03.060660 4933 scope.go:117] "RemoveContainer" containerID="0536b3f7f2f953996f9cc98878934d57a628ba193ea15fc456e5b4e41cade065" Dec 02 16:54:03 crc kubenswrapper[4933]: E1202 16:54:03.061519 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 16:54:17 crc kubenswrapper[4933]: I1202 16:54:17.063138 4933 scope.go:117] "RemoveContainer" containerID="0536b3f7f2f953996f9cc98878934d57a628ba193ea15fc456e5b4e41cade065" Dec 02 16:54:17 crc kubenswrapper[4933]: E1202 16:54:17.064201 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 16:54:31 crc kubenswrapper[4933]: I1202 16:54:31.055030 4933 scope.go:117] "RemoveContainer" containerID="0536b3f7f2f953996f9cc98878934d57a628ba193ea15fc456e5b4e41cade065" Dec 02 16:54:31 crc kubenswrapper[4933]: E1202 16:54:31.056150 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 16:54:42 crc kubenswrapper[4933]: I1202 16:54:42.053921 4933 scope.go:117] "RemoveContainer" containerID="0536b3f7f2f953996f9cc98878934d57a628ba193ea15fc456e5b4e41cade065" Dec 02 16:54:42 crc kubenswrapper[4933]: E1202 16:54:42.054876 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 16:54:54 crc kubenswrapper[4933]: I1202 16:54:54.054494 4933 scope.go:117] "RemoveContainer" containerID="0536b3f7f2f953996f9cc98878934d57a628ba193ea15fc456e5b4e41cade065" Dec 02 16:54:54 crc kubenswrapper[4933]: E1202 16:54:54.055392 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 16:55:08 crc kubenswrapper[4933]: I1202 16:55:08.054970 4933 scope.go:117] "RemoveContainer" containerID="0536b3f7f2f953996f9cc98878934d57a628ba193ea15fc456e5b4e41cade065" Dec 02 16:55:08 crc kubenswrapper[4933]: E1202 16:55:08.057557 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 16:55:22 crc kubenswrapper[4933]: I1202 16:55:22.054345 4933 scope.go:117] "RemoveContainer" containerID="0536b3f7f2f953996f9cc98878934d57a628ba193ea15fc456e5b4e41cade065" Dec 02 16:55:22 crc kubenswrapper[4933]: E1202 16:55:22.055846 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 16:55:29 crc kubenswrapper[4933]: I1202 16:55:29.785355 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-7wr5g"] Dec 02 16:55:29 crc kubenswrapper[4933]: E1202 16:55:29.787476 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab1b92a3-0498-4a38-8ea3-e45e7619a111" containerName="extract-utilities" Dec 02 16:55:29 crc kubenswrapper[4933]: I1202 16:55:29.787498 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab1b92a3-0498-4a38-8ea3-e45e7619a111" containerName="extract-utilities" Dec 02 16:55:29 crc kubenswrapper[4933]: E1202 16:55:29.787543 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab1b92a3-0498-4a38-8ea3-e45e7619a111" containerName="extract-content" Dec 02 16:55:29 crc kubenswrapper[4933]: I1202 16:55:29.787552 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab1b92a3-0498-4a38-8ea3-e45e7619a111" containerName="extract-content" Dec 02 16:55:29 crc kubenswrapper[4933]: E1202 16:55:29.787574 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00804f87-39e2-409a-bb48-6e4e1574bacd" containerName="registry-server" Dec 02 16:55:29 crc kubenswrapper[4933]: I1202 16:55:29.787582 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="00804f87-39e2-409a-bb48-6e4e1574bacd" containerName="registry-server" Dec 02 16:55:29 crc kubenswrapper[4933]: E1202 16:55:29.787594 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00804f87-39e2-409a-bb48-6e4e1574bacd" containerName="extract-utilities" Dec 02 16:55:29 crc kubenswrapper[4933]: I1202 16:55:29.787601 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="00804f87-39e2-409a-bb48-6e4e1574bacd" containerName="extract-utilities" Dec 02 16:55:29 crc kubenswrapper[4933]: E1202 16:55:29.787629 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00804f87-39e2-409a-bb48-6e4e1574bacd" containerName="extract-content" Dec 02 16:55:29 crc kubenswrapper[4933]: I1202 16:55:29.787639 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="00804f87-39e2-409a-bb48-6e4e1574bacd" containerName="extract-content" Dec 02 16:55:29 crc kubenswrapper[4933]: E1202 16:55:29.787651 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab1b92a3-0498-4a38-8ea3-e45e7619a111" containerName="registry-server" Dec 02 16:55:29 crc kubenswrapper[4933]: I1202 16:55:29.787658 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab1b92a3-0498-4a38-8ea3-e45e7619a111" containerName="registry-server" Dec 02 16:55:29 crc kubenswrapper[4933]: I1202 16:55:29.787959 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="00804f87-39e2-409a-bb48-6e4e1574bacd" containerName="registry-server" Dec 02 16:55:29 crc kubenswrapper[4933]: I1202 16:55:29.787981 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab1b92a3-0498-4a38-8ea3-e45e7619a111" containerName="registry-server" Dec 02 16:55:29 crc kubenswrapper[4933]: I1202 16:55:29.790224 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7wr5g" Dec 02 16:55:29 crc kubenswrapper[4933]: I1202 16:55:29.810950 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7wr5g"] Dec 02 16:55:29 crc kubenswrapper[4933]: I1202 16:55:29.815456 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06d5a596-f0d1-4758-8822-d46df6bb9fab-catalog-content\") pod \"community-operators-7wr5g\" (UID: \"06d5a596-f0d1-4758-8822-d46df6bb9fab\") " pod="openshift-marketplace/community-operators-7wr5g" Dec 02 16:55:29 crc kubenswrapper[4933]: I1202 16:55:29.815512 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvz7s\" (UniqueName: \"kubernetes.io/projected/06d5a596-f0d1-4758-8822-d46df6bb9fab-kube-api-access-xvz7s\") pod \"community-operators-7wr5g\" (UID: \"06d5a596-f0d1-4758-8822-d46df6bb9fab\") " pod="openshift-marketplace/community-operators-7wr5g" Dec 02 16:55:29 crc kubenswrapper[4933]: I1202 16:55:29.815652 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06d5a596-f0d1-4758-8822-d46df6bb9fab-utilities\") pod \"community-operators-7wr5g\" (UID: \"06d5a596-f0d1-4758-8822-d46df6bb9fab\") " pod="openshift-marketplace/community-operators-7wr5g" Dec 02 16:55:29 crc kubenswrapper[4933]: I1202 16:55:29.918526 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06d5a596-f0d1-4758-8822-d46df6bb9fab-catalog-content\") pod \"community-operators-7wr5g\" (UID: \"06d5a596-f0d1-4758-8822-d46df6bb9fab\") " pod="openshift-marketplace/community-operators-7wr5g" Dec 02 16:55:29 crc kubenswrapper[4933]: I1202 16:55:29.918618 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvz7s\" (UniqueName: \"kubernetes.io/projected/06d5a596-f0d1-4758-8822-d46df6bb9fab-kube-api-access-xvz7s\") pod \"community-operators-7wr5g\" (UID: \"06d5a596-f0d1-4758-8822-d46df6bb9fab\") " pod="openshift-marketplace/community-operators-7wr5g" Dec 02 16:55:29 crc kubenswrapper[4933]: I1202 16:55:29.918939 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06d5a596-f0d1-4758-8822-d46df6bb9fab-utilities\") pod \"community-operators-7wr5g\" (UID: \"06d5a596-f0d1-4758-8822-d46df6bb9fab\") " pod="openshift-marketplace/community-operators-7wr5g" Dec 02 16:55:29 crc kubenswrapper[4933]: I1202 16:55:29.918957 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06d5a596-f0d1-4758-8822-d46df6bb9fab-catalog-content\") pod \"community-operators-7wr5g\" (UID: \"06d5a596-f0d1-4758-8822-d46df6bb9fab\") " pod="openshift-marketplace/community-operators-7wr5g" Dec 02 16:55:29 crc kubenswrapper[4933]: I1202 16:55:29.919499 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06d5a596-f0d1-4758-8822-d46df6bb9fab-utilities\") pod \"community-operators-7wr5g\" (UID: \"06d5a596-f0d1-4758-8822-d46df6bb9fab\") " pod="openshift-marketplace/community-operators-7wr5g" Dec 02 16:55:29 crc kubenswrapper[4933]: I1202 16:55:29.941412 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvz7s\" (UniqueName: \"kubernetes.io/projected/06d5a596-f0d1-4758-8822-d46df6bb9fab-kube-api-access-xvz7s\") pod \"community-operators-7wr5g\" (UID: \"06d5a596-f0d1-4758-8822-d46df6bb9fab\") " pod="openshift-marketplace/community-operators-7wr5g" Dec 02 16:55:30 crc kubenswrapper[4933]: I1202 16:55:30.123429 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7wr5g" Dec 02 16:55:30 crc kubenswrapper[4933]: I1202 16:55:30.759614 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7wr5g"] Dec 02 16:55:31 crc kubenswrapper[4933]: I1202 16:55:31.176560 4933 generic.go:334] "Generic (PLEG): container finished" podID="06d5a596-f0d1-4758-8822-d46df6bb9fab" containerID="a2b92abd018a6dc90fa73ba5c65ef133d2bb233f828c84ac2720fcfc7749c2c3" exitCode=0 Dec 02 16:55:31 crc kubenswrapper[4933]: I1202 16:55:31.176680 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7wr5g" event={"ID":"06d5a596-f0d1-4758-8822-d46df6bb9fab","Type":"ContainerDied","Data":"a2b92abd018a6dc90fa73ba5c65ef133d2bb233f828c84ac2720fcfc7749c2c3"} Dec 02 16:55:31 crc kubenswrapper[4933]: I1202 16:55:31.176921 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7wr5g" event={"ID":"06d5a596-f0d1-4758-8822-d46df6bb9fab","Type":"ContainerStarted","Data":"8bf46f181977cc33e8910e61fce43e4034a6c6c0e973964c6b228f7395a98bca"} Dec 02 16:55:31 crc kubenswrapper[4933]: I1202 16:55:31.178720 4933 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 16:55:33 crc kubenswrapper[4933]: I1202 16:55:33.199958 4933 generic.go:334] "Generic (PLEG): container finished" podID="06d5a596-f0d1-4758-8822-d46df6bb9fab" containerID="b35c242cf1fe6994c772d480cb44df7cef649809a8d778df52c80b25852a2d44" exitCode=0 Dec 02 16:55:33 crc kubenswrapper[4933]: I1202 16:55:33.200059 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7wr5g" event={"ID":"06d5a596-f0d1-4758-8822-d46df6bb9fab","Type":"ContainerDied","Data":"b35c242cf1fe6994c772d480cb44df7cef649809a8d778df52c80b25852a2d44"} Dec 02 16:55:34 crc kubenswrapper[4933]: I1202 16:55:34.211523 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7wr5g" event={"ID":"06d5a596-f0d1-4758-8822-d46df6bb9fab","Type":"ContainerStarted","Data":"99d15c66e2f14aa69863952857fed562229aed851c71df0f559bdda226651460"} Dec 02 16:55:34 crc kubenswrapper[4933]: I1202 16:55:34.233793 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-7wr5g" podStartSLOduration=2.674603002 podStartE2EDuration="5.233771298s" podCreationTimestamp="2025-12-02 16:55:29 +0000 UTC" firstStartedPulling="2025-12-02 16:55:31.178508401 +0000 UTC m=+3794.429735104" lastFinishedPulling="2025-12-02 16:55:33.737676697 +0000 UTC m=+3796.988903400" observedRunningTime="2025-12-02 16:55:34.228268671 +0000 UTC m=+3797.479495384" watchObservedRunningTime="2025-12-02 16:55:34.233771298 +0000 UTC m=+3797.484998001" Dec 02 16:55:35 crc kubenswrapper[4933]: I1202 16:55:35.053919 4933 scope.go:117] "RemoveContainer" containerID="0536b3f7f2f953996f9cc98878934d57a628ba193ea15fc456e5b4e41cade065" Dec 02 16:55:35 crc kubenswrapper[4933]: E1202 16:55:35.054278 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 16:55:40 crc kubenswrapper[4933]: I1202 16:55:40.124689 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-7wr5g" Dec 02 16:55:40 crc kubenswrapper[4933]: I1202 16:55:40.125736 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-7wr5g" Dec 02 16:55:40 crc kubenswrapper[4933]: I1202 16:55:40.174569 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-7wr5g" Dec 02 16:55:40 crc kubenswrapper[4933]: I1202 16:55:40.326335 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-7wr5g" Dec 02 16:55:40 crc kubenswrapper[4933]: I1202 16:55:40.413588 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7wr5g"] Dec 02 16:55:42 crc kubenswrapper[4933]: I1202 16:55:42.298076 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-7wr5g" podUID="06d5a596-f0d1-4758-8822-d46df6bb9fab" containerName="registry-server" containerID="cri-o://99d15c66e2f14aa69863952857fed562229aed851c71df0f559bdda226651460" gracePeriod=2 Dec 02 16:55:42 crc kubenswrapper[4933]: I1202 16:55:42.868743 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7wr5g" Dec 02 16:55:42 crc kubenswrapper[4933]: I1202 16:55:42.951215 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06d5a596-f0d1-4758-8822-d46df6bb9fab-catalog-content\") pod \"06d5a596-f0d1-4758-8822-d46df6bb9fab\" (UID: \"06d5a596-f0d1-4758-8822-d46df6bb9fab\") " Dec 02 16:55:42 crc kubenswrapper[4933]: I1202 16:55:42.951358 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xvz7s\" (UniqueName: \"kubernetes.io/projected/06d5a596-f0d1-4758-8822-d46df6bb9fab-kube-api-access-xvz7s\") pod \"06d5a596-f0d1-4758-8822-d46df6bb9fab\" (UID: \"06d5a596-f0d1-4758-8822-d46df6bb9fab\") " Dec 02 16:55:42 crc kubenswrapper[4933]: I1202 16:55:42.951402 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06d5a596-f0d1-4758-8822-d46df6bb9fab-utilities\") pod \"06d5a596-f0d1-4758-8822-d46df6bb9fab\" (UID: \"06d5a596-f0d1-4758-8822-d46df6bb9fab\") " Dec 02 16:55:42 crc kubenswrapper[4933]: I1202 16:55:42.952383 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/06d5a596-f0d1-4758-8822-d46df6bb9fab-utilities" (OuterVolumeSpecName: "utilities") pod "06d5a596-f0d1-4758-8822-d46df6bb9fab" (UID: "06d5a596-f0d1-4758-8822-d46df6bb9fab"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 16:55:42 crc kubenswrapper[4933]: I1202 16:55:42.958023 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06d5a596-f0d1-4758-8822-d46df6bb9fab-kube-api-access-xvz7s" (OuterVolumeSpecName: "kube-api-access-xvz7s") pod "06d5a596-f0d1-4758-8822-d46df6bb9fab" (UID: "06d5a596-f0d1-4758-8822-d46df6bb9fab"). InnerVolumeSpecName "kube-api-access-xvz7s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 16:55:43 crc kubenswrapper[4933]: I1202 16:55:43.009659 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/06d5a596-f0d1-4758-8822-d46df6bb9fab-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "06d5a596-f0d1-4758-8822-d46df6bb9fab" (UID: "06d5a596-f0d1-4758-8822-d46df6bb9fab"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 16:55:43 crc kubenswrapper[4933]: I1202 16:55:43.054851 4933 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06d5a596-f0d1-4758-8822-d46df6bb9fab-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 16:55:43 crc kubenswrapper[4933]: I1202 16:55:43.054884 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xvz7s\" (UniqueName: \"kubernetes.io/projected/06d5a596-f0d1-4758-8822-d46df6bb9fab-kube-api-access-xvz7s\") on node \"crc\" DevicePath \"\"" Dec 02 16:55:43 crc kubenswrapper[4933]: I1202 16:55:43.054899 4933 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06d5a596-f0d1-4758-8822-d46df6bb9fab-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 16:55:43 crc kubenswrapper[4933]: I1202 16:55:43.309362 4933 generic.go:334] "Generic (PLEG): container finished" podID="06d5a596-f0d1-4758-8822-d46df6bb9fab" containerID="99d15c66e2f14aa69863952857fed562229aed851c71df0f559bdda226651460" exitCode=0 Dec 02 16:55:43 crc kubenswrapper[4933]: I1202 16:55:43.309400 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7wr5g" event={"ID":"06d5a596-f0d1-4758-8822-d46df6bb9fab","Type":"ContainerDied","Data":"99d15c66e2f14aa69863952857fed562229aed851c71df0f559bdda226651460"} Dec 02 16:55:43 crc kubenswrapper[4933]: I1202 16:55:43.309427 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7wr5g" event={"ID":"06d5a596-f0d1-4758-8822-d46df6bb9fab","Type":"ContainerDied","Data":"8bf46f181977cc33e8910e61fce43e4034a6c6c0e973964c6b228f7395a98bca"} Dec 02 16:55:43 crc kubenswrapper[4933]: I1202 16:55:43.309425 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7wr5g" Dec 02 16:55:43 crc kubenswrapper[4933]: I1202 16:55:43.309443 4933 scope.go:117] "RemoveContainer" containerID="99d15c66e2f14aa69863952857fed562229aed851c71df0f559bdda226651460" Dec 02 16:55:43 crc kubenswrapper[4933]: I1202 16:55:43.339675 4933 scope.go:117] "RemoveContainer" containerID="b35c242cf1fe6994c772d480cb44df7cef649809a8d778df52c80b25852a2d44" Dec 02 16:55:43 crc kubenswrapper[4933]: I1202 16:55:43.341862 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7wr5g"] Dec 02 16:55:43 crc kubenswrapper[4933]: I1202 16:55:43.352205 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-7wr5g"] Dec 02 16:55:43 crc kubenswrapper[4933]: I1202 16:55:43.363979 4933 scope.go:117] "RemoveContainer" containerID="a2b92abd018a6dc90fa73ba5c65ef133d2bb233f828c84ac2720fcfc7749c2c3" Dec 02 16:55:43 crc kubenswrapper[4933]: I1202 16:55:43.411646 4933 scope.go:117] "RemoveContainer" containerID="99d15c66e2f14aa69863952857fed562229aed851c71df0f559bdda226651460" Dec 02 16:55:43 crc kubenswrapper[4933]: E1202 16:55:43.412176 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99d15c66e2f14aa69863952857fed562229aed851c71df0f559bdda226651460\": container with ID starting with 99d15c66e2f14aa69863952857fed562229aed851c71df0f559bdda226651460 not found: ID does not exist" containerID="99d15c66e2f14aa69863952857fed562229aed851c71df0f559bdda226651460" Dec 02 16:55:43 crc kubenswrapper[4933]: I1202 16:55:43.412226 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99d15c66e2f14aa69863952857fed562229aed851c71df0f559bdda226651460"} err="failed to get container status \"99d15c66e2f14aa69863952857fed562229aed851c71df0f559bdda226651460\": rpc error: code = NotFound desc = could not find container \"99d15c66e2f14aa69863952857fed562229aed851c71df0f559bdda226651460\": container with ID starting with 99d15c66e2f14aa69863952857fed562229aed851c71df0f559bdda226651460 not found: ID does not exist" Dec 02 16:55:43 crc kubenswrapper[4933]: I1202 16:55:43.412258 4933 scope.go:117] "RemoveContainer" containerID="b35c242cf1fe6994c772d480cb44df7cef649809a8d778df52c80b25852a2d44" Dec 02 16:55:43 crc kubenswrapper[4933]: E1202 16:55:43.412699 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b35c242cf1fe6994c772d480cb44df7cef649809a8d778df52c80b25852a2d44\": container with ID starting with b35c242cf1fe6994c772d480cb44df7cef649809a8d778df52c80b25852a2d44 not found: ID does not exist" containerID="b35c242cf1fe6994c772d480cb44df7cef649809a8d778df52c80b25852a2d44" Dec 02 16:55:43 crc kubenswrapper[4933]: I1202 16:55:43.412731 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b35c242cf1fe6994c772d480cb44df7cef649809a8d778df52c80b25852a2d44"} err="failed to get container status \"b35c242cf1fe6994c772d480cb44df7cef649809a8d778df52c80b25852a2d44\": rpc error: code = NotFound desc = could not find container \"b35c242cf1fe6994c772d480cb44df7cef649809a8d778df52c80b25852a2d44\": container with ID starting with b35c242cf1fe6994c772d480cb44df7cef649809a8d778df52c80b25852a2d44 not found: ID does not exist" Dec 02 16:55:43 crc kubenswrapper[4933]: I1202 16:55:43.412755 4933 scope.go:117] "RemoveContainer" containerID="a2b92abd018a6dc90fa73ba5c65ef133d2bb233f828c84ac2720fcfc7749c2c3" Dec 02 16:55:43 crc kubenswrapper[4933]: E1202 16:55:43.413058 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2b92abd018a6dc90fa73ba5c65ef133d2bb233f828c84ac2720fcfc7749c2c3\": container with ID starting with a2b92abd018a6dc90fa73ba5c65ef133d2bb233f828c84ac2720fcfc7749c2c3 not found: ID does not exist" containerID="a2b92abd018a6dc90fa73ba5c65ef133d2bb233f828c84ac2720fcfc7749c2c3" Dec 02 16:55:43 crc kubenswrapper[4933]: I1202 16:55:43.413093 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2b92abd018a6dc90fa73ba5c65ef133d2bb233f828c84ac2720fcfc7749c2c3"} err="failed to get container status \"a2b92abd018a6dc90fa73ba5c65ef133d2bb233f828c84ac2720fcfc7749c2c3\": rpc error: code = NotFound desc = could not find container \"a2b92abd018a6dc90fa73ba5c65ef133d2bb233f828c84ac2720fcfc7749c2c3\": container with ID starting with a2b92abd018a6dc90fa73ba5c65ef133d2bb233f828c84ac2720fcfc7749c2c3 not found: ID does not exist" Dec 02 16:55:45 crc kubenswrapper[4933]: I1202 16:55:45.069476 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06d5a596-f0d1-4758-8822-d46df6bb9fab" path="/var/lib/kubelet/pods/06d5a596-f0d1-4758-8822-d46df6bb9fab/volumes" Dec 02 16:55:50 crc kubenswrapper[4933]: I1202 16:55:50.054317 4933 scope.go:117] "RemoveContainer" containerID="0536b3f7f2f953996f9cc98878934d57a628ba193ea15fc456e5b4e41cade065" Dec 02 16:55:50 crc kubenswrapper[4933]: E1202 16:55:50.055148 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 16:56:04 crc kubenswrapper[4933]: I1202 16:56:04.058410 4933 scope.go:117] "RemoveContainer" containerID="0536b3f7f2f953996f9cc98878934d57a628ba193ea15fc456e5b4e41cade065" Dec 02 16:56:04 crc kubenswrapper[4933]: E1202 16:56:04.059276 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 16:56:17 crc kubenswrapper[4933]: I1202 16:56:17.061804 4933 scope.go:117] "RemoveContainer" containerID="0536b3f7f2f953996f9cc98878934d57a628ba193ea15fc456e5b4e41cade065" Dec 02 16:56:17 crc kubenswrapper[4933]: E1202 16:56:17.062807 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 16:56:31 crc kubenswrapper[4933]: I1202 16:56:31.054349 4933 scope.go:117] "RemoveContainer" containerID="0536b3f7f2f953996f9cc98878934d57a628ba193ea15fc456e5b4e41cade065" Dec 02 16:56:31 crc kubenswrapper[4933]: E1202 16:56:31.055175 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 16:56:44 crc kubenswrapper[4933]: I1202 16:56:44.054173 4933 scope.go:117] "RemoveContainer" containerID="0536b3f7f2f953996f9cc98878934d57a628ba193ea15fc456e5b4e41cade065" Dec 02 16:56:44 crc kubenswrapper[4933]: E1202 16:56:44.055282 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 16:56:58 crc kubenswrapper[4933]: I1202 16:56:58.053791 4933 scope.go:117] "RemoveContainer" containerID="0536b3f7f2f953996f9cc98878934d57a628ba193ea15fc456e5b4e41cade065" Dec 02 16:56:58 crc kubenswrapper[4933]: E1202 16:56:58.055632 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 16:57:13 crc kubenswrapper[4933]: I1202 16:57:13.054753 4933 scope.go:117] "RemoveContainer" containerID="0536b3f7f2f953996f9cc98878934d57a628ba193ea15fc456e5b4e41cade065" Dec 02 16:57:13 crc kubenswrapper[4933]: E1202 16:57:13.056198 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 16:57:25 crc kubenswrapper[4933]: I1202 16:57:25.053189 4933 scope.go:117] "RemoveContainer" containerID="0536b3f7f2f953996f9cc98878934d57a628ba193ea15fc456e5b4e41cade065" Dec 02 16:57:25 crc kubenswrapper[4933]: E1202 16:57:25.054156 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 16:57:36 crc kubenswrapper[4933]: I1202 16:57:36.055133 4933 scope.go:117] "RemoveContainer" containerID="0536b3f7f2f953996f9cc98878934d57a628ba193ea15fc456e5b4e41cade065" Dec 02 16:57:36 crc kubenswrapper[4933]: E1202 16:57:36.056052 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 16:57:47 crc kubenswrapper[4933]: I1202 16:57:47.072144 4933 scope.go:117] "RemoveContainer" containerID="0536b3f7f2f953996f9cc98878934d57a628ba193ea15fc456e5b4e41cade065" Dec 02 16:57:47 crc kubenswrapper[4933]: E1202 16:57:47.073141 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 16:57:58 crc kubenswrapper[4933]: I1202 16:57:58.053711 4933 scope.go:117] "RemoveContainer" containerID="0536b3f7f2f953996f9cc98878934d57a628ba193ea15fc456e5b4e41cade065" Dec 02 16:57:58 crc kubenswrapper[4933]: E1202 16:57:58.054455 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 16:58:10 crc kubenswrapper[4933]: I1202 16:58:10.053938 4933 scope.go:117] "RemoveContainer" containerID="0536b3f7f2f953996f9cc98878934d57a628ba193ea15fc456e5b4e41cade065" Dec 02 16:58:10 crc kubenswrapper[4933]: E1202 16:58:10.054839 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 16:58:21 crc kubenswrapper[4933]: I1202 16:58:21.059187 4933 scope.go:117] "RemoveContainer" containerID="0536b3f7f2f953996f9cc98878934d57a628ba193ea15fc456e5b4e41cade065" Dec 02 16:58:21 crc kubenswrapper[4933]: E1202 16:58:21.059971 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 16:58:33 crc kubenswrapper[4933]: I1202 16:58:33.054810 4933 scope.go:117] "RemoveContainer" containerID="0536b3f7f2f953996f9cc98878934d57a628ba193ea15fc456e5b4e41cade065" Dec 02 16:58:33 crc kubenswrapper[4933]: E1202 16:58:33.055588 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 16:58:47 crc kubenswrapper[4933]: I1202 16:58:47.063746 4933 scope.go:117] "RemoveContainer" containerID="0536b3f7f2f953996f9cc98878934d57a628ba193ea15fc456e5b4e41cade065" Dec 02 16:58:47 crc kubenswrapper[4933]: E1202 16:58:47.064663 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 16:59:00 crc kubenswrapper[4933]: I1202 16:59:00.054105 4933 scope.go:117] "RemoveContainer" containerID="0536b3f7f2f953996f9cc98878934d57a628ba193ea15fc456e5b4e41cade065" Dec 02 16:59:01 crc kubenswrapper[4933]: I1202 16:59:01.742773 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" event={"ID":"e6c1c5e6-50dd-428a-890c-2c3f0456f2fa","Type":"ContainerStarted","Data":"51d509c3d4a4f36dfcf653f9a16a1e6d29bd39eaec00991530f2e151f65f7108"} Dec 02 17:00:00 crc kubenswrapper[4933]: I1202 17:00:00.192503 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411580-mvjxn"] Dec 02 17:00:00 crc kubenswrapper[4933]: E1202 17:00:00.193515 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06d5a596-f0d1-4758-8822-d46df6bb9fab" containerName="registry-server" Dec 02 17:00:00 crc kubenswrapper[4933]: I1202 17:00:00.193532 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="06d5a596-f0d1-4758-8822-d46df6bb9fab" containerName="registry-server" Dec 02 17:00:00 crc kubenswrapper[4933]: E1202 17:00:00.193563 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06d5a596-f0d1-4758-8822-d46df6bb9fab" containerName="extract-content" Dec 02 17:00:00 crc kubenswrapper[4933]: I1202 17:00:00.193571 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="06d5a596-f0d1-4758-8822-d46df6bb9fab" containerName="extract-content" Dec 02 17:00:00 crc kubenswrapper[4933]: E1202 17:00:00.193614 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06d5a596-f0d1-4758-8822-d46df6bb9fab" containerName="extract-utilities" Dec 02 17:00:00 crc kubenswrapper[4933]: I1202 17:00:00.193622 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="06d5a596-f0d1-4758-8822-d46df6bb9fab" containerName="extract-utilities" Dec 02 17:00:00 crc kubenswrapper[4933]: I1202 17:00:00.193902 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="06d5a596-f0d1-4758-8822-d46df6bb9fab" containerName="registry-server" Dec 02 17:00:00 crc kubenswrapper[4933]: I1202 17:00:00.194864 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411580-mvjxn" Dec 02 17:00:00 crc kubenswrapper[4933]: I1202 17:00:00.196934 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 02 17:00:00 crc kubenswrapper[4933]: I1202 17:00:00.197244 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 02 17:00:00 crc kubenswrapper[4933]: I1202 17:00:00.206292 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411580-mvjxn"] Dec 02 17:00:00 crc kubenswrapper[4933]: I1202 17:00:00.310735 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d2258a4f-47f3-4a6d-8715-b71be971f023-secret-volume\") pod \"collect-profiles-29411580-mvjxn\" (UID: \"d2258a4f-47f3-4a6d-8715-b71be971f023\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411580-mvjxn" Dec 02 17:00:00 crc kubenswrapper[4933]: I1202 17:00:00.310893 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frgq5\" (UniqueName: \"kubernetes.io/projected/d2258a4f-47f3-4a6d-8715-b71be971f023-kube-api-access-frgq5\") pod \"collect-profiles-29411580-mvjxn\" (UID: \"d2258a4f-47f3-4a6d-8715-b71be971f023\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411580-mvjxn" Dec 02 17:00:00 crc kubenswrapper[4933]: I1202 17:00:00.311327 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d2258a4f-47f3-4a6d-8715-b71be971f023-config-volume\") pod \"collect-profiles-29411580-mvjxn\" (UID: \"d2258a4f-47f3-4a6d-8715-b71be971f023\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411580-mvjxn" Dec 02 17:00:00 crc kubenswrapper[4933]: I1202 17:00:00.413679 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frgq5\" (UniqueName: \"kubernetes.io/projected/d2258a4f-47f3-4a6d-8715-b71be971f023-kube-api-access-frgq5\") pod \"collect-profiles-29411580-mvjxn\" (UID: \"d2258a4f-47f3-4a6d-8715-b71be971f023\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411580-mvjxn" Dec 02 17:00:00 crc kubenswrapper[4933]: I1202 17:00:00.413895 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d2258a4f-47f3-4a6d-8715-b71be971f023-config-volume\") pod \"collect-profiles-29411580-mvjxn\" (UID: \"d2258a4f-47f3-4a6d-8715-b71be971f023\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411580-mvjxn" Dec 02 17:00:00 crc kubenswrapper[4933]: I1202 17:00:00.413952 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d2258a4f-47f3-4a6d-8715-b71be971f023-secret-volume\") pod \"collect-profiles-29411580-mvjxn\" (UID: \"d2258a4f-47f3-4a6d-8715-b71be971f023\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411580-mvjxn" Dec 02 17:00:00 crc kubenswrapper[4933]: I1202 17:00:00.415290 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d2258a4f-47f3-4a6d-8715-b71be971f023-config-volume\") pod \"collect-profiles-29411580-mvjxn\" (UID: \"d2258a4f-47f3-4a6d-8715-b71be971f023\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411580-mvjxn" Dec 02 17:00:00 crc kubenswrapper[4933]: I1202 17:00:00.425451 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d2258a4f-47f3-4a6d-8715-b71be971f023-secret-volume\") pod \"collect-profiles-29411580-mvjxn\" (UID: \"d2258a4f-47f3-4a6d-8715-b71be971f023\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411580-mvjxn" Dec 02 17:00:00 crc kubenswrapper[4933]: I1202 17:00:00.431174 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frgq5\" (UniqueName: \"kubernetes.io/projected/d2258a4f-47f3-4a6d-8715-b71be971f023-kube-api-access-frgq5\") pod \"collect-profiles-29411580-mvjxn\" (UID: \"d2258a4f-47f3-4a6d-8715-b71be971f023\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411580-mvjxn" Dec 02 17:00:00 crc kubenswrapper[4933]: I1202 17:00:00.525305 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411580-mvjxn" Dec 02 17:00:01 crc kubenswrapper[4933]: I1202 17:00:01.007706 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411580-mvjxn"] Dec 02 17:00:01 crc kubenswrapper[4933]: W1202 17:00:01.017027 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd2258a4f_47f3_4a6d_8715_b71be971f023.slice/crio-135cb7e340c3a0f2b1a18d30bdc65132aeab8ea262229dbd86dec02ab90bcb75 WatchSource:0}: Error finding container 135cb7e340c3a0f2b1a18d30bdc65132aeab8ea262229dbd86dec02ab90bcb75: Status 404 returned error can't find the container with id 135cb7e340c3a0f2b1a18d30bdc65132aeab8ea262229dbd86dec02ab90bcb75 Dec 02 17:00:01 crc kubenswrapper[4933]: I1202 17:00:01.499556 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411580-mvjxn" event={"ID":"d2258a4f-47f3-4a6d-8715-b71be971f023","Type":"ContainerStarted","Data":"20da9cd0b4fb979e5d2d7914bb0487b52773ee1f4ce1ca057fbd710421ad4760"} Dec 02 17:00:01 crc kubenswrapper[4933]: I1202 17:00:01.499886 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411580-mvjxn" event={"ID":"d2258a4f-47f3-4a6d-8715-b71be971f023","Type":"ContainerStarted","Data":"135cb7e340c3a0f2b1a18d30bdc65132aeab8ea262229dbd86dec02ab90bcb75"} Dec 02 17:00:02 crc kubenswrapper[4933]: I1202 17:00:02.517858 4933 generic.go:334] "Generic (PLEG): container finished" podID="d2258a4f-47f3-4a6d-8715-b71be971f023" containerID="20da9cd0b4fb979e5d2d7914bb0487b52773ee1f4ce1ca057fbd710421ad4760" exitCode=0 Dec 02 17:00:02 crc kubenswrapper[4933]: I1202 17:00:02.518253 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411580-mvjxn" event={"ID":"d2258a4f-47f3-4a6d-8715-b71be971f023","Type":"ContainerDied","Data":"20da9cd0b4fb979e5d2d7914bb0487b52773ee1f4ce1ca057fbd710421ad4760"} Dec 02 17:00:02 crc kubenswrapper[4933]: I1202 17:00:02.982736 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411580-mvjxn" Dec 02 17:00:03 crc kubenswrapper[4933]: I1202 17:00:03.119178 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d2258a4f-47f3-4a6d-8715-b71be971f023-secret-volume\") pod \"d2258a4f-47f3-4a6d-8715-b71be971f023\" (UID: \"d2258a4f-47f3-4a6d-8715-b71be971f023\") " Dec 02 17:00:03 crc kubenswrapper[4933]: I1202 17:00:03.119694 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-frgq5\" (UniqueName: \"kubernetes.io/projected/d2258a4f-47f3-4a6d-8715-b71be971f023-kube-api-access-frgq5\") pod \"d2258a4f-47f3-4a6d-8715-b71be971f023\" (UID: \"d2258a4f-47f3-4a6d-8715-b71be971f023\") " Dec 02 17:00:03 crc kubenswrapper[4933]: I1202 17:00:03.119941 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d2258a4f-47f3-4a6d-8715-b71be971f023-config-volume\") pod \"d2258a4f-47f3-4a6d-8715-b71be971f023\" (UID: \"d2258a4f-47f3-4a6d-8715-b71be971f023\") " Dec 02 17:00:03 crc kubenswrapper[4933]: I1202 17:00:03.121743 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2258a4f-47f3-4a6d-8715-b71be971f023-config-volume" (OuterVolumeSpecName: "config-volume") pod "d2258a4f-47f3-4a6d-8715-b71be971f023" (UID: "d2258a4f-47f3-4a6d-8715-b71be971f023"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 17:00:03 crc kubenswrapper[4933]: I1202 17:00:03.125708 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2258a4f-47f3-4a6d-8715-b71be971f023-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "d2258a4f-47f3-4a6d-8715-b71be971f023" (UID: "d2258a4f-47f3-4a6d-8715-b71be971f023"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 17:00:03 crc kubenswrapper[4933]: I1202 17:00:03.128618 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2258a4f-47f3-4a6d-8715-b71be971f023-kube-api-access-frgq5" (OuterVolumeSpecName: "kube-api-access-frgq5") pod "d2258a4f-47f3-4a6d-8715-b71be971f023" (UID: "d2258a4f-47f3-4a6d-8715-b71be971f023"). InnerVolumeSpecName "kube-api-access-frgq5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 17:00:03 crc kubenswrapper[4933]: I1202 17:00:03.226120 4933 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d2258a4f-47f3-4a6d-8715-b71be971f023-config-volume\") on node \"crc\" DevicePath \"\"" Dec 02 17:00:03 crc kubenswrapper[4933]: I1202 17:00:03.226199 4933 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d2258a4f-47f3-4a6d-8715-b71be971f023-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 02 17:00:03 crc kubenswrapper[4933]: I1202 17:00:03.226214 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-frgq5\" (UniqueName: \"kubernetes.io/projected/d2258a4f-47f3-4a6d-8715-b71be971f023-kube-api-access-frgq5\") on node \"crc\" DevicePath \"\"" Dec 02 17:00:03 crc kubenswrapper[4933]: I1202 17:00:03.533412 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411580-mvjxn" event={"ID":"d2258a4f-47f3-4a6d-8715-b71be971f023","Type":"ContainerDied","Data":"135cb7e340c3a0f2b1a18d30bdc65132aeab8ea262229dbd86dec02ab90bcb75"} Dec 02 17:00:03 crc kubenswrapper[4933]: I1202 17:00:03.533448 4933 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="135cb7e340c3a0f2b1a18d30bdc65132aeab8ea262229dbd86dec02ab90bcb75" Dec 02 17:00:03 crc kubenswrapper[4933]: I1202 17:00:03.533458 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411580-mvjxn" Dec 02 17:00:04 crc kubenswrapper[4933]: I1202 17:00:04.072077 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411535-sfhvv"] Dec 02 17:00:04 crc kubenswrapper[4933]: I1202 17:00:04.085853 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411535-sfhvv"] Dec 02 17:00:05 crc kubenswrapper[4933]: I1202 17:00:05.066901 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="249f9151-f57b-4ab8-8e3a-5c5256e2ed1c" path="/var/lib/kubelet/pods/249f9151-f57b-4ab8-8e3a-5c5256e2ed1c/volumes" Dec 02 17:00:34 crc kubenswrapper[4933]: I1202 17:00:34.421519 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-vkpv9"] Dec 02 17:00:34 crc kubenswrapper[4933]: E1202 17:00:34.429673 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2258a4f-47f3-4a6d-8715-b71be971f023" containerName="collect-profiles" Dec 02 17:00:34 crc kubenswrapper[4933]: I1202 17:00:34.429713 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2258a4f-47f3-4a6d-8715-b71be971f023" containerName="collect-profiles" Dec 02 17:00:34 crc kubenswrapper[4933]: I1202 17:00:34.430543 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2258a4f-47f3-4a6d-8715-b71be971f023" containerName="collect-profiles" Dec 02 17:00:34 crc kubenswrapper[4933]: I1202 17:00:34.434221 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vkpv9" Dec 02 17:00:34 crc kubenswrapper[4933]: I1202 17:00:34.481440 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vkpv9"] Dec 02 17:00:34 crc kubenswrapper[4933]: I1202 17:00:34.589579 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ed26d12-966e-4798-a554-7eebd50431e7-catalog-content\") pod \"redhat-marketplace-vkpv9\" (UID: \"2ed26d12-966e-4798-a554-7eebd50431e7\") " pod="openshift-marketplace/redhat-marketplace-vkpv9" Dec 02 17:00:34 crc kubenswrapper[4933]: I1202 17:00:34.589700 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ed26d12-966e-4798-a554-7eebd50431e7-utilities\") pod \"redhat-marketplace-vkpv9\" (UID: \"2ed26d12-966e-4798-a554-7eebd50431e7\") " pod="openshift-marketplace/redhat-marketplace-vkpv9" Dec 02 17:00:34 crc kubenswrapper[4933]: I1202 17:00:34.589723 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hf7d7\" (UniqueName: \"kubernetes.io/projected/2ed26d12-966e-4798-a554-7eebd50431e7-kube-api-access-hf7d7\") pod \"redhat-marketplace-vkpv9\" (UID: \"2ed26d12-966e-4798-a554-7eebd50431e7\") " pod="openshift-marketplace/redhat-marketplace-vkpv9" Dec 02 17:00:34 crc kubenswrapper[4933]: I1202 17:00:34.691944 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ed26d12-966e-4798-a554-7eebd50431e7-catalog-content\") pod \"redhat-marketplace-vkpv9\" (UID: \"2ed26d12-966e-4798-a554-7eebd50431e7\") " pod="openshift-marketplace/redhat-marketplace-vkpv9" Dec 02 17:00:34 crc kubenswrapper[4933]: I1202 17:00:34.692143 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ed26d12-966e-4798-a554-7eebd50431e7-utilities\") pod \"redhat-marketplace-vkpv9\" (UID: \"2ed26d12-966e-4798-a554-7eebd50431e7\") " pod="openshift-marketplace/redhat-marketplace-vkpv9" Dec 02 17:00:34 crc kubenswrapper[4933]: I1202 17:00:34.692173 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hf7d7\" (UniqueName: \"kubernetes.io/projected/2ed26d12-966e-4798-a554-7eebd50431e7-kube-api-access-hf7d7\") pod \"redhat-marketplace-vkpv9\" (UID: \"2ed26d12-966e-4798-a554-7eebd50431e7\") " pod="openshift-marketplace/redhat-marketplace-vkpv9" Dec 02 17:00:34 crc kubenswrapper[4933]: I1202 17:00:34.692519 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ed26d12-966e-4798-a554-7eebd50431e7-catalog-content\") pod \"redhat-marketplace-vkpv9\" (UID: \"2ed26d12-966e-4798-a554-7eebd50431e7\") " pod="openshift-marketplace/redhat-marketplace-vkpv9" Dec 02 17:00:34 crc kubenswrapper[4933]: I1202 17:00:34.692696 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ed26d12-966e-4798-a554-7eebd50431e7-utilities\") pod \"redhat-marketplace-vkpv9\" (UID: \"2ed26d12-966e-4798-a554-7eebd50431e7\") " pod="openshift-marketplace/redhat-marketplace-vkpv9" Dec 02 17:00:34 crc kubenswrapper[4933]: I1202 17:00:34.711400 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hf7d7\" (UniqueName: \"kubernetes.io/projected/2ed26d12-966e-4798-a554-7eebd50431e7-kube-api-access-hf7d7\") pod \"redhat-marketplace-vkpv9\" (UID: \"2ed26d12-966e-4798-a554-7eebd50431e7\") " pod="openshift-marketplace/redhat-marketplace-vkpv9" Dec 02 17:00:34 crc kubenswrapper[4933]: I1202 17:00:34.796200 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vkpv9" Dec 02 17:00:35 crc kubenswrapper[4933]: I1202 17:00:35.335953 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vkpv9"] Dec 02 17:00:35 crc kubenswrapper[4933]: I1202 17:00:35.992096 4933 generic.go:334] "Generic (PLEG): container finished" podID="2ed26d12-966e-4798-a554-7eebd50431e7" containerID="b424b545fc4220fc57b7218ba18ca35ccc9a317cc34f224bb4c016880417a356" exitCode=0 Dec 02 17:00:35 crc kubenswrapper[4933]: I1202 17:00:35.992152 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vkpv9" event={"ID":"2ed26d12-966e-4798-a554-7eebd50431e7","Type":"ContainerDied","Data":"b424b545fc4220fc57b7218ba18ca35ccc9a317cc34f224bb4c016880417a356"} Dec 02 17:00:35 crc kubenswrapper[4933]: I1202 17:00:35.993132 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vkpv9" event={"ID":"2ed26d12-966e-4798-a554-7eebd50431e7","Type":"ContainerStarted","Data":"e11b98b27da19518be3b1a122c3b87d4204a0c4f428784ca9e3ee900e3a0a5ae"} Dec 02 17:00:35 crc kubenswrapper[4933]: I1202 17:00:35.994643 4933 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 17:00:37 crc kubenswrapper[4933]: I1202 17:00:37.008307 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vkpv9" event={"ID":"2ed26d12-966e-4798-a554-7eebd50431e7","Type":"ContainerStarted","Data":"a759dba77e2363c77764f9b33046797722f2625a05eefe52e011181140c56ae2"} Dec 02 17:00:38 crc kubenswrapper[4933]: I1202 17:00:38.025316 4933 generic.go:334] "Generic (PLEG): container finished" podID="2ed26d12-966e-4798-a554-7eebd50431e7" containerID="a759dba77e2363c77764f9b33046797722f2625a05eefe52e011181140c56ae2" exitCode=0 Dec 02 17:00:38 crc kubenswrapper[4933]: I1202 17:00:38.025406 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vkpv9" event={"ID":"2ed26d12-966e-4798-a554-7eebd50431e7","Type":"ContainerDied","Data":"a759dba77e2363c77764f9b33046797722f2625a05eefe52e011181140c56ae2"} Dec 02 17:00:39 crc kubenswrapper[4933]: I1202 17:00:39.041991 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vkpv9" event={"ID":"2ed26d12-966e-4798-a554-7eebd50431e7","Type":"ContainerStarted","Data":"11405e85ec691a66897bd1f51f04676aea9de0a988adc9134f3f97e3375a7fb0"} Dec 02 17:00:39 crc kubenswrapper[4933]: I1202 17:00:39.069548 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-vkpv9" podStartSLOduration=2.5746149000000003 podStartE2EDuration="5.069525322s" podCreationTimestamp="2025-12-02 17:00:34 +0000 UTC" firstStartedPulling="2025-12-02 17:00:35.994420699 +0000 UTC m=+4099.245647402" lastFinishedPulling="2025-12-02 17:00:38.489331121 +0000 UTC m=+4101.740557824" observedRunningTime="2025-12-02 17:00:39.059152915 +0000 UTC m=+4102.310379628" watchObservedRunningTime="2025-12-02 17:00:39.069525322 +0000 UTC m=+4102.320752025" Dec 02 17:00:44 crc kubenswrapper[4933]: I1202 17:00:44.801137 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-vkpv9" Dec 02 17:00:44 crc kubenswrapper[4933]: I1202 17:00:44.801682 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-vkpv9" Dec 02 17:00:44 crc kubenswrapper[4933]: I1202 17:00:44.864630 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-vkpv9" Dec 02 17:00:45 crc kubenswrapper[4933]: I1202 17:00:45.149489 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-vkpv9" Dec 02 17:00:45 crc kubenswrapper[4933]: I1202 17:00:45.216629 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vkpv9"] Dec 02 17:00:47 crc kubenswrapper[4933]: I1202 17:00:47.140622 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-vkpv9" podUID="2ed26d12-966e-4798-a554-7eebd50431e7" containerName="registry-server" containerID="cri-o://11405e85ec691a66897bd1f51f04676aea9de0a988adc9134f3f97e3375a7fb0" gracePeriod=2 Dec 02 17:00:47 crc kubenswrapper[4933]: I1202 17:00:47.311678 4933 scope.go:117] "RemoveContainer" containerID="e3e507e10cf4ad3f6184e645ef67d092ac7ef28eb186805dbe88dbf2a8d0c8e6" Dec 02 17:00:47 crc kubenswrapper[4933]: I1202 17:00:47.705063 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vkpv9" Dec 02 17:00:47 crc kubenswrapper[4933]: I1202 17:00:47.836059 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ed26d12-966e-4798-a554-7eebd50431e7-utilities\") pod \"2ed26d12-966e-4798-a554-7eebd50431e7\" (UID: \"2ed26d12-966e-4798-a554-7eebd50431e7\") " Dec 02 17:00:47 crc kubenswrapper[4933]: I1202 17:00:47.836499 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ed26d12-966e-4798-a554-7eebd50431e7-catalog-content\") pod \"2ed26d12-966e-4798-a554-7eebd50431e7\" (UID: \"2ed26d12-966e-4798-a554-7eebd50431e7\") " Dec 02 17:00:47 crc kubenswrapper[4933]: I1202 17:00:47.836523 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hf7d7\" (UniqueName: \"kubernetes.io/projected/2ed26d12-966e-4798-a554-7eebd50431e7-kube-api-access-hf7d7\") pod \"2ed26d12-966e-4798-a554-7eebd50431e7\" (UID: \"2ed26d12-966e-4798-a554-7eebd50431e7\") " Dec 02 17:00:47 crc kubenswrapper[4933]: I1202 17:00:47.837684 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ed26d12-966e-4798-a554-7eebd50431e7-utilities" (OuterVolumeSpecName: "utilities") pod "2ed26d12-966e-4798-a554-7eebd50431e7" (UID: "2ed26d12-966e-4798-a554-7eebd50431e7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 17:00:47 crc kubenswrapper[4933]: I1202 17:00:47.838925 4933 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ed26d12-966e-4798-a554-7eebd50431e7-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 17:00:47 crc kubenswrapper[4933]: I1202 17:00:47.842198 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ed26d12-966e-4798-a554-7eebd50431e7-kube-api-access-hf7d7" (OuterVolumeSpecName: "kube-api-access-hf7d7") pod "2ed26d12-966e-4798-a554-7eebd50431e7" (UID: "2ed26d12-966e-4798-a554-7eebd50431e7"). InnerVolumeSpecName "kube-api-access-hf7d7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 17:00:47 crc kubenswrapper[4933]: I1202 17:00:47.869433 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ed26d12-966e-4798-a554-7eebd50431e7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2ed26d12-966e-4798-a554-7eebd50431e7" (UID: "2ed26d12-966e-4798-a554-7eebd50431e7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 17:00:47 crc kubenswrapper[4933]: I1202 17:00:47.941748 4933 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ed26d12-966e-4798-a554-7eebd50431e7-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 17:00:47 crc kubenswrapper[4933]: I1202 17:00:47.941796 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hf7d7\" (UniqueName: \"kubernetes.io/projected/2ed26d12-966e-4798-a554-7eebd50431e7-kube-api-access-hf7d7\") on node \"crc\" DevicePath \"\"" Dec 02 17:00:48 crc kubenswrapper[4933]: I1202 17:00:48.152669 4933 generic.go:334] "Generic (PLEG): container finished" podID="2ed26d12-966e-4798-a554-7eebd50431e7" containerID="11405e85ec691a66897bd1f51f04676aea9de0a988adc9134f3f97e3375a7fb0" exitCode=0 Dec 02 17:00:48 crc kubenswrapper[4933]: I1202 17:00:48.152708 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vkpv9" event={"ID":"2ed26d12-966e-4798-a554-7eebd50431e7","Type":"ContainerDied","Data":"11405e85ec691a66897bd1f51f04676aea9de0a988adc9134f3f97e3375a7fb0"} Dec 02 17:00:48 crc kubenswrapper[4933]: I1202 17:00:48.152734 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vkpv9" event={"ID":"2ed26d12-966e-4798-a554-7eebd50431e7","Type":"ContainerDied","Data":"e11b98b27da19518be3b1a122c3b87d4204a0c4f428784ca9e3ee900e3a0a5ae"} Dec 02 17:00:48 crc kubenswrapper[4933]: I1202 17:00:48.152750 4933 scope.go:117] "RemoveContainer" containerID="11405e85ec691a66897bd1f51f04676aea9de0a988adc9134f3f97e3375a7fb0" Dec 02 17:00:48 crc kubenswrapper[4933]: I1202 17:00:48.152907 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vkpv9" Dec 02 17:00:48 crc kubenswrapper[4933]: I1202 17:00:48.191025 4933 scope.go:117] "RemoveContainer" containerID="a759dba77e2363c77764f9b33046797722f2625a05eefe52e011181140c56ae2" Dec 02 17:00:48 crc kubenswrapper[4933]: I1202 17:00:48.202707 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vkpv9"] Dec 02 17:00:48 crc kubenswrapper[4933]: I1202 17:00:48.215862 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-vkpv9"] Dec 02 17:00:48 crc kubenswrapper[4933]: I1202 17:00:48.216389 4933 scope.go:117] "RemoveContainer" containerID="b424b545fc4220fc57b7218ba18ca35ccc9a317cc34f224bb4c016880417a356" Dec 02 17:00:48 crc kubenswrapper[4933]: I1202 17:00:48.237064 4933 scope.go:117] "RemoveContainer" containerID="11405e85ec691a66897bd1f51f04676aea9de0a988adc9134f3f97e3375a7fb0" Dec 02 17:00:48 crc kubenswrapper[4933]: E1202 17:00:48.243295 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11405e85ec691a66897bd1f51f04676aea9de0a988adc9134f3f97e3375a7fb0\": container with ID starting with 11405e85ec691a66897bd1f51f04676aea9de0a988adc9134f3f97e3375a7fb0 not found: ID does not exist" containerID="11405e85ec691a66897bd1f51f04676aea9de0a988adc9134f3f97e3375a7fb0" Dec 02 17:00:48 crc kubenswrapper[4933]: I1202 17:00:48.243332 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11405e85ec691a66897bd1f51f04676aea9de0a988adc9134f3f97e3375a7fb0"} err="failed to get container status \"11405e85ec691a66897bd1f51f04676aea9de0a988adc9134f3f97e3375a7fb0\": rpc error: code = NotFound desc = could not find container \"11405e85ec691a66897bd1f51f04676aea9de0a988adc9134f3f97e3375a7fb0\": container with ID starting with 11405e85ec691a66897bd1f51f04676aea9de0a988adc9134f3f97e3375a7fb0 not found: ID does not exist" Dec 02 17:00:48 crc kubenswrapper[4933]: I1202 17:00:48.243356 4933 scope.go:117] "RemoveContainer" containerID="a759dba77e2363c77764f9b33046797722f2625a05eefe52e011181140c56ae2" Dec 02 17:00:48 crc kubenswrapper[4933]: E1202 17:00:48.243701 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a759dba77e2363c77764f9b33046797722f2625a05eefe52e011181140c56ae2\": container with ID starting with a759dba77e2363c77764f9b33046797722f2625a05eefe52e011181140c56ae2 not found: ID does not exist" containerID="a759dba77e2363c77764f9b33046797722f2625a05eefe52e011181140c56ae2" Dec 02 17:00:48 crc kubenswrapper[4933]: I1202 17:00:48.243731 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a759dba77e2363c77764f9b33046797722f2625a05eefe52e011181140c56ae2"} err="failed to get container status \"a759dba77e2363c77764f9b33046797722f2625a05eefe52e011181140c56ae2\": rpc error: code = NotFound desc = could not find container \"a759dba77e2363c77764f9b33046797722f2625a05eefe52e011181140c56ae2\": container with ID starting with a759dba77e2363c77764f9b33046797722f2625a05eefe52e011181140c56ae2 not found: ID does not exist" Dec 02 17:00:48 crc kubenswrapper[4933]: I1202 17:00:48.243744 4933 scope.go:117] "RemoveContainer" containerID="b424b545fc4220fc57b7218ba18ca35ccc9a317cc34f224bb4c016880417a356" Dec 02 17:00:48 crc kubenswrapper[4933]: E1202 17:00:48.244065 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b424b545fc4220fc57b7218ba18ca35ccc9a317cc34f224bb4c016880417a356\": container with ID starting with b424b545fc4220fc57b7218ba18ca35ccc9a317cc34f224bb4c016880417a356 not found: ID does not exist" containerID="b424b545fc4220fc57b7218ba18ca35ccc9a317cc34f224bb4c016880417a356" Dec 02 17:00:48 crc kubenswrapper[4933]: I1202 17:00:48.244088 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b424b545fc4220fc57b7218ba18ca35ccc9a317cc34f224bb4c016880417a356"} err="failed to get container status \"b424b545fc4220fc57b7218ba18ca35ccc9a317cc34f224bb4c016880417a356\": rpc error: code = NotFound desc = could not find container \"b424b545fc4220fc57b7218ba18ca35ccc9a317cc34f224bb4c016880417a356\": container with ID starting with b424b545fc4220fc57b7218ba18ca35ccc9a317cc34f224bb4c016880417a356 not found: ID does not exist" Dec 02 17:00:48 crc kubenswrapper[4933]: E1202 17:00:48.289446 4933 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ed26d12_966e_4798_a554_7eebd50431e7.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ed26d12_966e_4798_a554_7eebd50431e7.slice/crio-e11b98b27da19518be3b1a122c3b87d4204a0c4f428784ca9e3ee900e3a0a5ae\": RecentStats: unable to find data in memory cache]" Dec 02 17:00:48 crc kubenswrapper[4933]: E1202 17:00:48.289752 4933 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ed26d12_966e_4798_a554_7eebd50431e7.slice\": RecentStats: unable to find data in memory cache]" Dec 02 17:00:49 crc kubenswrapper[4933]: I1202 17:00:49.068267 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ed26d12-966e-4798-a554-7eebd50431e7" path="/var/lib/kubelet/pods/2ed26d12-966e-4798-a554-7eebd50431e7/volumes" Dec 02 17:01:00 crc kubenswrapper[4933]: I1202 17:01:00.150479 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29411581-d66dr"] Dec 02 17:01:00 crc kubenswrapper[4933]: E1202 17:01:00.151481 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ed26d12-966e-4798-a554-7eebd50431e7" containerName="extract-utilities" Dec 02 17:01:00 crc kubenswrapper[4933]: I1202 17:01:00.151499 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ed26d12-966e-4798-a554-7eebd50431e7" containerName="extract-utilities" Dec 02 17:01:00 crc kubenswrapper[4933]: E1202 17:01:00.151537 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ed26d12-966e-4798-a554-7eebd50431e7" containerName="registry-server" Dec 02 17:01:00 crc kubenswrapper[4933]: I1202 17:01:00.151543 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ed26d12-966e-4798-a554-7eebd50431e7" containerName="registry-server" Dec 02 17:01:00 crc kubenswrapper[4933]: E1202 17:01:00.151575 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ed26d12-966e-4798-a554-7eebd50431e7" containerName="extract-content" Dec 02 17:01:00 crc kubenswrapper[4933]: I1202 17:01:00.151581 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ed26d12-966e-4798-a554-7eebd50431e7" containerName="extract-content" Dec 02 17:01:00 crc kubenswrapper[4933]: I1202 17:01:00.151803 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ed26d12-966e-4798-a554-7eebd50431e7" containerName="registry-server" Dec 02 17:01:00 crc kubenswrapper[4933]: I1202 17:01:00.152686 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29411581-d66dr" Dec 02 17:01:00 crc kubenswrapper[4933]: I1202 17:01:00.192083 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29411581-d66dr"] Dec 02 17:01:00 crc kubenswrapper[4933]: I1202 17:01:00.260746 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtjx4\" (UniqueName: \"kubernetes.io/projected/86c26f4d-35bf-4616-8935-e00dad2fb46a-kube-api-access-xtjx4\") pod \"keystone-cron-29411581-d66dr\" (UID: \"86c26f4d-35bf-4616-8935-e00dad2fb46a\") " pod="openstack/keystone-cron-29411581-d66dr" Dec 02 17:01:00 crc kubenswrapper[4933]: I1202 17:01:00.260880 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86c26f4d-35bf-4616-8935-e00dad2fb46a-combined-ca-bundle\") pod \"keystone-cron-29411581-d66dr\" (UID: \"86c26f4d-35bf-4616-8935-e00dad2fb46a\") " pod="openstack/keystone-cron-29411581-d66dr" Dec 02 17:01:00 crc kubenswrapper[4933]: I1202 17:01:00.260935 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86c26f4d-35bf-4616-8935-e00dad2fb46a-config-data\") pod \"keystone-cron-29411581-d66dr\" (UID: \"86c26f4d-35bf-4616-8935-e00dad2fb46a\") " pod="openstack/keystone-cron-29411581-d66dr" Dec 02 17:01:00 crc kubenswrapper[4933]: I1202 17:01:00.260960 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/86c26f4d-35bf-4616-8935-e00dad2fb46a-fernet-keys\") pod \"keystone-cron-29411581-d66dr\" (UID: \"86c26f4d-35bf-4616-8935-e00dad2fb46a\") " pod="openstack/keystone-cron-29411581-d66dr" Dec 02 17:01:00 crc kubenswrapper[4933]: I1202 17:01:00.364003 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xtjx4\" (UniqueName: \"kubernetes.io/projected/86c26f4d-35bf-4616-8935-e00dad2fb46a-kube-api-access-xtjx4\") pod \"keystone-cron-29411581-d66dr\" (UID: \"86c26f4d-35bf-4616-8935-e00dad2fb46a\") " pod="openstack/keystone-cron-29411581-d66dr" Dec 02 17:01:00 crc kubenswrapper[4933]: I1202 17:01:00.364141 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86c26f4d-35bf-4616-8935-e00dad2fb46a-combined-ca-bundle\") pod \"keystone-cron-29411581-d66dr\" (UID: \"86c26f4d-35bf-4616-8935-e00dad2fb46a\") " pod="openstack/keystone-cron-29411581-d66dr" Dec 02 17:01:00 crc kubenswrapper[4933]: I1202 17:01:00.364198 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86c26f4d-35bf-4616-8935-e00dad2fb46a-config-data\") pod \"keystone-cron-29411581-d66dr\" (UID: \"86c26f4d-35bf-4616-8935-e00dad2fb46a\") " pod="openstack/keystone-cron-29411581-d66dr" Dec 02 17:01:00 crc kubenswrapper[4933]: I1202 17:01:00.364228 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/86c26f4d-35bf-4616-8935-e00dad2fb46a-fernet-keys\") pod \"keystone-cron-29411581-d66dr\" (UID: \"86c26f4d-35bf-4616-8935-e00dad2fb46a\") " pod="openstack/keystone-cron-29411581-d66dr" Dec 02 17:01:00 crc kubenswrapper[4933]: I1202 17:01:00.883505 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86c26f4d-35bf-4616-8935-e00dad2fb46a-config-data\") pod \"keystone-cron-29411581-d66dr\" (UID: \"86c26f4d-35bf-4616-8935-e00dad2fb46a\") " pod="openstack/keystone-cron-29411581-d66dr" Dec 02 17:01:00 crc kubenswrapper[4933]: I1202 17:01:00.885557 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtjx4\" (UniqueName: \"kubernetes.io/projected/86c26f4d-35bf-4616-8935-e00dad2fb46a-kube-api-access-xtjx4\") pod \"keystone-cron-29411581-d66dr\" (UID: \"86c26f4d-35bf-4616-8935-e00dad2fb46a\") " pod="openstack/keystone-cron-29411581-d66dr" Dec 02 17:01:00 crc kubenswrapper[4933]: I1202 17:01:00.885974 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86c26f4d-35bf-4616-8935-e00dad2fb46a-combined-ca-bundle\") pod \"keystone-cron-29411581-d66dr\" (UID: \"86c26f4d-35bf-4616-8935-e00dad2fb46a\") " pod="openstack/keystone-cron-29411581-d66dr" Dec 02 17:01:00 crc kubenswrapper[4933]: I1202 17:01:00.894641 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/86c26f4d-35bf-4616-8935-e00dad2fb46a-fernet-keys\") pod \"keystone-cron-29411581-d66dr\" (UID: \"86c26f4d-35bf-4616-8935-e00dad2fb46a\") " pod="openstack/keystone-cron-29411581-d66dr" Dec 02 17:01:01 crc kubenswrapper[4933]: I1202 17:01:01.077205 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29411581-d66dr" Dec 02 17:01:01 crc kubenswrapper[4933]: I1202 17:01:01.655063 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29411581-d66dr"] Dec 02 17:01:02 crc kubenswrapper[4933]: I1202 17:01:02.372304 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29411581-d66dr" event={"ID":"86c26f4d-35bf-4616-8935-e00dad2fb46a","Type":"ContainerStarted","Data":"5c4aff4e449cb629aeea6fa6e41dddd372b492be25744443b9656f998cc0f971"} Dec 02 17:01:02 crc kubenswrapper[4933]: I1202 17:01:02.372640 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29411581-d66dr" event={"ID":"86c26f4d-35bf-4616-8935-e00dad2fb46a","Type":"ContainerStarted","Data":"4b37187aba5c5be8b7e141d99b0e18f726eec26fca429450bf099d58be989b3c"} Dec 02 17:01:02 crc kubenswrapper[4933]: I1202 17:01:02.421507 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29411581-d66dr" podStartSLOduration=2.4214723510000002 podStartE2EDuration="2.421472351s" podCreationTimestamp="2025-12-02 17:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 17:01:02.391215413 +0000 UTC m=+4125.642442136" watchObservedRunningTime="2025-12-02 17:01:02.421472351 +0000 UTC m=+4125.672699084" Dec 02 17:01:04 crc kubenswrapper[4933]: I1202 17:01:04.395227 4933 generic.go:334] "Generic (PLEG): container finished" podID="86c26f4d-35bf-4616-8935-e00dad2fb46a" containerID="5c4aff4e449cb629aeea6fa6e41dddd372b492be25744443b9656f998cc0f971" exitCode=0 Dec 02 17:01:04 crc kubenswrapper[4933]: I1202 17:01:04.395339 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29411581-d66dr" event={"ID":"86c26f4d-35bf-4616-8935-e00dad2fb46a","Type":"ContainerDied","Data":"5c4aff4e449cb629aeea6fa6e41dddd372b492be25744443b9656f998cc0f971"} Dec 02 17:01:05 crc kubenswrapper[4933]: I1202 17:01:05.988578 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29411581-d66dr" Dec 02 17:01:06 crc kubenswrapper[4933]: I1202 17:01:06.137920 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xtjx4\" (UniqueName: \"kubernetes.io/projected/86c26f4d-35bf-4616-8935-e00dad2fb46a-kube-api-access-xtjx4\") pod \"86c26f4d-35bf-4616-8935-e00dad2fb46a\" (UID: \"86c26f4d-35bf-4616-8935-e00dad2fb46a\") " Dec 02 17:01:06 crc kubenswrapper[4933]: I1202 17:01:06.138090 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/86c26f4d-35bf-4616-8935-e00dad2fb46a-fernet-keys\") pod \"86c26f4d-35bf-4616-8935-e00dad2fb46a\" (UID: \"86c26f4d-35bf-4616-8935-e00dad2fb46a\") " Dec 02 17:01:06 crc kubenswrapper[4933]: I1202 17:01:06.138150 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86c26f4d-35bf-4616-8935-e00dad2fb46a-combined-ca-bundle\") pod \"86c26f4d-35bf-4616-8935-e00dad2fb46a\" (UID: \"86c26f4d-35bf-4616-8935-e00dad2fb46a\") " Dec 02 17:01:06 crc kubenswrapper[4933]: I1202 17:01:06.138617 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86c26f4d-35bf-4616-8935-e00dad2fb46a-config-data\") pod \"86c26f4d-35bf-4616-8935-e00dad2fb46a\" (UID: \"86c26f4d-35bf-4616-8935-e00dad2fb46a\") " Dec 02 17:01:06 crc kubenswrapper[4933]: I1202 17:01:06.144067 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86c26f4d-35bf-4616-8935-e00dad2fb46a-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "86c26f4d-35bf-4616-8935-e00dad2fb46a" (UID: "86c26f4d-35bf-4616-8935-e00dad2fb46a"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 17:01:06 crc kubenswrapper[4933]: I1202 17:01:06.149297 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86c26f4d-35bf-4616-8935-e00dad2fb46a-kube-api-access-xtjx4" (OuterVolumeSpecName: "kube-api-access-xtjx4") pod "86c26f4d-35bf-4616-8935-e00dad2fb46a" (UID: "86c26f4d-35bf-4616-8935-e00dad2fb46a"). InnerVolumeSpecName "kube-api-access-xtjx4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 17:01:06 crc kubenswrapper[4933]: I1202 17:01:06.175436 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86c26f4d-35bf-4616-8935-e00dad2fb46a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "86c26f4d-35bf-4616-8935-e00dad2fb46a" (UID: "86c26f4d-35bf-4616-8935-e00dad2fb46a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 17:01:06 crc kubenswrapper[4933]: I1202 17:01:06.224518 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86c26f4d-35bf-4616-8935-e00dad2fb46a-config-data" (OuterVolumeSpecName: "config-data") pod "86c26f4d-35bf-4616-8935-e00dad2fb46a" (UID: "86c26f4d-35bf-4616-8935-e00dad2fb46a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 17:01:06 crc kubenswrapper[4933]: I1202 17:01:06.242824 4933 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86c26f4d-35bf-4616-8935-e00dad2fb46a-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 17:01:06 crc kubenswrapper[4933]: I1202 17:01:06.242878 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xtjx4\" (UniqueName: \"kubernetes.io/projected/86c26f4d-35bf-4616-8935-e00dad2fb46a-kube-api-access-xtjx4\") on node \"crc\" DevicePath \"\"" Dec 02 17:01:06 crc kubenswrapper[4933]: I1202 17:01:06.242890 4933 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/86c26f4d-35bf-4616-8935-e00dad2fb46a-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 02 17:01:06 crc kubenswrapper[4933]: I1202 17:01:06.242899 4933 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86c26f4d-35bf-4616-8935-e00dad2fb46a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 17:01:06 crc kubenswrapper[4933]: I1202 17:01:06.447911 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29411581-d66dr" event={"ID":"86c26f4d-35bf-4616-8935-e00dad2fb46a","Type":"ContainerDied","Data":"4b37187aba5c5be8b7e141d99b0e18f726eec26fca429450bf099d58be989b3c"} Dec 02 17:01:06 crc kubenswrapper[4933]: I1202 17:01:06.448268 4933 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4b37187aba5c5be8b7e141d99b0e18f726eec26fca429450bf099d58be989b3c" Dec 02 17:01:06 crc kubenswrapper[4933]: I1202 17:01:06.448264 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29411581-d66dr" Dec 02 17:01:17 crc kubenswrapper[4933]: I1202 17:01:17.169405 4933 patch_prober.go:28] interesting pod/machine-config-daemon-d2p6w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 17:01:17 crc kubenswrapper[4933]: I1202 17:01:17.171002 4933 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 17:01:47 crc kubenswrapper[4933]: I1202 17:01:47.169653 4933 patch_prober.go:28] interesting pod/machine-config-daemon-d2p6w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 17:01:47 crc kubenswrapper[4933]: I1202 17:01:47.170377 4933 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 17:02:17 crc kubenswrapper[4933]: I1202 17:02:17.169935 4933 patch_prober.go:28] interesting pod/machine-config-daemon-d2p6w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 17:02:17 crc kubenswrapper[4933]: I1202 17:02:17.170602 4933 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 17:02:17 crc kubenswrapper[4933]: I1202 17:02:17.170664 4933 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" Dec 02 17:02:17 crc kubenswrapper[4933]: I1202 17:02:17.171943 4933 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"51d509c3d4a4f36dfcf653f9a16a1e6d29bd39eaec00991530f2e151f65f7108"} pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 17:02:17 crc kubenswrapper[4933]: I1202 17:02:17.172018 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" containerName="machine-config-daemon" containerID="cri-o://51d509c3d4a4f36dfcf653f9a16a1e6d29bd39eaec00991530f2e151f65f7108" gracePeriod=600 Dec 02 17:02:17 crc kubenswrapper[4933]: I1202 17:02:17.324070 4933 generic.go:334] "Generic (PLEG): container finished" podID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" containerID="51d509c3d4a4f36dfcf653f9a16a1e6d29bd39eaec00991530f2e151f65f7108" exitCode=0 Dec 02 17:02:17 crc kubenswrapper[4933]: I1202 17:02:17.324125 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" event={"ID":"e6c1c5e6-50dd-428a-890c-2c3f0456f2fa","Type":"ContainerDied","Data":"51d509c3d4a4f36dfcf653f9a16a1e6d29bd39eaec00991530f2e151f65f7108"} Dec 02 17:02:17 crc kubenswrapper[4933]: I1202 17:02:17.324196 4933 scope.go:117] "RemoveContainer" containerID="0536b3f7f2f953996f9cc98878934d57a628ba193ea15fc456e5b4e41cade065" Dec 02 17:02:18 crc kubenswrapper[4933]: I1202 17:02:18.339021 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" event={"ID":"e6c1c5e6-50dd-428a-890c-2c3f0456f2fa","Type":"ContainerStarted","Data":"92f6200fd90e9087a7ba6ed1bcde0586974cb341a4e26e21b4dc087341ea7eaa"} Dec 02 17:04:28 crc kubenswrapper[4933]: I1202 17:04:28.663769 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-5jc6r"] Dec 02 17:04:28 crc kubenswrapper[4933]: E1202 17:04:28.665153 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86c26f4d-35bf-4616-8935-e00dad2fb46a" containerName="keystone-cron" Dec 02 17:04:28 crc kubenswrapper[4933]: I1202 17:04:28.665172 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="86c26f4d-35bf-4616-8935-e00dad2fb46a" containerName="keystone-cron" Dec 02 17:04:28 crc kubenswrapper[4933]: I1202 17:04:28.665634 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="86c26f4d-35bf-4616-8935-e00dad2fb46a" containerName="keystone-cron" Dec 02 17:04:28 crc kubenswrapper[4933]: I1202 17:04:28.668234 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5jc6r" Dec 02 17:04:28 crc kubenswrapper[4933]: I1202 17:04:28.675638 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5jc6r"] Dec 02 17:04:28 crc kubenswrapper[4933]: I1202 17:04:28.715136 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b0fddcf-92f9-45fb-a0f1-cb00c1ab8e58-catalog-content\") pod \"certified-operators-5jc6r\" (UID: \"0b0fddcf-92f9-45fb-a0f1-cb00c1ab8e58\") " pod="openshift-marketplace/certified-operators-5jc6r" Dec 02 17:04:28 crc kubenswrapper[4933]: I1202 17:04:28.715232 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b0fddcf-92f9-45fb-a0f1-cb00c1ab8e58-utilities\") pod \"certified-operators-5jc6r\" (UID: \"0b0fddcf-92f9-45fb-a0f1-cb00c1ab8e58\") " pod="openshift-marketplace/certified-operators-5jc6r" Dec 02 17:04:28 crc kubenswrapper[4933]: I1202 17:04:28.715349 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfk2j\" (UniqueName: \"kubernetes.io/projected/0b0fddcf-92f9-45fb-a0f1-cb00c1ab8e58-kube-api-access-wfk2j\") pod \"certified-operators-5jc6r\" (UID: \"0b0fddcf-92f9-45fb-a0f1-cb00c1ab8e58\") " pod="openshift-marketplace/certified-operators-5jc6r" Dec 02 17:04:28 crc kubenswrapper[4933]: I1202 17:04:28.820911 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b0fddcf-92f9-45fb-a0f1-cb00c1ab8e58-catalog-content\") pod \"certified-operators-5jc6r\" (UID: \"0b0fddcf-92f9-45fb-a0f1-cb00c1ab8e58\") " pod="openshift-marketplace/certified-operators-5jc6r" Dec 02 17:04:28 crc kubenswrapper[4933]: I1202 17:04:28.821095 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b0fddcf-92f9-45fb-a0f1-cb00c1ab8e58-utilities\") pod \"certified-operators-5jc6r\" (UID: \"0b0fddcf-92f9-45fb-a0f1-cb00c1ab8e58\") " pod="openshift-marketplace/certified-operators-5jc6r" Dec 02 17:04:28 crc kubenswrapper[4933]: I1202 17:04:28.821305 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wfk2j\" (UniqueName: \"kubernetes.io/projected/0b0fddcf-92f9-45fb-a0f1-cb00c1ab8e58-kube-api-access-wfk2j\") pod \"certified-operators-5jc6r\" (UID: \"0b0fddcf-92f9-45fb-a0f1-cb00c1ab8e58\") " pod="openshift-marketplace/certified-operators-5jc6r" Dec 02 17:04:28 crc kubenswrapper[4933]: I1202 17:04:28.824759 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b0fddcf-92f9-45fb-a0f1-cb00c1ab8e58-catalog-content\") pod \"certified-operators-5jc6r\" (UID: \"0b0fddcf-92f9-45fb-a0f1-cb00c1ab8e58\") " pod="openshift-marketplace/certified-operators-5jc6r" Dec 02 17:04:28 crc kubenswrapper[4933]: I1202 17:04:28.829076 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b0fddcf-92f9-45fb-a0f1-cb00c1ab8e58-utilities\") pod \"certified-operators-5jc6r\" (UID: \"0b0fddcf-92f9-45fb-a0f1-cb00c1ab8e58\") " pod="openshift-marketplace/certified-operators-5jc6r" Dec 02 17:04:28 crc kubenswrapper[4933]: I1202 17:04:28.850843 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfk2j\" (UniqueName: \"kubernetes.io/projected/0b0fddcf-92f9-45fb-a0f1-cb00c1ab8e58-kube-api-access-wfk2j\") pod \"certified-operators-5jc6r\" (UID: \"0b0fddcf-92f9-45fb-a0f1-cb00c1ab8e58\") " pod="openshift-marketplace/certified-operators-5jc6r" Dec 02 17:04:29 crc kubenswrapper[4933]: I1202 17:04:29.002132 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5jc6r" Dec 02 17:04:29 crc kubenswrapper[4933]: I1202 17:04:29.490675 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5jc6r"] Dec 02 17:04:29 crc kubenswrapper[4933]: I1202 17:04:29.936619 4933 generic.go:334] "Generic (PLEG): container finished" podID="0b0fddcf-92f9-45fb-a0f1-cb00c1ab8e58" containerID="c10b182644e39028976532355f8d770ead73577ab70600160d6b5ed804958204" exitCode=0 Dec 02 17:04:29 crc kubenswrapper[4933]: I1202 17:04:29.936688 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5jc6r" event={"ID":"0b0fddcf-92f9-45fb-a0f1-cb00c1ab8e58","Type":"ContainerDied","Data":"c10b182644e39028976532355f8d770ead73577ab70600160d6b5ed804958204"} Dec 02 17:04:29 crc kubenswrapper[4933]: I1202 17:04:29.937150 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5jc6r" event={"ID":"0b0fddcf-92f9-45fb-a0f1-cb00c1ab8e58","Type":"ContainerStarted","Data":"403d9225ab7a16d3649e5bd17f8de7e9d2c8499a5bb538a0bff65eb2bd87e4fe"} Dec 02 17:04:37 crc kubenswrapper[4933]: I1202 17:04:37.023856 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5jc6r" event={"ID":"0b0fddcf-92f9-45fb-a0f1-cb00c1ab8e58","Type":"ContainerStarted","Data":"b6e70f1c10904464526c3dbfdd8fa1600fbb214a7e6afc1c9e89ec9baaefbe53"} Dec 02 17:04:38 crc kubenswrapper[4933]: I1202 17:04:38.037498 4933 generic.go:334] "Generic (PLEG): container finished" podID="0b0fddcf-92f9-45fb-a0f1-cb00c1ab8e58" containerID="b6e70f1c10904464526c3dbfdd8fa1600fbb214a7e6afc1c9e89ec9baaefbe53" exitCode=0 Dec 02 17:04:38 crc kubenswrapper[4933]: I1202 17:04:38.037583 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5jc6r" event={"ID":"0b0fddcf-92f9-45fb-a0f1-cb00c1ab8e58","Type":"ContainerDied","Data":"b6e70f1c10904464526c3dbfdd8fa1600fbb214a7e6afc1c9e89ec9baaefbe53"} Dec 02 17:04:39 crc kubenswrapper[4933]: I1202 17:04:39.050564 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5jc6r" event={"ID":"0b0fddcf-92f9-45fb-a0f1-cb00c1ab8e58","Type":"ContainerStarted","Data":"f2db237040e0b70f6a21eef9943d290517504bef3dbf2ea140fde1459ec4eea4"} Dec 02 17:04:39 crc kubenswrapper[4933]: I1202 17:04:39.089119 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-5jc6r" podStartSLOduration=2.399959526 podStartE2EDuration="11.08909949s" podCreationTimestamp="2025-12-02 17:04:28 +0000 UTC" firstStartedPulling="2025-12-02 17:04:29.940296557 +0000 UTC m=+4333.191523260" lastFinishedPulling="2025-12-02 17:04:38.629436491 +0000 UTC m=+4341.880663224" observedRunningTime="2025-12-02 17:04:39.082515385 +0000 UTC m=+4342.333742088" watchObservedRunningTime="2025-12-02 17:04:39.08909949 +0000 UTC m=+4342.340326193" Dec 02 17:04:47 crc kubenswrapper[4933]: I1202 17:04:47.170729 4933 patch_prober.go:28] interesting pod/machine-config-daemon-d2p6w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 17:04:47 crc kubenswrapper[4933]: I1202 17:04:47.171353 4933 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 17:04:49 crc kubenswrapper[4933]: I1202 17:04:49.009997 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-5jc6r" Dec 02 17:04:49 crc kubenswrapper[4933]: I1202 17:04:49.010298 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-5jc6r" Dec 02 17:04:49 crc kubenswrapper[4933]: I1202 17:04:49.096583 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-5jc6r" Dec 02 17:04:49 crc kubenswrapper[4933]: I1202 17:04:49.247369 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-5jc6r" Dec 02 17:04:49 crc kubenswrapper[4933]: I1202 17:04:49.352307 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5jc6r"] Dec 02 17:04:51 crc kubenswrapper[4933]: I1202 17:04:51.229774 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-5jc6r" podUID="0b0fddcf-92f9-45fb-a0f1-cb00c1ab8e58" containerName="registry-server" containerID="cri-o://f2db237040e0b70f6a21eef9943d290517504bef3dbf2ea140fde1459ec4eea4" gracePeriod=2 Dec 02 17:04:51 crc kubenswrapper[4933]: I1202 17:04:51.856429 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5jc6r" Dec 02 17:04:51 crc kubenswrapper[4933]: I1202 17:04:51.954211 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b0fddcf-92f9-45fb-a0f1-cb00c1ab8e58-catalog-content\") pod \"0b0fddcf-92f9-45fb-a0f1-cb00c1ab8e58\" (UID: \"0b0fddcf-92f9-45fb-a0f1-cb00c1ab8e58\") " Dec 02 17:04:51 crc kubenswrapper[4933]: I1202 17:04:51.954286 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wfk2j\" (UniqueName: \"kubernetes.io/projected/0b0fddcf-92f9-45fb-a0f1-cb00c1ab8e58-kube-api-access-wfk2j\") pod \"0b0fddcf-92f9-45fb-a0f1-cb00c1ab8e58\" (UID: \"0b0fddcf-92f9-45fb-a0f1-cb00c1ab8e58\") " Dec 02 17:04:51 crc kubenswrapper[4933]: I1202 17:04:51.954600 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b0fddcf-92f9-45fb-a0f1-cb00c1ab8e58-utilities\") pod \"0b0fddcf-92f9-45fb-a0f1-cb00c1ab8e58\" (UID: \"0b0fddcf-92f9-45fb-a0f1-cb00c1ab8e58\") " Dec 02 17:04:51 crc kubenswrapper[4933]: I1202 17:04:51.955530 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b0fddcf-92f9-45fb-a0f1-cb00c1ab8e58-utilities" (OuterVolumeSpecName: "utilities") pod "0b0fddcf-92f9-45fb-a0f1-cb00c1ab8e58" (UID: "0b0fddcf-92f9-45fb-a0f1-cb00c1ab8e58"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 17:04:51 crc kubenswrapper[4933]: I1202 17:04:51.961795 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b0fddcf-92f9-45fb-a0f1-cb00c1ab8e58-kube-api-access-wfk2j" (OuterVolumeSpecName: "kube-api-access-wfk2j") pod "0b0fddcf-92f9-45fb-a0f1-cb00c1ab8e58" (UID: "0b0fddcf-92f9-45fb-a0f1-cb00c1ab8e58"). InnerVolumeSpecName "kube-api-access-wfk2j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 17:04:52 crc kubenswrapper[4933]: I1202 17:04:52.012128 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b0fddcf-92f9-45fb-a0f1-cb00c1ab8e58-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0b0fddcf-92f9-45fb-a0f1-cb00c1ab8e58" (UID: "0b0fddcf-92f9-45fb-a0f1-cb00c1ab8e58"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 17:04:52 crc kubenswrapper[4933]: I1202 17:04:52.058335 4933 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b0fddcf-92f9-45fb-a0f1-cb00c1ab8e58-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 17:04:52 crc kubenswrapper[4933]: I1202 17:04:52.058366 4933 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b0fddcf-92f9-45fb-a0f1-cb00c1ab8e58-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 17:04:52 crc kubenswrapper[4933]: I1202 17:04:52.058376 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wfk2j\" (UniqueName: \"kubernetes.io/projected/0b0fddcf-92f9-45fb-a0f1-cb00c1ab8e58-kube-api-access-wfk2j\") on node \"crc\" DevicePath \"\"" Dec 02 17:04:52 crc kubenswrapper[4933]: I1202 17:04:52.242103 4933 generic.go:334] "Generic (PLEG): container finished" podID="0b0fddcf-92f9-45fb-a0f1-cb00c1ab8e58" containerID="f2db237040e0b70f6a21eef9943d290517504bef3dbf2ea140fde1459ec4eea4" exitCode=0 Dec 02 17:04:52 crc kubenswrapper[4933]: I1202 17:04:52.242256 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5jc6r" Dec 02 17:04:52 crc kubenswrapper[4933]: I1202 17:04:52.242351 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5jc6r" event={"ID":"0b0fddcf-92f9-45fb-a0f1-cb00c1ab8e58","Type":"ContainerDied","Data":"f2db237040e0b70f6a21eef9943d290517504bef3dbf2ea140fde1459ec4eea4"} Dec 02 17:04:52 crc kubenswrapper[4933]: I1202 17:04:52.242591 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5jc6r" event={"ID":"0b0fddcf-92f9-45fb-a0f1-cb00c1ab8e58","Type":"ContainerDied","Data":"403d9225ab7a16d3649e5bd17f8de7e9d2c8499a5bb538a0bff65eb2bd87e4fe"} Dec 02 17:04:52 crc kubenswrapper[4933]: I1202 17:04:52.242617 4933 scope.go:117] "RemoveContainer" containerID="f2db237040e0b70f6a21eef9943d290517504bef3dbf2ea140fde1459ec4eea4" Dec 02 17:04:52 crc kubenswrapper[4933]: I1202 17:04:52.278244 4933 scope.go:117] "RemoveContainer" containerID="b6e70f1c10904464526c3dbfdd8fa1600fbb214a7e6afc1c9e89ec9baaefbe53" Dec 02 17:04:52 crc kubenswrapper[4933]: I1202 17:04:52.290501 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5jc6r"] Dec 02 17:04:52 crc kubenswrapper[4933]: I1202 17:04:52.303795 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-5jc6r"] Dec 02 17:04:52 crc kubenswrapper[4933]: I1202 17:04:52.307134 4933 scope.go:117] "RemoveContainer" containerID="c10b182644e39028976532355f8d770ead73577ab70600160d6b5ed804958204" Dec 02 17:04:52 crc kubenswrapper[4933]: I1202 17:04:52.396141 4933 scope.go:117] "RemoveContainer" containerID="f2db237040e0b70f6a21eef9943d290517504bef3dbf2ea140fde1459ec4eea4" Dec 02 17:04:52 crc kubenswrapper[4933]: E1202 17:04:52.396630 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2db237040e0b70f6a21eef9943d290517504bef3dbf2ea140fde1459ec4eea4\": container with ID starting with f2db237040e0b70f6a21eef9943d290517504bef3dbf2ea140fde1459ec4eea4 not found: ID does not exist" containerID="f2db237040e0b70f6a21eef9943d290517504bef3dbf2ea140fde1459ec4eea4" Dec 02 17:04:52 crc kubenswrapper[4933]: I1202 17:04:52.396666 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2db237040e0b70f6a21eef9943d290517504bef3dbf2ea140fde1459ec4eea4"} err="failed to get container status \"f2db237040e0b70f6a21eef9943d290517504bef3dbf2ea140fde1459ec4eea4\": rpc error: code = NotFound desc = could not find container \"f2db237040e0b70f6a21eef9943d290517504bef3dbf2ea140fde1459ec4eea4\": container with ID starting with f2db237040e0b70f6a21eef9943d290517504bef3dbf2ea140fde1459ec4eea4 not found: ID does not exist" Dec 02 17:04:52 crc kubenswrapper[4933]: I1202 17:04:52.396690 4933 scope.go:117] "RemoveContainer" containerID="b6e70f1c10904464526c3dbfdd8fa1600fbb214a7e6afc1c9e89ec9baaefbe53" Dec 02 17:04:52 crc kubenswrapper[4933]: E1202 17:04:52.397101 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6e70f1c10904464526c3dbfdd8fa1600fbb214a7e6afc1c9e89ec9baaefbe53\": container with ID starting with b6e70f1c10904464526c3dbfdd8fa1600fbb214a7e6afc1c9e89ec9baaefbe53 not found: ID does not exist" containerID="b6e70f1c10904464526c3dbfdd8fa1600fbb214a7e6afc1c9e89ec9baaefbe53" Dec 02 17:04:52 crc kubenswrapper[4933]: I1202 17:04:52.397124 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6e70f1c10904464526c3dbfdd8fa1600fbb214a7e6afc1c9e89ec9baaefbe53"} err="failed to get container status \"b6e70f1c10904464526c3dbfdd8fa1600fbb214a7e6afc1c9e89ec9baaefbe53\": rpc error: code = NotFound desc = could not find container \"b6e70f1c10904464526c3dbfdd8fa1600fbb214a7e6afc1c9e89ec9baaefbe53\": container with ID starting with b6e70f1c10904464526c3dbfdd8fa1600fbb214a7e6afc1c9e89ec9baaefbe53 not found: ID does not exist" Dec 02 17:04:52 crc kubenswrapper[4933]: I1202 17:04:52.397142 4933 scope.go:117] "RemoveContainer" containerID="c10b182644e39028976532355f8d770ead73577ab70600160d6b5ed804958204" Dec 02 17:04:52 crc kubenswrapper[4933]: E1202 17:04:52.397448 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c10b182644e39028976532355f8d770ead73577ab70600160d6b5ed804958204\": container with ID starting with c10b182644e39028976532355f8d770ead73577ab70600160d6b5ed804958204 not found: ID does not exist" containerID="c10b182644e39028976532355f8d770ead73577ab70600160d6b5ed804958204" Dec 02 17:04:52 crc kubenswrapper[4933]: I1202 17:04:52.397472 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c10b182644e39028976532355f8d770ead73577ab70600160d6b5ed804958204"} err="failed to get container status \"c10b182644e39028976532355f8d770ead73577ab70600160d6b5ed804958204\": rpc error: code = NotFound desc = could not find container \"c10b182644e39028976532355f8d770ead73577ab70600160d6b5ed804958204\": container with ID starting with c10b182644e39028976532355f8d770ead73577ab70600160d6b5ed804958204 not found: ID does not exist" Dec 02 17:04:53 crc kubenswrapper[4933]: I1202 17:04:53.068764 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b0fddcf-92f9-45fb-a0f1-cb00c1ab8e58" path="/var/lib/kubelet/pods/0b0fddcf-92f9-45fb-a0f1-cb00c1ab8e58/volumes" Dec 02 17:05:07 crc kubenswrapper[4933]: I1202 17:05:07.104694 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-pldqv"] Dec 02 17:05:07 crc kubenswrapper[4933]: E1202 17:05:07.106779 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b0fddcf-92f9-45fb-a0f1-cb00c1ab8e58" containerName="extract-utilities" Dec 02 17:05:07 crc kubenswrapper[4933]: I1202 17:05:07.106820 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b0fddcf-92f9-45fb-a0f1-cb00c1ab8e58" containerName="extract-utilities" Dec 02 17:05:07 crc kubenswrapper[4933]: E1202 17:05:07.106947 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b0fddcf-92f9-45fb-a0f1-cb00c1ab8e58" containerName="registry-server" Dec 02 17:05:07 crc kubenswrapper[4933]: I1202 17:05:07.106970 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b0fddcf-92f9-45fb-a0f1-cb00c1ab8e58" containerName="registry-server" Dec 02 17:05:07 crc kubenswrapper[4933]: E1202 17:05:07.107024 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b0fddcf-92f9-45fb-a0f1-cb00c1ab8e58" containerName="extract-content" Dec 02 17:05:07 crc kubenswrapper[4933]: I1202 17:05:07.107045 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b0fddcf-92f9-45fb-a0f1-cb00c1ab8e58" containerName="extract-content" Dec 02 17:05:07 crc kubenswrapper[4933]: I1202 17:05:07.107777 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b0fddcf-92f9-45fb-a0f1-cb00c1ab8e58" containerName="registry-server" Dec 02 17:05:07 crc kubenswrapper[4933]: I1202 17:05:07.113716 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pldqv" Dec 02 17:05:07 crc kubenswrapper[4933]: I1202 17:05:07.135364 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pldqv"] Dec 02 17:05:07 crc kubenswrapper[4933]: I1202 17:05:07.158642 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cj6j9\" (UniqueName: \"kubernetes.io/projected/9be88ca8-b344-4fb9-bf66-a9a7611733ad-kube-api-access-cj6j9\") pod \"redhat-operators-pldqv\" (UID: \"9be88ca8-b344-4fb9-bf66-a9a7611733ad\") " pod="openshift-marketplace/redhat-operators-pldqv" Dec 02 17:05:07 crc kubenswrapper[4933]: I1202 17:05:07.158717 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9be88ca8-b344-4fb9-bf66-a9a7611733ad-utilities\") pod \"redhat-operators-pldqv\" (UID: \"9be88ca8-b344-4fb9-bf66-a9a7611733ad\") " pod="openshift-marketplace/redhat-operators-pldqv" Dec 02 17:05:07 crc kubenswrapper[4933]: I1202 17:05:07.158789 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9be88ca8-b344-4fb9-bf66-a9a7611733ad-catalog-content\") pod \"redhat-operators-pldqv\" (UID: \"9be88ca8-b344-4fb9-bf66-a9a7611733ad\") " pod="openshift-marketplace/redhat-operators-pldqv" Dec 02 17:05:07 crc kubenswrapper[4933]: I1202 17:05:07.260512 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9be88ca8-b344-4fb9-bf66-a9a7611733ad-utilities\") pod \"redhat-operators-pldqv\" (UID: \"9be88ca8-b344-4fb9-bf66-a9a7611733ad\") " pod="openshift-marketplace/redhat-operators-pldqv" Dec 02 17:05:07 crc kubenswrapper[4933]: I1202 17:05:07.260609 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9be88ca8-b344-4fb9-bf66-a9a7611733ad-catalog-content\") pod \"redhat-operators-pldqv\" (UID: \"9be88ca8-b344-4fb9-bf66-a9a7611733ad\") " pod="openshift-marketplace/redhat-operators-pldqv" Dec 02 17:05:07 crc kubenswrapper[4933]: I1202 17:05:07.260726 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cj6j9\" (UniqueName: \"kubernetes.io/projected/9be88ca8-b344-4fb9-bf66-a9a7611733ad-kube-api-access-cj6j9\") pod \"redhat-operators-pldqv\" (UID: \"9be88ca8-b344-4fb9-bf66-a9a7611733ad\") " pod="openshift-marketplace/redhat-operators-pldqv" Dec 02 17:05:07 crc kubenswrapper[4933]: I1202 17:05:07.261615 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9be88ca8-b344-4fb9-bf66-a9a7611733ad-utilities\") pod \"redhat-operators-pldqv\" (UID: \"9be88ca8-b344-4fb9-bf66-a9a7611733ad\") " pod="openshift-marketplace/redhat-operators-pldqv" Dec 02 17:05:07 crc kubenswrapper[4933]: I1202 17:05:07.261723 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9be88ca8-b344-4fb9-bf66-a9a7611733ad-catalog-content\") pod \"redhat-operators-pldqv\" (UID: \"9be88ca8-b344-4fb9-bf66-a9a7611733ad\") " pod="openshift-marketplace/redhat-operators-pldqv" Dec 02 17:05:07 crc kubenswrapper[4933]: I1202 17:05:07.294599 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cj6j9\" (UniqueName: \"kubernetes.io/projected/9be88ca8-b344-4fb9-bf66-a9a7611733ad-kube-api-access-cj6j9\") pod \"redhat-operators-pldqv\" (UID: \"9be88ca8-b344-4fb9-bf66-a9a7611733ad\") " pod="openshift-marketplace/redhat-operators-pldqv" Dec 02 17:05:07 crc kubenswrapper[4933]: I1202 17:05:07.450573 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pldqv" Dec 02 17:05:08 crc kubenswrapper[4933]: I1202 17:05:08.020746 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pldqv"] Dec 02 17:05:08 crc kubenswrapper[4933]: I1202 17:05:08.438198 4933 generic.go:334] "Generic (PLEG): container finished" podID="9be88ca8-b344-4fb9-bf66-a9a7611733ad" containerID="901bee3a744cd8697595ab4b6a5e5bfb83db0ea23870ffb7b6986c216266cf59" exitCode=0 Dec 02 17:05:08 crc kubenswrapper[4933]: I1202 17:05:08.438244 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pldqv" event={"ID":"9be88ca8-b344-4fb9-bf66-a9a7611733ad","Type":"ContainerDied","Data":"901bee3a744cd8697595ab4b6a5e5bfb83db0ea23870ffb7b6986c216266cf59"} Dec 02 17:05:08 crc kubenswrapper[4933]: I1202 17:05:08.438275 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pldqv" event={"ID":"9be88ca8-b344-4fb9-bf66-a9a7611733ad","Type":"ContainerStarted","Data":"ecc123ec2a7f89fb9f3406480ded5281292efd727f6f942099f54735fe46e826"} Dec 02 17:05:09 crc kubenswrapper[4933]: I1202 17:05:09.455120 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pldqv" event={"ID":"9be88ca8-b344-4fb9-bf66-a9a7611733ad","Type":"ContainerStarted","Data":"a88d7ad98ef342ca7b3360eae070017b5220c092efed25b2d0325f635a6154bd"} Dec 02 17:05:13 crc kubenswrapper[4933]: I1202 17:05:13.502945 4933 generic.go:334] "Generic (PLEG): container finished" podID="9be88ca8-b344-4fb9-bf66-a9a7611733ad" containerID="a88d7ad98ef342ca7b3360eae070017b5220c092efed25b2d0325f635a6154bd" exitCode=0 Dec 02 17:05:13 crc kubenswrapper[4933]: I1202 17:05:13.502999 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pldqv" event={"ID":"9be88ca8-b344-4fb9-bf66-a9a7611733ad","Type":"ContainerDied","Data":"a88d7ad98ef342ca7b3360eae070017b5220c092efed25b2d0325f635a6154bd"} Dec 02 17:05:14 crc kubenswrapper[4933]: I1202 17:05:14.520103 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pldqv" event={"ID":"9be88ca8-b344-4fb9-bf66-a9a7611733ad","Type":"ContainerStarted","Data":"8724d5e196bc8609e82984f5061d70e58ea93a6e701e2a6f0d33c47c6bdb45fc"} Dec 02 17:05:14 crc kubenswrapper[4933]: I1202 17:05:14.551278 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-pldqv" podStartSLOduration=1.985348622 podStartE2EDuration="7.551260827s" podCreationTimestamp="2025-12-02 17:05:07 +0000 UTC" firstStartedPulling="2025-12-02 17:05:08.440391987 +0000 UTC m=+4371.691618690" lastFinishedPulling="2025-12-02 17:05:14.006304192 +0000 UTC m=+4377.257530895" observedRunningTime="2025-12-02 17:05:14.548103763 +0000 UTC m=+4377.799330506" watchObservedRunningTime="2025-12-02 17:05:14.551260827 +0000 UTC m=+4377.802487530" Dec 02 17:05:17 crc kubenswrapper[4933]: I1202 17:05:17.169473 4933 patch_prober.go:28] interesting pod/machine-config-daemon-d2p6w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 17:05:17 crc kubenswrapper[4933]: I1202 17:05:17.170158 4933 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 17:05:17 crc kubenswrapper[4933]: I1202 17:05:17.450953 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-pldqv" Dec 02 17:05:17 crc kubenswrapper[4933]: I1202 17:05:17.451188 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-pldqv" Dec 02 17:05:18 crc kubenswrapper[4933]: I1202 17:05:18.511101 4933 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-pldqv" podUID="9be88ca8-b344-4fb9-bf66-a9a7611733ad" containerName="registry-server" probeResult="failure" output=< Dec 02 17:05:18 crc kubenswrapper[4933]: timeout: failed to connect service ":50051" within 1s Dec 02 17:05:18 crc kubenswrapper[4933]: > Dec 02 17:05:27 crc kubenswrapper[4933]: I1202 17:05:27.522353 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-pldqv" Dec 02 17:05:27 crc kubenswrapper[4933]: I1202 17:05:27.581895 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-pldqv" Dec 02 17:05:27 crc kubenswrapper[4933]: I1202 17:05:27.758546 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pldqv"] Dec 02 17:05:28 crc kubenswrapper[4933]: I1202 17:05:28.689626 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-pldqv" podUID="9be88ca8-b344-4fb9-bf66-a9a7611733ad" containerName="registry-server" containerID="cri-o://8724d5e196bc8609e82984f5061d70e58ea93a6e701e2a6f0d33c47c6bdb45fc" gracePeriod=2 Dec 02 17:05:29 crc kubenswrapper[4933]: I1202 17:05:29.220661 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pldqv" Dec 02 17:05:29 crc kubenswrapper[4933]: I1202 17:05:29.354460 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9be88ca8-b344-4fb9-bf66-a9a7611733ad-utilities\") pod \"9be88ca8-b344-4fb9-bf66-a9a7611733ad\" (UID: \"9be88ca8-b344-4fb9-bf66-a9a7611733ad\") " Dec 02 17:05:29 crc kubenswrapper[4933]: I1202 17:05:29.354943 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cj6j9\" (UniqueName: \"kubernetes.io/projected/9be88ca8-b344-4fb9-bf66-a9a7611733ad-kube-api-access-cj6j9\") pod \"9be88ca8-b344-4fb9-bf66-a9a7611733ad\" (UID: \"9be88ca8-b344-4fb9-bf66-a9a7611733ad\") " Dec 02 17:05:29 crc kubenswrapper[4933]: I1202 17:05:29.355268 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9be88ca8-b344-4fb9-bf66-a9a7611733ad-catalog-content\") pod \"9be88ca8-b344-4fb9-bf66-a9a7611733ad\" (UID: \"9be88ca8-b344-4fb9-bf66-a9a7611733ad\") " Dec 02 17:05:29 crc kubenswrapper[4933]: I1202 17:05:29.355349 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9be88ca8-b344-4fb9-bf66-a9a7611733ad-utilities" (OuterVolumeSpecName: "utilities") pod "9be88ca8-b344-4fb9-bf66-a9a7611733ad" (UID: "9be88ca8-b344-4fb9-bf66-a9a7611733ad"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 17:05:29 crc kubenswrapper[4933]: I1202 17:05:29.356259 4933 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9be88ca8-b344-4fb9-bf66-a9a7611733ad-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 17:05:29 crc kubenswrapper[4933]: I1202 17:05:29.361183 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9be88ca8-b344-4fb9-bf66-a9a7611733ad-kube-api-access-cj6j9" (OuterVolumeSpecName: "kube-api-access-cj6j9") pod "9be88ca8-b344-4fb9-bf66-a9a7611733ad" (UID: "9be88ca8-b344-4fb9-bf66-a9a7611733ad"). InnerVolumeSpecName "kube-api-access-cj6j9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 17:05:29 crc kubenswrapper[4933]: I1202 17:05:29.458603 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cj6j9\" (UniqueName: \"kubernetes.io/projected/9be88ca8-b344-4fb9-bf66-a9a7611733ad-kube-api-access-cj6j9\") on node \"crc\" DevicePath \"\"" Dec 02 17:05:29 crc kubenswrapper[4933]: I1202 17:05:29.497534 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9be88ca8-b344-4fb9-bf66-a9a7611733ad-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9be88ca8-b344-4fb9-bf66-a9a7611733ad" (UID: "9be88ca8-b344-4fb9-bf66-a9a7611733ad"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 17:05:29 crc kubenswrapper[4933]: I1202 17:05:29.561197 4933 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9be88ca8-b344-4fb9-bf66-a9a7611733ad-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 17:05:29 crc kubenswrapper[4933]: I1202 17:05:29.705116 4933 generic.go:334] "Generic (PLEG): container finished" podID="9be88ca8-b344-4fb9-bf66-a9a7611733ad" containerID="8724d5e196bc8609e82984f5061d70e58ea93a6e701e2a6f0d33c47c6bdb45fc" exitCode=0 Dec 02 17:05:29 crc kubenswrapper[4933]: I1202 17:05:29.705169 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pldqv" event={"ID":"9be88ca8-b344-4fb9-bf66-a9a7611733ad","Type":"ContainerDied","Data":"8724d5e196bc8609e82984f5061d70e58ea93a6e701e2a6f0d33c47c6bdb45fc"} Dec 02 17:05:29 crc kubenswrapper[4933]: I1202 17:05:29.705212 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pldqv" Dec 02 17:05:29 crc kubenswrapper[4933]: I1202 17:05:29.705256 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pldqv" event={"ID":"9be88ca8-b344-4fb9-bf66-a9a7611733ad","Type":"ContainerDied","Data":"ecc123ec2a7f89fb9f3406480ded5281292efd727f6f942099f54735fe46e826"} Dec 02 17:05:29 crc kubenswrapper[4933]: I1202 17:05:29.705298 4933 scope.go:117] "RemoveContainer" containerID="8724d5e196bc8609e82984f5061d70e58ea93a6e701e2a6f0d33c47c6bdb45fc" Dec 02 17:05:29 crc kubenswrapper[4933]: I1202 17:05:29.755594 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pldqv"] Dec 02 17:05:29 crc kubenswrapper[4933]: I1202 17:05:29.758354 4933 scope.go:117] "RemoveContainer" containerID="a88d7ad98ef342ca7b3360eae070017b5220c092efed25b2d0325f635a6154bd" Dec 02 17:05:29 crc kubenswrapper[4933]: I1202 17:05:29.769580 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-pldqv"] Dec 02 17:05:29 crc kubenswrapper[4933]: I1202 17:05:29.789663 4933 scope.go:117] "RemoveContainer" containerID="901bee3a744cd8697595ab4b6a5e5bfb83db0ea23870ffb7b6986c216266cf59" Dec 02 17:05:29 crc kubenswrapper[4933]: I1202 17:05:29.836956 4933 scope.go:117] "RemoveContainer" containerID="8724d5e196bc8609e82984f5061d70e58ea93a6e701e2a6f0d33c47c6bdb45fc" Dec 02 17:05:29 crc kubenswrapper[4933]: E1202 17:05:29.837506 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8724d5e196bc8609e82984f5061d70e58ea93a6e701e2a6f0d33c47c6bdb45fc\": container with ID starting with 8724d5e196bc8609e82984f5061d70e58ea93a6e701e2a6f0d33c47c6bdb45fc not found: ID does not exist" containerID="8724d5e196bc8609e82984f5061d70e58ea93a6e701e2a6f0d33c47c6bdb45fc" Dec 02 17:05:29 crc kubenswrapper[4933]: I1202 17:05:29.837534 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8724d5e196bc8609e82984f5061d70e58ea93a6e701e2a6f0d33c47c6bdb45fc"} err="failed to get container status \"8724d5e196bc8609e82984f5061d70e58ea93a6e701e2a6f0d33c47c6bdb45fc\": rpc error: code = NotFound desc = could not find container \"8724d5e196bc8609e82984f5061d70e58ea93a6e701e2a6f0d33c47c6bdb45fc\": container with ID starting with 8724d5e196bc8609e82984f5061d70e58ea93a6e701e2a6f0d33c47c6bdb45fc not found: ID does not exist" Dec 02 17:05:29 crc kubenswrapper[4933]: I1202 17:05:29.837569 4933 scope.go:117] "RemoveContainer" containerID="a88d7ad98ef342ca7b3360eae070017b5220c092efed25b2d0325f635a6154bd" Dec 02 17:05:29 crc kubenswrapper[4933]: E1202 17:05:29.838330 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a88d7ad98ef342ca7b3360eae070017b5220c092efed25b2d0325f635a6154bd\": container with ID starting with a88d7ad98ef342ca7b3360eae070017b5220c092efed25b2d0325f635a6154bd not found: ID does not exist" containerID="a88d7ad98ef342ca7b3360eae070017b5220c092efed25b2d0325f635a6154bd" Dec 02 17:05:29 crc kubenswrapper[4933]: I1202 17:05:29.838365 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a88d7ad98ef342ca7b3360eae070017b5220c092efed25b2d0325f635a6154bd"} err="failed to get container status \"a88d7ad98ef342ca7b3360eae070017b5220c092efed25b2d0325f635a6154bd\": rpc error: code = NotFound desc = could not find container \"a88d7ad98ef342ca7b3360eae070017b5220c092efed25b2d0325f635a6154bd\": container with ID starting with a88d7ad98ef342ca7b3360eae070017b5220c092efed25b2d0325f635a6154bd not found: ID does not exist" Dec 02 17:05:29 crc kubenswrapper[4933]: I1202 17:05:29.838379 4933 scope.go:117] "RemoveContainer" containerID="901bee3a744cd8697595ab4b6a5e5bfb83db0ea23870ffb7b6986c216266cf59" Dec 02 17:05:29 crc kubenswrapper[4933]: E1202 17:05:29.838651 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"901bee3a744cd8697595ab4b6a5e5bfb83db0ea23870ffb7b6986c216266cf59\": container with ID starting with 901bee3a744cd8697595ab4b6a5e5bfb83db0ea23870ffb7b6986c216266cf59 not found: ID does not exist" containerID="901bee3a744cd8697595ab4b6a5e5bfb83db0ea23870ffb7b6986c216266cf59" Dec 02 17:05:29 crc kubenswrapper[4933]: I1202 17:05:29.838701 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"901bee3a744cd8697595ab4b6a5e5bfb83db0ea23870ffb7b6986c216266cf59"} err="failed to get container status \"901bee3a744cd8697595ab4b6a5e5bfb83db0ea23870ffb7b6986c216266cf59\": rpc error: code = NotFound desc = could not find container \"901bee3a744cd8697595ab4b6a5e5bfb83db0ea23870ffb7b6986c216266cf59\": container with ID starting with 901bee3a744cd8697595ab4b6a5e5bfb83db0ea23870ffb7b6986c216266cf59 not found: ID does not exist" Dec 02 17:05:31 crc kubenswrapper[4933]: I1202 17:05:31.066136 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9be88ca8-b344-4fb9-bf66-a9a7611733ad" path="/var/lib/kubelet/pods/9be88ca8-b344-4fb9-bf66-a9a7611733ad/volumes" Dec 02 17:05:47 crc kubenswrapper[4933]: I1202 17:05:47.169575 4933 patch_prober.go:28] interesting pod/machine-config-daemon-d2p6w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 17:05:47 crc kubenswrapper[4933]: I1202 17:05:47.170358 4933 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 17:05:47 crc kubenswrapper[4933]: I1202 17:05:47.170434 4933 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" Dec 02 17:05:47 crc kubenswrapper[4933]: I1202 17:05:47.171546 4933 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"92f6200fd90e9087a7ba6ed1bcde0586974cb341a4e26e21b4dc087341ea7eaa"} pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 17:05:47 crc kubenswrapper[4933]: I1202 17:05:47.171613 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" containerName="machine-config-daemon" containerID="cri-o://92f6200fd90e9087a7ba6ed1bcde0586974cb341a4e26e21b4dc087341ea7eaa" gracePeriod=600 Dec 02 17:05:47 crc kubenswrapper[4933]: E1202 17:05:47.301269 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 17:05:47 crc kubenswrapper[4933]: I1202 17:05:47.949095 4933 generic.go:334] "Generic (PLEG): container finished" podID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" containerID="92f6200fd90e9087a7ba6ed1bcde0586974cb341a4e26e21b4dc087341ea7eaa" exitCode=0 Dec 02 17:05:47 crc kubenswrapper[4933]: I1202 17:05:47.949177 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" event={"ID":"e6c1c5e6-50dd-428a-890c-2c3f0456f2fa","Type":"ContainerDied","Data":"92f6200fd90e9087a7ba6ed1bcde0586974cb341a4e26e21b4dc087341ea7eaa"} Dec 02 17:05:47 crc kubenswrapper[4933]: I1202 17:05:47.949390 4933 scope.go:117] "RemoveContainer" containerID="51d509c3d4a4f36dfcf653f9a16a1e6d29bd39eaec00991530f2e151f65f7108" Dec 02 17:05:47 crc kubenswrapper[4933]: I1202 17:05:47.950376 4933 scope.go:117] "RemoveContainer" containerID="92f6200fd90e9087a7ba6ed1bcde0586974cb341a4e26e21b4dc087341ea7eaa" Dec 02 17:05:47 crc kubenswrapper[4933]: E1202 17:05:47.950933 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 17:05:52 crc kubenswrapper[4933]: E1202 17:05:52.720016 4933 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.213:43062->38.102.83.213:44657: read tcp 38.102.83.213:43062->38.102.83.213:44657: read: connection reset by peer Dec 02 17:06:00 crc kubenswrapper[4933]: I1202 17:06:00.053914 4933 scope.go:117] "RemoveContainer" containerID="92f6200fd90e9087a7ba6ed1bcde0586974cb341a4e26e21b4dc087341ea7eaa" Dec 02 17:06:00 crc kubenswrapper[4933]: E1202 17:06:00.054745 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 17:06:11 crc kubenswrapper[4933]: I1202 17:06:11.053809 4933 scope.go:117] "RemoveContainer" containerID="92f6200fd90e9087a7ba6ed1bcde0586974cb341a4e26e21b4dc087341ea7eaa" Dec 02 17:06:11 crc kubenswrapper[4933]: E1202 17:06:11.054655 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 17:06:24 crc kubenswrapper[4933]: I1202 17:06:24.054979 4933 scope.go:117] "RemoveContainer" containerID="92f6200fd90e9087a7ba6ed1bcde0586974cb341a4e26e21b4dc087341ea7eaa" Dec 02 17:06:24 crc kubenswrapper[4933]: E1202 17:06:24.055950 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 17:06:27 crc kubenswrapper[4933]: I1202 17:06:27.760589 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-kgbfj"] Dec 02 17:06:27 crc kubenswrapper[4933]: E1202 17:06:27.762171 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9be88ca8-b344-4fb9-bf66-a9a7611733ad" containerName="extract-utilities" Dec 02 17:06:27 crc kubenswrapper[4933]: I1202 17:06:27.762194 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="9be88ca8-b344-4fb9-bf66-a9a7611733ad" containerName="extract-utilities" Dec 02 17:06:27 crc kubenswrapper[4933]: E1202 17:06:27.762240 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9be88ca8-b344-4fb9-bf66-a9a7611733ad" containerName="registry-server" Dec 02 17:06:27 crc kubenswrapper[4933]: I1202 17:06:27.762253 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="9be88ca8-b344-4fb9-bf66-a9a7611733ad" containerName="registry-server" Dec 02 17:06:27 crc kubenswrapper[4933]: E1202 17:06:27.762291 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9be88ca8-b344-4fb9-bf66-a9a7611733ad" containerName="extract-content" Dec 02 17:06:27 crc kubenswrapper[4933]: I1202 17:06:27.762304 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="9be88ca8-b344-4fb9-bf66-a9a7611733ad" containerName="extract-content" Dec 02 17:06:27 crc kubenswrapper[4933]: I1202 17:06:27.762698 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="9be88ca8-b344-4fb9-bf66-a9a7611733ad" containerName="registry-server" Dec 02 17:06:27 crc kubenswrapper[4933]: I1202 17:06:27.765781 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kgbfj" Dec 02 17:06:27 crc kubenswrapper[4933]: I1202 17:06:27.783043 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kgbfj"] Dec 02 17:06:27 crc kubenswrapper[4933]: I1202 17:06:27.811347 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00363862-da13-46e1-a527-352bf53291b6-utilities\") pod \"community-operators-kgbfj\" (UID: \"00363862-da13-46e1-a527-352bf53291b6\") " pod="openshift-marketplace/community-operators-kgbfj" Dec 02 17:06:27 crc kubenswrapper[4933]: I1202 17:06:27.811552 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00363862-da13-46e1-a527-352bf53291b6-catalog-content\") pod \"community-operators-kgbfj\" (UID: \"00363862-da13-46e1-a527-352bf53291b6\") " pod="openshift-marketplace/community-operators-kgbfj" Dec 02 17:06:27 crc kubenswrapper[4933]: I1202 17:06:27.811598 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hdfp\" (UniqueName: \"kubernetes.io/projected/00363862-da13-46e1-a527-352bf53291b6-kube-api-access-6hdfp\") pod \"community-operators-kgbfj\" (UID: \"00363862-da13-46e1-a527-352bf53291b6\") " pod="openshift-marketplace/community-operators-kgbfj" Dec 02 17:06:27 crc kubenswrapper[4933]: I1202 17:06:27.914367 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00363862-da13-46e1-a527-352bf53291b6-catalog-content\") pod \"community-operators-kgbfj\" (UID: \"00363862-da13-46e1-a527-352bf53291b6\") " pod="openshift-marketplace/community-operators-kgbfj" Dec 02 17:06:27 crc kubenswrapper[4933]: I1202 17:06:27.914456 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hdfp\" (UniqueName: \"kubernetes.io/projected/00363862-da13-46e1-a527-352bf53291b6-kube-api-access-6hdfp\") pod \"community-operators-kgbfj\" (UID: \"00363862-da13-46e1-a527-352bf53291b6\") " pod="openshift-marketplace/community-operators-kgbfj" Dec 02 17:06:27 crc kubenswrapper[4933]: I1202 17:06:27.914701 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00363862-da13-46e1-a527-352bf53291b6-utilities\") pod \"community-operators-kgbfj\" (UID: \"00363862-da13-46e1-a527-352bf53291b6\") " pod="openshift-marketplace/community-operators-kgbfj" Dec 02 17:06:27 crc kubenswrapper[4933]: I1202 17:06:27.915283 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00363862-da13-46e1-a527-352bf53291b6-catalog-content\") pod \"community-operators-kgbfj\" (UID: \"00363862-da13-46e1-a527-352bf53291b6\") " pod="openshift-marketplace/community-operators-kgbfj" Dec 02 17:06:27 crc kubenswrapper[4933]: I1202 17:06:27.915392 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00363862-da13-46e1-a527-352bf53291b6-utilities\") pod \"community-operators-kgbfj\" (UID: \"00363862-da13-46e1-a527-352bf53291b6\") " pod="openshift-marketplace/community-operators-kgbfj" Dec 02 17:06:27 crc kubenswrapper[4933]: I1202 17:06:27.935656 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hdfp\" (UniqueName: \"kubernetes.io/projected/00363862-da13-46e1-a527-352bf53291b6-kube-api-access-6hdfp\") pod \"community-operators-kgbfj\" (UID: \"00363862-da13-46e1-a527-352bf53291b6\") " pod="openshift-marketplace/community-operators-kgbfj" Dec 02 17:06:28 crc kubenswrapper[4933]: I1202 17:06:28.107775 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kgbfj" Dec 02 17:06:28 crc kubenswrapper[4933]: I1202 17:06:28.674302 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kgbfj"] Dec 02 17:06:28 crc kubenswrapper[4933]: W1202 17:06:28.791395 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod00363862_da13_46e1_a527_352bf53291b6.slice/crio-72264123b564c5adffae8a564f11ad17297fe1c080856eec19f85596ff93367e WatchSource:0}: Error finding container 72264123b564c5adffae8a564f11ad17297fe1c080856eec19f85596ff93367e: Status 404 returned error can't find the container with id 72264123b564c5adffae8a564f11ad17297fe1c080856eec19f85596ff93367e Dec 02 17:06:29 crc kubenswrapper[4933]: I1202 17:06:29.611782 4933 generic.go:334] "Generic (PLEG): container finished" podID="00363862-da13-46e1-a527-352bf53291b6" containerID="2df74d8aec6729cc1cbff96f2a0a35df0b8ec4151d1de8134c94936c4eb0b0a5" exitCode=0 Dec 02 17:06:29 crc kubenswrapper[4933]: I1202 17:06:29.612133 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kgbfj" event={"ID":"00363862-da13-46e1-a527-352bf53291b6","Type":"ContainerDied","Data":"2df74d8aec6729cc1cbff96f2a0a35df0b8ec4151d1de8134c94936c4eb0b0a5"} Dec 02 17:06:29 crc kubenswrapper[4933]: I1202 17:06:29.612170 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kgbfj" event={"ID":"00363862-da13-46e1-a527-352bf53291b6","Type":"ContainerStarted","Data":"72264123b564c5adffae8a564f11ad17297fe1c080856eec19f85596ff93367e"} Dec 02 17:06:29 crc kubenswrapper[4933]: I1202 17:06:29.614877 4933 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 17:06:31 crc kubenswrapper[4933]: I1202 17:06:31.647167 4933 generic.go:334] "Generic (PLEG): container finished" podID="00363862-da13-46e1-a527-352bf53291b6" containerID="aa1ef3a2ed2ec67b1b218d9836c066098bc0dca890eac195f6d40d21d4b93dba" exitCode=0 Dec 02 17:06:31 crc kubenswrapper[4933]: I1202 17:06:31.647308 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kgbfj" event={"ID":"00363862-da13-46e1-a527-352bf53291b6","Type":"ContainerDied","Data":"aa1ef3a2ed2ec67b1b218d9836c066098bc0dca890eac195f6d40d21d4b93dba"} Dec 02 17:06:32 crc kubenswrapper[4933]: I1202 17:06:32.670947 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kgbfj" event={"ID":"00363862-da13-46e1-a527-352bf53291b6","Type":"ContainerStarted","Data":"7925a6d21c2d0454edf164dfbf134a0e4811574a253bb189d2da4c983471398d"} Dec 02 17:06:32 crc kubenswrapper[4933]: I1202 17:06:32.708078 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-kgbfj" podStartSLOduration=3.272125772 podStartE2EDuration="5.708053152s" podCreationTimestamp="2025-12-02 17:06:27 +0000 UTC" firstStartedPulling="2025-12-02 17:06:29.614585217 +0000 UTC m=+4452.865811930" lastFinishedPulling="2025-12-02 17:06:32.050512587 +0000 UTC m=+4455.301739310" observedRunningTime="2025-12-02 17:06:32.691611465 +0000 UTC m=+4455.942838178" watchObservedRunningTime="2025-12-02 17:06:32.708053152 +0000 UTC m=+4455.959279875" Dec 02 17:06:38 crc kubenswrapper[4933]: I1202 17:06:38.053125 4933 scope.go:117] "RemoveContainer" containerID="92f6200fd90e9087a7ba6ed1bcde0586974cb341a4e26e21b4dc087341ea7eaa" Dec 02 17:06:38 crc kubenswrapper[4933]: E1202 17:06:38.053900 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 17:06:38 crc kubenswrapper[4933]: I1202 17:06:38.108213 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-kgbfj" Dec 02 17:06:38 crc kubenswrapper[4933]: I1202 17:06:38.108291 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-kgbfj" Dec 02 17:06:38 crc kubenswrapper[4933]: I1202 17:06:38.176948 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-kgbfj" Dec 02 17:06:38 crc kubenswrapper[4933]: I1202 17:06:38.803552 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-kgbfj" Dec 02 17:06:38 crc kubenswrapper[4933]: I1202 17:06:38.854788 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kgbfj"] Dec 02 17:06:40 crc kubenswrapper[4933]: I1202 17:06:40.763355 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-kgbfj" podUID="00363862-da13-46e1-a527-352bf53291b6" containerName="registry-server" containerID="cri-o://7925a6d21c2d0454edf164dfbf134a0e4811574a253bb189d2da4c983471398d" gracePeriod=2 Dec 02 17:06:41 crc kubenswrapper[4933]: I1202 17:06:41.332434 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kgbfj" Dec 02 17:06:41 crc kubenswrapper[4933]: I1202 17:06:41.426045 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00363862-da13-46e1-a527-352bf53291b6-utilities\") pod \"00363862-da13-46e1-a527-352bf53291b6\" (UID: \"00363862-da13-46e1-a527-352bf53291b6\") " Dec 02 17:06:41 crc kubenswrapper[4933]: I1202 17:06:41.426127 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00363862-da13-46e1-a527-352bf53291b6-catalog-content\") pod \"00363862-da13-46e1-a527-352bf53291b6\" (UID: \"00363862-da13-46e1-a527-352bf53291b6\") " Dec 02 17:06:41 crc kubenswrapper[4933]: I1202 17:06:41.426250 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6hdfp\" (UniqueName: \"kubernetes.io/projected/00363862-da13-46e1-a527-352bf53291b6-kube-api-access-6hdfp\") pod \"00363862-da13-46e1-a527-352bf53291b6\" (UID: \"00363862-da13-46e1-a527-352bf53291b6\") " Dec 02 17:06:41 crc kubenswrapper[4933]: I1202 17:06:41.427323 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/00363862-da13-46e1-a527-352bf53291b6-utilities" (OuterVolumeSpecName: "utilities") pod "00363862-da13-46e1-a527-352bf53291b6" (UID: "00363862-da13-46e1-a527-352bf53291b6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 17:06:41 crc kubenswrapper[4933]: I1202 17:06:41.442723 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00363862-da13-46e1-a527-352bf53291b6-kube-api-access-6hdfp" (OuterVolumeSpecName: "kube-api-access-6hdfp") pod "00363862-da13-46e1-a527-352bf53291b6" (UID: "00363862-da13-46e1-a527-352bf53291b6"). InnerVolumeSpecName "kube-api-access-6hdfp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 17:06:41 crc kubenswrapper[4933]: I1202 17:06:41.528604 4933 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00363862-da13-46e1-a527-352bf53291b6-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 17:06:41 crc kubenswrapper[4933]: I1202 17:06:41.528651 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6hdfp\" (UniqueName: \"kubernetes.io/projected/00363862-da13-46e1-a527-352bf53291b6-kube-api-access-6hdfp\") on node \"crc\" DevicePath \"\"" Dec 02 17:06:41 crc kubenswrapper[4933]: I1202 17:06:41.694651 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/00363862-da13-46e1-a527-352bf53291b6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "00363862-da13-46e1-a527-352bf53291b6" (UID: "00363862-da13-46e1-a527-352bf53291b6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 17:06:41 crc kubenswrapper[4933]: I1202 17:06:41.732180 4933 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00363862-da13-46e1-a527-352bf53291b6-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 17:06:41 crc kubenswrapper[4933]: I1202 17:06:41.778342 4933 generic.go:334] "Generic (PLEG): container finished" podID="00363862-da13-46e1-a527-352bf53291b6" containerID="7925a6d21c2d0454edf164dfbf134a0e4811574a253bb189d2da4c983471398d" exitCode=0 Dec 02 17:06:41 crc kubenswrapper[4933]: I1202 17:06:41.778392 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kgbfj" Dec 02 17:06:41 crc kubenswrapper[4933]: I1202 17:06:41.778410 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kgbfj" event={"ID":"00363862-da13-46e1-a527-352bf53291b6","Type":"ContainerDied","Data":"7925a6d21c2d0454edf164dfbf134a0e4811574a253bb189d2da4c983471398d"} Dec 02 17:06:41 crc kubenswrapper[4933]: I1202 17:06:41.778459 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kgbfj" event={"ID":"00363862-da13-46e1-a527-352bf53291b6","Type":"ContainerDied","Data":"72264123b564c5adffae8a564f11ad17297fe1c080856eec19f85596ff93367e"} Dec 02 17:06:41 crc kubenswrapper[4933]: I1202 17:06:41.778488 4933 scope.go:117] "RemoveContainer" containerID="7925a6d21c2d0454edf164dfbf134a0e4811574a253bb189d2da4c983471398d" Dec 02 17:06:41 crc kubenswrapper[4933]: I1202 17:06:41.821222 4933 scope.go:117] "RemoveContainer" containerID="aa1ef3a2ed2ec67b1b218d9836c066098bc0dca890eac195f6d40d21d4b93dba" Dec 02 17:06:41 crc kubenswrapper[4933]: I1202 17:06:41.843724 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kgbfj"] Dec 02 17:06:41 crc kubenswrapper[4933]: I1202 17:06:41.858669 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-kgbfj"] Dec 02 17:06:41 crc kubenswrapper[4933]: I1202 17:06:41.864004 4933 scope.go:117] "RemoveContainer" containerID="2df74d8aec6729cc1cbff96f2a0a35df0b8ec4151d1de8134c94936c4eb0b0a5" Dec 02 17:06:41 crc kubenswrapper[4933]: I1202 17:06:41.923475 4933 scope.go:117] "RemoveContainer" containerID="7925a6d21c2d0454edf164dfbf134a0e4811574a253bb189d2da4c983471398d" Dec 02 17:06:41 crc kubenswrapper[4933]: E1202 17:06:41.924395 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7925a6d21c2d0454edf164dfbf134a0e4811574a253bb189d2da4c983471398d\": container with ID starting with 7925a6d21c2d0454edf164dfbf134a0e4811574a253bb189d2da4c983471398d not found: ID does not exist" containerID="7925a6d21c2d0454edf164dfbf134a0e4811574a253bb189d2da4c983471398d" Dec 02 17:06:41 crc kubenswrapper[4933]: I1202 17:06:41.924791 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7925a6d21c2d0454edf164dfbf134a0e4811574a253bb189d2da4c983471398d"} err="failed to get container status \"7925a6d21c2d0454edf164dfbf134a0e4811574a253bb189d2da4c983471398d\": rpc error: code = NotFound desc = could not find container \"7925a6d21c2d0454edf164dfbf134a0e4811574a253bb189d2da4c983471398d\": container with ID starting with 7925a6d21c2d0454edf164dfbf134a0e4811574a253bb189d2da4c983471398d not found: ID does not exist" Dec 02 17:06:41 crc kubenswrapper[4933]: I1202 17:06:41.924957 4933 scope.go:117] "RemoveContainer" containerID="aa1ef3a2ed2ec67b1b218d9836c066098bc0dca890eac195f6d40d21d4b93dba" Dec 02 17:06:41 crc kubenswrapper[4933]: E1202 17:06:41.925469 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa1ef3a2ed2ec67b1b218d9836c066098bc0dca890eac195f6d40d21d4b93dba\": container with ID starting with aa1ef3a2ed2ec67b1b218d9836c066098bc0dca890eac195f6d40d21d4b93dba not found: ID does not exist" containerID="aa1ef3a2ed2ec67b1b218d9836c066098bc0dca890eac195f6d40d21d4b93dba" Dec 02 17:06:41 crc kubenswrapper[4933]: I1202 17:06:41.925492 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa1ef3a2ed2ec67b1b218d9836c066098bc0dca890eac195f6d40d21d4b93dba"} err="failed to get container status \"aa1ef3a2ed2ec67b1b218d9836c066098bc0dca890eac195f6d40d21d4b93dba\": rpc error: code = NotFound desc = could not find container \"aa1ef3a2ed2ec67b1b218d9836c066098bc0dca890eac195f6d40d21d4b93dba\": container with ID starting with aa1ef3a2ed2ec67b1b218d9836c066098bc0dca890eac195f6d40d21d4b93dba not found: ID does not exist" Dec 02 17:06:41 crc kubenswrapper[4933]: I1202 17:06:41.925507 4933 scope.go:117] "RemoveContainer" containerID="2df74d8aec6729cc1cbff96f2a0a35df0b8ec4151d1de8134c94936c4eb0b0a5" Dec 02 17:06:41 crc kubenswrapper[4933]: E1202 17:06:41.925776 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2df74d8aec6729cc1cbff96f2a0a35df0b8ec4151d1de8134c94936c4eb0b0a5\": container with ID starting with 2df74d8aec6729cc1cbff96f2a0a35df0b8ec4151d1de8134c94936c4eb0b0a5 not found: ID does not exist" containerID="2df74d8aec6729cc1cbff96f2a0a35df0b8ec4151d1de8134c94936c4eb0b0a5" Dec 02 17:06:41 crc kubenswrapper[4933]: I1202 17:06:41.925797 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2df74d8aec6729cc1cbff96f2a0a35df0b8ec4151d1de8134c94936c4eb0b0a5"} err="failed to get container status \"2df74d8aec6729cc1cbff96f2a0a35df0b8ec4151d1de8134c94936c4eb0b0a5\": rpc error: code = NotFound desc = could not find container \"2df74d8aec6729cc1cbff96f2a0a35df0b8ec4151d1de8134c94936c4eb0b0a5\": container with ID starting with 2df74d8aec6729cc1cbff96f2a0a35df0b8ec4151d1de8134c94936c4eb0b0a5 not found: ID does not exist" Dec 02 17:06:43 crc kubenswrapper[4933]: I1202 17:06:43.077435 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00363862-da13-46e1-a527-352bf53291b6" path="/var/lib/kubelet/pods/00363862-da13-46e1-a527-352bf53291b6/volumes" Dec 02 17:06:53 crc kubenswrapper[4933]: I1202 17:06:53.060994 4933 scope.go:117] "RemoveContainer" containerID="92f6200fd90e9087a7ba6ed1bcde0586974cb341a4e26e21b4dc087341ea7eaa" Dec 02 17:06:53 crc kubenswrapper[4933]: E1202 17:06:53.062911 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 17:07:06 crc kubenswrapper[4933]: I1202 17:07:06.053462 4933 scope.go:117] "RemoveContainer" containerID="92f6200fd90e9087a7ba6ed1bcde0586974cb341a4e26e21b4dc087341ea7eaa" Dec 02 17:07:06 crc kubenswrapper[4933]: E1202 17:07:06.054373 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 17:07:19 crc kubenswrapper[4933]: I1202 17:07:19.052874 4933 scope.go:117] "RemoveContainer" containerID="92f6200fd90e9087a7ba6ed1bcde0586974cb341a4e26e21b4dc087341ea7eaa" Dec 02 17:07:19 crc kubenswrapper[4933]: E1202 17:07:19.053623 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 17:07:30 crc kubenswrapper[4933]: I1202 17:07:30.055483 4933 scope.go:117] "RemoveContainer" containerID="92f6200fd90e9087a7ba6ed1bcde0586974cb341a4e26e21b4dc087341ea7eaa" Dec 02 17:07:30 crc kubenswrapper[4933]: E1202 17:07:30.057546 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 17:07:36 crc kubenswrapper[4933]: E1202 17:07:36.567472 4933 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.213:32984->38.102.83.213:44657: write tcp 38.102.83.213:32984->38.102.83.213:44657: write: broken pipe Dec 02 17:07:41 crc kubenswrapper[4933]: I1202 17:07:41.053529 4933 scope.go:117] "RemoveContainer" containerID="92f6200fd90e9087a7ba6ed1bcde0586974cb341a4e26e21b4dc087341ea7eaa" Dec 02 17:07:41 crc kubenswrapper[4933]: E1202 17:07:41.055673 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 17:07:55 crc kubenswrapper[4933]: I1202 17:07:55.053713 4933 scope.go:117] "RemoveContainer" containerID="92f6200fd90e9087a7ba6ed1bcde0586974cb341a4e26e21b4dc087341ea7eaa" Dec 02 17:07:55 crc kubenswrapper[4933]: E1202 17:07:55.054685 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 17:08:08 crc kubenswrapper[4933]: I1202 17:08:08.053592 4933 scope.go:117] "RemoveContainer" containerID="92f6200fd90e9087a7ba6ed1bcde0586974cb341a4e26e21b4dc087341ea7eaa" Dec 02 17:08:08 crc kubenswrapper[4933]: E1202 17:08:08.054307 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 17:08:21 crc kubenswrapper[4933]: I1202 17:08:21.054109 4933 scope.go:117] "RemoveContainer" containerID="92f6200fd90e9087a7ba6ed1bcde0586974cb341a4e26e21b4dc087341ea7eaa" Dec 02 17:08:21 crc kubenswrapper[4933]: E1202 17:08:21.054895 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 17:08:36 crc kubenswrapper[4933]: I1202 17:08:36.054800 4933 scope.go:117] "RemoveContainer" containerID="92f6200fd90e9087a7ba6ed1bcde0586974cb341a4e26e21b4dc087341ea7eaa" Dec 02 17:08:36 crc kubenswrapper[4933]: E1202 17:08:36.055863 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 17:08:49 crc kubenswrapper[4933]: I1202 17:08:49.054695 4933 scope.go:117] "RemoveContainer" containerID="92f6200fd90e9087a7ba6ed1bcde0586974cb341a4e26e21b4dc087341ea7eaa" Dec 02 17:08:49 crc kubenswrapper[4933]: E1202 17:08:49.055542 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 17:09:02 crc kubenswrapper[4933]: I1202 17:09:02.054429 4933 scope.go:117] "RemoveContainer" containerID="92f6200fd90e9087a7ba6ed1bcde0586974cb341a4e26e21b4dc087341ea7eaa" Dec 02 17:09:02 crc kubenswrapper[4933]: E1202 17:09:02.058018 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 17:09:16 crc kubenswrapper[4933]: I1202 17:09:16.054232 4933 scope.go:117] "RemoveContainer" containerID="92f6200fd90e9087a7ba6ed1bcde0586974cb341a4e26e21b4dc087341ea7eaa" Dec 02 17:09:16 crc kubenswrapper[4933]: E1202 17:09:16.055125 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 17:09:27 crc kubenswrapper[4933]: I1202 17:09:27.073436 4933 scope.go:117] "RemoveContainer" containerID="92f6200fd90e9087a7ba6ed1bcde0586974cb341a4e26e21b4dc087341ea7eaa" Dec 02 17:09:27 crc kubenswrapper[4933]: E1202 17:09:27.074753 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 17:09:38 crc kubenswrapper[4933]: I1202 17:09:38.053490 4933 scope.go:117] "RemoveContainer" containerID="92f6200fd90e9087a7ba6ed1bcde0586974cb341a4e26e21b4dc087341ea7eaa" Dec 02 17:09:38 crc kubenswrapper[4933]: E1202 17:09:38.054556 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 17:09:51 crc kubenswrapper[4933]: I1202 17:09:51.053292 4933 scope.go:117] "RemoveContainer" containerID="92f6200fd90e9087a7ba6ed1bcde0586974cb341a4e26e21b4dc087341ea7eaa" Dec 02 17:09:51 crc kubenswrapper[4933]: E1202 17:09:51.054139 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 17:10:06 crc kubenswrapper[4933]: I1202 17:10:06.054214 4933 scope.go:117] "RemoveContainer" containerID="92f6200fd90e9087a7ba6ed1bcde0586974cb341a4e26e21b4dc087341ea7eaa" Dec 02 17:10:06 crc kubenswrapper[4933]: E1202 17:10:06.055301 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 17:10:21 crc kubenswrapper[4933]: I1202 17:10:21.055928 4933 scope.go:117] "RemoveContainer" containerID="92f6200fd90e9087a7ba6ed1bcde0586974cb341a4e26e21b4dc087341ea7eaa" Dec 02 17:10:21 crc kubenswrapper[4933]: E1202 17:10:21.057005 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 17:10:32 crc kubenswrapper[4933]: I1202 17:10:32.053393 4933 scope.go:117] "RemoveContainer" containerID="92f6200fd90e9087a7ba6ed1bcde0586974cb341a4e26e21b4dc087341ea7eaa" Dec 02 17:10:32 crc kubenswrapper[4933]: E1202 17:10:32.054252 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 17:10:43 crc kubenswrapper[4933]: I1202 17:10:43.055488 4933 scope.go:117] "RemoveContainer" containerID="92f6200fd90e9087a7ba6ed1bcde0586974cb341a4e26e21b4dc087341ea7eaa" Dec 02 17:10:43 crc kubenswrapper[4933]: E1202 17:10:43.057155 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 17:10:56 crc kubenswrapper[4933]: I1202 17:10:56.054051 4933 scope.go:117] "RemoveContainer" containerID="92f6200fd90e9087a7ba6ed1bcde0586974cb341a4e26e21b4dc087341ea7eaa" Dec 02 17:10:57 crc kubenswrapper[4933]: I1202 17:10:57.964372 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" event={"ID":"e6c1c5e6-50dd-428a-890c-2c3f0456f2fa","Type":"ContainerStarted","Data":"c3b07b288ed93f4f920f4fab70c864505c565fe378f7ea5e73804f6086777bc6"} Dec 02 17:10:59 crc kubenswrapper[4933]: I1202 17:10:59.790899 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-trnkb"] Dec 02 17:10:59 crc kubenswrapper[4933]: E1202 17:10:59.791762 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00363862-da13-46e1-a527-352bf53291b6" containerName="extract-content" Dec 02 17:10:59 crc kubenswrapper[4933]: I1202 17:10:59.791780 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="00363862-da13-46e1-a527-352bf53291b6" containerName="extract-content" Dec 02 17:10:59 crc kubenswrapper[4933]: E1202 17:10:59.791808 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00363862-da13-46e1-a527-352bf53291b6" containerName="extract-utilities" Dec 02 17:10:59 crc kubenswrapper[4933]: I1202 17:10:59.791834 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="00363862-da13-46e1-a527-352bf53291b6" containerName="extract-utilities" Dec 02 17:10:59 crc kubenswrapper[4933]: E1202 17:10:59.791876 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00363862-da13-46e1-a527-352bf53291b6" containerName="registry-server" Dec 02 17:10:59 crc kubenswrapper[4933]: I1202 17:10:59.791885 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="00363862-da13-46e1-a527-352bf53291b6" containerName="registry-server" Dec 02 17:10:59 crc kubenswrapper[4933]: I1202 17:10:59.792155 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="00363862-da13-46e1-a527-352bf53291b6" containerName="registry-server" Dec 02 17:10:59 crc kubenswrapper[4933]: I1202 17:10:59.794347 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-trnkb" Dec 02 17:10:59 crc kubenswrapper[4933]: I1202 17:10:59.826380 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f184aa3-1bdf-4b35-a2c3-195ac77a6581-catalog-content\") pod \"redhat-marketplace-trnkb\" (UID: \"1f184aa3-1bdf-4b35-a2c3-195ac77a6581\") " pod="openshift-marketplace/redhat-marketplace-trnkb" Dec 02 17:10:59 crc kubenswrapper[4933]: I1202 17:10:59.826808 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f184aa3-1bdf-4b35-a2c3-195ac77a6581-utilities\") pod \"redhat-marketplace-trnkb\" (UID: \"1f184aa3-1bdf-4b35-a2c3-195ac77a6581\") " pod="openshift-marketplace/redhat-marketplace-trnkb" Dec 02 17:10:59 crc kubenswrapper[4933]: I1202 17:10:59.827014 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-trnkb"] Dec 02 17:10:59 crc kubenswrapper[4933]: I1202 17:10:59.827124 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmmnl\" (UniqueName: \"kubernetes.io/projected/1f184aa3-1bdf-4b35-a2c3-195ac77a6581-kube-api-access-rmmnl\") pod \"redhat-marketplace-trnkb\" (UID: \"1f184aa3-1bdf-4b35-a2c3-195ac77a6581\") " pod="openshift-marketplace/redhat-marketplace-trnkb" Dec 02 17:10:59 crc kubenswrapper[4933]: I1202 17:10:59.930210 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f184aa3-1bdf-4b35-a2c3-195ac77a6581-utilities\") pod \"redhat-marketplace-trnkb\" (UID: \"1f184aa3-1bdf-4b35-a2c3-195ac77a6581\") " pod="openshift-marketplace/redhat-marketplace-trnkb" Dec 02 17:10:59 crc kubenswrapper[4933]: I1202 17:10:59.930559 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmmnl\" (UniqueName: \"kubernetes.io/projected/1f184aa3-1bdf-4b35-a2c3-195ac77a6581-kube-api-access-rmmnl\") pod \"redhat-marketplace-trnkb\" (UID: \"1f184aa3-1bdf-4b35-a2c3-195ac77a6581\") " pod="openshift-marketplace/redhat-marketplace-trnkb" Dec 02 17:10:59 crc kubenswrapper[4933]: I1202 17:10:59.930808 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f184aa3-1bdf-4b35-a2c3-195ac77a6581-catalog-content\") pod \"redhat-marketplace-trnkb\" (UID: \"1f184aa3-1bdf-4b35-a2c3-195ac77a6581\") " pod="openshift-marketplace/redhat-marketplace-trnkb" Dec 02 17:10:59 crc kubenswrapper[4933]: I1202 17:10:59.931151 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f184aa3-1bdf-4b35-a2c3-195ac77a6581-utilities\") pod \"redhat-marketplace-trnkb\" (UID: \"1f184aa3-1bdf-4b35-a2c3-195ac77a6581\") " pod="openshift-marketplace/redhat-marketplace-trnkb" Dec 02 17:10:59 crc kubenswrapper[4933]: I1202 17:10:59.931243 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f184aa3-1bdf-4b35-a2c3-195ac77a6581-catalog-content\") pod \"redhat-marketplace-trnkb\" (UID: \"1f184aa3-1bdf-4b35-a2c3-195ac77a6581\") " pod="openshift-marketplace/redhat-marketplace-trnkb" Dec 02 17:10:59 crc kubenswrapper[4933]: I1202 17:10:59.952779 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmmnl\" (UniqueName: \"kubernetes.io/projected/1f184aa3-1bdf-4b35-a2c3-195ac77a6581-kube-api-access-rmmnl\") pod \"redhat-marketplace-trnkb\" (UID: \"1f184aa3-1bdf-4b35-a2c3-195ac77a6581\") " pod="openshift-marketplace/redhat-marketplace-trnkb" Dec 02 17:11:00 crc kubenswrapper[4933]: I1202 17:11:00.144509 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-trnkb" Dec 02 17:11:00 crc kubenswrapper[4933]: I1202 17:11:00.691000 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-trnkb"] Dec 02 17:11:00 crc kubenswrapper[4933]: W1202 17:11:00.700229 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f184aa3_1bdf_4b35_a2c3_195ac77a6581.slice/crio-d27087010a211362c67c4af23960c8bf27751c81875b3051124d29f7c35aa3a9 WatchSource:0}: Error finding container d27087010a211362c67c4af23960c8bf27751c81875b3051124d29f7c35aa3a9: Status 404 returned error can't find the container with id d27087010a211362c67c4af23960c8bf27751c81875b3051124d29f7c35aa3a9 Dec 02 17:11:01 crc kubenswrapper[4933]: I1202 17:11:01.060119 4933 generic.go:334] "Generic (PLEG): container finished" podID="1f184aa3-1bdf-4b35-a2c3-195ac77a6581" containerID="21d405eadac1b1d48509bb351461479c0eebe8fe26bdfadf199012bf27a5a9c2" exitCode=0 Dec 02 17:11:01 crc kubenswrapper[4933]: I1202 17:11:01.073727 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-trnkb" event={"ID":"1f184aa3-1bdf-4b35-a2c3-195ac77a6581","Type":"ContainerDied","Data":"21d405eadac1b1d48509bb351461479c0eebe8fe26bdfadf199012bf27a5a9c2"} Dec 02 17:11:01 crc kubenswrapper[4933]: I1202 17:11:01.073783 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-trnkb" event={"ID":"1f184aa3-1bdf-4b35-a2c3-195ac77a6581","Type":"ContainerStarted","Data":"d27087010a211362c67c4af23960c8bf27751c81875b3051124d29f7c35aa3a9"} Dec 02 17:11:01 crc kubenswrapper[4933]: I1202 17:11:01.218534 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Dec 02 17:11:01 crc kubenswrapper[4933]: I1202 17:11:01.220746 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 02 17:11:01 crc kubenswrapper[4933]: I1202 17:11:01.222774 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Dec 02 17:11:01 crc kubenswrapper[4933]: I1202 17:11:01.222927 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Dec 02 17:11:01 crc kubenswrapper[4933]: I1202 17:11:01.223508 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Dec 02 17:11:01 crc kubenswrapper[4933]: I1202 17:11:01.226483 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-jwtt2" Dec 02 17:11:01 crc kubenswrapper[4933]: I1202 17:11:01.229901 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Dec 02 17:11:01 crc kubenswrapper[4933]: I1202 17:11:01.398365 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bd0fb308-1858-4dea-bf49-38e577824bd0-config-data\") pod \"tempest-tests-tempest\" (UID: \"bd0fb308-1858-4dea-bf49-38e577824bd0\") " pod="openstack/tempest-tests-tempest" Dec 02 17:11:01 crc kubenswrapper[4933]: I1202 17:11:01.398424 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/bd0fb308-1858-4dea-bf49-38e577824bd0-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"bd0fb308-1858-4dea-bf49-38e577824bd0\") " pod="openstack/tempest-tests-tempest" Dec 02 17:11:01 crc kubenswrapper[4933]: I1202 17:11:01.398522 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/bd0fb308-1858-4dea-bf49-38e577824bd0-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"bd0fb308-1858-4dea-bf49-38e577824bd0\") " pod="openstack/tempest-tests-tempest" Dec 02 17:11:01 crc kubenswrapper[4933]: I1202 17:11:01.398724 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/bd0fb308-1858-4dea-bf49-38e577824bd0-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"bd0fb308-1858-4dea-bf49-38e577824bd0\") " pod="openstack/tempest-tests-tempest" Dec 02 17:11:01 crc kubenswrapper[4933]: I1202 17:11:01.398903 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"tempest-tests-tempest\" (UID: \"bd0fb308-1858-4dea-bf49-38e577824bd0\") " pod="openstack/tempest-tests-tempest" Dec 02 17:11:01 crc kubenswrapper[4933]: I1202 17:11:01.399033 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bd0fb308-1858-4dea-bf49-38e577824bd0-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"bd0fb308-1858-4dea-bf49-38e577824bd0\") " pod="openstack/tempest-tests-tempest" Dec 02 17:11:01 crc kubenswrapper[4933]: I1202 17:11:01.399102 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crcnz\" (UniqueName: \"kubernetes.io/projected/bd0fb308-1858-4dea-bf49-38e577824bd0-kube-api-access-crcnz\") pod \"tempest-tests-tempest\" (UID: \"bd0fb308-1858-4dea-bf49-38e577824bd0\") " pod="openstack/tempest-tests-tempest" Dec 02 17:11:01 crc kubenswrapper[4933]: I1202 17:11:01.399221 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/bd0fb308-1858-4dea-bf49-38e577824bd0-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"bd0fb308-1858-4dea-bf49-38e577824bd0\") " pod="openstack/tempest-tests-tempest" Dec 02 17:11:01 crc kubenswrapper[4933]: I1202 17:11:01.399401 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/bd0fb308-1858-4dea-bf49-38e577824bd0-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"bd0fb308-1858-4dea-bf49-38e577824bd0\") " pod="openstack/tempest-tests-tempest" Dec 02 17:11:01 crc kubenswrapper[4933]: I1202 17:11:01.501466 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/bd0fb308-1858-4dea-bf49-38e577824bd0-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"bd0fb308-1858-4dea-bf49-38e577824bd0\") " pod="openstack/tempest-tests-tempest" Dec 02 17:11:01 crc kubenswrapper[4933]: I1202 17:11:01.501635 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bd0fb308-1858-4dea-bf49-38e577824bd0-config-data\") pod \"tempest-tests-tempest\" (UID: \"bd0fb308-1858-4dea-bf49-38e577824bd0\") " pod="openstack/tempest-tests-tempest" Dec 02 17:11:01 crc kubenswrapper[4933]: I1202 17:11:01.501660 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/bd0fb308-1858-4dea-bf49-38e577824bd0-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"bd0fb308-1858-4dea-bf49-38e577824bd0\") " pod="openstack/tempest-tests-tempest" Dec 02 17:11:01 crc kubenswrapper[4933]: I1202 17:11:01.501694 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/bd0fb308-1858-4dea-bf49-38e577824bd0-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"bd0fb308-1858-4dea-bf49-38e577824bd0\") " pod="openstack/tempest-tests-tempest" Dec 02 17:11:01 crc kubenswrapper[4933]: I1202 17:11:01.501739 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/bd0fb308-1858-4dea-bf49-38e577824bd0-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"bd0fb308-1858-4dea-bf49-38e577824bd0\") " pod="openstack/tempest-tests-tempest" Dec 02 17:11:01 crc kubenswrapper[4933]: I1202 17:11:01.501792 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"tempest-tests-tempest\" (UID: \"bd0fb308-1858-4dea-bf49-38e577824bd0\") " pod="openstack/tempest-tests-tempest" Dec 02 17:11:01 crc kubenswrapper[4933]: I1202 17:11:01.502534 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/bd0fb308-1858-4dea-bf49-38e577824bd0-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"bd0fb308-1858-4dea-bf49-38e577824bd0\") " pod="openstack/tempest-tests-tempest" Dec 02 17:11:01 crc kubenswrapper[4933]: I1202 17:11:01.502609 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bd0fb308-1858-4dea-bf49-38e577824bd0-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"bd0fb308-1858-4dea-bf49-38e577824bd0\") " pod="openstack/tempest-tests-tempest" Dec 02 17:11:01 crc kubenswrapper[4933]: I1202 17:11:01.502651 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crcnz\" (UniqueName: \"kubernetes.io/projected/bd0fb308-1858-4dea-bf49-38e577824bd0-kube-api-access-crcnz\") pod \"tempest-tests-tempest\" (UID: \"bd0fb308-1858-4dea-bf49-38e577824bd0\") " pod="openstack/tempest-tests-tempest" Dec 02 17:11:01 crc kubenswrapper[4933]: I1202 17:11:01.502703 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/bd0fb308-1858-4dea-bf49-38e577824bd0-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"bd0fb308-1858-4dea-bf49-38e577824bd0\") " pod="openstack/tempest-tests-tempest" Dec 02 17:11:01 crc kubenswrapper[4933]: I1202 17:11:01.502946 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/bd0fb308-1858-4dea-bf49-38e577824bd0-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"bd0fb308-1858-4dea-bf49-38e577824bd0\") " pod="openstack/tempest-tests-tempest" Dec 02 17:11:01 crc kubenswrapper[4933]: I1202 17:11:01.503277 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bd0fb308-1858-4dea-bf49-38e577824bd0-config-data\") pod \"tempest-tests-tempest\" (UID: \"bd0fb308-1858-4dea-bf49-38e577824bd0\") " pod="openstack/tempest-tests-tempest" Dec 02 17:11:01 crc kubenswrapper[4933]: I1202 17:11:01.503363 4933 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"tempest-tests-tempest\" (UID: \"bd0fb308-1858-4dea-bf49-38e577824bd0\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/tempest-tests-tempest" Dec 02 17:11:01 crc kubenswrapper[4933]: I1202 17:11:01.503486 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/bd0fb308-1858-4dea-bf49-38e577824bd0-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"bd0fb308-1858-4dea-bf49-38e577824bd0\") " pod="openstack/tempest-tests-tempest" Dec 02 17:11:01 crc kubenswrapper[4933]: I1202 17:11:01.512466 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bd0fb308-1858-4dea-bf49-38e577824bd0-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"bd0fb308-1858-4dea-bf49-38e577824bd0\") " pod="openstack/tempest-tests-tempest" Dec 02 17:11:01 crc kubenswrapper[4933]: I1202 17:11:01.512773 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/bd0fb308-1858-4dea-bf49-38e577824bd0-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"bd0fb308-1858-4dea-bf49-38e577824bd0\") " pod="openstack/tempest-tests-tempest" Dec 02 17:11:01 crc kubenswrapper[4933]: I1202 17:11:01.515438 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/bd0fb308-1858-4dea-bf49-38e577824bd0-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"bd0fb308-1858-4dea-bf49-38e577824bd0\") " pod="openstack/tempest-tests-tempest" Dec 02 17:11:01 crc kubenswrapper[4933]: I1202 17:11:01.527597 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crcnz\" (UniqueName: \"kubernetes.io/projected/bd0fb308-1858-4dea-bf49-38e577824bd0-kube-api-access-crcnz\") pod \"tempest-tests-tempest\" (UID: \"bd0fb308-1858-4dea-bf49-38e577824bd0\") " pod="openstack/tempest-tests-tempest" Dec 02 17:11:01 crc kubenswrapper[4933]: I1202 17:11:01.552971 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"tempest-tests-tempest\" (UID: \"bd0fb308-1858-4dea-bf49-38e577824bd0\") " pod="openstack/tempest-tests-tempest" Dec 02 17:11:01 crc kubenswrapper[4933]: I1202 17:11:01.838692 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 02 17:11:02 crc kubenswrapper[4933]: I1202 17:11:02.072973 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-trnkb" event={"ID":"1f184aa3-1bdf-4b35-a2c3-195ac77a6581","Type":"ContainerStarted","Data":"7eb643de2dafd3104c4d35e959a2ce56a7da606abce9ff9d55d10b56249268df"} Dec 02 17:11:02 crc kubenswrapper[4933]: W1202 17:11:02.321765 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbd0fb308_1858_4dea_bf49_38e577824bd0.slice/crio-68b06c06990fa20c06300cc85138725137d1ad0c2386088be8f9caab89076c55 WatchSource:0}: Error finding container 68b06c06990fa20c06300cc85138725137d1ad0c2386088be8f9caab89076c55: Status 404 returned error can't find the container with id 68b06c06990fa20c06300cc85138725137d1ad0c2386088be8f9caab89076c55 Dec 02 17:11:02 crc kubenswrapper[4933]: I1202 17:11:02.328815 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Dec 02 17:11:03 crc kubenswrapper[4933]: I1202 17:11:03.099711 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"bd0fb308-1858-4dea-bf49-38e577824bd0","Type":"ContainerStarted","Data":"68b06c06990fa20c06300cc85138725137d1ad0c2386088be8f9caab89076c55"} Dec 02 17:11:03 crc kubenswrapper[4933]: I1202 17:11:03.105209 4933 generic.go:334] "Generic (PLEG): container finished" podID="1f184aa3-1bdf-4b35-a2c3-195ac77a6581" containerID="7eb643de2dafd3104c4d35e959a2ce56a7da606abce9ff9d55d10b56249268df" exitCode=0 Dec 02 17:11:03 crc kubenswrapper[4933]: I1202 17:11:03.105259 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-trnkb" event={"ID":"1f184aa3-1bdf-4b35-a2c3-195ac77a6581","Type":"ContainerDied","Data":"7eb643de2dafd3104c4d35e959a2ce56a7da606abce9ff9d55d10b56249268df"} Dec 02 17:11:05 crc kubenswrapper[4933]: I1202 17:11:05.136655 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-trnkb" event={"ID":"1f184aa3-1bdf-4b35-a2c3-195ac77a6581","Type":"ContainerStarted","Data":"9137b2a519bb95af2b726ae32faf8eb16294cb3fcea2c7fbb029012292eb475e"} Dec 02 17:11:05 crc kubenswrapper[4933]: I1202 17:11:05.158854 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-trnkb" podStartSLOduration=3.320723483 podStartE2EDuration="6.158836631s" podCreationTimestamp="2025-12-02 17:10:59 +0000 UTC" firstStartedPulling="2025-12-02 17:11:01.063017867 +0000 UTC m=+4724.314244570" lastFinishedPulling="2025-12-02 17:11:03.901131015 +0000 UTC m=+4727.152357718" observedRunningTime="2025-12-02 17:11:05.157434714 +0000 UTC m=+4728.408661427" watchObservedRunningTime="2025-12-02 17:11:05.158836631 +0000 UTC m=+4728.410063334" Dec 02 17:11:10 crc kubenswrapper[4933]: I1202 17:11:10.147299 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-trnkb" Dec 02 17:11:10 crc kubenswrapper[4933]: I1202 17:11:10.148102 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-trnkb" Dec 02 17:11:10 crc kubenswrapper[4933]: I1202 17:11:10.205236 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-trnkb" Dec 02 17:11:10 crc kubenswrapper[4933]: I1202 17:11:10.291494 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-trnkb" Dec 02 17:11:10 crc kubenswrapper[4933]: I1202 17:11:10.466041 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-trnkb"] Dec 02 17:11:12 crc kubenswrapper[4933]: I1202 17:11:12.226479 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-trnkb" podUID="1f184aa3-1bdf-4b35-a2c3-195ac77a6581" containerName="registry-server" containerID="cri-o://9137b2a519bb95af2b726ae32faf8eb16294cb3fcea2c7fbb029012292eb475e" gracePeriod=2 Dec 02 17:11:12 crc kubenswrapper[4933]: I1202 17:11:12.787979 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-trnkb" Dec 02 17:11:12 crc kubenswrapper[4933]: I1202 17:11:12.866598 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f184aa3-1bdf-4b35-a2c3-195ac77a6581-utilities\") pod \"1f184aa3-1bdf-4b35-a2c3-195ac77a6581\" (UID: \"1f184aa3-1bdf-4b35-a2c3-195ac77a6581\") " Dec 02 17:11:12 crc kubenswrapper[4933]: I1202 17:11:12.866991 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f184aa3-1bdf-4b35-a2c3-195ac77a6581-catalog-content\") pod \"1f184aa3-1bdf-4b35-a2c3-195ac77a6581\" (UID: \"1f184aa3-1bdf-4b35-a2c3-195ac77a6581\") " Dec 02 17:11:12 crc kubenswrapper[4933]: I1202 17:11:12.867050 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rmmnl\" (UniqueName: \"kubernetes.io/projected/1f184aa3-1bdf-4b35-a2c3-195ac77a6581-kube-api-access-rmmnl\") pod \"1f184aa3-1bdf-4b35-a2c3-195ac77a6581\" (UID: \"1f184aa3-1bdf-4b35-a2c3-195ac77a6581\") " Dec 02 17:11:12 crc kubenswrapper[4933]: I1202 17:11:12.867219 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f184aa3-1bdf-4b35-a2c3-195ac77a6581-utilities" (OuterVolumeSpecName: "utilities") pod "1f184aa3-1bdf-4b35-a2c3-195ac77a6581" (UID: "1f184aa3-1bdf-4b35-a2c3-195ac77a6581"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 17:11:12 crc kubenswrapper[4933]: I1202 17:11:12.867635 4933 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f184aa3-1bdf-4b35-a2c3-195ac77a6581-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 17:11:12 crc kubenswrapper[4933]: I1202 17:11:12.873488 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f184aa3-1bdf-4b35-a2c3-195ac77a6581-kube-api-access-rmmnl" (OuterVolumeSpecName: "kube-api-access-rmmnl") pod "1f184aa3-1bdf-4b35-a2c3-195ac77a6581" (UID: "1f184aa3-1bdf-4b35-a2c3-195ac77a6581"). InnerVolumeSpecName "kube-api-access-rmmnl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 17:11:12 crc kubenswrapper[4933]: I1202 17:11:12.885962 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f184aa3-1bdf-4b35-a2c3-195ac77a6581-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1f184aa3-1bdf-4b35-a2c3-195ac77a6581" (UID: "1f184aa3-1bdf-4b35-a2c3-195ac77a6581"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 17:11:12 crc kubenswrapper[4933]: I1202 17:11:12.970815 4933 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f184aa3-1bdf-4b35-a2c3-195ac77a6581-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 17:11:12 crc kubenswrapper[4933]: I1202 17:11:12.970889 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rmmnl\" (UniqueName: \"kubernetes.io/projected/1f184aa3-1bdf-4b35-a2c3-195ac77a6581-kube-api-access-rmmnl\") on node \"crc\" DevicePath \"\"" Dec 02 17:11:13 crc kubenswrapper[4933]: I1202 17:11:13.239091 4933 generic.go:334] "Generic (PLEG): container finished" podID="1f184aa3-1bdf-4b35-a2c3-195ac77a6581" containerID="9137b2a519bb95af2b726ae32faf8eb16294cb3fcea2c7fbb029012292eb475e" exitCode=0 Dec 02 17:11:13 crc kubenswrapper[4933]: I1202 17:11:13.239148 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-trnkb" event={"ID":"1f184aa3-1bdf-4b35-a2c3-195ac77a6581","Type":"ContainerDied","Data":"9137b2a519bb95af2b726ae32faf8eb16294cb3fcea2c7fbb029012292eb475e"} Dec 02 17:11:13 crc kubenswrapper[4933]: I1202 17:11:13.239461 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-trnkb" event={"ID":"1f184aa3-1bdf-4b35-a2c3-195ac77a6581","Type":"ContainerDied","Data":"d27087010a211362c67c4af23960c8bf27751c81875b3051124d29f7c35aa3a9"} Dec 02 17:11:13 crc kubenswrapper[4933]: I1202 17:11:13.239490 4933 scope.go:117] "RemoveContainer" containerID="9137b2a519bb95af2b726ae32faf8eb16294cb3fcea2c7fbb029012292eb475e" Dec 02 17:11:13 crc kubenswrapper[4933]: I1202 17:11:13.239169 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-trnkb" Dec 02 17:11:13 crc kubenswrapper[4933]: I1202 17:11:13.269914 4933 scope.go:117] "RemoveContainer" containerID="7eb643de2dafd3104c4d35e959a2ce56a7da606abce9ff9d55d10b56249268df" Dec 02 17:11:13 crc kubenswrapper[4933]: I1202 17:11:13.272044 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-trnkb"] Dec 02 17:11:13 crc kubenswrapper[4933]: I1202 17:11:13.283383 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-trnkb"] Dec 02 17:11:13 crc kubenswrapper[4933]: I1202 17:11:13.302892 4933 scope.go:117] "RemoveContainer" containerID="21d405eadac1b1d48509bb351461479c0eebe8fe26bdfadf199012bf27a5a9c2" Dec 02 17:11:13 crc kubenswrapper[4933]: I1202 17:11:13.364007 4933 scope.go:117] "RemoveContainer" containerID="9137b2a519bb95af2b726ae32faf8eb16294cb3fcea2c7fbb029012292eb475e" Dec 02 17:11:13 crc kubenswrapper[4933]: E1202 17:11:13.364566 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9137b2a519bb95af2b726ae32faf8eb16294cb3fcea2c7fbb029012292eb475e\": container with ID starting with 9137b2a519bb95af2b726ae32faf8eb16294cb3fcea2c7fbb029012292eb475e not found: ID does not exist" containerID="9137b2a519bb95af2b726ae32faf8eb16294cb3fcea2c7fbb029012292eb475e" Dec 02 17:11:13 crc kubenswrapper[4933]: I1202 17:11:13.364596 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9137b2a519bb95af2b726ae32faf8eb16294cb3fcea2c7fbb029012292eb475e"} err="failed to get container status \"9137b2a519bb95af2b726ae32faf8eb16294cb3fcea2c7fbb029012292eb475e\": rpc error: code = NotFound desc = could not find container \"9137b2a519bb95af2b726ae32faf8eb16294cb3fcea2c7fbb029012292eb475e\": container with ID starting with 9137b2a519bb95af2b726ae32faf8eb16294cb3fcea2c7fbb029012292eb475e not found: ID does not exist" Dec 02 17:11:13 crc kubenswrapper[4933]: I1202 17:11:13.364617 4933 scope.go:117] "RemoveContainer" containerID="7eb643de2dafd3104c4d35e959a2ce56a7da606abce9ff9d55d10b56249268df" Dec 02 17:11:13 crc kubenswrapper[4933]: E1202 17:11:13.365027 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7eb643de2dafd3104c4d35e959a2ce56a7da606abce9ff9d55d10b56249268df\": container with ID starting with 7eb643de2dafd3104c4d35e959a2ce56a7da606abce9ff9d55d10b56249268df not found: ID does not exist" containerID="7eb643de2dafd3104c4d35e959a2ce56a7da606abce9ff9d55d10b56249268df" Dec 02 17:11:13 crc kubenswrapper[4933]: I1202 17:11:13.365057 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7eb643de2dafd3104c4d35e959a2ce56a7da606abce9ff9d55d10b56249268df"} err="failed to get container status \"7eb643de2dafd3104c4d35e959a2ce56a7da606abce9ff9d55d10b56249268df\": rpc error: code = NotFound desc = could not find container \"7eb643de2dafd3104c4d35e959a2ce56a7da606abce9ff9d55d10b56249268df\": container with ID starting with 7eb643de2dafd3104c4d35e959a2ce56a7da606abce9ff9d55d10b56249268df not found: ID does not exist" Dec 02 17:11:13 crc kubenswrapper[4933]: I1202 17:11:13.365073 4933 scope.go:117] "RemoveContainer" containerID="21d405eadac1b1d48509bb351461479c0eebe8fe26bdfadf199012bf27a5a9c2" Dec 02 17:11:13 crc kubenswrapper[4933]: E1202 17:11:13.365348 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21d405eadac1b1d48509bb351461479c0eebe8fe26bdfadf199012bf27a5a9c2\": container with ID starting with 21d405eadac1b1d48509bb351461479c0eebe8fe26bdfadf199012bf27a5a9c2 not found: ID does not exist" containerID="21d405eadac1b1d48509bb351461479c0eebe8fe26bdfadf199012bf27a5a9c2" Dec 02 17:11:13 crc kubenswrapper[4933]: I1202 17:11:13.365365 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21d405eadac1b1d48509bb351461479c0eebe8fe26bdfadf199012bf27a5a9c2"} err="failed to get container status \"21d405eadac1b1d48509bb351461479c0eebe8fe26bdfadf199012bf27a5a9c2\": rpc error: code = NotFound desc = could not find container \"21d405eadac1b1d48509bb351461479c0eebe8fe26bdfadf199012bf27a5a9c2\": container with ID starting with 21d405eadac1b1d48509bb351461479c0eebe8fe26bdfadf199012bf27a5a9c2 not found: ID does not exist" Dec 02 17:11:15 crc kubenswrapper[4933]: I1202 17:11:15.070762 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f184aa3-1bdf-4b35-a2c3-195ac77a6581" path="/var/lib/kubelet/pods/1f184aa3-1bdf-4b35-a2c3-195ac77a6581/volumes" Dec 02 17:11:38 crc kubenswrapper[4933]: E1202 17:11:38.819785 4933 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Dec 02 17:11:38 crc kubenswrapper[4933]: E1202 17:11:38.823062 4933 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-crcnz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(bd0fb308-1858-4dea-bf49-38e577824bd0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 17:11:38 crc kubenswrapper[4933]: E1202 17:11:38.824355 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="bd0fb308-1858-4dea-bf49-38e577824bd0" Dec 02 17:11:39 crc kubenswrapper[4933]: E1202 17:11:39.557046 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="bd0fb308-1858-4dea-bf49-38e577824bd0" Dec 02 17:11:52 crc kubenswrapper[4933]: I1202 17:11:52.056398 4933 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 17:11:52 crc kubenswrapper[4933]: I1202 17:11:52.544165 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Dec 02 17:11:54 crc kubenswrapper[4933]: I1202 17:11:54.747758 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"bd0fb308-1858-4dea-bf49-38e577824bd0","Type":"ContainerStarted","Data":"1c80addd77e6b1b4f6b362ae8a129d8f952468d11ffcdf63469f001d777b88f8"} Dec 02 17:11:54 crc kubenswrapper[4933]: I1202 17:11:54.775663 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=4.559486302 podStartE2EDuration="54.775642731s" podCreationTimestamp="2025-12-02 17:11:00 +0000 UTC" firstStartedPulling="2025-12-02 17:11:02.324640167 +0000 UTC m=+4725.575866870" lastFinishedPulling="2025-12-02 17:11:52.540796596 +0000 UTC m=+4775.792023299" observedRunningTime="2025-12-02 17:11:54.771728347 +0000 UTC m=+4778.022955050" watchObservedRunningTime="2025-12-02 17:11:54.775642731 +0000 UTC m=+4778.026869434" Dec 02 17:13:17 crc kubenswrapper[4933]: I1202 17:13:17.169851 4933 patch_prober.go:28] interesting pod/machine-config-daemon-d2p6w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 17:13:17 crc kubenswrapper[4933]: I1202 17:13:17.171503 4933 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 17:13:47 crc kubenswrapper[4933]: I1202 17:13:47.169915 4933 patch_prober.go:28] interesting pod/machine-config-daemon-d2p6w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 17:13:47 crc kubenswrapper[4933]: I1202 17:13:47.170443 4933 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 17:14:17 crc kubenswrapper[4933]: I1202 17:14:17.168978 4933 patch_prober.go:28] interesting pod/machine-config-daemon-d2p6w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 17:14:17 crc kubenswrapper[4933]: I1202 17:14:17.169559 4933 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 17:14:17 crc kubenswrapper[4933]: I1202 17:14:17.169984 4933 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" Dec 02 17:14:17 crc kubenswrapper[4933]: I1202 17:14:17.170995 4933 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c3b07b288ed93f4f920f4fab70c864505c565fe378f7ea5e73804f6086777bc6"} pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 17:14:17 crc kubenswrapper[4933]: I1202 17:14:17.171388 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" containerName="machine-config-daemon" containerID="cri-o://c3b07b288ed93f4f920f4fab70c864505c565fe378f7ea5e73804f6086777bc6" gracePeriod=600 Dec 02 17:14:17 crc kubenswrapper[4933]: I1202 17:14:17.470973 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" event={"ID":"e6c1c5e6-50dd-428a-890c-2c3f0456f2fa","Type":"ContainerDied","Data":"c3b07b288ed93f4f920f4fab70c864505c565fe378f7ea5e73804f6086777bc6"} Dec 02 17:14:17 crc kubenswrapper[4933]: I1202 17:14:17.470996 4933 generic.go:334] "Generic (PLEG): container finished" podID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" containerID="c3b07b288ed93f4f920f4fab70c864505c565fe378f7ea5e73804f6086777bc6" exitCode=0 Dec 02 17:14:17 crc kubenswrapper[4933]: I1202 17:14:17.471692 4933 scope.go:117] "RemoveContainer" containerID="92f6200fd90e9087a7ba6ed1bcde0586974cb341a4e26e21b4dc087341ea7eaa" Dec 02 17:14:18 crc kubenswrapper[4933]: I1202 17:14:18.484652 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" event={"ID":"e6c1c5e6-50dd-428a-890c-2c3f0456f2fa","Type":"ContainerStarted","Data":"c29bcc628a04bb3057206075ef5318ca3acce9aa63e363f1a269a85f5c7c79d9"} Dec 02 17:15:00 crc kubenswrapper[4933]: I1202 17:15:00.230129 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411595-nd6j7"] Dec 02 17:15:00 crc kubenswrapper[4933]: E1202 17:15:00.231206 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f184aa3-1bdf-4b35-a2c3-195ac77a6581" containerName="extract-utilities" Dec 02 17:15:00 crc kubenswrapper[4933]: I1202 17:15:00.231340 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f184aa3-1bdf-4b35-a2c3-195ac77a6581" containerName="extract-utilities" Dec 02 17:15:00 crc kubenswrapper[4933]: E1202 17:15:00.231364 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f184aa3-1bdf-4b35-a2c3-195ac77a6581" containerName="registry-server" Dec 02 17:15:00 crc kubenswrapper[4933]: I1202 17:15:00.231371 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f184aa3-1bdf-4b35-a2c3-195ac77a6581" containerName="registry-server" Dec 02 17:15:00 crc kubenswrapper[4933]: E1202 17:15:00.231388 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f184aa3-1bdf-4b35-a2c3-195ac77a6581" containerName="extract-content" Dec 02 17:15:00 crc kubenswrapper[4933]: I1202 17:15:00.231393 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f184aa3-1bdf-4b35-a2c3-195ac77a6581" containerName="extract-content" Dec 02 17:15:00 crc kubenswrapper[4933]: I1202 17:15:00.231642 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f184aa3-1bdf-4b35-a2c3-195ac77a6581" containerName="registry-server" Dec 02 17:15:00 crc kubenswrapper[4933]: I1202 17:15:00.233089 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411595-nd6j7" Dec 02 17:15:00 crc kubenswrapper[4933]: I1202 17:15:00.237233 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 02 17:15:00 crc kubenswrapper[4933]: I1202 17:15:00.238590 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 02 17:15:00 crc kubenswrapper[4933]: I1202 17:15:00.369113 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411595-nd6j7"] Dec 02 17:15:00 crc kubenswrapper[4933]: I1202 17:15:00.424474 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9fb8219e-7137-4924-a65f-eddda075a4b1-secret-volume\") pod \"collect-profiles-29411595-nd6j7\" (UID: \"9fb8219e-7137-4924-a65f-eddda075a4b1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411595-nd6j7" Dec 02 17:15:00 crc kubenswrapper[4933]: I1202 17:15:00.424594 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9fb8219e-7137-4924-a65f-eddda075a4b1-config-volume\") pod \"collect-profiles-29411595-nd6j7\" (UID: \"9fb8219e-7137-4924-a65f-eddda075a4b1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411595-nd6j7" Dec 02 17:15:00 crc kubenswrapper[4933]: I1202 17:15:00.424620 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmszd\" (UniqueName: \"kubernetes.io/projected/9fb8219e-7137-4924-a65f-eddda075a4b1-kube-api-access-vmszd\") pod \"collect-profiles-29411595-nd6j7\" (UID: \"9fb8219e-7137-4924-a65f-eddda075a4b1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411595-nd6j7" Dec 02 17:15:00 crc kubenswrapper[4933]: I1202 17:15:00.528182 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9fb8219e-7137-4924-a65f-eddda075a4b1-secret-volume\") pod \"collect-profiles-29411595-nd6j7\" (UID: \"9fb8219e-7137-4924-a65f-eddda075a4b1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411595-nd6j7" Dec 02 17:15:00 crc kubenswrapper[4933]: I1202 17:15:00.528268 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9fb8219e-7137-4924-a65f-eddda075a4b1-config-volume\") pod \"collect-profiles-29411595-nd6j7\" (UID: \"9fb8219e-7137-4924-a65f-eddda075a4b1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411595-nd6j7" Dec 02 17:15:00 crc kubenswrapper[4933]: I1202 17:15:00.528325 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmszd\" (UniqueName: \"kubernetes.io/projected/9fb8219e-7137-4924-a65f-eddda075a4b1-kube-api-access-vmszd\") pod \"collect-profiles-29411595-nd6j7\" (UID: \"9fb8219e-7137-4924-a65f-eddda075a4b1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411595-nd6j7" Dec 02 17:15:00 crc kubenswrapper[4933]: I1202 17:15:00.531650 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9fb8219e-7137-4924-a65f-eddda075a4b1-config-volume\") pod \"collect-profiles-29411595-nd6j7\" (UID: \"9fb8219e-7137-4924-a65f-eddda075a4b1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411595-nd6j7" Dec 02 17:15:00 crc kubenswrapper[4933]: I1202 17:15:00.543373 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9fb8219e-7137-4924-a65f-eddda075a4b1-secret-volume\") pod \"collect-profiles-29411595-nd6j7\" (UID: \"9fb8219e-7137-4924-a65f-eddda075a4b1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411595-nd6j7" Dec 02 17:15:00 crc kubenswrapper[4933]: I1202 17:15:00.547160 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmszd\" (UniqueName: \"kubernetes.io/projected/9fb8219e-7137-4924-a65f-eddda075a4b1-kube-api-access-vmszd\") pod \"collect-profiles-29411595-nd6j7\" (UID: \"9fb8219e-7137-4924-a65f-eddda075a4b1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411595-nd6j7" Dec 02 17:15:00 crc kubenswrapper[4933]: I1202 17:15:00.553493 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411595-nd6j7" Dec 02 17:15:01 crc kubenswrapper[4933]: I1202 17:15:01.194756 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411595-nd6j7"] Dec 02 17:15:01 crc kubenswrapper[4933]: I1202 17:15:01.949418 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411595-nd6j7" event={"ID":"9fb8219e-7137-4924-a65f-eddda075a4b1","Type":"ContainerStarted","Data":"e8224b782abf74a20cc0fb560237b6f544708ba2f7703737a4132b4c9327004d"} Dec 02 17:15:01 crc kubenswrapper[4933]: I1202 17:15:01.949722 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411595-nd6j7" event={"ID":"9fb8219e-7137-4924-a65f-eddda075a4b1","Type":"ContainerStarted","Data":"c64b812317608a737be389036c63cf4085afb42eb9cfe12435214a1851ff0d05"} Dec 02 17:15:02 crc kubenswrapper[4933]: I1202 17:15:02.017261 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29411595-nd6j7" podStartSLOduration=2.017225174 podStartE2EDuration="2.017225174s" podCreationTimestamp="2025-12-02 17:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 17:15:02.001443364 +0000 UTC m=+4965.252670067" watchObservedRunningTime="2025-12-02 17:15:02.017225174 +0000 UTC m=+4965.268451877" Dec 02 17:15:02 crc kubenswrapper[4933]: I1202 17:15:02.961422 4933 generic.go:334] "Generic (PLEG): container finished" podID="9fb8219e-7137-4924-a65f-eddda075a4b1" containerID="e8224b782abf74a20cc0fb560237b6f544708ba2f7703737a4132b4c9327004d" exitCode=0 Dec 02 17:15:02 crc kubenswrapper[4933]: I1202 17:15:02.961465 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411595-nd6j7" event={"ID":"9fb8219e-7137-4924-a65f-eddda075a4b1","Type":"ContainerDied","Data":"e8224b782abf74a20cc0fb560237b6f544708ba2f7703737a4132b4c9327004d"} Dec 02 17:15:04 crc kubenswrapper[4933]: I1202 17:15:04.988262 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411595-nd6j7" event={"ID":"9fb8219e-7137-4924-a65f-eddda075a4b1","Type":"ContainerDied","Data":"c64b812317608a737be389036c63cf4085afb42eb9cfe12435214a1851ff0d05"} Dec 02 17:15:04 crc kubenswrapper[4933]: I1202 17:15:04.988770 4933 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c64b812317608a737be389036c63cf4085afb42eb9cfe12435214a1851ff0d05" Dec 02 17:15:05 crc kubenswrapper[4933]: I1202 17:15:05.027597 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411595-nd6j7" Dec 02 17:15:05 crc kubenswrapper[4933]: I1202 17:15:05.132964 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vmszd\" (UniqueName: \"kubernetes.io/projected/9fb8219e-7137-4924-a65f-eddda075a4b1-kube-api-access-vmszd\") pod \"9fb8219e-7137-4924-a65f-eddda075a4b1\" (UID: \"9fb8219e-7137-4924-a65f-eddda075a4b1\") " Dec 02 17:15:05 crc kubenswrapper[4933]: I1202 17:15:05.133382 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9fb8219e-7137-4924-a65f-eddda075a4b1-config-volume\") pod \"9fb8219e-7137-4924-a65f-eddda075a4b1\" (UID: \"9fb8219e-7137-4924-a65f-eddda075a4b1\") " Dec 02 17:15:05 crc kubenswrapper[4933]: I1202 17:15:05.133447 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9fb8219e-7137-4924-a65f-eddda075a4b1-secret-volume\") pod \"9fb8219e-7137-4924-a65f-eddda075a4b1\" (UID: \"9fb8219e-7137-4924-a65f-eddda075a4b1\") " Dec 02 17:15:05 crc kubenswrapper[4933]: I1202 17:15:05.134383 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9fb8219e-7137-4924-a65f-eddda075a4b1-config-volume" (OuterVolumeSpecName: "config-volume") pod "9fb8219e-7137-4924-a65f-eddda075a4b1" (UID: "9fb8219e-7137-4924-a65f-eddda075a4b1"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 17:15:05 crc kubenswrapper[4933]: I1202 17:15:05.142015 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9fb8219e-7137-4924-a65f-eddda075a4b1-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "9fb8219e-7137-4924-a65f-eddda075a4b1" (UID: "9fb8219e-7137-4924-a65f-eddda075a4b1"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 17:15:05 crc kubenswrapper[4933]: I1202 17:15:05.142236 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9fb8219e-7137-4924-a65f-eddda075a4b1-kube-api-access-vmszd" (OuterVolumeSpecName: "kube-api-access-vmszd") pod "9fb8219e-7137-4924-a65f-eddda075a4b1" (UID: "9fb8219e-7137-4924-a65f-eddda075a4b1"). InnerVolumeSpecName "kube-api-access-vmszd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 17:15:05 crc kubenswrapper[4933]: I1202 17:15:05.236642 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vmszd\" (UniqueName: \"kubernetes.io/projected/9fb8219e-7137-4924-a65f-eddda075a4b1-kube-api-access-vmszd\") on node \"crc\" DevicePath \"\"" Dec 02 17:15:05 crc kubenswrapper[4933]: I1202 17:15:05.236676 4933 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9fb8219e-7137-4924-a65f-eddda075a4b1-config-volume\") on node \"crc\" DevicePath \"\"" Dec 02 17:15:05 crc kubenswrapper[4933]: I1202 17:15:05.236690 4933 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9fb8219e-7137-4924-a65f-eddda075a4b1-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 02 17:15:06 crc kubenswrapper[4933]: I1202 17:15:06.000057 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411595-nd6j7" Dec 02 17:15:06 crc kubenswrapper[4933]: I1202 17:15:06.119666 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411550-2x6q5"] Dec 02 17:15:06 crc kubenswrapper[4933]: I1202 17:15:06.134269 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411550-2x6q5"] Dec 02 17:15:07 crc kubenswrapper[4933]: I1202 17:15:07.076305 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="699448fc-4c20-4353-b844-e6beec90e5ac" path="/var/lib/kubelet/pods/699448fc-4c20-4353-b844-e6beec90e5ac/volumes" Dec 02 17:15:21 crc kubenswrapper[4933]: I1202 17:15:21.119814 4933 trace.go:236] Trace[1569056302]: "Calculate volume metrics of run-httpd for pod openstack/swift-proxy-64f6f8f6c-x5wnf" (02-Dec-2025 17:15:19.166) (total time: 1942ms): Dec 02 17:15:21 crc kubenswrapper[4933]: Trace[1569056302]: [1.942920907s] [1.942920907s] END Dec 02 17:15:25 crc kubenswrapper[4933]: I1202 17:15:25.798853 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-ccddh"] Dec 02 17:15:25 crc kubenswrapper[4933]: E1202 17:15:25.800968 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fb8219e-7137-4924-a65f-eddda075a4b1" containerName="collect-profiles" Dec 02 17:15:25 crc kubenswrapper[4933]: I1202 17:15:25.801066 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fb8219e-7137-4924-a65f-eddda075a4b1" containerName="collect-profiles" Dec 02 17:15:25 crc kubenswrapper[4933]: I1202 17:15:25.801380 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="9fb8219e-7137-4924-a65f-eddda075a4b1" containerName="collect-profiles" Dec 02 17:15:25 crc kubenswrapper[4933]: I1202 17:15:25.804224 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ccddh" Dec 02 17:15:25 crc kubenswrapper[4933]: I1202 17:15:25.855149 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ccddh"] Dec 02 17:15:25 crc kubenswrapper[4933]: I1202 17:15:25.932909 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc357b83-3ecf-4ba1-b963-189d04f3492e-utilities\") pod \"certified-operators-ccddh\" (UID: \"cc357b83-3ecf-4ba1-b963-189d04f3492e\") " pod="openshift-marketplace/certified-operators-ccddh" Dec 02 17:15:25 crc kubenswrapper[4933]: I1202 17:15:25.933394 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dj4ph\" (UniqueName: \"kubernetes.io/projected/cc357b83-3ecf-4ba1-b963-189d04f3492e-kube-api-access-dj4ph\") pod \"certified-operators-ccddh\" (UID: \"cc357b83-3ecf-4ba1-b963-189d04f3492e\") " pod="openshift-marketplace/certified-operators-ccddh" Dec 02 17:15:25 crc kubenswrapper[4933]: I1202 17:15:25.933476 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc357b83-3ecf-4ba1-b963-189d04f3492e-catalog-content\") pod \"certified-operators-ccddh\" (UID: \"cc357b83-3ecf-4ba1-b963-189d04f3492e\") " pod="openshift-marketplace/certified-operators-ccddh" Dec 02 17:15:26 crc kubenswrapper[4933]: I1202 17:15:26.035462 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc357b83-3ecf-4ba1-b963-189d04f3492e-utilities\") pod \"certified-operators-ccddh\" (UID: \"cc357b83-3ecf-4ba1-b963-189d04f3492e\") " pod="openshift-marketplace/certified-operators-ccddh" Dec 02 17:15:26 crc kubenswrapper[4933]: I1202 17:15:26.035674 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dj4ph\" (UniqueName: \"kubernetes.io/projected/cc357b83-3ecf-4ba1-b963-189d04f3492e-kube-api-access-dj4ph\") pod \"certified-operators-ccddh\" (UID: \"cc357b83-3ecf-4ba1-b963-189d04f3492e\") " pod="openshift-marketplace/certified-operators-ccddh" Dec 02 17:15:26 crc kubenswrapper[4933]: I1202 17:15:26.035706 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc357b83-3ecf-4ba1-b963-189d04f3492e-catalog-content\") pod \"certified-operators-ccddh\" (UID: \"cc357b83-3ecf-4ba1-b963-189d04f3492e\") " pod="openshift-marketplace/certified-operators-ccddh" Dec 02 17:15:26 crc kubenswrapper[4933]: I1202 17:15:26.036266 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc357b83-3ecf-4ba1-b963-189d04f3492e-utilities\") pod \"certified-operators-ccddh\" (UID: \"cc357b83-3ecf-4ba1-b963-189d04f3492e\") " pod="openshift-marketplace/certified-operators-ccddh" Dec 02 17:15:26 crc kubenswrapper[4933]: I1202 17:15:26.036328 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc357b83-3ecf-4ba1-b963-189d04f3492e-catalog-content\") pod \"certified-operators-ccddh\" (UID: \"cc357b83-3ecf-4ba1-b963-189d04f3492e\") " pod="openshift-marketplace/certified-operators-ccddh" Dec 02 17:15:26 crc kubenswrapper[4933]: I1202 17:15:26.061995 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dj4ph\" (UniqueName: \"kubernetes.io/projected/cc357b83-3ecf-4ba1-b963-189d04f3492e-kube-api-access-dj4ph\") pod \"certified-operators-ccddh\" (UID: \"cc357b83-3ecf-4ba1-b963-189d04f3492e\") " pod="openshift-marketplace/certified-operators-ccddh" Dec 02 17:15:26 crc kubenswrapper[4933]: I1202 17:15:26.144682 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ccddh" Dec 02 17:15:26 crc kubenswrapper[4933]: I1202 17:15:26.642994 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ccddh"] Dec 02 17:15:27 crc kubenswrapper[4933]: I1202 17:15:27.379093 4933 generic.go:334] "Generic (PLEG): container finished" podID="cc357b83-3ecf-4ba1-b963-189d04f3492e" containerID="cefff25c5fa077d8c2f1e4ec43bcb2e00f62c32583ae6b53a1c566d13c57c90e" exitCode=0 Dec 02 17:15:27 crc kubenswrapper[4933]: I1202 17:15:27.379456 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ccddh" event={"ID":"cc357b83-3ecf-4ba1-b963-189d04f3492e","Type":"ContainerDied","Data":"cefff25c5fa077d8c2f1e4ec43bcb2e00f62c32583ae6b53a1c566d13c57c90e"} Dec 02 17:15:27 crc kubenswrapper[4933]: I1202 17:15:27.379493 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ccddh" event={"ID":"cc357b83-3ecf-4ba1-b963-189d04f3492e","Type":"ContainerStarted","Data":"98c6d145a65d539e7362794898174c5032bf54974f8cfa26e027cb1b7592be4e"} Dec 02 17:15:28 crc kubenswrapper[4933]: I1202 17:15:28.397997 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xmhzg"] Dec 02 17:15:28 crc kubenswrapper[4933]: I1202 17:15:28.400759 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xmhzg" Dec 02 17:15:28 crc kubenswrapper[4933]: I1202 17:15:28.404548 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93dff2c3-803d-4162-8ce4-3cf7c2f66c1f-utilities\") pod \"redhat-operators-xmhzg\" (UID: \"93dff2c3-803d-4162-8ce4-3cf7c2f66c1f\") " pod="openshift-marketplace/redhat-operators-xmhzg" Dec 02 17:15:28 crc kubenswrapper[4933]: I1202 17:15:28.404925 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93dff2c3-803d-4162-8ce4-3cf7c2f66c1f-catalog-content\") pod \"redhat-operators-xmhzg\" (UID: \"93dff2c3-803d-4162-8ce4-3cf7c2f66c1f\") " pod="openshift-marketplace/redhat-operators-xmhzg" Dec 02 17:15:28 crc kubenswrapper[4933]: I1202 17:15:28.404979 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgjp5\" (UniqueName: \"kubernetes.io/projected/93dff2c3-803d-4162-8ce4-3cf7c2f66c1f-kube-api-access-sgjp5\") pod \"redhat-operators-xmhzg\" (UID: \"93dff2c3-803d-4162-8ce4-3cf7c2f66c1f\") " pod="openshift-marketplace/redhat-operators-xmhzg" Dec 02 17:15:28 crc kubenswrapper[4933]: I1202 17:15:28.425619 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xmhzg"] Dec 02 17:15:28 crc kubenswrapper[4933]: I1202 17:15:28.575300 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93dff2c3-803d-4162-8ce4-3cf7c2f66c1f-utilities\") pod \"redhat-operators-xmhzg\" (UID: \"93dff2c3-803d-4162-8ce4-3cf7c2f66c1f\") " pod="openshift-marketplace/redhat-operators-xmhzg" Dec 02 17:15:28 crc kubenswrapper[4933]: I1202 17:15:28.575677 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93dff2c3-803d-4162-8ce4-3cf7c2f66c1f-catalog-content\") pod \"redhat-operators-xmhzg\" (UID: \"93dff2c3-803d-4162-8ce4-3cf7c2f66c1f\") " pod="openshift-marketplace/redhat-operators-xmhzg" Dec 02 17:15:28 crc kubenswrapper[4933]: I1202 17:15:28.575712 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sgjp5\" (UniqueName: \"kubernetes.io/projected/93dff2c3-803d-4162-8ce4-3cf7c2f66c1f-kube-api-access-sgjp5\") pod \"redhat-operators-xmhzg\" (UID: \"93dff2c3-803d-4162-8ce4-3cf7c2f66c1f\") " pod="openshift-marketplace/redhat-operators-xmhzg" Dec 02 17:15:28 crc kubenswrapper[4933]: I1202 17:15:28.575957 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93dff2c3-803d-4162-8ce4-3cf7c2f66c1f-utilities\") pod \"redhat-operators-xmhzg\" (UID: \"93dff2c3-803d-4162-8ce4-3cf7c2f66c1f\") " pod="openshift-marketplace/redhat-operators-xmhzg" Dec 02 17:15:28 crc kubenswrapper[4933]: I1202 17:15:28.576234 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93dff2c3-803d-4162-8ce4-3cf7c2f66c1f-catalog-content\") pod \"redhat-operators-xmhzg\" (UID: \"93dff2c3-803d-4162-8ce4-3cf7c2f66c1f\") " pod="openshift-marketplace/redhat-operators-xmhzg" Dec 02 17:15:28 crc kubenswrapper[4933]: I1202 17:15:28.622244 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgjp5\" (UniqueName: \"kubernetes.io/projected/93dff2c3-803d-4162-8ce4-3cf7c2f66c1f-kube-api-access-sgjp5\") pod \"redhat-operators-xmhzg\" (UID: \"93dff2c3-803d-4162-8ce4-3cf7c2f66c1f\") " pod="openshift-marketplace/redhat-operators-xmhzg" Dec 02 17:15:28 crc kubenswrapper[4933]: I1202 17:15:28.794886 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xmhzg" Dec 02 17:15:29 crc kubenswrapper[4933]: I1202 17:15:29.327253 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xmhzg"] Dec 02 17:15:29 crc kubenswrapper[4933]: I1202 17:15:29.416378 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ccddh" event={"ID":"cc357b83-3ecf-4ba1-b963-189d04f3492e","Type":"ContainerStarted","Data":"19768edab076465f4ad02dca550625c2c1f6891c6c0db4a581e57ddc8755a83e"} Dec 02 17:15:29 crc kubenswrapper[4933]: I1202 17:15:29.419480 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xmhzg" event={"ID":"93dff2c3-803d-4162-8ce4-3cf7c2f66c1f","Type":"ContainerStarted","Data":"ace927109e3a40f5693c3653804b4124169239daf230a6918f2d8c5332746191"} Dec 02 17:15:30 crc kubenswrapper[4933]: I1202 17:15:30.435679 4933 generic.go:334] "Generic (PLEG): container finished" podID="cc357b83-3ecf-4ba1-b963-189d04f3492e" containerID="19768edab076465f4ad02dca550625c2c1f6891c6c0db4a581e57ddc8755a83e" exitCode=0 Dec 02 17:15:30 crc kubenswrapper[4933]: I1202 17:15:30.436480 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ccddh" event={"ID":"cc357b83-3ecf-4ba1-b963-189d04f3492e","Type":"ContainerDied","Data":"19768edab076465f4ad02dca550625c2c1f6891c6c0db4a581e57ddc8755a83e"} Dec 02 17:15:30 crc kubenswrapper[4933]: I1202 17:15:30.442370 4933 generic.go:334] "Generic (PLEG): container finished" podID="93dff2c3-803d-4162-8ce4-3cf7c2f66c1f" containerID="1e3e4141b68db4462b001b00f6a9c78e62f51b31085637a069152ba92f1e2bad" exitCode=0 Dec 02 17:15:30 crc kubenswrapper[4933]: I1202 17:15:30.442462 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xmhzg" event={"ID":"93dff2c3-803d-4162-8ce4-3cf7c2f66c1f","Type":"ContainerDied","Data":"1e3e4141b68db4462b001b00f6a9c78e62f51b31085637a069152ba92f1e2bad"} Dec 02 17:15:32 crc kubenswrapper[4933]: I1202 17:15:32.472173 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ccddh" event={"ID":"cc357b83-3ecf-4ba1-b963-189d04f3492e","Type":"ContainerStarted","Data":"905ecc3bd781bd26a3df58287eee49585cbbb0593d238950d1071f5379665023"} Dec 02 17:15:32 crc kubenswrapper[4933]: I1202 17:15:32.474815 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xmhzg" event={"ID":"93dff2c3-803d-4162-8ce4-3cf7c2f66c1f","Type":"ContainerStarted","Data":"3473a9589ea90ffddf56b95efcfce3412616b3347d4ac6110840122ddc2598e3"} Dec 02 17:15:32 crc kubenswrapper[4933]: I1202 17:15:32.506173 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-ccddh" podStartSLOduration=3.619645483 podStartE2EDuration="7.506156334s" podCreationTimestamp="2025-12-02 17:15:25 +0000 UTC" firstStartedPulling="2025-12-02 17:15:27.382970057 +0000 UTC m=+4990.634196760" lastFinishedPulling="2025-12-02 17:15:31.269480898 +0000 UTC m=+4994.520707611" observedRunningTime="2025-12-02 17:15:32.497450693 +0000 UTC m=+4995.748677396" watchObservedRunningTime="2025-12-02 17:15:32.506156334 +0000 UTC m=+4995.757383037" Dec 02 17:15:35 crc kubenswrapper[4933]: I1202 17:15:35.511718 4933 generic.go:334] "Generic (PLEG): container finished" podID="93dff2c3-803d-4162-8ce4-3cf7c2f66c1f" containerID="3473a9589ea90ffddf56b95efcfce3412616b3347d4ac6110840122ddc2598e3" exitCode=0 Dec 02 17:15:35 crc kubenswrapper[4933]: I1202 17:15:35.512320 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xmhzg" event={"ID":"93dff2c3-803d-4162-8ce4-3cf7c2f66c1f","Type":"ContainerDied","Data":"3473a9589ea90ffddf56b95efcfce3412616b3347d4ac6110840122ddc2598e3"} Dec 02 17:15:36 crc kubenswrapper[4933]: I1202 17:15:36.146401 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-ccddh" Dec 02 17:15:36 crc kubenswrapper[4933]: I1202 17:15:36.146487 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-ccddh" Dec 02 17:15:36 crc kubenswrapper[4933]: I1202 17:15:36.528049 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xmhzg" event={"ID":"93dff2c3-803d-4162-8ce4-3cf7c2f66c1f","Type":"ContainerStarted","Data":"28ea49a650b8a139be1dafce8ab1086555bd583531947335f881cd2e126ff81e"} Dec 02 17:15:36 crc kubenswrapper[4933]: I1202 17:15:36.552800 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xmhzg" podStartSLOduration=3.063564675 podStartE2EDuration="8.552778651s" podCreationTimestamp="2025-12-02 17:15:28 +0000 UTC" firstStartedPulling="2025-12-02 17:15:30.445075688 +0000 UTC m=+4993.696302401" lastFinishedPulling="2025-12-02 17:15:35.934289674 +0000 UTC m=+4999.185516377" observedRunningTime="2025-12-02 17:15:36.544297546 +0000 UTC m=+4999.795524269" watchObservedRunningTime="2025-12-02 17:15:36.552778651 +0000 UTC m=+4999.804005354" Dec 02 17:15:37 crc kubenswrapper[4933]: I1202 17:15:37.204468 4933 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-ccddh" podUID="cc357b83-3ecf-4ba1-b963-189d04f3492e" containerName="registry-server" probeResult="failure" output=< Dec 02 17:15:37 crc kubenswrapper[4933]: timeout: failed to connect service ":50051" within 1s Dec 02 17:15:37 crc kubenswrapper[4933]: > Dec 02 17:15:38 crc kubenswrapper[4933]: I1202 17:15:38.796268 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xmhzg" Dec 02 17:15:38 crc kubenswrapper[4933]: I1202 17:15:38.796581 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xmhzg" Dec 02 17:15:39 crc kubenswrapper[4933]: I1202 17:15:39.852291 4933 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-xmhzg" podUID="93dff2c3-803d-4162-8ce4-3cf7c2f66c1f" containerName="registry-server" probeResult="failure" output=< Dec 02 17:15:39 crc kubenswrapper[4933]: timeout: failed to connect service ":50051" within 1s Dec 02 17:15:39 crc kubenswrapper[4933]: > Dec 02 17:15:46 crc kubenswrapper[4933]: I1202 17:15:46.204587 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-ccddh" Dec 02 17:15:46 crc kubenswrapper[4933]: I1202 17:15:46.276364 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-ccddh" Dec 02 17:15:46 crc kubenswrapper[4933]: I1202 17:15:46.449526 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ccddh"] Dec 02 17:15:47 crc kubenswrapper[4933]: I1202 17:15:47.636844 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-ccddh" podUID="cc357b83-3ecf-4ba1-b963-189d04f3492e" containerName="registry-server" containerID="cri-o://905ecc3bd781bd26a3df58287eee49585cbbb0593d238950d1071f5379665023" gracePeriod=2 Dec 02 17:15:47 crc kubenswrapper[4933]: I1202 17:15:47.960832 4933 scope.go:117] "RemoveContainer" containerID="8bbbe127965e9e18276dab3fc4eafefa87a3b997f5ce8c03b3ec5d398be43c05" Dec 02 17:15:48 crc kubenswrapper[4933]: I1202 17:15:48.403926 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ccddh" Dec 02 17:15:48 crc kubenswrapper[4933]: I1202 17:15:48.490924 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc357b83-3ecf-4ba1-b963-189d04f3492e-catalog-content\") pod \"cc357b83-3ecf-4ba1-b963-189d04f3492e\" (UID: \"cc357b83-3ecf-4ba1-b963-189d04f3492e\") " Dec 02 17:15:48 crc kubenswrapper[4933]: I1202 17:15:48.491296 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dj4ph\" (UniqueName: \"kubernetes.io/projected/cc357b83-3ecf-4ba1-b963-189d04f3492e-kube-api-access-dj4ph\") pod \"cc357b83-3ecf-4ba1-b963-189d04f3492e\" (UID: \"cc357b83-3ecf-4ba1-b963-189d04f3492e\") " Dec 02 17:15:48 crc kubenswrapper[4933]: I1202 17:15:48.491422 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc357b83-3ecf-4ba1-b963-189d04f3492e-utilities\") pod \"cc357b83-3ecf-4ba1-b963-189d04f3492e\" (UID: \"cc357b83-3ecf-4ba1-b963-189d04f3492e\") " Dec 02 17:15:48 crc kubenswrapper[4933]: I1202 17:15:48.494397 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc357b83-3ecf-4ba1-b963-189d04f3492e-utilities" (OuterVolumeSpecName: "utilities") pod "cc357b83-3ecf-4ba1-b963-189d04f3492e" (UID: "cc357b83-3ecf-4ba1-b963-189d04f3492e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 17:15:48 crc kubenswrapper[4933]: I1202 17:15:48.548037 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc357b83-3ecf-4ba1-b963-189d04f3492e-kube-api-access-dj4ph" (OuterVolumeSpecName: "kube-api-access-dj4ph") pod "cc357b83-3ecf-4ba1-b963-189d04f3492e" (UID: "cc357b83-3ecf-4ba1-b963-189d04f3492e"). InnerVolumeSpecName "kube-api-access-dj4ph". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 17:15:48 crc kubenswrapper[4933]: I1202 17:15:48.594530 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dj4ph\" (UniqueName: \"kubernetes.io/projected/cc357b83-3ecf-4ba1-b963-189d04f3492e-kube-api-access-dj4ph\") on node \"crc\" DevicePath \"\"" Dec 02 17:15:48 crc kubenswrapper[4933]: I1202 17:15:48.594895 4933 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc357b83-3ecf-4ba1-b963-189d04f3492e-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 17:15:48 crc kubenswrapper[4933]: I1202 17:15:48.596996 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc357b83-3ecf-4ba1-b963-189d04f3492e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cc357b83-3ecf-4ba1-b963-189d04f3492e" (UID: "cc357b83-3ecf-4ba1-b963-189d04f3492e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 17:15:48 crc kubenswrapper[4933]: I1202 17:15:48.654457 4933 generic.go:334] "Generic (PLEG): container finished" podID="cc357b83-3ecf-4ba1-b963-189d04f3492e" containerID="905ecc3bd781bd26a3df58287eee49585cbbb0593d238950d1071f5379665023" exitCode=0 Dec 02 17:15:48 crc kubenswrapper[4933]: I1202 17:15:48.654507 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ccddh" event={"ID":"cc357b83-3ecf-4ba1-b963-189d04f3492e","Type":"ContainerDied","Data":"905ecc3bd781bd26a3df58287eee49585cbbb0593d238950d1071f5379665023"} Dec 02 17:15:48 crc kubenswrapper[4933]: I1202 17:15:48.654607 4933 scope.go:117] "RemoveContainer" containerID="905ecc3bd781bd26a3df58287eee49585cbbb0593d238950d1071f5379665023" Dec 02 17:15:48 crc kubenswrapper[4933]: I1202 17:15:48.654654 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ccddh" Dec 02 17:15:48 crc kubenswrapper[4933]: I1202 17:15:48.654536 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ccddh" event={"ID":"cc357b83-3ecf-4ba1-b963-189d04f3492e","Type":"ContainerDied","Data":"98c6d145a65d539e7362794898174c5032bf54974f8cfa26e027cb1b7592be4e"} Dec 02 17:15:48 crc kubenswrapper[4933]: I1202 17:15:48.698316 4933 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc357b83-3ecf-4ba1-b963-189d04f3492e-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 17:15:48 crc kubenswrapper[4933]: I1202 17:15:48.702690 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ccddh"] Dec 02 17:15:48 crc kubenswrapper[4933]: I1202 17:15:48.711030 4933 scope.go:117] "RemoveContainer" containerID="19768edab076465f4ad02dca550625c2c1f6891c6c0db4a581e57ddc8755a83e" Dec 02 17:15:48 crc kubenswrapper[4933]: I1202 17:15:48.718025 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-ccddh"] Dec 02 17:15:48 crc kubenswrapper[4933]: I1202 17:15:48.740449 4933 scope.go:117] "RemoveContainer" containerID="cefff25c5fa077d8c2f1e4ec43bcb2e00f62c32583ae6b53a1c566d13c57c90e" Dec 02 17:15:48 crc kubenswrapper[4933]: I1202 17:15:48.767593 4933 scope.go:117] "RemoveContainer" containerID="905ecc3bd781bd26a3df58287eee49585cbbb0593d238950d1071f5379665023" Dec 02 17:15:48 crc kubenswrapper[4933]: E1202 17:15:48.769077 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"905ecc3bd781bd26a3df58287eee49585cbbb0593d238950d1071f5379665023\": container with ID starting with 905ecc3bd781bd26a3df58287eee49585cbbb0593d238950d1071f5379665023 not found: ID does not exist" containerID="905ecc3bd781bd26a3df58287eee49585cbbb0593d238950d1071f5379665023" Dec 02 17:15:48 crc kubenswrapper[4933]: I1202 17:15:48.769127 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"905ecc3bd781bd26a3df58287eee49585cbbb0593d238950d1071f5379665023"} err="failed to get container status \"905ecc3bd781bd26a3df58287eee49585cbbb0593d238950d1071f5379665023\": rpc error: code = NotFound desc = could not find container \"905ecc3bd781bd26a3df58287eee49585cbbb0593d238950d1071f5379665023\": container with ID starting with 905ecc3bd781bd26a3df58287eee49585cbbb0593d238950d1071f5379665023 not found: ID does not exist" Dec 02 17:15:48 crc kubenswrapper[4933]: I1202 17:15:48.769158 4933 scope.go:117] "RemoveContainer" containerID="19768edab076465f4ad02dca550625c2c1f6891c6c0db4a581e57ddc8755a83e" Dec 02 17:15:48 crc kubenswrapper[4933]: E1202 17:15:48.769798 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19768edab076465f4ad02dca550625c2c1f6891c6c0db4a581e57ddc8755a83e\": container with ID starting with 19768edab076465f4ad02dca550625c2c1f6891c6c0db4a581e57ddc8755a83e not found: ID does not exist" containerID="19768edab076465f4ad02dca550625c2c1f6891c6c0db4a581e57ddc8755a83e" Dec 02 17:15:48 crc kubenswrapper[4933]: I1202 17:15:48.769887 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19768edab076465f4ad02dca550625c2c1f6891c6c0db4a581e57ddc8755a83e"} err="failed to get container status \"19768edab076465f4ad02dca550625c2c1f6891c6c0db4a581e57ddc8755a83e\": rpc error: code = NotFound desc = could not find container \"19768edab076465f4ad02dca550625c2c1f6891c6c0db4a581e57ddc8755a83e\": container with ID starting with 19768edab076465f4ad02dca550625c2c1f6891c6c0db4a581e57ddc8755a83e not found: ID does not exist" Dec 02 17:15:48 crc kubenswrapper[4933]: I1202 17:15:48.769916 4933 scope.go:117] "RemoveContainer" containerID="cefff25c5fa077d8c2f1e4ec43bcb2e00f62c32583ae6b53a1c566d13c57c90e" Dec 02 17:15:48 crc kubenswrapper[4933]: E1202 17:15:48.770502 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cefff25c5fa077d8c2f1e4ec43bcb2e00f62c32583ae6b53a1c566d13c57c90e\": container with ID starting with cefff25c5fa077d8c2f1e4ec43bcb2e00f62c32583ae6b53a1c566d13c57c90e not found: ID does not exist" containerID="cefff25c5fa077d8c2f1e4ec43bcb2e00f62c32583ae6b53a1c566d13c57c90e" Dec 02 17:15:48 crc kubenswrapper[4933]: I1202 17:15:48.770531 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cefff25c5fa077d8c2f1e4ec43bcb2e00f62c32583ae6b53a1c566d13c57c90e"} err="failed to get container status \"cefff25c5fa077d8c2f1e4ec43bcb2e00f62c32583ae6b53a1c566d13c57c90e\": rpc error: code = NotFound desc = could not find container \"cefff25c5fa077d8c2f1e4ec43bcb2e00f62c32583ae6b53a1c566d13c57c90e\": container with ID starting with cefff25c5fa077d8c2f1e4ec43bcb2e00f62c32583ae6b53a1c566d13c57c90e not found: ID does not exist" Dec 02 17:15:49 crc kubenswrapper[4933]: I1202 17:15:49.070692 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc357b83-3ecf-4ba1-b963-189d04f3492e" path="/var/lib/kubelet/pods/cc357b83-3ecf-4ba1-b963-189d04f3492e/volumes" Dec 02 17:15:49 crc kubenswrapper[4933]: I1202 17:15:49.849664 4933 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-xmhzg" podUID="93dff2c3-803d-4162-8ce4-3cf7c2f66c1f" containerName="registry-server" probeResult="failure" output=< Dec 02 17:15:49 crc kubenswrapper[4933]: timeout: failed to connect service ":50051" within 1s Dec 02 17:15:49 crc kubenswrapper[4933]: > Dec 02 17:15:58 crc kubenswrapper[4933]: I1202 17:15:58.854076 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xmhzg" Dec 02 17:15:58 crc kubenswrapper[4933]: I1202 17:15:58.917417 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xmhzg" Dec 02 17:15:59 crc kubenswrapper[4933]: I1202 17:15:59.601998 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xmhzg"] Dec 02 17:16:00 crc kubenswrapper[4933]: I1202 17:16:00.775392 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-xmhzg" podUID="93dff2c3-803d-4162-8ce4-3cf7c2f66c1f" containerName="registry-server" containerID="cri-o://28ea49a650b8a139be1dafce8ab1086555bd583531947335f881cd2e126ff81e" gracePeriod=2 Dec 02 17:16:01 crc kubenswrapper[4933]: I1202 17:16:01.428392 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xmhzg" Dec 02 17:16:01 crc kubenswrapper[4933]: I1202 17:16:01.518787 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93dff2c3-803d-4162-8ce4-3cf7c2f66c1f-catalog-content\") pod \"93dff2c3-803d-4162-8ce4-3cf7c2f66c1f\" (UID: \"93dff2c3-803d-4162-8ce4-3cf7c2f66c1f\") " Dec 02 17:16:01 crc kubenswrapper[4933]: I1202 17:16:01.518898 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93dff2c3-803d-4162-8ce4-3cf7c2f66c1f-utilities\") pod \"93dff2c3-803d-4162-8ce4-3cf7c2f66c1f\" (UID: \"93dff2c3-803d-4162-8ce4-3cf7c2f66c1f\") " Dec 02 17:16:01 crc kubenswrapper[4933]: I1202 17:16:01.519173 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sgjp5\" (UniqueName: \"kubernetes.io/projected/93dff2c3-803d-4162-8ce4-3cf7c2f66c1f-kube-api-access-sgjp5\") pod \"93dff2c3-803d-4162-8ce4-3cf7c2f66c1f\" (UID: \"93dff2c3-803d-4162-8ce4-3cf7c2f66c1f\") " Dec 02 17:16:01 crc kubenswrapper[4933]: I1202 17:16:01.519566 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93dff2c3-803d-4162-8ce4-3cf7c2f66c1f-utilities" (OuterVolumeSpecName: "utilities") pod "93dff2c3-803d-4162-8ce4-3cf7c2f66c1f" (UID: "93dff2c3-803d-4162-8ce4-3cf7c2f66c1f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 17:16:01 crc kubenswrapper[4933]: I1202 17:16:01.520203 4933 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93dff2c3-803d-4162-8ce4-3cf7c2f66c1f-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 17:16:01 crc kubenswrapper[4933]: I1202 17:16:01.526220 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93dff2c3-803d-4162-8ce4-3cf7c2f66c1f-kube-api-access-sgjp5" (OuterVolumeSpecName: "kube-api-access-sgjp5") pod "93dff2c3-803d-4162-8ce4-3cf7c2f66c1f" (UID: "93dff2c3-803d-4162-8ce4-3cf7c2f66c1f"). InnerVolumeSpecName "kube-api-access-sgjp5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 17:16:01 crc kubenswrapper[4933]: I1202 17:16:01.621955 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sgjp5\" (UniqueName: \"kubernetes.io/projected/93dff2c3-803d-4162-8ce4-3cf7c2f66c1f-kube-api-access-sgjp5\") on node \"crc\" DevicePath \"\"" Dec 02 17:16:01 crc kubenswrapper[4933]: I1202 17:16:01.636885 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93dff2c3-803d-4162-8ce4-3cf7c2f66c1f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "93dff2c3-803d-4162-8ce4-3cf7c2f66c1f" (UID: "93dff2c3-803d-4162-8ce4-3cf7c2f66c1f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 17:16:01 crc kubenswrapper[4933]: I1202 17:16:01.724774 4933 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93dff2c3-803d-4162-8ce4-3cf7c2f66c1f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 17:16:01 crc kubenswrapper[4933]: I1202 17:16:01.788682 4933 generic.go:334] "Generic (PLEG): container finished" podID="93dff2c3-803d-4162-8ce4-3cf7c2f66c1f" containerID="28ea49a650b8a139be1dafce8ab1086555bd583531947335f881cd2e126ff81e" exitCode=0 Dec 02 17:16:01 crc kubenswrapper[4933]: I1202 17:16:01.788735 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xmhzg" Dec 02 17:16:01 crc kubenswrapper[4933]: I1202 17:16:01.788734 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xmhzg" event={"ID":"93dff2c3-803d-4162-8ce4-3cf7c2f66c1f","Type":"ContainerDied","Data":"28ea49a650b8a139be1dafce8ab1086555bd583531947335f881cd2e126ff81e"} Dec 02 17:16:01 crc kubenswrapper[4933]: I1202 17:16:01.788971 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xmhzg" event={"ID":"93dff2c3-803d-4162-8ce4-3cf7c2f66c1f","Type":"ContainerDied","Data":"ace927109e3a40f5693c3653804b4124169239daf230a6918f2d8c5332746191"} Dec 02 17:16:01 crc kubenswrapper[4933]: I1202 17:16:01.789058 4933 scope.go:117] "RemoveContainer" containerID="28ea49a650b8a139be1dafce8ab1086555bd583531947335f881cd2e126ff81e" Dec 02 17:16:01 crc kubenswrapper[4933]: I1202 17:16:01.824785 4933 scope.go:117] "RemoveContainer" containerID="3473a9589ea90ffddf56b95efcfce3412616b3347d4ac6110840122ddc2598e3" Dec 02 17:16:01 crc kubenswrapper[4933]: I1202 17:16:01.828178 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xmhzg"] Dec 02 17:16:01 crc kubenswrapper[4933]: I1202 17:16:01.839719 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-xmhzg"] Dec 02 17:16:01 crc kubenswrapper[4933]: I1202 17:16:01.845021 4933 scope.go:117] "RemoveContainer" containerID="1e3e4141b68db4462b001b00f6a9c78e62f51b31085637a069152ba92f1e2bad" Dec 02 17:16:01 crc kubenswrapper[4933]: I1202 17:16:01.905642 4933 scope.go:117] "RemoveContainer" containerID="28ea49a650b8a139be1dafce8ab1086555bd583531947335f881cd2e126ff81e" Dec 02 17:16:01 crc kubenswrapper[4933]: E1202 17:16:01.906452 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28ea49a650b8a139be1dafce8ab1086555bd583531947335f881cd2e126ff81e\": container with ID starting with 28ea49a650b8a139be1dafce8ab1086555bd583531947335f881cd2e126ff81e not found: ID does not exist" containerID="28ea49a650b8a139be1dafce8ab1086555bd583531947335f881cd2e126ff81e" Dec 02 17:16:01 crc kubenswrapper[4933]: I1202 17:16:01.906498 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28ea49a650b8a139be1dafce8ab1086555bd583531947335f881cd2e126ff81e"} err="failed to get container status \"28ea49a650b8a139be1dafce8ab1086555bd583531947335f881cd2e126ff81e\": rpc error: code = NotFound desc = could not find container \"28ea49a650b8a139be1dafce8ab1086555bd583531947335f881cd2e126ff81e\": container with ID starting with 28ea49a650b8a139be1dafce8ab1086555bd583531947335f881cd2e126ff81e not found: ID does not exist" Dec 02 17:16:01 crc kubenswrapper[4933]: I1202 17:16:01.906525 4933 scope.go:117] "RemoveContainer" containerID="3473a9589ea90ffddf56b95efcfce3412616b3347d4ac6110840122ddc2598e3" Dec 02 17:16:01 crc kubenswrapper[4933]: E1202 17:16:01.906970 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3473a9589ea90ffddf56b95efcfce3412616b3347d4ac6110840122ddc2598e3\": container with ID starting with 3473a9589ea90ffddf56b95efcfce3412616b3347d4ac6110840122ddc2598e3 not found: ID does not exist" containerID="3473a9589ea90ffddf56b95efcfce3412616b3347d4ac6110840122ddc2598e3" Dec 02 17:16:01 crc kubenswrapper[4933]: I1202 17:16:01.907025 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3473a9589ea90ffddf56b95efcfce3412616b3347d4ac6110840122ddc2598e3"} err="failed to get container status \"3473a9589ea90ffddf56b95efcfce3412616b3347d4ac6110840122ddc2598e3\": rpc error: code = NotFound desc = could not find container \"3473a9589ea90ffddf56b95efcfce3412616b3347d4ac6110840122ddc2598e3\": container with ID starting with 3473a9589ea90ffddf56b95efcfce3412616b3347d4ac6110840122ddc2598e3 not found: ID does not exist" Dec 02 17:16:01 crc kubenswrapper[4933]: I1202 17:16:01.907072 4933 scope.go:117] "RemoveContainer" containerID="1e3e4141b68db4462b001b00f6a9c78e62f51b31085637a069152ba92f1e2bad" Dec 02 17:16:01 crc kubenswrapper[4933]: E1202 17:16:01.907412 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e3e4141b68db4462b001b00f6a9c78e62f51b31085637a069152ba92f1e2bad\": container with ID starting with 1e3e4141b68db4462b001b00f6a9c78e62f51b31085637a069152ba92f1e2bad not found: ID does not exist" containerID="1e3e4141b68db4462b001b00f6a9c78e62f51b31085637a069152ba92f1e2bad" Dec 02 17:16:01 crc kubenswrapper[4933]: I1202 17:16:01.907444 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e3e4141b68db4462b001b00f6a9c78e62f51b31085637a069152ba92f1e2bad"} err="failed to get container status \"1e3e4141b68db4462b001b00f6a9c78e62f51b31085637a069152ba92f1e2bad\": rpc error: code = NotFound desc = could not find container \"1e3e4141b68db4462b001b00f6a9c78e62f51b31085637a069152ba92f1e2bad\": container with ID starting with 1e3e4141b68db4462b001b00f6a9c78e62f51b31085637a069152ba92f1e2bad not found: ID does not exist" Dec 02 17:16:03 crc kubenswrapper[4933]: I1202 17:16:03.068405 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93dff2c3-803d-4162-8ce4-3cf7c2f66c1f" path="/var/lib/kubelet/pods/93dff2c3-803d-4162-8ce4-3cf7c2f66c1f/volumes" Dec 02 17:16:17 crc kubenswrapper[4933]: I1202 17:16:17.169479 4933 patch_prober.go:28] interesting pod/machine-config-daemon-d2p6w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 17:16:17 crc kubenswrapper[4933]: I1202 17:16:17.388096 4933 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 17:16:47 crc kubenswrapper[4933]: I1202 17:16:47.169708 4933 patch_prober.go:28] interesting pod/machine-config-daemon-d2p6w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 17:16:47 crc kubenswrapper[4933]: I1202 17:16:47.170273 4933 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 17:17:17 crc kubenswrapper[4933]: I1202 17:17:17.169126 4933 patch_prober.go:28] interesting pod/machine-config-daemon-d2p6w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 17:17:17 crc kubenswrapper[4933]: I1202 17:17:17.169626 4933 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 17:17:17 crc kubenswrapper[4933]: I1202 17:17:17.169667 4933 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" Dec 02 17:17:17 crc kubenswrapper[4933]: I1202 17:17:17.170787 4933 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c29bcc628a04bb3057206075ef5318ca3acce9aa63e363f1a269a85f5c7c79d9"} pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 17:17:17 crc kubenswrapper[4933]: I1202 17:17:17.170859 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" containerName="machine-config-daemon" containerID="cri-o://c29bcc628a04bb3057206075ef5318ca3acce9aa63e363f1a269a85f5c7c79d9" gracePeriod=600 Dec 02 17:17:17 crc kubenswrapper[4933]: E1202 17:17:17.322989 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 17:17:18 crc kubenswrapper[4933]: I1202 17:17:18.119412 4933 generic.go:334] "Generic (PLEG): container finished" podID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" containerID="c29bcc628a04bb3057206075ef5318ca3acce9aa63e363f1a269a85f5c7c79d9" exitCode=0 Dec 02 17:17:18 crc kubenswrapper[4933]: I1202 17:17:18.119477 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" event={"ID":"e6c1c5e6-50dd-428a-890c-2c3f0456f2fa","Type":"ContainerDied","Data":"c29bcc628a04bb3057206075ef5318ca3acce9aa63e363f1a269a85f5c7c79d9"} Dec 02 17:17:18 crc kubenswrapper[4933]: I1202 17:17:18.119526 4933 scope.go:117] "RemoveContainer" containerID="c3b07b288ed93f4f920f4fab70c864505c565fe378f7ea5e73804f6086777bc6" Dec 02 17:17:18 crc kubenswrapper[4933]: I1202 17:17:18.121044 4933 scope.go:117] "RemoveContainer" containerID="c29bcc628a04bb3057206075ef5318ca3acce9aa63e363f1a269a85f5c7c79d9" Dec 02 17:17:18 crc kubenswrapper[4933]: E1202 17:17:18.121711 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 17:17:33 crc kubenswrapper[4933]: I1202 17:17:33.056332 4933 scope.go:117] "RemoveContainer" containerID="c29bcc628a04bb3057206075ef5318ca3acce9aa63e363f1a269a85f5c7c79d9" Dec 02 17:17:33 crc kubenswrapper[4933]: E1202 17:17:33.057244 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 17:17:45 crc kubenswrapper[4933]: I1202 17:17:45.054453 4933 scope.go:117] "RemoveContainer" containerID="c29bcc628a04bb3057206075ef5318ca3acce9aa63e363f1a269a85f5c7c79d9" Dec 02 17:17:45 crc kubenswrapper[4933]: E1202 17:17:45.055343 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 17:17:57 crc kubenswrapper[4933]: I1202 17:17:57.054090 4933 scope.go:117] "RemoveContainer" containerID="c29bcc628a04bb3057206075ef5318ca3acce9aa63e363f1a269a85f5c7c79d9" Dec 02 17:17:57 crc kubenswrapper[4933]: E1202 17:17:57.060086 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 17:18:10 crc kubenswrapper[4933]: I1202 17:18:10.053262 4933 scope.go:117] "RemoveContainer" containerID="c29bcc628a04bb3057206075ef5318ca3acce9aa63e363f1a269a85f5c7c79d9" Dec 02 17:18:10 crc kubenswrapper[4933]: E1202 17:18:10.054942 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 17:18:23 crc kubenswrapper[4933]: I1202 17:18:23.054703 4933 scope.go:117] "RemoveContainer" containerID="c29bcc628a04bb3057206075ef5318ca3acce9aa63e363f1a269a85f5c7c79d9" Dec 02 17:18:23 crc kubenswrapper[4933]: E1202 17:18:23.058018 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 17:18:34 crc kubenswrapper[4933]: I1202 17:18:34.055487 4933 scope.go:117] "RemoveContainer" containerID="c29bcc628a04bb3057206075ef5318ca3acce9aa63e363f1a269a85f5c7c79d9" Dec 02 17:18:34 crc kubenswrapper[4933]: E1202 17:18:34.056610 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 17:18:47 crc kubenswrapper[4933]: I1202 17:18:47.061220 4933 scope.go:117] "RemoveContainer" containerID="c29bcc628a04bb3057206075ef5318ca3acce9aa63e363f1a269a85f5c7c79d9" Dec 02 17:18:47 crc kubenswrapper[4933]: E1202 17:18:47.062520 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 17:18:59 crc kubenswrapper[4933]: I1202 17:18:59.053353 4933 scope.go:117] "RemoveContainer" containerID="c29bcc628a04bb3057206075ef5318ca3acce9aa63e363f1a269a85f5c7c79d9" Dec 02 17:18:59 crc kubenswrapper[4933]: E1202 17:18:59.054019 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 17:19:12 crc kubenswrapper[4933]: I1202 17:19:12.053672 4933 scope.go:117] "RemoveContainer" containerID="c29bcc628a04bb3057206075ef5318ca3acce9aa63e363f1a269a85f5c7c79d9" Dec 02 17:19:12 crc kubenswrapper[4933]: E1202 17:19:12.054527 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 17:19:26 crc kubenswrapper[4933]: I1202 17:19:26.054209 4933 scope.go:117] "RemoveContainer" containerID="c29bcc628a04bb3057206075ef5318ca3acce9aa63e363f1a269a85f5c7c79d9" Dec 02 17:19:26 crc kubenswrapper[4933]: E1202 17:19:26.054981 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 17:19:38 crc kubenswrapper[4933]: I1202 17:19:38.054208 4933 scope.go:117] "RemoveContainer" containerID="c29bcc628a04bb3057206075ef5318ca3acce9aa63e363f1a269a85f5c7c79d9" Dec 02 17:19:38 crc kubenswrapper[4933]: E1202 17:19:38.055199 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 17:19:50 crc kubenswrapper[4933]: I1202 17:19:50.053803 4933 scope.go:117] "RemoveContainer" containerID="c29bcc628a04bb3057206075ef5318ca3acce9aa63e363f1a269a85f5c7c79d9" Dec 02 17:19:50 crc kubenswrapper[4933]: E1202 17:19:50.055530 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 17:20:01 crc kubenswrapper[4933]: I1202 17:20:01.053157 4933 scope.go:117] "RemoveContainer" containerID="c29bcc628a04bb3057206075ef5318ca3acce9aa63e363f1a269a85f5c7c79d9" Dec 02 17:20:01 crc kubenswrapper[4933]: E1202 17:20:01.054005 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 17:20:13 crc kubenswrapper[4933]: I1202 17:20:13.053702 4933 scope.go:117] "RemoveContainer" containerID="c29bcc628a04bb3057206075ef5318ca3acce9aa63e363f1a269a85f5c7c79d9" Dec 02 17:20:13 crc kubenswrapper[4933]: E1202 17:20:13.054633 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 17:20:26 crc kubenswrapper[4933]: I1202 17:20:26.053303 4933 scope.go:117] "RemoveContainer" containerID="c29bcc628a04bb3057206075ef5318ca3acce9aa63e363f1a269a85f5c7c79d9" Dec 02 17:20:26 crc kubenswrapper[4933]: E1202 17:20:26.054350 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 17:20:41 crc kubenswrapper[4933]: I1202 17:20:41.054204 4933 scope.go:117] "RemoveContainer" containerID="c29bcc628a04bb3057206075ef5318ca3acce9aa63e363f1a269a85f5c7c79d9" Dec 02 17:20:41 crc kubenswrapper[4933]: E1202 17:20:41.055135 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 17:20:54 crc kubenswrapper[4933]: I1202 17:20:54.054274 4933 scope.go:117] "RemoveContainer" containerID="c29bcc628a04bb3057206075ef5318ca3acce9aa63e363f1a269a85f5c7c79d9" Dec 02 17:20:54 crc kubenswrapper[4933]: E1202 17:20:54.056325 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 17:21:08 crc kubenswrapper[4933]: I1202 17:21:08.054245 4933 scope.go:117] "RemoveContainer" containerID="c29bcc628a04bb3057206075ef5318ca3acce9aa63e363f1a269a85f5c7c79d9" Dec 02 17:21:08 crc kubenswrapper[4933]: E1202 17:21:08.056232 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 17:21:22 crc kubenswrapper[4933]: I1202 17:21:22.053852 4933 scope.go:117] "RemoveContainer" containerID="c29bcc628a04bb3057206075ef5318ca3acce9aa63e363f1a269a85f5c7c79d9" Dec 02 17:21:22 crc kubenswrapper[4933]: E1202 17:21:22.054605 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 17:21:33 crc kubenswrapper[4933]: I1202 17:21:33.053889 4933 scope.go:117] "RemoveContainer" containerID="c29bcc628a04bb3057206075ef5318ca3acce9aa63e363f1a269a85f5c7c79d9" Dec 02 17:21:33 crc kubenswrapper[4933]: E1202 17:21:33.055660 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 17:21:34 crc kubenswrapper[4933]: I1202 17:21:34.996730 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-dbssq"] Dec 02 17:21:34 crc kubenswrapper[4933]: E1202 17:21:34.999795 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93dff2c3-803d-4162-8ce4-3cf7c2f66c1f" containerName="extract-content" Dec 02 17:21:34 crc kubenswrapper[4933]: I1202 17:21:34.999841 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="93dff2c3-803d-4162-8ce4-3cf7c2f66c1f" containerName="extract-content" Dec 02 17:21:34 crc kubenswrapper[4933]: E1202 17:21:34.999862 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93dff2c3-803d-4162-8ce4-3cf7c2f66c1f" containerName="registry-server" Dec 02 17:21:34 crc kubenswrapper[4933]: I1202 17:21:34.999869 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="93dff2c3-803d-4162-8ce4-3cf7c2f66c1f" containerName="registry-server" Dec 02 17:21:34 crc kubenswrapper[4933]: E1202 17:21:34.999885 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc357b83-3ecf-4ba1-b963-189d04f3492e" containerName="extract-utilities" Dec 02 17:21:34 crc kubenswrapper[4933]: I1202 17:21:34.999891 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc357b83-3ecf-4ba1-b963-189d04f3492e" containerName="extract-utilities" Dec 02 17:21:34 crc kubenswrapper[4933]: E1202 17:21:34.999906 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93dff2c3-803d-4162-8ce4-3cf7c2f66c1f" containerName="extract-utilities" Dec 02 17:21:34 crc kubenswrapper[4933]: I1202 17:21:34.999913 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="93dff2c3-803d-4162-8ce4-3cf7c2f66c1f" containerName="extract-utilities" Dec 02 17:21:34 crc kubenswrapper[4933]: E1202 17:21:34.999931 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc357b83-3ecf-4ba1-b963-189d04f3492e" containerName="registry-server" Dec 02 17:21:34 crc kubenswrapper[4933]: I1202 17:21:34.999937 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc357b83-3ecf-4ba1-b963-189d04f3492e" containerName="registry-server" Dec 02 17:21:34 crc kubenswrapper[4933]: E1202 17:21:34.999951 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc357b83-3ecf-4ba1-b963-189d04f3492e" containerName="extract-content" Dec 02 17:21:34 crc kubenswrapper[4933]: I1202 17:21:34.999957 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc357b83-3ecf-4ba1-b963-189d04f3492e" containerName="extract-content" Dec 02 17:21:35 crc kubenswrapper[4933]: I1202 17:21:35.000471 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="93dff2c3-803d-4162-8ce4-3cf7c2f66c1f" containerName="registry-server" Dec 02 17:21:35 crc kubenswrapper[4933]: I1202 17:21:35.000499 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc357b83-3ecf-4ba1-b963-189d04f3492e" containerName="registry-server" Dec 02 17:21:35 crc kubenswrapper[4933]: I1202 17:21:35.002469 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dbssq" Dec 02 17:21:35 crc kubenswrapper[4933]: I1202 17:21:35.016247 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dbssq"] Dec 02 17:21:35 crc kubenswrapper[4933]: I1202 17:21:35.102098 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73954862-c62b-4ac9-bc96-67eb462b7dfa-utilities\") pod \"community-operators-dbssq\" (UID: \"73954862-c62b-4ac9-bc96-67eb462b7dfa\") " pod="openshift-marketplace/community-operators-dbssq" Dec 02 17:21:35 crc kubenswrapper[4933]: I1202 17:21:35.102420 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wntfh\" (UniqueName: \"kubernetes.io/projected/73954862-c62b-4ac9-bc96-67eb462b7dfa-kube-api-access-wntfh\") pod \"community-operators-dbssq\" (UID: \"73954862-c62b-4ac9-bc96-67eb462b7dfa\") " pod="openshift-marketplace/community-operators-dbssq" Dec 02 17:21:35 crc kubenswrapper[4933]: I1202 17:21:35.102677 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73954862-c62b-4ac9-bc96-67eb462b7dfa-catalog-content\") pod \"community-operators-dbssq\" (UID: \"73954862-c62b-4ac9-bc96-67eb462b7dfa\") " pod="openshift-marketplace/community-operators-dbssq" Dec 02 17:21:35 crc kubenswrapper[4933]: I1202 17:21:35.204657 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73954862-c62b-4ac9-bc96-67eb462b7dfa-utilities\") pod \"community-operators-dbssq\" (UID: \"73954862-c62b-4ac9-bc96-67eb462b7dfa\") " pod="openshift-marketplace/community-operators-dbssq" Dec 02 17:21:35 crc kubenswrapper[4933]: I1202 17:21:35.205199 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wntfh\" (UniqueName: \"kubernetes.io/projected/73954862-c62b-4ac9-bc96-67eb462b7dfa-kube-api-access-wntfh\") pod \"community-operators-dbssq\" (UID: \"73954862-c62b-4ac9-bc96-67eb462b7dfa\") " pod="openshift-marketplace/community-operators-dbssq" Dec 02 17:21:35 crc kubenswrapper[4933]: I1202 17:21:35.205240 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73954862-c62b-4ac9-bc96-67eb462b7dfa-utilities\") pod \"community-operators-dbssq\" (UID: \"73954862-c62b-4ac9-bc96-67eb462b7dfa\") " pod="openshift-marketplace/community-operators-dbssq" Dec 02 17:21:35 crc kubenswrapper[4933]: I1202 17:21:35.205412 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73954862-c62b-4ac9-bc96-67eb462b7dfa-catalog-content\") pod \"community-operators-dbssq\" (UID: \"73954862-c62b-4ac9-bc96-67eb462b7dfa\") " pod="openshift-marketplace/community-operators-dbssq" Dec 02 17:21:35 crc kubenswrapper[4933]: I1202 17:21:35.206644 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73954862-c62b-4ac9-bc96-67eb462b7dfa-catalog-content\") pod \"community-operators-dbssq\" (UID: \"73954862-c62b-4ac9-bc96-67eb462b7dfa\") " pod="openshift-marketplace/community-operators-dbssq" Dec 02 17:21:35 crc kubenswrapper[4933]: I1202 17:21:35.481674 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wntfh\" (UniqueName: \"kubernetes.io/projected/73954862-c62b-4ac9-bc96-67eb462b7dfa-kube-api-access-wntfh\") pod \"community-operators-dbssq\" (UID: \"73954862-c62b-4ac9-bc96-67eb462b7dfa\") " pod="openshift-marketplace/community-operators-dbssq" Dec 02 17:21:35 crc kubenswrapper[4933]: I1202 17:21:35.624362 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dbssq" Dec 02 17:21:36 crc kubenswrapper[4933]: I1202 17:21:36.141584 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dbssq"] Dec 02 17:21:36 crc kubenswrapper[4933]: I1202 17:21:36.321416 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dbssq" event={"ID":"73954862-c62b-4ac9-bc96-67eb462b7dfa","Type":"ContainerStarted","Data":"dfe3ec379e663d534e223f6b230e41aaf53a7ec6ff68fe995245184a176ba080"} Dec 02 17:21:36 crc kubenswrapper[4933]: I1202 17:21:36.795725 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-mqrbh"] Dec 02 17:21:36 crc kubenswrapper[4933]: I1202 17:21:36.798875 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mqrbh" Dec 02 17:21:36 crc kubenswrapper[4933]: I1202 17:21:36.819582 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mqrbh"] Dec 02 17:21:36 crc kubenswrapper[4933]: I1202 17:21:36.948531 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b6dd372-0197-4809-a91e-9178b7aa3706-utilities\") pod \"redhat-marketplace-mqrbh\" (UID: \"8b6dd372-0197-4809-a91e-9178b7aa3706\") " pod="openshift-marketplace/redhat-marketplace-mqrbh" Dec 02 17:21:36 crc kubenswrapper[4933]: I1202 17:21:36.948603 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxp2p\" (UniqueName: \"kubernetes.io/projected/8b6dd372-0197-4809-a91e-9178b7aa3706-kube-api-access-dxp2p\") pod \"redhat-marketplace-mqrbh\" (UID: \"8b6dd372-0197-4809-a91e-9178b7aa3706\") " pod="openshift-marketplace/redhat-marketplace-mqrbh" Dec 02 17:21:36 crc kubenswrapper[4933]: I1202 17:21:36.948816 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b6dd372-0197-4809-a91e-9178b7aa3706-catalog-content\") pod \"redhat-marketplace-mqrbh\" (UID: \"8b6dd372-0197-4809-a91e-9178b7aa3706\") " pod="openshift-marketplace/redhat-marketplace-mqrbh" Dec 02 17:21:37 crc kubenswrapper[4933]: I1202 17:21:37.051052 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b6dd372-0197-4809-a91e-9178b7aa3706-utilities\") pod \"redhat-marketplace-mqrbh\" (UID: \"8b6dd372-0197-4809-a91e-9178b7aa3706\") " pod="openshift-marketplace/redhat-marketplace-mqrbh" Dec 02 17:21:37 crc kubenswrapper[4933]: I1202 17:21:37.051116 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxp2p\" (UniqueName: \"kubernetes.io/projected/8b6dd372-0197-4809-a91e-9178b7aa3706-kube-api-access-dxp2p\") pod \"redhat-marketplace-mqrbh\" (UID: \"8b6dd372-0197-4809-a91e-9178b7aa3706\") " pod="openshift-marketplace/redhat-marketplace-mqrbh" Dec 02 17:21:37 crc kubenswrapper[4933]: I1202 17:21:37.051179 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b6dd372-0197-4809-a91e-9178b7aa3706-catalog-content\") pod \"redhat-marketplace-mqrbh\" (UID: \"8b6dd372-0197-4809-a91e-9178b7aa3706\") " pod="openshift-marketplace/redhat-marketplace-mqrbh" Dec 02 17:21:37 crc kubenswrapper[4933]: I1202 17:21:37.051710 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b6dd372-0197-4809-a91e-9178b7aa3706-utilities\") pod \"redhat-marketplace-mqrbh\" (UID: \"8b6dd372-0197-4809-a91e-9178b7aa3706\") " pod="openshift-marketplace/redhat-marketplace-mqrbh" Dec 02 17:21:37 crc kubenswrapper[4933]: I1202 17:21:37.051744 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b6dd372-0197-4809-a91e-9178b7aa3706-catalog-content\") pod \"redhat-marketplace-mqrbh\" (UID: \"8b6dd372-0197-4809-a91e-9178b7aa3706\") " pod="openshift-marketplace/redhat-marketplace-mqrbh" Dec 02 17:21:37 crc kubenswrapper[4933]: I1202 17:21:37.332986 4933 generic.go:334] "Generic (PLEG): container finished" podID="73954862-c62b-4ac9-bc96-67eb462b7dfa" containerID="317a77f6ada31a14cb6c9164b5a474670fece27d89003708b0a2a5c114162801" exitCode=0 Dec 02 17:21:37 crc kubenswrapper[4933]: I1202 17:21:37.333025 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dbssq" event={"ID":"73954862-c62b-4ac9-bc96-67eb462b7dfa","Type":"ContainerDied","Data":"317a77f6ada31a14cb6c9164b5a474670fece27d89003708b0a2a5c114162801"} Dec 02 17:21:37 crc kubenswrapper[4933]: I1202 17:21:37.336794 4933 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 17:21:37 crc kubenswrapper[4933]: I1202 17:21:37.596148 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxp2p\" (UniqueName: \"kubernetes.io/projected/8b6dd372-0197-4809-a91e-9178b7aa3706-kube-api-access-dxp2p\") pod \"redhat-marketplace-mqrbh\" (UID: \"8b6dd372-0197-4809-a91e-9178b7aa3706\") " pod="openshift-marketplace/redhat-marketplace-mqrbh" Dec 02 17:21:37 crc kubenswrapper[4933]: I1202 17:21:37.720491 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mqrbh" Dec 02 17:21:38 crc kubenswrapper[4933]: I1202 17:21:38.201127 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mqrbh"] Dec 02 17:21:38 crc kubenswrapper[4933]: W1202 17:21:38.206738 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b6dd372_0197_4809_a91e_9178b7aa3706.slice/crio-89fd2390a57ab935d95d614d105ba715ad1fa097402b152788eef2048a58e339 WatchSource:0}: Error finding container 89fd2390a57ab935d95d614d105ba715ad1fa097402b152788eef2048a58e339: Status 404 returned error can't find the container with id 89fd2390a57ab935d95d614d105ba715ad1fa097402b152788eef2048a58e339 Dec 02 17:21:38 crc kubenswrapper[4933]: I1202 17:21:38.345323 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mqrbh" event={"ID":"8b6dd372-0197-4809-a91e-9178b7aa3706","Type":"ContainerStarted","Data":"89fd2390a57ab935d95d614d105ba715ad1fa097402b152788eef2048a58e339"} Dec 02 17:21:39 crc kubenswrapper[4933]: I1202 17:21:39.358591 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dbssq" event={"ID":"73954862-c62b-4ac9-bc96-67eb462b7dfa","Type":"ContainerStarted","Data":"2ec88daecb724dab60c80e32c7ae79fc1b7e208cfb1cf4a2e3136d008947b508"} Dec 02 17:21:39 crc kubenswrapper[4933]: I1202 17:21:39.368078 4933 generic.go:334] "Generic (PLEG): container finished" podID="8b6dd372-0197-4809-a91e-9178b7aa3706" containerID="6c14e4d971ffc6bf6ca60dc4dc4d4ab389859e3cdaf1fd89712ce28cffd58657" exitCode=0 Dec 02 17:21:39 crc kubenswrapper[4933]: I1202 17:21:39.368125 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mqrbh" event={"ID":"8b6dd372-0197-4809-a91e-9178b7aa3706","Type":"ContainerDied","Data":"6c14e4d971ffc6bf6ca60dc4dc4d4ab389859e3cdaf1fd89712ce28cffd58657"} Dec 02 17:21:40 crc kubenswrapper[4933]: I1202 17:21:40.381878 4933 generic.go:334] "Generic (PLEG): container finished" podID="73954862-c62b-4ac9-bc96-67eb462b7dfa" containerID="2ec88daecb724dab60c80e32c7ae79fc1b7e208cfb1cf4a2e3136d008947b508" exitCode=0 Dec 02 17:21:40 crc kubenswrapper[4933]: I1202 17:21:40.381988 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dbssq" event={"ID":"73954862-c62b-4ac9-bc96-67eb462b7dfa","Type":"ContainerDied","Data":"2ec88daecb724dab60c80e32c7ae79fc1b7e208cfb1cf4a2e3136d008947b508"} Dec 02 17:21:40 crc kubenswrapper[4933]: I1202 17:21:40.386067 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mqrbh" event={"ID":"8b6dd372-0197-4809-a91e-9178b7aa3706","Type":"ContainerStarted","Data":"d4b4e44577058ee82aaeaf12e3d7a012e7f4945ee6151115eb9b66a6252d2d6d"} Dec 02 17:21:41 crc kubenswrapper[4933]: I1202 17:21:41.398139 4933 generic.go:334] "Generic (PLEG): container finished" podID="8b6dd372-0197-4809-a91e-9178b7aa3706" containerID="d4b4e44577058ee82aaeaf12e3d7a012e7f4945ee6151115eb9b66a6252d2d6d" exitCode=0 Dec 02 17:21:41 crc kubenswrapper[4933]: I1202 17:21:41.398272 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mqrbh" event={"ID":"8b6dd372-0197-4809-a91e-9178b7aa3706","Type":"ContainerDied","Data":"d4b4e44577058ee82aaeaf12e3d7a012e7f4945ee6151115eb9b66a6252d2d6d"} Dec 02 17:21:41 crc kubenswrapper[4933]: I1202 17:21:41.401605 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dbssq" event={"ID":"73954862-c62b-4ac9-bc96-67eb462b7dfa","Type":"ContainerStarted","Data":"4d0c579139833bfd93d541580d6db66a116eb63c4d2a3eafe544c737c39f54c8"} Dec 02 17:21:41 crc kubenswrapper[4933]: I1202 17:21:41.441243 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-dbssq" podStartSLOduration=3.80227206 podStartE2EDuration="7.440881709s" podCreationTimestamp="2025-12-02 17:21:34 +0000 UTC" firstStartedPulling="2025-12-02 17:21:37.335428442 +0000 UTC m=+5360.586655145" lastFinishedPulling="2025-12-02 17:21:40.974038091 +0000 UTC m=+5364.225264794" observedRunningTime="2025-12-02 17:21:41.431432268 +0000 UTC m=+5364.682658971" watchObservedRunningTime="2025-12-02 17:21:41.440881709 +0000 UTC m=+5364.692108412" Dec 02 17:21:42 crc kubenswrapper[4933]: I1202 17:21:42.414888 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mqrbh" event={"ID":"8b6dd372-0197-4809-a91e-9178b7aa3706","Type":"ContainerStarted","Data":"e00628b5d1d5255f837d260457f23b5dd090ecff2041dc3b944bce87aaec828f"} Dec 02 17:21:42 crc kubenswrapper[4933]: I1202 17:21:42.462455 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-mqrbh" podStartSLOduration=4.062498293 podStartE2EDuration="6.462438226s" podCreationTimestamp="2025-12-02 17:21:36 +0000 UTC" firstStartedPulling="2025-12-02 17:21:39.432718027 +0000 UTC m=+5362.683944750" lastFinishedPulling="2025-12-02 17:21:41.83265798 +0000 UTC m=+5365.083884683" observedRunningTime="2025-12-02 17:21:42.45919141 +0000 UTC m=+5365.710418133" watchObservedRunningTime="2025-12-02 17:21:42.462438226 +0000 UTC m=+5365.713664929" Dec 02 17:21:45 crc kubenswrapper[4933]: I1202 17:21:45.625532 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-dbssq" Dec 02 17:21:45 crc kubenswrapper[4933]: I1202 17:21:45.626258 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-dbssq" Dec 02 17:21:46 crc kubenswrapper[4933]: I1202 17:21:46.054438 4933 scope.go:117] "RemoveContainer" containerID="c29bcc628a04bb3057206075ef5318ca3acce9aa63e363f1a269a85f5c7c79d9" Dec 02 17:21:46 crc kubenswrapper[4933]: E1202 17:21:46.055188 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 17:21:46 crc kubenswrapper[4933]: I1202 17:21:46.677738 4933 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-dbssq" podUID="73954862-c62b-4ac9-bc96-67eb462b7dfa" containerName="registry-server" probeResult="failure" output=< Dec 02 17:21:46 crc kubenswrapper[4933]: timeout: failed to connect service ":50051" within 1s Dec 02 17:21:46 crc kubenswrapper[4933]: > Dec 02 17:21:47 crc kubenswrapper[4933]: I1202 17:21:47.721128 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-mqrbh" Dec 02 17:21:47 crc kubenswrapper[4933]: I1202 17:21:47.721447 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-mqrbh" Dec 02 17:21:47 crc kubenswrapper[4933]: I1202 17:21:47.799085 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-mqrbh" Dec 02 17:21:48 crc kubenswrapper[4933]: I1202 17:21:48.535230 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-mqrbh" Dec 02 17:21:49 crc kubenswrapper[4933]: I1202 17:21:49.603881 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mqrbh"] Dec 02 17:21:50 crc kubenswrapper[4933]: I1202 17:21:50.502109 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-mqrbh" podUID="8b6dd372-0197-4809-a91e-9178b7aa3706" containerName="registry-server" containerID="cri-o://e00628b5d1d5255f837d260457f23b5dd090ecff2041dc3b944bce87aaec828f" gracePeriod=2 Dec 02 17:21:51 crc kubenswrapper[4933]: I1202 17:21:51.186996 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mqrbh" Dec 02 17:21:51 crc kubenswrapper[4933]: I1202 17:21:51.234349 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b6dd372-0197-4809-a91e-9178b7aa3706-catalog-content\") pod \"8b6dd372-0197-4809-a91e-9178b7aa3706\" (UID: \"8b6dd372-0197-4809-a91e-9178b7aa3706\") " Dec 02 17:21:51 crc kubenswrapper[4933]: I1202 17:21:51.234563 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dxp2p\" (UniqueName: \"kubernetes.io/projected/8b6dd372-0197-4809-a91e-9178b7aa3706-kube-api-access-dxp2p\") pod \"8b6dd372-0197-4809-a91e-9178b7aa3706\" (UID: \"8b6dd372-0197-4809-a91e-9178b7aa3706\") " Dec 02 17:21:51 crc kubenswrapper[4933]: I1202 17:21:51.234660 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b6dd372-0197-4809-a91e-9178b7aa3706-utilities\") pod \"8b6dd372-0197-4809-a91e-9178b7aa3706\" (UID: \"8b6dd372-0197-4809-a91e-9178b7aa3706\") " Dec 02 17:21:51 crc kubenswrapper[4933]: I1202 17:21:51.235381 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b6dd372-0197-4809-a91e-9178b7aa3706-utilities" (OuterVolumeSpecName: "utilities") pod "8b6dd372-0197-4809-a91e-9178b7aa3706" (UID: "8b6dd372-0197-4809-a91e-9178b7aa3706"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 17:21:51 crc kubenswrapper[4933]: I1202 17:21:51.243317 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b6dd372-0197-4809-a91e-9178b7aa3706-kube-api-access-dxp2p" (OuterVolumeSpecName: "kube-api-access-dxp2p") pod "8b6dd372-0197-4809-a91e-9178b7aa3706" (UID: "8b6dd372-0197-4809-a91e-9178b7aa3706"). InnerVolumeSpecName "kube-api-access-dxp2p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 17:21:51 crc kubenswrapper[4933]: I1202 17:21:51.259295 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b6dd372-0197-4809-a91e-9178b7aa3706-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8b6dd372-0197-4809-a91e-9178b7aa3706" (UID: "8b6dd372-0197-4809-a91e-9178b7aa3706"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 17:21:51 crc kubenswrapper[4933]: I1202 17:21:51.337433 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dxp2p\" (UniqueName: \"kubernetes.io/projected/8b6dd372-0197-4809-a91e-9178b7aa3706-kube-api-access-dxp2p\") on node \"crc\" DevicePath \"\"" Dec 02 17:21:51 crc kubenswrapper[4933]: I1202 17:21:51.337485 4933 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b6dd372-0197-4809-a91e-9178b7aa3706-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 17:21:51 crc kubenswrapper[4933]: I1202 17:21:51.337494 4933 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b6dd372-0197-4809-a91e-9178b7aa3706-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 17:21:51 crc kubenswrapper[4933]: I1202 17:21:51.516666 4933 generic.go:334] "Generic (PLEG): container finished" podID="8b6dd372-0197-4809-a91e-9178b7aa3706" containerID="e00628b5d1d5255f837d260457f23b5dd090ecff2041dc3b944bce87aaec828f" exitCode=0 Dec 02 17:21:51 crc kubenswrapper[4933]: I1202 17:21:51.516710 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mqrbh" event={"ID":"8b6dd372-0197-4809-a91e-9178b7aa3706","Type":"ContainerDied","Data":"e00628b5d1d5255f837d260457f23b5dd090ecff2041dc3b944bce87aaec828f"} Dec 02 17:21:51 crc kubenswrapper[4933]: I1202 17:21:51.516737 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mqrbh" event={"ID":"8b6dd372-0197-4809-a91e-9178b7aa3706","Type":"ContainerDied","Data":"89fd2390a57ab935d95d614d105ba715ad1fa097402b152788eef2048a58e339"} Dec 02 17:21:51 crc kubenswrapper[4933]: I1202 17:21:51.516753 4933 scope.go:117] "RemoveContainer" containerID="e00628b5d1d5255f837d260457f23b5dd090ecff2041dc3b944bce87aaec828f" Dec 02 17:21:51 crc kubenswrapper[4933]: I1202 17:21:51.516993 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mqrbh" Dec 02 17:21:51 crc kubenswrapper[4933]: I1202 17:21:51.544429 4933 scope.go:117] "RemoveContainer" containerID="d4b4e44577058ee82aaeaf12e3d7a012e7f4945ee6151115eb9b66a6252d2d6d" Dec 02 17:21:51 crc kubenswrapper[4933]: I1202 17:21:51.572543 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mqrbh"] Dec 02 17:21:51 crc kubenswrapper[4933]: I1202 17:21:51.588542 4933 scope.go:117] "RemoveContainer" containerID="6c14e4d971ffc6bf6ca60dc4dc4d4ab389859e3cdaf1fd89712ce28cffd58657" Dec 02 17:21:51 crc kubenswrapper[4933]: I1202 17:21:51.589557 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-mqrbh"] Dec 02 17:21:51 crc kubenswrapper[4933]: I1202 17:21:51.654779 4933 scope.go:117] "RemoveContainer" containerID="e00628b5d1d5255f837d260457f23b5dd090ecff2041dc3b944bce87aaec828f" Dec 02 17:21:51 crc kubenswrapper[4933]: E1202 17:21:51.655379 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e00628b5d1d5255f837d260457f23b5dd090ecff2041dc3b944bce87aaec828f\": container with ID starting with e00628b5d1d5255f837d260457f23b5dd090ecff2041dc3b944bce87aaec828f not found: ID does not exist" containerID="e00628b5d1d5255f837d260457f23b5dd090ecff2041dc3b944bce87aaec828f" Dec 02 17:21:51 crc kubenswrapper[4933]: I1202 17:21:51.655409 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e00628b5d1d5255f837d260457f23b5dd090ecff2041dc3b944bce87aaec828f"} err="failed to get container status \"e00628b5d1d5255f837d260457f23b5dd090ecff2041dc3b944bce87aaec828f\": rpc error: code = NotFound desc = could not find container \"e00628b5d1d5255f837d260457f23b5dd090ecff2041dc3b944bce87aaec828f\": container with ID starting with e00628b5d1d5255f837d260457f23b5dd090ecff2041dc3b944bce87aaec828f not found: ID does not exist" Dec 02 17:21:51 crc kubenswrapper[4933]: I1202 17:21:51.655429 4933 scope.go:117] "RemoveContainer" containerID="d4b4e44577058ee82aaeaf12e3d7a012e7f4945ee6151115eb9b66a6252d2d6d" Dec 02 17:21:51 crc kubenswrapper[4933]: E1202 17:21:51.655719 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4b4e44577058ee82aaeaf12e3d7a012e7f4945ee6151115eb9b66a6252d2d6d\": container with ID starting with d4b4e44577058ee82aaeaf12e3d7a012e7f4945ee6151115eb9b66a6252d2d6d not found: ID does not exist" containerID="d4b4e44577058ee82aaeaf12e3d7a012e7f4945ee6151115eb9b66a6252d2d6d" Dec 02 17:21:51 crc kubenswrapper[4933]: I1202 17:21:51.655740 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4b4e44577058ee82aaeaf12e3d7a012e7f4945ee6151115eb9b66a6252d2d6d"} err="failed to get container status \"d4b4e44577058ee82aaeaf12e3d7a012e7f4945ee6151115eb9b66a6252d2d6d\": rpc error: code = NotFound desc = could not find container \"d4b4e44577058ee82aaeaf12e3d7a012e7f4945ee6151115eb9b66a6252d2d6d\": container with ID starting with d4b4e44577058ee82aaeaf12e3d7a012e7f4945ee6151115eb9b66a6252d2d6d not found: ID does not exist" Dec 02 17:21:51 crc kubenswrapper[4933]: I1202 17:21:51.655750 4933 scope.go:117] "RemoveContainer" containerID="6c14e4d971ffc6bf6ca60dc4dc4d4ab389859e3cdaf1fd89712ce28cffd58657" Dec 02 17:21:51 crc kubenswrapper[4933]: E1202 17:21:51.656161 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c14e4d971ffc6bf6ca60dc4dc4d4ab389859e3cdaf1fd89712ce28cffd58657\": container with ID starting with 6c14e4d971ffc6bf6ca60dc4dc4d4ab389859e3cdaf1fd89712ce28cffd58657 not found: ID does not exist" containerID="6c14e4d971ffc6bf6ca60dc4dc4d4ab389859e3cdaf1fd89712ce28cffd58657" Dec 02 17:21:51 crc kubenswrapper[4933]: I1202 17:21:51.656181 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c14e4d971ffc6bf6ca60dc4dc4d4ab389859e3cdaf1fd89712ce28cffd58657"} err="failed to get container status \"6c14e4d971ffc6bf6ca60dc4dc4d4ab389859e3cdaf1fd89712ce28cffd58657\": rpc error: code = NotFound desc = could not find container \"6c14e4d971ffc6bf6ca60dc4dc4d4ab389859e3cdaf1fd89712ce28cffd58657\": container with ID starting with 6c14e4d971ffc6bf6ca60dc4dc4d4ab389859e3cdaf1fd89712ce28cffd58657 not found: ID does not exist" Dec 02 17:21:53 crc kubenswrapper[4933]: I1202 17:21:53.071631 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b6dd372-0197-4809-a91e-9178b7aa3706" path="/var/lib/kubelet/pods/8b6dd372-0197-4809-a91e-9178b7aa3706/volumes" Dec 02 17:21:55 crc kubenswrapper[4933]: I1202 17:21:55.679632 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-dbssq" Dec 02 17:21:55 crc kubenswrapper[4933]: I1202 17:21:55.756761 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-dbssq" Dec 02 17:21:55 crc kubenswrapper[4933]: I1202 17:21:55.935663 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-dbssq"] Dec 02 17:21:57 crc kubenswrapper[4933]: I1202 17:21:57.586097 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-dbssq" podUID="73954862-c62b-4ac9-bc96-67eb462b7dfa" containerName="registry-server" containerID="cri-o://4d0c579139833bfd93d541580d6db66a116eb63c4d2a3eafe544c737c39f54c8" gracePeriod=2 Dec 02 17:21:58 crc kubenswrapper[4933]: I1202 17:21:58.174662 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dbssq" Dec 02 17:21:58 crc kubenswrapper[4933]: I1202 17:21:58.228962 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73954862-c62b-4ac9-bc96-67eb462b7dfa-utilities\") pod \"73954862-c62b-4ac9-bc96-67eb462b7dfa\" (UID: \"73954862-c62b-4ac9-bc96-67eb462b7dfa\") " Dec 02 17:21:58 crc kubenswrapper[4933]: I1202 17:21:58.229296 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wntfh\" (UniqueName: \"kubernetes.io/projected/73954862-c62b-4ac9-bc96-67eb462b7dfa-kube-api-access-wntfh\") pod \"73954862-c62b-4ac9-bc96-67eb462b7dfa\" (UID: \"73954862-c62b-4ac9-bc96-67eb462b7dfa\") " Dec 02 17:21:58 crc kubenswrapper[4933]: I1202 17:21:58.229418 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73954862-c62b-4ac9-bc96-67eb462b7dfa-catalog-content\") pod \"73954862-c62b-4ac9-bc96-67eb462b7dfa\" (UID: \"73954862-c62b-4ac9-bc96-67eb462b7dfa\") " Dec 02 17:21:58 crc kubenswrapper[4933]: I1202 17:21:58.231132 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73954862-c62b-4ac9-bc96-67eb462b7dfa-utilities" (OuterVolumeSpecName: "utilities") pod "73954862-c62b-4ac9-bc96-67eb462b7dfa" (UID: "73954862-c62b-4ac9-bc96-67eb462b7dfa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 17:21:58 crc kubenswrapper[4933]: I1202 17:21:58.234948 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73954862-c62b-4ac9-bc96-67eb462b7dfa-kube-api-access-wntfh" (OuterVolumeSpecName: "kube-api-access-wntfh") pod "73954862-c62b-4ac9-bc96-67eb462b7dfa" (UID: "73954862-c62b-4ac9-bc96-67eb462b7dfa"). InnerVolumeSpecName "kube-api-access-wntfh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 17:21:58 crc kubenswrapper[4933]: I1202 17:21:58.293725 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73954862-c62b-4ac9-bc96-67eb462b7dfa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "73954862-c62b-4ac9-bc96-67eb462b7dfa" (UID: "73954862-c62b-4ac9-bc96-67eb462b7dfa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 17:21:58 crc kubenswrapper[4933]: I1202 17:21:58.334732 4933 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73954862-c62b-4ac9-bc96-67eb462b7dfa-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 17:21:58 crc kubenswrapper[4933]: I1202 17:21:58.334768 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wntfh\" (UniqueName: \"kubernetes.io/projected/73954862-c62b-4ac9-bc96-67eb462b7dfa-kube-api-access-wntfh\") on node \"crc\" DevicePath \"\"" Dec 02 17:21:58 crc kubenswrapper[4933]: I1202 17:21:58.334779 4933 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73954862-c62b-4ac9-bc96-67eb462b7dfa-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 17:21:58 crc kubenswrapper[4933]: I1202 17:21:58.618525 4933 generic.go:334] "Generic (PLEG): container finished" podID="73954862-c62b-4ac9-bc96-67eb462b7dfa" containerID="4d0c579139833bfd93d541580d6db66a116eb63c4d2a3eafe544c737c39f54c8" exitCode=0 Dec 02 17:21:58 crc kubenswrapper[4933]: I1202 17:21:58.618783 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dbssq" event={"ID":"73954862-c62b-4ac9-bc96-67eb462b7dfa","Type":"ContainerDied","Data":"4d0c579139833bfd93d541580d6db66a116eb63c4d2a3eafe544c737c39f54c8"} Dec 02 17:21:58 crc kubenswrapper[4933]: I1202 17:21:58.618814 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dbssq" event={"ID":"73954862-c62b-4ac9-bc96-67eb462b7dfa","Type":"ContainerDied","Data":"dfe3ec379e663d534e223f6b230e41aaf53a7ec6ff68fe995245184a176ba080"} Dec 02 17:21:58 crc kubenswrapper[4933]: I1202 17:21:58.618873 4933 scope.go:117] "RemoveContainer" containerID="4d0c579139833bfd93d541580d6db66a116eb63c4d2a3eafe544c737c39f54c8" Dec 02 17:21:58 crc kubenswrapper[4933]: I1202 17:21:58.619084 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dbssq" Dec 02 17:21:58 crc kubenswrapper[4933]: I1202 17:21:58.656255 4933 scope.go:117] "RemoveContainer" containerID="2ec88daecb724dab60c80e32c7ae79fc1b7e208cfb1cf4a2e3136d008947b508" Dec 02 17:21:58 crc kubenswrapper[4933]: I1202 17:21:58.668981 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-dbssq"] Dec 02 17:21:58 crc kubenswrapper[4933]: I1202 17:21:58.681057 4933 scope.go:117] "RemoveContainer" containerID="317a77f6ada31a14cb6c9164b5a474670fece27d89003708b0a2a5c114162801" Dec 02 17:21:58 crc kubenswrapper[4933]: I1202 17:21:58.682070 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-dbssq"] Dec 02 17:21:58 crc kubenswrapper[4933]: I1202 17:21:58.747950 4933 scope.go:117] "RemoveContainer" containerID="4d0c579139833bfd93d541580d6db66a116eb63c4d2a3eafe544c737c39f54c8" Dec 02 17:21:58 crc kubenswrapper[4933]: E1202 17:21:58.748566 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d0c579139833bfd93d541580d6db66a116eb63c4d2a3eafe544c737c39f54c8\": container with ID starting with 4d0c579139833bfd93d541580d6db66a116eb63c4d2a3eafe544c737c39f54c8 not found: ID does not exist" containerID="4d0c579139833bfd93d541580d6db66a116eb63c4d2a3eafe544c737c39f54c8" Dec 02 17:21:58 crc kubenswrapper[4933]: I1202 17:21:58.748615 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d0c579139833bfd93d541580d6db66a116eb63c4d2a3eafe544c737c39f54c8"} err="failed to get container status \"4d0c579139833bfd93d541580d6db66a116eb63c4d2a3eafe544c737c39f54c8\": rpc error: code = NotFound desc = could not find container \"4d0c579139833bfd93d541580d6db66a116eb63c4d2a3eafe544c737c39f54c8\": container with ID starting with 4d0c579139833bfd93d541580d6db66a116eb63c4d2a3eafe544c737c39f54c8 not found: ID does not exist" Dec 02 17:21:58 crc kubenswrapper[4933]: I1202 17:21:58.748645 4933 scope.go:117] "RemoveContainer" containerID="2ec88daecb724dab60c80e32c7ae79fc1b7e208cfb1cf4a2e3136d008947b508" Dec 02 17:21:58 crc kubenswrapper[4933]: E1202 17:21:58.749445 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ec88daecb724dab60c80e32c7ae79fc1b7e208cfb1cf4a2e3136d008947b508\": container with ID starting with 2ec88daecb724dab60c80e32c7ae79fc1b7e208cfb1cf4a2e3136d008947b508 not found: ID does not exist" containerID="2ec88daecb724dab60c80e32c7ae79fc1b7e208cfb1cf4a2e3136d008947b508" Dec 02 17:21:58 crc kubenswrapper[4933]: I1202 17:21:58.749500 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ec88daecb724dab60c80e32c7ae79fc1b7e208cfb1cf4a2e3136d008947b508"} err="failed to get container status \"2ec88daecb724dab60c80e32c7ae79fc1b7e208cfb1cf4a2e3136d008947b508\": rpc error: code = NotFound desc = could not find container \"2ec88daecb724dab60c80e32c7ae79fc1b7e208cfb1cf4a2e3136d008947b508\": container with ID starting with 2ec88daecb724dab60c80e32c7ae79fc1b7e208cfb1cf4a2e3136d008947b508 not found: ID does not exist" Dec 02 17:21:58 crc kubenswrapper[4933]: I1202 17:21:58.749537 4933 scope.go:117] "RemoveContainer" containerID="317a77f6ada31a14cb6c9164b5a474670fece27d89003708b0a2a5c114162801" Dec 02 17:21:58 crc kubenswrapper[4933]: E1202 17:21:58.749927 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"317a77f6ada31a14cb6c9164b5a474670fece27d89003708b0a2a5c114162801\": container with ID starting with 317a77f6ada31a14cb6c9164b5a474670fece27d89003708b0a2a5c114162801 not found: ID does not exist" containerID="317a77f6ada31a14cb6c9164b5a474670fece27d89003708b0a2a5c114162801" Dec 02 17:21:58 crc kubenswrapper[4933]: I1202 17:21:58.749960 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"317a77f6ada31a14cb6c9164b5a474670fece27d89003708b0a2a5c114162801"} err="failed to get container status \"317a77f6ada31a14cb6c9164b5a474670fece27d89003708b0a2a5c114162801\": rpc error: code = NotFound desc = could not find container \"317a77f6ada31a14cb6c9164b5a474670fece27d89003708b0a2a5c114162801\": container with ID starting with 317a77f6ada31a14cb6c9164b5a474670fece27d89003708b0a2a5c114162801 not found: ID does not exist" Dec 02 17:21:59 crc kubenswrapper[4933]: I1202 17:21:59.066306 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73954862-c62b-4ac9-bc96-67eb462b7dfa" path="/var/lib/kubelet/pods/73954862-c62b-4ac9-bc96-67eb462b7dfa/volumes" Dec 02 17:22:01 crc kubenswrapper[4933]: I1202 17:22:01.054230 4933 scope.go:117] "RemoveContainer" containerID="c29bcc628a04bb3057206075ef5318ca3acce9aa63e363f1a269a85f5c7c79d9" Dec 02 17:22:01 crc kubenswrapper[4933]: E1202 17:22:01.055641 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 17:22:14 crc kubenswrapper[4933]: I1202 17:22:14.053198 4933 scope.go:117] "RemoveContainer" containerID="c29bcc628a04bb3057206075ef5318ca3acce9aa63e363f1a269a85f5c7c79d9" Dec 02 17:22:14 crc kubenswrapper[4933]: E1202 17:22:14.054040 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 17:22:28 crc kubenswrapper[4933]: I1202 17:22:28.053898 4933 scope.go:117] "RemoveContainer" containerID="c29bcc628a04bb3057206075ef5318ca3acce9aa63e363f1a269a85f5c7c79d9" Dec 02 17:22:29 crc kubenswrapper[4933]: I1202 17:22:29.036729 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" event={"ID":"e6c1c5e6-50dd-428a-890c-2c3f0456f2fa","Type":"ContainerStarted","Data":"529eb147c924b9a9956a3e94527da0e2fe1e1626f4460203a305a414e55c74a6"} Dec 02 17:24:47 crc kubenswrapper[4933]: I1202 17:24:47.168810 4933 patch_prober.go:28] interesting pod/machine-config-daemon-d2p6w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 17:24:47 crc kubenswrapper[4933]: I1202 17:24:47.169274 4933 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 17:25:17 crc kubenswrapper[4933]: I1202 17:25:17.168859 4933 patch_prober.go:28] interesting pod/machine-config-daemon-d2p6w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 17:25:17 crc kubenswrapper[4933]: I1202 17:25:17.169348 4933 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 17:25:27 crc kubenswrapper[4933]: I1202 17:25:27.964679 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-49z2s"] Dec 02 17:25:27 crc kubenswrapper[4933]: E1202 17:25:27.965948 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b6dd372-0197-4809-a91e-9178b7aa3706" containerName="registry-server" Dec 02 17:25:27 crc kubenswrapper[4933]: I1202 17:25:27.965967 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b6dd372-0197-4809-a91e-9178b7aa3706" containerName="registry-server" Dec 02 17:25:27 crc kubenswrapper[4933]: E1202 17:25:27.966001 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b6dd372-0197-4809-a91e-9178b7aa3706" containerName="extract-utilities" Dec 02 17:25:27 crc kubenswrapper[4933]: I1202 17:25:27.966030 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b6dd372-0197-4809-a91e-9178b7aa3706" containerName="extract-utilities" Dec 02 17:25:27 crc kubenswrapper[4933]: E1202 17:25:27.966044 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b6dd372-0197-4809-a91e-9178b7aa3706" containerName="extract-content" Dec 02 17:25:27 crc kubenswrapper[4933]: I1202 17:25:27.966052 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b6dd372-0197-4809-a91e-9178b7aa3706" containerName="extract-content" Dec 02 17:25:27 crc kubenswrapper[4933]: E1202 17:25:27.966074 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73954862-c62b-4ac9-bc96-67eb462b7dfa" containerName="extract-content" Dec 02 17:25:27 crc kubenswrapper[4933]: I1202 17:25:27.966082 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="73954862-c62b-4ac9-bc96-67eb462b7dfa" containerName="extract-content" Dec 02 17:25:27 crc kubenswrapper[4933]: E1202 17:25:27.966103 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73954862-c62b-4ac9-bc96-67eb462b7dfa" containerName="extract-utilities" Dec 02 17:25:27 crc kubenswrapper[4933]: I1202 17:25:27.966110 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="73954862-c62b-4ac9-bc96-67eb462b7dfa" containerName="extract-utilities" Dec 02 17:25:27 crc kubenswrapper[4933]: E1202 17:25:27.966135 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73954862-c62b-4ac9-bc96-67eb462b7dfa" containerName="registry-server" Dec 02 17:25:27 crc kubenswrapper[4933]: I1202 17:25:27.966145 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="73954862-c62b-4ac9-bc96-67eb462b7dfa" containerName="registry-server" Dec 02 17:25:27 crc kubenswrapper[4933]: I1202 17:25:27.966454 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b6dd372-0197-4809-a91e-9178b7aa3706" containerName="registry-server" Dec 02 17:25:27 crc kubenswrapper[4933]: I1202 17:25:27.966485 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="73954862-c62b-4ac9-bc96-67eb462b7dfa" containerName="registry-server" Dec 02 17:25:27 crc kubenswrapper[4933]: I1202 17:25:27.968442 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-49z2s" Dec 02 17:25:27 crc kubenswrapper[4933]: I1202 17:25:27.980860 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-49z2s"] Dec 02 17:25:28 crc kubenswrapper[4933]: I1202 17:25:28.065528 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9qnz\" (UniqueName: \"kubernetes.io/projected/31fdcfec-48fe-49e9-bbd8-ad36678feca0-kube-api-access-q9qnz\") pod \"certified-operators-49z2s\" (UID: \"31fdcfec-48fe-49e9-bbd8-ad36678feca0\") " pod="openshift-marketplace/certified-operators-49z2s" Dec 02 17:25:28 crc kubenswrapper[4933]: I1202 17:25:28.066094 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31fdcfec-48fe-49e9-bbd8-ad36678feca0-utilities\") pod \"certified-operators-49z2s\" (UID: \"31fdcfec-48fe-49e9-bbd8-ad36678feca0\") " pod="openshift-marketplace/certified-operators-49z2s" Dec 02 17:25:28 crc kubenswrapper[4933]: I1202 17:25:28.066148 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31fdcfec-48fe-49e9-bbd8-ad36678feca0-catalog-content\") pod \"certified-operators-49z2s\" (UID: \"31fdcfec-48fe-49e9-bbd8-ad36678feca0\") " pod="openshift-marketplace/certified-operators-49z2s" Dec 02 17:25:28 crc kubenswrapper[4933]: I1202 17:25:28.168456 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9qnz\" (UniqueName: \"kubernetes.io/projected/31fdcfec-48fe-49e9-bbd8-ad36678feca0-kube-api-access-q9qnz\") pod \"certified-operators-49z2s\" (UID: \"31fdcfec-48fe-49e9-bbd8-ad36678feca0\") " pod="openshift-marketplace/certified-operators-49z2s" Dec 02 17:25:28 crc kubenswrapper[4933]: I1202 17:25:28.168658 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31fdcfec-48fe-49e9-bbd8-ad36678feca0-utilities\") pod \"certified-operators-49z2s\" (UID: \"31fdcfec-48fe-49e9-bbd8-ad36678feca0\") " pod="openshift-marketplace/certified-operators-49z2s" Dec 02 17:25:28 crc kubenswrapper[4933]: I1202 17:25:28.168700 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31fdcfec-48fe-49e9-bbd8-ad36678feca0-catalog-content\") pod \"certified-operators-49z2s\" (UID: \"31fdcfec-48fe-49e9-bbd8-ad36678feca0\") " pod="openshift-marketplace/certified-operators-49z2s" Dec 02 17:25:28 crc kubenswrapper[4933]: I1202 17:25:28.169404 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31fdcfec-48fe-49e9-bbd8-ad36678feca0-utilities\") pod \"certified-operators-49z2s\" (UID: \"31fdcfec-48fe-49e9-bbd8-ad36678feca0\") " pod="openshift-marketplace/certified-operators-49z2s" Dec 02 17:25:28 crc kubenswrapper[4933]: I1202 17:25:28.169485 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31fdcfec-48fe-49e9-bbd8-ad36678feca0-catalog-content\") pod \"certified-operators-49z2s\" (UID: \"31fdcfec-48fe-49e9-bbd8-ad36678feca0\") " pod="openshift-marketplace/certified-operators-49z2s" Dec 02 17:25:28 crc kubenswrapper[4933]: I1202 17:25:28.194458 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9qnz\" (UniqueName: \"kubernetes.io/projected/31fdcfec-48fe-49e9-bbd8-ad36678feca0-kube-api-access-q9qnz\") pod \"certified-operators-49z2s\" (UID: \"31fdcfec-48fe-49e9-bbd8-ad36678feca0\") " pod="openshift-marketplace/certified-operators-49z2s" Dec 02 17:25:28 crc kubenswrapper[4933]: I1202 17:25:28.302676 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-49z2s" Dec 02 17:25:28 crc kubenswrapper[4933]: I1202 17:25:28.872990 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-49z2s"] Dec 02 17:25:29 crc kubenswrapper[4933]: I1202 17:25:29.211459 4933 generic.go:334] "Generic (PLEG): container finished" podID="31fdcfec-48fe-49e9-bbd8-ad36678feca0" containerID="6bd1eb4e7be98271e252880f57784a7bc701da1d8ba70e5aab0e83af3f76a443" exitCode=0 Dec 02 17:25:29 crc kubenswrapper[4933]: I1202 17:25:29.211570 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-49z2s" event={"ID":"31fdcfec-48fe-49e9-bbd8-ad36678feca0","Type":"ContainerDied","Data":"6bd1eb4e7be98271e252880f57784a7bc701da1d8ba70e5aab0e83af3f76a443"} Dec 02 17:25:29 crc kubenswrapper[4933]: I1202 17:25:29.211773 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-49z2s" event={"ID":"31fdcfec-48fe-49e9-bbd8-ad36678feca0","Type":"ContainerStarted","Data":"e7f07e4ca6c9e072fbceb347ca315d8e7106b9526ac75f3648f0b06e9f2cc516"} Dec 02 17:25:30 crc kubenswrapper[4933]: I1202 17:25:30.229028 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-49z2s" event={"ID":"31fdcfec-48fe-49e9-bbd8-ad36678feca0","Type":"ContainerStarted","Data":"a9e143b13d74fae0dd942df5ee6b9c3881c87a91d8f7db896a262c1fe7f32429"} Dec 02 17:25:32 crc kubenswrapper[4933]: I1202 17:25:32.247148 4933 generic.go:334] "Generic (PLEG): container finished" podID="31fdcfec-48fe-49e9-bbd8-ad36678feca0" containerID="a9e143b13d74fae0dd942df5ee6b9c3881c87a91d8f7db896a262c1fe7f32429" exitCode=0 Dec 02 17:25:32 crc kubenswrapper[4933]: I1202 17:25:32.247305 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-49z2s" event={"ID":"31fdcfec-48fe-49e9-bbd8-ad36678feca0","Type":"ContainerDied","Data":"a9e143b13d74fae0dd942df5ee6b9c3881c87a91d8f7db896a262c1fe7f32429"} Dec 02 17:25:33 crc kubenswrapper[4933]: I1202 17:25:33.262016 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-49z2s" event={"ID":"31fdcfec-48fe-49e9-bbd8-ad36678feca0","Type":"ContainerStarted","Data":"88959d31318714c3eb2591451ada071d5715786672233462b95d85cfec97a181"} Dec 02 17:25:33 crc kubenswrapper[4933]: I1202 17:25:33.292962 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-49z2s" podStartSLOduration=2.800172421 podStartE2EDuration="6.29294443s" podCreationTimestamp="2025-12-02 17:25:27 +0000 UTC" firstStartedPulling="2025-12-02 17:25:29.213310968 +0000 UTC m=+5592.464537671" lastFinishedPulling="2025-12-02 17:25:32.706082967 +0000 UTC m=+5595.957309680" observedRunningTime="2025-12-02 17:25:33.285515122 +0000 UTC m=+5596.536741825" watchObservedRunningTime="2025-12-02 17:25:33.29294443 +0000 UTC m=+5596.544171133" Dec 02 17:25:38 crc kubenswrapper[4933]: I1202 17:25:38.302882 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-49z2s" Dec 02 17:25:38 crc kubenswrapper[4933]: I1202 17:25:38.303501 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-49z2s" Dec 02 17:25:38 crc kubenswrapper[4933]: I1202 17:25:38.376657 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-49z2s" Dec 02 17:25:38 crc kubenswrapper[4933]: I1202 17:25:38.442152 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-49z2s" Dec 02 17:25:38 crc kubenswrapper[4933]: I1202 17:25:38.623503 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-49z2s"] Dec 02 17:25:40 crc kubenswrapper[4933]: I1202 17:25:40.354552 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-49z2s" podUID="31fdcfec-48fe-49e9-bbd8-ad36678feca0" containerName="registry-server" containerID="cri-o://88959d31318714c3eb2591451ada071d5715786672233462b95d85cfec97a181" gracePeriod=2 Dec 02 17:25:41 crc kubenswrapper[4933]: I1202 17:25:41.381000 4933 generic.go:334] "Generic (PLEG): container finished" podID="31fdcfec-48fe-49e9-bbd8-ad36678feca0" containerID="88959d31318714c3eb2591451ada071d5715786672233462b95d85cfec97a181" exitCode=0 Dec 02 17:25:41 crc kubenswrapper[4933]: I1202 17:25:41.381380 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-49z2s" event={"ID":"31fdcfec-48fe-49e9-bbd8-ad36678feca0","Type":"ContainerDied","Data":"88959d31318714c3eb2591451ada071d5715786672233462b95d85cfec97a181"} Dec 02 17:25:41 crc kubenswrapper[4933]: I1202 17:25:41.677834 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-49z2s" Dec 02 17:25:41 crc kubenswrapper[4933]: I1202 17:25:41.841112 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31fdcfec-48fe-49e9-bbd8-ad36678feca0-utilities\") pod \"31fdcfec-48fe-49e9-bbd8-ad36678feca0\" (UID: \"31fdcfec-48fe-49e9-bbd8-ad36678feca0\") " Dec 02 17:25:41 crc kubenswrapper[4933]: I1202 17:25:41.841870 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31fdcfec-48fe-49e9-bbd8-ad36678feca0-utilities" (OuterVolumeSpecName: "utilities") pod "31fdcfec-48fe-49e9-bbd8-ad36678feca0" (UID: "31fdcfec-48fe-49e9-bbd8-ad36678feca0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 17:25:41 crc kubenswrapper[4933]: I1202 17:25:41.842226 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q9qnz\" (UniqueName: \"kubernetes.io/projected/31fdcfec-48fe-49e9-bbd8-ad36678feca0-kube-api-access-q9qnz\") pod \"31fdcfec-48fe-49e9-bbd8-ad36678feca0\" (UID: \"31fdcfec-48fe-49e9-bbd8-ad36678feca0\") " Dec 02 17:25:41 crc kubenswrapper[4933]: I1202 17:25:41.842458 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31fdcfec-48fe-49e9-bbd8-ad36678feca0-catalog-content\") pod \"31fdcfec-48fe-49e9-bbd8-ad36678feca0\" (UID: \"31fdcfec-48fe-49e9-bbd8-ad36678feca0\") " Dec 02 17:25:41 crc kubenswrapper[4933]: I1202 17:25:41.843485 4933 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31fdcfec-48fe-49e9-bbd8-ad36678feca0-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 17:25:41 crc kubenswrapper[4933]: I1202 17:25:41.848786 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31fdcfec-48fe-49e9-bbd8-ad36678feca0-kube-api-access-q9qnz" (OuterVolumeSpecName: "kube-api-access-q9qnz") pod "31fdcfec-48fe-49e9-bbd8-ad36678feca0" (UID: "31fdcfec-48fe-49e9-bbd8-ad36678feca0"). InnerVolumeSpecName "kube-api-access-q9qnz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 17:25:41 crc kubenswrapper[4933]: I1202 17:25:41.907621 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31fdcfec-48fe-49e9-bbd8-ad36678feca0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "31fdcfec-48fe-49e9-bbd8-ad36678feca0" (UID: "31fdcfec-48fe-49e9-bbd8-ad36678feca0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 17:25:41 crc kubenswrapper[4933]: I1202 17:25:41.945724 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q9qnz\" (UniqueName: \"kubernetes.io/projected/31fdcfec-48fe-49e9-bbd8-ad36678feca0-kube-api-access-q9qnz\") on node \"crc\" DevicePath \"\"" Dec 02 17:25:41 crc kubenswrapper[4933]: I1202 17:25:41.945774 4933 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31fdcfec-48fe-49e9-bbd8-ad36678feca0-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 17:25:42 crc kubenswrapper[4933]: I1202 17:25:42.396474 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-49z2s" event={"ID":"31fdcfec-48fe-49e9-bbd8-ad36678feca0","Type":"ContainerDied","Data":"e7f07e4ca6c9e072fbceb347ca315d8e7106b9526ac75f3648f0b06e9f2cc516"} Dec 02 17:25:42 crc kubenswrapper[4933]: I1202 17:25:42.396593 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-49z2s" Dec 02 17:25:42 crc kubenswrapper[4933]: I1202 17:25:42.396623 4933 scope.go:117] "RemoveContainer" containerID="88959d31318714c3eb2591451ada071d5715786672233462b95d85cfec97a181" Dec 02 17:25:42 crc kubenswrapper[4933]: I1202 17:25:42.430400 4933 scope.go:117] "RemoveContainer" containerID="a9e143b13d74fae0dd942df5ee6b9c3881c87a91d8f7db896a262c1fe7f32429" Dec 02 17:25:42 crc kubenswrapper[4933]: I1202 17:25:42.451553 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-49z2s"] Dec 02 17:25:42 crc kubenswrapper[4933]: I1202 17:25:42.463757 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-49z2s"] Dec 02 17:25:42 crc kubenswrapper[4933]: I1202 17:25:42.478530 4933 scope.go:117] "RemoveContainer" containerID="6bd1eb4e7be98271e252880f57784a7bc701da1d8ba70e5aab0e83af3f76a443" Dec 02 17:25:43 crc kubenswrapper[4933]: I1202 17:25:43.075990 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31fdcfec-48fe-49e9-bbd8-ad36678feca0" path="/var/lib/kubelet/pods/31fdcfec-48fe-49e9-bbd8-ad36678feca0/volumes" Dec 02 17:25:47 crc kubenswrapper[4933]: I1202 17:25:47.169085 4933 patch_prober.go:28] interesting pod/machine-config-daemon-d2p6w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 17:25:47 crc kubenswrapper[4933]: I1202 17:25:47.169546 4933 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 17:25:47 crc kubenswrapper[4933]: I1202 17:25:47.169594 4933 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" Dec 02 17:25:47 crc kubenswrapper[4933]: I1202 17:25:47.170467 4933 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"529eb147c924b9a9956a3e94527da0e2fe1e1626f4460203a305a414e55c74a6"} pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 17:25:47 crc kubenswrapper[4933]: I1202 17:25:47.170521 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" containerName="machine-config-daemon" containerID="cri-o://529eb147c924b9a9956a3e94527da0e2fe1e1626f4460203a305a414e55c74a6" gracePeriod=600 Dec 02 17:25:47 crc kubenswrapper[4933]: I1202 17:25:47.451570 4933 generic.go:334] "Generic (PLEG): container finished" podID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" containerID="529eb147c924b9a9956a3e94527da0e2fe1e1626f4460203a305a414e55c74a6" exitCode=0 Dec 02 17:25:47 crc kubenswrapper[4933]: I1202 17:25:47.451679 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" event={"ID":"e6c1c5e6-50dd-428a-890c-2c3f0456f2fa","Type":"ContainerDied","Data":"529eb147c924b9a9956a3e94527da0e2fe1e1626f4460203a305a414e55c74a6"} Dec 02 17:25:47 crc kubenswrapper[4933]: I1202 17:25:47.451994 4933 scope.go:117] "RemoveContainer" containerID="c29bcc628a04bb3057206075ef5318ca3acce9aa63e363f1a269a85f5c7c79d9" Dec 02 17:25:48 crc kubenswrapper[4933]: I1202 17:25:48.468371 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" event={"ID":"e6c1c5e6-50dd-428a-890c-2c3f0456f2fa","Type":"ContainerStarted","Data":"8e9bedbcd8d4eb23a5878e0ca4c08cb5a8edf59a17a5b95e1bb072e6b3f98d17"} Dec 02 17:26:34 crc kubenswrapper[4933]: I1202 17:26:34.592190 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-bmdjq"] Dec 02 17:26:34 crc kubenswrapper[4933]: E1202 17:26:34.593241 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31fdcfec-48fe-49e9-bbd8-ad36678feca0" containerName="extract-content" Dec 02 17:26:34 crc kubenswrapper[4933]: I1202 17:26:34.593254 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="31fdcfec-48fe-49e9-bbd8-ad36678feca0" containerName="extract-content" Dec 02 17:26:34 crc kubenswrapper[4933]: E1202 17:26:34.593267 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31fdcfec-48fe-49e9-bbd8-ad36678feca0" containerName="registry-server" Dec 02 17:26:34 crc kubenswrapper[4933]: I1202 17:26:34.593273 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="31fdcfec-48fe-49e9-bbd8-ad36678feca0" containerName="registry-server" Dec 02 17:26:34 crc kubenswrapper[4933]: E1202 17:26:34.593285 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31fdcfec-48fe-49e9-bbd8-ad36678feca0" containerName="extract-utilities" Dec 02 17:26:34 crc kubenswrapper[4933]: I1202 17:26:34.593291 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="31fdcfec-48fe-49e9-bbd8-ad36678feca0" containerName="extract-utilities" Dec 02 17:26:34 crc kubenswrapper[4933]: I1202 17:26:34.593529 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="31fdcfec-48fe-49e9-bbd8-ad36678feca0" containerName="registry-server" Dec 02 17:26:34 crc kubenswrapper[4933]: I1202 17:26:34.595350 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bmdjq" Dec 02 17:26:34 crc kubenswrapper[4933]: I1202 17:26:34.630903 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bmdjq"] Dec 02 17:26:34 crc kubenswrapper[4933]: I1202 17:26:34.708972 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsdsw\" (UniqueName: \"kubernetes.io/projected/35f800d2-6297-4151-b58f-582e4e791cec-kube-api-access-tsdsw\") pod \"redhat-operators-bmdjq\" (UID: \"35f800d2-6297-4151-b58f-582e4e791cec\") " pod="openshift-marketplace/redhat-operators-bmdjq" Dec 02 17:26:34 crc kubenswrapper[4933]: I1202 17:26:34.709345 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35f800d2-6297-4151-b58f-582e4e791cec-catalog-content\") pod \"redhat-operators-bmdjq\" (UID: \"35f800d2-6297-4151-b58f-582e4e791cec\") " pod="openshift-marketplace/redhat-operators-bmdjq" Dec 02 17:26:34 crc kubenswrapper[4933]: I1202 17:26:34.709721 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35f800d2-6297-4151-b58f-582e4e791cec-utilities\") pod \"redhat-operators-bmdjq\" (UID: \"35f800d2-6297-4151-b58f-582e4e791cec\") " pod="openshift-marketplace/redhat-operators-bmdjq" Dec 02 17:26:34 crc kubenswrapper[4933]: I1202 17:26:34.811936 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35f800d2-6297-4151-b58f-582e4e791cec-catalog-content\") pod \"redhat-operators-bmdjq\" (UID: \"35f800d2-6297-4151-b58f-582e4e791cec\") " pod="openshift-marketplace/redhat-operators-bmdjq" Dec 02 17:26:34 crc kubenswrapper[4933]: I1202 17:26:34.812098 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35f800d2-6297-4151-b58f-582e4e791cec-utilities\") pod \"redhat-operators-bmdjq\" (UID: \"35f800d2-6297-4151-b58f-582e4e791cec\") " pod="openshift-marketplace/redhat-operators-bmdjq" Dec 02 17:26:34 crc kubenswrapper[4933]: I1202 17:26:34.812179 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tsdsw\" (UniqueName: \"kubernetes.io/projected/35f800d2-6297-4151-b58f-582e4e791cec-kube-api-access-tsdsw\") pod \"redhat-operators-bmdjq\" (UID: \"35f800d2-6297-4151-b58f-582e4e791cec\") " pod="openshift-marketplace/redhat-operators-bmdjq" Dec 02 17:26:34 crc kubenswrapper[4933]: I1202 17:26:34.812650 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35f800d2-6297-4151-b58f-582e4e791cec-catalog-content\") pod \"redhat-operators-bmdjq\" (UID: \"35f800d2-6297-4151-b58f-582e4e791cec\") " pod="openshift-marketplace/redhat-operators-bmdjq" Dec 02 17:26:34 crc kubenswrapper[4933]: I1202 17:26:34.812699 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35f800d2-6297-4151-b58f-582e4e791cec-utilities\") pod \"redhat-operators-bmdjq\" (UID: \"35f800d2-6297-4151-b58f-582e4e791cec\") " pod="openshift-marketplace/redhat-operators-bmdjq" Dec 02 17:26:34 crc kubenswrapper[4933]: I1202 17:26:34.830615 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsdsw\" (UniqueName: \"kubernetes.io/projected/35f800d2-6297-4151-b58f-582e4e791cec-kube-api-access-tsdsw\") pod \"redhat-operators-bmdjq\" (UID: \"35f800d2-6297-4151-b58f-582e4e791cec\") " pod="openshift-marketplace/redhat-operators-bmdjq" Dec 02 17:26:34 crc kubenswrapper[4933]: I1202 17:26:34.922321 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bmdjq" Dec 02 17:26:35 crc kubenswrapper[4933]: I1202 17:26:35.418850 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bmdjq"] Dec 02 17:26:36 crc kubenswrapper[4933]: I1202 17:26:36.087631 4933 generic.go:334] "Generic (PLEG): container finished" podID="35f800d2-6297-4151-b58f-582e4e791cec" containerID="948a8d00b44ea595f91a832b6f010738024854062772df868ead0288767337ba" exitCode=0 Dec 02 17:26:36 crc kubenswrapper[4933]: I1202 17:26:36.088798 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bmdjq" event={"ID":"35f800d2-6297-4151-b58f-582e4e791cec","Type":"ContainerDied","Data":"948a8d00b44ea595f91a832b6f010738024854062772df868ead0288767337ba"} Dec 02 17:26:36 crc kubenswrapper[4933]: I1202 17:26:36.088868 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bmdjq" event={"ID":"35f800d2-6297-4151-b58f-582e4e791cec","Type":"ContainerStarted","Data":"5db1cce3a360d63ac8914e4692ba0abbfce2c475686dcb9f0be240d739aa992d"} Dec 02 17:26:38 crc kubenswrapper[4933]: I1202 17:26:38.112918 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bmdjq" event={"ID":"35f800d2-6297-4151-b58f-582e4e791cec","Type":"ContainerStarted","Data":"ca57374306dcc25f2241f676dd507116ac5e95e567d4641aa3eeb17f58588811"} Dec 02 17:26:41 crc kubenswrapper[4933]: I1202 17:26:41.144928 4933 generic.go:334] "Generic (PLEG): container finished" podID="35f800d2-6297-4151-b58f-582e4e791cec" containerID="ca57374306dcc25f2241f676dd507116ac5e95e567d4641aa3eeb17f58588811" exitCode=0 Dec 02 17:26:41 crc kubenswrapper[4933]: I1202 17:26:41.144998 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bmdjq" event={"ID":"35f800d2-6297-4151-b58f-582e4e791cec","Type":"ContainerDied","Data":"ca57374306dcc25f2241f676dd507116ac5e95e567d4641aa3eeb17f58588811"} Dec 02 17:26:41 crc kubenswrapper[4933]: I1202 17:26:41.148212 4933 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 17:26:42 crc kubenswrapper[4933]: I1202 17:26:42.166532 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bmdjq" event={"ID":"35f800d2-6297-4151-b58f-582e4e791cec","Type":"ContainerStarted","Data":"87c59f983765a64aba90dba6d86e2df9cc248716eb5de597ed55a74a48e8241c"} Dec 02 17:26:42 crc kubenswrapper[4933]: I1202 17:26:42.199178 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-bmdjq" podStartSLOduration=2.645539827 podStartE2EDuration="8.19915913s" podCreationTimestamp="2025-12-02 17:26:34 +0000 UTC" firstStartedPulling="2025-12-02 17:26:36.099096812 +0000 UTC m=+5659.350323515" lastFinishedPulling="2025-12-02 17:26:41.652716125 +0000 UTC m=+5664.903942818" observedRunningTime="2025-12-02 17:26:42.187239013 +0000 UTC m=+5665.438465716" watchObservedRunningTime="2025-12-02 17:26:42.19915913 +0000 UTC m=+5665.450385833" Dec 02 17:26:44 crc kubenswrapper[4933]: I1202 17:26:44.942314 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-bmdjq" Dec 02 17:26:44 crc kubenswrapper[4933]: I1202 17:26:44.942942 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-bmdjq" Dec 02 17:26:45 crc kubenswrapper[4933]: I1202 17:26:45.996465 4933 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-bmdjq" podUID="35f800d2-6297-4151-b58f-582e4e791cec" containerName="registry-server" probeResult="failure" output=< Dec 02 17:26:45 crc kubenswrapper[4933]: timeout: failed to connect service ":50051" within 1s Dec 02 17:26:45 crc kubenswrapper[4933]: > Dec 02 17:26:55 crc kubenswrapper[4933]: I1202 17:26:55.036852 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-bmdjq" Dec 02 17:26:55 crc kubenswrapper[4933]: I1202 17:26:55.171394 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-bmdjq" Dec 02 17:26:55 crc kubenswrapper[4933]: I1202 17:26:55.296093 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bmdjq"] Dec 02 17:26:56 crc kubenswrapper[4933]: I1202 17:26:56.327941 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-bmdjq" podUID="35f800d2-6297-4151-b58f-582e4e791cec" containerName="registry-server" containerID="cri-o://87c59f983765a64aba90dba6d86e2df9cc248716eb5de597ed55a74a48e8241c" gracePeriod=2 Dec 02 17:26:56 crc kubenswrapper[4933]: I1202 17:26:56.820665 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bmdjq" Dec 02 17:26:56 crc kubenswrapper[4933]: I1202 17:26:56.962881 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tsdsw\" (UniqueName: \"kubernetes.io/projected/35f800d2-6297-4151-b58f-582e4e791cec-kube-api-access-tsdsw\") pod \"35f800d2-6297-4151-b58f-582e4e791cec\" (UID: \"35f800d2-6297-4151-b58f-582e4e791cec\") " Dec 02 17:26:56 crc kubenswrapper[4933]: I1202 17:26:56.962949 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35f800d2-6297-4151-b58f-582e4e791cec-utilities\") pod \"35f800d2-6297-4151-b58f-582e4e791cec\" (UID: \"35f800d2-6297-4151-b58f-582e4e791cec\") " Dec 02 17:26:56 crc kubenswrapper[4933]: I1202 17:26:56.963057 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35f800d2-6297-4151-b58f-582e4e791cec-catalog-content\") pod \"35f800d2-6297-4151-b58f-582e4e791cec\" (UID: \"35f800d2-6297-4151-b58f-582e4e791cec\") " Dec 02 17:26:56 crc kubenswrapper[4933]: I1202 17:26:56.963930 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35f800d2-6297-4151-b58f-582e4e791cec-utilities" (OuterVolumeSpecName: "utilities") pod "35f800d2-6297-4151-b58f-582e4e791cec" (UID: "35f800d2-6297-4151-b58f-582e4e791cec"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 17:26:56 crc kubenswrapper[4933]: I1202 17:26:56.964744 4933 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35f800d2-6297-4151-b58f-582e4e791cec-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 17:26:56 crc kubenswrapper[4933]: I1202 17:26:56.971752 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35f800d2-6297-4151-b58f-582e4e791cec-kube-api-access-tsdsw" (OuterVolumeSpecName: "kube-api-access-tsdsw") pod "35f800d2-6297-4151-b58f-582e4e791cec" (UID: "35f800d2-6297-4151-b58f-582e4e791cec"). InnerVolumeSpecName "kube-api-access-tsdsw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 17:26:57 crc kubenswrapper[4933]: I1202 17:26:57.058302 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35f800d2-6297-4151-b58f-582e4e791cec-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "35f800d2-6297-4151-b58f-582e4e791cec" (UID: "35f800d2-6297-4151-b58f-582e4e791cec"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 17:26:57 crc kubenswrapper[4933]: I1202 17:26:57.067378 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tsdsw\" (UniqueName: \"kubernetes.io/projected/35f800d2-6297-4151-b58f-582e4e791cec-kube-api-access-tsdsw\") on node \"crc\" DevicePath \"\"" Dec 02 17:26:57 crc kubenswrapper[4933]: I1202 17:26:57.067431 4933 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35f800d2-6297-4151-b58f-582e4e791cec-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 17:26:57 crc kubenswrapper[4933]: I1202 17:26:57.343598 4933 generic.go:334] "Generic (PLEG): container finished" podID="35f800d2-6297-4151-b58f-582e4e791cec" containerID="87c59f983765a64aba90dba6d86e2df9cc248716eb5de597ed55a74a48e8241c" exitCode=0 Dec 02 17:26:57 crc kubenswrapper[4933]: I1202 17:26:57.343651 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bmdjq" event={"ID":"35f800d2-6297-4151-b58f-582e4e791cec","Type":"ContainerDied","Data":"87c59f983765a64aba90dba6d86e2df9cc248716eb5de597ed55a74a48e8241c"} Dec 02 17:26:57 crc kubenswrapper[4933]: I1202 17:26:57.343693 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bmdjq" event={"ID":"35f800d2-6297-4151-b58f-582e4e791cec","Type":"ContainerDied","Data":"5db1cce3a360d63ac8914e4692ba0abbfce2c475686dcb9f0be240d739aa992d"} Dec 02 17:26:57 crc kubenswrapper[4933]: I1202 17:26:57.343692 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bmdjq" Dec 02 17:26:57 crc kubenswrapper[4933]: I1202 17:26:57.343712 4933 scope.go:117] "RemoveContainer" containerID="87c59f983765a64aba90dba6d86e2df9cc248716eb5de597ed55a74a48e8241c" Dec 02 17:26:57 crc kubenswrapper[4933]: I1202 17:26:57.372556 4933 scope.go:117] "RemoveContainer" containerID="ca57374306dcc25f2241f676dd507116ac5e95e567d4641aa3eeb17f58588811" Dec 02 17:26:57 crc kubenswrapper[4933]: I1202 17:26:57.377490 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bmdjq"] Dec 02 17:26:57 crc kubenswrapper[4933]: I1202 17:26:57.398625 4933 scope.go:117] "RemoveContainer" containerID="948a8d00b44ea595f91a832b6f010738024854062772df868ead0288767337ba" Dec 02 17:26:57 crc kubenswrapper[4933]: I1202 17:26:57.401698 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-bmdjq"] Dec 02 17:26:57 crc kubenswrapper[4933]: I1202 17:26:57.463128 4933 scope.go:117] "RemoveContainer" containerID="87c59f983765a64aba90dba6d86e2df9cc248716eb5de597ed55a74a48e8241c" Dec 02 17:26:57 crc kubenswrapper[4933]: E1202 17:26:57.463864 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87c59f983765a64aba90dba6d86e2df9cc248716eb5de597ed55a74a48e8241c\": container with ID starting with 87c59f983765a64aba90dba6d86e2df9cc248716eb5de597ed55a74a48e8241c not found: ID does not exist" containerID="87c59f983765a64aba90dba6d86e2df9cc248716eb5de597ed55a74a48e8241c" Dec 02 17:26:57 crc kubenswrapper[4933]: I1202 17:26:57.463929 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87c59f983765a64aba90dba6d86e2df9cc248716eb5de597ed55a74a48e8241c"} err="failed to get container status \"87c59f983765a64aba90dba6d86e2df9cc248716eb5de597ed55a74a48e8241c\": rpc error: code = NotFound desc = could not find container \"87c59f983765a64aba90dba6d86e2df9cc248716eb5de597ed55a74a48e8241c\": container with ID starting with 87c59f983765a64aba90dba6d86e2df9cc248716eb5de597ed55a74a48e8241c not found: ID does not exist" Dec 02 17:26:57 crc kubenswrapper[4933]: I1202 17:26:57.463970 4933 scope.go:117] "RemoveContainer" containerID="ca57374306dcc25f2241f676dd507116ac5e95e567d4641aa3eeb17f58588811" Dec 02 17:26:57 crc kubenswrapper[4933]: E1202 17:26:57.464446 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca57374306dcc25f2241f676dd507116ac5e95e567d4641aa3eeb17f58588811\": container with ID starting with ca57374306dcc25f2241f676dd507116ac5e95e567d4641aa3eeb17f58588811 not found: ID does not exist" containerID="ca57374306dcc25f2241f676dd507116ac5e95e567d4641aa3eeb17f58588811" Dec 02 17:26:57 crc kubenswrapper[4933]: I1202 17:26:57.464481 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca57374306dcc25f2241f676dd507116ac5e95e567d4641aa3eeb17f58588811"} err="failed to get container status \"ca57374306dcc25f2241f676dd507116ac5e95e567d4641aa3eeb17f58588811\": rpc error: code = NotFound desc = could not find container \"ca57374306dcc25f2241f676dd507116ac5e95e567d4641aa3eeb17f58588811\": container with ID starting with ca57374306dcc25f2241f676dd507116ac5e95e567d4641aa3eeb17f58588811 not found: ID does not exist" Dec 02 17:26:57 crc kubenswrapper[4933]: I1202 17:26:57.464502 4933 scope.go:117] "RemoveContainer" containerID="948a8d00b44ea595f91a832b6f010738024854062772df868ead0288767337ba" Dec 02 17:26:57 crc kubenswrapper[4933]: E1202 17:26:57.465550 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"948a8d00b44ea595f91a832b6f010738024854062772df868ead0288767337ba\": container with ID starting with 948a8d00b44ea595f91a832b6f010738024854062772df868ead0288767337ba not found: ID does not exist" containerID="948a8d00b44ea595f91a832b6f010738024854062772df868ead0288767337ba" Dec 02 17:26:57 crc kubenswrapper[4933]: I1202 17:26:57.465619 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"948a8d00b44ea595f91a832b6f010738024854062772df868ead0288767337ba"} err="failed to get container status \"948a8d00b44ea595f91a832b6f010738024854062772df868ead0288767337ba\": rpc error: code = NotFound desc = could not find container \"948a8d00b44ea595f91a832b6f010738024854062772df868ead0288767337ba\": container with ID starting with 948a8d00b44ea595f91a832b6f010738024854062772df868ead0288767337ba not found: ID does not exist" Dec 02 17:26:59 crc kubenswrapper[4933]: I1202 17:26:59.066246 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35f800d2-6297-4151-b58f-582e4e791cec" path="/var/lib/kubelet/pods/35f800d2-6297-4151-b58f-582e4e791cec/volumes" Dec 02 17:27:47 crc kubenswrapper[4933]: I1202 17:27:47.170242 4933 patch_prober.go:28] interesting pod/machine-config-daemon-d2p6w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 17:27:47 crc kubenswrapper[4933]: I1202 17:27:47.170744 4933 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 17:28:17 crc kubenswrapper[4933]: I1202 17:28:17.169619 4933 patch_prober.go:28] interesting pod/machine-config-daemon-d2p6w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 17:28:17 crc kubenswrapper[4933]: I1202 17:28:17.170269 4933 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 17:28:47 crc kubenswrapper[4933]: I1202 17:28:47.169222 4933 patch_prober.go:28] interesting pod/machine-config-daemon-d2p6w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 17:28:47 crc kubenswrapper[4933]: I1202 17:28:47.169842 4933 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 17:28:47 crc kubenswrapper[4933]: I1202 17:28:47.169898 4933 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" Dec 02 17:28:47 crc kubenswrapper[4933]: I1202 17:28:47.170998 4933 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8e9bedbcd8d4eb23a5878e0ca4c08cb5a8edf59a17a5b95e1bb072e6b3f98d17"} pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 17:28:47 crc kubenswrapper[4933]: I1202 17:28:47.171059 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" containerName="machine-config-daemon" containerID="cri-o://8e9bedbcd8d4eb23a5878e0ca4c08cb5a8edf59a17a5b95e1bb072e6b3f98d17" gracePeriod=600 Dec 02 17:28:47 crc kubenswrapper[4933]: E1202 17:28:47.308086 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 17:28:47 crc kubenswrapper[4933]: I1202 17:28:47.670415 4933 generic.go:334] "Generic (PLEG): container finished" podID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" containerID="8e9bedbcd8d4eb23a5878e0ca4c08cb5a8edf59a17a5b95e1bb072e6b3f98d17" exitCode=0 Dec 02 17:28:47 crc kubenswrapper[4933]: I1202 17:28:47.670470 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" event={"ID":"e6c1c5e6-50dd-428a-890c-2c3f0456f2fa","Type":"ContainerDied","Data":"8e9bedbcd8d4eb23a5878e0ca4c08cb5a8edf59a17a5b95e1bb072e6b3f98d17"} Dec 02 17:28:47 crc kubenswrapper[4933]: I1202 17:28:47.670509 4933 scope.go:117] "RemoveContainer" containerID="529eb147c924b9a9956a3e94527da0e2fe1e1626f4460203a305a414e55c74a6" Dec 02 17:28:47 crc kubenswrapper[4933]: I1202 17:28:47.671645 4933 scope.go:117] "RemoveContainer" containerID="8e9bedbcd8d4eb23a5878e0ca4c08cb5a8edf59a17a5b95e1bb072e6b3f98d17" Dec 02 17:28:47 crc kubenswrapper[4933]: E1202 17:28:47.672239 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 17:28:59 crc kubenswrapper[4933]: I1202 17:28:59.053990 4933 scope.go:117] "RemoveContainer" containerID="8e9bedbcd8d4eb23a5878e0ca4c08cb5a8edf59a17a5b95e1bb072e6b3f98d17" Dec 02 17:28:59 crc kubenswrapper[4933]: E1202 17:28:59.054950 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 17:29:14 crc kubenswrapper[4933]: I1202 17:29:14.053124 4933 scope.go:117] "RemoveContainer" containerID="8e9bedbcd8d4eb23a5878e0ca4c08cb5a8edf59a17a5b95e1bb072e6b3f98d17" Dec 02 17:29:14 crc kubenswrapper[4933]: E1202 17:29:14.053795 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 17:29:29 crc kubenswrapper[4933]: I1202 17:29:29.054193 4933 scope.go:117] "RemoveContainer" containerID="8e9bedbcd8d4eb23a5878e0ca4c08cb5a8edf59a17a5b95e1bb072e6b3f98d17" Dec 02 17:29:29 crc kubenswrapper[4933]: E1202 17:29:29.055048 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 17:29:43 crc kubenswrapper[4933]: I1202 17:29:43.053280 4933 scope.go:117] "RemoveContainer" containerID="8e9bedbcd8d4eb23a5878e0ca4c08cb5a8edf59a17a5b95e1bb072e6b3f98d17" Dec 02 17:29:43 crc kubenswrapper[4933]: E1202 17:29:43.054178 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 17:29:56 crc kubenswrapper[4933]: I1202 17:29:56.053595 4933 scope.go:117] "RemoveContainer" containerID="8e9bedbcd8d4eb23a5878e0ca4c08cb5a8edf59a17a5b95e1bb072e6b3f98d17" Dec 02 17:29:56 crc kubenswrapper[4933]: E1202 17:29:56.054723 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 17:30:00 crc kubenswrapper[4933]: I1202 17:30:00.166939 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411610-k6ndg"] Dec 02 17:30:00 crc kubenswrapper[4933]: E1202 17:30:00.167808 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35f800d2-6297-4151-b58f-582e4e791cec" containerName="extract-content" Dec 02 17:30:00 crc kubenswrapper[4933]: I1202 17:30:00.167846 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="35f800d2-6297-4151-b58f-582e4e791cec" containerName="extract-content" Dec 02 17:30:00 crc kubenswrapper[4933]: E1202 17:30:00.167890 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35f800d2-6297-4151-b58f-582e4e791cec" containerName="registry-server" Dec 02 17:30:00 crc kubenswrapper[4933]: I1202 17:30:00.167898 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="35f800d2-6297-4151-b58f-582e4e791cec" containerName="registry-server" Dec 02 17:30:00 crc kubenswrapper[4933]: E1202 17:30:00.167953 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35f800d2-6297-4151-b58f-582e4e791cec" containerName="extract-utilities" Dec 02 17:30:00 crc kubenswrapper[4933]: I1202 17:30:00.167963 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="35f800d2-6297-4151-b58f-582e4e791cec" containerName="extract-utilities" Dec 02 17:30:00 crc kubenswrapper[4933]: I1202 17:30:00.168274 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="35f800d2-6297-4151-b58f-582e4e791cec" containerName="registry-server" Dec 02 17:30:00 crc kubenswrapper[4933]: I1202 17:30:00.169284 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411610-k6ndg" Dec 02 17:30:00 crc kubenswrapper[4933]: I1202 17:30:00.173235 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 02 17:30:00 crc kubenswrapper[4933]: I1202 17:30:00.173473 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 02 17:30:00 crc kubenswrapper[4933]: I1202 17:30:00.192927 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411610-k6ndg"] Dec 02 17:30:00 crc kubenswrapper[4933]: I1202 17:30:00.285117 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/45c51796-cd06-49c3-b749-d20f6d22b1b1-config-volume\") pod \"collect-profiles-29411610-k6ndg\" (UID: \"45c51796-cd06-49c3-b749-d20f6d22b1b1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411610-k6ndg" Dec 02 17:30:00 crc kubenswrapper[4933]: I1202 17:30:00.285297 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/45c51796-cd06-49c3-b749-d20f6d22b1b1-secret-volume\") pod \"collect-profiles-29411610-k6ndg\" (UID: \"45c51796-cd06-49c3-b749-d20f6d22b1b1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411610-k6ndg" Dec 02 17:30:00 crc kubenswrapper[4933]: I1202 17:30:00.285327 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9mg2\" (UniqueName: \"kubernetes.io/projected/45c51796-cd06-49c3-b749-d20f6d22b1b1-kube-api-access-g9mg2\") pod \"collect-profiles-29411610-k6ndg\" (UID: \"45c51796-cd06-49c3-b749-d20f6d22b1b1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411610-k6ndg" Dec 02 17:30:00 crc kubenswrapper[4933]: I1202 17:30:00.387907 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/45c51796-cd06-49c3-b749-d20f6d22b1b1-secret-volume\") pod \"collect-profiles-29411610-k6ndg\" (UID: \"45c51796-cd06-49c3-b749-d20f6d22b1b1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411610-k6ndg" Dec 02 17:30:00 crc kubenswrapper[4933]: I1202 17:30:00.387997 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9mg2\" (UniqueName: \"kubernetes.io/projected/45c51796-cd06-49c3-b749-d20f6d22b1b1-kube-api-access-g9mg2\") pod \"collect-profiles-29411610-k6ndg\" (UID: \"45c51796-cd06-49c3-b749-d20f6d22b1b1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411610-k6ndg" Dec 02 17:30:00 crc kubenswrapper[4933]: I1202 17:30:00.388181 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/45c51796-cd06-49c3-b749-d20f6d22b1b1-config-volume\") pod \"collect-profiles-29411610-k6ndg\" (UID: \"45c51796-cd06-49c3-b749-d20f6d22b1b1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411610-k6ndg" Dec 02 17:30:00 crc kubenswrapper[4933]: I1202 17:30:00.389223 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/45c51796-cd06-49c3-b749-d20f6d22b1b1-config-volume\") pod \"collect-profiles-29411610-k6ndg\" (UID: \"45c51796-cd06-49c3-b749-d20f6d22b1b1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411610-k6ndg" Dec 02 17:30:00 crc kubenswrapper[4933]: I1202 17:30:00.394539 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/45c51796-cd06-49c3-b749-d20f6d22b1b1-secret-volume\") pod \"collect-profiles-29411610-k6ndg\" (UID: \"45c51796-cd06-49c3-b749-d20f6d22b1b1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411610-k6ndg" Dec 02 17:30:00 crc kubenswrapper[4933]: I1202 17:30:00.404715 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9mg2\" (UniqueName: \"kubernetes.io/projected/45c51796-cd06-49c3-b749-d20f6d22b1b1-kube-api-access-g9mg2\") pod \"collect-profiles-29411610-k6ndg\" (UID: \"45c51796-cd06-49c3-b749-d20f6d22b1b1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411610-k6ndg" Dec 02 17:30:00 crc kubenswrapper[4933]: I1202 17:30:00.492305 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411610-k6ndg" Dec 02 17:30:01 crc kubenswrapper[4933]: W1202 17:30:01.028033 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod45c51796_cd06_49c3_b749_d20f6d22b1b1.slice/crio-18ff0ae96b69509460c9840fa1bee010c81f19b792eadbf6bbf0dcefcfb5415d WatchSource:0}: Error finding container 18ff0ae96b69509460c9840fa1bee010c81f19b792eadbf6bbf0dcefcfb5415d: Status 404 returned error can't find the container with id 18ff0ae96b69509460c9840fa1bee010c81f19b792eadbf6bbf0dcefcfb5415d Dec 02 17:30:01 crc kubenswrapper[4933]: I1202 17:30:01.036211 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411610-k6ndg"] Dec 02 17:30:01 crc kubenswrapper[4933]: I1202 17:30:01.761719 4933 generic.go:334] "Generic (PLEG): container finished" podID="45c51796-cd06-49c3-b749-d20f6d22b1b1" containerID="5f0e297dfdcc1a2580db5d2baec4038ba22cedcd030483b3053a34277a5ecb0f" exitCode=0 Dec 02 17:30:01 crc kubenswrapper[4933]: I1202 17:30:01.761990 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411610-k6ndg" event={"ID":"45c51796-cd06-49c3-b749-d20f6d22b1b1","Type":"ContainerDied","Data":"5f0e297dfdcc1a2580db5d2baec4038ba22cedcd030483b3053a34277a5ecb0f"} Dec 02 17:30:01 crc kubenswrapper[4933]: I1202 17:30:01.762121 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411610-k6ndg" event={"ID":"45c51796-cd06-49c3-b749-d20f6d22b1b1","Type":"ContainerStarted","Data":"18ff0ae96b69509460c9840fa1bee010c81f19b792eadbf6bbf0dcefcfb5415d"} Dec 02 17:30:03 crc kubenswrapper[4933]: I1202 17:30:03.211909 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411610-k6ndg" Dec 02 17:30:03 crc kubenswrapper[4933]: I1202 17:30:03.358057 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/45c51796-cd06-49c3-b749-d20f6d22b1b1-config-volume\") pod \"45c51796-cd06-49c3-b749-d20f6d22b1b1\" (UID: \"45c51796-cd06-49c3-b749-d20f6d22b1b1\") " Dec 02 17:30:03 crc kubenswrapper[4933]: I1202 17:30:03.358347 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/45c51796-cd06-49c3-b749-d20f6d22b1b1-secret-volume\") pod \"45c51796-cd06-49c3-b749-d20f6d22b1b1\" (UID: \"45c51796-cd06-49c3-b749-d20f6d22b1b1\") " Dec 02 17:30:03 crc kubenswrapper[4933]: I1202 17:30:03.358467 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g9mg2\" (UniqueName: \"kubernetes.io/projected/45c51796-cd06-49c3-b749-d20f6d22b1b1-kube-api-access-g9mg2\") pod \"45c51796-cd06-49c3-b749-d20f6d22b1b1\" (UID: \"45c51796-cd06-49c3-b749-d20f6d22b1b1\") " Dec 02 17:30:03 crc kubenswrapper[4933]: I1202 17:30:03.359016 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45c51796-cd06-49c3-b749-d20f6d22b1b1-config-volume" (OuterVolumeSpecName: "config-volume") pod "45c51796-cd06-49c3-b749-d20f6d22b1b1" (UID: "45c51796-cd06-49c3-b749-d20f6d22b1b1"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 17:30:03 crc kubenswrapper[4933]: I1202 17:30:03.364302 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45c51796-cd06-49c3-b749-d20f6d22b1b1-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "45c51796-cd06-49c3-b749-d20f6d22b1b1" (UID: "45c51796-cd06-49c3-b749-d20f6d22b1b1"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 17:30:03 crc kubenswrapper[4933]: I1202 17:30:03.364580 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45c51796-cd06-49c3-b749-d20f6d22b1b1-kube-api-access-g9mg2" (OuterVolumeSpecName: "kube-api-access-g9mg2") pod "45c51796-cd06-49c3-b749-d20f6d22b1b1" (UID: "45c51796-cd06-49c3-b749-d20f6d22b1b1"). InnerVolumeSpecName "kube-api-access-g9mg2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 17:30:03 crc kubenswrapper[4933]: I1202 17:30:03.462022 4933 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/45c51796-cd06-49c3-b749-d20f6d22b1b1-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 02 17:30:03 crc kubenswrapper[4933]: I1202 17:30:03.462054 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g9mg2\" (UniqueName: \"kubernetes.io/projected/45c51796-cd06-49c3-b749-d20f6d22b1b1-kube-api-access-g9mg2\") on node \"crc\" DevicePath \"\"" Dec 02 17:30:03 crc kubenswrapper[4933]: I1202 17:30:03.462063 4933 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/45c51796-cd06-49c3-b749-d20f6d22b1b1-config-volume\") on node \"crc\" DevicePath \"\"" Dec 02 17:30:03 crc kubenswrapper[4933]: I1202 17:30:03.788926 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411610-k6ndg" event={"ID":"45c51796-cd06-49c3-b749-d20f6d22b1b1","Type":"ContainerDied","Data":"18ff0ae96b69509460c9840fa1bee010c81f19b792eadbf6bbf0dcefcfb5415d"} Dec 02 17:30:03 crc kubenswrapper[4933]: I1202 17:30:03.788966 4933 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="18ff0ae96b69509460c9840fa1bee010c81f19b792eadbf6bbf0dcefcfb5415d" Dec 02 17:30:03 crc kubenswrapper[4933]: I1202 17:30:03.789003 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411610-k6ndg" Dec 02 17:30:04 crc kubenswrapper[4933]: I1202 17:30:04.363520 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411565-vmzq6"] Dec 02 17:30:04 crc kubenswrapper[4933]: I1202 17:30:04.385225 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411565-vmzq6"] Dec 02 17:30:05 crc kubenswrapper[4933]: I1202 17:30:05.071989 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27cff0b6-327c-4b71-b01b-ed77b98d24ea" path="/var/lib/kubelet/pods/27cff0b6-327c-4b71-b01b-ed77b98d24ea/volumes" Dec 02 17:30:08 crc kubenswrapper[4933]: I1202 17:30:08.053911 4933 scope.go:117] "RemoveContainer" containerID="8e9bedbcd8d4eb23a5878e0ca4c08cb5a8edf59a17a5b95e1bb072e6b3f98d17" Dec 02 17:30:08 crc kubenswrapper[4933]: E1202 17:30:08.054783 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 17:30:21 crc kubenswrapper[4933]: I1202 17:30:21.054517 4933 scope.go:117] "RemoveContainer" containerID="8e9bedbcd8d4eb23a5878e0ca4c08cb5a8edf59a17a5b95e1bb072e6b3f98d17" Dec 02 17:30:21 crc kubenswrapper[4933]: E1202 17:30:21.055452 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 17:30:33 crc kubenswrapper[4933]: I1202 17:30:33.054245 4933 scope.go:117] "RemoveContainer" containerID="8e9bedbcd8d4eb23a5878e0ca4c08cb5a8edf59a17a5b95e1bb072e6b3f98d17" Dec 02 17:30:33 crc kubenswrapper[4933]: E1202 17:30:33.055087 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 17:30:48 crc kubenswrapper[4933]: I1202 17:30:48.053732 4933 scope.go:117] "RemoveContainer" containerID="8e9bedbcd8d4eb23a5878e0ca4c08cb5a8edf59a17a5b95e1bb072e6b3f98d17" Dec 02 17:30:48 crc kubenswrapper[4933]: E1202 17:30:48.054857 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 17:30:48 crc kubenswrapper[4933]: I1202 17:30:48.736847 4933 scope.go:117] "RemoveContainer" containerID="55b6840cbc2de06f8d5f558141ab0bd9efd15fd02b53cefa1764c60107739e6c" Dec 02 17:30:59 crc kubenswrapper[4933]: I1202 17:30:59.053680 4933 scope.go:117] "RemoveContainer" containerID="8e9bedbcd8d4eb23a5878e0ca4c08cb5a8edf59a17a5b95e1bb072e6b3f98d17" Dec 02 17:30:59 crc kubenswrapper[4933]: E1202 17:30:59.054801 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 17:31:11 crc kubenswrapper[4933]: I1202 17:31:11.053878 4933 scope.go:117] "RemoveContainer" containerID="8e9bedbcd8d4eb23a5878e0ca4c08cb5a8edf59a17a5b95e1bb072e6b3f98d17" Dec 02 17:31:11 crc kubenswrapper[4933]: E1202 17:31:11.055002 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 17:31:26 crc kubenswrapper[4933]: I1202 17:31:26.054408 4933 scope.go:117] "RemoveContainer" containerID="8e9bedbcd8d4eb23a5878e0ca4c08cb5a8edf59a17a5b95e1bb072e6b3f98d17" Dec 02 17:31:26 crc kubenswrapper[4933]: E1202 17:31:26.055524 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 17:31:38 crc kubenswrapper[4933]: I1202 17:31:38.054282 4933 scope.go:117] "RemoveContainer" containerID="8e9bedbcd8d4eb23a5878e0ca4c08cb5a8edf59a17a5b95e1bb072e6b3f98d17" Dec 02 17:31:38 crc kubenswrapper[4933]: E1202 17:31:38.055330 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 17:31:53 crc kubenswrapper[4933]: I1202 17:31:53.055223 4933 scope.go:117] "RemoveContainer" containerID="8e9bedbcd8d4eb23a5878e0ca4c08cb5a8edf59a17a5b95e1bb072e6b3f98d17" Dec 02 17:31:53 crc kubenswrapper[4933]: E1202 17:31:53.056139 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 17:32:04 crc kubenswrapper[4933]: I1202 17:32:04.053475 4933 scope.go:117] "RemoveContainer" containerID="8e9bedbcd8d4eb23a5878e0ca4c08cb5a8edf59a17a5b95e1bb072e6b3f98d17" Dec 02 17:32:04 crc kubenswrapper[4933]: E1202 17:32:04.055650 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 17:32:13 crc kubenswrapper[4933]: I1202 17:32:13.312354 4933 generic.go:334] "Generic (PLEG): container finished" podID="bd0fb308-1858-4dea-bf49-38e577824bd0" containerID="1c80addd77e6b1b4f6b362ae8a129d8f952468d11ffcdf63469f001d777b88f8" exitCode=0 Dec 02 17:32:13 crc kubenswrapper[4933]: I1202 17:32:13.312464 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"bd0fb308-1858-4dea-bf49-38e577824bd0","Type":"ContainerDied","Data":"1c80addd77e6b1b4f6b362ae8a129d8f952468d11ffcdf63469f001d777b88f8"} Dec 02 17:32:14 crc kubenswrapper[4933]: I1202 17:32:14.773947 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 02 17:32:14 crc kubenswrapper[4933]: I1202 17:32:14.871197 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"bd0fb308-1858-4dea-bf49-38e577824bd0\" (UID: \"bd0fb308-1858-4dea-bf49-38e577824bd0\") " Dec 02 17:32:14 crc kubenswrapper[4933]: I1202 17:32:14.871664 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bd0fb308-1858-4dea-bf49-38e577824bd0-config-data\") pod \"bd0fb308-1858-4dea-bf49-38e577824bd0\" (UID: \"bd0fb308-1858-4dea-bf49-38e577824bd0\") " Dec 02 17:32:14 crc kubenswrapper[4933]: I1202 17:32:14.871712 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/bd0fb308-1858-4dea-bf49-38e577824bd0-openstack-config-secret\") pod \"bd0fb308-1858-4dea-bf49-38e577824bd0\" (UID: \"bd0fb308-1858-4dea-bf49-38e577824bd0\") " Dec 02 17:32:14 crc kubenswrapper[4933]: I1202 17:32:14.871766 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/bd0fb308-1858-4dea-bf49-38e577824bd0-ca-certs\") pod \"bd0fb308-1858-4dea-bf49-38e577824bd0\" (UID: \"bd0fb308-1858-4dea-bf49-38e577824bd0\") " Dec 02 17:32:14 crc kubenswrapper[4933]: I1202 17:32:14.871863 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/bd0fb308-1858-4dea-bf49-38e577824bd0-test-operator-ephemeral-workdir\") pod \"bd0fb308-1858-4dea-bf49-38e577824bd0\" (UID: \"bd0fb308-1858-4dea-bf49-38e577824bd0\") " Dec 02 17:32:14 crc kubenswrapper[4933]: I1202 17:32:14.872043 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-crcnz\" (UniqueName: \"kubernetes.io/projected/bd0fb308-1858-4dea-bf49-38e577824bd0-kube-api-access-crcnz\") pod \"bd0fb308-1858-4dea-bf49-38e577824bd0\" (UID: \"bd0fb308-1858-4dea-bf49-38e577824bd0\") " Dec 02 17:32:14 crc kubenswrapper[4933]: I1202 17:32:14.872126 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bd0fb308-1858-4dea-bf49-38e577824bd0-ssh-key\") pod \"bd0fb308-1858-4dea-bf49-38e577824bd0\" (UID: \"bd0fb308-1858-4dea-bf49-38e577824bd0\") " Dec 02 17:32:14 crc kubenswrapper[4933]: I1202 17:32:14.872165 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/bd0fb308-1858-4dea-bf49-38e577824bd0-test-operator-ephemeral-temporary\") pod \"bd0fb308-1858-4dea-bf49-38e577824bd0\" (UID: \"bd0fb308-1858-4dea-bf49-38e577824bd0\") " Dec 02 17:32:14 crc kubenswrapper[4933]: I1202 17:32:14.872250 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/bd0fb308-1858-4dea-bf49-38e577824bd0-openstack-config\") pod \"bd0fb308-1858-4dea-bf49-38e577824bd0\" (UID: \"bd0fb308-1858-4dea-bf49-38e577824bd0\") " Dec 02 17:32:14 crc kubenswrapper[4933]: I1202 17:32:14.872734 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd0fb308-1858-4dea-bf49-38e577824bd0-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "bd0fb308-1858-4dea-bf49-38e577824bd0" (UID: "bd0fb308-1858-4dea-bf49-38e577824bd0"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 17:32:14 crc kubenswrapper[4933]: I1202 17:32:14.873163 4933 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/bd0fb308-1858-4dea-bf49-38e577824bd0-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Dec 02 17:32:14 crc kubenswrapper[4933]: I1202 17:32:14.877461 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd0fb308-1858-4dea-bf49-38e577824bd0-kube-api-access-crcnz" (OuterVolumeSpecName: "kube-api-access-crcnz") pod "bd0fb308-1858-4dea-bf49-38e577824bd0" (UID: "bd0fb308-1858-4dea-bf49-38e577824bd0"). InnerVolumeSpecName "kube-api-access-crcnz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 17:32:14 crc kubenswrapper[4933]: I1202 17:32:14.878007 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd0fb308-1858-4dea-bf49-38e577824bd0-config-data" (OuterVolumeSpecName: "config-data") pod "bd0fb308-1858-4dea-bf49-38e577824bd0" (UID: "bd0fb308-1858-4dea-bf49-38e577824bd0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 17:32:14 crc kubenswrapper[4933]: I1202 17:32:14.878074 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd0fb308-1858-4dea-bf49-38e577824bd0-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "bd0fb308-1858-4dea-bf49-38e577824bd0" (UID: "bd0fb308-1858-4dea-bf49-38e577824bd0"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 17:32:14 crc kubenswrapper[4933]: I1202 17:32:14.887059 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "test-operator-logs") pod "bd0fb308-1858-4dea-bf49-38e577824bd0" (UID: "bd0fb308-1858-4dea-bf49-38e577824bd0"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 02 17:32:14 crc kubenswrapper[4933]: I1202 17:32:14.908632 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd0fb308-1858-4dea-bf49-38e577824bd0-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "bd0fb308-1858-4dea-bf49-38e577824bd0" (UID: "bd0fb308-1858-4dea-bf49-38e577824bd0"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 17:32:14 crc kubenswrapper[4933]: I1202 17:32:14.911410 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd0fb308-1858-4dea-bf49-38e577824bd0-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "bd0fb308-1858-4dea-bf49-38e577824bd0" (UID: "bd0fb308-1858-4dea-bf49-38e577824bd0"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 17:32:14 crc kubenswrapper[4933]: I1202 17:32:14.913969 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd0fb308-1858-4dea-bf49-38e577824bd0-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "bd0fb308-1858-4dea-bf49-38e577824bd0" (UID: "bd0fb308-1858-4dea-bf49-38e577824bd0"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 17:32:14 crc kubenswrapper[4933]: I1202 17:32:14.944958 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd0fb308-1858-4dea-bf49-38e577824bd0-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "bd0fb308-1858-4dea-bf49-38e577824bd0" (UID: "bd0fb308-1858-4dea-bf49-38e577824bd0"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 17:32:14 crc kubenswrapper[4933]: I1202 17:32:14.976016 4933 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/bd0fb308-1858-4dea-bf49-38e577824bd0-openstack-config\") on node \"crc\" DevicePath \"\"" Dec 02 17:32:14 crc kubenswrapper[4933]: I1202 17:32:14.976578 4933 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Dec 02 17:32:14 crc kubenswrapper[4933]: I1202 17:32:14.976612 4933 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bd0fb308-1858-4dea-bf49-38e577824bd0-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 17:32:14 crc kubenswrapper[4933]: I1202 17:32:14.976626 4933 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/bd0fb308-1858-4dea-bf49-38e577824bd0-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Dec 02 17:32:14 crc kubenswrapper[4933]: I1202 17:32:14.976642 4933 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/bd0fb308-1858-4dea-bf49-38e577824bd0-ca-certs\") on node \"crc\" DevicePath \"\"" Dec 02 17:32:14 crc kubenswrapper[4933]: I1202 17:32:14.976654 4933 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/bd0fb308-1858-4dea-bf49-38e577824bd0-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Dec 02 17:32:14 crc kubenswrapper[4933]: I1202 17:32:14.976667 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-crcnz\" (UniqueName: \"kubernetes.io/projected/bd0fb308-1858-4dea-bf49-38e577824bd0-kube-api-access-crcnz\") on node \"crc\" DevicePath \"\"" Dec 02 17:32:14 crc kubenswrapper[4933]: I1202 17:32:14.976678 4933 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bd0fb308-1858-4dea-bf49-38e577824bd0-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 17:32:15 crc kubenswrapper[4933]: I1202 17:32:15.020064 4933 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Dec 02 17:32:15 crc kubenswrapper[4933]: I1202 17:32:15.079310 4933 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Dec 02 17:32:15 crc kubenswrapper[4933]: I1202 17:32:15.336295 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"bd0fb308-1858-4dea-bf49-38e577824bd0","Type":"ContainerDied","Data":"68b06c06990fa20c06300cc85138725137d1ad0c2386088be8f9caab89076c55"} Dec 02 17:32:15 crc kubenswrapper[4933]: I1202 17:32:15.336344 4933 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68b06c06990fa20c06300cc85138725137d1ad0c2386088be8f9caab89076c55" Dec 02 17:32:15 crc kubenswrapper[4933]: I1202 17:32:15.336427 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 02 17:32:18 crc kubenswrapper[4933]: I1202 17:32:18.053333 4933 scope.go:117] "RemoveContainer" containerID="8e9bedbcd8d4eb23a5878e0ca4c08cb5a8edf59a17a5b95e1bb072e6b3f98d17" Dec 02 17:32:18 crc kubenswrapper[4933]: E1202 17:32:18.054096 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 17:32:21 crc kubenswrapper[4933]: I1202 17:32:21.989589 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-7lvvj"] Dec 02 17:32:21 crc kubenswrapper[4933]: E1202 17:32:21.990657 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45c51796-cd06-49c3-b749-d20f6d22b1b1" containerName="collect-profiles" Dec 02 17:32:21 crc kubenswrapper[4933]: I1202 17:32:21.990675 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="45c51796-cd06-49c3-b749-d20f6d22b1b1" containerName="collect-profiles" Dec 02 17:32:21 crc kubenswrapper[4933]: E1202 17:32:21.990733 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd0fb308-1858-4dea-bf49-38e577824bd0" containerName="tempest-tests-tempest-tests-runner" Dec 02 17:32:21 crc kubenswrapper[4933]: I1202 17:32:21.990744 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd0fb308-1858-4dea-bf49-38e577824bd0" containerName="tempest-tests-tempest-tests-runner" Dec 02 17:32:21 crc kubenswrapper[4933]: I1202 17:32:21.991010 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd0fb308-1858-4dea-bf49-38e577824bd0" containerName="tempest-tests-tempest-tests-runner" Dec 02 17:32:21 crc kubenswrapper[4933]: I1202 17:32:21.991040 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="45c51796-cd06-49c3-b749-d20f6d22b1b1" containerName="collect-profiles" Dec 02 17:32:21 crc kubenswrapper[4933]: I1202 17:32:21.992905 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7lvvj" Dec 02 17:32:22 crc kubenswrapper[4933]: I1202 17:32:22.002506 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7lvvj"] Dec 02 17:32:22 crc kubenswrapper[4933]: I1202 17:32:22.050766 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/818897f6-33ea-41ae-b307-a8c61a5eeb1d-utilities\") pod \"redhat-marketplace-7lvvj\" (UID: \"818897f6-33ea-41ae-b307-a8c61a5eeb1d\") " pod="openshift-marketplace/redhat-marketplace-7lvvj" Dec 02 17:32:22 crc kubenswrapper[4933]: I1202 17:32:22.050945 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/818897f6-33ea-41ae-b307-a8c61a5eeb1d-catalog-content\") pod \"redhat-marketplace-7lvvj\" (UID: \"818897f6-33ea-41ae-b307-a8c61a5eeb1d\") " pod="openshift-marketplace/redhat-marketplace-7lvvj" Dec 02 17:32:22 crc kubenswrapper[4933]: I1202 17:32:22.051127 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9979\" (UniqueName: \"kubernetes.io/projected/818897f6-33ea-41ae-b307-a8c61a5eeb1d-kube-api-access-p9979\") pod \"redhat-marketplace-7lvvj\" (UID: \"818897f6-33ea-41ae-b307-a8c61a5eeb1d\") " pod="openshift-marketplace/redhat-marketplace-7lvvj" Dec 02 17:32:22 crc kubenswrapper[4933]: I1202 17:32:22.153072 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9979\" (UniqueName: \"kubernetes.io/projected/818897f6-33ea-41ae-b307-a8c61a5eeb1d-kube-api-access-p9979\") pod \"redhat-marketplace-7lvvj\" (UID: \"818897f6-33ea-41ae-b307-a8c61a5eeb1d\") " pod="openshift-marketplace/redhat-marketplace-7lvvj" Dec 02 17:32:22 crc kubenswrapper[4933]: I1202 17:32:22.153414 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/818897f6-33ea-41ae-b307-a8c61a5eeb1d-utilities\") pod \"redhat-marketplace-7lvvj\" (UID: \"818897f6-33ea-41ae-b307-a8c61a5eeb1d\") " pod="openshift-marketplace/redhat-marketplace-7lvvj" Dec 02 17:32:22 crc kubenswrapper[4933]: I1202 17:32:22.153479 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/818897f6-33ea-41ae-b307-a8c61a5eeb1d-catalog-content\") pod \"redhat-marketplace-7lvvj\" (UID: \"818897f6-33ea-41ae-b307-a8c61a5eeb1d\") " pod="openshift-marketplace/redhat-marketplace-7lvvj" Dec 02 17:32:22 crc kubenswrapper[4933]: I1202 17:32:22.154803 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/818897f6-33ea-41ae-b307-a8c61a5eeb1d-utilities\") pod \"redhat-marketplace-7lvvj\" (UID: \"818897f6-33ea-41ae-b307-a8c61a5eeb1d\") " pod="openshift-marketplace/redhat-marketplace-7lvvj" Dec 02 17:32:22 crc kubenswrapper[4933]: I1202 17:32:22.154868 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/818897f6-33ea-41ae-b307-a8c61a5eeb1d-catalog-content\") pod \"redhat-marketplace-7lvvj\" (UID: \"818897f6-33ea-41ae-b307-a8c61a5eeb1d\") " pod="openshift-marketplace/redhat-marketplace-7lvvj" Dec 02 17:32:22 crc kubenswrapper[4933]: I1202 17:32:22.176082 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9979\" (UniqueName: \"kubernetes.io/projected/818897f6-33ea-41ae-b307-a8c61a5eeb1d-kube-api-access-p9979\") pod \"redhat-marketplace-7lvvj\" (UID: \"818897f6-33ea-41ae-b307-a8c61a5eeb1d\") " pod="openshift-marketplace/redhat-marketplace-7lvvj" Dec 02 17:32:22 crc kubenswrapper[4933]: I1202 17:32:22.325021 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7lvvj" Dec 02 17:32:22 crc kubenswrapper[4933]: I1202 17:32:22.858426 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7lvvj"] Dec 02 17:32:23 crc kubenswrapper[4933]: I1202 17:32:23.443816 4933 generic.go:334] "Generic (PLEG): container finished" podID="818897f6-33ea-41ae-b307-a8c61a5eeb1d" containerID="058ec4c688e31d8314602b13488eba24e7cccdfcca02692c5c0d49d2e7a520a0" exitCode=0 Dec 02 17:32:23 crc kubenswrapper[4933]: I1202 17:32:23.443908 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7lvvj" event={"ID":"818897f6-33ea-41ae-b307-a8c61a5eeb1d","Type":"ContainerDied","Data":"058ec4c688e31d8314602b13488eba24e7cccdfcca02692c5c0d49d2e7a520a0"} Dec 02 17:32:23 crc kubenswrapper[4933]: I1202 17:32:23.444187 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7lvvj" event={"ID":"818897f6-33ea-41ae-b307-a8c61a5eeb1d","Type":"ContainerStarted","Data":"d28caafd3ae2fc44cee260f3d8f81859e25d3e1dda83c78b63f5fefc7bd0ce12"} Dec 02 17:32:23 crc kubenswrapper[4933]: I1202 17:32:23.445931 4933 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 17:32:25 crc kubenswrapper[4933]: I1202 17:32:25.470896 4933 generic.go:334] "Generic (PLEG): container finished" podID="818897f6-33ea-41ae-b307-a8c61a5eeb1d" containerID="a40de143fbded94c7f4b707d591b57e3b9238a28e4c0f9e283744ec5c39648e2" exitCode=0 Dec 02 17:32:25 crc kubenswrapper[4933]: I1202 17:32:25.470951 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7lvvj" event={"ID":"818897f6-33ea-41ae-b307-a8c61a5eeb1d","Type":"ContainerDied","Data":"a40de143fbded94c7f4b707d591b57e3b9238a28e4c0f9e283744ec5c39648e2"} Dec 02 17:32:26 crc kubenswrapper[4933]: I1202 17:32:26.050660 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 02 17:32:26 crc kubenswrapper[4933]: I1202 17:32:26.054672 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 02 17:32:26 crc kubenswrapper[4933]: I1202 17:32:26.070110 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-jwtt2" Dec 02 17:32:26 crc kubenswrapper[4933]: I1202 17:32:26.092861 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 02 17:32:26 crc kubenswrapper[4933]: I1202 17:32:26.159947 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvn9x\" (UniqueName: \"kubernetes.io/projected/595befc5-8c2b-4b8e-8991-e12460cd0ef2-kube-api-access-cvn9x\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"595befc5-8c2b-4b8e-8991-e12460cd0ef2\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 02 17:32:26 crc kubenswrapper[4933]: I1202 17:32:26.160448 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"595befc5-8c2b-4b8e-8991-e12460cd0ef2\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 02 17:32:26 crc kubenswrapper[4933]: I1202 17:32:26.262594 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"595befc5-8c2b-4b8e-8991-e12460cd0ef2\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 02 17:32:26 crc kubenswrapper[4933]: I1202 17:32:26.262674 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvn9x\" (UniqueName: \"kubernetes.io/projected/595befc5-8c2b-4b8e-8991-e12460cd0ef2-kube-api-access-cvn9x\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"595befc5-8c2b-4b8e-8991-e12460cd0ef2\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 02 17:32:26 crc kubenswrapper[4933]: I1202 17:32:26.263393 4933 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"595befc5-8c2b-4b8e-8991-e12460cd0ef2\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 02 17:32:26 crc kubenswrapper[4933]: I1202 17:32:26.287831 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvn9x\" (UniqueName: \"kubernetes.io/projected/595befc5-8c2b-4b8e-8991-e12460cd0ef2-kube-api-access-cvn9x\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"595befc5-8c2b-4b8e-8991-e12460cd0ef2\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 02 17:32:26 crc kubenswrapper[4933]: I1202 17:32:26.311704 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"595befc5-8c2b-4b8e-8991-e12460cd0ef2\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 02 17:32:26 crc kubenswrapper[4933]: I1202 17:32:26.407262 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 02 17:32:26 crc kubenswrapper[4933]: I1202 17:32:26.489599 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7lvvj" event={"ID":"818897f6-33ea-41ae-b307-a8c61a5eeb1d","Type":"ContainerStarted","Data":"e4fc99671f4e7dd50b1ececf588b4328d7d11efcd55514b120cf149574777a68"} Dec 02 17:32:26 crc kubenswrapper[4933]: I1202 17:32:26.914575 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-7lvvj" podStartSLOduration=3.4240591289999998 podStartE2EDuration="5.914544174s" podCreationTimestamp="2025-12-02 17:32:21 +0000 UTC" firstStartedPulling="2025-12-02 17:32:23.445676543 +0000 UTC m=+6006.696903246" lastFinishedPulling="2025-12-02 17:32:25.936161588 +0000 UTC m=+6009.187388291" observedRunningTime="2025-12-02 17:32:26.507165462 +0000 UTC m=+6009.758392185" watchObservedRunningTime="2025-12-02 17:32:26.914544174 +0000 UTC m=+6010.165770877" Dec 02 17:32:26 crc kubenswrapper[4933]: I1202 17:32:26.923220 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 02 17:32:26 crc kubenswrapper[4933]: W1202 17:32:26.923783 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod595befc5_8c2b_4b8e_8991_e12460cd0ef2.slice/crio-b7b38e0e996071350902010c60e7495452425dc2296572e6fb3d9925286074e1 WatchSource:0}: Error finding container b7b38e0e996071350902010c60e7495452425dc2296572e6fb3d9925286074e1: Status 404 returned error can't find the container with id b7b38e0e996071350902010c60e7495452425dc2296572e6fb3d9925286074e1 Dec 02 17:32:27 crc kubenswrapper[4933]: I1202 17:32:27.504464 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"595befc5-8c2b-4b8e-8991-e12460cd0ef2","Type":"ContainerStarted","Data":"b7b38e0e996071350902010c60e7495452425dc2296572e6fb3d9925286074e1"} Dec 02 17:32:28 crc kubenswrapper[4933]: I1202 17:32:28.519695 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"595befc5-8c2b-4b8e-8991-e12460cd0ef2","Type":"ContainerStarted","Data":"03d24dfd6fd26bc95e61b309b23c8c3444e7debadff695f68eeb74cd127638df"} Dec 02 17:32:28 crc kubenswrapper[4933]: I1202 17:32:28.532265 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.538919349 podStartE2EDuration="3.532244924s" podCreationTimestamp="2025-12-02 17:32:25 +0000 UTC" firstStartedPulling="2025-12-02 17:32:26.928921639 +0000 UTC m=+6010.180148352" lastFinishedPulling="2025-12-02 17:32:27.922247224 +0000 UTC m=+6011.173473927" observedRunningTime="2025-12-02 17:32:28.532171242 +0000 UTC m=+6011.783397955" watchObservedRunningTime="2025-12-02 17:32:28.532244924 +0000 UTC m=+6011.783471647" Dec 02 17:32:29 crc kubenswrapper[4933]: I1202 17:32:29.055133 4933 scope.go:117] "RemoveContainer" containerID="8e9bedbcd8d4eb23a5878e0ca4c08cb5a8edf59a17a5b95e1bb072e6b3f98d17" Dec 02 17:32:29 crc kubenswrapper[4933]: E1202 17:32:29.055799 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 17:32:32 crc kubenswrapper[4933]: I1202 17:32:32.325890 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-7lvvj" Dec 02 17:32:32 crc kubenswrapper[4933]: I1202 17:32:32.326355 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-7lvvj" Dec 02 17:32:32 crc kubenswrapper[4933]: I1202 17:32:32.396536 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-7lvvj" Dec 02 17:32:32 crc kubenswrapper[4933]: I1202 17:32:32.626331 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-7lvvj" Dec 02 17:32:32 crc kubenswrapper[4933]: I1202 17:32:32.677000 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7lvvj"] Dec 02 17:32:34 crc kubenswrapper[4933]: I1202 17:32:34.589754 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-7lvvj" podUID="818897f6-33ea-41ae-b307-a8c61a5eeb1d" containerName="registry-server" containerID="cri-o://e4fc99671f4e7dd50b1ececf588b4328d7d11efcd55514b120cf149574777a68" gracePeriod=2 Dec 02 17:32:35 crc kubenswrapper[4933]: I1202 17:32:35.234862 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7lvvj" Dec 02 17:32:35 crc kubenswrapper[4933]: I1202 17:32:35.386647 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p9979\" (UniqueName: \"kubernetes.io/projected/818897f6-33ea-41ae-b307-a8c61a5eeb1d-kube-api-access-p9979\") pod \"818897f6-33ea-41ae-b307-a8c61a5eeb1d\" (UID: \"818897f6-33ea-41ae-b307-a8c61a5eeb1d\") " Dec 02 17:32:35 crc kubenswrapper[4933]: I1202 17:32:35.387145 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/818897f6-33ea-41ae-b307-a8c61a5eeb1d-utilities\") pod \"818897f6-33ea-41ae-b307-a8c61a5eeb1d\" (UID: \"818897f6-33ea-41ae-b307-a8c61a5eeb1d\") " Dec 02 17:32:35 crc kubenswrapper[4933]: I1202 17:32:35.387173 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/818897f6-33ea-41ae-b307-a8c61a5eeb1d-catalog-content\") pod \"818897f6-33ea-41ae-b307-a8c61a5eeb1d\" (UID: \"818897f6-33ea-41ae-b307-a8c61a5eeb1d\") " Dec 02 17:32:35 crc kubenswrapper[4933]: I1202 17:32:35.388775 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/818897f6-33ea-41ae-b307-a8c61a5eeb1d-utilities" (OuterVolumeSpecName: "utilities") pod "818897f6-33ea-41ae-b307-a8c61a5eeb1d" (UID: "818897f6-33ea-41ae-b307-a8c61a5eeb1d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 17:32:35 crc kubenswrapper[4933]: I1202 17:32:35.393902 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/818897f6-33ea-41ae-b307-a8c61a5eeb1d-kube-api-access-p9979" (OuterVolumeSpecName: "kube-api-access-p9979") pod "818897f6-33ea-41ae-b307-a8c61a5eeb1d" (UID: "818897f6-33ea-41ae-b307-a8c61a5eeb1d"). InnerVolumeSpecName "kube-api-access-p9979". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 17:32:35 crc kubenswrapper[4933]: I1202 17:32:35.412576 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/818897f6-33ea-41ae-b307-a8c61a5eeb1d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "818897f6-33ea-41ae-b307-a8c61a5eeb1d" (UID: "818897f6-33ea-41ae-b307-a8c61a5eeb1d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 17:32:35 crc kubenswrapper[4933]: I1202 17:32:35.489458 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p9979\" (UniqueName: \"kubernetes.io/projected/818897f6-33ea-41ae-b307-a8c61a5eeb1d-kube-api-access-p9979\") on node \"crc\" DevicePath \"\"" Dec 02 17:32:35 crc kubenswrapper[4933]: I1202 17:32:35.489490 4933 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/818897f6-33ea-41ae-b307-a8c61a5eeb1d-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 17:32:35 crc kubenswrapper[4933]: I1202 17:32:35.489502 4933 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/818897f6-33ea-41ae-b307-a8c61a5eeb1d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 17:32:35 crc kubenswrapper[4933]: I1202 17:32:35.602735 4933 generic.go:334] "Generic (PLEG): container finished" podID="818897f6-33ea-41ae-b307-a8c61a5eeb1d" containerID="e4fc99671f4e7dd50b1ececf588b4328d7d11efcd55514b120cf149574777a68" exitCode=0 Dec 02 17:32:35 crc kubenswrapper[4933]: I1202 17:32:35.602783 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7lvvj" Dec 02 17:32:35 crc kubenswrapper[4933]: I1202 17:32:35.602784 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7lvvj" event={"ID":"818897f6-33ea-41ae-b307-a8c61a5eeb1d","Type":"ContainerDied","Data":"e4fc99671f4e7dd50b1ececf588b4328d7d11efcd55514b120cf149574777a68"} Dec 02 17:32:35 crc kubenswrapper[4933]: I1202 17:32:35.602866 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7lvvj" event={"ID":"818897f6-33ea-41ae-b307-a8c61a5eeb1d","Type":"ContainerDied","Data":"d28caafd3ae2fc44cee260f3d8f81859e25d3e1dda83c78b63f5fefc7bd0ce12"} Dec 02 17:32:35 crc kubenswrapper[4933]: I1202 17:32:35.602886 4933 scope.go:117] "RemoveContainer" containerID="e4fc99671f4e7dd50b1ececf588b4328d7d11efcd55514b120cf149574777a68" Dec 02 17:32:35 crc kubenswrapper[4933]: I1202 17:32:35.633863 4933 scope.go:117] "RemoveContainer" containerID="a40de143fbded94c7f4b707d591b57e3b9238a28e4c0f9e283744ec5c39648e2" Dec 02 17:32:35 crc kubenswrapper[4933]: I1202 17:32:35.646848 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7lvvj"] Dec 02 17:32:35 crc kubenswrapper[4933]: I1202 17:32:35.663963 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-7lvvj"] Dec 02 17:32:35 crc kubenswrapper[4933]: I1202 17:32:35.669369 4933 scope.go:117] "RemoveContainer" containerID="058ec4c688e31d8314602b13488eba24e7cccdfcca02692c5c0d49d2e7a520a0" Dec 02 17:32:35 crc kubenswrapper[4933]: I1202 17:32:35.714351 4933 scope.go:117] "RemoveContainer" containerID="e4fc99671f4e7dd50b1ececf588b4328d7d11efcd55514b120cf149574777a68" Dec 02 17:32:35 crc kubenswrapper[4933]: E1202 17:32:35.714674 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4fc99671f4e7dd50b1ececf588b4328d7d11efcd55514b120cf149574777a68\": container with ID starting with e4fc99671f4e7dd50b1ececf588b4328d7d11efcd55514b120cf149574777a68 not found: ID does not exist" containerID="e4fc99671f4e7dd50b1ececf588b4328d7d11efcd55514b120cf149574777a68" Dec 02 17:32:35 crc kubenswrapper[4933]: I1202 17:32:35.714712 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4fc99671f4e7dd50b1ececf588b4328d7d11efcd55514b120cf149574777a68"} err="failed to get container status \"e4fc99671f4e7dd50b1ececf588b4328d7d11efcd55514b120cf149574777a68\": rpc error: code = NotFound desc = could not find container \"e4fc99671f4e7dd50b1ececf588b4328d7d11efcd55514b120cf149574777a68\": container with ID starting with e4fc99671f4e7dd50b1ececf588b4328d7d11efcd55514b120cf149574777a68 not found: ID does not exist" Dec 02 17:32:35 crc kubenswrapper[4933]: I1202 17:32:35.714742 4933 scope.go:117] "RemoveContainer" containerID="a40de143fbded94c7f4b707d591b57e3b9238a28e4c0f9e283744ec5c39648e2" Dec 02 17:32:35 crc kubenswrapper[4933]: E1202 17:32:35.715009 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a40de143fbded94c7f4b707d591b57e3b9238a28e4c0f9e283744ec5c39648e2\": container with ID starting with a40de143fbded94c7f4b707d591b57e3b9238a28e4c0f9e283744ec5c39648e2 not found: ID does not exist" containerID="a40de143fbded94c7f4b707d591b57e3b9238a28e4c0f9e283744ec5c39648e2" Dec 02 17:32:35 crc kubenswrapper[4933]: I1202 17:32:35.715030 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a40de143fbded94c7f4b707d591b57e3b9238a28e4c0f9e283744ec5c39648e2"} err="failed to get container status \"a40de143fbded94c7f4b707d591b57e3b9238a28e4c0f9e283744ec5c39648e2\": rpc error: code = NotFound desc = could not find container \"a40de143fbded94c7f4b707d591b57e3b9238a28e4c0f9e283744ec5c39648e2\": container with ID starting with a40de143fbded94c7f4b707d591b57e3b9238a28e4c0f9e283744ec5c39648e2 not found: ID does not exist" Dec 02 17:32:35 crc kubenswrapper[4933]: I1202 17:32:35.715044 4933 scope.go:117] "RemoveContainer" containerID="058ec4c688e31d8314602b13488eba24e7cccdfcca02692c5c0d49d2e7a520a0" Dec 02 17:32:35 crc kubenswrapper[4933]: E1202 17:32:35.715552 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"058ec4c688e31d8314602b13488eba24e7cccdfcca02692c5c0d49d2e7a520a0\": container with ID starting with 058ec4c688e31d8314602b13488eba24e7cccdfcca02692c5c0d49d2e7a520a0 not found: ID does not exist" containerID="058ec4c688e31d8314602b13488eba24e7cccdfcca02692c5c0d49d2e7a520a0" Dec 02 17:32:35 crc kubenswrapper[4933]: I1202 17:32:35.715597 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"058ec4c688e31d8314602b13488eba24e7cccdfcca02692c5c0d49d2e7a520a0"} err="failed to get container status \"058ec4c688e31d8314602b13488eba24e7cccdfcca02692c5c0d49d2e7a520a0\": rpc error: code = NotFound desc = could not find container \"058ec4c688e31d8314602b13488eba24e7cccdfcca02692c5c0d49d2e7a520a0\": container with ID starting with 058ec4c688e31d8314602b13488eba24e7cccdfcca02692c5c0d49d2e7a520a0 not found: ID does not exist" Dec 02 17:32:37 crc kubenswrapper[4933]: I1202 17:32:37.068786 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="818897f6-33ea-41ae-b307-a8c61a5eeb1d" path="/var/lib/kubelet/pods/818897f6-33ea-41ae-b307-a8c61a5eeb1d/volumes" Dec 02 17:32:38 crc kubenswrapper[4933]: I1202 17:32:38.213794 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-6ssms"] Dec 02 17:32:38 crc kubenswrapper[4933]: E1202 17:32:38.214613 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="818897f6-33ea-41ae-b307-a8c61a5eeb1d" containerName="extract-utilities" Dec 02 17:32:38 crc kubenswrapper[4933]: I1202 17:32:38.214629 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="818897f6-33ea-41ae-b307-a8c61a5eeb1d" containerName="extract-utilities" Dec 02 17:32:38 crc kubenswrapper[4933]: E1202 17:32:38.214662 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="818897f6-33ea-41ae-b307-a8c61a5eeb1d" containerName="extract-content" Dec 02 17:32:38 crc kubenswrapper[4933]: I1202 17:32:38.214670 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="818897f6-33ea-41ae-b307-a8c61a5eeb1d" containerName="extract-content" Dec 02 17:32:38 crc kubenswrapper[4933]: E1202 17:32:38.214715 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="818897f6-33ea-41ae-b307-a8c61a5eeb1d" containerName="registry-server" Dec 02 17:32:38 crc kubenswrapper[4933]: I1202 17:32:38.214723 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="818897f6-33ea-41ae-b307-a8c61a5eeb1d" containerName="registry-server" Dec 02 17:32:38 crc kubenswrapper[4933]: I1202 17:32:38.215007 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="818897f6-33ea-41ae-b307-a8c61a5eeb1d" containerName="registry-server" Dec 02 17:32:38 crc kubenswrapper[4933]: I1202 17:32:38.216727 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6ssms" Dec 02 17:32:38 crc kubenswrapper[4933]: I1202 17:32:38.240962 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6ssms"] Dec 02 17:32:38 crc kubenswrapper[4933]: I1202 17:32:38.360255 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15df34f0-0056-4137-8005-74be5a3b241c-catalog-content\") pod \"community-operators-6ssms\" (UID: \"15df34f0-0056-4137-8005-74be5a3b241c\") " pod="openshift-marketplace/community-operators-6ssms" Dec 02 17:32:38 crc kubenswrapper[4933]: I1202 17:32:38.360373 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15df34f0-0056-4137-8005-74be5a3b241c-utilities\") pod \"community-operators-6ssms\" (UID: \"15df34f0-0056-4137-8005-74be5a3b241c\") " pod="openshift-marketplace/community-operators-6ssms" Dec 02 17:32:38 crc kubenswrapper[4933]: I1202 17:32:38.360551 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mctw\" (UniqueName: \"kubernetes.io/projected/15df34f0-0056-4137-8005-74be5a3b241c-kube-api-access-4mctw\") pod \"community-operators-6ssms\" (UID: \"15df34f0-0056-4137-8005-74be5a3b241c\") " pod="openshift-marketplace/community-operators-6ssms" Dec 02 17:32:38 crc kubenswrapper[4933]: I1202 17:32:38.462309 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15df34f0-0056-4137-8005-74be5a3b241c-catalog-content\") pod \"community-operators-6ssms\" (UID: \"15df34f0-0056-4137-8005-74be5a3b241c\") " pod="openshift-marketplace/community-operators-6ssms" Dec 02 17:32:38 crc kubenswrapper[4933]: I1202 17:32:38.462511 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15df34f0-0056-4137-8005-74be5a3b241c-utilities\") pod \"community-operators-6ssms\" (UID: \"15df34f0-0056-4137-8005-74be5a3b241c\") " pod="openshift-marketplace/community-operators-6ssms" Dec 02 17:32:38 crc kubenswrapper[4933]: I1202 17:32:38.462751 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4mctw\" (UniqueName: \"kubernetes.io/projected/15df34f0-0056-4137-8005-74be5a3b241c-kube-api-access-4mctw\") pod \"community-operators-6ssms\" (UID: \"15df34f0-0056-4137-8005-74be5a3b241c\") " pod="openshift-marketplace/community-operators-6ssms" Dec 02 17:32:38 crc kubenswrapper[4933]: I1202 17:32:38.463159 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15df34f0-0056-4137-8005-74be5a3b241c-utilities\") pod \"community-operators-6ssms\" (UID: \"15df34f0-0056-4137-8005-74be5a3b241c\") " pod="openshift-marketplace/community-operators-6ssms" Dec 02 17:32:38 crc kubenswrapper[4933]: I1202 17:32:38.463153 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15df34f0-0056-4137-8005-74be5a3b241c-catalog-content\") pod \"community-operators-6ssms\" (UID: \"15df34f0-0056-4137-8005-74be5a3b241c\") " pod="openshift-marketplace/community-operators-6ssms" Dec 02 17:32:38 crc kubenswrapper[4933]: I1202 17:32:38.492070 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mctw\" (UniqueName: \"kubernetes.io/projected/15df34f0-0056-4137-8005-74be5a3b241c-kube-api-access-4mctw\") pod \"community-operators-6ssms\" (UID: \"15df34f0-0056-4137-8005-74be5a3b241c\") " pod="openshift-marketplace/community-operators-6ssms" Dec 02 17:32:38 crc kubenswrapper[4933]: I1202 17:32:38.537560 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6ssms" Dec 02 17:32:39 crc kubenswrapper[4933]: I1202 17:32:39.135424 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6ssms"] Dec 02 17:32:39 crc kubenswrapper[4933]: I1202 17:32:39.723861 4933 generic.go:334] "Generic (PLEG): container finished" podID="15df34f0-0056-4137-8005-74be5a3b241c" containerID="929ff413c2659933aea9db3aaf6f476511c5fd6290ff04532c44e8b01bd1b467" exitCode=0 Dec 02 17:32:39 crc kubenswrapper[4933]: I1202 17:32:39.723973 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6ssms" event={"ID":"15df34f0-0056-4137-8005-74be5a3b241c","Type":"ContainerDied","Data":"929ff413c2659933aea9db3aaf6f476511c5fd6290ff04532c44e8b01bd1b467"} Dec 02 17:32:39 crc kubenswrapper[4933]: I1202 17:32:39.724184 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6ssms" event={"ID":"15df34f0-0056-4137-8005-74be5a3b241c","Type":"ContainerStarted","Data":"7b5aee9a08b66e6a1e9816368d787407905ab2deaf11d6917237b6efcbe54c9b"} Dec 02 17:32:40 crc kubenswrapper[4933]: I1202 17:32:40.736056 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6ssms" event={"ID":"15df34f0-0056-4137-8005-74be5a3b241c","Type":"ContainerStarted","Data":"68e0eb222355277db1049d119870cde2900b0789af721b3d9a0afba3dea97cc9"} Dec 02 17:32:41 crc kubenswrapper[4933]: I1202 17:32:41.753157 4933 generic.go:334] "Generic (PLEG): container finished" podID="15df34f0-0056-4137-8005-74be5a3b241c" containerID="68e0eb222355277db1049d119870cde2900b0789af721b3d9a0afba3dea97cc9" exitCode=0 Dec 02 17:32:41 crc kubenswrapper[4933]: I1202 17:32:41.753285 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6ssms" event={"ID":"15df34f0-0056-4137-8005-74be5a3b241c","Type":"ContainerDied","Data":"68e0eb222355277db1049d119870cde2900b0789af721b3d9a0afba3dea97cc9"} Dec 02 17:32:43 crc kubenswrapper[4933]: I1202 17:32:43.787799 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6ssms" event={"ID":"15df34f0-0056-4137-8005-74be5a3b241c","Type":"ContainerStarted","Data":"d47c830f4ed967adfa02f2392b0bcd7567a9373c27d5510c02c3167bc8779d82"} Dec 02 17:32:43 crc kubenswrapper[4933]: I1202 17:32:43.812554 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-6ssms" podStartSLOduration=3.050390901 podStartE2EDuration="5.812533883s" podCreationTimestamp="2025-12-02 17:32:38 +0000 UTC" firstStartedPulling="2025-12-02 17:32:39.726552706 +0000 UTC m=+6022.977779409" lastFinishedPulling="2025-12-02 17:32:42.488695688 +0000 UTC m=+6025.739922391" observedRunningTime="2025-12-02 17:32:43.809779941 +0000 UTC m=+6027.061006744" watchObservedRunningTime="2025-12-02 17:32:43.812533883 +0000 UTC m=+6027.063760586" Dec 02 17:32:44 crc kubenswrapper[4933]: I1202 17:32:44.053215 4933 scope.go:117] "RemoveContainer" containerID="8e9bedbcd8d4eb23a5878e0ca4c08cb5a8edf59a17a5b95e1bb072e6b3f98d17" Dec 02 17:32:44 crc kubenswrapper[4933]: E1202 17:32:44.053598 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 17:32:48 crc kubenswrapper[4933]: I1202 17:32:48.538048 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-6ssms" Dec 02 17:32:48 crc kubenswrapper[4933]: I1202 17:32:48.539096 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-6ssms" Dec 02 17:32:48 crc kubenswrapper[4933]: I1202 17:32:48.592562 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-6ssms" Dec 02 17:32:48 crc kubenswrapper[4933]: I1202 17:32:48.910025 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-6ssms" Dec 02 17:32:48 crc kubenswrapper[4933]: I1202 17:32:48.970701 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6ssms"] Dec 02 17:32:50 crc kubenswrapper[4933]: I1202 17:32:50.879989 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-6ssms" podUID="15df34f0-0056-4137-8005-74be5a3b241c" containerName="registry-server" containerID="cri-o://d47c830f4ed967adfa02f2392b0bcd7567a9373c27d5510c02c3167bc8779d82" gracePeriod=2 Dec 02 17:32:51 crc kubenswrapper[4933]: I1202 17:32:51.399028 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6ssms" Dec 02 17:32:51 crc kubenswrapper[4933]: I1202 17:32:51.539909 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15df34f0-0056-4137-8005-74be5a3b241c-catalog-content\") pod \"15df34f0-0056-4137-8005-74be5a3b241c\" (UID: \"15df34f0-0056-4137-8005-74be5a3b241c\") " Dec 02 17:32:51 crc kubenswrapper[4933]: I1202 17:32:51.540090 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15df34f0-0056-4137-8005-74be5a3b241c-utilities\") pod \"15df34f0-0056-4137-8005-74be5a3b241c\" (UID: \"15df34f0-0056-4137-8005-74be5a3b241c\") " Dec 02 17:32:51 crc kubenswrapper[4933]: I1202 17:32:51.540208 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4mctw\" (UniqueName: \"kubernetes.io/projected/15df34f0-0056-4137-8005-74be5a3b241c-kube-api-access-4mctw\") pod \"15df34f0-0056-4137-8005-74be5a3b241c\" (UID: \"15df34f0-0056-4137-8005-74be5a3b241c\") " Dec 02 17:32:51 crc kubenswrapper[4933]: I1202 17:32:51.540773 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15df34f0-0056-4137-8005-74be5a3b241c-utilities" (OuterVolumeSpecName: "utilities") pod "15df34f0-0056-4137-8005-74be5a3b241c" (UID: "15df34f0-0056-4137-8005-74be5a3b241c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 17:32:51 crc kubenswrapper[4933]: I1202 17:32:51.541752 4933 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15df34f0-0056-4137-8005-74be5a3b241c-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 17:32:51 crc kubenswrapper[4933]: I1202 17:32:51.546550 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15df34f0-0056-4137-8005-74be5a3b241c-kube-api-access-4mctw" (OuterVolumeSpecName: "kube-api-access-4mctw") pod "15df34f0-0056-4137-8005-74be5a3b241c" (UID: "15df34f0-0056-4137-8005-74be5a3b241c"). InnerVolumeSpecName "kube-api-access-4mctw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 17:32:51 crc kubenswrapper[4933]: I1202 17:32:51.588874 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15df34f0-0056-4137-8005-74be5a3b241c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "15df34f0-0056-4137-8005-74be5a3b241c" (UID: "15df34f0-0056-4137-8005-74be5a3b241c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 17:32:51 crc kubenswrapper[4933]: I1202 17:32:51.644640 4933 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15df34f0-0056-4137-8005-74be5a3b241c-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 17:32:51 crc kubenswrapper[4933]: I1202 17:32:51.644670 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4mctw\" (UniqueName: \"kubernetes.io/projected/15df34f0-0056-4137-8005-74be5a3b241c-kube-api-access-4mctw\") on node \"crc\" DevicePath \"\"" Dec 02 17:32:51 crc kubenswrapper[4933]: I1202 17:32:51.898217 4933 generic.go:334] "Generic (PLEG): container finished" podID="15df34f0-0056-4137-8005-74be5a3b241c" containerID="d47c830f4ed967adfa02f2392b0bcd7567a9373c27d5510c02c3167bc8779d82" exitCode=0 Dec 02 17:32:51 crc kubenswrapper[4933]: I1202 17:32:51.898310 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6ssms" Dec 02 17:32:51 crc kubenswrapper[4933]: I1202 17:32:51.898340 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6ssms" event={"ID":"15df34f0-0056-4137-8005-74be5a3b241c","Type":"ContainerDied","Data":"d47c830f4ed967adfa02f2392b0bcd7567a9373c27d5510c02c3167bc8779d82"} Dec 02 17:32:51 crc kubenswrapper[4933]: I1202 17:32:51.898816 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6ssms" event={"ID":"15df34f0-0056-4137-8005-74be5a3b241c","Type":"ContainerDied","Data":"7b5aee9a08b66e6a1e9816368d787407905ab2deaf11d6917237b6efcbe54c9b"} Dec 02 17:32:51 crc kubenswrapper[4933]: I1202 17:32:51.898878 4933 scope.go:117] "RemoveContainer" containerID="d47c830f4ed967adfa02f2392b0bcd7567a9373c27d5510c02c3167bc8779d82" Dec 02 17:32:51 crc kubenswrapper[4933]: I1202 17:32:51.951122 4933 scope.go:117] "RemoveContainer" containerID="68e0eb222355277db1049d119870cde2900b0789af721b3d9a0afba3dea97cc9" Dec 02 17:32:51 crc kubenswrapper[4933]: I1202 17:32:51.952073 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6ssms"] Dec 02 17:32:52 crc kubenswrapper[4933]: I1202 17:32:52.027193 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-6ssms"] Dec 02 17:32:52 crc kubenswrapper[4933]: I1202 17:32:52.072719 4933 scope.go:117] "RemoveContainer" containerID="929ff413c2659933aea9db3aaf6f476511c5fd6290ff04532c44e8b01bd1b467" Dec 02 17:32:52 crc kubenswrapper[4933]: I1202 17:32:52.106312 4933 scope.go:117] "RemoveContainer" containerID="d47c830f4ed967adfa02f2392b0bcd7567a9373c27d5510c02c3167bc8779d82" Dec 02 17:32:52 crc kubenswrapper[4933]: E1202 17:32:52.106494 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d47c830f4ed967adfa02f2392b0bcd7567a9373c27d5510c02c3167bc8779d82\": container with ID starting with d47c830f4ed967adfa02f2392b0bcd7567a9373c27d5510c02c3167bc8779d82 not found: ID does not exist" containerID="d47c830f4ed967adfa02f2392b0bcd7567a9373c27d5510c02c3167bc8779d82" Dec 02 17:32:52 crc kubenswrapper[4933]: I1202 17:32:52.106529 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d47c830f4ed967adfa02f2392b0bcd7567a9373c27d5510c02c3167bc8779d82"} err="failed to get container status \"d47c830f4ed967adfa02f2392b0bcd7567a9373c27d5510c02c3167bc8779d82\": rpc error: code = NotFound desc = could not find container \"d47c830f4ed967adfa02f2392b0bcd7567a9373c27d5510c02c3167bc8779d82\": container with ID starting with d47c830f4ed967adfa02f2392b0bcd7567a9373c27d5510c02c3167bc8779d82 not found: ID does not exist" Dec 02 17:32:52 crc kubenswrapper[4933]: I1202 17:32:52.106554 4933 scope.go:117] "RemoveContainer" containerID="68e0eb222355277db1049d119870cde2900b0789af721b3d9a0afba3dea97cc9" Dec 02 17:32:52 crc kubenswrapper[4933]: E1202 17:32:52.106751 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68e0eb222355277db1049d119870cde2900b0789af721b3d9a0afba3dea97cc9\": container with ID starting with 68e0eb222355277db1049d119870cde2900b0789af721b3d9a0afba3dea97cc9 not found: ID does not exist" containerID="68e0eb222355277db1049d119870cde2900b0789af721b3d9a0afba3dea97cc9" Dec 02 17:32:52 crc kubenswrapper[4933]: I1202 17:32:52.106781 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68e0eb222355277db1049d119870cde2900b0789af721b3d9a0afba3dea97cc9"} err="failed to get container status \"68e0eb222355277db1049d119870cde2900b0789af721b3d9a0afba3dea97cc9\": rpc error: code = NotFound desc = could not find container \"68e0eb222355277db1049d119870cde2900b0789af721b3d9a0afba3dea97cc9\": container with ID starting with 68e0eb222355277db1049d119870cde2900b0789af721b3d9a0afba3dea97cc9 not found: ID does not exist" Dec 02 17:32:52 crc kubenswrapper[4933]: I1202 17:32:52.106802 4933 scope.go:117] "RemoveContainer" containerID="929ff413c2659933aea9db3aaf6f476511c5fd6290ff04532c44e8b01bd1b467" Dec 02 17:32:52 crc kubenswrapper[4933]: E1202 17:32:52.107344 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"929ff413c2659933aea9db3aaf6f476511c5fd6290ff04532c44e8b01bd1b467\": container with ID starting with 929ff413c2659933aea9db3aaf6f476511c5fd6290ff04532c44e8b01bd1b467 not found: ID does not exist" containerID="929ff413c2659933aea9db3aaf6f476511c5fd6290ff04532c44e8b01bd1b467" Dec 02 17:32:52 crc kubenswrapper[4933]: I1202 17:32:52.107372 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"929ff413c2659933aea9db3aaf6f476511c5fd6290ff04532c44e8b01bd1b467"} err="failed to get container status \"929ff413c2659933aea9db3aaf6f476511c5fd6290ff04532c44e8b01bd1b467\": rpc error: code = NotFound desc = could not find container \"929ff413c2659933aea9db3aaf6f476511c5fd6290ff04532c44e8b01bd1b467\": container with ID starting with 929ff413c2659933aea9db3aaf6f476511c5fd6290ff04532c44e8b01bd1b467 not found: ID does not exist" Dec 02 17:32:53 crc kubenswrapper[4933]: I1202 17:32:53.069547 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15df34f0-0056-4137-8005-74be5a3b241c" path="/var/lib/kubelet/pods/15df34f0-0056-4137-8005-74be5a3b241c/volumes" Dec 02 17:32:57 crc kubenswrapper[4933]: I1202 17:32:57.662899 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-dpxp8/must-gather-8jmnr"] Dec 02 17:32:57 crc kubenswrapper[4933]: E1202 17:32:57.663969 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15df34f0-0056-4137-8005-74be5a3b241c" containerName="registry-server" Dec 02 17:32:57 crc kubenswrapper[4933]: I1202 17:32:57.663986 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="15df34f0-0056-4137-8005-74be5a3b241c" containerName="registry-server" Dec 02 17:32:57 crc kubenswrapper[4933]: E1202 17:32:57.664022 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15df34f0-0056-4137-8005-74be5a3b241c" containerName="extract-utilities" Dec 02 17:32:57 crc kubenswrapper[4933]: I1202 17:32:57.664031 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="15df34f0-0056-4137-8005-74be5a3b241c" containerName="extract-utilities" Dec 02 17:32:57 crc kubenswrapper[4933]: E1202 17:32:57.664063 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15df34f0-0056-4137-8005-74be5a3b241c" containerName="extract-content" Dec 02 17:32:57 crc kubenswrapper[4933]: I1202 17:32:57.664070 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="15df34f0-0056-4137-8005-74be5a3b241c" containerName="extract-content" Dec 02 17:32:57 crc kubenswrapper[4933]: I1202 17:32:57.664360 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="15df34f0-0056-4137-8005-74be5a3b241c" containerName="registry-server" Dec 02 17:32:57 crc kubenswrapper[4933]: I1202 17:32:57.665916 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dpxp8/must-gather-8jmnr" Dec 02 17:32:57 crc kubenswrapper[4933]: I1202 17:32:57.668078 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-dpxp8"/"openshift-service-ca.crt" Dec 02 17:32:57 crc kubenswrapper[4933]: I1202 17:32:57.669709 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-dpxp8"/"kube-root-ca.crt" Dec 02 17:32:57 crc kubenswrapper[4933]: I1202 17:32:57.672524 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-dpxp8"/"default-dockercfg-rm6nf" Dec 02 17:32:57 crc kubenswrapper[4933]: I1202 17:32:57.678350 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-dpxp8/must-gather-8jmnr"] Dec 02 17:32:57 crc kubenswrapper[4933]: I1202 17:32:57.816185 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/924c1228-98a1-4dea-b42f-d1f680719607-must-gather-output\") pod \"must-gather-8jmnr\" (UID: \"924c1228-98a1-4dea-b42f-d1f680719607\") " pod="openshift-must-gather-dpxp8/must-gather-8jmnr" Dec 02 17:32:57 crc kubenswrapper[4933]: I1202 17:32:57.816365 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94ppb\" (UniqueName: \"kubernetes.io/projected/924c1228-98a1-4dea-b42f-d1f680719607-kube-api-access-94ppb\") pod \"must-gather-8jmnr\" (UID: \"924c1228-98a1-4dea-b42f-d1f680719607\") " pod="openshift-must-gather-dpxp8/must-gather-8jmnr" Dec 02 17:32:57 crc kubenswrapper[4933]: I1202 17:32:57.918570 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94ppb\" (UniqueName: \"kubernetes.io/projected/924c1228-98a1-4dea-b42f-d1f680719607-kube-api-access-94ppb\") pod \"must-gather-8jmnr\" (UID: \"924c1228-98a1-4dea-b42f-d1f680719607\") " pod="openshift-must-gather-dpxp8/must-gather-8jmnr" Dec 02 17:32:57 crc kubenswrapper[4933]: I1202 17:32:57.918772 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/924c1228-98a1-4dea-b42f-d1f680719607-must-gather-output\") pod \"must-gather-8jmnr\" (UID: \"924c1228-98a1-4dea-b42f-d1f680719607\") " pod="openshift-must-gather-dpxp8/must-gather-8jmnr" Dec 02 17:32:57 crc kubenswrapper[4933]: I1202 17:32:57.919447 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/924c1228-98a1-4dea-b42f-d1f680719607-must-gather-output\") pod \"must-gather-8jmnr\" (UID: \"924c1228-98a1-4dea-b42f-d1f680719607\") " pod="openshift-must-gather-dpxp8/must-gather-8jmnr" Dec 02 17:32:57 crc kubenswrapper[4933]: I1202 17:32:57.944189 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94ppb\" (UniqueName: \"kubernetes.io/projected/924c1228-98a1-4dea-b42f-d1f680719607-kube-api-access-94ppb\") pod \"must-gather-8jmnr\" (UID: \"924c1228-98a1-4dea-b42f-d1f680719607\") " pod="openshift-must-gather-dpxp8/must-gather-8jmnr" Dec 02 17:32:57 crc kubenswrapper[4933]: I1202 17:32:57.985797 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dpxp8/must-gather-8jmnr" Dec 02 17:32:58 crc kubenswrapper[4933]: I1202 17:32:58.054014 4933 scope.go:117] "RemoveContainer" containerID="8e9bedbcd8d4eb23a5878e0ca4c08cb5a8edf59a17a5b95e1bb072e6b3f98d17" Dec 02 17:32:58 crc kubenswrapper[4933]: E1202 17:32:58.054757 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 17:32:58 crc kubenswrapper[4933]: I1202 17:32:58.547731 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-dpxp8/must-gather-8jmnr"] Dec 02 17:32:58 crc kubenswrapper[4933]: I1202 17:32:58.976555 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dpxp8/must-gather-8jmnr" event={"ID":"924c1228-98a1-4dea-b42f-d1f680719607","Type":"ContainerStarted","Data":"9759db6e8db6fee1ad24925db6aa9e757e06cc8769659c0fa8c4fbebcd771455"} Dec 02 17:33:03 crc kubenswrapper[4933]: I1202 17:33:03.069691 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dpxp8/must-gather-8jmnr" event={"ID":"924c1228-98a1-4dea-b42f-d1f680719607","Type":"ContainerStarted","Data":"6564edcec74ba05e16ad9eb2bf12211027116093d18a0a5c5d6c87dd9a55510d"} Dec 02 17:33:04 crc kubenswrapper[4933]: I1202 17:33:04.076559 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dpxp8/must-gather-8jmnr" event={"ID":"924c1228-98a1-4dea-b42f-d1f680719607","Type":"ContainerStarted","Data":"ac3a47b1a2f3023a8f94376ef0d432d7e6f330334ae8476d05e15a28654ecd66"} Dec 02 17:33:04 crc kubenswrapper[4933]: I1202 17:33:04.106891 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-dpxp8/must-gather-8jmnr" podStartSLOduration=3.260868011 podStartE2EDuration="7.106874386s" podCreationTimestamp="2025-12-02 17:32:57 +0000 UTC" firstStartedPulling="2025-12-02 17:32:58.5709999 +0000 UTC m=+6041.822226603" lastFinishedPulling="2025-12-02 17:33:02.417006275 +0000 UTC m=+6045.668232978" observedRunningTime="2025-12-02 17:33:04.091107795 +0000 UTC m=+6047.342334508" watchObservedRunningTime="2025-12-02 17:33:04.106874386 +0000 UTC m=+6047.358101089" Dec 02 17:33:08 crc kubenswrapper[4933]: I1202 17:33:08.202922 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-dpxp8/crc-debug-9zgpr"] Dec 02 17:33:08 crc kubenswrapper[4933]: I1202 17:33:08.205735 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dpxp8/crc-debug-9zgpr" Dec 02 17:33:08 crc kubenswrapper[4933]: I1202 17:33:08.288857 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c390e16c-d7b0-4e6c-97c7-4f8e5bf6eac0-host\") pod \"crc-debug-9zgpr\" (UID: \"c390e16c-d7b0-4e6c-97c7-4f8e5bf6eac0\") " pod="openshift-must-gather-dpxp8/crc-debug-9zgpr" Dec 02 17:33:08 crc kubenswrapper[4933]: I1202 17:33:08.288955 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d67s9\" (UniqueName: \"kubernetes.io/projected/c390e16c-d7b0-4e6c-97c7-4f8e5bf6eac0-kube-api-access-d67s9\") pod \"crc-debug-9zgpr\" (UID: \"c390e16c-d7b0-4e6c-97c7-4f8e5bf6eac0\") " pod="openshift-must-gather-dpxp8/crc-debug-9zgpr" Dec 02 17:33:08 crc kubenswrapper[4933]: I1202 17:33:08.391179 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c390e16c-d7b0-4e6c-97c7-4f8e5bf6eac0-host\") pod \"crc-debug-9zgpr\" (UID: \"c390e16c-d7b0-4e6c-97c7-4f8e5bf6eac0\") " pod="openshift-must-gather-dpxp8/crc-debug-9zgpr" Dec 02 17:33:08 crc kubenswrapper[4933]: I1202 17:33:08.391262 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d67s9\" (UniqueName: \"kubernetes.io/projected/c390e16c-d7b0-4e6c-97c7-4f8e5bf6eac0-kube-api-access-d67s9\") pod \"crc-debug-9zgpr\" (UID: \"c390e16c-d7b0-4e6c-97c7-4f8e5bf6eac0\") " pod="openshift-must-gather-dpxp8/crc-debug-9zgpr" Dec 02 17:33:08 crc kubenswrapper[4933]: I1202 17:33:08.391618 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c390e16c-d7b0-4e6c-97c7-4f8e5bf6eac0-host\") pod \"crc-debug-9zgpr\" (UID: \"c390e16c-d7b0-4e6c-97c7-4f8e5bf6eac0\") " pod="openshift-must-gather-dpxp8/crc-debug-9zgpr" Dec 02 17:33:08 crc kubenswrapper[4933]: I1202 17:33:08.433446 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d67s9\" (UniqueName: \"kubernetes.io/projected/c390e16c-d7b0-4e6c-97c7-4f8e5bf6eac0-kube-api-access-d67s9\") pod \"crc-debug-9zgpr\" (UID: \"c390e16c-d7b0-4e6c-97c7-4f8e5bf6eac0\") " pod="openshift-must-gather-dpxp8/crc-debug-9zgpr" Dec 02 17:33:08 crc kubenswrapper[4933]: I1202 17:33:08.529710 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dpxp8/crc-debug-9zgpr" Dec 02 17:33:09 crc kubenswrapper[4933]: I1202 17:33:09.055134 4933 scope.go:117] "RemoveContainer" containerID="8e9bedbcd8d4eb23a5878e0ca4c08cb5a8edf59a17a5b95e1bb072e6b3f98d17" Dec 02 17:33:09 crc kubenswrapper[4933]: E1202 17:33:09.055905 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 17:33:09 crc kubenswrapper[4933]: I1202 17:33:09.144641 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dpxp8/crc-debug-9zgpr" event={"ID":"c390e16c-d7b0-4e6c-97c7-4f8e5bf6eac0","Type":"ContainerStarted","Data":"96b36a94eb57ebf05508a2468db0c6c8048f64423734650b7bc08f8b8212db81"} Dec 02 17:33:22 crc kubenswrapper[4933]: I1202 17:33:22.302618 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dpxp8/crc-debug-9zgpr" event={"ID":"c390e16c-d7b0-4e6c-97c7-4f8e5bf6eac0","Type":"ContainerStarted","Data":"ab86ce2068441135695c77541201eb3c535b9355f93dd544561915e3e75a3495"} Dec 02 17:33:22 crc kubenswrapper[4933]: I1202 17:33:22.326738 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-dpxp8/crc-debug-9zgpr" podStartSLOduration=1.592750658 podStartE2EDuration="14.326715138s" podCreationTimestamp="2025-12-02 17:33:08 +0000 UTC" firstStartedPulling="2025-12-02 17:33:08.584479923 +0000 UTC m=+6051.835706616" lastFinishedPulling="2025-12-02 17:33:21.318444393 +0000 UTC m=+6064.569671096" observedRunningTime="2025-12-02 17:33:22.31718118 +0000 UTC m=+6065.568407883" watchObservedRunningTime="2025-12-02 17:33:22.326715138 +0000 UTC m=+6065.577941861" Dec 02 17:33:23 crc kubenswrapper[4933]: I1202 17:33:23.054167 4933 scope.go:117] "RemoveContainer" containerID="8e9bedbcd8d4eb23a5878e0ca4c08cb5a8edf59a17a5b95e1bb072e6b3f98d17" Dec 02 17:33:23 crc kubenswrapper[4933]: E1202 17:33:23.054718 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 17:33:35 crc kubenswrapper[4933]: I1202 17:33:35.054450 4933 scope.go:117] "RemoveContainer" containerID="8e9bedbcd8d4eb23a5878e0ca4c08cb5a8edf59a17a5b95e1bb072e6b3f98d17" Dec 02 17:33:35 crc kubenswrapper[4933]: E1202 17:33:35.055336 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 17:33:49 crc kubenswrapper[4933]: I1202 17:33:49.055073 4933 scope.go:117] "RemoveContainer" containerID="8e9bedbcd8d4eb23a5878e0ca4c08cb5a8edf59a17a5b95e1bb072e6b3f98d17" Dec 02 17:33:49 crc kubenswrapper[4933]: I1202 17:33:49.610051 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" event={"ID":"e6c1c5e6-50dd-428a-890c-2c3f0456f2fa","Type":"ContainerStarted","Data":"a83509fdc2a402118ff91daac7799c2a25cc85aee38ea32fab3a07cc2be654e1"} Dec 02 17:34:10 crc kubenswrapper[4933]: I1202 17:34:10.847974 4933 generic.go:334] "Generic (PLEG): container finished" podID="c390e16c-d7b0-4e6c-97c7-4f8e5bf6eac0" containerID="ab86ce2068441135695c77541201eb3c535b9355f93dd544561915e3e75a3495" exitCode=0 Dec 02 17:34:10 crc kubenswrapper[4933]: I1202 17:34:10.848487 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dpxp8/crc-debug-9zgpr" event={"ID":"c390e16c-d7b0-4e6c-97c7-4f8e5bf6eac0","Type":"ContainerDied","Data":"ab86ce2068441135695c77541201eb3c535b9355f93dd544561915e3e75a3495"} Dec 02 17:34:12 crc kubenswrapper[4933]: I1202 17:34:12.008551 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dpxp8/crc-debug-9zgpr" Dec 02 17:34:12 crc kubenswrapper[4933]: I1202 17:34:12.057548 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-dpxp8/crc-debug-9zgpr"] Dec 02 17:34:12 crc kubenswrapper[4933]: I1202 17:34:12.070214 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d67s9\" (UniqueName: \"kubernetes.io/projected/c390e16c-d7b0-4e6c-97c7-4f8e5bf6eac0-kube-api-access-d67s9\") pod \"c390e16c-d7b0-4e6c-97c7-4f8e5bf6eac0\" (UID: \"c390e16c-d7b0-4e6c-97c7-4f8e5bf6eac0\") " Dec 02 17:34:12 crc kubenswrapper[4933]: I1202 17:34:12.070323 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c390e16c-d7b0-4e6c-97c7-4f8e5bf6eac0-host\") pod \"c390e16c-d7b0-4e6c-97c7-4f8e5bf6eac0\" (UID: \"c390e16c-d7b0-4e6c-97c7-4f8e5bf6eac0\") " Dec 02 17:34:12 crc kubenswrapper[4933]: I1202 17:34:12.070398 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c390e16c-d7b0-4e6c-97c7-4f8e5bf6eac0-host" (OuterVolumeSpecName: "host") pod "c390e16c-d7b0-4e6c-97c7-4f8e5bf6eac0" (UID: "c390e16c-d7b0-4e6c-97c7-4f8e5bf6eac0"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 17:34:12 crc kubenswrapper[4933]: I1202 17:34:12.071387 4933 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c390e16c-d7b0-4e6c-97c7-4f8e5bf6eac0-host\") on node \"crc\" DevicePath \"\"" Dec 02 17:34:12 crc kubenswrapper[4933]: I1202 17:34:12.071800 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-dpxp8/crc-debug-9zgpr"] Dec 02 17:34:12 crc kubenswrapper[4933]: I1202 17:34:12.088996 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c390e16c-d7b0-4e6c-97c7-4f8e5bf6eac0-kube-api-access-d67s9" (OuterVolumeSpecName: "kube-api-access-d67s9") pod "c390e16c-d7b0-4e6c-97c7-4f8e5bf6eac0" (UID: "c390e16c-d7b0-4e6c-97c7-4f8e5bf6eac0"). InnerVolumeSpecName "kube-api-access-d67s9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 17:34:12 crc kubenswrapper[4933]: I1202 17:34:12.173494 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d67s9\" (UniqueName: \"kubernetes.io/projected/c390e16c-d7b0-4e6c-97c7-4f8e5bf6eac0-kube-api-access-d67s9\") on node \"crc\" DevicePath \"\"" Dec 02 17:34:12 crc kubenswrapper[4933]: I1202 17:34:12.871544 4933 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="96b36a94eb57ebf05508a2468db0c6c8048f64423734650b7bc08f8b8212db81" Dec 02 17:34:12 crc kubenswrapper[4933]: I1202 17:34:12.871592 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dpxp8/crc-debug-9zgpr" Dec 02 17:34:13 crc kubenswrapper[4933]: I1202 17:34:13.068209 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c390e16c-d7b0-4e6c-97c7-4f8e5bf6eac0" path="/var/lib/kubelet/pods/c390e16c-d7b0-4e6c-97c7-4f8e5bf6eac0/volumes" Dec 02 17:34:13 crc kubenswrapper[4933]: I1202 17:34:13.246333 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-dpxp8/crc-debug-dzmpw"] Dec 02 17:34:13 crc kubenswrapper[4933]: E1202 17:34:13.246966 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c390e16c-d7b0-4e6c-97c7-4f8e5bf6eac0" containerName="container-00" Dec 02 17:34:13 crc kubenswrapper[4933]: I1202 17:34:13.246987 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="c390e16c-d7b0-4e6c-97c7-4f8e5bf6eac0" containerName="container-00" Dec 02 17:34:13 crc kubenswrapper[4933]: I1202 17:34:13.247262 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="c390e16c-d7b0-4e6c-97c7-4f8e5bf6eac0" containerName="container-00" Dec 02 17:34:13 crc kubenswrapper[4933]: I1202 17:34:13.248233 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dpxp8/crc-debug-dzmpw" Dec 02 17:34:13 crc kubenswrapper[4933]: I1202 17:34:13.297692 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2cc4q\" (UniqueName: \"kubernetes.io/projected/476faa67-36e9-4295-8fe1-41157a48d2a3-kube-api-access-2cc4q\") pod \"crc-debug-dzmpw\" (UID: \"476faa67-36e9-4295-8fe1-41157a48d2a3\") " pod="openshift-must-gather-dpxp8/crc-debug-dzmpw" Dec 02 17:34:13 crc kubenswrapper[4933]: I1202 17:34:13.298046 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/476faa67-36e9-4295-8fe1-41157a48d2a3-host\") pod \"crc-debug-dzmpw\" (UID: \"476faa67-36e9-4295-8fe1-41157a48d2a3\") " pod="openshift-must-gather-dpxp8/crc-debug-dzmpw" Dec 02 17:34:13 crc kubenswrapper[4933]: I1202 17:34:13.401314 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/476faa67-36e9-4295-8fe1-41157a48d2a3-host\") pod \"crc-debug-dzmpw\" (UID: \"476faa67-36e9-4295-8fe1-41157a48d2a3\") " pod="openshift-must-gather-dpxp8/crc-debug-dzmpw" Dec 02 17:34:13 crc kubenswrapper[4933]: I1202 17:34:13.401498 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/476faa67-36e9-4295-8fe1-41157a48d2a3-host\") pod \"crc-debug-dzmpw\" (UID: \"476faa67-36e9-4295-8fe1-41157a48d2a3\") " pod="openshift-must-gather-dpxp8/crc-debug-dzmpw" Dec 02 17:34:13 crc kubenswrapper[4933]: I1202 17:34:13.401767 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2cc4q\" (UniqueName: \"kubernetes.io/projected/476faa67-36e9-4295-8fe1-41157a48d2a3-kube-api-access-2cc4q\") pod \"crc-debug-dzmpw\" (UID: \"476faa67-36e9-4295-8fe1-41157a48d2a3\") " pod="openshift-must-gather-dpxp8/crc-debug-dzmpw" Dec 02 17:34:13 crc kubenswrapper[4933]: I1202 17:34:13.423688 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2cc4q\" (UniqueName: \"kubernetes.io/projected/476faa67-36e9-4295-8fe1-41157a48d2a3-kube-api-access-2cc4q\") pod \"crc-debug-dzmpw\" (UID: \"476faa67-36e9-4295-8fe1-41157a48d2a3\") " pod="openshift-must-gather-dpxp8/crc-debug-dzmpw" Dec 02 17:34:13 crc kubenswrapper[4933]: I1202 17:34:13.569673 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dpxp8/crc-debug-dzmpw" Dec 02 17:34:13 crc kubenswrapper[4933]: I1202 17:34:13.884238 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dpxp8/crc-debug-dzmpw" event={"ID":"476faa67-36e9-4295-8fe1-41157a48d2a3","Type":"ContainerStarted","Data":"00bf94081d249bfffd9925d55ed43c5a4c4d8e0319e16c3b42e62348d11e4a42"} Dec 02 17:34:14 crc kubenswrapper[4933]: I1202 17:34:14.898370 4933 generic.go:334] "Generic (PLEG): container finished" podID="476faa67-36e9-4295-8fe1-41157a48d2a3" containerID="8fb666ac0cecec44fb0c4e0625db4677c20281790414e6fc8fb98260085a273e" exitCode=0 Dec 02 17:34:14 crc kubenswrapper[4933]: I1202 17:34:14.898419 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dpxp8/crc-debug-dzmpw" event={"ID":"476faa67-36e9-4295-8fe1-41157a48d2a3","Type":"ContainerDied","Data":"8fb666ac0cecec44fb0c4e0625db4677c20281790414e6fc8fb98260085a273e"} Dec 02 17:34:16 crc kubenswrapper[4933]: I1202 17:34:16.055086 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dpxp8/crc-debug-dzmpw" Dec 02 17:34:16 crc kubenswrapper[4933]: I1202 17:34:16.181094 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2cc4q\" (UniqueName: \"kubernetes.io/projected/476faa67-36e9-4295-8fe1-41157a48d2a3-kube-api-access-2cc4q\") pod \"476faa67-36e9-4295-8fe1-41157a48d2a3\" (UID: \"476faa67-36e9-4295-8fe1-41157a48d2a3\") " Dec 02 17:34:16 crc kubenswrapper[4933]: I1202 17:34:16.181151 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/476faa67-36e9-4295-8fe1-41157a48d2a3-host\") pod \"476faa67-36e9-4295-8fe1-41157a48d2a3\" (UID: \"476faa67-36e9-4295-8fe1-41157a48d2a3\") " Dec 02 17:34:16 crc kubenswrapper[4933]: I1202 17:34:16.181249 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/476faa67-36e9-4295-8fe1-41157a48d2a3-host" (OuterVolumeSpecName: "host") pod "476faa67-36e9-4295-8fe1-41157a48d2a3" (UID: "476faa67-36e9-4295-8fe1-41157a48d2a3"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 17:34:16 crc kubenswrapper[4933]: I1202 17:34:16.183427 4933 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/476faa67-36e9-4295-8fe1-41157a48d2a3-host\") on node \"crc\" DevicePath \"\"" Dec 02 17:34:16 crc kubenswrapper[4933]: I1202 17:34:16.187840 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/476faa67-36e9-4295-8fe1-41157a48d2a3-kube-api-access-2cc4q" (OuterVolumeSpecName: "kube-api-access-2cc4q") pod "476faa67-36e9-4295-8fe1-41157a48d2a3" (UID: "476faa67-36e9-4295-8fe1-41157a48d2a3"). InnerVolumeSpecName "kube-api-access-2cc4q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 17:34:16 crc kubenswrapper[4933]: I1202 17:34:16.293672 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2cc4q\" (UniqueName: \"kubernetes.io/projected/476faa67-36e9-4295-8fe1-41157a48d2a3-kube-api-access-2cc4q\") on node \"crc\" DevicePath \"\"" Dec 02 17:34:16 crc kubenswrapper[4933]: I1202 17:34:16.919711 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dpxp8/crc-debug-dzmpw" event={"ID":"476faa67-36e9-4295-8fe1-41157a48d2a3","Type":"ContainerDied","Data":"00bf94081d249bfffd9925d55ed43c5a4c4d8e0319e16c3b42e62348d11e4a42"} Dec 02 17:34:16 crc kubenswrapper[4933]: I1202 17:34:16.919763 4933 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="00bf94081d249bfffd9925d55ed43c5a4c4d8e0319e16c3b42e62348d11e4a42" Dec 02 17:34:16 crc kubenswrapper[4933]: I1202 17:34:16.919843 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dpxp8/crc-debug-dzmpw" Dec 02 17:34:17 crc kubenswrapper[4933]: I1202 17:34:17.377121 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-dpxp8/crc-debug-dzmpw"] Dec 02 17:34:17 crc kubenswrapper[4933]: I1202 17:34:17.386615 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-dpxp8/crc-debug-dzmpw"] Dec 02 17:34:18 crc kubenswrapper[4933]: I1202 17:34:18.548142 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-dpxp8/crc-debug-ppf66"] Dec 02 17:34:18 crc kubenswrapper[4933]: E1202 17:34:18.549047 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="476faa67-36e9-4295-8fe1-41157a48d2a3" containerName="container-00" Dec 02 17:34:18 crc kubenswrapper[4933]: I1202 17:34:18.549065 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="476faa67-36e9-4295-8fe1-41157a48d2a3" containerName="container-00" Dec 02 17:34:18 crc kubenswrapper[4933]: I1202 17:34:18.549348 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="476faa67-36e9-4295-8fe1-41157a48d2a3" containerName="container-00" Dec 02 17:34:18 crc kubenswrapper[4933]: I1202 17:34:18.550300 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dpxp8/crc-debug-ppf66" Dec 02 17:34:18 crc kubenswrapper[4933]: I1202 17:34:18.686333 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpcnc\" (UniqueName: \"kubernetes.io/projected/62477a6c-7b87-4ed1-8645-5df52b452051-kube-api-access-cpcnc\") pod \"crc-debug-ppf66\" (UID: \"62477a6c-7b87-4ed1-8645-5df52b452051\") " pod="openshift-must-gather-dpxp8/crc-debug-ppf66" Dec 02 17:34:18 crc kubenswrapper[4933]: I1202 17:34:18.686482 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/62477a6c-7b87-4ed1-8645-5df52b452051-host\") pod \"crc-debug-ppf66\" (UID: \"62477a6c-7b87-4ed1-8645-5df52b452051\") " pod="openshift-must-gather-dpxp8/crc-debug-ppf66" Dec 02 17:34:18 crc kubenswrapper[4933]: I1202 17:34:18.790282 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cpcnc\" (UniqueName: \"kubernetes.io/projected/62477a6c-7b87-4ed1-8645-5df52b452051-kube-api-access-cpcnc\") pod \"crc-debug-ppf66\" (UID: \"62477a6c-7b87-4ed1-8645-5df52b452051\") " pod="openshift-must-gather-dpxp8/crc-debug-ppf66" Dec 02 17:34:18 crc kubenswrapper[4933]: I1202 17:34:18.790524 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/62477a6c-7b87-4ed1-8645-5df52b452051-host\") pod \"crc-debug-ppf66\" (UID: \"62477a6c-7b87-4ed1-8645-5df52b452051\") " pod="openshift-must-gather-dpxp8/crc-debug-ppf66" Dec 02 17:34:18 crc kubenswrapper[4933]: I1202 17:34:18.792575 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/62477a6c-7b87-4ed1-8645-5df52b452051-host\") pod \"crc-debug-ppf66\" (UID: \"62477a6c-7b87-4ed1-8645-5df52b452051\") " pod="openshift-must-gather-dpxp8/crc-debug-ppf66" Dec 02 17:34:18 crc kubenswrapper[4933]: I1202 17:34:18.837613 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpcnc\" (UniqueName: \"kubernetes.io/projected/62477a6c-7b87-4ed1-8645-5df52b452051-kube-api-access-cpcnc\") pod \"crc-debug-ppf66\" (UID: \"62477a6c-7b87-4ed1-8645-5df52b452051\") " pod="openshift-must-gather-dpxp8/crc-debug-ppf66" Dec 02 17:34:18 crc kubenswrapper[4933]: I1202 17:34:18.868701 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dpxp8/crc-debug-ppf66" Dec 02 17:34:18 crc kubenswrapper[4933]: I1202 17:34:18.951217 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dpxp8/crc-debug-ppf66" event={"ID":"62477a6c-7b87-4ed1-8645-5df52b452051","Type":"ContainerStarted","Data":"daaa28e85fa72be46fea47b0e0944cebfe055ab90ad7285f7827fc2a6a0fcc53"} Dec 02 17:34:19 crc kubenswrapper[4933]: I1202 17:34:19.067367 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="476faa67-36e9-4295-8fe1-41157a48d2a3" path="/var/lib/kubelet/pods/476faa67-36e9-4295-8fe1-41157a48d2a3/volumes" Dec 02 17:34:19 crc kubenswrapper[4933]: I1202 17:34:19.965661 4933 generic.go:334] "Generic (PLEG): container finished" podID="62477a6c-7b87-4ed1-8645-5df52b452051" containerID="581f6a64f905352a0b01dfdec6beaa61816926e3cd6196e33cb8992f06f4c04a" exitCode=0 Dec 02 17:34:19 crc kubenswrapper[4933]: I1202 17:34:19.965762 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dpxp8/crc-debug-ppf66" event={"ID":"62477a6c-7b87-4ed1-8645-5df52b452051","Type":"ContainerDied","Data":"581f6a64f905352a0b01dfdec6beaa61816926e3cd6196e33cb8992f06f4c04a"} Dec 02 17:34:20 crc kubenswrapper[4933]: I1202 17:34:20.020040 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-dpxp8/crc-debug-ppf66"] Dec 02 17:34:20 crc kubenswrapper[4933]: I1202 17:34:20.034535 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-dpxp8/crc-debug-ppf66"] Dec 02 17:34:21 crc kubenswrapper[4933]: I1202 17:34:21.104532 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dpxp8/crc-debug-ppf66" Dec 02 17:34:21 crc kubenswrapper[4933]: I1202 17:34:21.145397 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/62477a6c-7b87-4ed1-8645-5df52b452051-host\") pod \"62477a6c-7b87-4ed1-8645-5df52b452051\" (UID: \"62477a6c-7b87-4ed1-8645-5df52b452051\") " Dec 02 17:34:21 crc kubenswrapper[4933]: I1202 17:34:21.145696 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cpcnc\" (UniqueName: \"kubernetes.io/projected/62477a6c-7b87-4ed1-8645-5df52b452051-kube-api-access-cpcnc\") pod \"62477a6c-7b87-4ed1-8645-5df52b452051\" (UID: \"62477a6c-7b87-4ed1-8645-5df52b452051\") " Dec 02 17:34:21 crc kubenswrapper[4933]: I1202 17:34:21.145650 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/62477a6c-7b87-4ed1-8645-5df52b452051-host" (OuterVolumeSpecName: "host") pod "62477a6c-7b87-4ed1-8645-5df52b452051" (UID: "62477a6c-7b87-4ed1-8645-5df52b452051"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 17:34:21 crc kubenswrapper[4933]: I1202 17:34:21.147587 4933 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/62477a6c-7b87-4ed1-8645-5df52b452051-host\") on node \"crc\" DevicePath \"\"" Dec 02 17:34:21 crc kubenswrapper[4933]: I1202 17:34:21.154024 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62477a6c-7b87-4ed1-8645-5df52b452051-kube-api-access-cpcnc" (OuterVolumeSpecName: "kube-api-access-cpcnc") pod "62477a6c-7b87-4ed1-8645-5df52b452051" (UID: "62477a6c-7b87-4ed1-8645-5df52b452051"). InnerVolumeSpecName "kube-api-access-cpcnc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 17:34:21 crc kubenswrapper[4933]: I1202 17:34:21.249848 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cpcnc\" (UniqueName: \"kubernetes.io/projected/62477a6c-7b87-4ed1-8645-5df52b452051-kube-api-access-cpcnc\") on node \"crc\" DevicePath \"\"" Dec 02 17:34:21 crc kubenswrapper[4933]: I1202 17:34:21.990134 4933 scope.go:117] "RemoveContainer" containerID="581f6a64f905352a0b01dfdec6beaa61816926e3cd6196e33cb8992f06f4c04a" Dec 02 17:34:21 crc kubenswrapper[4933]: I1202 17:34:21.990184 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dpxp8/crc-debug-ppf66" Dec 02 17:34:23 crc kubenswrapper[4933]: I1202 17:34:23.068718 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62477a6c-7b87-4ed1-8645-5df52b452051" path="/var/lib/kubelet/pods/62477a6c-7b87-4ed1-8645-5df52b452051/volumes" Dec 02 17:34:44 crc kubenswrapper[4933]: I1202 17:34:44.868025 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_9e5f2071-d2a6-49cb-9c10-34a6f81bc75b/aodh-api/0.log" Dec 02 17:34:45 crc kubenswrapper[4933]: I1202 17:34:45.050737 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_9e5f2071-d2a6-49cb-9c10-34a6f81bc75b/aodh-evaluator/0.log" Dec 02 17:34:45 crc kubenswrapper[4933]: I1202 17:34:45.108347 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_9e5f2071-d2a6-49cb-9c10-34a6f81bc75b/aodh-notifier/0.log" Dec 02 17:34:45 crc kubenswrapper[4933]: I1202 17:34:45.117834 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_9e5f2071-d2a6-49cb-9c10-34a6f81bc75b/aodh-listener/0.log" Dec 02 17:34:45 crc kubenswrapper[4933]: I1202 17:34:45.306982 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-56654f9db6-xk8pt_cec17902-d3a5-4961-88eb-65c3773747fa/barbican-api/0.log" Dec 02 17:34:45 crc kubenswrapper[4933]: I1202 17:34:45.315807 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-56654f9db6-xk8pt_cec17902-d3a5-4961-88eb-65c3773747fa/barbican-api-log/0.log" Dec 02 17:34:45 crc kubenswrapper[4933]: I1202 17:34:45.434393 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-64dd586798-jl4hw_78795142-23f4-4bfd-ba25-479d6cc3c19f/barbican-keystone-listener/0.log" Dec 02 17:34:45 crc kubenswrapper[4933]: I1202 17:34:45.624929 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-56bd7dd77f-sxk4g_04729c7a-7c1b-4138-832a-f6d0bf327720/barbican-worker/0.log" Dec 02 17:34:45 crc kubenswrapper[4933]: I1202 17:34:45.665065 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-64dd586798-jl4hw_78795142-23f4-4bfd-ba25-479d6cc3c19f/barbican-keystone-listener-log/0.log" Dec 02 17:34:45 crc kubenswrapper[4933]: I1202 17:34:45.686871 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-56bd7dd77f-sxk4g_04729c7a-7c1b-4138-832a-f6d0bf327720/barbican-worker-log/0.log" Dec 02 17:34:45 crc kubenswrapper[4933]: I1202 17:34:45.913405 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-db4xp_2060aa16-0f55-457f-98c1-058372e78f0f/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 17:34:45 crc kubenswrapper[4933]: I1202 17:34:45.989067 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_d788537f-6bd8-4db8-a47d-2053d24dca64/ceilometer-central-agent/0.log" Dec 02 17:34:46 crc kubenswrapper[4933]: I1202 17:34:46.144982 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_d788537f-6bd8-4db8-a47d-2053d24dca64/ceilometer-notification-agent/0.log" Dec 02 17:34:46 crc kubenswrapper[4933]: I1202 17:34:46.145281 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_d788537f-6bd8-4db8-a47d-2053d24dca64/proxy-httpd/0.log" Dec 02 17:34:46 crc kubenswrapper[4933]: I1202 17:34:46.169232 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_d788537f-6bd8-4db8-a47d-2053d24dca64/sg-core/0.log" Dec 02 17:34:46 crc kubenswrapper[4933]: I1202 17:34:46.386707 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_9e9fc2a4-bdc8-4f3f-9ef1-b9e9a141cd02/cinder-api-log/0.log" Dec 02 17:34:46 crc kubenswrapper[4933]: I1202 17:34:46.437246 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_9e9fc2a4-bdc8-4f3f-9ef1-b9e9a141cd02/cinder-api/0.log" Dec 02 17:34:46 crc kubenswrapper[4933]: I1202 17:34:46.531530 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_60f07df2-68b3-4c78-9279-5dd6d9c71397/cinder-scheduler/0.log" Dec 02 17:34:46 crc kubenswrapper[4933]: I1202 17:34:46.660398 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_60f07df2-68b3-4c78-9279-5dd6d9c71397/probe/0.log" Dec 02 17:34:46 crc kubenswrapper[4933]: I1202 17:34:46.711789 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-87nbr_0a4e084f-f2c5-418d-8990-6074168317ab/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 17:34:46 crc kubenswrapper[4933]: I1202 17:34:46.901212 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-b4tdz_9e5c1276-65ce-4553-9d05-e8e27aaef6b3/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 17:34:46 crc kubenswrapper[4933]: I1202 17:34:46.950297 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5596c69fcc-lb9d4_d8a58bed-855e-481a-a1c6-a2fa6851e55c/init/0.log" Dec 02 17:34:47 crc kubenswrapper[4933]: I1202 17:34:47.134595 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5596c69fcc-lb9d4_d8a58bed-855e-481a-a1c6-a2fa6851e55c/init/0.log" Dec 02 17:34:47 crc kubenswrapper[4933]: I1202 17:34:47.225849 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-dtvjg_2baaff97-25e3-44ca-818e-0c9d121abe01/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 17:34:47 crc kubenswrapper[4933]: I1202 17:34:47.230354 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5596c69fcc-lb9d4_d8a58bed-855e-481a-a1c6-a2fa6851e55c/dnsmasq-dns/0.log" Dec 02 17:34:47 crc kubenswrapper[4933]: I1202 17:34:47.438924 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_848e9b92-1e28-4d82-b057-0335915a6155/glance-httpd/0.log" Dec 02 17:34:47 crc kubenswrapper[4933]: I1202 17:34:47.441949 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_848e9b92-1e28-4d82-b057-0335915a6155/glance-log/0.log" Dec 02 17:34:47 crc kubenswrapper[4933]: I1202 17:34:47.600510 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_45180c36-0fa3-4abc-a647-9b4beb0ed87d/glance-httpd/0.log" Dec 02 17:34:47 crc kubenswrapper[4933]: I1202 17:34:47.637865 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_45180c36-0fa3-4abc-a647-9b4beb0ed87d/glance-log/0.log" Dec 02 17:34:48 crc kubenswrapper[4933]: I1202 17:34:48.205746 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-engine-c8865d5c5-ktz2c_1aa97de0-3a2a-474c-904e-5fa59773c33c/heat-engine/0.log" Dec 02 17:34:48 crc kubenswrapper[4933]: I1202 17:34:48.440504 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-api-7cfb68f886-5mpb2_0f856765-5f4c-445f-bcd1-736db6fb2c56/heat-api/0.log" Dec 02 17:34:48 crc kubenswrapper[4933]: I1202 17:34:48.505564 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-87krj_6efaf1ff-2f24-4e58-8899-a5ca660bd6cc/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 17:34:48 crc kubenswrapper[4933]: I1202 17:34:48.582905 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-cfnapi-7d8849d76d-cc29j_6ce3e9ef-c14e-49c8-bd2b-8b268f028516/heat-cfnapi/0.log" Dec 02 17:34:48 crc kubenswrapper[4933]: I1202 17:34:48.677619 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-mqv4w_213c04c8-e37f-4898-8141-cd8a5a5e6626/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 17:34:48 crc kubenswrapper[4933]: I1202 17:34:48.935018 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29411581-d66dr_86c26f4d-35bf-4616-8935-e00dad2fb46a/keystone-cron/0.log" Dec 02 17:34:49 crc kubenswrapper[4933]: I1202 17:34:49.070117 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_06766e68-6811-4cd2-bb90-67cc353669e6/kube-state-metrics/0.log" Dec 02 17:34:49 crc kubenswrapper[4933]: I1202 17:34:49.229403 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-6bm6r_7b465ce0-4efc-4624-b140-f3bbb0e0b420/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 17:34:49 crc kubenswrapper[4933]: I1202 17:34:49.239082 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-5445f4c57f-62kcb_e87150f8-ace7-485f-bbfe-8818205e400b/keystone-api/0.log" Dec 02 17:34:49 crc kubenswrapper[4933]: I1202 17:34:49.433817 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_logging-edpm-deployment-openstack-edpm-ipam-995pm_e28d7f01-efc3-4011-baea-61cc2a6f0cd9/logging-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 17:34:49 crc kubenswrapper[4933]: I1202 17:34:49.662975 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mysqld-exporter-0_43672f01-bd87-4595-8ba0-d76811762bc2/mysqld-exporter/0.log" Dec 02 17:34:50 crc kubenswrapper[4933]: I1202 17:34:50.092177 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-7d5bbf46cc-92zkh_03882bf4-d9e9-460d-a94e-fb17189d0278/neutron-httpd/0.log" Dec 02 17:34:50 crc kubenswrapper[4933]: I1202 17:34:50.092476 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-snnm2_438c203a-5750-4377-95f9-3aa2353f6f53/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 17:34:50 crc kubenswrapper[4933]: I1202 17:34:50.263402 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-7d5bbf46cc-92zkh_03882bf4-d9e9-460d-a94e-fb17189d0278/neutron-api/0.log" Dec 02 17:34:50 crc kubenswrapper[4933]: I1202 17:34:50.720610 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_f8103500-191e-4988-b97f-532c1f1c1b20/nova-cell0-conductor-conductor/0.log" Dec 02 17:34:51 crc kubenswrapper[4933]: I1202 17:34:51.036307 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_9aa955fa-0221-4590-bd34-0ee84ba06562/nova-api-log/0.log" Dec 02 17:34:51 crc kubenswrapper[4933]: I1202 17:34:51.088738 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_3186bcaa-4306-4842-885b-6afd00553e78/nova-cell1-conductor-conductor/0.log" Dec 02 17:34:51 crc kubenswrapper[4933]: I1202 17:34:51.443057 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-cbhhv_5e83fdad-1cda-4282-b5e7-6911bdd8d9a0/nova-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 17:34:51 crc kubenswrapper[4933]: I1202 17:34:51.445107 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_d7593905-f88c-466a-b547-9a3e59588987/nova-cell1-novncproxy-novncproxy/0.log" Dec 02 17:34:51 crc kubenswrapper[4933]: I1202 17:34:51.521974 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_9aa955fa-0221-4590-bd34-0ee84ba06562/nova-api-api/0.log" Dec 02 17:34:51 crc kubenswrapper[4933]: I1202 17:34:51.776773 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_20794a38-c3e1-4b9e-a8d6-6df6d2a6bd7d/nova-metadata-log/0.log" Dec 02 17:34:51 crc kubenswrapper[4933]: I1202 17:34:51.973999 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_a40f5b56-cf52-45da-b3b8-70a271c07984/nova-scheduler-scheduler/0.log" Dec 02 17:34:52 crc kubenswrapper[4933]: I1202 17:34:52.062797 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_553a306b-ceeb-41b2-8b2c-c32dcd70639e/mysql-bootstrap/0.log" Dec 02 17:34:52 crc kubenswrapper[4933]: I1202 17:34:52.297349 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_553a306b-ceeb-41b2-8b2c-c32dcd70639e/mysql-bootstrap/0.log" Dec 02 17:34:52 crc kubenswrapper[4933]: I1202 17:34:52.339290 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_553a306b-ceeb-41b2-8b2c-c32dcd70639e/galera/0.log" Dec 02 17:34:52 crc kubenswrapper[4933]: I1202 17:34:52.469458 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_cd426cb0-522e-4c62-874e-a85119e82490/mysql-bootstrap/0.log" Dec 02 17:34:52 crc kubenswrapper[4933]: I1202 17:34:52.721002 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_cd426cb0-522e-4c62-874e-a85119e82490/galera/0.log" Dec 02 17:34:52 crc kubenswrapper[4933]: I1202 17:34:52.768943 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_cd426cb0-522e-4c62-874e-a85119e82490/mysql-bootstrap/0.log" Dec 02 17:34:52 crc kubenswrapper[4933]: I1202 17:34:52.990508 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_f673f811-c1d1-4e11-94d8-4932e9761bbf/openstackclient/0.log" Dec 02 17:34:53 crc kubenswrapper[4933]: I1202 17:34:53.049706 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-g8qb5_8a35aeec-161d-4ef9-a42b-7967c06c7249/ovn-controller/0.log" Dec 02 17:34:53 crc kubenswrapper[4933]: I1202 17:34:53.333442 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-cvdrx_b9b7f4a7-69fa-4d7a-bbbe-631089b57bf2/openstack-network-exporter/0.log" Dec 02 17:34:53 crc kubenswrapper[4933]: I1202 17:34:53.469565 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-2z7kb_34ddcddf-7552-4555-8013-3dc06fc2549a/ovsdb-server-init/0.log" Dec 02 17:34:53 crc kubenswrapper[4933]: I1202 17:34:53.695159 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-2z7kb_34ddcddf-7552-4555-8013-3dc06fc2549a/ovsdb-server-init/0.log" Dec 02 17:34:53 crc kubenswrapper[4933]: I1202 17:34:53.754026 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-2z7kb_34ddcddf-7552-4555-8013-3dc06fc2549a/ovsdb-server/0.log" Dec 02 17:34:53 crc kubenswrapper[4933]: I1202 17:34:53.767658 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-2z7kb_34ddcddf-7552-4555-8013-3dc06fc2549a/ovs-vswitchd/0.log" Dec 02 17:34:54 crc kubenswrapper[4933]: I1202 17:34:54.026940 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-56tww_5bb48f13-b80b-462d-acf5-8751ec7aaa8e/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 17:34:54 crc kubenswrapper[4933]: I1202 17:34:54.106972 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_20794a38-c3e1-4b9e-a8d6-6df6d2a6bd7d/nova-metadata-metadata/0.log" Dec 02 17:34:54 crc kubenswrapper[4933]: I1202 17:34:54.155395 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_f365adeb-4cfd-409d-8bd8-29b4779e1e0f/openstack-network-exporter/0.log" Dec 02 17:34:54 crc kubenswrapper[4933]: I1202 17:34:54.309438 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_f365adeb-4cfd-409d-8bd8-29b4779e1e0f/ovn-northd/0.log" Dec 02 17:34:54 crc kubenswrapper[4933]: I1202 17:34:54.365365 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_662ba342-72a0-430b-b46d-d6f0f0eafd2b/ovsdbserver-nb/0.log" Dec 02 17:34:54 crc kubenswrapper[4933]: I1202 17:34:54.374022 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_662ba342-72a0-430b-b46d-d6f0f0eafd2b/openstack-network-exporter/0.log" Dec 02 17:34:54 crc kubenswrapper[4933]: I1202 17:34:54.617955 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_d5b04559-f2ad-49dc-a280-626d6de841de/ovsdbserver-sb/0.log" Dec 02 17:34:54 crc kubenswrapper[4933]: I1202 17:34:54.634793 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_d5b04559-f2ad-49dc-a280-626d6de841de/openstack-network-exporter/0.log" Dec 02 17:34:55 crc kubenswrapper[4933]: I1202 17:34:55.318168 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-64bbf7fc9b-xn4tj_250ea801-8fe7-44d3-b5ec-78cd22a3fa39/placement-log/0.log" Dec 02 17:34:55 crc kubenswrapper[4933]: I1202 17:34:55.460146 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-64bbf7fc9b-xn4tj_250ea801-8fe7-44d3-b5ec-78cd22a3fa39/placement-api/0.log" Dec 02 17:34:55 crc kubenswrapper[4933]: I1202 17:34:55.581536 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_8621d49b-46de-42d1-a6aa-291b8cda18ca/init-config-reloader/0.log" Dec 02 17:34:55 crc kubenswrapper[4933]: I1202 17:34:55.812176 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_8621d49b-46de-42d1-a6aa-291b8cda18ca/prometheus/0.log" Dec 02 17:34:55 crc kubenswrapper[4933]: I1202 17:34:55.824901 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_8621d49b-46de-42d1-a6aa-291b8cda18ca/config-reloader/0.log" Dec 02 17:34:55 crc kubenswrapper[4933]: I1202 17:34:55.868501 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_8621d49b-46de-42d1-a6aa-291b8cda18ca/init-config-reloader/0.log" Dec 02 17:34:55 crc kubenswrapper[4933]: I1202 17:34:55.885929 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_8621d49b-46de-42d1-a6aa-291b8cda18ca/thanos-sidecar/0.log" Dec 02 17:34:56 crc kubenswrapper[4933]: I1202 17:34:56.179667 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_cd27c6c8-91fa-4036-b50e-996c263b202c/setup-container/0.log" Dec 02 17:34:56 crc kubenswrapper[4933]: I1202 17:34:56.367376 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_cd27c6c8-91fa-4036-b50e-996c263b202c/setup-container/0.log" Dec 02 17:34:56 crc kubenswrapper[4933]: I1202 17:34:56.443044 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_cd27c6c8-91fa-4036-b50e-996c263b202c/rabbitmq/0.log" Dec 02 17:34:56 crc kubenswrapper[4933]: I1202 17:34:56.606403 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_528bcbae-e7ca-4ad1-8e7d-571da6cb972b/setup-container/0.log" Dec 02 17:34:56 crc kubenswrapper[4933]: I1202 17:34:56.729644 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_528bcbae-e7ca-4ad1-8e7d-571da6cb972b/setup-container/0.log" Dec 02 17:34:56 crc kubenswrapper[4933]: I1202 17:34:56.829933 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_528bcbae-e7ca-4ad1-8e7d-571da6cb972b/rabbitmq/0.log" Dec 02 17:34:56 crc kubenswrapper[4933]: I1202 17:34:56.878047 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-r5bzb_b2b6f713-7e70-4d1e-9ee4-5fa433a01ded/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 17:34:57 crc kubenswrapper[4933]: I1202 17:34:57.068519 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-l7rfq_1eeef066-30d5-47f9-90a0-2815244b7ebb/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 17:34:57 crc kubenswrapper[4933]: I1202 17:34:57.133787 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-mznps_b712434b-6106-44ed-aa67-1328e50cdb2c/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 17:34:57 crc kubenswrapper[4933]: I1202 17:34:57.339807 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-fgvgb_0feefe80-99a0-4d78-9753-c823a10fc0f8/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 17:34:57 crc kubenswrapper[4933]: I1202 17:34:57.375530 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-4m7pz_8b53d0b1-3511-4e0c-9d87-b7ceab39da16/ssh-known-hosts-edpm-deployment/0.log" Dec 02 17:34:57 crc kubenswrapper[4933]: I1202 17:34:57.686522 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-64f6f8f6c-x5wnf_97b9d44e-b2f2-4da0-82e0-28c657d8df41/proxy-server/0.log" Dec 02 17:34:57 crc kubenswrapper[4933]: I1202 17:34:57.768650 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-64f6f8f6c-x5wnf_97b9d44e-b2f2-4da0-82e0-28c657d8df41/proxy-httpd/0.log" Dec 02 17:34:57 crc kubenswrapper[4933]: I1202 17:34:57.886418 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-vwmtj_591c4cbc-2470-449b-9046-76b4c6543cb9/swift-ring-rebalance/0.log" Dec 02 17:34:58 crc kubenswrapper[4933]: I1202 17:34:58.078357 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_cf325ab8-af91-4009-9e8b-a299db2234da/account-reaper/0.log" Dec 02 17:34:58 crc kubenswrapper[4933]: I1202 17:34:58.080244 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_cf325ab8-af91-4009-9e8b-a299db2234da/account-auditor/0.log" Dec 02 17:34:58 crc kubenswrapper[4933]: I1202 17:34:58.243962 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_cf325ab8-af91-4009-9e8b-a299db2234da/account-server/0.log" Dec 02 17:34:58 crc kubenswrapper[4933]: I1202 17:34:58.253402 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_cf325ab8-af91-4009-9e8b-a299db2234da/account-replicator/0.log" Dec 02 17:34:58 crc kubenswrapper[4933]: I1202 17:34:58.343527 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_cf325ab8-af91-4009-9e8b-a299db2234da/container-auditor/0.log" Dec 02 17:34:58 crc kubenswrapper[4933]: I1202 17:34:58.416093 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_cf325ab8-af91-4009-9e8b-a299db2234da/container-replicator/0.log" Dec 02 17:34:58 crc kubenswrapper[4933]: I1202 17:34:58.508457 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_cf325ab8-af91-4009-9e8b-a299db2234da/container-server/0.log" Dec 02 17:34:58 crc kubenswrapper[4933]: I1202 17:34:58.511227 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_cf325ab8-af91-4009-9e8b-a299db2234da/container-updater/0.log" Dec 02 17:34:58 crc kubenswrapper[4933]: I1202 17:34:58.594898 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_cf325ab8-af91-4009-9e8b-a299db2234da/object-auditor/0.log" Dec 02 17:34:58 crc kubenswrapper[4933]: I1202 17:34:58.636440 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_cf325ab8-af91-4009-9e8b-a299db2234da/object-expirer/0.log" Dec 02 17:34:58 crc kubenswrapper[4933]: I1202 17:34:58.691784 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_cf325ab8-af91-4009-9e8b-a299db2234da/object-server/0.log" Dec 02 17:34:58 crc kubenswrapper[4933]: I1202 17:34:58.738165 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_cf325ab8-af91-4009-9e8b-a299db2234da/object-replicator/0.log" Dec 02 17:34:58 crc kubenswrapper[4933]: I1202 17:34:58.850904 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_cf325ab8-af91-4009-9e8b-a299db2234da/object-updater/0.log" Dec 02 17:34:58 crc kubenswrapper[4933]: I1202 17:34:58.880743 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_cf325ab8-af91-4009-9e8b-a299db2234da/swift-recon-cron/0.log" Dec 02 17:34:58 crc kubenswrapper[4933]: I1202 17:34:58.887591 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_cf325ab8-af91-4009-9e8b-a299db2234da/rsync/0.log" Dec 02 17:34:59 crc kubenswrapper[4933]: I1202 17:34:59.076360 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-zzqmp_efd2ea83-1192-411e-9c7f-bff8761883fb/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 17:34:59 crc kubenswrapper[4933]: I1202 17:34:59.164841 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-power-monitoring-edpm-deployment-openstack-edpm-hrqsx_06b52c33-11b2-4a83-a9b8-2de5845e6e89/telemetry-power-monitoring-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 17:34:59 crc kubenswrapper[4933]: I1202 17:34:59.365713 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_595befc5-8c2b-4b8e-8991-e12460cd0ef2/test-operator-logs-container/0.log" Dec 02 17:34:59 crc kubenswrapper[4933]: I1202 17:34:59.550612 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-9htq4_7aa88472-106f-484d-ae92-94c064b2a908/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 17:35:00 crc kubenswrapper[4933]: I1202 17:35:00.232424 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_bd0fb308-1858-4dea-bf49-38e577824bd0/tempest-tests-tempest-tests-runner/0.log" Dec 02 17:35:06 crc kubenswrapper[4933]: I1202 17:35:06.831055 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_32a2d025-2245-46a6-82d1-228e920490a3/memcached/0.log" Dec 02 17:35:29 crc kubenswrapper[4933]: I1202 17:35:29.078726 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_389cdc0f67cc74332d7e56ab6f2a768988fde617365ff87a48ceeb17d7c6rtb_2749120d-52e8-46d6-b419-b0c2375e0355/util/0.log" Dec 02 17:35:29 crc kubenswrapper[4933]: I1202 17:35:29.259721 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_389cdc0f67cc74332d7e56ab6f2a768988fde617365ff87a48ceeb17d7c6rtb_2749120d-52e8-46d6-b419-b0c2375e0355/pull/0.log" Dec 02 17:35:29 crc kubenswrapper[4933]: I1202 17:35:29.278599 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_389cdc0f67cc74332d7e56ab6f2a768988fde617365ff87a48ceeb17d7c6rtb_2749120d-52e8-46d6-b419-b0c2375e0355/pull/0.log" Dec 02 17:35:29 crc kubenswrapper[4933]: I1202 17:35:29.515243 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_389cdc0f67cc74332d7e56ab6f2a768988fde617365ff87a48ceeb17d7c6rtb_2749120d-52e8-46d6-b419-b0c2375e0355/util/0.log" Dec 02 17:35:29 crc kubenswrapper[4933]: I1202 17:35:29.679213 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_389cdc0f67cc74332d7e56ab6f2a768988fde617365ff87a48ceeb17d7c6rtb_2749120d-52e8-46d6-b419-b0c2375e0355/util/0.log" Dec 02 17:35:29 crc kubenswrapper[4933]: I1202 17:35:29.730521 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_389cdc0f67cc74332d7e56ab6f2a768988fde617365ff87a48ceeb17d7c6rtb_2749120d-52e8-46d6-b419-b0c2375e0355/pull/0.log" Dec 02 17:35:29 crc kubenswrapper[4933]: I1202 17:35:29.764640 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_389cdc0f67cc74332d7e56ab6f2a768988fde617365ff87a48ceeb17d7c6rtb_2749120d-52e8-46d6-b419-b0c2375e0355/extract/0.log" Dec 02 17:35:29 crc kubenswrapper[4933]: I1202 17:35:29.931959 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-ltqgf_d7ebb5f9-1140-47dc-a2e2-1952a295d218/kube-rbac-proxy/0.log" Dec 02 17:35:30 crc kubenswrapper[4933]: I1202 17:35:30.016042 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-zhsjb_4a6c2fb7-89ad-402e-8e31-b56e50d1386c/kube-rbac-proxy/0.log" Dec 02 17:35:30 crc kubenswrapper[4933]: I1202 17:35:30.074054 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-ltqgf_d7ebb5f9-1140-47dc-a2e2-1952a295d218/manager/0.log" Dec 02 17:35:30 crc kubenswrapper[4933]: I1202 17:35:30.215704 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-zhsjb_4a6c2fb7-89ad-402e-8e31-b56e50d1386c/manager/0.log" Dec 02 17:35:30 crc kubenswrapper[4933]: I1202 17:35:30.279810 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-vgblp_623d7dfe-ddc2-4557-95a3-02b8fb56ee35/kube-rbac-proxy/0.log" Dec 02 17:35:30 crc kubenswrapper[4933]: I1202 17:35:30.318455 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-vgblp_623d7dfe-ddc2-4557-95a3-02b8fb56ee35/manager/0.log" Dec 02 17:35:30 crc kubenswrapper[4933]: I1202 17:35:30.454298 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987cd8cd-9kzqg_443e1418-ad3d-4f7a-b7b0-682beceb2977/kube-rbac-proxy/0.log" Dec 02 17:35:30 crc kubenswrapper[4933]: I1202 17:35:30.546373 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987cd8cd-9kzqg_443e1418-ad3d-4f7a-b7b0-682beceb2977/manager/0.log" Dec 02 17:35:30 crc kubenswrapper[4933]: I1202 17:35:30.668120 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-rzj5x_fba1c43c-fdfb-4ea0-939b-e38adcc79720/kube-rbac-proxy/0.log" Dec 02 17:35:30 crc kubenswrapper[4933]: I1202 17:35:30.779373 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-t7rsk_e7437781-26c4-4a57-8afc-6bbc1ef7f7dd/kube-rbac-proxy/0.log" Dec 02 17:35:30 crc kubenswrapper[4933]: I1202 17:35:30.793350 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-rzj5x_fba1c43c-fdfb-4ea0-939b-e38adcc79720/manager/0.log" Dec 02 17:35:30 crc kubenswrapper[4933]: I1202 17:35:30.943285 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-t7rsk_e7437781-26c4-4a57-8afc-6bbc1ef7f7dd/manager/0.log" Dec 02 17:35:31 crc kubenswrapper[4933]: I1202 17:35:31.048300 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-ggssz_2e0aae7d-292b-4117-8b13-6021d1b5174a/kube-rbac-proxy/0.log" Dec 02 17:35:31 crc kubenswrapper[4933]: I1202 17:35:31.187162 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-ggssz_2e0aae7d-292b-4117-8b13-6021d1b5174a/manager/0.log" Dec 02 17:35:31 crc kubenswrapper[4933]: I1202 17:35:31.282590 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-ln9b5_e80caebf-2148-4666-98bb-963fce1bc84e/kube-rbac-proxy/0.log" Dec 02 17:35:31 crc kubenswrapper[4933]: I1202 17:35:31.305991 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-ln9b5_e80caebf-2148-4666-98bb-963fce1bc84e/manager/0.log" Dec 02 17:35:31 crc kubenswrapper[4933]: I1202 17:35:31.489977 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-mslvn_9c02a5b3-2770-427a-a0a4-c1fac9bd0ffa/kube-rbac-proxy/0.log" Dec 02 17:35:31 crc kubenswrapper[4933]: I1202 17:35:31.683515 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-mslvn_9c02a5b3-2770-427a-a0a4-c1fac9bd0ffa/manager/0.log" Dec 02 17:35:31 crc kubenswrapper[4933]: I1202 17:35:31.750433 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7c79b5df47-rmd89_a5d0bd48-d38b-44ae-b486-1ff751a0791a/kube-rbac-proxy/0.log" Dec 02 17:35:31 crc kubenswrapper[4933]: I1202 17:35:31.769958 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7c79b5df47-rmd89_a5d0bd48-d38b-44ae-b486-1ff751a0791a/manager/0.log" Dec 02 17:35:31 crc kubenswrapper[4933]: I1202 17:35:31.927546 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-7zb88_22a42f16-b74f-4323-aee5-2d713c1232ea/kube-rbac-proxy/0.log" Dec 02 17:35:32 crc kubenswrapper[4933]: I1202 17:35:32.013893 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-7zb88_22a42f16-b74f-4323-aee5-2d713c1232ea/manager/0.log" Dec 02 17:35:32 crc kubenswrapper[4933]: I1202 17:35:32.118982 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-n59qc_b8e9c61c-a72f-41e1-8f62-99d951ce4950/kube-rbac-proxy/0.log" Dec 02 17:35:32 crc kubenswrapper[4933]: I1202 17:35:32.177347 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-n59qc_b8e9c61c-a72f-41e1-8f62-99d951ce4950/manager/0.log" Dec 02 17:35:32 crc kubenswrapper[4933]: I1202 17:35:32.275440 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-87566_a35dd4ba-4d05-4af0-b0b2-2285e9e35889/kube-rbac-proxy/0.log" Dec 02 17:35:32 crc kubenswrapper[4933]: I1202 17:35:32.430699 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-87566_a35dd4ba-4d05-4af0-b0b2-2285e9e35889/manager/0.log" Dec 02 17:35:32 crc kubenswrapper[4933]: I1202 17:35:32.484403 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-t8bvj_7d831995-fd00-455a-822e-82eb0cca6a33/kube-rbac-proxy/0.log" Dec 02 17:35:32 crc kubenswrapper[4933]: I1202 17:35:32.538261 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-t8bvj_7d831995-fd00-455a-822e-82eb0cca6a33/manager/0.log" Dec 02 17:35:32 crc kubenswrapper[4933]: I1202 17:35:32.587846 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd42vpt8_a9221a82-9f86-4d33-a63b-71bd4c532830/kube-rbac-proxy/0.log" Dec 02 17:35:32 crc kubenswrapper[4933]: I1202 17:35:32.674922 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd42vpt8_a9221a82-9f86-4d33-a63b-71bd4c532830/manager/0.log" Dec 02 17:35:32 crc kubenswrapper[4933]: I1202 17:35:32.991747 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-vv5bw_df3012be-3abc-4502-a517-2d685340047f/registry-server/0.log" Dec 02 17:35:33 crc kubenswrapper[4933]: I1202 17:35:33.067943 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-69c497bc86-fnlmp_a85c4e8a-605a-46c0-9b73-a9fa99a314a1/operator/0.log" Dec 02 17:35:33 crc kubenswrapper[4933]: I1202 17:35:33.198701 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-mgdvd_9e637867-b8e7-48b3-987d-53172ec80734/kube-rbac-proxy/0.log" Dec 02 17:35:33 crc kubenswrapper[4933]: I1202 17:35:33.327594 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-dbnk8_e19f7d2e-55da-4ba5-9a68-0d49c06eecf3/kube-rbac-proxy/0.log" Dec 02 17:35:33 crc kubenswrapper[4933]: I1202 17:35:33.347213 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-mgdvd_9e637867-b8e7-48b3-987d-53172ec80734/manager/0.log" Dec 02 17:35:33 crc kubenswrapper[4933]: I1202 17:35:33.441236 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-dbnk8_e19f7d2e-55da-4ba5-9a68-0d49c06eecf3/manager/0.log" Dec 02 17:35:33 crc kubenswrapper[4933]: I1202 17:35:33.613734 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-f8sqn_fc13fe0e-6c9a-4d9a-bf78-ec06c2962f67/operator/0.log" Dec 02 17:35:33 crc kubenswrapper[4933]: I1202 17:35:33.644512 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-qgclr_d5d121c4-7a04-478d-b210-36b258949699/kube-rbac-proxy/0.log" Dec 02 17:35:33 crc kubenswrapper[4933]: I1202 17:35:33.858597 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-qgclr_d5d121c4-7a04-478d-b210-36b258949699/manager/0.log" Dec 02 17:35:33 crc kubenswrapper[4933]: I1202 17:35:33.866616 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-65697495f7-vqpwl_94050b59-4392-4a9d-9ce8-b5e2c61e0d46/kube-rbac-proxy/0.log" Dec 02 17:35:34 crc kubenswrapper[4933]: I1202 17:35:34.102374 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-w2ddp_7f82c652-2e90-4fd6-bd23-381c2f529a27/manager/0.log" Dec 02 17:35:34 crc kubenswrapper[4933]: I1202 17:35:34.165977 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-5wlgz"] Dec 02 17:35:34 crc kubenswrapper[4933]: E1202 17:35:34.166873 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62477a6c-7b87-4ed1-8645-5df52b452051" containerName="container-00" Dec 02 17:35:34 crc kubenswrapper[4933]: I1202 17:35:34.166893 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="62477a6c-7b87-4ed1-8645-5df52b452051" containerName="container-00" Dec 02 17:35:34 crc kubenswrapper[4933]: I1202 17:35:34.168714 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="62477a6c-7b87-4ed1-8645-5df52b452051" containerName="container-00" Dec 02 17:35:34 crc kubenswrapper[4933]: I1202 17:35:34.170528 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5wlgz" Dec 02 17:35:34 crc kubenswrapper[4933]: I1202 17:35:34.179915 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5wlgz"] Dec 02 17:35:34 crc kubenswrapper[4933]: I1202 17:35:34.199795 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-w2ddp_7f82c652-2e90-4fd6-bd23-381c2f529a27/kube-rbac-proxy/0.log" Dec 02 17:35:34 crc kubenswrapper[4933]: I1202 17:35:34.244521 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43bbe484-a14a-4f3d-96f1-ae27ebc50bb8-utilities\") pod \"certified-operators-5wlgz\" (UID: \"43bbe484-a14a-4f3d-96f1-ae27ebc50bb8\") " pod="openshift-marketplace/certified-operators-5wlgz" Dec 02 17:35:34 crc kubenswrapper[4933]: I1202 17:35:34.244742 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrd8d\" (UniqueName: \"kubernetes.io/projected/43bbe484-a14a-4f3d-96f1-ae27ebc50bb8-kube-api-access-qrd8d\") pod \"certified-operators-5wlgz\" (UID: \"43bbe484-a14a-4f3d-96f1-ae27ebc50bb8\") " pod="openshift-marketplace/certified-operators-5wlgz" Dec 02 17:35:34 crc kubenswrapper[4933]: I1202 17:35:34.245041 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43bbe484-a14a-4f3d-96f1-ae27ebc50bb8-catalog-content\") pod \"certified-operators-5wlgz\" (UID: \"43bbe484-a14a-4f3d-96f1-ae27ebc50bb8\") " pod="openshift-marketplace/certified-operators-5wlgz" Dec 02 17:35:34 crc kubenswrapper[4933]: I1202 17:35:34.346521 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43bbe484-a14a-4f3d-96f1-ae27ebc50bb8-catalog-content\") pod \"certified-operators-5wlgz\" (UID: \"43bbe484-a14a-4f3d-96f1-ae27ebc50bb8\") " pod="openshift-marketplace/certified-operators-5wlgz" Dec 02 17:35:34 crc kubenswrapper[4933]: I1202 17:35:34.346598 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43bbe484-a14a-4f3d-96f1-ae27ebc50bb8-utilities\") pod \"certified-operators-5wlgz\" (UID: \"43bbe484-a14a-4f3d-96f1-ae27ebc50bb8\") " pod="openshift-marketplace/certified-operators-5wlgz" Dec 02 17:35:34 crc kubenswrapper[4933]: I1202 17:35:34.346618 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrd8d\" (UniqueName: \"kubernetes.io/projected/43bbe484-a14a-4f3d-96f1-ae27ebc50bb8-kube-api-access-qrd8d\") pod \"certified-operators-5wlgz\" (UID: \"43bbe484-a14a-4f3d-96f1-ae27ebc50bb8\") " pod="openshift-marketplace/certified-operators-5wlgz" Dec 02 17:35:34 crc kubenswrapper[4933]: I1202 17:35:34.347196 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43bbe484-a14a-4f3d-96f1-ae27ebc50bb8-catalog-content\") pod \"certified-operators-5wlgz\" (UID: \"43bbe484-a14a-4f3d-96f1-ae27ebc50bb8\") " pod="openshift-marketplace/certified-operators-5wlgz" Dec 02 17:35:34 crc kubenswrapper[4933]: I1202 17:35:34.347275 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43bbe484-a14a-4f3d-96f1-ae27ebc50bb8-utilities\") pod \"certified-operators-5wlgz\" (UID: \"43bbe484-a14a-4f3d-96f1-ae27ebc50bb8\") " pod="openshift-marketplace/certified-operators-5wlgz" Dec 02 17:35:34 crc kubenswrapper[4933]: I1202 17:35:34.358052 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-65697495f7-vqpwl_94050b59-4392-4a9d-9ce8-b5e2c61e0d46/manager/0.log" Dec 02 17:35:34 crc kubenswrapper[4933]: I1202 17:35:34.392918 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrd8d\" (UniqueName: \"kubernetes.io/projected/43bbe484-a14a-4f3d-96f1-ae27ebc50bb8-kube-api-access-qrd8d\") pod \"certified-operators-5wlgz\" (UID: \"43bbe484-a14a-4f3d-96f1-ae27ebc50bb8\") " pod="openshift-marketplace/certified-operators-5wlgz" Dec 02 17:35:34 crc kubenswrapper[4933]: I1202 17:35:34.396699 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-596767c485-f2zz4_a23e8ea6-05d4-49b6-a661-30b1236cb653/manager/0.log" Dec 02 17:35:34 crc kubenswrapper[4933]: I1202 17:35:34.422381 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-c2pbl_7ab353e5-677f-499c-8909-47813767a26c/kube-rbac-proxy/0.log" Dec 02 17:35:34 crc kubenswrapper[4933]: I1202 17:35:34.507810 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5wlgz" Dec 02 17:35:34 crc kubenswrapper[4933]: I1202 17:35:34.606421 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-c2pbl_7ab353e5-677f-499c-8909-47813767a26c/manager/0.log" Dec 02 17:35:35 crc kubenswrapper[4933]: W1202 17:35:35.485715 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43bbe484_a14a_4f3d_96f1_ae27ebc50bb8.slice/crio-05b43d9d6fb5bccb8438d7db1fa04d2929d68a915614885faddaa88e60300a26 WatchSource:0}: Error finding container 05b43d9d6fb5bccb8438d7db1fa04d2929d68a915614885faddaa88e60300a26: Status 404 returned error can't find the container with id 05b43d9d6fb5bccb8438d7db1fa04d2929d68a915614885faddaa88e60300a26 Dec 02 17:35:35 crc kubenswrapper[4933]: I1202 17:35:35.498505 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5wlgz"] Dec 02 17:35:35 crc kubenswrapper[4933]: I1202 17:35:35.909806 4933 generic.go:334] "Generic (PLEG): container finished" podID="43bbe484-a14a-4f3d-96f1-ae27ebc50bb8" containerID="eb3ceaab6318f64d139425a53e349e83fa8a00d817c390799e18c60437097d49" exitCode=0 Dec 02 17:35:35 crc kubenswrapper[4933]: I1202 17:35:35.909863 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5wlgz" event={"ID":"43bbe484-a14a-4f3d-96f1-ae27ebc50bb8","Type":"ContainerDied","Data":"eb3ceaab6318f64d139425a53e349e83fa8a00d817c390799e18c60437097d49"} Dec 02 17:35:35 crc kubenswrapper[4933]: I1202 17:35:35.910158 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5wlgz" event={"ID":"43bbe484-a14a-4f3d-96f1-ae27ebc50bb8","Type":"ContainerStarted","Data":"05b43d9d6fb5bccb8438d7db1fa04d2929d68a915614885faddaa88e60300a26"} Dec 02 17:35:37 crc kubenswrapper[4933]: I1202 17:35:37.951100 4933 generic.go:334] "Generic (PLEG): container finished" podID="43bbe484-a14a-4f3d-96f1-ae27ebc50bb8" containerID="e8990c93e0487161fa69dcf432b9fc27d4622b5365e1b36faeda3aef80dd68c6" exitCode=0 Dec 02 17:35:37 crc kubenswrapper[4933]: I1202 17:35:37.951584 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5wlgz" event={"ID":"43bbe484-a14a-4f3d-96f1-ae27ebc50bb8","Type":"ContainerDied","Data":"e8990c93e0487161fa69dcf432b9fc27d4622b5365e1b36faeda3aef80dd68c6"} Dec 02 17:35:39 crc kubenswrapper[4933]: I1202 17:35:39.990620 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5wlgz" event={"ID":"43bbe484-a14a-4f3d-96f1-ae27ebc50bb8","Type":"ContainerStarted","Data":"eda2e319f4559fa2d70d86449dfd0af7c6396f3814f7cd34122d245ed65dedb7"} Dec 02 17:35:40 crc kubenswrapper[4933]: I1202 17:35:40.016690 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-5wlgz" podStartSLOduration=2.552369106 podStartE2EDuration="6.016668896s" podCreationTimestamp="2025-12-02 17:35:34 +0000 UTC" firstStartedPulling="2025-12-02 17:35:35.915543775 +0000 UTC m=+6199.166770478" lastFinishedPulling="2025-12-02 17:35:39.379843575 +0000 UTC m=+6202.631070268" observedRunningTime="2025-12-02 17:35:40.008002668 +0000 UTC m=+6203.259229391" watchObservedRunningTime="2025-12-02 17:35:40.016668896 +0000 UTC m=+6203.267895599" Dec 02 17:35:44 crc kubenswrapper[4933]: I1202 17:35:44.508479 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-5wlgz" Dec 02 17:35:44 crc kubenswrapper[4933]: I1202 17:35:44.511495 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-5wlgz" Dec 02 17:35:44 crc kubenswrapper[4933]: I1202 17:35:44.572609 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-5wlgz" Dec 02 17:35:45 crc kubenswrapper[4933]: I1202 17:35:45.114534 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-5wlgz" Dec 02 17:35:45 crc kubenswrapper[4933]: I1202 17:35:45.179105 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5wlgz"] Dec 02 17:35:47 crc kubenswrapper[4933]: I1202 17:35:47.074265 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-5wlgz" podUID="43bbe484-a14a-4f3d-96f1-ae27ebc50bb8" containerName="registry-server" containerID="cri-o://eda2e319f4559fa2d70d86449dfd0af7c6396f3814f7cd34122d245ed65dedb7" gracePeriod=2 Dec 02 17:35:47 crc kubenswrapper[4933]: I1202 17:35:47.642456 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5wlgz" Dec 02 17:35:47 crc kubenswrapper[4933]: I1202 17:35:47.697575 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qrd8d\" (UniqueName: \"kubernetes.io/projected/43bbe484-a14a-4f3d-96f1-ae27ebc50bb8-kube-api-access-qrd8d\") pod \"43bbe484-a14a-4f3d-96f1-ae27ebc50bb8\" (UID: \"43bbe484-a14a-4f3d-96f1-ae27ebc50bb8\") " Dec 02 17:35:47 crc kubenswrapper[4933]: I1202 17:35:47.698136 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43bbe484-a14a-4f3d-96f1-ae27ebc50bb8-catalog-content\") pod \"43bbe484-a14a-4f3d-96f1-ae27ebc50bb8\" (UID: \"43bbe484-a14a-4f3d-96f1-ae27ebc50bb8\") " Dec 02 17:35:47 crc kubenswrapper[4933]: I1202 17:35:47.698189 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43bbe484-a14a-4f3d-96f1-ae27ebc50bb8-utilities\") pod \"43bbe484-a14a-4f3d-96f1-ae27ebc50bb8\" (UID: \"43bbe484-a14a-4f3d-96f1-ae27ebc50bb8\") " Dec 02 17:35:47 crc kubenswrapper[4933]: I1202 17:35:47.699188 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43bbe484-a14a-4f3d-96f1-ae27ebc50bb8-utilities" (OuterVolumeSpecName: "utilities") pod "43bbe484-a14a-4f3d-96f1-ae27ebc50bb8" (UID: "43bbe484-a14a-4f3d-96f1-ae27ebc50bb8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 17:35:47 crc kubenswrapper[4933]: I1202 17:35:47.706018 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43bbe484-a14a-4f3d-96f1-ae27ebc50bb8-kube-api-access-qrd8d" (OuterVolumeSpecName: "kube-api-access-qrd8d") pod "43bbe484-a14a-4f3d-96f1-ae27ebc50bb8" (UID: "43bbe484-a14a-4f3d-96f1-ae27ebc50bb8"). InnerVolumeSpecName "kube-api-access-qrd8d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 17:35:47 crc kubenswrapper[4933]: I1202 17:35:47.763787 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43bbe484-a14a-4f3d-96f1-ae27ebc50bb8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "43bbe484-a14a-4f3d-96f1-ae27ebc50bb8" (UID: "43bbe484-a14a-4f3d-96f1-ae27ebc50bb8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 17:35:47 crc kubenswrapper[4933]: I1202 17:35:47.816918 4933 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43bbe484-a14a-4f3d-96f1-ae27ebc50bb8-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 17:35:47 crc kubenswrapper[4933]: I1202 17:35:47.816953 4933 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43bbe484-a14a-4f3d-96f1-ae27ebc50bb8-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 17:35:47 crc kubenswrapper[4933]: I1202 17:35:47.816962 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qrd8d\" (UniqueName: \"kubernetes.io/projected/43bbe484-a14a-4f3d-96f1-ae27ebc50bb8-kube-api-access-qrd8d\") on node \"crc\" DevicePath \"\"" Dec 02 17:35:48 crc kubenswrapper[4933]: I1202 17:35:48.080284 4933 generic.go:334] "Generic (PLEG): container finished" podID="43bbe484-a14a-4f3d-96f1-ae27ebc50bb8" containerID="eda2e319f4559fa2d70d86449dfd0af7c6396f3814f7cd34122d245ed65dedb7" exitCode=0 Dec 02 17:35:48 crc kubenswrapper[4933]: I1202 17:35:48.080342 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5wlgz" event={"ID":"43bbe484-a14a-4f3d-96f1-ae27ebc50bb8","Type":"ContainerDied","Data":"eda2e319f4559fa2d70d86449dfd0af7c6396f3814f7cd34122d245ed65dedb7"} Dec 02 17:35:48 crc kubenswrapper[4933]: I1202 17:35:48.080379 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5wlgz" event={"ID":"43bbe484-a14a-4f3d-96f1-ae27ebc50bb8","Type":"ContainerDied","Data":"05b43d9d6fb5bccb8438d7db1fa04d2929d68a915614885faddaa88e60300a26"} Dec 02 17:35:48 crc kubenswrapper[4933]: I1202 17:35:48.080406 4933 scope.go:117] "RemoveContainer" containerID="eda2e319f4559fa2d70d86449dfd0af7c6396f3814f7cd34122d245ed65dedb7" Dec 02 17:35:48 crc kubenswrapper[4933]: I1202 17:35:48.080597 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5wlgz" Dec 02 17:35:48 crc kubenswrapper[4933]: I1202 17:35:48.114908 4933 scope.go:117] "RemoveContainer" containerID="e8990c93e0487161fa69dcf432b9fc27d4622b5365e1b36faeda3aef80dd68c6" Dec 02 17:35:48 crc kubenswrapper[4933]: I1202 17:35:48.145106 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5wlgz"] Dec 02 17:35:48 crc kubenswrapper[4933]: I1202 17:35:48.149211 4933 scope.go:117] "RemoveContainer" containerID="eb3ceaab6318f64d139425a53e349e83fa8a00d817c390799e18c60437097d49" Dec 02 17:35:48 crc kubenswrapper[4933]: I1202 17:35:48.155050 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-5wlgz"] Dec 02 17:35:48 crc kubenswrapper[4933]: I1202 17:35:48.203892 4933 scope.go:117] "RemoveContainer" containerID="eda2e319f4559fa2d70d86449dfd0af7c6396f3814f7cd34122d245ed65dedb7" Dec 02 17:35:48 crc kubenswrapper[4933]: E1202 17:35:48.206164 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eda2e319f4559fa2d70d86449dfd0af7c6396f3814f7cd34122d245ed65dedb7\": container with ID starting with eda2e319f4559fa2d70d86449dfd0af7c6396f3814f7cd34122d245ed65dedb7 not found: ID does not exist" containerID="eda2e319f4559fa2d70d86449dfd0af7c6396f3814f7cd34122d245ed65dedb7" Dec 02 17:35:48 crc kubenswrapper[4933]: I1202 17:35:48.206213 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eda2e319f4559fa2d70d86449dfd0af7c6396f3814f7cd34122d245ed65dedb7"} err="failed to get container status \"eda2e319f4559fa2d70d86449dfd0af7c6396f3814f7cd34122d245ed65dedb7\": rpc error: code = NotFound desc = could not find container \"eda2e319f4559fa2d70d86449dfd0af7c6396f3814f7cd34122d245ed65dedb7\": container with ID starting with eda2e319f4559fa2d70d86449dfd0af7c6396f3814f7cd34122d245ed65dedb7 not found: ID does not exist" Dec 02 17:35:48 crc kubenswrapper[4933]: I1202 17:35:48.206252 4933 scope.go:117] "RemoveContainer" containerID="e8990c93e0487161fa69dcf432b9fc27d4622b5365e1b36faeda3aef80dd68c6" Dec 02 17:35:48 crc kubenswrapper[4933]: E1202 17:35:48.206563 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8990c93e0487161fa69dcf432b9fc27d4622b5365e1b36faeda3aef80dd68c6\": container with ID starting with e8990c93e0487161fa69dcf432b9fc27d4622b5365e1b36faeda3aef80dd68c6 not found: ID does not exist" containerID="e8990c93e0487161fa69dcf432b9fc27d4622b5365e1b36faeda3aef80dd68c6" Dec 02 17:35:48 crc kubenswrapper[4933]: I1202 17:35:48.206588 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8990c93e0487161fa69dcf432b9fc27d4622b5365e1b36faeda3aef80dd68c6"} err="failed to get container status \"e8990c93e0487161fa69dcf432b9fc27d4622b5365e1b36faeda3aef80dd68c6\": rpc error: code = NotFound desc = could not find container \"e8990c93e0487161fa69dcf432b9fc27d4622b5365e1b36faeda3aef80dd68c6\": container with ID starting with e8990c93e0487161fa69dcf432b9fc27d4622b5365e1b36faeda3aef80dd68c6 not found: ID does not exist" Dec 02 17:35:48 crc kubenswrapper[4933]: I1202 17:35:48.206608 4933 scope.go:117] "RemoveContainer" containerID="eb3ceaab6318f64d139425a53e349e83fa8a00d817c390799e18c60437097d49" Dec 02 17:35:48 crc kubenswrapper[4933]: E1202 17:35:48.206970 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb3ceaab6318f64d139425a53e349e83fa8a00d817c390799e18c60437097d49\": container with ID starting with eb3ceaab6318f64d139425a53e349e83fa8a00d817c390799e18c60437097d49 not found: ID does not exist" containerID="eb3ceaab6318f64d139425a53e349e83fa8a00d817c390799e18c60437097d49" Dec 02 17:35:48 crc kubenswrapper[4933]: I1202 17:35:48.207009 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb3ceaab6318f64d139425a53e349e83fa8a00d817c390799e18c60437097d49"} err="failed to get container status \"eb3ceaab6318f64d139425a53e349e83fa8a00d817c390799e18c60437097d49\": rpc error: code = NotFound desc = could not find container \"eb3ceaab6318f64d139425a53e349e83fa8a00d817c390799e18c60437097d49\": container with ID starting with eb3ceaab6318f64d139425a53e349e83fa8a00d817c390799e18c60437097d49 not found: ID does not exist" Dec 02 17:35:49 crc kubenswrapper[4933]: I1202 17:35:49.075621 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43bbe484-a14a-4f3d-96f1-ae27ebc50bb8" path="/var/lib/kubelet/pods/43bbe484-a14a-4f3d-96f1-ae27ebc50bb8/volumes" Dec 02 17:35:56 crc kubenswrapper[4933]: I1202 17:35:56.711038 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-cx7jx_061606b3-9a47-4cff-ad31-04e9a5a05528/control-plane-machine-set-operator/0.log" Dec 02 17:35:56 crc kubenswrapper[4933]: I1202 17:35:56.899753 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-brhm4_2e0ed0e9-47b5-44d2-89c5-9674e4dc3ed4/kube-rbac-proxy/0.log" Dec 02 17:35:56 crc kubenswrapper[4933]: I1202 17:35:56.929455 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-brhm4_2e0ed0e9-47b5-44d2-89c5-9674e4dc3ed4/machine-api-operator/0.log" Dec 02 17:36:10 crc kubenswrapper[4933]: I1202 17:36:10.506022 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-dhbnp_e81749dd-af4f-46b4-954a-4581cf720cd5/cert-manager-controller/0.log" Dec 02 17:36:10 crc kubenswrapper[4933]: I1202 17:36:10.688697 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-q4f6z_b2d605a1-7e07-47c6-af5f-e2f2a41df05d/cert-manager-cainjector/0.log" Dec 02 17:36:10 crc kubenswrapper[4933]: I1202 17:36:10.799171 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-lpsrb_41523154-4286-47e3-9d29-3adb94b50073/cert-manager-webhook/0.log" Dec 02 17:36:17 crc kubenswrapper[4933]: I1202 17:36:17.760515 4933 patch_prober.go:28] interesting pod/machine-config-daemon-d2p6w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 17:36:17 crc kubenswrapper[4933]: I1202 17:36:17.761222 4933 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 17:36:24 crc kubenswrapper[4933]: I1202 17:36:24.696837 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7fbb5f6569-gl4gm_a4ec34d6-4de1-4ecb-a60a-46d969b32ff4/nmstate-console-plugin/0.log" Dec 02 17:36:24 crc kubenswrapper[4933]: I1202 17:36:24.844492 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-zjk85_6dec4a20-a5e7-4e76-80d5-8dfbb82021b3/nmstate-handler/0.log" Dec 02 17:36:24 crc kubenswrapper[4933]: I1202 17:36:24.900999 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-qr4ph_09f6889b-f9be-48f6-a955-c19824c76cdd/kube-rbac-proxy/0.log" Dec 02 17:36:24 crc kubenswrapper[4933]: I1202 17:36:24.982136 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-qr4ph_09f6889b-f9be-48f6-a955-c19824c76cdd/nmstate-metrics/0.log" Dec 02 17:36:25 crc kubenswrapper[4933]: I1202 17:36:25.105552 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5b5b58f5c8-mqmjq_37530e12-c39d-43b8-887f-64e9584c1d48/nmstate-operator/0.log" Dec 02 17:36:25 crc kubenswrapper[4933]: I1202 17:36:25.227342 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f6d4c5ccb-85srr_f007b4c7-10dc-452b-b11c-1506dc25da9a/nmstate-webhook/0.log" Dec 02 17:36:38 crc kubenswrapper[4933]: I1202 17:36:38.759712 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-5dfc66d568-zbwvk_a390b3c7-fa88-43d0-81d1-ca767f1e9133/manager/0.log" Dec 02 17:36:38 crc kubenswrapper[4933]: I1202 17:36:38.825228 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-5dfc66d568-zbwvk_a390b3c7-fa88-43d0-81d1-ca767f1e9133/kube-rbac-proxy/0.log" Dec 02 17:36:47 crc kubenswrapper[4933]: I1202 17:36:47.169114 4933 patch_prober.go:28] interesting pod/machine-config-daemon-d2p6w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 17:36:47 crc kubenswrapper[4933]: I1202 17:36:47.169750 4933 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 17:36:54 crc kubenswrapper[4933]: I1202 17:36:54.485142 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_cluster-logging-operator-ff9846bd-sg6kn_5390a7b1-0b5d-485c-8fc3-44fca44d4286/cluster-logging-operator/0.log" Dec 02 17:36:54 crc kubenswrapper[4933]: I1202 17:36:54.571144 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_collector-9hj2j_1a5bbd19-7aab-46a5-b5c6-9f3861d9dc8d/collector/0.log" Dec 02 17:36:54 crc kubenswrapper[4933]: I1202 17:36:54.698237 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-compactor-0_39e91993-69e7-44dc-bb73-5372571ce233/loki-compactor/0.log" Dec 02 17:36:54 crc kubenswrapper[4933]: I1202 17:36:54.763100 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-distributor-76cc67bf56-bps9l_92d8d29e-d630-4e5b-9697-5f388253535a/loki-distributor/0.log" Dec 02 17:36:54 crc kubenswrapper[4933]: I1202 17:36:54.909947 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-fdb4c644d-28287_349b144f-569c-411e-9651-db4659c64925/opa/0.log" Dec 02 17:36:54 crc kubenswrapper[4933]: I1202 17:36:54.947813 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-fdb4c644d-28287_349b144f-569c-411e-9651-db4659c64925/gateway/0.log" Dec 02 17:36:55 crc kubenswrapper[4933]: I1202 17:36:55.099614 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-fdb4c644d-wj2pd_c016a791-ff2e-4466-95e5-80333c909784/gateway/0.log" Dec 02 17:36:55 crc kubenswrapper[4933]: I1202 17:36:55.152939 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-fdb4c644d-wj2pd_c016a791-ff2e-4466-95e5-80333c909784/opa/0.log" Dec 02 17:36:55 crc kubenswrapper[4933]: I1202 17:36:55.237577 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-index-gateway-0_50300b3a-1c70-41d4-a37c-5b121b3c0efb/loki-index-gateway/0.log" Dec 02 17:36:55 crc kubenswrapper[4933]: I1202 17:36:55.398860 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-ingester-0_cb0e0e26-ef5a-48bd-aa32-af5919aaf24a/loki-ingester/0.log" Dec 02 17:36:55 crc kubenswrapper[4933]: I1202 17:36:55.469292 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-querier-5895d59bb8-d4wmg_b969ec3f-f1c9-4752-b473-1a14450526c1/loki-querier/0.log" Dec 02 17:36:55 crc kubenswrapper[4933]: I1202 17:36:55.628363 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-query-frontend-84558f7c9f-drtkk_4020f625-85e6-4986-84aa-1f73717025cb/loki-query-frontend/0.log" Dec 02 17:37:10 crc kubenswrapper[4933]: I1202 17:37:10.467441 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-dkmmd_b18bcd0e-ac22-4425-940c-29311785588f/kube-rbac-proxy/0.log" Dec 02 17:37:10 crc kubenswrapper[4933]: I1202 17:37:10.654875 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-dkmmd_b18bcd0e-ac22-4425-940c-29311785588f/controller/0.log" Dec 02 17:37:10 crc kubenswrapper[4933]: I1202 17:37:10.728353 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2zdck_3cc86070-243d-4886-9746-9fcab519fb50/cp-frr-files/0.log" Dec 02 17:37:11 crc kubenswrapper[4933]: I1202 17:37:11.075557 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2zdck_3cc86070-243d-4886-9746-9fcab519fb50/cp-frr-files/0.log" Dec 02 17:37:11 crc kubenswrapper[4933]: I1202 17:37:11.092904 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2zdck_3cc86070-243d-4886-9746-9fcab519fb50/cp-metrics/0.log" Dec 02 17:37:11 crc kubenswrapper[4933]: I1202 17:37:11.118559 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2zdck_3cc86070-243d-4886-9746-9fcab519fb50/cp-reloader/0.log" Dec 02 17:37:11 crc kubenswrapper[4933]: I1202 17:37:11.187972 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2zdck_3cc86070-243d-4886-9746-9fcab519fb50/cp-reloader/0.log" Dec 02 17:37:11 crc kubenswrapper[4933]: I1202 17:37:11.360117 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2zdck_3cc86070-243d-4886-9746-9fcab519fb50/cp-metrics/0.log" Dec 02 17:37:11 crc kubenswrapper[4933]: I1202 17:37:11.371669 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2zdck_3cc86070-243d-4886-9746-9fcab519fb50/cp-frr-files/0.log" Dec 02 17:37:11 crc kubenswrapper[4933]: I1202 17:37:11.399167 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2zdck_3cc86070-243d-4886-9746-9fcab519fb50/cp-reloader/0.log" Dec 02 17:37:11 crc kubenswrapper[4933]: I1202 17:37:11.408271 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2zdck_3cc86070-243d-4886-9746-9fcab519fb50/cp-metrics/0.log" Dec 02 17:37:11 crc kubenswrapper[4933]: I1202 17:37:11.593839 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2zdck_3cc86070-243d-4886-9746-9fcab519fb50/cp-frr-files/0.log" Dec 02 17:37:11 crc kubenswrapper[4933]: I1202 17:37:11.606300 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2zdck_3cc86070-243d-4886-9746-9fcab519fb50/cp-reloader/0.log" Dec 02 17:37:11 crc kubenswrapper[4933]: I1202 17:37:11.611127 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2zdck_3cc86070-243d-4886-9746-9fcab519fb50/cp-metrics/0.log" Dec 02 17:37:11 crc kubenswrapper[4933]: I1202 17:37:11.645008 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2zdck_3cc86070-243d-4886-9746-9fcab519fb50/controller/0.log" Dec 02 17:37:11 crc kubenswrapper[4933]: I1202 17:37:11.795665 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2zdck_3cc86070-243d-4886-9746-9fcab519fb50/frr-metrics/0.log" Dec 02 17:37:11 crc kubenswrapper[4933]: I1202 17:37:11.852284 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2zdck_3cc86070-243d-4886-9746-9fcab519fb50/kube-rbac-proxy/0.log" Dec 02 17:37:11 crc kubenswrapper[4933]: I1202 17:37:11.867248 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2zdck_3cc86070-243d-4886-9746-9fcab519fb50/kube-rbac-proxy-frr/0.log" Dec 02 17:37:12 crc kubenswrapper[4933]: I1202 17:37:12.073676 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2zdck_3cc86070-243d-4886-9746-9fcab519fb50/reloader/0.log" Dec 02 17:37:12 crc kubenswrapper[4933]: I1202 17:37:12.098948 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7fcb986d4-r4spw_b5d26e47-d7ac-4bda-8b0e-559a3ee18eb6/frr-k8s-webhook-server/0.log" Dec 02 17:37:12 crc kubenswrapper[4933]: I1202 17:37:12.393879 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-777b6c8cc-hkdpl_a987e1fe-f524-4d29-98c4-cc0238d7b79f/manager/0.log" Dec 02 17:37:12 crc kubenswrapper[4933]: I1202 17:37:12.590077 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-ff64ff47b-dqgr2_3493ec7a-7065-4895-944d-2aefb1419718/webhook-server/0.log" Dec 02 17:37:12 crc kubenswrapper[4933]: I1202 17:37:12.658042 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-7428k_cf92f734-644b-4a6c-b716-bf6269ae99b5/kube-rbac-proxy/0.log" Dec 02 17:37:13 crc kubenswrapper[4933]: I1202 17:37:13.435350 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-7428k_cf92f734-644b-4a6c-b716-bf6269ae99b5/speaker/0.log" Dec 02 17:37:13 crc kubenswrapper[4933]: I1202 17:37:13.750058 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2zdck_3cc86070-243d-4886-9746-9fcab519fb50/frr/0.log" Dec 02 17:37:17 crc kubenswrapper[4933]: I1202 17:37:17.169967 4933 patch_prober.go:28] interesting pod/machine-config-daemon-d2p6w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 17:37:17 crc kubenswrapper[4933]: I1202 17:37:17.170796 4933 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 17:37:17 crc kubenswrapper[4933]: I1202 17:37:17.170921 4933 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" Dec 02 17:37:17 crc kubenswrapper[4933]: I1202 17:37:17.172898 4933 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a83509fdc2a402118ff91daac7799c2a25cc85aee38ea32fab3a07cc2be654e1"} pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 17:37:17 crc kubenswrapper[4933]: I1202 17:37:17.173086 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" containerName="machine-config-daemon" containerID="cri-o://a83509fdc2a402118ff91daac7799c2a25cc85aee38ea32fab3a07cc2be654e1" gracePeriod=600 Dec 02 17:37:17 crc kubenswrapper[4933]: I1202 17:37:17.426928 4933 generic.go:334] "Generic (PLEG): container finished" podID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" containerID="a83509fdc2a402118ff91daac7799c2a25cc85aee38ea32fab3a07cc2be654e1" exitCode=0 Dec 02 17:37:17 crc kubenswrapper[4933]: I1202 17:37:17.426982 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" event={"ID":"e6c1c5e6-50dd-428a-890c-2c3f0456f2fa","Type":"ContainerDied","Data":"a83509fdc2a402118ff91daac7799c2a25cc85aee38ea32fab3a07cc2be654e1"} Dec 02 17:37:17 crc kubenswrapper[4933]: I1202 17:37:17.427021 4933 scope.go:117] "RemoveContainer" containerID="8e9bedbcd8d4eb23a5878e0ca4c08cb5a8edf59a17a5b95e1bb072e6b3f98d17" Dec 02 17:37:18 crc kubenswrapper[4933]: I1202 17:37:18.443171 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" event={"ID":"e6c1c5e6-50dd-428a-890c-2c3f0456f2fa","Type":"ContainerStarted","Data":"2ee3071a320876f6620f091b90e8722600baacb646bda6aad185c6e1a117bdd8"} Dec 02 17:37:25 crc kubenswrapper[4933]: I1202 17:37:25.777383 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8g5ctb_e3bd3f79-2e8f-42fc-b9c1-f7dc2ab455d7/util/0.log" Dec 02 17:37:26 crc kubenswrapper[4933]: I1202 17:37:26.030994 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8g5ctb_e3bd3f79-2e8f-42fc-b9c1-f7dc2ab455d7/util/0.log" Dec 02 17:37:26 crc kubenswrapper[4933]: I1202 17:37:26.055226 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8g5ctb_e3bd3f79-2e8f-42fc-b9c1-f7dc2ab455d7/pull/0.log" Dec 02 17:37:26 crc kubenswrapper[4933]: I1202 17:37:26.080279 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8g5ctb_e3bd3f79-2e8f-42fc-b9c1-f7dc2ab455d7/pull/0.log" Dec 02 17:37:26 crc kubenswrapper[4933]: I1202 17:37:26.242540 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8g5ctb_e3bd3f79-2e8f-42fc-b9c1-f7dc2ab455d7/util/0.log" Dec 02 17:37:26 crc kubenswrapper[4933]: I1202 17:37:26.242803 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8g5ctb_e3bd3f79-2e8f-42fc-b9c1-f7dc2ab455d7/pull/0.log" Dec 02 17:37:26 crc kubenswrapper[4933]: I1202 17:37:26.246775 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8g5ctb_e3bd3f79-2e8f-42fc-b9c1-f7dc2ab455d7/extract/0.log" Dec 02 17:37:26 crc kubenswrapper[4933]: I1202 17:37:26.412660 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7m5qk_9bc38c6b-b881-4fec-9d9f-d7bee85e17eb/util/0.log" Dec 02 17:37:26 crc kubenswrapper[4933]: I1202 17:37:26.629925 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7m5qk_9bc38c6b-b881-4fec-9d9f-d7bee85e17eb/util/0.log" Dec 02 17:37:26 crc kubenswrapper[4933]: I1202 17:37:26.640364 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7m5qk_9bc38c6b-b881-4fec-9d9f-d7bee85e17eb/pull/0.log" Dec 02 17:37:26 crc kubenswrapper[4933]: I1202 17:37:26.640488 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7m5qk_9bc38c6b-b881-4fec-9d9f-d7bee85e17eb/pull/0.log" Dec 02 17:37:26 crc kubenswrapper[4933]: I1202 17:37:26.809140 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7m5qk_9bc38c6b-b881-4fec-9d9f-d7bee85e17eb/util/0.log" Dec 02 17:37:26 crc kubenswrapper[4933]: I1202 17:37:26.824084 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7m5qk_9bc38c6b-b881-4fec-9d9f-d7bee85e17eb/pull/0.log" Dec 02 17:37:26 crc kubenswrapper[4933]: I1202 17:37:26.873329 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7m5qk_9bc38c6b-b881-4fec-9d9f-d7bee85e17eb/extract/0.log" Dec 02 17:37:26 crc kubenswrapper[4933]: I1202 17:37:26.973747 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210ql9cd_90462580-2133-44d1-b7bc-c838b66ef30b/util/0.log" Dec 02 17:37:27 crc kubenswrapper[4933]: I1202 17:37:27.197901 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210ql9cd_90462580-2133-44d1-b7bc-c838b66ef30b/util/0.log" Dec 02 17:37:27 crc kubenswrapper[4933]: I1202 17:37:27.230670 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210ql9cd_90462580-2133-44d1-b7bc-c838b66ef30b/pull/0.log" Dec 02 17:37:27 crc kubenswrapper[4933]: I1202 17:37:27.230880 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210ql9cd_90462580-2133-44d1-b7bc-c838b66ef30b/pull/0.log" Dec 02 17:37:27 crc kubenswrapper[4933]: I1202 17:37:27.388746 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210ql9cd_90462580-2133-44d1-b7bc-c838b66ef30b/util/0.log" Dec 02 17:37:27 crc kubenswrapper[4933]: I1202 17:37:27.391127 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210ql9cd_90462580-2133-44d1-b7bc-c838b66ef30b/extract/0.log" Dec 02 17:37:27 crc kubenswrapper[4933]: I1202 17:37:27.402222 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210ql9cd_90462580-2133-44d1-b7bc-c838b66ef30b/pull/0.log" Dec 02 17:37:27 crc kubenswrapper[4933]: I1202 17:37:27.588649 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fm792l_bf9b2c95-38b7-4a61-89ae-d40735ddccaa/util/0.log" Dec 02 17:37:27 crc kubenswrapper[4933]: I1202 17:37:27.712369 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fm792l_bf9b2c95-38b7-4a61-89ae-d40735ddccaa/util/0.log" Dec 02 17:37:27 crc kubenswrapper[4933]: I1202 17:37:27.754922 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fm792l_bf9b2c95-38b7-4a61-89ae-d40735ddccaa/pull/0.log" Dec 02 17:37:27 crc kubenswrapper[4933]: I1202 17:37:27.771202 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fm792l_bf9b2c95-38b7-4a61-89ae-d40735ddccaa/pull/0.log" Dec 02 17:37:27 crc kubenswrapper[4933]: I1202 17:37:27.926742 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fm792l_bf9b2c95-38b7-4a61-89ae-d40735ddccaa/extract/0.log" Dec 02 17:37:27 crc kubenswrapper[4933]: I1202 17:37:27.926977 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fm792l_bf9b2c95-38b7-4a61-89ae-d40735ddccaa/pull/0.log" Dec 02 17:37:27 crc kubenswrapper[4933]: I1202 17:37:27.995911 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fm792l_bf9b2c95-38b7-4a61-89ae-d40735ddccaa/util/0.log" Dec 02 17:37:28 crc kubenswrapper[4933]: I1202 17:37:28.182044 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f837tfk6_de440c6d-8572-4245-a443-c7ac5ebe5a69/util/0.log" Dec 02 17:37:28 crc kubenswrapper[4933]: I1202 17:37:28.521723 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f837tfk6_de440c6d-8572-4245-a443-c7ac5ebe5a69/util/0.log" Dec 02 17:37:28 crc kubenswrapper[4933]: I1202 17:37:28.597226 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f837tfk6_de440c6d-8572-4245-a443-c7ac5ebe5a69/pull/0.log" Dec 02 17:37:28 crc kubenswrapper[4933]: I1202 17:37:28.599999 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f837tfk6_de440c6d-8572-4245-a443-c7ac5ebe5a69/pull/0.log" Dec 02 17:37:28 crc kubenswrapper[4933]: I1202 17:37:28.770042 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f837tfk6_de440c6d-8572-4245-a443-c7ac5ebe5a69/extract/0.log" Dec 02 17:37:28 crc kubenswrapper[4933]: I1202 17:37:28.770845 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f837tfk6_de440c6d-8572-4245-a443-c7ac5ebe5a69/util/0.log" Dec 02 17:37:28 crc kubenswrapper[4933]: I1202 17:37:28.870440 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f837tfk6_de440c6d-8572-4245-a443-c7ac5ebe5a69/pull/0.log" Dec 02 17:37:28 crc kubenswrapper[4933]: I1202 17:37:28.961417 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-hv4zj_fa22a7e7-ae56-415c-ba73-37c19aa18fcb/extract-utilities/0.log" Dec 02 17:37:29 crc kubenswrapper[4933]: I1202 17:37:29.137511 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-hv4zj_fa22a7e7-ae56-415c-ba73-37c19aa18fcb/extract-utilities/0.log" Dec 02 17:37:29 crc kubenswrapper[4933]: I1202 17:37:29.168804 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-hv4zj_fa22a7e7-ae56-415c-ba73-37c19aa18fcb/extract-content/0.log" Dec 02 17:37:29 crc kubenswrapper[4933]: I1202 17:37:29.175149 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-hv4zj_fa22a7e7-ae56-415c-ba73-37c19aa18fcb/extract-content/0.log" Dec 02 17:37:29 crc kubenswrapper[4933]: I1202 17:37:29.371494 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-hv4zj_fa22a7e7-ae56-415c-ba73-37c19aa18fcb/extract-content/0.log" Dec 02 17:37:29 crc kubenswrapper[4933]: I1202 17:37:29.371494 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-hv4zj_fa22a7e7-ae56-415c-ba73-37c19aa18fcb/extract-utilities/0.log" Dec 02 17:37:29 crc kubenswrapper[4933]: I1202 17:37:29.645303 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-h6v9b_dfb3af35-57b8-4936-9baf-7cc31fe9682b/extract-utilities/0.log" Dec 02 17:37:29 crc kubenswrapper[4933]: I1202 17:37:29.841692 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-h6v9b_dfb3af35-57b8-4936-9baf-7cc31fe9682b/extract-content/0.log" Dec 02 17:37:29 crc kubenswrapper[4933]: I1202 17:37:29.853460 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-h6v9b_dfb3af35-57b8-4936-9baf-7cc31fe9682b/extract-content/0.log" Dec 02 17:37:29 crc kubenswrapper[4933]: I1202 17:37:29.903326 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-h6v9b_dfb3af35-57b8-4936-9baf-7cc31fe9682b/extract-utilities/0.log" Dec 02 17:37:30 crc kubenswrapper[4933]: I1202 17:37:30.234030 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-h6v9b_dfb3af35-57b8-4936-9baf-7cc31fe9682b/extract-utilities/0.log" Dec 02 17:37:30 crc kubenswrapper[4933]: I1202 17:37:30.269979 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-h6v9b_dfb3af35-57b8-4936-9baf-7cc31fe9682b/extract-content/0.log" Dec 02 17:37:30 crc kubenswrapper[4933]: I1202 17:37:30.286143 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-hv4zj_fa22a7e7-ae56-415c-ba73-37c19aa18fcb/registry-server/0.log" Dec 02 17:37:30 crc kubenswrapper[4933]: I1202 17:37:30.485725 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-k9r4r_e66c3d98-54cc-4d3f-a1b4-3eee8ed9ab51/marketplace-operator/3.log" Dec 02 17:37:30 crc kubenswrapper[4933]: I1202 17:37:30.521771 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-k9r4r_e66c3d98-54cc-4d3f-a1b4-3eee8ed9ab51/marketplace-operator/2.log" Dec 02 17:37:30 crc kubenswrapper[4933]: I1202 17:37:30.526808 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-pr6cs_797f2838-8711-4f72-af0c-2fe515a73e03/extract-utilities/0.log" Dec 02 17:37:30 crc kubenswrapper[4933]: I1202 17:37:30.807108 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-pr6cs_797f2838-8711-4f72-af0c-2fe515a73e03/extract-content/0.log" Dec 02 17:37:30 crc kubenswrapper[4933]: I1202 17:37:30.887242 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-pr6cs_797f2838-8711-4f72-af0c-2fe515a73e03/extract-content/0.log" Dec 02 17:37:30 crc kubenswrapper[4933]: I1202 17:37:30.887404 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-pr6cs_797f2838-8711-4f72-af0c-2fe515a73e03/extract-utilities/0.log" Dec 02 17:37:30 crc kubenswrapper[4933]: I1202 17:37:30.933659 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-h6v9b_dfb3af35-57b8-4936-9baf-7cc31fe9682b/registry-server/0.log" Dec 02 17:37:31 crc kubenswrapper[4933]: I1202 17:37:31.018157 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-pr6cs_797f2838-8711-4f72-af0c-2fe515a73e03/extract-utilities/0.log" Dec 02 17:37:31 crc kubenswrapper[4933]: I1202 17:37:31.072225 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-pr6cs_797f2838-8711-4f72-af0c-2fe515a73e03/extract-content/0.log" Dec 02 17:37:31 crc kubenswrapper[4933]: I1202 17:37:31.167640 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-jpwwl_cb82c223-384a-463c-9deb-8cfe4a50ffd7/extract-utilities/0.log" Dec 02 17:37:31 crc kubenswrapper[4933]: I1202 17:37:31.286711 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-pr6cs_797f2838-8711-4f72-af0c-2fe515a73e03/registry-server/0.log" Dec 02 17:37:31 crc kubenswrapper[4933]: I1202 17:37:31.351235 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-jpwwl_cb82c223-384a-463c-9deb-8cfe4a50ffd7/extract-utilities/0.log" Dec 02 17:37:31 crc kubenswrapper[4933]: I1202 17:37:31.398782 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-jpwwl_cb82c223-384a-463c-9deb-8cfe4a50ffd7/extract-content/0.log" Dec 02 17:37:31 crc kubenswrapper[4933]: I1202 17:37:31.398779 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-jpwwl_cb82c223-384a-463c-9deb-8cfe4a50ffd7/extract-content/0.log" Dec 02 17:37:31 crc kubenswrapper[4933]: I1202 17:37:31.620663 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-jpwwl_cb82c223-384a-463c-9deb-8cfe4a50ffd7/extract-content/0.log" Dec 02 17:37:31 crc kubenswrapper[4933]: I1202 17:37:31.638182 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-jpwwl_cb82c223-384a-463c-9deb-8cfe4a50ffd7/extract-utilities/0.log" Dec 02 17:37:32 crc kubenswrapper[4933]: I1202 17:37:32.349465 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-jpwwl_cb82c223-384a-463c-9deb-8cfe4a50ffd7/registry-server/0.log" Dec 02 17:37:44 crc kubenswrapper[4933]: I1202 17:37:44.939420 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-668cf9dfbb-pcq8l_9d0e9982-e6e0-43d3-8e6d-8835a52fe9d8/prometheus-operator/0.log" Dec 02 17:37:45 crc kubenswrapper[4933]: I1202 17:37:45.080196 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-cd6cfcdf6-cwxhj_a8b5527a-5f6c-461a-8397-c911f538eb3a/prometheus-operator-admission-webhook/0.log" Dec 02 17:37:45 crc kubenswrapper[4933]: I1202 17:37:45.183850 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-cd6cfcdf6-wnjsz_15f56600-a066-40fd-8433-d0552173dc57/prometheus-operator-admission-webhook/0.log" Dec 02 17:37:45 crc kubenswrapper[4933]: I1202 17:37:45.310931 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-d8bb48f5d-dpvq4_9f22c4f9-d4f0-4ff8-9322-f03662c116a8/operator/0.log" Dec 02 17:37:45 crc kubenswrapper[4933]: I1202 17:37:45.399671 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-ui-dashboards-7d5fb4cbfb-zdcrf_d691d907-a29d-40ad-ad96-009e8a7d56e8/observability-ui-dashboards/0.log" Dec 02 17:37:45 crc kubenswrapper[4933]: I1202 17:37:45.530890 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5446b9c989-s9dt9_1ec8da33-52a7-4abb-a205-7c14a8186f5e/perses-operator/0.log" Dec 02 17:37:54 crc kubenswrapper[4933]: I1202 17:37:54.011939 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-x7gjw"] Dec 02 17:37:54 crc kubenswrapper[4933]: E1202 17:37:54.013562 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43bbe484-a14a-4f3d-96f1-ae27ebc50bb8" containerName="extract-content" Dec 02 17:37:54 crc kubenswrapper[4933]: I1202 17:37:54.013577 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="43bbe484-a14a-4f3d-96f1-ae27ebc50bb8" containerName="extract-content" Dec 02 17:37:54 crc kubenswrapper[4933]: E1202 17:37:54.013597 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43bbe484-a14a-4f3d-96f1-ae27ebc50bb8" containerName="registry-server" Dec 02 17:37:54 crc kubenswrapper[4933]: I1202 17:37:54.013603 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="43bbe484-a14a-4f3d-96f1-ae27ebc50bb8" containerName="registry-server" Dec 02 17:37:54 crc kubenswrapper[4933]: E1202 17:37:54.013626 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43bbe484-a14a-4f3d-96f1-ae27ebc50bb8" containerName="extract-utilities" Dec 02 17:37:54 crc kubenswrapper[4933]: I1202 17:37:54.013634 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="43bbe484-a14a-4f3d-96f1-ae27ebc50bb8" containerName="extract-utilities" Dec 02 17:37:54 crc kubenswrapper[4933]: I1202 17:37:54.013937 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="43bbe484-a14a-4f3d-96f1-ae27ebc50bb8" containerName="registry-server" Dec 02 17:37:54 crc kubenswrapper[4933]: I1202 17:37:54.015766 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-x7gjw" Dec 02 17:37:54 crc kubenswrapper[4933]: I1202 17:37:54.027297 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-x7gjw"] Dec 02 17:37:54 crc kubenswrapper[4933]: I1202 17:37:54.030865 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2eeaf410-601f-4e03-960e-1694782292e1-utilities\") pod \"redhat-operators-x7gjw\" (UID: \"2eeaf410-601f-4e03-960e-1694782292e1\") " pod="openshift-marketplace/redhat-operators-x7gjw" Dec 02 17:37:54 crc kubenswrapper[4933]: I1202 17:37:54.030934 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2eeaf410-601f-4e03-960e-1694782292e1-catalog-content\") pod \"redhat-operators-x7gjw\" (UID: \"2eeaf410-601f-4e03-960e-1694782292e1\") " pod="openshift-marketplace/redhat-operators-x7gjw" Dec 02 17:37:54 crc kubenswrapper[4933]: I1202 17:37:54.031023 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l72h6\" (UniqueName: \"kubernetes.io/projected/2eeaf410-601f-4e03-960e-1694782292e1-kube-api-access-l72h6\") pod \"redhat-operators-x7gjw\" (UID: \"2eeaf410-601f-4e03-960e-1694782292e1\") " pod="openshift-marketplace/redhat-operators-x7gjw" Dec 02 17:37:54 crc kubenswrapper[4933]: I1202 17:37:54.133677 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2eeaf410-601f-4e03-960e-1694782292e1-utilities\") pod \"redhat-operators-x7gjw\" (UID: \"2eeaf410-601f-4e03-960e-1694782292e1\") " pod="openshift-marketplace/redhat-operators-x7gjw" Dec 02 17:37:54 crc kubenswrapper[4933]: I1202 17:37:54.133754 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2eeaf410-601f-4e03-960e-1694782292e1-catalog-content\") pod \"redhat-operators-x7gjw\" (UID: \"2eeaf410-601f-4e03-960e-1694782292e1\") " pod="openshift-marketplace/redhat-operators-x7gjw" Dec 02 17:37:54 crc kubenswrapper[4933]: I1202 17:37:54.133843 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l72h6\" (UniqueName: \"kubernetes.io/projected/2eeaf410-601f-4e03-960e-1694782292e1-kube-api-access-l72h6\") pod \"redhat-operators-x7gjw\" (UID: \"2eeaf410-601f-4e03-960e-1694782292e1\") " pod="openshift-marketplace/redhat-operators-x7gjw" Dec 02 17:37:54 crc kubenswrapper[4933]: I1202 17:37:54.135120 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2eeaf410-601f-4e03-960e-1694782292e1-utilities\") pod \"redhat-operators-x7gjw\" (UID: \"2eeaf410-601f-4e03-960e-1694782292e1\") " pod="openshift-marketplace/redhat-operators-x7gjw" Dec 02 17:37:54 crc kubenswrapper[4933]: I1202 17:37:54.135170 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2eeaf410-601f-4e03-960e-1694782292e1-catalog-content\") pod \"redhat-operators-x7gjw\" (UID: \"2eeaf410-601f-4e03-960e-1694782292e1\") " pod="openshift-marketplace/redhat-operators-x7gjw" Dec 02 17:37:54 crc kubenswrapper[4933]: I1202 17:37:54.167616 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l72h6\" (UniqueName: \"kubernetes.io/projected/2eeaf410-601f-4e03-960e-1694782292e1-kube-api-access-l72h6\") pod \"redhat-operators-x7gjw\" (UID: \"2eeaf410-601f-4e03-960e-1694782292e1\") " pod="openshift-marketplace/redhat-operators-x7gjw" Dec 02 17:37:54 crc kubenswrapper[4933]: I1202 17:37:54.341766 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-x7gjw" Dec 02 17:37:54 crc kubenswrapper[4933]: I1202 17:37:54.852721 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-x7gjw"] Dec 02 17:37:55 crc kubenswrapper[4933]: I1202 17:37:55.854908 4933 generic.go:334] "Generic (PLEG): container finished" podID="2eeaf410-601f-4e03-960e-1694782292e1" containerID="a9602aba49982aaa9efefc44a35bce8eaea4fcbaaacd1618106baa2fc1fc1d6e" exitCode=0 Dec 02 17:37:55 crc kubenswrapper[4933]: I1202 17:37:55.855005 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x7gjw" event={"ID":"2eeaf410-601f-4e03-960e-1694782292e1","Type":"ContainerDied","Data":"a9602aba49982aaa9efefc44a35bce8eaea4fcbaaacd1618106baa2fc1fc1d6e"} Dec 02 17:37:55 crc kubenswrapper[4933]: I1202 17:37:55.855468 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x7gjw" event={"ID":"2eeaf410-601f-4e03-960e-1694782292e1","Type":"ContainerStarted","Data":"4a8ca79124dd332bfd08314e20935d16a5212e2cc43bcefe717b2139503ac089"} Dec 02 17:37:55 crc kubenswrapper[4933]: I1202 17:37:55.858718 4933 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 17:37:56 crc kubenswrapper[4933]: I1202 17:37:56.872272 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x7gjw" event={"ID":"2eeaf410-601f-4e03-960e-1694782292e1","Type":"ContainerStarted","Data":"658b6ed50b85e8d6d70441165dbe7e7bb1405240d9cb0eb8439db75cc7682ad8"} Dec 02 17:37:59 crc kubenswrapper[4933]: I1202 17:37:59.901653 4933 generic.go:334] "Generic (PLEG): container finished" podID="2eeaf410-601f-4e03-960e-1694782292e1" containerID="658b6ed50b85e8d6d70441165dbe7e7bb1405240d9cb0eb8439db75cc7682ad8" exitCode=0 Dec 02 17:37:59 crc kubenswrapper[4933]: I1202 17:37:59.901723 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x7gjw" event={"ID":"2eeaf410-601f-4e03-960e-1694782292e1","Type":"ContainerDied","Data":"658b6ed50b85e8d6d70441165dbe7e7bb1405240d9cb0eb8439db75cc7682ad8"} Dec 02 17:38:00 crc kubenswrapper[4933]: I1202 17:38:00.170747 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-5dfc66d568-zbwvk_a390b3c7-fa88-43d0-81d1-ca767f1e9133/kube-rbac-proxy/0.log" Dec 02 17:38:00 crc kubenswrapper[4933]: I1202 17:38:00.191060 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-5dfc66d568-zbwvk_a390b3c7-fa88-43d0-81d1-ca767f1e9133/manager/0.log" Dec 02 17:38:00 crc kubenswrapper[4933]: I1202 17:38:00.915746 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x7gjw" event={"ID":"2eeaf410-601f-4e03-960e-1694782292e1","Type":"ContainerStarted","Data":"976c302ed2e4bb6a8c58c33b8f8c73de1d3d11c99d90398c9531b5b3bb3c38c8"} Dec 02 17:38:00 crc kubenswrapper[4933]: I1202 17:38:00.937999 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-x7gjw" podStartSLOduration=3.4952467240000002 podStartE2EDuration="7.937978006s" podCreationTimestamp="2025-12-02 17:37:53 +0000 UTC" firstStartedPulling="2025-12-02 17:37:55.857903349 +0000 UTC m=+6339.109130052" lastFinishedPulling="2025-12-02 17:38:00.300634631 +0000 UTC m=+6343.551861334" observedRunningTime="2025-12-02 17:38:00.929605985 +0000 UTC m=+6344.180832688" watchObservedRunningTime="2025-12-02 17:38:00.937978006 +0000 UTC m=+6344.189204719" Dec 02 17:38:04 crc kubenswrapper[4933]: I1202 17:38:04.342940 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-x7gjw" Dec 02 17:38:04 crc kubenswrapper[4933]: I1202 17:38:04.343269 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-x7gjw" Dec 02 17:38:05 crc kubenswrapper[4933]: I1202 17:38:05.403725 4933 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-x7gjw" podUID="2eeaf410-601f-4e03-960e-1694782292e1" containerName="registry-server" probeResult="failure" output=< Dec 02 17:38:05 crc kubenswrapper[4933]: timeout: failed to connect service ":50051" within 1s Dec 02 17:38:05 crc kubenswrapper[4933]: > Dec 02 17:38:10 crc kubenswrapper[4933]: E1202 17:38:10.278592 4933 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.213:40912->38.102.83.213:44657: write tcp 38.102.83.213:40912->38.102.83.213:44657: write: connection reset by peer Dec 02 17:38:15 crc kubenswrapper[4933]: I1202 17:38:15.417450 4933 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-x7gjw" podUID="2eeaf410-601f-4e03-960e-1694782292e1" containerName="registry-server" probeResult="failure" output=< Dec 02 17:38:15 crc kubenswrapper[4933]: timeout: failed to connect service ":50051" within 1s Dec 02 17:38:15 crc kubenswrapper[4933]: > Dec 02 17:38:24 crc kubenswrapper[4933]: I1202 17:38:24.404473 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-x7gjw" Dec 02 17:38:24 crc kubenswrapper[4933]: I1202 17:38:24.463670 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-x7gjw" Dec 02 17:38:25 crc kubenswrapper[4933]: I1202 17:38:25.222162 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-x7gjw"] Dec 02 17:38:26 crc kubenswrapper[4933]: I1202 17:38:26.230817 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-x7gjw" podUID="2eeaf410-601f-4e03-960e-1694782292e1" containerName="registry-server" containerID="cri-o://976c302ed2e4bb6a8c58c33b8f8c73de1d3d11c99d90398c9531b5b3bb3c38c8" gracePeriod=2 Dec 02 17:38:27 crc kubenswrapper[4933]: I1202 17:38:27.297755 4933 generic.go:334] "Generic (PLEG): container finished" podID="2eeaf410-601f-4e03-960e-1694782292e1" containerID="976c302ed2e4bb6a8c58c33b8f8c73de1d3d11c99d90398c9531b5b3bb3c38c8" exitCode=0 Dec 02 17:38:27 crc kubenswrapper[4933]: I1202 17:38:27.298058 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x7gjw" event={"ID":"2eeaf410-601f-4e03-960e-1694782292e1","Type":"ContainerDied","Data":"976c302ed2e4bb6a8c58c33b8f8c73de1d3d11c99d90398c9531b5b3bb3c38c8"} Dec 02 17:38:27 crc kubenswrapper[4933]: I1202 17:38:27.389034 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-x7gjw" Dec 02 17:38:27 crc kubenswrapper[4933]: I1202 17:38:27.590308 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2eeaf410-601f-4e03-960e-1694782292e1-catalog-content\") pod \"2eeaf410-601f-4e03-960e-1694782292e1\" (UID: \"2eeaf410-601f-4e03-960e-1694782292e1\") " Dec 02 17:38:27 crc kubenswrapper[4933]: I1202 17:38:27.590433 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l72h6\" (UniqueName: \"kubernetes.io/projected/2eeaf410-601f-4e03-960e-1694782292e1-kube-api-access-l72h6\") pod \"2eeaf410-601f-4e03-960e-1694782292e1\" (UID: \"2eeaf410-601f-4e03-960e-1694782292e1\") " Dec 02 17:38:27 crc kubenswrapper[4933]: I1202 17:38:27.590635 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2eeaf410-601f-4e03-960e-1694782292e1-utilities\") pod \"2eeaf410-601f-4e03-960e-1694782292e1\" (UID: \"2eeaf410-601f-4e03-960e-1694782292e1\") " Dec 02 17:38:27 crc kubenswrapper[4933]: I1202 17:38:27.591698 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2eeaf410-601f-4e03-960e-1694782292e1-utilities" (OuterVolumeSpecName: "utilities") pod "2eeaf410-601f-4e03-960e-1694782292e1" (UID: "2eeaf410-601f-4e03-960e-1694782292e1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 17:38:27 crc kubenswrapper[4933]: I1202 17:38:27.599881 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2eeaf410-601f-4e03-960e-1694782292e1-kube-api-access-l72h6" (OuterVolumeSpecName: "kube-api-access-l72h6") pod "2eeaf410-601f-4e03-960e-1694782292e1" (UID: "2eeaf410-601f-4e03-960e-1694782292e1"). InnerVolumeSpecName "kube-api-access-l72h6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 17:38:27 crc kubenswrapper[4933]: I1202 17:38:27.695200 4933 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2eeaf410-601f-4e03-960e-1694782292e1-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 17:38:27 crc kubenswrapper[4933]: I1202 17:38:27.695538 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l72h6\" (UniqueName: \"kubernetes.io/projected/2eeaf410-601f-4e03-960e-1694782292e1-kube-api-access-l72h6\") on node \"crc\" DevicePath \"\"" Dec 02 17:38:27 crc kubenswrapper[4933]: I1202 17:38:27.707986 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2eeaf410-601f-4e03-960e-1694782292e1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2eeaf410-601f-4e03-960e-1694782292e1" (UID: "2eeaf410-601f-4e03-960e-1694782292e1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 17:38:27 crc kubenswrapper[4933]: I1202 17:38:27.797256 4933 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2eeaf410-601f-4e03-960e-1694782292e1-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 17:38:28 crc kubenswrapper[4933]: I1202 17:38:28.312015 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x7gjw" event={"ID":"2eeaf410-601f-4e03-960e-1694782292e1","Type":"ContainerDied","Data":"4a8ca79124dd332bfd08314e20935d16a5212e2cc43bcefe717b2139503ac089"} Dec 02 17:38:28 crc kubenswrapper[4933]: I1202 17:38:28.312059 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-x7gjw" Dec 02 17:38:28 crc kubenswrapper[4933]: I1202 17:38:28.312100 4933 scope.go:117] "RemoveContainer" containerID="976c302ed2e4bb6a8c58c33b8f8c73de1d3d11c99d90398c9531b5b3bb3c38c8" Dec 02 17:38:28 crc kubenswrapper[4933]: I1202 17:38:28.368339 4933 scope.go:117] "RemoveContainer" containerID="658b6ed50b85e8d6d70441165dbe7e7bb1405240d9cb0eb8439db75cc7682ad8" Dec 02 17:38:28 crc kubenswrapper[4933]: I1202 17:38:28.373792 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-x7gjw"] Dec 02 17:38:28 crc kubenswrapper[4933]: I1202 17:38:28.386721 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-x7gjw"] Dec 02 17:38:28 crc kubenswrapper[4933]: I1202 17:38:28.405112 4933 scope.go:117] "RemoveContainer" containerID="a9602aba49982aaa9efefc44a35bce8eaea4fcbaaacd1618106baa2fc1fc1d6e" Dec 02 17:38:29 crc kubenswrapper[4933]: I1202 17:38:29.091570 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2eeaf410-601f-4e03-960e-1694782292e1" path="/var/lib/kubelet/pods/2eeaf410-601f-4e03-960e-1694782292e1/volumes" Dec 02 17:39:17 crc kubenswrapper[4933]: I1202 17:39:17.169399 4933 patch_prober.go:28] interesting pod/machine-config-daemon-d2p6w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 17:39:17 crc kubenswrapper[4933]: I1202 17:39:17.170066 4933 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 17:39:47 crc kubenswrapper[4933]: I1202 17:39:47.170136 4933 patch_prober.go:28] interesting pod/machine-config-daemon-d2p6w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 17:39:47 crc kubenswrapper[4933]: I1202 17:39:47.170799 4933 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 17:39:48 crc kubenswrapper[4933]: E1202 17:39:48.273767 4933 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod924c1228_98a1_4dea_b42f_d1f680719607.slice/crio-6564edcec74ba05e16ad9eb2bf12211027116093d18a0a5c5d6c87dd9a55510d.scope\": RecentStats: unable to find data in memory cache]" Dec 02 17:39:48 crc kubenswrapper[4933]: E1202 17:39:48.274080 4933 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod924c1228_98a1_4dea_b42f_d1f680719607.slice/crio-conmon-6564edcec74ba05e16ad9eb2bf12211027116093d18a0a5c5d6c87dd9a55510d.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod924c1228_98a1_4dea_b42f_d1f680719607.slice/crio-6564edcec74ba05e16ad9eb2bf12211027116093d18a0a5c5d6c87dd9a55510d.scope\": RecentStats: unable to find data in memory cache]" Dec 02 17:39:48 crc kubenswrapper[4933]: I1202 17:39:48.731858 4933 generic.go:334] "Generic (PLEG): container finished" podID="924c1228-98a1-4dea-b42f-d1f680719607" containerID="6564edcec74ba05e16ad9eb2bf12211027116093d18a0a5c5d6c87dd9a55510d" exitCode=0 Dec 02 17:39:48 crc kubenswrapper[4933]: I1202 17:39:48.731962 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dpxp8/must-gather-8jmnr" event={"ID":"924c1228-98a1-4dea-b42f-d1f680719607","Type":"ContainerDied","Data":"6564edcec74ba05e16ad9eb2bf12211027116093d18a0a5c5d6c87dd9a55510d"} Dec 02 17:39:48 crc kubenswrapper[4933]: I1202 17:39:48.733036 4933 scope.go:117] "RemoveContainer" containerID="6564edcec74ba05e16ad9eb2bf12211027116093d18a0a5c5d6c87dd9a55510d" Dec 02 17:39:49 crc kubenswrapper[4933]: I1202 17:39:49.168334 4933 scope.go:117] "RemoveContainer" containerID="ab86ce2068441135695c77541201eb3c535b9355f93dd544561915e3e75a3495" Dec 02 17:39:49 crc kubenswrapper[4933]: I1202 17:39:49.333050 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-dpxp8_must-gather-8jmnr_924c1228-98a1-4dea-b42f-d1f680719607/gather/0.log" Dec 02 17:39:57 crc kubenswrapper[4933]: I1202 17:39:57.282761 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-dpxp8/must-gather-8jmnr"] Dec 02 17:39:57 crc kubenswrapper[4933]: I1202 17:39:57.283406 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-dpxp8/must-gather-8jmnr" podUID="924c1228-98a1-4dea-b42f-d1f680719607" containerName="copy" containerID="cri-o://ac3a47b1a2f3023a8f94376ef0d432d7e6f330334ae8476d05e15a28654ecd66" gracePeriod=2 Dec 02 17:39:57 crc kubenswrapper[4933]: I1202 17:39:57.294647 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-dpxp8/must-gather-8jmnr"] Dec 02 17:39:57 crc kubenswrapper[4933]: I1202 17:39:57.795673 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-dpxp8_must-gather-8jmnr_924c1228-98a1-4dea-b42f-d1f680719607/copy/0.log" Dec 02 17:39:57 crc kubenswrapper[4933]: I1202 17:39:57.796418 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dpxp8/must-gather-8jmnr" Dec 02 17:39:57 crc kubenswrapper[4933]: I1202 17:39:57.857224 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-dpxp8_must-gather-8jmnr_924c1228-98a1-4dea-b42f-d1f680719607/copy/0.log" Dec 02 17:39:57 crc kubenswrapper[4933]: I1202 17:39:57.857625 4933 generic.go:334] "Generic (PLEG): container finished" podID="924c1228-98a1-4dea-b42f-d1f680719607" containerID="ac3a47b1a2f3023a8f94376ef0d432d7e6f330334ae8476d05e15a28654ecd66" exitCode=143 Dec 02 17:39:57 crc kubenswrapper[4933]: I1202 17:39:57.857675 4933 scope.go:117] "RemoveContainer" containerID="ac3a47b1a2f3023a8f94376ef0d432d7e6f330334ae8476d05e15a28654ecd66" Dec 02 17:39:57 crc kubenswrapper[4933]: I1202 17:39:57.857706 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dpxp8/must-gather-8jmnr" Dec 02 17:39:57 crc kubenswrapper[4933]: I1202 17:39:57.880159 4933 scope.go:117] "RemoveContainer" containerID="6564edcec74ba05e16ad9eb2bf12211027116093d18a0a5c5d6c87dd9a55510d" Dec 02 17:39:57 crc kubenswrapper[4933]: I1202 17:39:57.915360 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/924c1228-98a1-4dea-b42f-d1f680719607-must-gather-output\") pod \"924c1228-98a1-4dea-b42f-d1f680719607\" (UID: \"924c1228-98a1-4dea-b42f-d1f680719607\") " Dec 02 17:39:57 crc kubenswrapper[4933]: I1202 17:39:57.915403 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-94ppb\" (UniqueName: \"kubernetes.io/projected/924c1228-98a1-4dea-b42f-d1f680719607-kube-api-access-94ppb\") pod \"924c1228-98a1-4dea-b42f-d1f680719607\" (UID: \"924c1228-98a1-4dea-b42f-d1f680719607\") " Dec 02 17:39:57 crc kubenswrapper[4933]: I1202 17:39:57.922043 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/924c1228-98a1-4dea-b42f-d1f680719607-kube-api-access-94ppb" (OuterVolumeSpecName: "kube-api-access-94ppb") pod "924c1228-98a1-4dea-b42f-d1f680719607" (UID: "924c1228-98a1-4dea-b42f-d1f680719607"). InnerVolumeSpecName "kube-api-access-94ppb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 17:39:57 crc kubenswrapper[4933]: I1202 17:39:57.957716 4933 scope.go:117] "RemoveContainer" containerID="ac3a47b1a2f3023a8f94376ef0d432d7e6f330334ae8476d05e15a28654ecd66" Dec 02 17:39:57 crc kubenswrapper[4933]: E1202 17:39:57.958965 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac3a47b1a2f3023a8f94376ef0d432d7e6f330334ae8476d05e15a28654ecd66\": container with ID starting with ac3a47b1a2f3023a8f94376ef0d432d7e6f330334ae8476d05e15a28654ecd66 not found: ID does not exist" containerID="ac3a47b1a2f3023a8f94376ef0d432d7e6f330334ae8476d05e15a28654ecd66" Dec 02 17:39:57 crc kubenswrapper[4933]: I1202 17:39:57.959313 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac3a47b1a2f3023a8f94376ef0d432d7e6f330334ae8476d05e15a28654ecd66"} err="failed to get container status \"ac3a47b1a2f3023a8f94376ef0d432d7e6f330334ae8476d05e15a28654ecd66\": rpc error: code = NotFound desc = could not find container \"ac3a47b1a2f3023a8f94376ef0d432d7e6f330334ae8476d05e15a28654ecd66\": container with ID starting with ac3a47b1a2f3023a8f94376ef0d432d7e6f330334ae8476d05e15a28654ecd66 not found: ID does not exist" Dec 02 17:39:57 crc kubenswrapper[4933]: I1202 17:39:57.959462 4933 scope.go:117] "RemoveContainer" containerID="6564edcec74ba05e16ad9eb2bf12211027116093d18a0a5c5d6c87dd9a55510d" Dec 02 17:39:57 crc kubenswrapper[4933]: E1202 17:39:57.960481 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6564edcec74ba05e16ad9eb2bf12211027116093d18a0a5c5d6c87dd9a55510d\": container with ID starting with 6564edcec74ba05e16ad9eb2bf12211027116093d18a0a5c5d6c87dd9a55510d not found: ID does not exist" containerID="6564edcec74ba05e16ad9eb2bf12211027116093d18a0a5c5d6c87dd9a55510d" Dec 02 17:39:57 crc kubenswrapper[4933]: I1202 17:39:57.960546 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6564edcec74ba05e16ad9eb2bf12211027116093d18a0a5c5d6c87dd9a55510d"} err="failed to get container status \"6564edcec74ba05e16ad9eb2bf12211027116093d18a0a5c5d6c87dd9a55510d\": rpc error: code = NotFound desc = could not find container \"6564edcec74ba05e16ad9eb2bf12211027116093d18a0a5c5d6c87dd9a55510d\": container with ID starting with 6564edcec74ba05e16ad9eb2bf12211027116093d18a0a5c5d6c87dd9a55510d not found: ID does not exist" Dec 02 17:39:58 crc kubenswrapper[4933]: I1202 17:39:58.018241 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-94ppb\" (UniqueName: \"kubernetes.io/projected/924c1228-98a1-4dea-b42f-d1f680719607-kube-api-access-94ppb\") on node \"crc\" DevicePath \"\"" Dec 02 17:39:58 crc kubenswrapper[4933]: I1202 17:39:58.110045 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/924c1228-98a1-4dea-b42f-d1f680719607-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "924c1228-98a1-4dea-b42f-d1f680719607" (UID: "924c1228-98a1-4dea-b42f-d1f680719607"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 17:39:58 crc kubenswrapper[4933]: I1202 17:39:58.120383 4933 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/924c1228-98a1-4dea-b42f-d1f680719607-must-gather-output\") on node \"crc\" DevicePath \"\"" Dec 02 17:39:59 crc kubenswrapper[4933]: I1202 17:39:59.070351 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="924c1228-98a1-4dea-b42f-d1f680719607" path="/var/lib/kubelet/pods/924c1228-98a1-4dea-b42f-d1f680719607/volumes" Dec 02 17:40:17 crc kubenswrapper[4933]: I1202 17:40:17.169769 4933 patch_prober.go:28] interesting pod/machine-config-daemon-d2p6w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 17:40:17 crc kubenswrapper[4933]: I1202 17:40:17.170473 4933 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 17:40:17 crc kubenswrapper[4933]: I1202 17:40:17.170551 4933 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" Dec 02 17:40:17 crc kubenswrapper[4933]: I1202 17:40:17.171498 4933 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2ee3071a320876f6620f091b90e8722600baacb646bda6aad185c6e1a117bdd8"} pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 17:40:17 crc kubenswrapper[4933]: I1202 17:40:17.171540 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" containerName="machine-config-daemon" containerID="cri-o://2ee3071a320876f6620f091b90e8722600baacb646bda6aad185c6e1a117bdd8" gracePeriod=600 Dec 02 17:40:17 crc kubenswrapper[4933]: E1202 17:40:17.294221 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 17:40:18 crc kubenswrapper[4933]: I1202 17:40:18.173806 4933 generic.go:334] "Generic (PLEG): container finished" podID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" containerID="2ee3071a320876f6620f091b90e8722600baacb646bda6aad185c6e1a117bdd8" exitCode=0 Dec 02 17:40:18 crc kubenswrapper[4933]: I1202 17:40:18.173854 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" event={"ID":"e6c1c5e6-50dd-428a-890c-2c3f0456f2fa","Type":"ContainerDied","Data":"2ee3071a320876f6620f091b90e8722600baacb646bda6aad185c6e1a117bdd8"} Dec 02 17:40:18 crc kubenswrapper[4933]: I1202 17:40:18.173945 4933 scope.go:117] "RemoveContainer" containerID="a83509fdc2a402118ff91daac7799c2a25cc85aee38ea32fab3a07cc2be654e1" Dec 02 17:40:18 crc kubenswrapper[4933]: I1202 17:40:18.174751 4933 scope.go:117] "RemoveContainer" containerID="2ee3071a320876f6620f091b90e8722600baacb646bda6aad185c6e1a117bdd8" Dec 02 17:40:18 crc kubenswrapper[4933]: E1202 17:40:18.175353 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 17:40:29 crc kubenswrapper[4933]: I1202 17:40:29.054634 4933 scope.go:117] "RemoveContainer" containerID="2ee3071a320876f6620f091b90e8722600baacb646bda6aad185c6e1a117bdd8" Dec 02 17:40:29 crc kubenswrapper[4933]: E1202 17:40:29.055899 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 17:40:44 crc kubenswrapper[4933]: I1202 17:40:44.053793 4933 scope.go:117] "RemoveContainer" containerID="2ee3071a320876f6620f091b90e8722600baacb646bda6aad185c6e1a117bdd8" Dec 02 17:40:44 crc kubenswrapper[4933]: E1202 17:40:44.054623 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 17:40:49 crc kubenswrapper[4933]: I1202 17:40:49.316247 4933 scope.go:117] "RemoveContainer" containerID="8fb666ac0cecec44fb0c4e0625db4677c20281790414e6fc8fb98260085a273e" Dec 02 17:40:56 crc kubenswrapper[4933]: I1202 17:40:56.053670 4933 scope.go:117] "RemoveContainer" containerID="2ee3071a320876f6620f091b90e8722600baacb646bda6aad185c6e1a117bdd8" Dec 02 17:40:56 crc kubenswrapper[4933]: E1202 17:40:56.055077 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 17:41:11 crc kubenswrapper[4933]: I1202 17:41:11.053938 4933 scope.go:117] "RemoveContainer" containerID="2ee3071a320876f6620f091b90e8722600baacb646bda6aad185c6e1a117bdd8" Dec 02 17:41:11 crc kubenswrapper[4933]: E1202 17:41:11.054940 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 17:41:23 crc kubenswrapper[4933]: I1202 17:41:23.053936 4933 scope.go:117] "RemoveContainer" containerID="2ee3071a320876f6620f091b90e8722600baacb646bda6aad185c6e1a117bdd8" Dec 02 17:41:23 crc kubenswrapper[4933]: E1202 17:41:23.055328 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 17:41:37 crc kubenswrapper[4933]: I1202 17:41:37.053800 4933 scope.go:117] "RemoveContainer" containerID="2ee3071a320876f6620f091b90e8722600baacb646bda6aad185c6e1a117bdd8" Dec 02 17:41:37 crc kubenswrapper[4933]: E1202 17:41:37.054598 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 17:41:48 crc kubenswrapper[4933]: I1202 17:41:48.055159 4933 scope.go:117] "RemoveContainer" containerID="2ee3071a320876f6620f091b90e8722600baacb646bda6aad185c6e1a117bdd8" Dec 02 17:41:48 crc kubenswrapper[4933]: E1202 17:41:48.056893 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 17:42:03 crc kubenswrapper[4933]: I1202 17:42:03.053919 4933 scope.go:117] "RemoveContainer" containerID="2ee3071a320876f6620f091b90e8722600baacb646bda6aad185c6e1a117bdd8" Dec 02 17:42:03 crc kubenswrapper[4933]: E1202 17:42:03.055186 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 17:42:17 crc kubenswrapper[4933]: I1202 17:42:17.067876 4933 scope.go:117] "RemoveContainer" containerID="2ee3071a320876f6620f091b90e8722600baacb646bda6aad185c6e1a117bdd8" Dec 02 17:42:17 crc kubenswrapper[4933]: E1202 17:42:17.068759 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 17:42:31 crc kubenswrapper[4933]: I1202 17:42:31.055714 4933 scope.go:117] "RemoveContainer" containerID="2ee3071a320876f6620f091b90e8722600baacb646bda6aad185c6e1a117bdd8" Dec 02 17:42:31 crc kubenswrapper[4933]: E1202 17:42:31.058699 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 17:42:44 crc kubenswrapper[4933]: I1202 17:42:44.054994 4933 scope.go:117] "RemoveContainer" containerID="2ee3071a320876f6620f091b90e8722600baacb646bda6aad185c6e1a117bdd8" Dec 02 17:42:44 crc kubenswrapper[4933]: E1202 17:42:44.056009 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 17:42:54 crc kubenswrapper[4933]: I1202 17:42:54.888268 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-m26tx"] Dec 02 17:42:54 crc kubenswrapper[4933]: E1202 17:42:54.889341 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="924c1228-98a1-4dea-b42f-d1f680719607" containerName="gather" Dec 02 17:42:54 crc kubenswrapper[4933]: I1202 17:42:54.889359 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="924c1228-98a1-4dea-b42f-d1f680719607" containerName="gather" Dec 02 17:42:54 crc kubenswrapper[4933]: E1202 17:42:54.889392 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="924c1228-98a1-4dea-b42f-d1f680719607" containerName="copy" Dec 02 17:42:54 crc kubenswrapper[4933]: I1202 17:42:54.889400 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="924c1228-98a1-4dea-b42f-d1f680719607" containerName="copy" Dec 02 17:42:54 crc kubenswrapper[4933]: E1202 17:42:54.889416 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2eeaf410-601f-4e03-960e-1694782292e1" containerName="extract-utilities" Dec 02 17:42:54 crc kubenswrapper[4933]: I1202 17:42:54.889425 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="2eeaf410-601f-4e03-960e-1694782292e1" containerName="extract-utilities" Dec 02 17:42:54 crc kubenswrapper[4933]: E1202 17:42:54.889471 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2eeaf410-601f-4e03-960e-1694782292e1" containerName="extract-content" Dec 02 17:42:54 crc kubenswrapper[4933]: I1202 17:42:54.889479 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="2eeaf410-601f-4e03-960e-1694782292e1" containerName="extract-content" Dec 02 17:42:54 crc kubenswrapper[4933]: E1202 17:42:54.889499 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2eeaf410-601f-4e03-960e-1694782292e1" containerName="registry-server" Dec 02 17:42:54 crc kubenswrapper[4933]: I1202 17:42:54.889507 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="2eeaf410-601f-4e03-960e-1694782292e1" containerName="registry-server" Dec 02 17:42:54 crc kubenswrapper[4933]: I1202 17:42:54.889847 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="2eeaf410-601f-4e03-960e-1694782292e1" containerName="registry-server" Dec 02 17:42:54 crc kubenswrapper[4933]: I1202 17:42:54.889880 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="924c1228-98a1-4dea-b42f-d1f680719607" containerName="gather" Dec 02 17:42:54 crc kubenswrapper[4933]: I1202 17:42:54.889896 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="924c1228-98a1-4dea-b42f-d1f680719607" containerName="copy" Dec 02 17:42:54 crc kubenswrapper[4933]: I1202 17:42:54.893715 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m26tx" Dec 02 17:42:54 crc kubenswrapper[4933]: I1202 17:42:54.915874 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-m26tx"] Dec 02 17:42:54 crc kubenswrapper[4933]: I1202 17:42:54.997090 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-825qs\" (UniqueName: \"kubernetes.io/projected/6d6ca96f-9078-4a0b-8277-2a1f1e16cec0-kube-api-access-825qs\") pod \"redhat-marketplace-m26tx\" (UID: \"6d6ca96f-9078-4a0b-8277-2a1f1e16cec0\") " pod="openshift-marketplace/redhat-marketplace-m26tx" Dec 02 17:42:54 crc kubenswrapper[4933]: I1202 17:42:54.997190 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d6ca96f-9078-4a0b-8277-2a1f1e16cec0-utilities\") pod \"redhat-marketplace-m26tx\" (UID: \"6d6ca96f-9078-4a0b-8277-2a1f1e16cec0\") " pod="openshift-marketplace/redhat-marketplace-m26tx" Dec 02 17:42:54 crc kubenswrapper[4933]: I1202 17:42:54.997677 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d6ca96f-9078-4a0b-8277-2a1f1e16cec0-catalog-content\") pod \"redhat-marketplace-m26tx\" (UID: \"6d6ca96f-9078-4a0b-8277-2a1f1e16cec0\") " pod="openshift-marketplace/redhat-marketplace-m26tx" Dec 02 17:42:55 crc kubenswrapper[4933]: I1202 17:42:55.102158 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-825qs\" (UniqueName: \"kubernetes.io/projected/6d6ca96f-9078-4a0b-8277-2a1f1e16cec0-kube-api-access-825qs\") pod \"redhat-marketplace-m26tx\" (UID: \"6d6ca96f-9078-4a0b-8277-2a1f1e16cec0\") " pod="openshift-marketplace/redhat-marketplace-m26tx" Dec 02 17:42:55 crc kubenswrapper[4933]: I1202 17:42:55.102241 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d6ca96f-9078-4a0b-8277-2a1f1e16cec0-utilities\") pod \"redhat-marketplace-m26tx\" (UID: \"6d6ca96f-9078-4a0b-8277-2a1f1e16cec0\") " pod="openshift-marketplace/redhat-marketplace-m26tx" Dec 02 17:42:55 crc kubenswrapper[4933]: I1202 17:42:55.102372 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d6ca96f-9078-4a0b-8277-2a1f1e16cec0-catalog-content\") pod \"redhat-marketplace-m26tx\" (UID: \"6d6ca96f-9078-4a0b-8277-2a1f1e16cec0\") " pod="openshift-marketplace/redhat-marketplace-m26tx" Dec 02 17:42:55 crc kubenswrapper[4933]: I1202 17:42:55.103119 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d6ca96f-9078-4a0b-8277-2a1f1e16cec0-utilities\") pod \"redhat-marketplace-m26tx\" (UID: \"6d6ca96f-9078-4a0b-8277-2a1f1e16cec0\") " pod="openshift-marketplace/redhat-marketplace-m26tx" Dec 02 17:42:55 crc kubenswrapper[4933]: I1202 17:42:55.103173 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d6ca96f-9078-4a0b-8277-2a1f1e16cec0-catalog-content\") pod \"redhat-marketplace-m26tx\" (UID: \"6d6ca96f-9078-4a0b-8277-2a1f1e16cec0\") " pod="openshift-marketplace/redhat-marketplace-m26tx" Dec 02 17:42:55 crc kubenswrapper[4933]: I1202 17:42:55.136365 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-825qs\" (UniqueName: \"kubernetes.io/projected/6d6ca96f-9078-4a0b-8277-2a1f1e16cec0-kube-api-access-825qs\") pod \"redhat-marketplace-m26tx\" (UID: \"6d6ca96f-9078-4a0b-8277-2a1f1e16cec0\") " pod="openshift-marketplace/redhat-marketplace-m26tx" Dec 02 17:42:55 crc kubenswrapper[4933]: I1202 17:42:55.225439 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m26tx" Dec 02 17:42:55 crc kubenswrapper[4933]: I1202 17:42:55.789137 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-m26tx"] Dec 02 17:42:56 crc kubenswrapper[4933]: I1202 17:42:56.778880 4933 generic.go:334] "Generic (PLEG): container finished" podID="6d6ca96f-9078-4a0b-8277-2a1f1e16cec0" containerID="617ece9037f28f68d5c6b4b536324343d8c3e53d2cd1b54c7a4389c3e3ea7f18" exitCode=0 Dec 02 17:42:56 crc kubenswrapper[4933]: I1202 17:42:56.779358 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m26tx" event={"ID":"6d6ca96f-9078-4a0b-8277-2a1f1e16cec0","Type":"ContainerDied","Data":"617ece9037f28f68d5c6b4b536324343d8c3e53d2cd1b54c7a4389c3e3ea7f18"} Dec 02 17:42:56 crc kubenswrapper[4933]: I1202 17:42:56.779486 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m26tx" event={"ID":"6d6ca96f-9078-4a0b-8277-2a1f1e16cec0","Type":"ContainerStarted","Data":"13b8425796b9e7356d896c77735e88585effe027494e7b957ba39d8b9dc5b1d5"} Dec 02 17:42:56 crc kubenswrapper[4933]: I1202 17:42:56.792368 4933 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 17:42:57 crc kubenswrapper[4933]: I1202 17:42:57.060589 4933 scope.go:117] "RemoveContainer" containerID="2ee3071a320876f6620f091b90e8722600baacb646bda6aad185c6e1a117bdd8" Dec 02 17:42:57 crc kubenswrapper[4933]: E1202 17:42:57.061227 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 17:42:57 crc kubenswrapper[4933]: I1202 17:42:57.796535 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m26tx" event={"ID":"6d6ca96f-9078-4a0b-8277-2a1f1e16cec0","Type":"ContainerStarted","Data":"a45dc4d0b7403f1ef46bf76a6e0d2d9b02203643a7573a7043eb4ccb02cdc1d2"} Dec 02 17:42:58 crc kubenswrapper[4933]: I1202 17:42:58.815642 4933 generic.go:334] "Generic (PLEG): container finished" podID="6d6ca96f-9078-4a0b-8277-2a1f1e16cec0" containerID="a45dc4d0b7403f1ef46bf76a6e0d2d9b02203643a7573a7043eb4ccb02cdc1d2" exitCode=0 Dec 02 17:42:58 crc kubenswrapper[4933]: I1202 17:42:58.815927 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m26tx" event={"ID":"6d6ca96f-9078-4a0b-8277-2a1f1e16cec0","Type":"ContainerDied","Data":"a45dc4d0b7403f1ef46bf76a6e0d2d9b02203643a7573a7043eb4ccb02cdc1d2"} Dec 02 17:42:59 crc kubenswrapper[4933]: I1202 17:42:59.830997 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m26tx" event={"ID":"6d6ca96f-9078-4a0b-8277-2a1f1e16cec0","Type":"ContainerStarted","Data":"bdb553285d7753f2c1e490fc98b9f12e824408d30c036f5201258b7dba664c61"} Dec 02 17:42:59 crc kubenswrapper[4933]: I1202 17:42:59.861046 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-m26tx" podStartSLOduration=3.380434486 podStartE2EDuration="5.86101862s" podCreationTimestamp="2025-12-02 17:42:54 +0000 UTC" firstStartedPulling="2025-12-02 17:42:56.791792096 +0000 UTC m=+6640.043018799" lastFinishedPulling="2025-12-02 17:42:59.27237623 +0000 UTC m=+6642.523602933" observedRunningTime="2025-12-02 17:42:59.850301716 +0000 UTC m=+6643.101528429" watchObservedRunningTime="2025-12-02 17:42:59.86101862 +0000 UTC m=+6643.112245363" Dec 02 17:43:05 crc kubenswrapper[4933]: I1202 17:43:05.225653 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-m26tx" Dec 02 17:43:05 crc kubenswrapper[4933]: I1202 17:43:05.226339 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-m26tx" Dec 02 17:43:05 crc kubenswrapper[4933]: I1202 17:43:05.301871 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-m26tx" Dec 02 17:43:06 crc kubenswrapper[4933]: I1202 17:43:06.012476 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-m26tx" Dec 02 17:43:06 crc kubenswrapper[4933]: I1202 17:43:06.090367 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-m26tx"] Dec 02 17:43:07 crc kubenswrapper[4933]: I1202 17:43:07.952113 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-m26tx" podUID="6d6ca96f-9078-4a0b-8277-2a1f1e16cec0" containerName="registry-server" containerID="cri-o://bdb553285d7753f2c1e490fc98b9f12e824408d30c036f5201258b7dba664c61" gracePeriod=2 Dec 02 17:43:08 crc kubenswrapper[4933]: I1202 17:43:08.532284 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m26tx" Dec 02 17:43:08 crc kubenswrapper[4933]: I1202 17:43:08.704069 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d6ca96f-9078-4a0b-8277-2a1f1e16cec0-catalog-content\") pod \"6d6ca96f-9078-4a0b-8277-2a1f1e16cec0\" (UID: \"6d6ca96f-9078-4a0b-8277-2a1f1e16cec0\") " Dec 02 17:43:08 crc kubenswrapper[4933]: I1202 17:43:08.704170 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-825qs\" (UniqueName: \"kubernetes.io/projected/6d6ca96f-9078-4a0b-8277-2a1f1e16cec0-kube-api-access-825qs\") pod \"6d6ca96f-9078-4a0b-8277-2a1f1e16cec0\" (UID: \"6d6ca96f-9078-4a0b-8277-2a1f1e16cec0\") " Dec 02 17:43:08 crc kubenswrapper[4933]: I1202 17:43:08.704235 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d6ca96f-9078-4a0b-8277-2a1f1e16cec0-utilities\") pod \"6d6ca96f-9078-4a0b-8277-2a1f1e16cec0\" (UID: \"6d6ca96f-9078-4a0b-8277-2a1f1e16cec0\") " Dec 02 17:43:08 crc kubenswrapper[4933]: I1202 17:43:08.705534 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d6ca96f-9078-4a0b-8277-2a1f1e16cec0-utilities" (OuterVolumeSpecName: "utilities") pod "6d6ca96f-9078-4a0b-8277-2a1f1e16cec0" (UID: "6d6ca96f-9078-4a0b-8277-2a1f1e16cec0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 17:43:08 crc kubenswrapper[4933]: I1202 17:43:08.712987 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d6ca96f-9078-4a0b-8277-2a1f1e16cec0-kube-api-access-825qs" (OuterVolumeSpecName: "kube-api-access-825qs") pod "6d6ca96f-9078-4a0b-8277-2a1f1e16cec0" (UID: "6d6ca96f-9078-4a0b-8277-2a1f1e16cec0"). InnerVolumeSpecName "kube-api-access-825qs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 17:43:08 crc kubenswrapper[4933]: I1202 17:43:08.728998 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d6ca96f-9078-4a0b-8277-2a1f1e16cec0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6d6ca96f-9078-4a0b-8277-2a1f1e16cec0" (UID: "6d6ca96f-9078-4a0b-8277-2a1f1e16cec0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 17:43:08 crc kubenswrapper[4933]: I1202 17:43:08.807400 4933 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d6ca96f-9078-4a0b-8277-2a1f1e16cec0-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 17:43:08 crc kubenswrapper[4933]: I1202 17:43:08.807442 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-825qs\" (UniqueName: \"kubernetes.io/projected/6d6ca96f-9078-4a0b-8277-2a1f1e16cec0-kube-api-access-825qs\") on node \"crc\" DevicePath \"\"" Dec 02 17:43:08 crc kubenswrapper[4933]: I1202 17:43:08.807453 4933 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d6ca96f-9078-4a0b-8277-2a1f1e16cec0-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 17:43:08 crc kubenswrapper[4933]: I1202 17:43:08.981584 4933 generic.go:334] "Generic (PLEG): container finished" podID="6d6ca96f-9078-4a0b-8277-2a1f1e16cec0" containerID="bdb553285d7753f2c1e490fc98b9f12e824408d30c036f5201258b7dba664c61" exitCode=0 Dec 02 17:43:08 crc kubenswrapper[4933]: I1202 17:43:08.981642 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m26tx" event={"ID":"6d6ca96f-9078-4a0b-8277-2a1f1e16cec0","Type":"ContainerDied","Data":"bdb553285d7753f2c1e490fc98b9f12e824408d30c036f5201258b7dba664c61"} Dec 02 17:43:08 crc kubenswrapper[4933]: I1202 17:43:08.981681 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m26tx" event={"ID":"6d6ca96f-9078-4a0b-8277-2a1f1e16cec0","Type":"ContainerDied","Data":"13b8425796b9e7356d896c77735e88585effe027494e7b957ba39d8b9dc5b1d5"} Dec 02 17:43:08 crc kubenswrapper[4933]: I1202 17:43:08.981704 4933 scope.go:117] "RemoveContainer" containerID="bdb553285d7753f2c1e490fc98b9f12e824408d30c036f5201258b7dba664c61" Dec 02 17:43:08 crc kubenswrapper[4933]: I1202 17:43:08.981730 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m26tx" Dec 02 17:43:09 crc kubenswrapper[4933]: I1202 17:43:09.031046 4933 scope.go:117] "RemoveContainer" containerID="a45dc4d0b7403f1ef46bf76a6e0d2d9b02203643a7573a7043eb4ccb02cdc1d2" Dec 02 17:43:09 crc kubenswrapper[4933]: I1202 17:43:09.050080 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-m26tx"] Dec 02 17:43:09 crc kubenswrapper[4933]: I1202 17:43:09.057657 4933 scope.go:117] "RemoveContainer" containerID="2ee3071a320876f6620f091b90e8722600baacb646bda6aad185c6e1a117bdd8" Dec 02 17:43:09 crc kubenswrapper[4933]: E1202 17:43:09.058377 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 17:43:09 crc kubenswrapper[4933]: I1202 17:43:09.068265 4933 scope.go:117] "RemoveContainer" containerID="617ece9037f28f68d5c6b4b536324343d8c3e53d2cd1b54c7a4389c3e3ea7f18" Dec 02 17:43:09 crc kubenswrapper[4933]: I1202 17:43:09.073891 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-m26tx"] Dec 02 17:43:09 crc kubenswrapper[4933]: I1202 17:43:09.139800 4933 scope.go:117] "RemoveContainer" containerID="bdb553285d7753f2c1e490fc98b9f12e824408d30c036f5201258b7dba664c61" Dec 02 17:43:09 crc kubenswrapper[4933]: E1202 17:43:09.140302 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bdb553285d7753f2c1e490fc98b9f12e824408d30c036f5201258b7dba664c61\": container with ID starting with bdb553285d7753f2c1e490fc98b9f12e824408d30c036f5201258b7dba664c61 not found: ID does not exist" containerID="bdb553285d7753f2c1e490fc98b9f12e824408d30c036f5201258b7dba664c61" Dec 02 17:43:09 crc kubenswrapper[4933]: I1202 17:43:09.140342 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bdb553285d7753f2c1e490fc98b9f12e824408d30c036f5201258b7dba664c61"} err="failed to get container status \"bdb553285d7753f2c1e490fc98b9f12e824408d30c036f5201258b7dba664c61\": rpc error: code = NotFound desc = could not find container \"bdb553285d7753f2c1e490fc98b9f12e824408d30c036f5201258b7dba664c61\": container with ID starting with bdb553285d7753f2c1e490fc98b9f12e824408d30c036f5201258b7dba664c61 not found: ID does not exist" Dec 02 17:43:09 crc kubenswrapper[4933]: I1202 17:43:09.140368 4933 scope.go:117] "RemoveContainer" containerID="a45dc4d0b7403f1ef46bf76a6e0d2d9b02203643a7573a7043eb4ccb02cdc1d2" Dec 02 17:43:09 crc kubenswrapper[4933]: E1202 17:43:09.141689 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a45dc4d0b7403f1ef46bf76a6e0d2d9b02203643a7573a7043eb4ccb02cdc1d2\": container with ID starting with a45dc4d0b7403f1ef46bf76a6e0d2d9b02203643a7573a7043eb4ccb02cdc1d2 not found: ID does not exist" containerID="a45dc4d0b7403f1ef46bf76a6e0d2d9b02203643a7573a7043eb4ccb02cdc1d2" Dec 02 17:43:09 crc kubenswrapper[4933]: I1202 17:43:09.141718 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a45dc4d0b7403f1ef46bf76a6e0d2d9b02203643a7573a7043eb4ccb02cdc1d2"} err="failed to get container status \"a45dc4d0b7403f1ef46bf76a6e0d2d9b02203643a7573a7043eb4ccb02cdc1d2\": rpc error: code = NotFound desc = could not find container \"a45dc4d0b7403f1ef46bf76a6e0d2d9b02203643a7573a7043eb4ccb02cdc1d2\": container with ID starting with a45dc4d0b7403f1ef46bf76a6e0d2d9b02203643a7573a7043eb4ccb02cdc1d2 not found: ID does not exist" Dec 02 17:43:09 crc kubenswrapper[4933]: I1202 17:43:09.141736 4933 scope.go:117] "RemoveContainer" containerID="617ece9037f28f68d5c6b4b536324343d8c3e53d2cd1b54c7a4389c3e3ea7f18" Dec 02 17:43:09 crc kubenswrapper[4933]: E1202 17:43:09.142158 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"617ece9037f28f68d5c6b4b536324343d8c3e53d2cd1b54c7a4389c3e3ea7f18\": container with ID starting with 617ece9037f28f68d5c6b4b536324343d8c3e53d2cd1b54c7a4389c3e3ea7f18 not found: ID does not exist" containerID="617ece9037f28f68d5c6b4b536324343d8c3e53d2cd1b54c7a4389c3e3ea7f18" Dec 02 17:43:09 crc kubenswrapper[4933]: I1202 17:43:09.142693 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"617ece9037f28f68d5c6b4b536324343d8c3e53d2cd1b54c7a4389c3e3ea7f18"} err="failed to get container status \"617ece9037f28f68d5c6b4b536324343d8c3e53d2cd1b54c7a4389c3e3ea7f18\": rpc error: code = NotFound desc = could not find container \"617ece9037f28f68d5c6b4b536324343d8c3e53d2cd1b54c7a4389c3e3ea7f18\": container with ID starting with 617ece9037f28f68d5c6b4b536324343d8c3e53d2cd1b54c7a4389c3e3ea7f18 not found: ID does not exist" Dec 02 17:43:11 crc kubenswrapper[4933]: I1202 17:43:11.080197 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d6ca96f-9078-4a0b-8277-2a1f1e16cec0" path="/var/lib/kubelet/pods/6d6ca96f-9078-4a0b-8277-2a1f1e16cec0/volumes" Dec 02 17:43:22 crc kubenswrapper[4933]: I1202 17:43:22.053765 4933 scope.go:117] "RemoveContainer" containerID="2ee3071a320876f6620f091b90e8722600baacb646bda6aad185c6e1a117bdd8" Dec 02 17:43:22 crc kubenswrapper[4933]: E1202 17:43:22.055151 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 17:43:22 crc kubenswrapper[4933]: I1202 17:43:22.409518 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7pg8p/must-gather-s92q2"] Dec 02 17:43:22 crc kubenswrapper[4933]: E1202 17:43:22.410283 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d6ca96f-9078-4a0b-8277-2a1f1e16cec0" containerName="extract-content" Dec 02 17:43:22 crc kubenswrapper[4933]: I1202 17:43:22.410311 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d6ca96f-9078-4a0b-8277-2a1f1e16cec0" containerName="extract-content" Dec 02 17:43:22 crc kubenswrapper[4933]: E1202 17:43:22.410365 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d6ca96f-9078-4a0b-8277-2a1f1e16cec0" containerName="registry-server" Dec 02 17:43:22 crc kubenswrapper[4933]: I1202 17:43:22.410376 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d6ca96f-9078-4a0b-8277-2a1f1e16cec0" containerName="registry-server" Dec 02 17:43:22 crc kubenswrapper[4933]: E1202 17:43:22.410394 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d6ca96f-9078-4a0b-8277-2a1f1e16cec0" containerName="extract-utilities" Dec 02 17:43:22 crc kubenswrapper[4933]: I1202 17:43:22.410405 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d6ca96f-9078-4a0b-8277-2a1f1e16cec0" containerName="extract-utilities" Dec 02 17:43:22 crc kubenswrapper[4933]: I1202 17:43:22.410765 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d6ca96f-9078-4a0b-8277-2a1f1e16cec0" containerName="registry-server" Dec 02 17:43:22 crc kubenswrapper[4933]: I1202 17:43:22.412907 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7pg8p/must-gather-s92q2" Dec 02 17:43:22 crc kubenswrapper[4933]: I1202 17:43:22.419047 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-7pg8p"/"kube-root-ca.crt" Dec 02 17:43:22 crc kubenswrapper[4933]: I1202 17:43:22.420915 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-7pg8p"/"openshift-service-ca.crt" Dec 02 17:43:22 crc kubenswrapper[4933]: I1202 17:43:22.429597 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7pg8p/must-gather-s92q2"] Dec 02 17:43:22 crc kubenswrapper[4933]: I1202 17:43:22.480535 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2828\" (UniqueName: \"kubernetes.io/projected/3e542deb-5ddf-4bfc-9eea-73e4d47aa429-kube-api-access-x2828\") pod \"must-gather-s92q2\" (UID: \"3e542deb-5ddf-4bfc-9eea-73e4d47aa429\") " pod="openshift-must-gather-7pg8p/must-gather-s92q2" Dec 02 17:43:22 crc kubenswrapper[4933]: I1202 17:43:22.480715 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3e542deb-5ddf-4bfc-9eea-73e4d47aa429-must-gather-output\") pod \"must-gather-s92q2\" (UID: \"3e542deb-5ddf-4bfc-9eea-73e4d47aa429\") " pod="openshift-must-gather-7pg8p/must-gather-s92q2" Dec 02 17:43:22 crc kubenswrapper[4933]: I1202 17:43:22.582979 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2828\" (UniqueName: \"kubernetes.io/projected/3e542deb-5ddf-4bfc-9eea-73e4d47aa429-kube-api-access-x2828\") pod \"must-gather-s92q2\" (UID: \"3e542deb-5ddf-4bfc-9eea-73e4d47aa429\") " pod="openshift-must-gather-7pg8p/must-gather-s92q2" Dec 02 17:43:22 crc kubenswrapper[4933]: I1202 17:43:22.583085 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3e542deb-5ddf-4bfc-9eea-73e4d47aa429-must-gather-output\") pod \"must-gather-s92q2\" (UID: \"3e542deb-5ddf-4bfc-9eea-73e4d47aa429\") " pod="openshift-must-gather-7pg8p/must-gather-s92q2" Dec 02 17:43:22 crc kubenswrapper[4933]: I1202 17:43:22.583495 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3e542deb-5ddf-4bfc-9eea-73e4d47aa429-must-gather-output\") pod \"must-gather-s92q2\" (UID: \"3e542deb-5ddf-4bfc-9eea-73e4d47aa429\") " pod="openshift-must-gather-7pg8p/must-gather-s92q2" Dec 02 17:43:22 crc kubenswrapper[4933]: I1202 17:43:22.610378 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2828\" (UniqueName: \"kubernetes.io/projected/3e542deb-5ddf-4bfc-9eea-73e4d47aa429-kube-api-access-x2828\") pod \"must-gather-s92q2\" (UID: \"3e542deb-5ddf-4bfc-9eea-73e4d47aa429\") " pod="openshift-must-gather-7pg8p/must-gather-s92q2" Dec 02 17:43:22 crc kubenswrapper[4933]: I1202 17:43:22.734941 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7pg8p/must-gather-s92q2" Dec 02 17:43:23 crc kubenswrapper[4933]: I1202 17:43:23.076282 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7pg8p/must-gather-s92q2"] Dec 02 17:43:23 crc kubenswrapper[4933]: I1202 17:43:23.270446 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7pg8p/must-gather-s92q2" event={"ID":"3e542deb-5ddf-4bfc-9eea-73e4d47aa429","Type":"ContainerStarted","Data":"6012a0dcc2336d577be1a40bd76ac54f9eefda65be2dcddc7aeafb5cefd30d86"} Dec 02 17:43:24 crc kubenswrapper[4933]: I1202 17:43:24.284297 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7pg8p/must-gather-s92q2" event={"ID":"3e542deb-5ddf-4bfc-9eea-73e4d47aa429","Type":"ContainerStarted","Data":"2b76cd34a84fae6817ecf59866964e90adff390d922825b0fc42e143f2d9c160"} Dec 02 17:43:24 crc kubenswrapper[4933]: I1202 17:43:24.284665 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7pg8p/must-gather-s92q2" event={"ID":"3e542deb-5ddf-4bfc-9eea-73e4d47aa429","Type":"ContainerStarted","Data":"3922682c0c171c839f0090e22a4d32e9b51e2254edbfb5872d993628fed90aea"} Dec 02 17:43:25 crc kubenswrapper[4933]: I1202 17:43:25.333999 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-7pg8p/must-gather-s92q2" podStartSLOduration=3.333970776 podStartE2EDuration="3.333970776s" podCreationTimestamp="2025-12-02 17:43:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 17:43:25.326173889 +0000 UTC m=+6668.577400602" watchObservedRunningTime="2025-12-02 17:43:25.333970776 +0000 UTC m=+6668.585197499" Dec 02 17:43:27 crc kubenswrapper[4933]: I1202 17:43:27.455096 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7pg8p/crc-debug-2wlqx"] Dec 02 17:43:27 crc kubenswrapper[4933]: I1202 17:43:27.457445 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7pg8p/crc-debug-2wlqx" Dec 02 17:43:27 crc kubenswrapper[4933]: I1202 17:43:27.460574 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-7pg8p"/"default-dockercfg-gfbxz" Dec 02 17:43:27 crc kubenswrapper[4933]: I1202 17:43:27.494191 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3929b2c0-2ddd-4c6f-b72a-4f88bb6318dd-host\") pod \"crc-debug-2wlqx\" (UID: \"3929b2c0-2ddd-4c6f-b72a-4f88bb6318dd\") " pod="openshift-must-gather-7pg8p/crc-debug-2wlqx" Dec 02 17:43:27 crc kubenswrapper[4933]: I1202 17:43:27.494539 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2r4j5\" (UniqueName: \"kubernetes.io/projected/3929b2c0-2ddd-4c6f-b72a-4f88bb6318dd-kube-api-access-2r4j5\") pod \"crc-debug-2wlqx\" (UID: \"3929b2c0-2ddd-4c6f-b72a-4f88bb6318dd\") " pod="openshift-must-gather-7pg8p/crc-debug-2wlqx" Dec 02 17:43:27 crc kubenswrapper[4933]: I1202 17:43:27.597323 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3929b2c0-2ddd-4c6f-b72a-4f88bb6318dd-host\") pod \"crc-debug-2wlqx\" (UID: \"3929b2c0-2ddd-4c6f-b72a-4f88bb6318dd\") " pod="openshift-must-gather-7pg8p/crc-debug-2wlqx" Dec 02 17:43:27 crc kubenswrapper[4933]: I1202 17:43:27.597378 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2r4j5\" (UniqueName: \"kubernetes.io/projected/3929b2c0-2ddd-4c6f-b72a-4f88bb6318dd-kube-api-access-2r4j5\") pod \"crc-debug-2wlqx\" (UID: \"3929b2c0-2ddd-4c6f-b72a-4f88bb6318dd\") " pod="openshift-must-gather-7pg8p/crc-debug-2wlqx" Dec 02 17:43:27 crc kubenswrapper[4933]: I1202 17:43:27.597817 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3929b2c0-2ddd-4c6f-b72a-4f88bb6318dd-host\") pod \"crc-debug-2wlqx\" (UID: \"3929b2c0-2ddd-4c6f-b72a-4f88bb6318dd\") " pod="openshift-must-gather-7pg8p/crc-debug-2wlqx" Dec 02 17:43:27 crc kubenswrapper[4933]: I1202 17:43:27.618536 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2r4j5\" (UniqueName: \"kubernetes.io/projected/3929b2c0-2ddd-4c6f-b72a-4f88bb6318dd-kube-api-access-2r4j5\") pod \"crc-debug-2wlqx\" (UID: \"3929b2c0-2ddd-4c6f-b72a-4f88bb6318dd\") " pod="openshift-must-gather-7pg8p/crc-debug-2wlqx" Dec 02 17:43:27 crc kubenswrapper[4933]: I1202 17:43:27.776303 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7pg8p/crc-debug-2wlqx" Dec 02 17:43:27 crc kubenswrapper[4933]: W1202 17:43:27.812200 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3929b2c0_2ddd_4c6f_b72a_4f88bb6318dd.slice/crio-a9f4260b7a859723f71f7265320b7239fd6e3eb85f833cd81cf0219ca21dc2e3 WatchSource:0}: Error finding container a9f4260b7a859723f71f7265320b7239fd6e3eb85f833cd81cf0219ca21dc2e3: Status 404 returned error can't find the container with id a9f4260b7a859723f71f7265320b7239fd6e3eb85f833cd81cf0219ca21dc2e3 Dec 02 17:43:28 crc kubenswrapper[4933]: I1202 17:43:28.355751 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7pg8p/crc-debug-2wlqx" event={"ID":"3929b2c0-2ddd-4c6f-b72a-4f88bb6318dd","Type":"ContainerStarted","Data":"a9f4260b7a859723f71f7265320b7239fd6e3eb85f833cd81cf0219ca21dc2e3"} Dec 02 17:43:29 crc kubenswrapper[4933]: I1202 17:43:29.366323 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7pg8p/crc-debug-2wlqx" event={"ID":"3929b2c0-2ddd-4c6f-b72a-4f88bb6318dd","Type":"ContainerStarted","Data":"650006d6bddf0e0a96ced04438d1d16dffb5be05d1c2927c66f3b84617e90dde"} Dec 02 17:43:29 crc kubenswrapper[4933]: I1202 17:43:29.406727 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-7pg8p/crc-debug-2wlqx" podStartSLOduration=2.406705157 podStartE2EDuration="2.406705157s" podCreationTimestamp="2025-12-02 17:43:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 17:43:29.379622119 +0000 UTC m=+6672.630848822" watchObservedRunningTime="2025-12-02 17:43:29.406705157 +0000 UTC m=+6672.657931860" Dec 02 17:43:35 crc kubenswrapper[4933]: I1202 17:43:35.053858 4933 scope.go:117] "RemoveContainer" containerID="2ee3071a320876f6620f091b90e8722600baacb646bda6aad185c6e1a117bdd8" Dec 02 17:43:35 crc kubenswrapper[4933]: E1202 17:43:35.054806 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 17:43:48 crc kubenswrapper[4933]: I1202 17:43:48.054209 4933 scope.go:117] "RemoveContainer" containerID="2ee3071a320876f6620f091b90e8722600baacb646bda6aad185c6e1a117bdd8" Dec 02 17:43:48 crc kubenswrapper[4933]: E1202 17:43:48.055145 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 17:44:03 crc kubenswrapper[4933]: I1202 17:44:03.054396 4933 scope.go:117] "RemoveContainer" containerID="2ee3071a320876f6620f091b90e8722600baacb646bda6aad185c6e1a117bdd8" Dec 02 17:44:03 crc kubenswrapper[4933]: E1202 17:44:03.055581 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 17:44:09 crc kubenswrapper[4933]: I1202 17:44:09.794347 4933 generic.go:334] "Generic (PLEG): container finished" podID="3929b2c0-2ddd-4c6f-b72a-4f88bb6318dd" containerID="650006d6bddf0e0a96ced04438d1d16dffb5be05d1c2927c66f3b84617e90dde" exitCode=0 Dec 02 17:44:09 crc kubenswrapper[4933]: I1202 17:44:09.794419 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7pg8p/crc-debug-2wlqx" event={"ID":"3929b2c0-2ddd-4c6f-b72a-4f88bb6318dd","Type":"ContainerDied","Data":"650006d6bddf0e0a96ced04438d1d16dffb5be05d1c2927c66f3b84617e90dde"} Dec 02 17:44:10 crc kubenswrapper[4933]: I1202 17:44:10.917888 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7pg8p/crc-debug-2wlqx" Dec 02 17:44:10 crc kubenswrapper[4933]: I1202 17:44:10.962486 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-7pg8p/crc-debug-2wlqx"] Dec 02 17:44:10 crc kubenswrapper[4933]: I1202 17:44:10.971703 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-7pg8p/crc-debug-2wlqx"] Dec 02 17:44:11 crc kubenswrapper[4933]: I1202 17:44:11.037918 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2r4j5\" (UniqueName: \"kubernetes.io/projected/3929b2c0-2ddd-4c6f-b72a-4f88bb6318dd-kube-api-access-2r4j5\") pod \"3929b2c0-2ddd-4c6f-b72a-4f88bb6318dd\" (UID: \"3929b2c0-2ddd-4c6f-b72a-4f88bb6318dd\") " Dec 02 17:44:11 crc kubenswrapper[4933]: I1202 17:44:11.038010 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3929b2c0-2ddd-4c6f-b72a-4f88bb6318dd-host\") pod \"3929b2c0-2ddd-4c6f-b72a-4f88bb6318dd\" (UID: \"3929b2c0-2ddd-4c6f-b72a-4f88bb6318dd\") " Dec 02 17:44:11 crc kubenswrapper[4933]: I1202 17:44:11.038145 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3929b2c0-2ddd-4c6f-b72a-4f88bb6318dd-host" (OuterVolumeSpecName: "host") pod "3929b2c0-2ddd-4c6f-b72a-4f88bb6318dd" (UID: "3929b2c0-2ddd-4c6f-b72a-4f88bb6318dd"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 17:44:11 crc kubenswrapper[4933]: I1202 17:44:11.038941 4933 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3929b2c0-2ddd-4c6f-b72a-4f88bb6318dd-host\") on node \"crc\" DevicePath \"\"" Dec 02 17:44:11 crc kubenswrapper[4933]: I1202 17:44:11.044594 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3929b2c0-2ddd-4c6f-b72a-4f88bb6318dd-kube-api-access-2r4j5" (OuterVolumeSpecName: "kube-api-access-2r4j5") pod "3929b2c0-2ddd-4c6f-b72a-4f88bb6318dd" (UID: "3929b2c0-2ddd-4c6f-b72a-4f88bb6318dd"). InnerVolumeSpecName "kube-api-access-2r4j5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 17:44:11 crc kubenswrapper[4933]: I1202 17:44:11.077106 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3929b2c0-2ddd-4c6f-b72a-4f88bb6318dd" path="/var/lib/kubelet/pods/3929b2c0-2ddd-4c6f-b72a-4f88bb6318dd/volumes" Dec 02 17:44:11 crc kubenswrapper[4933]: I1202 17:44:11.141490 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2r4j5\" (UniqueName: \"kubernetes.io/projected/3929b2c0-2ddd-4c6f-b72a-4f88bb6318dd-kube-api-access-2r4j5\") on node \"crc\" DevicePath \"\"" Dec 02 17:44:11 crc kubenswrapper[4933]: I1202 17:44:11.827941 4933 scope.go:117] "RemoveContainer" containerID="650006d6bddf0e0a96ced04438d1d16dffb5be05d1c2927c66f3b84617e90dde" Dec 02 17:44:11 crc kubenswrapper[4933]: I1202 17:44:11.828425 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7pg8p/crc-debug-2wlqx" Dec 02 17:44:12 crc kubenswrapper[4933]: I1202 17:44:12.152884 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7pg8p/crc-debug-rgjb4"] Dec 02 17:44:12 crc kubenswrapper[4933]: E1202 17:44:12.154607 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3929b2c0-2ddd-4c6f-b72a-4f88bb6318dd" containerName="container-00" Dec 02 17:44:12 crc kubenswrapper[4933]: I1202 17:44:12.154694 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="3929b2c0-2ddd-4c6f-b72a-4f88bb6318dd" containerName="container-00" Dec 02 17:44:12 crc kubenswrapper[4933]: I1202 17:44:12.155090 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="3929b2c0-2ddd-4c6f-b72a-4f88bb6318dd" containerName="container-00" Dec 02 17:44:12 crc kubenswrapper[4933]: I1202 17:44:12.155948 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7pg8p/crc-debug-rgjb4" Dec 02 17:44:12 crc kubenswrapper[4933]: I1202 17:44:12.158372 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-7pg8p"/"default-dockercfg-gfbxz" Dec 02 17:44:12 crc kubenswrapper[4933]: I1202 17:44:12.270623 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a34acbe9-8f14-4554-90a1-44b3d95cdfd0-host\") pod \"crc-debug-rgjb4\" (UID: \"a34acbe9-8f14-4554-90a1-44b3d95cdfd0\") " pod="openshift-must-gather-7pg8p/crc-debug-rgjb4" Dec 02 17:44:12 crc kubenswrapper[4933]: I1202 17:44:12.270800 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7cl5\" (UniqueName: \"kubernetes.io/projected/a34acbe9-8f14-4554-90a1-44b3d95cdfd0-kube-api-access-z7cl5\") pod \"crc-debug-rgjb4\" (UID: \"a34acbe9-8f14-4554-90a1-44b3d95cdfd0\") " pod="openshift-must-gather-7pg8p/crc-debug-rgjb4" Dec 02 17:44:12 crc kubenswrapper[4933]: I1202 17:44:12.373180 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7cl5\" (UniqueName: \"kubernetes.io/projected/a34acbe9-8f14-4554-90a1-44b3d95cdfd0-kube-api-access-z7cl5\") pod \"crc-debug-rgjb4\" (UID: \"a34acbe9-8f14-4554-90a1-44b3d95cdfd0\") " pod="openshift-must-gather-7pg8p/crc-debug-rgjb4" Dec 02 17:44:12 crc kubenswrapper[4933]: I1202 17:44:12.373849 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a34acbe9-8f14-4554-90a1-44b3d95cdfd0-host\") pod \"crc-debug-rgjb4\" (UID: \"a34acbe9-8f14-4554-90a1-44b3d95cdfd0\") " pod="openshift-must-gather-7pg8p/crc-debug-rgjb4" Dec 02 17:44:12 crc kubenswrapper[4933]: I1202 17:44:12.373973 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a34acbe9-8f14-4554-90a1-44b3d95cdfd0-host\") pod \"crc-debug-rgjb4\" (UID: \"a34acbe9-8f14-4554-90a1-44b3d95cdfd0\") " pod="openshift-must-gather-7pg8p/crc-debug-rgjb4" Dec 02 17:44:12 crc kubenswrapper[4933]: I1202 17:44:12.395611 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7cl5\" (UniqueName: \"kubernetes.io/projected/a34acbe9-8f14-4554-90a1-44b3d95cdfd0-kube-api-access-z7cl5\") pod \"crc-debug-rgjb4\" (UID: \"a34acbe9-8f14-4554-90a1-44b3d95cdfd0\") " pod="openshift-must-gather-7pg8p/crc-debug-rgjb4" Dec 02 17:44:12 crc kubenswrapper[4933]: I1202 17:44:12.476859 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7pg8p/crc-debug-rgjb4" Dec 02 17:44:12 crc kubenswrapper[4933]: W1202 17:44:12.512627 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda34acbe9_8f14_4554_90a1_44b3d95cdfd0.slice/crio-af00bf9d2503df1c12c2fec674f0d0efeb47ee89b0a20a816f79b148ed049a71 WatchSource:0}: Error finding container af00bf9d2503df1c12c2fec674f0d0efeb47ee89b0a20a816f79b148ed049a71: Status 404 returned error can't find the container with id af00bf9d2503df1c12c2fec674f0d0efeb47ee89b0a20a816f79b148ed049a71 Dec 02 17:44:12 crc kubenswrapper[4933]: I1202 17:44:12.844547 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7pg8p/crc-debug-rgjb4" event={"ID":"a34acbe9-8f14-4554-90a1-44b3d95cdfd0","Type":"ContainerStarted","Data":"16e9993ea12cd25d9b56386486263292fb512ae34f6764d23fff4aa65f52499a"} Dec 02 17:44:12 crc kubenswrapper[4933]: I1202 17:44:12.844873 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7pg8p/crc-debug-rgjb4" event={"ID":"a34acbe9-8f14-4554-90a1-44b3d95cdfd0","Type":"ContainerStarted","Data":"af00bf9d2503df1c12c2fec674f0d0efeb47ee89b0a20a816f79b148ed049a71"} Dec 02 17:44:12 crc kubenswrapper[4933]: I1202 17:44:12.865547 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-7pg8p/crc-debug-rgjb4" podStartSLOduration=0.865529704 podStartE2EDuration="865.529704ms" podCreationTimestamp="2025-12-02 17:44:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 17:44:12.86049551 +0000 UTC m=+6716.111722243" watchObservedRunningTime="2025-12-02 17:44:12.865529704 +0000 UTC m=+6716.116756407" Dec 02 17:44:13 crc kubenswrapper[4933]: I1202 17:44:13.856540 4933 generic.go:334] "Generic (PLEG): container finished" podID="a34acbe9-8f14-4554-90a1-44b3d95cdfd0" containerID="16e9993ea12cd25d9b56386486263292fb512ae34f6764d23fff4aa65f52499a" exitCode=0 Dec 02 17:44:13 crc kubenswrapper[4933]: I1202 17:44:13.856597 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7pg8p/crc-debug-rgjb4" event={"ID":"a34acbe9-8f14-4554-90a1-44b3d95cdfd0","Type":"ContainerDied","Data":"16e9993ea12cd25d9b56386486263292fb512ae34f6764d23fff4aa65f52499a"} Dec 02 17:44:14 crc kubenswrapper[4933]: I1202 17:44:14.981125 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7pg8p/crc-debug-rgjb4" Dec 02 17:44:15 crc kubenswrapper[4933]: I1202 17:44:15.136581 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z7cl5\" (UniqueName: \"kubernetes.io/projected/a34acbe9-8f14-4554-90a1-44b3d95cdfd0-kube-api-access-z7cl5\") pod \"a34acbe9-8f14-4554-90a1-44b3d95cdfd0\" (UID: \"a34acbe9-8f14-4554-90a1-44b3d95cdfd0\") " Dec 02 17:44:15 crc kubenswrapper[4933]: I1202 17:44:15.136738 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a34acbe9-8f14-4554-90a1-44b3d95cdfd0-host\") pod \"a34acbe9-8f14-4554-90a1-44b3d95cdfd0\" (UID: \"a34acbe9-8f14-4554-90a1-44b3d95cdfd0\") " Dec 02 17:44:15 crc kubenswrapper[4933]: I1202 17:44:15.137069 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a34acbe9-8f14-4554-90a1-44b3d95cdfd0-host" (OuterVolumeSpecName: "host") pod "a34acbe9-8f14-4554-90a1-44b3d95cdfd0" (UID: "a34acbe9-8f14-4554-90a1-44b3d95cdfd0"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 17:44:15 crc kubenswrapper[4933]: I1202 17:44:15.137509 4933 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a34acbe9-8f14-4554-90a1-44b3d95cdfd0-host\") on node \"crc\" DevicePath \"\"" Dec 02 17:44:15 crc kubenswrapper[4933]: I1202 17:44:15.143554 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a34acbe9-8f14-4554-90a1-44b3d95cdfd0-kube-api-access-z7cl5" (OuterVolumeSpecName: "kube-api-access-z7cl5") pod "a34acbe9-8f14-4554-90a1-44b3d95cdfd0" (UID: "a34acbe9-8f14-4554-90a1-44b3d95cdfd0"). InnerVolumeSpecName "kube-api-access-z7cl5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 17:44:15 crc kubenswrapper[4933]: I1202 17:44:15.240237 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z7cl5\" (UniqueName: \"kubernetes.io/projected/a34acbe9-8f14-4554-90a1-44b3d95cdfd0-kube-api-access-z7cl5\") on node \"crc\" DevicePath \"\"" Dec 02 17:44:15 crc kubenswrapper[4933]: I1202 17:44:15.290178 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-7pg8p/crc-debug-rgjb4"] Dec 02 17:44:15 crc kubenswrapper[4933]: I1202 17:44:15.304902 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-7pg8p/crc-debug-rgjb4"] Dec 02 17:44:15 crc kubenswrapper[4933]: I1202 17:44:15.895324 4933 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="af00bf9d2503df1c12c2fec674f0d0efeb47ee89b0a20a816f79b148ed049a71" Dec 02 17:44:15 crc kubenswrapper[4933]: I1202 17:44:15.895387 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7pg8p/crc-debug-rgjb4" Dec 02 17:44:16 crc kubenswrapper[4933]: I1202 17:44:16.487440 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7pg8p/crc-debug-xnzx7"] Dec 02 17:44:16 crc kubenswrapper[4933]: E1202 17:44:16.488686 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a34acbe9-8f14-4554-90a1-44b3d95cdfd0" containerName="container-00" Dec 02 17:44:16 crc kubenswrapper[4933]: I1202 17:44:16.488710 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="a34acbe9-8f14-4554-90a1-44b3d95cdfd0" containerName="container-00" Dec 02 17:44:16 crc kubenswrapper[4933]: I1202 17:44:16.491387 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="a34acbe9-8f14-4554-90a1-44b3d95cdfd0" containerName="container-00" Dec 02 17:44:16 crc kubenswrapper[4933]: I1202 17:44:16.495287 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7pg8p/crc-debug-xnzx7" Dec 02 17:44:16 crc kubenswrapper[4933]: I1202 17:44:16.499378 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-7pg8p"/"default-dockercfg-gfbxz" Dec 02 17:44:16 crc kubenswrapper[4933]: I1202 17:44:16.604598 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ls9l\" (UniqueName: \"kubernetes.io/projected/d2c5ebad-8c85-4d83-bf86-8db3d11cf9f6-kube-api-access-6ls9l\") pod \"crc-debug-xnzx7\" (UID: \"d2c5ebad-8c85-4d83-bf86-8db3d11cf9f6\") " pod="openshift-must-gather-7pg8p/crc-debug-xnzx7" Dec 02 17:44:16 crc kubenswrapper[4933]: I1202 17:44:16.604900 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d2c5ebad-8c85-4d83-bf86-8db3d11cf9f6-host\") pod \"crc-debug-xnzx7\" (UID: \"d2c5ebad-8c85-4d83-bf86-8db3d11cf9f6\") " pod="openshift-must-gather-7pg8p/crc-debug-xnzx7" Dec 02 17:44:16 crc kubenswrapper[4933]: I1202 17:44:16.707368 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d2c5ebad-8c85-4d83-bf86-8db3d11cf9f6-host\") pod \"crc-debug-xnzx7\" (UID: \"d2c5ebad-8c85-4d83-bf86-8db3d11cf9f6\") " pod="openshift-must-gather-7pg8p/crc-debug-xnzx7" Dec 02 17:44:16 crc kubenswrapper[4933]: I1202 17:44:16.707491 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d2c5ebad-8c85-4d83-bf86-8db3d11cf9f6-host\") pod \"crc-debug-xnzx7\" (UID: \"d2c5ebad-8c85-4d83-bf86-8db3d11cf9f6\") " pod="openshift-must-gather-7pg8p/crc-debug-xnzx7" Dec 02 17:44:16 crc kubenswrapper[4933]: I1202 17:44:16.707595 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ls9l\" (UniqueName: \"kubernetes.io/projected/d2c5ebad-8c85-4d83-bf86-8db3d11cf9f6-kube-api-access-6ls9l\") pod \"crc-debug-xnzx7\" (UID: \"d2c5ebad-8c85-4d83-bf86-8db3d11cf9f6\") " pod="openshift-must-gather-7pg8p/crc-debug-xnzx7" Dec 02 17:44:16 crc kubenswrapper[4933]: I1202 17:44:16.730849 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ls9l\" (UniqueName: \"kubernetes.io/projected/d2c5ebad-8c85-4d83-bf86-8db3d11cf9f6-kube-api-access-6ls9l\") pod \"crc-debug-xnzx7\" (UID: \"d2c5ebad-8c85-4d83-bf86-8db3d11cf9f6\") " pod="openshift-must-gather-7pg8p/crc-debug-xnzx7" Dec 02 17:44:16 crc kubenswrapper[4933]: I1202 17:44:16.821356 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7pg8p/crc-debug-xnzx7" Dec 02 17:44:16 crc kubenswrapper[4933]: W1202 17:44:16.887593 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd2c5ebad_8c85_4d83_bf86_8db3d11cf9f6.slice/crio-f9883d10ea76dc8c6a62d01c30488d3d873c144b682d3b574e405dd87a969463 WatchSource:0}: Error finding container f9883d10ea76dc8c6a62d01c30488d3d873c144b682d3b574e405dd87a969463: Status 404 returned error can't find the container with id f9883d10ea76dc8c6a62d01c30488d3d873c144b682d3b574e405dd87a969463 Dec 02 17:44:16 crc kubenswrapper[4933]: I1202 17:44:16.913740 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7pg8p/crc-debug-xnzx7" event={"ID":"d2c5ebad-8c85-4d83-bf86-8db3d11cf9f6","Type":"ContainerStarted","Data":"f9883d10ea76dc8c6a62d01c30488d3d873c144b682d3b574e405dd87a969463"} Dec 02 17:44:17 crc kubenswrapper[4933]: I1202 17:44:17.080470 4933 scope.go:117] "RemoveContainer" containerID="2ee3071a320876f6620f091b90e8722600baacb646bda6aad185c6e1a117bdd8" Dec 02 17:44:17 crc kubenswrapper[4933]: I1202 17:44:17.082075 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a34acbe9-8f14-4554-90a1-44b3d95cdfd0" path="/var/lib/kubelet/pods/a34acbe9-8f14-4554-90a1-44b3d95cdfd0/volumes" Dec 02 17:44:17 crc kubenswrapper[4933]: E1202 17:44:17.082109 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 17:44:17 crc kubenswrapper[4933]: I1202 17:44:17.932503 4933 generic.go:334] "Generic (PLEG): container finished" podID="d2c5ebad-8c85-4d83-bf86-8db3d11cf9f6" containerID="e78eda7279c81482c1b77f555a818b82fe061b36fbae769b25f16d41287befad" exitCode=0 Dec 02 17:44:17 crc kubenswrapper[4933]: I1202 17:44:17.932704 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7pg8p/crc-debug-xnzx7" event={"ID":"d2c5ebad-8c85-4d83-bf86-8db3d11cf9f6","Type":"ContainerDied","Data":"e78eda7279c81482c1b77f555a818b82fe061b36fbae769b25f16d41287befad"} Dec 02 17:44:17 crc kubenswrapper[4933]: I1202 17:44:17.998080 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-7pg8p/crc-debug-xnzx7"] Dec 02 17:44:18 crc kubenswrapper[4933]: I1202 17:44:18.017700 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-7pg8p/crc-debug-xnzx7"] Dec 02 17:44:19 crc kubenswrapper[4933]: I1202 17:44:19.100753 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7pg8p/crc-debug-xnzx7" Dec 02 17:44:19 crc kubenswrapper[4933]: I1202 17:44:19.279137 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d2c5ebad-8c85-4d83-bf86-8db3d11cf9f6-host\") pod \"d2c5ebad-8c85-4d83-bf86-8db3d11cf9f6\" (UID: \"d2c5ebad-8c85-4d83-bf86-8db3d11cf9f6\") " Dec 02 17:44:19 crc kubenswrapper[4933]: I1202 17:44:19.279289 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d2c5ebad-8c85-4d83-bf86-8db3d11cf9f6-host" (OuterVolumeSpecName: "host") pod "d2c5ebad-8c85-4d83-bf86-8db3d11cf9f6" (UID: "d2c5ebad-8c85-4d83-bf86-8db3d11cf9f6"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 17:44:19 crc kubenswrapper[4933]: I1202 17:44:19.279943 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ls9l\" (UniqueName: \"kubernetes.io/projected/d2c5ebad-8c85-4d83-bf86-8db3d11cf9f6-kube-api-access-6ls9l\") pod \"d2c5ebad-8c85-4d83-bf86-8db3d11cf9f6\" (UID: \"d2c5ebad-8c85-4d83-bf86-8db3d11cf9f6\") " Dec 02 17:44:19 crc kubenswrapper[4933]: I1202 17:44:19.280939 4933 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d2c5ebad-8c85-4d83-bf86-8db3d11cf9f6-host\") on node \"crc\" DevicePath \"\"" Dec 02 17:44:19 crc kubenswrapper[4933]: I1202 17:44:19.294054 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2c5ebad-8c85-4d83-bf86-8db3d11cf9f6-kube-api-access-6ls9l" (OuterVolumeSpecName: "kube-api-access-6ls9l") pod "d2c5ebad-8c85-4d83-bf86-8db3d11cf9f6" (UID: "d2c5ebad-8c85-4d83-bf86-8db3d11cf9f6"). InnerVolumeSpecName "kube-api-access-6ls9l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 17:44:19 crc kubenswrapper[4933]: I1202 17:44:19.383299 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ls9l\" (UniqueName: \"kubernetes.io/projected/d2c5ebad-8c85-4d83-bf86-8db3d11cf9f6-kube-api-access-6ls9l\") on node \"crc\" DevicePath \"\"" Dec 02 17:44:19 crc kubenswrapper[4933]: I1202 17:44:19.959205 4933 scope.go:117] "RemoveContainer" containerID="e78eda7279c81482c1b77f555a818b82fe061b36fbae769b25f16d41287befad" Dec 02 17:44:19 crc kubenswrapper[4933]: I1202 17:44:19.959234 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7pg8p/crc-debug-xnzx7" Dec 02 17:44:21 crc kubenswrapper[4933]: I1202 17:44:21.064986 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2c5ebad-8c85-4d83-bf86-8db3d11cf9f6" path="/var/lib/kubelet/pods/d2c5ebad-8c85-4d83-bf86-8db3d11cf9f6/volumes" Dec 02 17:44:28 crc kubenswrapper[4933]: I1202 17:44:28.054352 4933 scope.go:117] "RemoveContainer" containerID="2ee3071a320876f6620f091b90e8722600baacb646bda6aad185c6e1a117bdd8" Dec 02 17:44:28 crc kubenswrapper[4933]: E1202 17:44:28.056165 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 17:44:41 crc kubenswrapper[4933]: I1202 17:44:41.054360 4933 scope.go:117] "RemoveContainer" containerID="2ee3071a320876f6620f091b90e8722600baacb646bda6aad185c6e1a117bdd8" Dec 02 17:44:41 crc kubenswrapper[4933]: E1202 17:44:41.057007 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 17:44:53 crc kubenswrapper[4933]: I1202 17:44:53.920399 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_9e5f2071-d2a6-49cb-9c10-34a6f81bc75b/aodh-api/0.log" Dec 02 17:44:54 crc kubenswrapper[4933]: I1202 17:44:54.126588 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_9e5f2071-d2a6-49cb-9c10-34a6f81bc75b/aodh-listener/0.log" Dec 02 17:44:54 crc kubenswrapper[4933]: I1202 17:44:54.135680 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_9e5f2071-d2a6-49cb-9c10-34a6f81bc75b/aodh-evaluator/0.log" Dec 02 17:44:54 crc kubenswrapper[4933]: I1202 17:44:54.153508 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_9e5f2071-d2a6-49cb-9c10-34a6f81bc75b/aodh-notifier/0.log" Dec 02 17:44:54 crc kubenswrapper[4933]: I1202 17:44:54.320593 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-56654f9db6-xk8pt_cec17902-d3a5-4961-88eb-65c3773747fa/barbican-api/0.log" Dec 02 17:44:54 crc kubenswrapper[4933]: I1202 17:44:54.370558 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-56654f9db6-xk8pt_cec17902-d3a5-4961-88eb-65c3773747fa/barbican-api-log/0.log" Dec 02 17:44:54 crc kubenswrapper[4933]: I1202 17:44:54.450315 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-64dd586798-jl4hw_78795142-23f4-4bfd-ba25-479d6cc3c19f/barbican-keystone-listener/0.log" Dec 02 17:44:54 crc kubenswrapper[4933]: I1202 17:44:54.654132 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-64dd586798-jl4hw_78795142-23f4-4bfd-ba25-479d6cc3c19f/barbican-keystone-listener-log/0.log" Dec 02 17:44:54 crc kubenswrapper[4933]: I1202 17:44:54.673503 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-56bd7dd77f-sxk4g_04729c7a-7c1b-4138-832a-f6d0bf327720/barbican-worker/0.log" Dec 02 17:44:54 crc kubenswrapper[4933]: I1202 17:44:54.717311 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-56bd7dd77f-sxk4g_04729c7a-7c1b-4138-832a-f6d0bf327720/barbican-worker-log/0.log" Dec 02 17:44:54 crc kubenswrapper[4933]: I1202 17:44:54.935073 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-db4xp_2060aa16-0f55-457f-98c1-058372e78f0f/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 17:44:55 crc kubenswrapper[4933]: I1202 17:44:55.045419 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_d788537f-6bd8-4db8-a47d-2053d24dca64/ceilometer-central-agent/0.log" Dec 02 17:44:55 crc kubenswrapper[4933]: I1202 17:44:55.054744 4933 scope.go:117] "RemoveContainer" containerID="2ee3071a320876f6620f091b90e8722600baacb646bda6aad185c6e1a117bdd8" Dec 02 17:44:55 crc kubenswrapper[4933]: E1202 17:44:55.055092 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 17:44:55 crc kubenswrapper[4933]: I1202 17:44:55.181910 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_d788537f-6bd8-4db8-a47d-2053d24dca64/proxy-httpd/0.log" Dec 02 17:44:55 crc kubenswrapper[4933]: I1202 17:44:55.185841 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_d788537f-6bd8-4db8-a47d-2053d24dca64/ceilometer-notification-agent/0.log" Dec 02 17:44:55 crc kubenswrapper[4933]: I1202 17:44:55.219707 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_d788537f-6bd8-4db8-a47d-2053d24dca64/sg-core/0.log" Dec 02 17:44:55 crc kubenswrapper[4933]: I1202 17:44:55.490780 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_9e9fc2a4-bdc8-4f3f-9ef1-b9e9a141cd02/cinder-api-log/0.log" Dec 02 17:44:55 crc kubenswrapper[4933]: I1202 17:44:55.726789 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_9e9fc2a4-bdc8-4f3f-9ef1-b9e9a141cd02/cinder-api/0.log" Dec 02 17:44:55 crc kubenswrapper[4933]: I1202 17:44:55.813153 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_60f07df2-68b3-4c78-9279-5dd6d9c71397/probe/0.log" Dec 02 17:44:55 crc kubenswrapper[4933]: I1202 17:44:55.947471 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_60f07df2-68b3-4c78-9279-5dd6d9c71397/cinder-scheduler/0.log" Dec 02 17:44:56 crc kubenswrapper[4933]: I1202 17:44:56.061780 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-87nbr_0a4e084f-f2c5-418d-8990-6074168317ab/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 17:44:56 crc kubenswrapper[4933]: I1202 17:44:56.168471 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-b4tdz_9e5c1276-65ce-4553-9d05-e8e27aaef6b3/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 17:44:56 crc kubenswrapper[4933]: I1202 17:44:56.268922 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5596c69fcc-lb9d4_d8a58bed-855e-481a-a1c6-a2fa6851e55c/init/0.log" Dec 02 17:44:56 crc kubenswrapper[4933]: I1202 17:44:56.487149 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5596c69fcc-lb9d4_d8a58bed-855e-481a-a1c6-a2fa6851e55c/init/0.log" Dec 02 17:44:56 crc kubenswrapper[4933]: I1202 17:44:56.598855 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5596c69fcc-lb9d4_d8a58bed-855e-481a-a1c6-a2fa6851e55c/dnsmasq-dns/0.log" Dec 02 17:44:56 crc kubenswrapper[4933]: I1202 17:44:56.609956 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-dtvjg_2baaff97-25e3-44ca-818e-0c9d121abe01/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 17:44:56 crc kubenswrapper[4933]: I1202 17:44:56.806213 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_848e9b92-1e28-4d82-b057-0335915a6155/glance-log/0.log" Dec 02 17:44:56 crc kubenswrapper[4933]: I1202 17:44:56.831090 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_848e9b92-1e28-4d82-b057-0335915a6155/glance-httpd/0.log" Dec 02 17:44:57 crc kubenswrapper[4933]: I1202 17:44:57.004376 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_45180c36-0fa3-4abc-a647-9b4beb0ed87d/glance-httpd/0.log" Dec 02 17:44:57 crc kubenswrapper[4933]: I1202 17:44:57.033563 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_45180c36-0fa3-4abc-a647-9b4beb0ed87d/glance-log/0.log" Dec 02 17:44:57 crc kubenswrapper[4933]: I1202 17:44:57.553682 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-engine-c8865d5c5-ktz2c_1aa97de0-3a2a-474c-904e-5fa59773c33c/heat-engine/0.log" Dec 02 17:44:57 crc kubenswrapper[4933]: I1202 17:44:57.764173 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-87krj_6efaf1ff-2f24-4e58-8899-a5ca660bd6cc/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 17:44:57 crc kubenswrapper[4933]: I1202 17:44:57.892301 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-api-7cfb68f886-5mpb2_0f856765-5f4c-445f-bcd1-736db6fb2c56/heat-api/0.log" Dec 02 17:44:57 crc kubenswrapper[4933]: I1202 17:44:57.982499 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-mqv4w_213c04c8-e37f-4898-8141-cd8a5a5e6626/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 17:44:57 crc kubenswrapper[4933]: I1202 17:44:57.994146 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-cfnapi-7d8849d76d-cc29j_6ce3e9ef-c14e-49c8-bd2b-8b268f028516/heat-cfnapi/0.log" Dec 02 17:44:58 crc kubenswrapper[4933]: I1202 17:44:58.287489 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29411581-d66dr_86c26f4d-35bf-4616-8935-e00dad2fb46a/keystone-cron/0.log" Dec 02 17:44:58 crc kubenswrapper[4933]: I1202 17:44:58.299006 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_06766e68-6811-4cd2-bb90-67cc353669e6/kube-state-metrics/0.log" Dec 02 17:44:58 crc kubenswrapper[4933]: I1202 17:44:58.508952 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-6bm6r_7b465ce0-4efc-4624-b140-f3bbb0e0b420/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 17:44:58 crc kubenswrapper[4933]: I1202 17:44:58.593778 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_logging-edpm-deployment-openstack-edpm-ipam-995pm_e28d7f01-efc3-4011-baea-61cc2a6f0cd9/logging-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 17:44:58 crc kubenswrapper[4933]: I1202 17:44:58.613747 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-5445f4c57f-62kcb_e87150f8-ace7-485f-bbfe-8818205e400b/keystone-api/0.log" Dec 02 17:44:58 crc kubenswrapper[4933]: I1202 17:44:58.827198 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mysqld-exporter-0_43672f01-bd87-4595-8ba0-d76811762bc2/mysqld-exporter/0.log" Dec 02 17:44:59 crc kubenswrapper[4933]: I1202 17:44:59.219704 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-snnm2_438c203a-5750-4377-95f9-3aa2353f6f53/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 17:44:59 crc kubenswrapper[4933]: I1202 17:44:59.241243 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-7d5bbf46cc-92zkh_03882bf4-d9e9-460d-a94e-fb17189d0278/neutron-httpd/0.log" Dec 02 17:44:59 crc kubenswrapper[4933]: I1202 17:44:59.286808 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-7d5bbf46cc-92zkh_03882bf4-d9e9-460d-a94e-fb17189d0278/neutron-api/0.log" Dec 02 17:44:59 crc kubenswrapper[4933]: I1202 17:44:59.999651 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_f8103500-191e-4988-b97f-532c1f1c1b20/nova-cell0-conductor-conductor/0.log" Dec 02 17:45:00 crc kubenswrapper[4933]: I1202 17:45:00.240172 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411625-xmk57"] Dec 02 17:45:00 crc kubenswrapper[4933]: E1202 17:45:00.255985 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2c5ebad-8c85-4d83-bf86-8db3d11cf9f6" containerName="container-00" Dec 02 17:45:00 crc kubenswrapper[4933]: I1202 17:45:00.256034 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2c5ebad-8c85-4d83-bf86-8db3d11cf9f6" containerName="container-00" Dec 02 17:45:00 crc kubenswrapper[4933]: I1202 17:45:00.256325 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2c5ebad-8c85-4d83-bf86-8db3d11cf9f6" containerName="container-00" Dec 02 17:45:00 crc kubenswrapper[4933]: I1202 17:45:00.262030 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411625-xmk57"] Dec 02 17:45:00 crc kubenswrapper[4933]: I1202 17:45:00.262131 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411625-xmk57" Dec 02 17:45:00 crc kubenswrapper[4933]: I1202 17:45:00.264161 4933 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 02 17:45:00 crc kubenswrapper[4933]: I1202 17:45:00.264792 4933 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 02 17:45:00 crc kubenswrapper[4933]: I1202 17:45:00.310000 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_9aa955fa-0221-4590-bd34-0ee84ba06562/nova-api-log/0.log" Dec 02 17:45:00 crc kubenswrapper[4933]: I1202 17:45:00.358146 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3f01830c-0127-4d73-9e55-05d973de817e-config-volume\") pod \"collect-profiles-29411625-xmk57\" (UID: \"3f01830c-0127-4d73-9e55-05d973de817e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411625-xmk57" Dec 02 17:45:00 crc kubenswrapper[4933]: I1202 17:45:00.358219 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3f01830c-0127-4d73-9e55-05d973de817e-secret-volume\") pod \"collect-profiles-29411625-xmk57\" (UID: \"3f01830c-0127-4d73-9e55-05d973de817e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411625-xmk57" Dec 02 17:45:00 crc kubenswrapper[4933]: I1202 17:45:00.358268 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgrs5\" (UniqueName: \"kubernetes.io/projected/3f01830c-0127-4d73-9e55-05d973de817e-kube-api-access-bgrs5\") pod \"collect-profiles-29411625-xmk57\" (UID: \"3f01830c-0127-4d73-9e55-05d973de817e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411625-xmk57" Dec 02 17:45:00 crc kubenswrapper[4933]: I1202 17:45:00.461854 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3f01830c-0127-4d73-9e55-05d973de817e-config-volume\") pod \"collect-profiles-29411625-xmk57\" (UID: \"3f01830c-0127-4d73-9e55-05d973de817e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411625-xmk57" Dec 02 17:45:00 crc kubenswrapper[4933]: I1202 17:45:00.461933 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3f01830c-0127-4d73-9e55-05d973de817e-secret-volume\") pod \"collect-profiles-29411625-xmk57\" (UID: \"3f01830c-0127-4d73-9e55-05d973de817e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411625-xmk57" Dec 02 17:45:00 crc kubenswrapper[4933]: I1202 17:45:00.462004 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgrs5\" (UniqueName: \"kubernetes.io/projected/3f01830c-0127-4d73-9e55-05d973de817e-kube-api-access-bgrs5\") pod \"collect-profiles-29411625-xmk57\" (UID: \"3f01830c-0127-4d73-9e55-05d973de817e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411625-xmk57" Dec 02 17:45:00 crc kubenswrapper[4933]: I1202 17:45:00.463447 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3f01830c-0127-4d73-9e55-05d973de817e-config-volume\") pod \"collect-profiles-29411625-xmk57\" (UID: \"3f01830c-0127-4d73-9e55-05d973de817e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411625-xmk57" Dec 02 17:45:00 crc kubenswrapper[4933]: I1202 17:45:00.478483 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3f01830c-0127-4d73-9e55-05d973de817e-secret-volume\") pod \"collect-profiles-29411625-xmk57\" (UID: \"3f01830c-0127-4d73-9e55-05d973de817e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411625-xmk57" Dec 02 17:45:00 crc kubenswrapper[4933]: I1202 17:45:00.480886 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgrs5\" (UniqueName: \"kubernetes.io/projected/3f01830c-0127-4d73-9e55-05d973de817e-kube-api-access-bgrs5\") pod \"collect-profiles-29411625-xmk57\" (UID: \"3f01830c-0127-4d73-9e55-05d973de817e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411625-xmk57" Dec 02 17:45:00 crc kubenswrapper[4933]: I1202 17:45:00.501509 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_3186bcaa-4306-4842-885b-6afd00553e78/nova-cell1-conductor-conductor/0.log" Dec 02 17:45:00 crc kubenswrapper[4933]: I1202 17:45:00.586481 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411625-xmk57" Dec 02 17:45:00 crc kubenswrapper[4933]: I1202 17:45:00.720879 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_d7593905-f88c-466a-b547-9a3e59588987/nova-cell1-novncproxy-novncproxy/0.log" Dec 02 17:45:00 crc kubenswrapper[4933]: I1202 17:45:00.842939 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-cbhhv_5e83fdad-1cda-4282-b5e7-6911bdd8d9a0/nova-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 17:45:01 crc kubenswrapper[4933]: I1202 17:45:01.179492 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411625-xmk57"] Dec 02 17:45:01 crc kubenswrapper[4933]: I1202 17:45:01.192726 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_20794a38-c3e1-4b9e-a8d6-6df6d2a6bd7d/nova-metadata-log/0.log" Dec 02 17:45:01 crc kubenswrapper[4933]: I1202 17:45:01.212043 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_9aa955fa-0221-4590-bd34-0ee84ba06562/nova-api-api/0.log" Dec 02 17:45:01 crc kubenswrapper[4933]: I1202 17:45:01.446586 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411625-xmk57" event={"ID":"3f01830c-0127-4d73-9e55-05d973de817e","Type":"ContainerStarted","Data":"dae313ad90e1232146023b62115e3fba5f1601ab9a3e682a84b71e1835b1e5d5"} Dec 02 17:45:01 crc kubenswrapper[4933]: I1202 17:45:01.446929 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411625-xmk57" event={"ID":"3f01830c-0127-4d73-9e55-05d973de817e","Type":"ContainerStarted","Data":"fed79d2c68cf63eb91b0d32448d198866bd73c2a391b7c890792fe727ea20d74"} Dec 02 17:45:01 crc kubenswrapper[4933]: I1202 17:45:01.478680 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29411625-xmk57" podStartSLOduration=1.478658935 podStartE2EDuration="1.478658935s" podCreationTimestamp="2025-12-02 17:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 17:45:01.462257111 +0000 UTC m=+6764.713483824" watchObservedRunningTime="2025-12-02 17:45:01.478658935 +0000 UTC m=+6764.729885648" Dec 02 17:45:01 crc kubenswrapper[4933]: I1202 17:45:01.492233 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_553a306b-ceeb-41b2-8b2c-c32dcd70639e/mysql-bootstrap/0.log" Dec 02 17:45:01 crc kubenswrapper[4933]: I1202 17:45:01.684391 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_a40f5b56-cf52-45da-b3b8-70a271c07984/nova-scheduler-scheduler/0.log" Dec 02 17:45:01 crc kubenswrapper[4933]: I1202 17:45:01.723572 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_553a306b-ceeb-41b2-8b2c-c32dcd70639e/mysql-bootstrap/0.log" Dec 02 17:45:01 crc kubenswrapper[4933]: I1202 17:45:01.805234 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_553a306b-ceeb-41b2-8b2c-c32dcd70639e/galera/0.log" Dec 02 17:45:01 crc kubenswrapper[4933]: I1202 17:45:01.977800 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_cd426cb0-522e-4c62-874e-a85119e82490/mysql-bootstrap/0.log" Dec 02 17:45:02 crc kubenswrapper[4933]: I1202 17:45:02.206203 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_cd426cb0-522e-4c62-874e-a85119e82490/mysql-bootstrap/0.log" Dec 02 17:45:02 crc kubenswrapper[4933]: I1202 17:45:02.238944 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_cd426cb0-522e-4c62-874e-a85119e82490/galera/0.log" Dec 02 17:45:02 crc kubenswrapper[4933]: I1202 17:45:02.427985 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_f673f811-c1d1-4e11-94d8-4932e9761bbf/openstackclient/0.log" Dec 02 17:45:02 crc kubenswrapper[4933]: I1202 17:45:02.460175 4933 generic.go:334] "Generic (PLEG): container finished" podID="3f01830c-0127-4d73-9e55-05d973de817e" containerID="dae313ad90e1232146023b62115e3fba5f1601ab9a3e682a84b71e1835b1e5d5" exitCode=0 Dec 02 17:45:02 crc kubenswrapper[4933]: I1202 17:45:02.460223 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411625-xmk57" event={"ID":"3f01830c-0127-4d73-9e55-05d973de817e","Type":"ContainerDied","Data":"dae313ad90e1232146023b62115e3fba5f1601ab9a3e682a84b71e1835b1e5d5"} Dec 02 17:45:02 crc kubenswrapper[4933]: I1202 17:45:02.552069 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-g8qb5_8a35aeec-161d-4ef9-a42b-7967c06c7249/ovn-controller/0.log" Dec 02 17:45:02 crc kubenswrapper[4933]: I1202 17:45:02.732042 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-cvdrx_b9b7f4a7-69fa-4d7a-bbbe-631089b57bf2/openstack-network-exporter/0.log" Dec 02 17:45:02 crc kubenswrapper[4933]: I1202 17:45:02.946489 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-2z7kb_34ddcddf-7552-4555-8013-3dc06fc2549a/ovsdb-server-init/0.log" Dec 02 17:45:03 crc kubenswrapper[4933]: I1202 17:45:03.110502 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-2z7kb_34ddcddf-7552-4555-8013-3dc06fc2549a/ovsdb-server-init/0.log" Dec 02 17:45:03 crc kubenswrapper[4933]: I1202 17:45:03.126002 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-2z7kb_34ddcddf-7552-4555-8013-3dc06fc2549a/ovs-vswitchd/0.log" Dec 02 17:45:03 crc kubenswrapper[4933]: I1202 17:45:03.239508 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-2z7kb_34ddcddf-7552-4555-8013-3dc06fc2549a/ovsdb-server/0.log" Dec 02 17:45:03 crc kubenswrapper[4933]: I1202 17:45:03.571377 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-56tww_5bb48f13-b80b-462d-acf5-8751ec7aaa8e/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 17:45:03 crc kubenswrapper[4933]: I1202 17:45:03.845490 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_f365adeb-4cfd-409d-8bd8-29b4779e1e0f/openstack-network-exporter/0.log" Dec 02 17:45:03 crc kubenswrapper[4933]: I1202 17:45:03.948472 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_f365adeb-4cfd-409d-8bd8-29b4779e1e0f/ovn-northd/0.log" Dec 02 17:45:04 crc kubenswrapper[4933]: I1202 17:45:04.015679 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411625-xmk57" Dec 02 17:45:04 crc kubenswrapper[4933]: I1202 17:45:04.075901 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bgrs5\" (UniqueName: \"kubernetes.io/projected/3f01830c-0127-4d73-9e55-05d973de817e-kube-api-access-bgrs5\") pod \"3f01830c-0127-4d73-9e55-05d973de817e\" (UID: \"3f01830c-0127-4d73-9e55-05d973de817e\") " Dec 02 17:45:04 crc kubenswrapper[4933]: I1202 17:45:04.076219 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3f01830c-0127-4d73-9e55-05d973de817e-config-volume\") pod \"3f01830c-0127-4d73-9e55-05d973de817e\" (UID: \"3f01830c-0127-4d73-9e55-05d973de817e\") " Dec 02 17:45:04 crc kubenswrapper[4933]: I1202 17:45:04.076295 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3f01830c-0127-4d73-9e55-05d973de817e-secret-volume\") pod \"3f01830c-0127-4d73-9e55-05d973de817e\" (UID: \"3f01830c-0127-4d73-9e55-05d973de817e\") " Dec 02 17:45:04 crc kubenswrapper[4933]: I1202 17:45:04.076757 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f01830c-0127-4d73-9e55-05d973de817e-config-volume" (OuterVolumeSpecName: "config-volume") pod "3f01830c-0127-4d73-9e55-05d973de817e" (UID: "3f01830c-0127-4d73-9e55-05d973de817e"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 17:45:04 crc kubenswrapper[4933]: I1202 17:45:04.077235 4933 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3f01830c-0127-4d73-9e55-05d973de817e-config-volume\") on node \"crc\" DevicePath \"\"" Dec 02 17:45:04 crc kubenswrapper[4933]: I1202 17:45:04.095145 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f01830c-0127-4d73-9e55-05d973de817e-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "3f01830c-0127-4d73-9e55-05d973de817e" (UID: "3f01830c-0127-4d73-9e55-05d973de817e"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 17:45:04 crc kubenswrapper[4933]: I1202 17:45:04.106295 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_662ba342-72a0-430b-b46d-d6f0f0eafd2b/openstack-network-exporter/0.log" Dec 02 17:45:04 crc kubenswrapper[4933]: I1202 17:45:04.122188 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f01830c-0127-4d73-9e55-05d973de817e-kube-api-access-bgrs5" (OuterVolumeSpecName: "kube-api-access-bgrs5") pod "3f01830c-0127-4d73-9e55-05d973de817e" (UID: "3f01830c-0127-4d73-9e55-05d973de817e"). InnerVolumeSpecName "kube-api-access-bgrs5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 17:45:04 crc kubenswrapper[4933]: I1202 17:45:04.179564 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bgrs5\" (UniqueName: \"kubernetes.io/projected/3f01830c-0127-4d73-9e55-05d973de817e-kube-api-access-bgrs5\") on node \"crc\" DevicePath \"\"" Dec 02 17:45:04 crc kubenswrapper[4933]: I1202 17:45:04.179593 4933 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3f01830c-0127-4d73-9e55-05d973de817e-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 02 17:45:04 crc kubenswrapper[4933]: I1202 17:45:04.252923 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411580-mvjxn"] Dec 02 17:45:04 crc kubenswrapper[4933]: I1202 17:45:04.255311 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_20794a38-c3e1-4b9e-a8d6-6df6d2a6bd7d/nova-metadata-metadata/0.log" Dec 02 17:45:04 crc kubenswrapper[4933]: I1202 17:45:04.261629 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411580-mvjxn"] Dec 02 17:45:04 crc kubenswrapper[4933]: I1202 17:45:04.290050 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_662ba342-72a0-430b-b46d-d6f0f0eafd2b/ovsdbserver-nb/0.log" Dec 02 17:45:04 crc kubenswrapper[4933]: I1202 17:45:04.486433 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_d5b04559-f2ad-49dc-a280-626d6de841de/openstack-network-exporter/0.log" Dec 02 17:45:04 crc kubenswrapper[4933]: I1202 17:45:04.492943 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411625-xmk57" event={"ID":"3f01830c-0127-4d73-9e55-05d973de817e","Type":"ContainerDied","Data":"fed79d2c68cf63eb91b0d32448d198866bd73c2a391b7c890792fe727ea20d74"} Dec 02 17:45:04 crc kubenswrapper[4933]: I1202 17:45:04.493267 4933 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fed79d2c68cf63eb91b0d32448d198866bd73c2a391b7c890792fe727ea20d74" Dec 02 17:45:04 crc kubenswrapper[4933]: I1202 17:45:04.493149 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411625-xmk57" Dec 02 17:45:04 crc kubenswrapper[4933]: I1202 17:45:04.530305 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_d5b04559-f2ad-49dc-a280-626d6de841de/ovsdbserver-sb/0.log" Dec 02 17:45:04 crc kubenswrapper[4933]: I1202 17:45:04.769885 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-64bbf7fc9b-xn4tj_250ea801-8fe7-44d3-b5ec-78cd22a3fa39/placement-api/0.log" Dec 02 17:45:04 crc kubenswrapper[4933]: I1202 17:45:04.848594 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_8621d49b-46de-42d1-a6aa-291b8cda18ca/init-config-reloader/0.log" Dec 02 17:45:04 crc kubenswrapper[4933]: I1202 17:45:04.875330 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-64bbf7fc9b-xn4tj_250ea801-8fe7-44d3-b5ec-78cd22a3fa39/placement-log/0.log" Dec 02 17:45:05 crc kubenswrapper[4933]: I1202 17:45:05.049379 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_8621d49b-46de-42d1-a6aa-291b8cda18ca/config-reloader/0.log" Dec 02 17:45:05 crc kubenswrapper[4933]: I1202 17:45:05.060306 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_8621d49b-46de-42d1-a6aa-291b8cda18ca/prometheus/0.log" Dec 02 17:45:05 crc kubenswrapper[4933]: I1202 17:45:05.064568 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2258a4f-47f3-4a6d-8715-b71be971f023" path="/var/lib/kubelet/pods/d2258a4f-47f3-4a6d-8715-b71be971f023/volumes" Dec 02 17:45:05 crc kubenswrapper[4933]: I1202 17:45:05.136077 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_8621d49b-46de-42d1-a6aa-291b8cda18ca/init-config-reloader/0.log" Dec 02 17:45:05 crc kubenswrapper[4933]: I1202 17:45:05.194377 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_8621d49b-46de-42d1-a6aa-291b8cda18ca/thanos-sidecar/0.log" Dec 02 17:45:05 crc kubenswrapper[4933]: I1202 17:45:05.372756 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_cd27c6c8-91fa-4036-b50e-996c263b202c/setup-container/0.log" Dec 02 17:45:05 crc kubenswrapper[4933]: I1202 17:45:05.521659 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_cd27c6c8-91fa-4036-b50e-996c263b202c/setup-container/0.log" Dec 02 17:45:05 crc kubenswrapper[4933]: I1202 17:45:05.574760 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_cd27c6c8-91fa-4036-b50e-996c263b202c/rabbitmq/0.log" Dec 02 17:45:05 crc kubenswrapper[4933]: I1202 17:45:05.575371 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_528bcbae-e7ca-4ad1-8e7d-571da6cb972b/setup-container/0.log" Dec 02 17:45:05 crc kubenswrapper[4933]: I1202 17:45:05.791081 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_528bcbae-e7ca-4ad1-8e7d-571da6cb972b/setup-container/0.log" Dec 02 17:45:05 crc kubenswrapper[4933]: I1202 17:45:05.799518 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_528bcbae-e7ca-4ad1-8e7d-571da6cb972b/rabbitmq/0.log" Dec 02 17:45:05 crc kubenswrapper[4933]: I1202 17:45:05.946285 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-r5bzb_b2b6f713-7e70-4d1e-9ee4-5fa433a01ded/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 17:45:06 crc kubenswrapper[4933]: I1202 17:45:06.046034 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-l7rfq_1eeef066-30d5-47f9-90a0-2815244b7ebb/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 17:45:06 crc kubenswrapper[4933]: I1202 17:45:06.054335 4933 scope.go:117] "RemoveContainer" containerID="2ee3071a320876f6620f091b90e8722600baacb646bda6aad185c6e1a117bdd8" Dec 02 17:45:06 crc kubenswrapper[4933]: E1202 17:45:06.054738 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 17:45:06 crc kubenswrapper[4933]: I1202 17:45:06.198694 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-mznps_b712434b-6106-44ed-aa67-1328e50cdb2c/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 17:45:06 crc kubenswrapper[4933]: I1202 17:45:06.352736 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-fgvgb_0feefe80-99a0-4d78-9753-c823a10fc0f8/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 17:45:06 crc kubenswrapper[4933]: I1202 17:45:06.452304 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-4m7pz_8b53d0b1-3511-4e0c-9d87-b7ceab39da16/ssh-known-hosts-edpm-deployment/0.log" Dec 02 17:45:06 crc kubenswrapper[4933]: I1202 17:45:06.714068 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-64f6f8f6c-x5wnf_97b9d44e-b2f2-4da0-82e0-28c657d8df41/proxy-server/0.log" Dec 02 17:45:06 crc kubenswrapper[4933]: I1202 17:45:06.907322 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-64f6f8f6c-x5wnf_97b9d44e-b2f2-4da0-82e0-28c657d8df41/proxy-httpd/0.log" Dec 02 17:45:07 crc kubenswrapper[4933]: I1202 17:45:07.048602 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-vwmtj_591c4cbc-2470-449b-9046-76b4c6543cb9/swift-ring-rebalance/0.log" Dec 02 17:45:07 crc kubenswrapper[4933]: I1202 17:45:07.182964 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_cf325ab8-af91-4009-9e8b-a299db2234da/account-auditor/0.log" Dec 02 17:45:07 crc kubenswrapper[4933]: I1202 17:45:07.284326 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_cf325ab8-af91-4009-9e8b-a299db2234da/account-reaper/0.log" Dec 02 17:45:07 crc kubenswrapper[4933]: I1202 17:45:07.352424 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_cf325ab8-af91-4009-9e8b-a299db2234da/account-replicator/0.log" Dec 02 17:45:07 crc kubenswrapper[4933]: I1202 17:45:07.395915 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_cf325ab8-af91-4009-9e8b-a299db2234da/account-server/0.log" Dec 02 17:45:07 crc kubenswrapper[4933]: I1202 17:45:07.410133 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_cf325ab8-af91-4009-9e8b-a299db2234da/container-auditor/0.log" Dec 02 17:45:07 crc kubenswrapper[4933]: I1202 17:45:07.574833 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_cf325ab8-af91-4009-9e8b-a299db2234da/container-server/0.log" Dec 02 17:45:07 crc kubenswrapper[4933]: I1202 17:45:07.581754 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_cf325ab8-af91-4009-9e8b-a299db2234da/container-replicator/0.log" Dec 02 17:45:07 crc kubenswrapper[4933]: I1202 17:45:07.647477 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_cf325ab8-af91-4009-9e8b-a299db2234da/object-auditor/0.log" Dec 02 17:45:07 crc kubenswrapper[4933]: I1202 17:45:07.664664 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_cf325ab8-af91-4009-9e8b-a299db2234da/container-updater/0.log" Dec 02 17:45:07 crc kubenswrapper[4933]: I1202 17:45:07.809746 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_cf325ab8-af91-4009-9e8b-a299db2234da/object-replicator/0.log" Dec 02 17:45:07 crc kubenswrapper[4933]: I1202 17:45:07.816656 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_cf325ab8-af91-4009-9e8b-a299db2234da/object-expirer/0.log" Dec 02 17:45:07 crc kubenswrapper[4933]: I1202 17:45:07.860940 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_cf325ab8-af91-4009-9e8b-a299db2234da/object-updater/0.log" Dec 02 17:45:07 crc kubenswrapper[4933]: I1202 17:45:07.921877 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_cf325ab8-af91-4009-9e8b-a299db2234da/object-server/0.log" Dec 02 17:45:08 crc kubenswrapper[4933]: I1202 17:45:08.021439 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_cf325ab8-af91-4009-9e8b-a299db2234da/rsync/0.log" Dec 02 17:45:08 crc kubenswrapper[4933]: I1202 17:45:08.037698 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_cf325ab8-af91-4009-9e8b-a299db2234da/swift-recon-cron/0.log" Dec 02 17:45:08 crc kubenswrapper[4933]: I1202 17:45:08.238374 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-zzqmp_efd2ea83-1192-411e-9c7f-bff8761883fb/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 17:45:08 crc kubenswrapper[4933]: I1202 17:45:08.345445 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-power-monitoring-edpm-deployment-openstack-edpm-hrqsx_06b52c33-11b2-4a83-a9b8-2de5845e6e89/telemetry-power-monitoring-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 17:45:08 crc kubenswrapper[4933]: I1202 17:45:08.546579 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_595befc5-8c2b-4b8e-8991-e12460cd0ef2/test-operator-logs-container/0.log" Dec 02 17:45:08 crc kubenswrapper[4933]: I1202 17:45:08.752795 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-9htq4_7aa88472-106f-484d-ae92-94c064b2a908/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 17:45:09 crc kubenswrapper[4933]: I1202 17:45:09.425524 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_bd0fb308-1858-4dea-bf49-38e577824bd0/tempest-tests-tempest-tests-runner/0.log" Dec 02 17:45:17 crc kubenswrapper[4933]: I1202 17:45:17.067752 4933 scope.go:117] "RemoveContainer" containerID="2ee3071a320876f6620f091b90e8722600baacb646bda6aad185c6e1a117bdd8" Dec 02 17:45:17 crc kubenswrapper[4933]: E1202 17:45:17.071215 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 17:45:17 crc kubenswrapper[4933]: I1202 17:45:17.526243 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_32a2d025-2245-46a6-82d1-228e920490a3/memcached/0.log" Dec 02 17:45:28 crc kubenswrapper[4933]: I1202 17:45:28.054111 4933 scope.go:117] "RemoveContainer" containerID="2ee3071a320876f6620f091b90e8722600baacb646bda6aad185c6e1a117bdd8" Dec 02 17:45:28 crc kubenswrapper[4933]: I1202 17:45:28.823841 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" event={"ID":"e6c1c5e6-50dd-428a-890c-2c3f0456f2fa","Type":"ContainerStarted","Data":"78b5ba533150423d7bf35cf21bf475e020dd6bbdd76ca527e9cad7d7b2b04066"} Dec 02 17:45:35 crc kubenswrapper[4933]: I1202 17:45:35.886045 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2n5jm"] Dec 02 17:45:35 crc kubenswrapper[4933]: E1202 17:45:35.887196 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f01830c-0127-4d73-9e55-05d973de817e" containerName="collect-profiles" Dec 02 17:45:35 crc kubenswrapper[4933]: I1202 17:45:35.887211 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f01830c-0127-4d73-9e55-05d973de817e" containerName="collect-profiles" Dec 02 17:45:35 crc kubenswrapper[4933]: I1202 17:45:35.887477 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f01830c-0127-4d73-9e55-05d973de817e" containerName="collect-profiles" Dec 02 17:45:35 crc kubenswrapper[4933]: I1202 17:45:35.889328 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2n5jm" Dec 02 17:45:35 crc kubenswrapper[4933]: I1202 17:45:35.922643 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2n5jm"] Dec 02 17:45:35 crc kubenswrapper[4933]: I1202 17:45:35.964720 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19b63fd4-9d4f-4550-8123-7c89ed9ed1b8-catalog-content\") pod \"certified-operators-2n5jm\" (UID: \"19b63fd4-9d4f-4550-8123-7c89ed9ed1b8\") " pod="openshift-marketplace/certified-operators-2n5jm" Dec 02 17:45:35 crc kubenswrapper[4933]: I1202 17:45:35.964785 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2czb\" (UniqueName: \"kubernetes.io/projected/19b63fd4-9d4f-4550-8123-7c89ed9ed1b8-kube-api-access-l2czb\") pod \"certified-operators-2n5jm\" (UID: \"19b63fd4-9d4f-4550-8123-7c89ed9ed1b8\") " pod="openshift-marketplace/certified-operators-2n5jm" Dec 02 17:45:35 crc kubenswrapper[4933]: I1202 17:45:35.964820 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19b63fd4-9d4f-4550-8123-7c89ed9ed1b8-utilities\") pod \"certified-operators-2n5jm\" (UID: \"19b63fd4-9d4f-4550-8123-7c89ed9ed1b8\") " pod="openshift-marketplace/certified-operators-2n5jm" Dec 02 17:45:36 crc kubenswrapper[4933]: I1202 17:45:36.066947 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19b63fd4-9d4f-4550-8123-7c89ed9ed1b8-catalog-content\") pod \"certified-operators-2n5jm\" (UID: \"19b63fd4-9d4f-4550-8123-7c89ed9ed1b8\") " pod="openshift-marketplace/certified-operators-2n5jm" Dec 02 17:45:36 crc kubenswrapper[4933]: I1202 17:45:36.067022 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2czb\" (UniqueName: \"kubernetes.io/projected/19b63fd4-9d4f-4550-8123-7c89ed9ed1b8-kube-api-access-l2czb\") pod \"certified-operators-2n5jm\" (UID: \"19b63fd4-9d4f-4550-8123-7c89ed9ed1b8\") " pod="openshift-marketplace/certified-operators-2n5jm" Dec 02 17:45:36 crc kubenswrapper[4933]: I1202 17:45:36.067086 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19b63fd4-9d4f-4550-8123-7c89ed9ed1b8-utilities\") pod \"certified-operators-2n5jm\" (UID: \"19b63fd4-9d4f-4550-8123-7c89ed9ed1b8\") " pod="openshift-marketplace/certified-operators-2n5jm" Dec 02 17:45:36 crc kubenswrapper[4933]: I1202 17:45:36.068105 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19b63fd4-9d4f-4550-8123-7c89ed9ed1b8-utilities\") pod \"certified-operators-2n5jm\" (UID: \"19b63fd4-9d4f-4550-8123-7c89ed9ed1b8\") " pod="openshift-marketplace/certified-operators-2n5jm" Dec 02 17:45:36 crc kubenswrapper[4933]: I1202 17:45:36.068212 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19b63fd4-9d4f-4550-8123-7c89ed9ed1b8-catalog-content\") pod \"certified-operators-2n5jm\" (UID: \"19b63fd4-9d4f-4550-8123-7c89ed9ed1b8\") " pod="openshift-marketplace/certified-operators-2n5jm" Dec 02 17:45:36 crc kubenswrapper[4933]: I1202 17:45:36.086644 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2czb\" (UniqueName: \"kubernetes.io/projected/19b63fd4-9d4f-4550-8123-7c89ed9ed1b8-kube-api-access-l2czb\") pod \"certified-operators-2n5jm\" (UID: \"19b63fd4-9d4f-4550-8123-7c89ed9ed1b8\") " pod="openshift-marketplace/certified-operators-2n5jm" Dec 02 17:45:36 crc kubenswrapper[4933]: I1202 17:45:36.228112 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2n5jm" Dec 02 17:45:36 crc kubenswrapper[4933]: I1202 17:45:36.853101 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2n5jm"] Dec 02 17:45:36 crc kubenswrapper[4933]: W1202 17:45:36.864987 4933 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19b63fd4_9d4f_4550_8123_7c89ed9ed1b8.slice/crio-81be20520cd45d69136690c8a3136d2efb7db9edd4efbf3acc6f57401e86fd86 WatchSource:0}: Error finding container 81be20520cd45d69136690c8a3136d2efb7db9edd4efbf3acc6f57401e86fd86: Status 404 returned error can't find the container with id 81be20520cd45d69136690c8a3136d2efb7db9edd4efbf3acc6f57401e86fd86 Dec 02 17:45:36 crc kubenswrapper[4933]: I1202 17:45:36.921159 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2n5jm" event={"ID":"19b63fd4-9d4f-4550-8123-7c89ed9ed1b8","Type":"ContainerStarted","Data":"81be20520cd45d69136690c8a3136d2efb7db9edd4efbf3acc6f57401e86fd86"} Dec 02 17:45:37 crc kubenswrapper[4933]: I1202 17:45:37.936161 4933 generic.go:334] "Generic (PLEG): container finished" podID="19b63fd4-9d4f-4550-8123-7c89ed9ed1b8" containerID="530be00af95c838bf06465c810458895161ce26435ae0237389568b024d68f9f" exitCode=0 Dec 02 17:45:37 crc kubenswrapper[4933]: I1202 17:45:37.936588 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2n5jm" event={"ID":"19b63fd4-9d4f-4550-8123-7c89ed9ed1b8","Type":"ContainerDied","Data":"530be00af95c838bf06465c810458895161ce26435ae0237389568b024d68f9f"} Dec 02 17:45:38 crc kubenswrapper[4933]: I1202 17:45:38.948920 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2n5jm" event={"ID":"19b63fd4-9d4f-4550-8123-7c89ed9ed1b8","Type":"ContainerStarted","Data":"47c8347ba6f86f5d8056c3153dbc32a6bf0286da1e4df9875341c1faf5fbf86e"} Dec 02 17:45:39 crc kubenswrapper[4933]: I1202 17:45:39.084284 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_389cdc0f67cc74332d7e56ab6f2a768988fde617365ff87a48ceeb17d7c6rtb_2749120d-52e8-46d6-b419-b0c2375e0355/util/0.log" Dec 02 17:45:39 crc kubenswrapper[4933]: I1202 17:45:39.320934 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_389cdc0f67cc74332d7e56ab6f2a768988fde617365ff87a48ceeb17d7c6rtb_2749120d-52e8-46d6-b419-b0c2375e0355/util/0.log" Dec 02 17:45:39 crc kubenswrapper[4933]: I1202 17:45:39.409808 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_389cdc0f67cc74332d7e56ab6f2a768988fde617365ff87a48ceeb17d7c6rtb_2749120d-52e8-46d6-b419-b0c2375e0355/pull/0.log" Dec 02 17:45:39 crc kubenswrapper[4933]: I1202 17:45:39.410067 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_389cdc0f67cc74332d7e56ab6f2a768988fde617365ff87a48ceeb17d7c6rtb_2749120d-52e8-46d6-b419-b0c2375e0355/pull/0.log" Dec 02 17:45:39 crc kubenswrapper[4933]: I1202 17:45:39.536535 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_389cdc0f67cc74332d7e56ab6f2a768988fde617365ff87a48ceeb17d7c6rtb_2749120d-52e8-46d6-b419-b0c2375e0355/pull/0.log" Dec 02 17:45:39 crc kubenswrapper[4933]: I1202 17:45:39.647769 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_389cdc0f67cc74332d7e56ab6f2a768988fde617365ff87a48ceeb17d7c6rtb_2749120d-52e8-46d6-b419-b0c2375e0355/util/0.log" Dec 02 17:45:39 crc kubenswrapper[4933]: I1202 17:45:39.647952 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_389cdc0f67cc74332d7e56ab6f2a768988fde617365ff87a48ceeb17d7c6rtb_2749120d-52e8-46d6-b419-b0c2375e0355/extract/0.log" Dec 02 17:45:39 crc kubenswrapper[4933]: I1202 17:45:39.809768 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-ltqgf_d7ebb5f9-1140-47dc-a2e2-1952a295d218/kube-rbac-proxy/0.log" Dec 02 17:45:39 crc kubenswrapper[4933]: I1202 17:45:39.891690 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-zhsjb_4a6c2fb7-89ad-402e-8e31-b56e50d1386c/kube-rbac-proxy/0.log" Dec 02 17:45:39 crc kubenswrapper[4933]: I1202 17:45:39.952969 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-ltqgf_d7ebb5f9-1140-47dc-a2e2-1952a295d218/manager/0.log" Dec 02 17:45:40 crc kubenswrapper[4933]: I1202 17:45:40.286549 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-zhsjb_4a6c2fb7-89ad-402e-8e31-b56e50d1386c/manager/0.log" Dec 02 17:45:40 crc kubenswrapper[4933]: I1202 17:45:40.334679 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-vgblp_623d7dfe-ddc2-4557-95a3-02b8fb56ee35/kube-rbac-proxy/0.log" Dec 02 17:45:40 crc kubenswrapper[4933]: I1202 17:45:40.347577 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-vgblp_623d7dfe-ddc2-4557-95a3-02b8fb56ee35/manager/0.log" Dec 02 17:45:40 crc kubenswrapper[4933]: I1202 17:45:40.519065 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987cd8cd-9kzqg_443e1418-ad3d-4f7a-b7b0-682beceb2977/kube-rbac-proxy/0.log" Dec 02 17:45:40 crc kubenswrapper[4933]: I1202 17:45:40.652011 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987cd8cd-9kzqg_443e1418-ad3d-4f7a-b7b0-682beceb2977/manager/0.log" Dec 02 17:45:40 crc kubenswrapper[4933]: I1202 17:45:40.759924 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-rzj5x_fba1c43c-fdfb-4ea0-939b-e38adcc79720/kube-rbac-proxy/0.log" Dec 02 17:45:40 crc kubenswrapper[4933]: I1202 17:45:40.893488 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-rzj5x_fba1c43c-fdfb-4ea0-939b-e38adcc79720/manager/0.log" Dec 02 17:45:40 crc kubenswrapper[4933]: I1202 17:45:40.941933 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-t7rsk_e7437781-26c4-4a57-8afc-6bbc1ef7f7dd/kube-rbac-proxy/0.log" Dec 02 17:45:40 crc kubenswrapper[4933]: I1202 17:45:40.969843 4933 generic.go:334] "Generic (PLEG): container finished" podID="19b63fd4-9d4f-4550-8123-7c89ed9ed1b8" containerID="47c8347ba6f86f5d8056c3153dbc32a6bf0286da1e4df9875341c1faf5fbf86e" exitCode=0 Dec 02 17:45:40 crc kubenswrapper[4933]: I1202 17:45:40.970272 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2n5jm" event={"ID":"19b63fd4-9d4f-4550-8123-7c89ed9ed1b8","Type":"ContainerDied","Data":"47c8347ba6f86f5d8056c3153dbc32a6bf0286da1e4df9875341c1faf5fbf86e"} Dec 02 17:45:41 crc kubenswrapper[4933]: I1202 17:45:41.065853 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-t7rsk_e7437781-26c4-4a57-8afc-6bbc1ef7f7dd/manager/0.log" Dec 02 17:45:41 crc kubenswrapper[4933]: I1202 17:45:41.165786 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-ggssz_2e0aae7d-292b-4117-8b13-6021d1b5174a/kube-rbac-proxy/0.log" Dec 02 17:45:41 crc kubenswrapper[4933]: I1202 17:45:41.361382 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-ggssz_2e0aae7d-292b-4117-8b13-6021d1b5174a/manager/0.log" Dec 02 17:45:41 crc kubenswrapper[4933]: I1202 17:45:41.569171 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-ln9b5_e80caebf-2148-4666-98bb-963fce1bc84e/kube-rbac-proxy/0.log" Dec 02 17:45:41 crc kubenswrapper[4933]: I1202 17:45:41.652070 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-ln9b5_e80caebf-2148-4666-98bb-963fce1bc84e/manager/0.log" Dec 02 17:45:41 crc kubenswrapper[4933]: I1202 17:45:41.885906 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-mslvn_9c02a5b3-2770-427a-a0a4-c1fac9bd0ffa/manager/0.log" Dec 02 17:45:41 crc kubenswrapper[4933]: I1202 17:45:41.926900 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-mslvn_9c02a5b3-2770-427a-a0a4-c1fac9bd0ffa/kube-rbac-proxy/0.log" Dec 02 17:45:41 crc kubenswrapper[4933]: I1202 17:45:41.989885 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2n5jm" event={"ID":"19b63fd4-9d4f-4550-8123-7c89ed9ed1b8","Type":"ContainerStarted","Data":"e54533523f746e504940ee38b0199f1f21f5a68c88ba2d2196a04ef7e55d9e23"} Dec 02 17:45:42 crc kubenswrapper[4933]: I1202 17:45:42.016756 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2n5jm" podStartSLOduration=3.56139282 podStartE2EDuration="7.0167373s" podCreationTimestamp="2025-12-02 17:45:35 +0000 UTC" firstStartedPulling="2025-12-02 17:45:37.938590976 +0000 UTC m=+6801.189817689" lastFinishedPulling="2025-12-02 17:45:41.393935476 +0000 UTC m=+6804.645162169" observedRunningTime="2025-12-02 17:45:42.006115209 +0000 UTC m=+6805.257341922" watchObservedRunningTime="2025-12-02 17:45:42.0167373 +0000 UTC m=+6805.267964003" Dec 02 17:45:42 crc kubenswrapper[4933]: I1202 17:45:42.147437 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7c79b5df47-rmd89_a5d0bd48-d38b-44ae-b486-1ff751a0791a/kube-rbac-proxy/0.log" Dec 02 17:45:42 crc kubenswrapper[4933]: I1202 17:45:42.167146 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7c79b5df47-rmd89_a5d0bd48-d38b-44ae-b486-1ff751a0791a/manager/0.log" Dec 02 17:45:42 crc kubenswrapper[4933]: I1202 17:45:42.236698 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-7zb88_22a42f16-b74f-4323-aee5-2d713c1232ea/kube-rbac-proxy/0.log" Dec 02 17:45:42 crc kubenswrapper[4933]: I1202 17:45:42.468453 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-n59qc_b8e9c61c-a72f-41e1-8f62-99d951ce4950/kube-rbac-proxy/0.log" Dec 02 17:45:42 crc kubenswrapper[4933]: I1202 17:45:42.481369 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-7zb88_22a42f16-b74f-4323-aee5-2d713c1232ea/manager/0.log" Dec 02 17:45:42 crc kubenswrapper[4933]: I1202 17:45:42.586242 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-n59qc_b8e9c61c-a72f-41e1-8f62-99d951ce4950/manager/0.log" Dec 02 17:45:42 crc kubenswrapper[4933]: I1202 17:45:42.705265 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-87566_a35dd4ba-4d05-4af0-b0b2-2285e9e35889/kube-rbac-proxy/0.log" Dec 02 17:45:42 crc kubenswrapper[4933]: I1202 17:45:42.856830 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-87566_a35dd4ba-4d05-4af0-b0b2-2285e9e35889/manager/0.log" Dec 02 17:45:43 crc kubenswrapper[4933]: I1202 17:45:43.005375 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-t8bvj_7d831995-fd00-455a-822e-82eb0cca6a33/kube-rbac-proxy/0.log" Dec 02 17:45:43 crc kubenswrapper[4933]: I1202 17:45:43.067112 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-t8bvj_7d831995-fd00-455a-822e-82eb0cca6a33/manager/0.log" Dec 02 17:45:43 crc kubenswrapper[4933]: I1202 17:45:43.168855 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd42vpt8_a9221a82-9f86-4d33-a63b-71bd4c532830/kube-rbac-proxy/0.log" Dec 02 17:45:43 crc kubenswrapper[4933]: I1202 17:45:43.273357 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd42vpt8_a9221a82-9f86-4d33-a63b-71bd4c532830/manager/0.log" Dec 02 17:45:43 crc kubenswrapper[4933]: I1202 17:45:43.596771 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-69c497bc86-fnlmp_a85c4e8a-605a-46c0-9b73-a9fa99a314a1/operator/0.log" Dec 02 17:45:43 crc kubenswrapper[4933]: I1202 17:45:43.652813 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-vv5bw_df3012be-3abc-4502-a517-2d685340047f/registry-server/0.log" Dec 02 17:45:43 crc kubenswrapper[4933]: I1202 17:45:43.785864 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-mgdvd_9e637867-b8e7-48b3-987d-53172ec80734/kube-rbac-proxy/0.log" Dec 02 17:45:43 crc kubenswrapper[4933]: I1202 17:45:43.931941 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-mgdvd_9e637867-b8e7-48b3-987d-53172ec80734/manager/0.log" Dec 02 17:45:44 crc kubenswrapper[4933]: I1202 17:45:44.022100 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-dbnk8_e19f7d2e-55da-4ba5-9a68-0d49c06eecf3/kube-rbac-proxy/0.log" Dec 02 17:45:44 crc kubenswrapper[4933]: I1202 17:45:44.046476 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-dbnk8_e19f7d2e-55da-4ba5-9a68-0d49c06eecf3/manager/0.log" Dec 02 17:45:44 crc kubenswrapper[4933]: I1202 17:45:44.193877 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-f8sqn_fc13fe0e-6c9a-4d9a-bf78-ec06c2962f67/operator/0.log" Dec 02 17:45:44 crc kubenswrapper[4933]: I1202 17:45:44.269611 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-qgclr_d5d121c4-7a04-478d-b210-36b258949699/kube-rbac-proxy/0.log" Dec 02 17:45:44 crc kubenswrapper[4933]: I1202 17:45:44.484339 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-qgclr_d5d121c4-7a04-478d-b210-36b258949699/manager/0.log" Dec 02 17:45:44 crc kubenswrapper[4933]: I1202 17:45:44.587151 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-65697495f7-vqpwl_94050b59-4392-4a9d-9ce8-b5e2c61e0d46/kube-rbac-proxy/0.log" Dec 02 17:45:44 crc kubenswrapper[4933]: I1202 17:45:44.713708 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-596767c485-f2zz4_a23e8ea6-05d4-49b6-a661-30b1236cb653/manager/0.log" Dec 02 17:45:44 crc kubenswrapper[4933]: I1202 17:45:44.732641 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-w2ddp_7f82c652-2e90-4fd6-bd23-381c2f529a27/kube-rbac-proxy/0.log" Dec 02 17:45:44 crc kubenswrapper[4933]: I1202 17:45:44.866227 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-w2ddp_7f82c652-2e90-4fd6-bd23-381c2f529a27/manager/0.log" Dec 02 17:45:44 crc kubenswrapper[4933]: I1202 17:45:44.888049 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-65697495f7-vqpwl_94050b59-4392-4a9d-9ce8-b5e2c61e0d46/manager/0.log" Dec 02 17:45:44 crc kubenswrapper[4933]: I1202 17:45:44.949605 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-c2pbl_7ab353e5-677f-499c-8909-47813767a26c/kube-rbac-proxy/0.log" Dec 02 17:45:44 crc kubenswrapper[4933]: I1202 17:45:44.950734 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-c2pbl_7ab353e5-677f-499c-8909-47813767a26c/manager/0.log" Dec 02 17:45:46 crc kubenswrapper[4933]: I1202 17:45:46.229101 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2n5jm" Dec 02 17:45:46 crc kubenswrapper[4933]: I1202 17:45:46.229678 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2n5jm" Dec 02 17:45:47 crc kubenswrapper[4933]: I1202 17:45:47.296152 4933 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-2n5jm" podUID="19b63fd4-9d4f-4550-8123-7c89ed9ed1b8" containerName="registry-server" probeResult="failure" output=< Dec 02 17:45:47 crc kubenswrapper[4933]: timeout: failed to connect service ":50051" within 1s Dec 02 17:45:47 crc kubenswrapper[4933]: > Dec 02 17:45:49 crc kubenswrapper[4933]: I1202 17:45:49.589185 4933 scope.go:117] "RemoveContainer" containerID="20da9cd0b4fb979e5d2d7914bb0487b52773ee1f4ce1ca057fbd710421ad4760" Dec 02 17:45:56 crc kubenswrapper[4933]: I1202 17:45:56.288133 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2n5jm" Dec 02 17:45:56 crc kubenswrapper[4933]: I1202 17:45:56.375359 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2n5jm" Dec 02 17:45:56 crc kubenswrapper[4933]: I1202 17:45:56.527587 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2n5jm"] Dec 02 17:45:58 crc kubenswrapper[4933]: I1202 17:45:58.193171 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-2n5jm" podUID="19b63fd4-9d4f-4550-8123-7c89ed9ed1b8" containerName="registry-server" containerID="cri-o://e54533523f746e504940ee38b0199f1f21f5a68c88ba2d2196a04ef7e55d9e23" gracePeriod=2 Dec 02 17:45:58 crc kubenswrapper[4933]: I1202 17:45:58.859154 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2n5jm" Dec 02 17:45:58 crc kubenswrapper[4933]: I1202 17:45:58.967038 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l2czb\" (UniqueName: \"kubernetes.io/projected/19b63fd4-9d4f-4550-8123-7c89ed9ed1b8-kube-api-access-l2czb\") pod \"19b63fd4-9d4f-4550-8123-7c89ed9ed1b8\" (UID: \"19b63fd4-9d4f-4550-8123-7c89ed9ed1b8\") " Dec 02 17:45:58 crc kubenswrapper[4933]: I1202 17:45:58.967530 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19b63fd4-9d4f-4550-8123-7c89ed9ed1b8-catalog-content\") pod \"19b63fd4-9d4f-4550-8123-7c89ed9ed1b8\" (UID: \"19b63fd4-9d4f-4550-8123-7c89ed9ed1b8\") " Dec 02 17:45:58 crc kubenswrapper[4933]: I1202 17:45:58.967575 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19b63fd4-9d4f-4550-8123-7c89ed9ed1b8-utilities\") pod \"19b63fd4-9d4f-4550-8123-7c89ed9ed1b8\" (UID: \"19b63fd4-9d4f-4550-8123-7c89ed9ed1b8\") " Dec 02 17:45:58 crc kubenswrapper[4933]: I1202 17:45:58.969973 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19b63fd4-9d4f-4550-8123-7c89ed9ed1b8-utilities" (OuterVolumeSpecName: "utilities") pod "19b63fd4-9d4f-4550-8123-7c89ed9ed1b8" (UID: "19b63fd4-9d4f-4550-8123-7c89ed9ed1b8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 17:45:58 crc kubenswrapper[4933]: I1202 17:45:58.974729 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19b63fd4-9d4f-4550-8123-7c89ed9ed1b8-kube-api-access-l2czb" (OuterVolumeSpecName: "kube-api-access-l2czb") pod "19b63fd4-9d4f-4550-8123-7c89ed9ed1b8" (UID: "19b63fd4-9d4f-4550-8123-7c89ed9ed1b8"). InnerVolumeSpecName "kube-api-access-l2czb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 17:45:59 crc kubenswrapper[4933]: I1202 17:45:59.023479 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19b63fd4-9d4f-4550-8123-7c89ed9ed1b8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "19b63fd4-9d4f-4550-8123-7c89ed9ed1b8" (UID: "19b63fd4-9d4f-4550-8123-7c89ed9ed1b8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 17:45:59 crc kubenswrapper[4933]: I1202 17:45:59.071232 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l2czb\" (UniqueName: \"kubernetes.io/projected/19b63fd4-9d4f-4550-8123-7c89ed9ed1b8-kube-api-access-l2czb\") on node \"crc\" DevicePath \"\"" Dec 02 17:45:59 crc kubenswrapper[4933]: I1202 17:45:59.071565 4933 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19b63fd4-9d4f-4550-8123-7c89ed9ed1b8-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 17:45:59 crc kubenswrapper[4933]: I1202 17:45:59.071584 4933 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19b63fd4-9d4f-4550-8123-7c89ed9ed1b8-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 17:45:59 crc kubenswrapper[4933]: I1202 17:45:59.198630 4933 generic.go:334] "Generic (PLEG): container finished" podID="19b63fd4-9d4f-4550-8123-7c89ed9ed1b8" containerID="e54533523f746e504940ee38b0199f1f21f5a68c88ba2d2196a04ef7e55d9e23" exitCode=0 Dec 02 17:45:59 crc kubenswrapper[4933]: I1202 17:45:59.198713 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2n5jm" Dec 02 17:45:59 crc kubenswrapper[4933]: I1202 17:45:59.199832 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2n5jm" event={"ID":"19b63fd4-9d4f-4550-8123-7c89ed9ed1b8","Type":"ContainerDied","Data":"e54533523f746e504940ee38b0199f1f21f5a68c88ba2d2196a04ef7e55d9e23"} Dec 02 17:45:59 crc kubenswrapper[4933]: I1202 17:45:59.199906 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2n5jm" event={"ID":"19b63fd4-9d4f-4550-8123-7c89ed9ed1b8","Type":"ContainerDied","Data":"81be20520cd45d69136690c8a3136d2efb7db9edd4efbf3acc6f57401e86fd86"} Dec 02 17:45:59 crc kubenswrapper[4933]: I1202 17:45:59.199934 4933 scope.go:117] "RemoveContainer" containerID="e54533523f746e504940ee38b0199f1f21f5a68c88ba2d2196a04ef7e55d9e23" Dec 02 17:45:59 crc kubenswrapper[4933]: I1202 17:45:59.229964 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2n5jm"] Dec 02 17:45:59 crc kubenswrapper[4933]: I1202 17:45:59.233476 4933 scope.go:117] "RemoveContainer" containerID="47c8347ba6f86f5d8056c3153dbc32a6bf0286da1e4df9875341c1faf5fbf86e" Dec 02 17:45:59 crc kubenswrapper[4933]: I1202 17:45:59.237578 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-2n5jm"] Dec 02 17:45:59 crc kubenswrapper[4933]: I1202 17:45:59.254787 4933 scope.go:117] "RemoveContainer" containerID="530be00af95c838bf06465c810458895161ce26435ae0237389568b024d68f9f" Dec 02 17:45:59 crc kubenswrapper[4933]: I1202 17:45:59.325213 4933 scope.go:117] "RemoveContainer" containerID="e54533523f746e504940ee38b0199f1f21f5a68c88ba2d2196a04ef7e55d9e23" Dec 02 17:45:59 crc kubenswrapper[4933]: E1202 17:45:59.326194 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e54533523f746e504940ee38b0199f1f21f5a68c88ba2d2196a04ef7e55d9e23\": container with ID starting with e54533523f746e504940ee38b0199f1f21f5a68c88ba2d2196a04ef7e55d9e23 not found: ID does not exist" containerID="e54533523f746e504940ee38b0199f1f21f5a68c88ba2d2196a04ef7e55d9e23" Dec 02 17:45:59 crc kubenswrapper[4933]: I1202 17:45:59.326230 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e54533523f746e504940ee38b0199f1f21f5a68c88ba2d2196a04ef7e55d9e23"} err="failed to get container status \"e54533523f746e504940ee38b0199f1f21f5a68c88ba2d2196a04ef7e55d9e23\": rpc error: code = NotFound desc = could not find container \"e54533523f746e504940ee38b0199f1f21f5a68c88ba2d2196a04ef7e55d9e23\": container with ID starting with e54533523f746e504940ee38b0199f1f21f5a68c88ba2d2196a04ef7e55d9e23 not found: ID does not exist" Dec 02 17:45:59 crc kubenswrapper[4933]: I1202 17:45:59.326253 4933 scope.go:117] "RemoveContainer" containerID="47c8347ba6f86f5d8056c3153dbc32a6bf0286da1e4df9875341c1faf5fbf86e" Dec 02 17:45:59 crc kubenswrapper[4933]: E1202 17:45:59.326630 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47c8347ba6f86f5d8056c3153dbc32a6bf0286da1e4df9875341c1faf5fbf86e\": container with ID starting with 47c8347ba6f86f5d8056c3153dbc32a6bf0286da1e4df9875341c1faf5fbf86e not found: ID does not exist" containerID="47c8347ba6f86f5d8056c3153dbc32a6bf0286da1e4df9875341c1faf5fbf86e" Dec 02 17:45:59 crc kubenswrapper[4933]: I1202 17:45:59.326650 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47c8347ba6f86f5d8056c3153dbc32a6bf0286da1e4df9875341c1faf5fbf86e"} err="failed to get container status \"47c8347ba6f86f5d8056c3153dbc32a6bf0286da1e4df9875341c1faf5fbf86e\": rpc error: code = NotFound desc = could not find container \"47c8347ba6f86f5d8056c3153dbc32a6bf0286da1e4df9875341c1faf5fbf86e\": container with ID starting with 47c8347ba6f86f5d8056c3153dbc32a6bf0286da1e4df9875341c1faf5fbf86e not found: ID does not exist" Dec 02 17:45:59 crc kubenswrapper[4933]: I1202 17:45:59.326664 4933 scope.go:117] "RemoveContainer" containerID="530be00af95c838bf06465c810458895161ce26435ae0237389568b024d68f9f" Dec 02 17:45:59 crc kubenswrapper[4933]: E1202 17:45:59.327061 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"530be00af95c838bf06465c810458895161ce26435ae0237389568b024d68f9f\": container with ID starting with 530be00af95c838bf06465c810458895161ce26435ae0237389568b024d68f9f not found: ID does not exist" containerID="530be00af95c838bf06465c810458895161ce26435ae0237389568b024d68f9f" Dec 02 17:45:59 crc kubenswrapper[4933]: I1202 17:45:59.327082 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"530be00af95c838bf06465c810458895161ce26435ae0237389568b024d68f9f"} err="failed to get container status \"530be00af95c838bf06465c810458895161ce26435ae0237389568b024d68f9f\": rpc error: code = NotFound desc = could not find container \"530be00af95c838bf06465c810458895161ce26435ae0237389568b024d68f9f\": container with ID starting with 530be00af95c838bf06465c810458895161ce26435ae0237389568b024d68f9f not found: ID does not exist" Dec 02 17:46:01 crc kubenswrapper[4933]: I1202 17:46:01.070797 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19b63fd4-9d4f-4550-8123-7c89ed9ed1b8" path="/var/lib/kubelet/pods/19b63fd4-9d4f-4550-8123-7c89ed9ed1b8/volumes" Dec 02 17:46:06 crc kubenswrapper[4933]: I1202 17:46:06.574733 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-cx7jx_061606b3-9a47-4cff-ad31-04e9a5a05528/control-plane-machine-set-operator/0.log" Dec 02 17:46:06 crc kubenswrapper[4933]: I1202 17:46:06.794109 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-brhm4_2e0ed0e9-47b5-44d2-89c5-9674e4dc3ed4/kube-rbac-proxy/0.log" Dec 02 17:46:06 crc kubenswrapper[4933]: I1202 17:46:06.813737 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-brhm4_2e0ed0e9-47b5-44d2-89c5-9674e4dc3ed4/machine-api-operator/0.log" Dec 02 17:46:21 crc kubenswrapper[4933]: I1202 17:46:21.563586 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-dhbnp_e81749dd-af4f-46b4-954a-4581cf720cd5/cert-manager-controller/0.log" Dec 02 17:46:21 crc kubenswrapper[4933]: I1202 17:46:21.731627 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-q4f6z_b2d605a1-7e07-47c6-af5f-e2f2a41df05d/cert-manager-cainjector/0.log" Dec 02 17:46:21 crc kubenswrapper[4933]: I1202 17:46:21.791595 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-lpsrb_41523154-4286-47e3-9d29-3adb94b50073/cert-manager-webhook/0.log" Dec 02 17:46:30 crc kubenswrapper[4933]: I1202 17:46:30.976251 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-t2zlc"] Dec 02 17:46:30 crc kubenswrapper[4933]: E1202 17:46:30.977941 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19b63fd4-9d4f-4550-8123-7c89ed9ed1b8" containerName="extract-content" Dec 02 17:46:30 crc kubenswrapper[4933]: I1202 17:46:30.977966 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="19b63fd4-9d4f-4550-8123-7c89ed9ed1b8" containerName="extract-content" Dec 02 17:46:30 crc kubenswrapper[4933]: E1202 17:46:30.977985 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19b63fd4-9d4f-4550-8123-7c89ed9ed1b8" containerName="extract-utilities" Dec 02 17:46:30 crc kubenswrapper[4933]: I1202 17:46:30.977996 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="19b63fd4-9d4f-4550-8123-7c89ed9ed1b8" containerName="extract-utilities" Dec 02 17:46:30 crc kubenswrapper[4933]: E1202 17:46:30.978065 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19b63fd4-9d4f-4550-8123-7c89ed9ed1b8" containerName="registry-server" Dec 02 17:46:30 crc kubenswrapper[4933]: I1202 17:46:30.978079 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="19b63fd4-9d4f-4550-8123-7c89ed9ed1b8" containerName="registry-server" Dec 02 17:46:30 crc kubenswrapper[4933]: I1202 17:46:30.978483 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="19b63fd4-9d4f-4550-8123-7c89ed9ed1b8" containerName="registry-server" Dec 02 17:46:30 crc kubenswrapper[4933]: I1202 17:46:30.981616 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-t2zlc" Dec 02 17:46:30 crc kubenswrapper[4933]: I1202 17:46:30.993112 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-t2zlc"] Dec 02 17:46:31 crc kubenswrapper[4933]: I1202 17:46:31.090991 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bce47a52-f2ec-49f5-9724-db529d2f38c7-catalog-content\") pod \"community-operators-t2zlc\" (UID: \"bce47a52-f2ec-49f5-9724-db529d2f38c7\") " pod="openshift-marketplace/community-operators-t2zlc" Dec 02 17:46:31 crc kubenswrapper[4933]: I1202 17:46:31.091431 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tv9h2\" (UniqueName: \"kubernetes.io/projected/bce47a52-f2ec-49f5-9724-db529d2f38c7-kube-api-access-tv9h2\") pod \"community-operators-t2zlc\" (UID: \"bce47a52-f2ec-49f5-9724-db529d2f38c7\") " pod="openshift-marketplace/community-operators-t2zlc" Dec 02 17:46:31 crc kubenswrapper[4933]: I1202 17:46:31.091745 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bce47a52-f2ec-49f5-9724-db529d2f38c7-utilities\") pod \"community-operators-t2zlc\" (UID: \"bce47a52-f2ec-49f5-9724-db529d2f38c7\") " pod="openshift-marketplace/community-operators-t2zlc" Dec 02 17:46:31 crc kubenswrapper[4933]: I1202 17:46:31.195448 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bce47a52-f2ec-49f5-9724-db529d2f38c7-catalog-content\") pod \"community-operators-t2zlc\" (UID: \"bce47a52-f2ec-49f5-9724-db529d2f38c7\") " pod="openshift-marketplace/community-operators-t2zlc" Dec 02 17:46:31 crc kubenswrapper[4933]: I1202 17:46:31.195746 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tv9h2\" (UniqueName: \"kubernetes.io/projected/bce47a52-f2ec-49f5-9724-db529d2f38c7-kube-api-access-tv9h2\") pod \"community-operators-t2zlc\" (UID: \"bce47a52-f2ec-49f5-9724-db529d2f38c7\") " pod="openshift-marketplace/community-operators-t2zlc" Dec 02 17:46:31 crc kubenswrapper[4933]: I1202 17:46:31.195912 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bce47a52-f2ec-49f5-9724-db529d2f38c7-utilities\") pod \"community-operators-t2zlc\" (UID: \"bce47a52-f2ec-49f5-9724-db529d2f38c7\") " pod="openshift-marketplace/community-operators-t2zlc" Dec 02 17:46:31 crc kubenswrapper[4933]: I1202 17:46:31.200116 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bce47a52-f2ec-49f5-9724-db529d2f38c7-utilities\") pod \"community-operators-t2zlc\" (UID: \"bce47a52-f2ec-49f5-9724-db529d2f38c7\") " pod="openshift-marketplace/community-operators-t2zlc" Dec 02 17:46:31 crc kubenswrapper[4933]: I1202 17:46:31.200906 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bce47a52-f2ec-49f5-9724-db529d2f38c7-catalog-content\") pod \"community-operators-t2zlc\" (UID: \"bce47a52-f2ec-49f5-9724-db529d2f38c7\") " pod="openshift-marketplace/community-operators-t2zlc" Dec 02 17:46:31 crc kubenswrapper[4933]: I1202 17:46:31.281586 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tv9h2\" (UniqueName: \"kubernetes.io/projected/bce47a52-f2ec-49f5-9724-db529d2f38c7-kube-api-access-tv9h2\") pod \"community-operators-t2zlc\" (UID: \"bce47a52-f2ec-49f5-9724-db529d2f38c7\") " pod="openshift-marketplace/community-operators-t2zlc" Dec 02 17:46:31 crc kubenswrapper[4933]: I1202 17:46:31.313103 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-t2zlc" Dec 02 17:46:31 crc kubenswrapper[4933]: I1202 17:46:31.807789 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-t2zlc"] Dec 02 17:46:32 crc kubenswrapper[4933]: I1202 17:46:32.568211 4933 generic.go:334] "Generic (PLEG): container finished" podID="bce47a52-f2ec-49f5-9724-db529d2f38c7" containerID="89e0adec39ea2384d533e65c06fbc6efeff55aa6968ac9e9004ae50705b306d5" exitCode=0 Dec 02 17:46:32 crc kubenswrapper[4933]: I1202 17:46:32.568392 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t2zlc" event={"ID":"bce47a52-f2ec-49f5-9724-db529d2f38c7","Type":"ContainerDied","Data":"89e0adec39ea2384d533e65c06fbc6efeff55aa6968ac9e9004ae50705b306d5"} Dec 02 17:46:32 crc kubenswrapper[4933]: I1202 17:46:32.568446 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t2zlc" event={"ID":"bce47a52-f2ec-49f5-9724-db529d2f38c7","Type":"ContainerStarted","Data":"688a8731dd6858b9ac47600a0efb93c551e864bf43ac3b2c567779d4b839248a"} Dec 02 17:46:33 crc kubenswrapper[4933]: I1202 17:46:33.585156 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t2zlc" event={"ID":"bce47a52-f2ec-49f5-9724-db529d2f38c7","Type":"ContainerStarted","Data":"c1c420fd1e59256d2e259342d0fb05b77a6f553b7fd85b4f2e200253fcd5a551"} Dec 02 17:46:34 crc kubenswrapper[4933]: I1202 17:46:34.608425 4933 generic.go:334] "Generic (PLEG): container finished" podID="bce47a52-f2ec-49f5-9724-db529d2f38c7" containerID="c1c420fd1e59256d2e259342d0fb05b77a6f553b7fd85b4f2e200253fcd5a551" exitCode=0 Dec 02 17:46:34 crc kubenswrapper[4933]: I1202 17:46:34.608558 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t2zlc" event={"ID":"bce47a52-f2ec-49f5-9724-db529d2f38c7","Type":"ContainerDied","Data":"c1c420fd1e59256d2e259342d0fb05b77a6f553b7fd85b4f2e200253fcd5a551"} Dec 02 17:46:35 crc kubenswrapper[4933]: I1202 17:46:35.647855 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t2zlc" event={"ID":"bce47a52-f2ec-49f5-9724-db529d2f38c7","Type":"ContainerStarted","Data":"7821103f98eab10b1d565db2b97abc27251aace66544d611493551548b44aec4"} Dec 02 17:46:36 crc kubenswrapper[4933]: I1202 17:46:36.736126 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7fbb5f6569-gl4gm_a4ec34d6-4de1-4ecb-a60a-46d969b32ff4/nmstate-console-plugin/0.log" Dec 02 17:46:36 crc kubenswrapper[4933]: I1202 17:46:36.950531 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-zjk85_6dec4a20-a5e7-4e76-80d5-8dfbb82021b3/nmstate-handler/0.log" Dec 02 17:46:37 crc kubenswrapper[4933]: I1202 17:46:37.067867 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-qr4ph_09f6889b-f9be-48f6-a955-c19824c76cdd/kube-rbac-proxy/0.log" Dec 02 17:46:37 crc kubenswrapper[4933]: I1202 17:46:37.179108 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-qr4ph_09f6889b-f9be-48f6-a955-c19824c76cdd/nmstate-metrics/0.log" Dec 02 17:46:37 crc kubenswrapper[4933]: I1202 17:46:37.267777 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5b5b58f5c8-mqmjq_37530e12-c39d-43b8-887f-64e9584c1d48/nmstate-operator/0.log" Dec 02 17:46:37 crc kubenswrapper[4933]: I1202 17:46:37.366546 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f6d4c5ccb-85srr_f007b4c7-10dc-452b-b11c-1506dc25da9a/nmstate-webhook/0.log" Dec 02 17:46:41 crc kubenswrapper[4933]: I1202 17:46:41.313688 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-t2zlc" Dec 02 17:46:41 crc kubenswrapper[4933]: I1202 17:46:41.314175 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-t2zlc" Dec 02 17:46:41 crc kubenswrapper[4933]: I1202 17:46:41.391014 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-t2zlc" Dec 02 17:46:41 crc kubenswrapper[4933]: I1202 17:46:41.420059 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-t2zlc" podStartSLOduration=8.940247002 podStartE2EDuration="11.420040205s" podCreationTimestamp="2025-12-02 17:46:30 +0000 UTC" firstStartedPulling="2025-12-02 17:46:32.571945576 +0000 UTC m=+6855.823172279" lastFinishedPulling="2025-12-02 17:46:35.051738779 +0000 UTC m=+6858.302965482" observedRunningTime="2025-12-02 17:46:35.672929811 +0000 UTC m=+6858.924156514" watchObservedRunningTime="2025-12-02 17:46:41.420040205 +0000 UTC m=+6864.671266918" Dec 02 17:46:41 crc kubenswrapper[4933]: I1202 17:46:41.771589 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-t2zlc" Dec 02 17:46:41 crc kubenswrapper[4933]: I1202 17:46:41.842171 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-t2zlc"] Dec 02 17:46:43 crc kubenswrapper[4933]: I1202 17:46:43.745973 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-t2zlc" podUID="bce47a52-f2ec-49f5-9724-db529d2f38c7" containerName="registry-server" containerID="cri-o://7821103f98eab10b1d565db2b97abc27251aace66544d611493551548b44aec4" gracePeriod=2 Dec 02 17:46:44 crc kubenswrapper[4933]: I1202 17:46:44.442149 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-t2zlc" Dec 02 17:46:44 crc kubenswrapper[4933]: I1202 17:46:44.544421 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bce47a52-f2ec-49f5-9724-db529d2f38c7-catalog-content\") pod \"bce47a52-f2ec-49f5-9724-db529d2f38c7\" (UID: \"bce47a52-f2ec-49f5-9724-db529d2f38c7\") " Dec 02 17:46:44 crc kubenswrapper[4933]: I1202 17:46:44.544479 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tv9h2\" (UniqueName: \"kubernetes.io/projected/bce47a52-f2ec-49f5-9724-db529d2f38c7-kube-api-access-tv9h2\") pod \"bce47a52-f2ec-49f5-9724-db529d2f38c7\" (UID: \"bce47a52-f2ec-49f5-9724-db529d2f38c7\") " Dec 02 17:46:44 crc kubenswrapper[4933]: I1202 17:46:44.544676 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bce47a52-f2ec-49f5-9724-db529d2f38c7-utilities\") pod \"bce47a52-f2ec-49f5-9724-db529d2f38c7\" (UID: \"bce47a52-f2ec-49f5-9724-db529d2f38c7\") " Dec 02 17:46:44 crc kubenswrapper[4933]: I1202 17:46:44.545492 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bce47a52-f2ec-49f5-9724-db529d2f38c7-utilities" (OuterVolumeSpecName: "utilities") pod "bce47a52-f2ec-49f5-9724-db529d2f38c7" (UID: "bce47a52-f2ec-49f5-9724-db529d2f38c7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 17:46:44 crc kubenswrapper[4933]: I1202 17:46:44.554220 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bce47a52-f2ec-49f5-9724-db529d2f38c7-kube-api-access-tv9h2" (OuterVolumeSpecName: "kube-api-access-tv9h2") pod "bce47a52-f2ec-49f5-9724-db529d2f38c7" (UID: "bce47a52-f2ec-49f5-9724-db529d2f38c7"). InnerVolumeSpecName "kube-api-access-tv9h2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 17:46:44 crc kubenswrapper[4933]: I1202 17:46:44.590803 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bce47a52-f2ec-49f5-9724-db529d2f38c7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bce47a52-f2ec-49f5-9724-db529d2f38c7" (UID: "bce47a52-f2ec-49f5-9724-db529d2f38c7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 17:46:44 crc kubenswrapper[4933]: I1202 17:46:44.647440 4933 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bce47a52-f2ec-49f5-9724-db529d2f38c7-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 17:46:44 crc kubenswrapper[4933]: I1202 17:46:44.647715 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tv9h2\" (UniqueName: \"kubernetes.io/projected/bce47a52-f2ec-49f5-9724-db529d2f38c7-kube-api-access-tv9h2\") on node \"crc\" DevicePath \"\"" Dec 02 17:46:44 crc kubenswrapper[4933]: I1202 17:46:44.647811 4933 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bce47a52-f2ec-49f5-9724-db529d2f38c7-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 17:46:44 crc kubenswrapper[4933]: I1202 17:46:44.762306 4933 generic.go:334] "Generic (PLEG): container finished" podID="bce47a52-f2ec-49f5-9724-db529d2f38c7" containerID="7821103f98eab10b1d565db2b97abc27251aace66544d611493551548b44aec4" exitCode=0 Dec 02 17:46:44 crc kubenswrapper[4933]: I1202 17:46:44.762490 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t2zlc" event={"ID":"bce47a52-f2ec-49f5-9724-db529d2f38c7","Type":"ContainerDied","Data":"7821103f98eab10b1d565db2b97abc27251aace66544d611493551548b44aec4"} Dec 02 17:46:44 crc kubenswrapper[4933]: I1202 17:46:44.763558 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t2zlc" event={"ID":"bce47a52-f2ec-49f5-9724-db529d2f38c7","Type":"ContainerDied","Data":"688a8731dd6858b9ac47600a0efb93c551e864bf43ac3b2c567779d4b839248a"} Dec 02 17:46:44 crc kubenswrapper[4933]: I1202 17:46:44.762586 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-t2zlc" Dec 02 17:46:44 crc kubenswrapper[4933]: I1202 17:46:44.763689 4933 scope.go:117] "RemoveContainer" containerID="7821103f98eab10b1d565db2b97abc27251aace66544d611493551548b44aec4" Dec 02 17:46:44 crc kubenswrapper[4933]: I1202 17:46:44.811117 4933 scope.go:117] "RemoveContainer" containerID="c1c420fd1e59256d2e259342d0fb05b77a6f553b7fd85b4f2e200253fcd5a551" Dec 02 17:46:44 crc kubenswrapper[4933]: I1202 17:46:44.813157 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-t2zlc"] Dec 02 17:46:44 crc kubenswrapper[4933]: I1202 17:46:44.823625 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-t2zlc"] Dec 02 17:46:44 crc kubenswrapper[4933]: I1202 17:46:44.841285 4933 scope.go:117] "RemoveContainer" containerID="89e0adec39ea2384d533e65c06fbc6efeff55aa6968ac9e9004ae50705b306d5" Dec 02 17:46:44 crc kubenswrapper[4933]: I1202 17:46:44.908093 4933 scope.go:117] "RemoveContainer" containerID="7821103f98eab10b1d565db2b97abc27251aace66544d611493551548b44aec4" Dec 02 17:46:44 crc kubenswrapper[4933]: E1202 17:46:44.908672 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7821103f98eab10b1d565db2b97abc27251aace66544d611493551548b44aec4\": container with ID starting with 7821103f98eab10b1d565db2b97abc27251aace66544d611493551548b44aec4 not found: ID does not exist" containerID="7821103f98eab10b1d565db2b97abc27251aace66544d611493551548b44aec4" Dec 02 17:46:44 crc kubenswrapper[4933]: I1202 17:46:44.908726 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7821103f98eab10b1d565db2b97abc27251aace66544d611493551548b44aec4"} err="failed to get container status \"7821103f98eab10b1d565db2b97abc27251aace66544d611493551548b44aec4\": rpc error: code = NotFound desc = could not find container \"7821103f98eab10b1d565db2b97abc27251aace66544d611493551548b44aec4\": container with ID starting with 7821103f98eab10b1d565db2b97abc27251aace66544d611493551548b44aec4 not found: ID does not exist" Dec 02 17:46:44 crc kubenswrapper[4933]: I1202 17:46:44.908757 4933 scope.go:117] "RemoveContainer" containerID="c1c420fd1e59256d2e259342d0fb05b77a6f553b7fd85b4f2e200253fcd5a551" Dec 02 17:46:44 crc kubenswrapper[4933]: E1202 17:46:44.909134 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1c420fd1e59256d2e259342d0fb05b77a6f553b7fd85b4f2e200253fcd5a551\": container with ID starting with c1c420fd1e59256d2e259342d0fb05b77a6f553b7fd85b4f2e200253fcd5a551 not found: ID does not exist" containerID="c1c420fd1e59256d2e259342d0fb05b77a6f553b7fd85b4f2e200253fcd5a551" Dec 02 17:46:44 crc kubenswrapper[4933]: I1202 17:46:44.909169 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1c420fd1e59256d2e259342d0fb05b77a6f553b7fd85b4f2e200253fcd5a551"} err="failed to get container status \"c1c420fd1e59256d2e259342d0fb05b77a6f553b7fd85b4f2e200253fcd5a551\": rpc error: code = NotFound desc = could not find container \"c1c420fd1e59256d2e259342d0fb05b77a6f553b7fd85b4f2e200253fcd5a551\": container with ID starting with c1c420fd1e59256d2e259342d0fb05b77a6f553b7fd85b4f2e200253fcd5a551 not found: ID does not exist" Dec 02 17:46:44 crc kubenswrapper[4933]: I1202 17:46:44.909191 4933 scope.go:117] "RemoveContainer" containerID="89e0adec39ea2384d533e65c06fbc6efeff55aa6968ac9e9004ae50705b306d5" Dec 02 17:46:44 crc kubenswrapper[4933]: E1202 17:46:44.909459 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89e0adec39ea2384d533e65c06fbc6efeff55aa6968ac9e9004ae50705b306d5\": container with ID starting with 89e0adec39ea2384d533e65c06fbc6efeff55aa6968ac9e9004ae50705b306d5 not found: ID does not exist" containerID="89e0adec39ea2384d533e65c06fbc6efeff55aa6968ac9e9004ae50705b306d5" Dec 02 17:46:44 crc kubenswrapper[4933]: I1202 17:46:44.909486 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89e0adec39ea2384d533e65c06fbc6efeff55aa6968ac9e9004ae50705b306d5"} err="failed to get container status \"89e0adec39ea2384d533e65c06fbc6efeff55aa6968ac9e9004ae50705b306d5\": rpc error: code = NotFound desc = could not find container \"89e0adec39ea2384d533e65c06fbc6efeff55aa6968ac9e9004ae50705b306d5\": container with ID starting with 89e0adec39ea2384d533e65c06fbc6efeff55aa6968ac9e9004ae50705b306d5 not found: ID does not exist" Dec 02 17:46:45 crc kubenswrapper[4933]: I1202 17:46:45.063848 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bce47a52-f2ec-49f5-9724-db529d2f38c7" path="/var/lib/kubelet/pods/bce47a52-f2ec-49f5-9724-db529d2f38c7/volumes" Dec 02 17:46:52 crc kubenswrapper[4933]: I1202 17:46:52.742901 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-5dfc66d568-zbwvk_a390b3c7-fa88-43d0-81d1-ca767f1e9133/manager/0.log" Dec 02 17:46:52 crc kubenswrapper[4933]: I1202 17:46:52.747967 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-5dfc66d568-zbwvk_a390b3c7-fa88-43d0-81d1-ca767f1e9133/kube-rbac-proxy/0.log" Dec 02 17:47:08 crc kubenswrapper[4933]: I1202 17:47:08.933586 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_cluster-logging-operator-ff9846bd-sg6kn_5390a7b1-0b5d-485c-8fc3-44fca44d4286/cluster-logging-operator/0.log" Dec 02 17:47:09 crc kubenswrapper[4933]: I1202 17:47:09.120628 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_collector-9hj2j_1a5bbd19-7aab-46a5-b5c6-9f3861d9dc8d/collector/0.log" Dec 02 17:47:09 crc kubenswrapper[4933]: I1202 17:47:09.160861 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-compactor-0_39e91993-69e7-44dc-bb73-5372571ce233/loki-compactor/0.log" Dec 02 17:47:09 crc kubenswrapper[4933]: I1202 17:47:09.278004 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-distributor-76cc67bf56-bps9l_92d8d29e-d630-4e5b-9697-5f388253535a/loki-distributor/0.log" Dec 02 17:47:09 crc kubenswrapper[4933]: I1202 17:47:09.323647 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-fdb4c644d-28287_349b144f-569c-411e-9651-db4659c64925/gateway/0.log" Dec 02 17:47:09 crc kubenswrapper[4933]: I1202 17:47:09.360767 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-fdb4c644d-28287_349b144f-569c-411e-9651-db4659c64925/opa/0.log" Dec 02 17:47:09 crc kubenswrapper[4933]: I1202 17:47:09.464050 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-fdb4c644d-wj2pd_c016a791-ff2e-4466-95e5-80333c909784/gateway/0.log" Dec 02 17:47:09 crc kubenswrapper[4933]: I1202 17:47:09.496468 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-fdb4c644d-wj2pd_c016a791-ff2e-4466-95e5-80333c909784/opa/0.log" Dec 02 17:47:09 crc kubenswrapper[4933]: I1202 17:47:09.625797 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-index-gateway-0_50300b3a-1c70-41d4-a37c-5b121b3c0efb/loki-index-gateway/0.log" Dec 02 17:47:09 crc kubenswrapper[4933]: I1202 17:47:09.727169 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-ingester-0_cb0e0e26-ef5a-48bd-aa32-af5919aaf24a/loki-ingester/0.log" Dec 02 17:47:09 crc kubenswrapper[4933]: I1202 17:47:09.828049 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-querier-5895d59bb8-d4wmg_b969ec3f-f1c9-4752-b473-1a14450526c1/loki-querier/0.log" Dec 02 17:47:09 crc kubenswrapper[4933]: I1202 17:47:09.921537 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-query-frontend-84558f7c9f-drtkk_4020f625-85e6-4986-84aa-1f73717025cb/loki-query-frontend/0.log" Dec 02 17:47:25 crc kubenswrapper[4933]: I1202 17:47:25.707952 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-dkmmd_b18bcd0e-ac22-4425-940c-29311785588f/kube-rbac-proxy/0.log" Dec 02 17:47:25 crc kubenswrapper[4933]: I1202 17:47:25.758351 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-dkmmd_b18bcd0e-ac22-4425-940c-29311785588f/controller/0.log" Dec 02 17:47:25 crc kubenswrapper[4933]: I1202 17:47:25.872451 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2zdck_3cc86070-243d-4886-9746-9fcab519fb50/cp-frr-files/0.log" Dec 02 17:47:26 crc kubenswrapper[4933]: I1202 17:47:26.064387 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2zdck_3cc86070-243d-4886-9746-9fcab519fb50/cp-frr-files/0.log" Dec 02 17:47:26 crc kubenswrapper[4933]: I1202 17:47:26.090351 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2zdck_3cc86070-243d-4886-9746-9fcab519fb50/cp-reloader/0.log" Dec 02 17:47:26 crc kubenswrapper[4933]: I1202 17:47:26.093971 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2zdck_3cc86070-243d-4886-9746-9fcab519fb50/cp-metrics/0.log" Dec 02 17:47:26 crc kubenswrapper[4933]: I1202 17:47:26.104198 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2zdck_3cc86070-243d-4886-9746-9fcab519fb50/cp-reloader/0.log" Dec 02 17:47:26 crc kubenswrapper[4933]: I1202 17:47:26.304194 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2zdck_3cc86070-243d-4886-9746-9fcab519fb50/cp-reloader/0.log" Dec 02 17:47:26 crc kubenswrapper[4933]: I1202 17:47:26.320976 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2zdck_3cc86070-243d-4886-9746-9fcab519fb50/cp-frr-files/0.log" Dec 02 17:47:26 crc kubenswrapper[4933]: I1202 17:47:26.338990 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2zdck_3cc86070-243d-4886-9746-9fcab519fb50/cp-metrics/0.log" Dec 02 17:47:26 crc kubenswrapper[4933]: I1202 17:47:26.368850 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2zdck_3cc86070-243d-4886-9746-9fcab519fb50/cp-metrics/0.log" Dec 02 17:47:26 crc kubenswrapper[4933]: I1202 17:47:26.502565 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2zdck_3cc86070-243d-4886-9746-9fcab519fb50/cp-frr-files/0.log" Dec 02 17:47:26 crc kubenswrapper[4933]: I1202 17:47:26.528294 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2zdck_3cc86070-243d-4886-9746-9fcab519fb50/cp-metrics/0.log" Dec 02 17:47:26 crc kubenswrapper[4933]: I1202 17:47:26.545121 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2zdck_3cc86070-243d-4886-9746-9fcab519fb50/cp-reloader/0.log" Dec 02 17:47:26 crc kubenswrapper[4933]: I1202 17:47:26.602040 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2zdck_3cc86070-243d-4886-9746-9fcab519fb50/controller/0.log" Dec 02 17:47:26 crc kubenswrapper[4933]: I1202 17:47:26.712700 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2zdck_3cc86070-243d-4886-9746-9fcab519fb50/frr-metrics/0.log" Dec 02 17:47:26 crc kubenswrapper[4933]: I1202 17:47:26.746847 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2zdck_3cc86070-243d-4886-9746-9fcab519fb50/kube-rbac-proxy/0.log" Dec 02 17:47:26 crc kubenswrapper[4933]: I1202 17:47:26.851578 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2zdck_3cc86070-243d-4886-9746-9fcab519fb50/kube-rbac-proxy-frr/0.log" Dec 02 17:47:26 crc kubenswrapper[4933]: I1202 17:47:26.939014 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2zdck_3cc86070-243d-4886-9746-9fcab519fb50/reloader/0.log" Dec 02 17:47:27 crc kubenswrapper[4933]: I1202 17:47:27.168338 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7fcb986d4-r4spw_b5d26e47-d7ac-4bda-8b0e-559a3ee18eb6/frr-k8s-webhook-server/0.log" Dec 02 17:47:27 crc kubenswrapper[4933]: I1202 17:47:27.261445 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-777b6c8cc-hkdpl_a987e1fe-f524-4d29-98c4-cc0238d7b79f/manager/0.log" Dec 02 17:47:27 crc kubenswrapper[4933]: I1202 17:47:27.426301 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-ff64ff47b-dqgr2_3493ec7a-7065-4895-944d-2aefb1419718/webhook-server/0.log" Dec 02 17:47:27 crc kubenswrapper[4933]: I1202 17:47:27.665446 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-7428k_cf92f734-644b-4a6c-b716-bf6269ae99b5/kube-rbac-proxy/0.log" Dec 02 17:47:28 crc kubenswrapper[4933]: I1202 17:47:28.196662 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-7428k_cf92f734-644b-4a6c-b716-bf6269ae99b5/speaker/0.log" Dec 02 17:47:28 crc kubenswrapper[4933]: I1202 17:47:28.534529 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2zdck_3cc86070-243d-4886-9746-9fcab519fb50/frr/0.log" Dec 02 17:47:43 crc kubenswrapper[4933]: I1202 17:47:43.021595 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8g5ctb_e3bd3f79-2e8f-42fc-b9c1-f7dc2ab455d7/util/0.log" Dec 02 17:47:43 crc kubenswrapper[4933]: I1202 17:47:43.172202 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8g5ctb_e3bd3f79-2e8f-42fc-b9c1-f7dc2ab455d7/util/0.log" Dec 02 17:47:43 crc kubenswrapper[4933]: I1202 17:47:43.198422 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8g5ctb_e3bd3f79-2e8f-42fc-b9c1-f7dc2ab455d7/pull/0.log" Dec 02 17:47:43 crc kubenswrapper[4933]: I1202 17:47:43.233618 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8g5ctb_e3bd3f79-2e8f-42fc-b9c1-f7dc2ab455d7/pull/0.log" Dec 02 17:47:43 crc kubenswrapper[4933]: I1202 17:47:43.381606 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8g5ctb_e3bd3f79-2e8f-42fc-b9c1-f7dc2ab455d7/util/0.log" Dec 02 17:47:43 crc kubenswrapper[4933]: I1202 17:47:43.401815 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8g5ctb_e3bd3f79-2e8f-42fc-b9c1-f7dc2ab455d7/pull/0.log" Dec 02 17:47:43 crc kubenswrapper[4933]: I1202 17:47:43.441333 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8g5ctb_e3bd3f79-2e8f-42fc-b9c1-f7dc2ab455d7/extract/0.log" Dec 02 17:47:43 crc kubenswrapper[4933]: I1202 17:47:43.569670 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7m5qk_9bc38c6b-b881-4fec-9d9f-d7bee85e17eb/util/0.log" Dec 02 17:47:43 crc kubenswrapper[4933]: I1202 17:47:43.701427 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7m5qk_9bc38c6b-b881-4fec-9d9f-d7bee85e17eb/util/0.log" Dec 02 17:47:43 crc kubenswrapper[4933]: I1202 17:47:43.737766 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7m5qk_9bc38c6b-b881-4fec-9d9f-d7bee85e17eb/pull/0.log" Dec 02 17:47:43 crc kubenswrapper[4933]: I1202 17:47:43.751348 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7m5qk_9bc38c6b-b881-4fec-9d9f-d7bee85e17eb/pull/0.log" Dec 02 17:47:43 crc kubenswrapper[4933]: I1202 17:47:43.940796 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7m5qk_9bc38c6b-b881-4fec-9d9f-d7bee85e17eb/util/0.log" Dec 02 17:47:43 crc kubenswrapper[4933]: I1202 17:47:43.984911 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7m5qk_9bc38c6b-b881-4fec-9d9f-d7bee85e17eb/pull/0.log" Dec 02 17:47:44 crc kubenswrapper[4933]: I1202 17:47:44.022007 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7m5qk_9bc38c6b-b881-4fec-9d9f-d7bee85e17eb/extract/0.log" Dec 02 17:47:44 crc kubenswrapper[4933]: I1202 17:47:44.143657 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210ql9cd_90462580-2133-44d1-b7bc-c838b66ef30b/util/0.log" Dec 02 17:47:44 crc kubenswrapper[4933]: I1202 17:47:44.279010 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210ql9cd_90462580-2133-44d1-b7bc-c838b66ef30b/pull/0.log" Dec 02 17:47:44 crc kubenswrapper[4933]: I1202 17:47:44.291431 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210ql9cd_90462580-2133-44d1-b7bc-c838b66ef30b/util/0.log" Dec 02 17:47:44 crc kubenswrapper[4933]: I1202 17:47:44.320394 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210ql9cd_90462580-2133-44d1-b7bc-c838b66ef30b/pull/0.log" Dec 02 17:47:44 crc kubenswrapper[4933]: I1202 17:47:44.435738 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210ql9cd_90462580-2133-44d1-b7bc-c838b66ef30b/util/0.log" Dec 02 17:47:44 crc kubenswrapper[4933]: I1202 17:47:44.477944 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210ql9cd_90462580-2133-44d1-b7bc-c838b66ef30b/extract/0.log" Dec 02 17:47:44 crc kubenswrapper[4933]: I1202 17:47:44.484924 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210ql9cd_90462580-2133-44d1-b7bc-c838b66ef30b/pull/0.log" Dec 02 17:47:44 crc kubenswrapper[4933]: I1202 17:47:44.613357 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fm792l_bf9b2c95-38b7-4a61-89ae-d40735ddccaa/util/0.log" Dec 02 17:47:44 crc kubenswrapper[4933]: I1202 17:47:44.790348 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fm792l_bf9b2c95-38b7-4a61-89ae-d40735ddccaa/pull/0.log" Dec 02 17:47:44 crc kubenswrapper[4933]: I1202 17:47:44.812586 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fm792l_bf9b2c95-38b7-4a61-89ae-d40735ddccaa/util/0.log" Dec 02 17:47:44 crc kubenswrapper[4933]: I1202 17:47:44.816923 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fm792l_bf9b2c95-38b7-4a61-89ae-d40735ddccaa/pull/0.log" Dec 02 17:47:44 crc kubenswrapper[4933]: I1202 17:47:44.986841 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fm792l_bf9b2c95-38b7-4a61-89ae-d40735ddccaa/pull/0.log" Dec 02 17:47:45 crc kubenswrapper[4933]: I1202 17:47:45.007668 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fm792l_bf9b2c95-38b7-4a61-89ae-d40735ddccaa/extract/0.log" Dec 02 17:47:45 crc kubenswrapper[4933]: I1202 17:47:45.048995 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fm792l_bf9b2c95-38b7-4a61-89ae-d40735ddccaa/util/0.log" Dec 02 17:47:45 crc kubenswrapper[4933]: I1202 17:47:45.157926 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f837tfk6_de440c6d-8572-4245-a443-c7ac5ebe5a69/util/0.log" Dec 02 17:47:45 crc kubenswrapper[4933]: I1202 17:47:45.377068 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f837tfk6_de440c6d-8572-4245-a443-c7ac5ebe5a69/util/0.log" Dec 02 17:47:45 crc kubenswrapper[4933]: I1202 17:47:45.417291 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f837tfk6_de440c6d-8572-4245-a443-c7ac5ebe5a69/pull/0.log" Dec 02 17:47:45 crc kubenswrapper[4933]: I1202 17:47:45.423734 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f837tfk6_de440c6d-8572-4245-a443-c7ac5ebe5a69/pull/0.log" Dec 02 17:47:45 crc kubenswrapper[4933]: I1202 17:47:45.577594 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f837tfk6_de440c6d-8572-4245-a443-c7ac5ebe5a69/pull/0.log" Dec 02 17:47:45 crc kubenswrapper[4933]: I1202 17:47:45.752280 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f837tfk6_de440c6d-8572-4245-a443-c7ac5ebe5a69/util/0.log" Dec 02 17:47:45 crc kubenswrapper[4933]: I1202 17:47:45.763590 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f837tfk6_de440c6d-8572-4245-a443-c7ac5ebe5a69/extract/0.log" Dec 02 17:47:45 crc kubenswrapper[4933]: I1202 17:47:45.946576 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-hv4zj_fa22a7e7-ae56-415c-ba73-37c19aa18fcb/extract-utilities/0.log" Dec 02 17:47:46 crc kubenswrapper[4933]: I1202 17:47:46.092354 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-hv4zj_fa22a7e7-ae56-415c-ba73-37c19aa18fcb/extract-content/0.log" Dec 02 17:47:46 crc kubenswrapper[4933]: I1202 17:47:46.099697 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-hv4zj_fa22a7e7-ae56-415c-ba73-37c19aa18fcb/extract-content/0.log" Dec 02 17:47:46 crc kubenswrapper[4933]: I1202 17:47:46.104049 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-hv4zj_fa22a7e7-ae56-415c-ba73-37c19aa18fcb/extract-utilities/0.log" Dec 02 17:47:46 crc kubenswrapper[4933]: I1202 17:47:46.338221 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-hv4zj_fa22a7e7-ae56-415c-ba73-37c19aa18fcb/extract-utilities/0.log" Dec 02 17:47:46 crc kubenswrapper[4933]: I1202 17:47:46.347782 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-hv4zj_fa22a7e7-ae56-415c-ba73-37c19aa18fcb/extract-content/0.log" Dec 02 17:47:46 crc kubenswrapper[4933]: I1202 17:47:46.609558 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-h6v9b_dfb3af35-57b8-4936-9baf-7cc31fe9682b/extract-utilities/0.log" Dec 02 17:47:46 crc kubenswrapper[4933]: I1202 17:47:46.836287 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-h6v9b_dfb3af35-57b8-4936-9baf-7cc31fe9682b/extract-content/0.log" Dec 02 17:47:46 crc kubenswrapper[4933]: I1202 17:47:46.836804 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-h6v9b_dfb3af35-57b8-4936-9baf-7cc31fe9682b/extract-utilities/0.log" Dec 02 17:47:46 crc kubenswrapper[4933]: I1202 17:47:46.855209 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-h6v9b_dfb3af35-57b8-4936-9baf-7cc31fe9682b/extract-content/0.log" Dec 02 17:47:47 crc kubenswrapper[4933]: I1202 17:47:47.033370 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-h6v9b_dfb3af35-57b8-4936-9baf-7cc31fe9682b/extract-utilities/0.log" Dec 02 17:47:47 crc kubenswrapper[4933]: I1202 17:47:47.067249 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-h6v9b_dfb3af35-57b8-4936-9baf-7cc31fe9682b/extract-content/0.log" Dec 02 17:47:47 crc kubenswrapper[4933]: I1202 17:47:47.169465 4933 patch_prober.go:28] interesting pod/machine-config-daemon-d2p6w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 17:47:47 crc kubenswrapper[4933]: I1202 17:47:47.169510 4933 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 17:47:47 crc kubenswrapper[4933]: I1202 17:47:47.259402 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-k9r4r_e66c3d98-54cc-4d3f-a1b4-3eee8ed9ab51/marketplace-operator/2.log" Dec 02 17:47:47 crc kubenswrapper[4933]: I1202 17:47:47.269163 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-k9r4r_e66c3d98-54cc-4d3f-a1b4-3eee8ed9ab51/marketplace-operator/3.log" Dec 02 17:47:47 crc kubenswrapper[4933]: I1202 17:47:47.415266 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-hv4zj_fa22a7e7-ae56-415c-ba73-37c19aa18fcb/registry-server/0.log" Dec 02 17:47:47 crc kubenswrapper[4933]: I1202 17:47:47.441980 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-pr6cs_797f2838-8711-4f72-af0c-2fe515a73e03/extract-utilities/0.log" Dec 02 17:47:47 crc kubenswrapper[4933]: I1202 17:47:47.680446 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-pr6cs_797f2838-8711-4f72-af0c-2fe515a73e03/extract-content/0.log" Dec 02 17:47:47 crc kubenswrapper[4933]: I1202 17:47:47.688071 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-pr6cs_797f2838-8711-4f72-af0c-2fe515a73e03/extract-utilities/0.log" Dec 02 17:47:47 crc kubenswrapper[4933]: I1202 17:47:47.718605 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-pr6cs_797f2838-8711-4f72-af0c-2fe515a73e03/extract-content/0.log" Dec 02 17:47:47 crc kubenswrapper[4933]: I1202 17:47:47.834033 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-h6v9b_dfb3af35-57b8-4936-9baf-7cc31fe9682b/registry-server/0.log" Dec 02 17:47:47 crc kubenswrapper[4933]: I1202 17:47:47.870611 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-pr6cs_797f2838-8711-4f72-af0c-2fe515a73e03/extract-content/0.log" Dec 02 17:47:47 crc kubenswrapper[4933]: I1202 17:47:47.898686 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-pr6cs_797f2838-8711-4f72-af0c-2fe515a73e03/extract-utilities/0.log" Dec 02 17:47:48 crc kubenswrapper[4933]: I1202 17:47:48.051044 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-jpwwl_cb82c223-384a-463c-9deb-8cfe4a50ffd7/extract-utilities/0.log" Dec 02 17:47:48 crc kubenswrapper[4933]: I1202 17:47:48.146034 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-pr6cs_797f2838-8711-4f72-af0c-2fe515a73e03/registry-server/0.log" Dec 02 17:47:48 crc kubenswrapper[4933]: I1202 17:47:48.277195 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-jpwwl_cb82c223-384a-463c-9deb-8cfe4a50ffd7/extract-utilities/0.log" Dec 02 17:47:48 crc kubenswrapper[4933]: I1202 17:47:48.283483 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-jpwwl_cb82c223-384a-463c-9deb-8cfe4a50ffd7/extract-content/0.log" Dec 02 17:47:48 crc kubenswrapper[4933]: I1202 17:47:48.288799 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-jpwwl_cb82c223-384a-463c-9deb-8cfe4a50ffd7/extract-content/0.log" Dec 02 17:47:48 crc kubenswrapper[4933]: I1202 17:47:48.416213 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-jpwwl_cb82c223-384a-463c-9deb-8cfe4a50ffd7/extract-content/0.log" Dec 02 17:47:48 crc kubenswrapper[4933]: I1202 17:47:48.437341 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-jpwwl_cb82c223-384a-463c-9deb-8cfe4a50ffd7/extract-utilities/0.log" Dec 02 17:47:49 crc kubenswrapper[4933]: I1202 17:47:49.209694 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-jpwwl_cb82c223-384a-463c-9deb-8cfe4a50ffd7/registry-server/0.log" Dec 02 17:48:02 crc kubenswrapper[4933]: I1202 17:48:02.208402 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-668cf9dfbb-pcq8l_9d0e9982-e6e0-43d3-8e6d-8835a52fe9d8/prometheus-operator/0.log" Dec 02 17:48:02 crc kubenswrapper[4933]: I1202 17:48:02.381961 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-cd6cfcdf6-wnjsz_15f56600-a066-40fd-8433-d0552173dc57/prometheus-operator-admission-webhook/0.log" Dec 02 17:48:02 crc kubenswrapper[4933]: I1202 17:48:02.383155 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-cd6cfcdf6-cwxhj_a8b5527a-5f6c-461a-8397-c911f538eb3a/prometheus-operator-admission-webhook/0.log" Dec 02 17:48:02 crc kubenswrapper[4933]: I1202 17:48:02.579372 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-d8bb48f5d-dpvq4_9f22c4f9-d4f0-4ff8-9322-f03662c116a8/operator/0.log" Dec 02 17:48:02 crc kubenswrapper[4933]: I1202 17:48:02.605271 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-ui-dashboards-7d5fb4cbfb-zdcrf_d691d907-a29d-40ad-ad96-009e8a7d56e8/observability-ui-dashboards/0.log" Dec 02 17:48:02 crc kubenswrapper[4933]: I1202 17:48:02.739990 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5446b9c989-s9dt9_1ec8da33-52a7-4abb-a205-7c14a8186f5e/perses-operator/0.log" Dec 02 17:48:17 crc kubenswrapper[4933]: I1202 17:48:17.564142 4933 patch_prober.go:28] interesting pod/machine-config-daemon-d2p6w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 17:48:17 crc kubenswrapper[4933]: I1202 17:48:17.564919 4933 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 17:48:17 crc kubenswrapper[4933]: I1202 17:48:17.591756 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-5dfc66d568-zbwvk_a390b3c7-fa88-43d0-81d1-ca767f1e9133/manager/0.log" Dec 02 17:48:17 crc kubenswrapper[4933]: I1202 17:48:17.716387 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-5dfc66d568-zbwvk_a390b3c7-fa88-43d0-81d1-ca767f1e9133/kube-rbac-proxy/0.log" Dec 02 17:48:47 crc kubenswrapper[4933]: I1202 17:48:47.169901 4933 patch_prober.go:28] interesting pod/machine-config-daemon-d2p6w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 17:48:47 crc kubenswrapper[4933]: I1202 17:48:47.170532 4933 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 17:48:47 crc kubenswrapper[4933]: I1202 17:48:47.170597 4933 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" Dec 02 17:48:47 crc kubenswrapper[4933]: I1202 17:48:47.171920 4933 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"78b5ba533150423d7bf35cf21bf475e020dd6bbdd76ca527e9cad7d7b2b04066"} pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 17:48:47 crc kubenswrapper[4933]: I1202 17:48:47.172026 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" containerName="machine-config-daemon" containerID="cri-o://78b5ba533150423d7bf35cf21bf475e020dd6bbdd76ca527e9cad7d7b2b04066" gracePeriod=600 Dec 02 17:48:47 crc kubenswrapper[4933]: I1202 17:48:47.959100 4933 generic.go:334] "Generic (PLEG): container finished" podID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" containerID="78b5ba533150423d7bf35cf21bf475e020dd6bbdd76ca527e9cad7d7b2b04066" exitCode=0 Dec 02 17:48:47 crc kubenswrapper[4933]: I1202 17:48:47.959230 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" event={"ID":"e6c1c5e6-50dd-428a-890c-2c3f0456f2fa","Type":"ContainerDied","Data":"78b5ba533150423d7bf35cf21bf475e020dd6bbdd76ca527e9cad7d7b2b04066"} Dec 02 17:48:47 crc kubenswrapper[4933]: I1202 17:48:47.959814 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" event={"ID":"e6c1c5e6-50dd-428a-890c-2c3f0456f2fa","Type":"ContainerStarted","Data":"7c40d90a41c7f2f8aed02bda995f0865e4a7a77a8372ef4cdfaefeaf4a34037b"} Dec 02 17:48:47 crc kubenswrapper[4933]: I1202 17:48:47.959898 4933 scope.go:117] "RemoveContainer" containerID="2ee3071a320876f6620f091b90e8722600baacb646bda6aad185c6e1a117bdd8" Dec 02 17:49:34 crc kubenswrapper[4933]: I1202 17:49:34.115025 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-cjskf"] Dec 02 17:49:34 crc kubenswrapper[4933]: E1202 17:49:34.116359 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bce47a52-f2ec-49f5-9724-db529d2f38c7" containerName="extract-utilities" Dec 02 17:49:34 crc kubenswrapper[4933]: I1202 17:49:34.116373 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="bce47a52-f2ec-49f5-9724-db529d2f38c7" containerName="extract-utilities" Dec 02 17:49:34 crc kubenswrapper[4933]: E1202 17:49:34.116381 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bce47a52-f2ec-49f5-9724-db529d2f38c7" containerName="registry-server" Dec 02 17:49:34 crc kubenswrapper[4933]: I1202 17:49:34.116387 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="bce47a52-f2ec-49f5-9724-db529d2f38c7" containerName="registry-server" Dec 02 17:49:34 crc kubenswrapper[4933]: E1202 17:49:34.116420 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bce47a52-f2ec-49f5-9724-db529d2f38c7" containerName="extract-content" Dec 02 17:49:34 crc kubenswrapper[4933]: I1202 17:49:34.116436 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="bce47a52-f2ec-49f5-9724-db529d2f38c7" containerName="extract-content" Dec 02 17:49:34 crc kubenswrapper[4933]: I1202 17:49:34.116704 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="bce47a52-f2ec-49f5-9724-db529d2f38c7" containerName="registry-server" Dec 02 17:49:34 crc kubenswrapper[4933]: I1202 17:49:34.118362 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cjskf" Dec 02 17:49:34 crc kubenswrapper[4933]: I1202 17:49:34.128487 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cjskf"] Dec 02 17:49:34 crc kubenswrapper[4933]: I1202 17:49:34.260881 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d1ce61bc-2be6-4b16-9b36-075024f6bd14-catalog-content\") pod \"redhat-operators-cjskf\" (UID: \"d1ce61bc-2be6-4b16-9b36-075024f6bd14\") " pod="openshift-marketplace/redhat-operators-cjskf" Dec 02 17:49:34 crc kubenswrapper[4933]: I1202 17:49:34.261142 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d1ce61bc-2be6-4b16-9b36-075024f6bd14-utilities\") pod \"redhat-operators-cjskf\" (UID: \"d1ce61bc-2be6-4b16-9b36-075024f6bd14\") " pod="openshift-marketplace/redhat-operators-cjskf" Dec 02 17:49:34 crc kubenswrapper[4933]: I1202 17:49:34.261218 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jc86\" (UniqueName: \"kubernetes.io/projected/d1ce61bc-2be6-4b16-9b36-075024f6bd14-kube-api-access-2jc86\") pod \"redhat-operators-cjskf\" (UID: \"d1ce61bc-2be6-4b16-9b36-075024f6bd14\") " pod="openshift-marketplace/redhat-operators-cjskf" Dec 02 17:49:34 crc kubenswrapper[4933]: I1202 17:49:34.363059 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d1ce61bc-2be6-4b16-9b36-075024f6bd14-utilities\") pod \"redhat-operators-cjskf\" (UID: \"d1ce61bc-2be6-4b16-9b36-075024f6bd14\") " pod="openshift-marketplace/redhat-operators-cjskf" Dec 02 17:49:34 crc kubenswrapper[4933]: I1202 17:49:34.363118 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jc86\" (UniqueName: \"kubernetes.io/projected/d1ce61bc-2be6-4b16-9b36-075024f6bd14-kube-api-access-2jc86\") pod \"redhat-operators-cjskf\" (UID: \"d1ce61bc-2be6-4b16-9b36-075024f6bd14\") " pod="openshift-marketplace/redhat-operators-cjskf" Dec 02 17:49:34 crc kubenswrapper[4933]: I1202 17:49:34.363299 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d1ce61bc-2be6-4b16-9b36-075024f6bd14-catalog-content\") pod \"redhat-operators-cjskf\" (UID: \"d1ce61bc-2be6-4b16-9b36-075024f6bd14\") " pod="openshift-marketplace/redhat-operators-cjskf" Dec 02 17:49:34 crc kubenswrapper[4933]: I1202 17:49:34.363802 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d1ce61bc-2be6-4b16-9b36-075024f6bd14-catalog-content\") pod \"redhat-operators-cjskf\" (UID: \"d1ce61bc-2be6-4b16-9b36-075024f6bd14\") " pod="openshift-marketplace/redhat-operators-cjskf" Dec 02 17:49:34 crc kubenswrapper[4933]: I1202 17:49:34.364048 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d1ce61bc-2be6-4b16-9b36-075024f6bd14-utilities\") pod \"redhat-operators-cjskf\" (UID: \"d1ce61bc-2be6-4b16-9b36-075024f6bd14\") " pod="openshift-marketplace/redhat-operators-cjskf" Dec 02 17:49:34 crc kubenswrapper[4933]: I1202 17:49:34.393698 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jc86\" (UniqueName: \"kubernetes.io/projected/d1ce61bc-2be6-4b16-9b36-075024f6bd14-kube-api-access-2jc86\") pod \"redhat-operators-cjskf\" (UID: \"d1ce61bc-2be6-4b16-9b36-075024f6bd14\") " pod="openshift-marketplace/redhat-operators-cjskf" Dec 02 17:49:34 crc kubenswrapper[4933]: I1202 17:49:34.456715 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cjskf" Dec 02 17:49:35 crc kubenswrapper[4933]: I1202 17:49:35.051647 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cjskf"] Dec 02 17:49:35 crc kubenswrapper[4933]: I1202 17:49:35.691195 4933 generic.go:334] "Generic (PLEG): container finished" podID="d1ce61bc-2be6-4b16-9b36-075024f6bd14" containerID="88e4c9f3697c052066db428baf1a7aa4e19975ba13bf4423b484ff5d11d871cd" exitCode=0 Dec 02 17:49:35 crc kubenswrapper[4933]: I1202 17:49:35.691317 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cjskf" event={"ID":"d1ce61bc-2be6-4b16-9b36-075024f6bd14","Type":"ContainerDied","Data":"88e4c9f3697c052066db428baf1a7aa4e19975ba13bf4423b484ff5d11d871cd"} Dec 02 17:49:35 crc kubenswrapper[4933]: I1202 17:49:35.691485 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cjskf" event={"ID":"d1ce61bc-2be6-4b16-9b36-075024f6bd14","Type":"ContainerStarted","Data":"5b33944369cb3e8ee45b4e316252ed22e26a29221aaa594fb2441b66ffab908b"} Dec 02 17:49:35 crc kubenswrapper[4933]: I1202 17:49:35.693996 4933 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 17:49:36 crc kubenswrapper[4933]: I1202 17:49:36.718364 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cjskf" event={"ID":"d1ce61bc-2be6-4b16-9b36-075024f6bd14","Type":"ContainerStarted","Data":"2f3dca41e6f6da9b13e1871e186df93d3cb84275961bf95309d990eb37cbdd9f"} Dec 02 17:49:39 crc kubenswrapper[4933]: I1202 17:49:39.758804 4933 generic.go:334] "Generic (PLEG): container finished" podID="d1ce61bc-2be6-4b16-9b36-075024f6bd14" containerID="2f3dca41e6f6da9b13e1871e186df93d3cb84275961bf95309d990eb37cbdd9f" exitCode=0 Dec 02 17:49:39 crc kubenswrapper[4933]: I1202 17:49:39.759743 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cjskf" event={"ID":"d1ce61bc-2be6-4b16-9b36-075024f6bd14","Type":"ContainerDied","Data":"2f3dca41e6f6da9b13e1871e186df93d3cb84275961bf95309d990eb37cbdd9f"} Dec 02 17:49:40 crc kubenswrapper[4933]: I1202 17:49:40.774618 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cjskf" event={"ID":"d1ce61bc-2be6-4b16-9b36-075024f6bd14","Type":"ContainerStarted","Data":"a83ee7c55a2efd799289c1cefd89cbf18c20d3e28a4f54946158d8f1662001f9"} Dec 02 17:49:40 crc kubenswrapper[4933]: I1202 17:49:40.809949 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-cjskf" podStartSLOduration=2.365447455 podStartE2EDuration="6.809929746s" podCreationTimestamp="2025-12-02 17:49:34 +0000 UTC" firstStartedPulling="2025-12-02 17:49:35.693473809 +0000 UTC m=+7038.944700512" lastFinishedPulling="2025-12-02 17:49:40.1379561 +0000 UTC m=+7043.389182803" observedRunningTime="2025-12-02 17:49:40.801261455 +0000 UTC m=+7044.052488158" watchObservedRunningTime="2025-12-02 17:49:40.809929746 +0000 UTC m=+7044.061156479" Dec 02 17:49:44 crc kubenswrapper[4933]: I1202 17:49:44.457170 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-cjskf" Dec 02 17:49:44 crc kubenswrapper[4933]: I1202 17:49:44.459095 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-cjskf" Dec 02 17:49:45 crc kubenswrapper[4933]: I1202 17:49:45.516392 4933 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-cjskf" podUID="d1ce61bc-2be6-4b16-9b36-075024f6bd14" containerName="registry-server" probeResult="failure" output=< Dec 02 17:49:45 crc kubenswrapper[4933]: timeout: failed to connect service ":50051" within 1s Dec 02 17:49:45 crc kubenswrapper[4933]: > Dec 02 17:49:55 crc kubenswrapper[4933]: I1202 17:49:55.540040 4933 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-cjskf" podUID="d1ce61bc-2be6-4b16-9b36-075024f6bd14" containerName="registry-server" probeResult="failure" output=< Dec 02 17:49:55 crc kubenswrapper[4933]: timeout: failed to connect service ":50051" within 1s Dec 02 17:49:55 crc kubenswrapper[4933]: > Dec 02 17:50:04 crc kubenswrapper[4933]: I1202 17:50:04.536953 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-cjskf" Dec 02 17:50:04 crc kubenswrapper[4933]: I1202 17:50:04.614474 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-cjskf" Dec 02 17:50:05 crc kubenswrapper[4933]: I1202 17:50:05.089050 4933 generic.go:334] "Generic (PLEG): container finished" podID="3e542deb-5ddf-4bfc-9eea-73e4d47aa429" containerID="3922682c0c171c839f0090e22a4d32e9b51e2254edbfb5872d993628fed90aea" exitCode=0 Dec 02 17:50:05 crc kubenswrapper[4933]: I1202 17:50:05.089931 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7pg8p/must-gather-s92q2" event={"ID":"3e542deb-5ddf-4bfc-9eea-73e4d47aa429","Type":"ContainerDied","Data":"3922682c0c171c839f0090e22a4d32e9b51e2254edbfb5872d993628fed90aea"} Dec 02 17:50:05 crc kubenswrapper[4933]: I1202 17:50:05.091139 4933 scope.go:117] "RemoveContainer" containerID="3922682c0c171c839f0090e22a4d32e9b51e2254edbfb5872d993628fed90aea" Dec 02 17:50:05 crc kubenswrapper[4933]: I1202 17:50:05.481776 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cjskf"] Dec 02 17:50:05 crc kubenswrapper[4933]: I1202 17:50:05.703286 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-7pg8p_must-gather-s92q2_3e542deb-5ddf-4bfc-9eea-73e4d47aa429/gather/0.log" Dec 02 17:50:06 crc kubenswrapper[4933]: I1202 17:50:06.104178 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-cjskf" podUID="d1ce61bc-2be6-4b16-9b36-075024f6bd14" containerName="registry-server" containerID="cri-o://a83ee7c55a2efd799289c1cefd89cbf18c20d3e28a4f54946158d8f1662001f9" gracePeriod=2 Dec 02 17:50:06 crc kubenswrapper[4933]: I1202 17:50:06.672062 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cjskf" Dec 02 17:50:06 crc kubenswrapper[4933]: I1202 17:50:06.859784 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2jc86\" (UniqueName: \"kubernetes.io/projected/d1ce61bc-2be6-4b16-9b36-075024f6bd14-kube-api-access-2jc86\") pod \"d1ce61bc-2be6-4b16-9b36-075024f6bd14\" (UID: \"d1ce61bc-2be6-4b16-9b36-075024f6bd14\") " Dec 02 17:50:06 crc kubenswrapper[4933]: I1202 17:50:06.860058 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d1ce61bc-2be6-4b16-9b36-075024f6bd14-catalog-content\") pod \"d1ce61bc-2be6-4b16-9b36-075024f6bd14\" (UID: \"d1ce61bc-2be6-4b16-9b36-075024f6bd14\") " Dec 02 17:50:06 crc kubenswrapper[4933]: I1202 17:50:06.860481 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d1ce61bc-2be6-4b16-9b36-075024f6bd14-utilities\") pod \"d1ce61bc-2be6-4b16-9b36-075024f6bd14\" (UID: \"d1ce61bc-2be6-4b16-9b36-075024f6bd14\") " Dec 02 17:50:06 crc kubenswrapper[4933]: I1202 17:50:06.861459 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1ce61bc-2be6-4b16-9b36-075024f6bd14-utilities" (OuterVolumeSpecName: "utilities") pod "d1ce61bc-2be6-4b16-9b36-075024f6bd14" (UID: "d1ce61bc-2be6-4b16-9b36-075024f6bd14"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 17:50:06 crc kubenswrapper[4933]: I1202 17:50:06.862095 4933 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d1ce61bc-2be6-4b16-9b36-075024f6bd14-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 17:50:06 crc kubenswrapper[4933]: I1202 17:50:06.882216 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1ce61bc-2be6-4b16-9b36-075024f6bd14-kube-api-access-2jc86" (OuterVolumeSpecName: "kube-api-access-2jc86") pod "d1ce61bc-2be6-4b16-9b36-075024f6bd14" (UID: "d1ce61bc-2be6-4b16-9b36-075024f6bd14"). InnerVolumeSpecName "kube-api-access-2jc86". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 17:50:06 crc kubenswrapper[4933]: I1202 17:50:06.966272 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2jc86\" (UniqueName: \"kubernetes.io/projected/d1ce61bc-2be6-4b16-9b36-075024f6bd14-kube-api-access-2jc86\") on node \"crc\" DevicePath \"\"" Dec 02 17:50:06 crc kubenswrapper[4933]: I1202 17:50:06.988032 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1ce61bc-2be6-4b16-9b36-075024f6bd14-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d1ce61bc-2be6-4b16-9b36-075024f6bd14" (UID: "d1ce61bc-2be6-4b16-9b36-075024f6bd14"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 17:50:07 crc kubenswrapper[4933]: I1202 17:50:07.069428 4933 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d1ce61bc-2be6-4b16-9b36-075024f6bd14-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 17:50:07 crc kubenswrapper[4933]: I1202 17:50:07.123133 4933 generic.go:334] "Generic (PLEG): container finished" podID="d1ce61bc-2be6-4b16-9b36-075024f6bd14" containerID="a83ee7c55a2efd799289c1cefd89cbf18c20d3e28a4f54946158d8f1662001f9" exitCode=0 Dec 02 17:50:07 crc kubenswrapper[4933]: I1202 17:50:07.123176 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cjskf" event={"ID":"d1ce61bc-2be6-4b16-9b36-075024f6bd14","Type":"ContainerDied","Data":"a83ee7c55a2efd799289c1cefd89cbf18c20d3e28a4f54946158d8f1662001f9"} Dec 02 17:50:07 crc kubenswrapper[4933]: I1202 17:50:07.123205 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cjskf" event={"ID":"d1ce61bc-2be6-4b16-9b36-075024f6bd14","Type":"ContainerDied","Data":"5b33944369cb3e8ee45b4e316252ed22e26a29221aaa594fb2441b66ffab908b"} Dec 02 17:50:07 crc kubenswrapper[4933]: I1202 17:50:07.123227 4933 scope.go:117] "RemoveContainer" containerID="a83ee7c55a2efd799289c1cefd89cbf18c20d3e28a4f54946158d8f1662001f9" Dec 02 17:50:07 crc kubenswrapper[4933]: I1202 17:50:07.123573 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cjskf" Dec 02 17:50:07 crc kubenswrapper[4933]: I1202 17:50:07.158038 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cjskf"] Dec 02 17:50:07 crc kubenswrapper[4933]: I1202 17:50:07.167752 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-cjskf"] Dec 02 17:50:07 crc kubenswrapper[4933]: I1202 17:50:07.175255 4933 scope.go:117] "RemoveContainer" containerID="2f3dca41e6f6da9b13e1871e186df93d3cb84275961bf95309d990eb37cbdd9f" Dec 02 17:50:07 crc kubenswrapper[4933]: I1202 17:50:07.202866 4933 scope.go:117] "RemoveContainer" containerID="88e4c9f3697c052066db428baf1a7aa4e19975ba13bf4423b484ff5d11d871cd" Dec 02 17:50:07 crc kubenswrapper[4933]: I1202 17:50:07.271706 4933 scope.go:117] "RemoveContainer" containerID="a83ee7c55a2efd799289c1cefd89cbf18c20d3e28a4f54946158d8f1662001f9" Dec 02 17:50:07 crc kubenswrapper[4933]: E1202 17:50:07.274765 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a83ee7c55a2efd799289c1cefd89cbf18c20d3e28a4f54946158d8f1662001f9\": container with ID starting with a83ee7c55a2efd799289c1cefd89cbf18c20d3e28a4f54946158d8f1662001f9 not found: ID does not exist" containerID="a83ee7c55a2efd799289c1cefd89cbf18c20d3e28a4f54946158d8f1662001f9" Dec 02 17:50:07 crc kubenswrapper[4933]: I1202 17:50:07.274837 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a83ee7c55a2efd799289c1cefd89cbf18c20d3e28a4f54946158d8f1662001f9"} err="failed to get container status \"a83ee7c55a2efd799289c1cefd89cbf18c20d3e28a4f54946158d8f1662001f9\": rpc error: code = NotFound desc = could not find container \"a83ee7c55a2efd799289c1cefd89cbf18c20d3e28a4f54946158d8f1662001f9\": container with ID starting with a83ee7c55a2efd799289c1cefd89cbf18c20d3e28a4f54946158d8f1662001f9 not found: ID does not exist" Dec 02 17:50:07 crc kubenswrapper[4933]: I1202 17:50:07.274871 4933 scope.go:117] "RemoveContainer" containerID="2f3dca41e6f6da9b13e1871e186df93d3cb84275961bf95309d990eb37cbdd9f" Dec 02 17:50:07 crc kubenswrapper[4933]: E1202 17:50:07.275450 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f3dca41e6f6da9b13e1871e186df93d3cb84275961bf95309d990eb37cbdd9f\": container with ID starting with 2f3dca41e6f6da9b13e1871e186df93d3cb84275961bf95309d990eb37cbdd9f not found: ID does not exist" containerID="2f3dca41e6f6da9b13e1871e186df93d3cb84275961bf95309d990eb37cbdd9f" Dec 02 17:50:07 crc kubenswrapper[4933]: I1202 17:50:07.275498 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f3dca41e6f6da9b13e1871e186df93d3cb84275961bf95309d990eb37cbdd9f"} err="failed to get container status \"2f3dca41e6f6da9b13e1871e186df93d3cb84275961bf95309d990eb37cbdd9f\": rpc error: code = NotFound desc = could not find container \"2f3dca41e6f6da9b13e1871e186df93d3cb84275961bf95309d990eb37cbdd9f\": container with ID starting with 2f3dca41e6f6da9b13e1871e186df93d3cb84275961bf95309d990eb37cbdd9f not found: ID does not exist" Dec 02 17:50:07 crc kubenswrapper[4933]: I1202 17:50:07.275530 4933 scope.go:117] "RemoveContainer" containerID="88e4c9f3697c052066db428baf1a7aa4e19975ba13bf4423b484ff5d11d871cd" Dec 02 17:50:07 crc kubenswrapper[4933]: E1202 17:50:07.275858 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88e4c9f3697c052066db428baf1a7aa4e19975ba13bf4423b484ff5d11d871cd\": container with ID starting with 88e4c9f3697c052066db428baf1a7aa4e19975ba13bf4423b484ff5d11d871cd not found: ID does not exist" containerID="88e4c9f3697c052066db428baf1a7aa4e19975ba13bf4423b484ff5d11d871cd" Dec 02 17:50:07 crc kubenswrapper[4933]: I1202 17:50:07.275888 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88e4c9f3697c052066db428baf1a7aa4e19975ba13bf4423b484ff5d11d871cd"} err="failed to get container status \"88e4c9f3697c052066db428baf1a7aa4e19975ba13bf4423b484ff5d11d871cd\": rpc error: code = NotFound desc = could not find container \"88e4c9f3697c052066db428baf1a7aa4e19975ba13bf4423b484ff5d11d871cd\": container with ID starting with 88e4c9f3697c052066db428baf1a7aa4e19975ba13bf4423b484ff5d11d871cd not found: ID does not exist" Dec 02 17:50:09 crc kubenswrapper[4933]: I1202 17:50:09.070943 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1ce61bc-2be6-4b16-9b36-075024f6bd14" path="/var/lib/kubelet/pods/d1ce61bc-2be6-4b16-9b36-075024f6bd14/volumes" Dec 02 17:50:19 crc kubenswrapper[4933]: I1202 17:50:19.021352 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-7pg8p/must-gather-s92q2"] Dec 02 17:50:19 crc kubenswrapper[4933]: I1202 17:50:19.022469 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-7pg8p/must-gather-s92q2" podUID="3e542deb-5ddf-4bfc-9eea-73e4d47aa429" containerName="copy" containerID="cri-o://2b76cd34a84fae6817ecf59866964e90adff390d922825b0fc42e143f2d9c160" gracePeriod=2 Dec 02 17:50:19 crc kubenswrapper[4933]: I1202 17:50:19.048490 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-7pg8p/must-gather-s92q2"] Dec 02 17:50:19 crc kubenswrapper[4933]: I1202 17:50:19.317885 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-7pg8p_must-gather-s92q2_3e542deb-5ddf-4bfc-9eea-73e4d47aa429/copy/0.log" Dec 02 17:50:19 crc kubenswrapper[4933]: I1202 17:50:19.318307 4933 generic.go:334] "Generic (PLEG): container finished" podID="3e542deb-5ddf-4bfc-9eea-73e4d47aa429" containerID="2b76cd34a84fae6817ecf59866964e90adff390d922825b0fc42e143f2d9c160" exitCode=143 Dec 02 17:50:19 crc kubenswrapper[4933]: I1202 17:50:19.726597 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-7pg8p_must-gather-s92q2_3e542deb-5ddf-4bfc-9eea-73e4d47aa429/copy/0.log" Dec 02 17:50:19 crc kubenswrapper[4933]: I1202 17:50:19.727676 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7pg8p/must-gather-s92q2" Dec 02 17:50:19 crc kubenswrapper[4933]: I1202 17:50:19.834299 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2828\" (UniqueName: \"kubernetes.io/projected/3e542deb-5ddf-4bfc-9eea-73e4d47aa429-kube-api-access-x2828\") pod \"3e542deb-5ddf-4bfc-9eea-73e4d47aa429\" (UID: \"3e542deb-5ddf-4bfc-9eea-73e4d47aa429\") " Dec 02 17:50:19 crc kubenswrapper[4933]: I1202 17:50:19.834703 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3e542deb-5ddf-4bfc-9eea-73e4d47aa429-must-gather-output\") pod \"3e542deb-5ddf-4bfc-9eea-73e4d47aa429\" (UID: \"3e542deb-5ddf-4bfc-9eea-73e4d47aa429\") " Dec 02 17:50:19 crc kubenswrapper[4933]: I1202 17:50:19.843819 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e542deb-5ddf-4bfc-9eea-73e4d47aa429-kube-api-access-x2828" (OuterVolumeSpecName: "kube-api-access-x2828") pod "3e542deb-5ddf-4bfc-9eea-73e4d47aa429" (UID: "3e542deb-5ddf-4bfc-9eea-73e4d47aa429"). InnerVolumeSpecName "kube-api-access-x2828". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 17:50:19 crc kubenswrapper[4933]: I1202 17:50:19.937096 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2828\" (UniqueName: \"kubernetes.io/projected/3e542deb-5ddf-4bfc-9eea-73e4d47aa429-kube-api-access-x2828\") on node \"crc\" DevicePath \"\"" Dec 02 17:50:20 crc kubenswrapper[4933]: I1202 17:50:20.006439 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e542deb-5ddf-4bfc-9eea-73e4d47aa429-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "3e542deb-5ddf-4bfc-9eea-73e4d47aa429" (UID: "3e542deb-5ddf-4bfc-9eea-73e4d47aa429"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 17:50:20 crc kubenswrapper[4933]: I1202 17:50:20.039721 4933 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3e542deb-5ddf-4bfc-9eea-73e4d47aa429-must-gather-output\") on node \"crc\" DevicePath \"\"" Dec 02 17:50:20 crc kubenswrapper[4933]: I1202 17:50:20.331274 4933 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-7pg8p_must-gather-s92q2_3e542deb-5ddf-4bfc-9eea-73e4d47aa429/copy/0.log" Dec 02 17:50:20 crc kubenswrapper[4933]: I1202 17:50:20.333636 4933 scope.go:117] "RemoveContainer" containerID="2b76cd34a84fae6817ecf59866964e90adff390d922825b0fc42e143f2d9c160" Dec 02 17:50:20 crc kubenswrapper[4933]: I1202 17:50:20.333699 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7pg8p/must-gather-s92q2" Dec 02 17:50:20 crc kubenswrapper[4933]: I1202 17:50:20.364109 4933 scope.go:117] "RemoveContainer" containerID="3922682c0c171c839f0090e22a4d32e9b51e2254edbfb5872d993628fed90aea" Dec 02 17:50:21 crc kubenswrapper[4933]: I1202 17:50:21.069294 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e542deb-5ddf-4bfc-9eea-73e4d47aa429" path="/var/lib/kubelet/pods/3e542deb-5ddf-4bfc-9eea-73e4d47aa429/volumes" Dec 02 17:50:47 crc kubenswrapper[4933]: I1202 17:50:47.170018 4933 patch_prober.go:28] interesting pod/machine-config-daemon-d2p6w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 17:50:47 crc kubenswrapper[4933]: I1202 17:50:47.173008 4933 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 17:50:49 crc kubenswrapper[4933]: I1202 17:50:49.809144 4933 scope.go:117] "RemoveContainer" containerID="16e9993ea12cd25d9b56386486263292fb512ae34f6764d23fff4aa65f52499a" Dec 02 17:51:17 crc kubenswrapper[4933]: I1202 17:51:17.170478 4933 patch_prober.go:28] interesting pod/machine-config-daemon-d2p6w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 17:51:17 crc kubenswrapper[4933]: I1202 17:51:17.171541 4933 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 17:51:47 crc kubenswrapper[4933]: I1202 17:51:47.170225 4933 patch_prober.go:28] interesting pod/machine-config-daemon-d2p6w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 17:51:47 crc kubenswrapper[4933]: I1202 17:51:47.171071 4933 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 17:51:47 crc kubenswrapper[4933]: I1202 17:51:47.171147 4933 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" Dec 02 17:51:47 crc kubenswrapper[4933]: I1202 17:51:47.172656 4933 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7c40d90a41c7f2f8aed02bda995f0865e4a7a77a8372ef4cdfaefeaf4a34037b"} pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 17:51:47 crc kubenswrapper[4933]: I1202 17:51:47.172764 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" containerName="machine-config-daemon" containerID="cri-o://7c40d90a41c7f2f8aed02bda995f0865e4a7a77a8372ef4cdfaefeaf4a34037b" gracePeriod=600 Dec 02 17:51:47 crc kubenswrapper[4933]: E1202 17:51:47.320345 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 17:51:47 crc kubenswrapper[4933]: I1202 17:51:47.482694 4933 generic.go:334] "Generic (PLEG): container finished" podID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" containerID="7c40d90a41c7f2f8aed02bda995f0865e4a7a77a8372ef4cdfaefeaf4a34037b" exitCode=0 Dec 02 17:51:47 crc kubenswrapper[4933]: I1202 17:51:47.482743 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" event={"ID":"e6c1c5e6-50dd-428a-890c-2c3f0456f2fa","Type":"ContainerDied","Data":"7c40d90a41c7f2f8aed02bda995f0865e4a7a77a8372ef4cdfaefeaf4a34037b"} Dec 02 17:51:47 crc kubenswrapper[4933]: I1202 17:51:47.482861 4933 scope.go:117] "RemoveContainer" containerID="78b5ba533150423d7bf35cf21bf475e020dd6bbdd76ca527e9cad7d7b2b04066" Dec 02 17:51:47 crc kubenswrapper[4933]: I1202 17:51:47.483724 4933 scope.go:117] "RemoveContainer" containerID="7c40d90a41c7f2f8aed02bda995f0865e4a7a77a8372ef4cdfaefeaf4a34037b" Dec 02 17:51:47 crc kubenswrapper[4933]: E1202 17:51:47.484208 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 17:51:58 crc kubenswrapper[4933]: I1202 17:51:58.054471 4933 scope.go:117] "RemoveContainer" containerID="7c40d90a41c7f2f8aed02bda995f0865e4a7a77a8372ef4cdfaefeaf4a34037b" Dec 02 17:51:58 crc kubenswrapper[4933]: E1202 17:51:58.055885 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 17:52:12 crc kubenswrapper[4933]: I1202 17:52:12.053630 4933 scope.go:117] "RemoveContainer" containerID="7c40d90a41c7f2f8aed02bda995f0865e4a7a77a8372ef4cdfaefeaf4a34037b" Dec 02 17:52:12 crc kubenswrapper[4933]: E1202 17:52:12.054642 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 17:52:27 crc kubenswrapper[4933]: I1202 17:52:27.072474 4933 scope.go:117] "RemoveContainer" containerID="7c40d90a41c7f2f8aed02bda995f0865e4a7a77a8372ef4cdfaefeaf4a34037b" Dec 02 17:52:27 crc kubenswrapper[4933]: E1202 17:52:27.073934 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 17:52:42 crc kubenswrapper[4933]: I1202 17:52:42.054512 4933 scope.go:117] "RemoveContainer" containerID="7c40d90a41c7f2f8aed02bda995f0865e4a7a77a8372ef4cdfaefeaf4a34037b" Dec 02 17:52:42 crc kubenswrapper[4933]: E1202 17:52:42.055777 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 17:52:54 crc kubenswrapper[4933]: I1202 17:52:54.054477 4933 scope.go:117] "RemoveContainer" containerID="7c40d90a41c7f2f8aed02bda995f0865e4a7a77a8372ef4cdfaefeaf4a34037b" Dec 02 17:52:54 crc kubenswrapper[4933]: E1202 17:52:54.055571 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 17:53:06 crc kubenswrapper[4933]: I1202 17:53:06.053746 4933 scope.go:117] "RemoveContainer" containerID="7c40d90a41c7f2f8aed02bda995f0865e4a7a77a8372ef4cdfaefeaf4a34037b" Dec 02 17:53:06 crc kubenswrapper[4933]: E1202 17:53:06.054892 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 17:53:19 crc kubenswrapper[4933]: I1202 17:53:19.054956 4933 scope.go:117] "RemoveContainer" containerID="7c40d90a41c7f2f8aed02bda995f0865e4a7a77a8372ef4cdfaefeaf4a34037b" Dec 02 17:53:19 crc kubenswrapper[4933]: E1202 17:53:19.056907 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 17:53:33 crc kubenswrapper[4933]: I1202 17:53:33.056094 4933 scope.go:117] "RemoveContainer" containerID="7c40d90a41c7f2f8aed02bda995f0865e4a7a77a8372ef4cdfaefeaf4a34037b" Dec 02 17:53:33 crc kubenswrapper[4933]: E1202 17:53:33.057140 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 17:53:47 crc kubenswrapper[4933]: I1202 17:53:47.068012 4933 scope.go:117] "RemoveContainer" containerID="7c40d90a41c7f2f8aed02bda995f0865e4a7a77a8372ef4cdfaefeaf4a34037b" Dec 02 17:53:47 crc kubenswrapper[4933]: E1202 17:53:47.069640 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 17:53:54 crc kubenswrapper[4933]: I1202 17:53:54.771693 4933 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-ngshh"] Dec 02 17:53:54 crc kubenswrapper[4933]: E1202 17:53:54.773694 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e542deb-5ddf-4bfc-9eea-73e4d47aa429" containerName="copy" Dec 02 17:53:54 crc kubenswrapper[4933]: I1202 17:53:54.773762 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e542deb-5ddf-4bfc-9eea-73e4d47aa429" containerName="copy" Dec 02 17:53:54 crc kubenswrapper[4933]: E1202 17:53:54.773845 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1ce61bc-2be6-4b16-9b36-075024f6bd14" containerName="registry-server" Dec 02 17:53:54 crc kubenswrapper[4933]: I1202 17:53:54.773901 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1ce61bc-2be6-4b16-9b36-075024f6bd14" containerName="registry-server" Dec 02 17:53:54 crc kubenswrapper[4933]: E1202 17:53:54.773958 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1ce61bc-2be6-4b16-9b36-075024f6bd14" containerName="extract-content" Dec 02 17:53:54 crc kubenswrapper[4933]: I1202 17:53:54.774011 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1ce61bc-2be6-4b16-9b36-075024f6bd14" containerName="extract-content" Dec 02 17:53:54 crc kubenswrapper[4933]: E1202 17:53:54.774070 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e542deb-5ddf-4bfc-9eea-73e4d47aa429" containerName="gather" Dec 02 17:53:54 crc kubenswrapper[4933]: I1202 17:53:54.774120 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e542deb-5ddf-4bfc-9eea-73e4d47aa429" containerName="gather" Dec 02 17:53:54 crc kubenswrapper[4933]: E1202 17:53:54.774188 4933 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1ce61bc-2be6-4b16-9b36-075024f6bd14" containerName="extract-utilities" Dec 02 17:53:54 crc kubenswrapper[4933]: I1202 17:53:54.774236 4933 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1ce61bc-2be6-4b16-9b36-075024f6bd14" containerName="extract-utilities" Dec 02 17:53:54 crc kubenswrapper[4933]: I1202 17:53:54.774534 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e542deb-5ddf-4bfc-9eea-73e4d47aa429" containerName="gather" Dec 02 17:53:54 crc kubenswrapper[4933]: I1202 17:53:54.774593 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1ce61bc-2be6-4b16-9b36-075024f6bd14" containerName="registry-server" Dec 02 17:53:54 crc kubenswrapper[4933]: I1202 17:53:54.774692 4933 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e542deb-5ddf-4bfc-9eea-73e4d47aa429" containerName="copy" Dec 02 17:53:54 crc kubenswrapper[4933]: I1202 17:53:54.776783 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ngshh" Dec 02 17:53:54 crc kubenswrapper[4933]: I1202 17:53:54.794250 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ngshh"] Dec 02 17:53:54 crc kubenswrapper[4933]: I1202 17:53:54.856281 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf51d19b-802a-463f-a833-50973dce7583-utilities\") pod \"redhat-marketplace-ngshh\" (UID: \"bf51d19b-802a-463f-a833-50973dce7583\") " pod="openshift-marketplace/redhat-marketplace-ngshh" Dec 02 17:53:54 crc kubenswrapper[4933]: I1202 17:53:54.856377 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf51d19b-802a-463f-a833-50973dce7583-catalog-content\") pod \"redhat-marketplace-ngshh\" (UID: \"bf51d19b-802a-463f-a833-50973dce7583\") " pod="openshift-marketplace/redhat-marketplace-ngshh" Dec 02 17:53:54 crc kubenswrapper[4933]: I1202 17:53:54.856496 4933 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9lz9\" (UniqueName: \"kubernetes.io/projected/bf51d19b-802a-463f-a833-50973dce7583-kube-api-access-k9lz9\") pod \"redhat-marketplace-ngshh\" (UID: \"bf51d19b-802a-463f-a833-50973dce7583\") " pod="openshift-marketplace/redhat-marketplace-ngshh" Dec 02 17:53:54 crc kubenswrapper[4933]: I1202 17:53:54.960350 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9lz9\" (UniqueName: \"kubernetes.io/projected/bf51d19b-802a-463f-a833-50973dce7583-kube-api-access-k9lz9\") pod \"redhat-marketplace-ngshh\" (UID: \"bf51d19b-802a-463f-a833-50973dce7583\") " pod="openshift-marketplace/redhat-marketplace-ngshh" Dec 02 17:53:54 crc kubenswrapper[4933]: I1202 17:53:54.960595 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf51d19b-802a-463f-a833-50973dce7583-utilities\") pod \"redhat-marketplace-ngshh\" (UID: \"bf51d19b-802a-463f-a833-50973dce7583\") " pod="openshift-marketplace/redhat-marketplace-ngshh" Dec 02 17:53:54 crc kubenswrapper[4933]: I1202 17:53:54.960663 4933 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf51d19b-802a-463f-a833-50973dce7583-catalog-content\") pod \"redhat-marketplace-ngshh\" (UID: \"bf51d19b-802a-463f-a833-50973dce7583\") " pod="openshift-marketplace/redhat-marketplace-ngshh" Dec 02 17:53:54 crc kubenswrapper[4933]: I1202 17:53:54.961330 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf51d19b-802a-463f-a833-50973dce7583-utilities\") pod \"redhat-marketplace-ngshh\" (UID: \"bf51d19b-802a-463f-a833-50973dce7583\") " pod="openshift-marketplace/redhat-marketplace-ngshh" Dec 02 17:53:54 crc kubenswrapper[4933]: I1202 17:53:54.961465 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf51d19b-802a-463f-a833-50973dce7583-catalog-content\") pod \"redhat-marketplace-ngshh\" (UID: \"bf51d19b-802a-463f-a833-50973dce7583\") " pod="openshift-marketplace/redhat-marketplace-ngshh" Dec 02 17:53:54 crc kubenswrapper[4933]: I1202 17:53:54.981259 4933 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9lz9\" (UniqueName: \"kubernetes.io/projected/bf51d19b-802a-463f-a833-50973dce7583-kube-api-access-k9lz9\") pod \"redhat-marketplace-ngshh\" (UID: \"bf51d19b-802a-463f-a833-50973dce7583\") " pod="openshift-marketplace/redhat-marketplace-ngshh" Dec 02 17:53:55 crc kubenswrapper[4933]: I1202 17:53:55.108154 4933 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ngshh" Dec 02 17:53:55 crc kubenswrapper[4933]: I1202 17:53:55.682267 4933 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ngshh"] Dec 02 17:53:56 crc kubenswrapper[4933]: I1202 17:53:56.510684 4933 generic.go:334] "Generic (PLEG): container finished" podID="bf51d19b-802a-463f-a833-50973dce7583" containerID="ccfd743f16e1cb28be48b71053bc017865c6cb9d852d69ff4356453d26a8f1ef" exitCode=0 Dec 02 17:53:56 crc kubenswrapper[4933]: I1202 17:53:56.511173 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ngshh" event={"ID":"bf51d19b-802a-463f-a833-50973dce7583","Type":"ContainerDied","Data":"ccfd743f16e1cb28be48b71053bc017865c6cb9d852d69ff4356453d26a8f1ef"} Dec 02 17:53:56 crc kubenswrapper[4933]: I1202 17:53:56.511250 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ngshh" event={"ID":"bf51d19b-802a-463f-a833-50973dce7583","Type":"ContainerStarted","Data":"b1dfada127819b9a8192561abb4eaf36d1b0939eb56191c35cfbafd9925f3d2d"} Dec 02 17:53:57 crc kubenswrapper[4933]: I1202 17:53:57.527337 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ngshh" event={"ID":"bf51d19b-802a-463f-a833-50973dce7583","Type":"ContainerStarted","Data":"655f7e96f6f159c34022ff33ab456380882216abd79f70ee317c2304c8b77554"} Dec 02 17:53:58 crc kubenswrapper[4933]: I1202 17:53:58.539642 4933 generic.go:334] "Generic (PLEG): container finished" podID="bf51d19b-802a-463f-a833-50973dce7583" containerID="655f7e96f6f159c34022ff33ab456380882216abd79f70ee317c2304c8b77554" exitCode=0 Dec 02 17:53:58 crc kubenswrapper[4933]: I1202 17:53:58.539739 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ngshh" event={"ID":"bf51d19b-802a-463f-a833-50973dce7583","Type":"ContainerDied","Data":"655f7e96f6f159c34022ff33ab456380882216abd79f70ee317c2304c8b77554"} Dec 02 17:53:59 crc kubenswrapper[4933]: I1202 17:53:59.554005 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ngshh" event={"ID":"bf51d19b-802a-463f-a833-50973dce7583","Type":"ContainerStarted","Data":"a71c5952d0eb3bd2e3ae2dc54f24b5192721c4817c5474373dea1bae16d14cc8"} Dec 02 17:53:59 crc kubenswrapper[4933]: I1202 17:53:59.576697 4933 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-ngshh" podStartSLOduration=3.084441512 podStartE2EDuration="5.576677108s" podCreationTimestamp="2025-12-02 17:53:54 +0000 UTC" firstStartedPulling="2025-12-02 17:53:56.514840905 +0000 UTC m=+7299.766067608" lastFinishedPulling="2025-12-02 17:53:59.007076501 +0000 UTC m=+7302.258303204" observedRunningTime="2025-12-02 17:53:59.57110108 +0000 UTC m=+7302.822327793" watchObservedRunningTime="2025-12-02 17:53:59.576677108 +0000 UTC m=+7302.827903811" Dec 02 17:54:00 crc kubenswrapper[4933]: I1202 17:54:00.054657 4933 scope.go:117] "RemoveContainer" containerID="7c40d90a41c7f2f8aed02bda995f0865e4a7a77a8372ef4cdfaefeaf4a34037b" Dec 02 17:54:00 crc kubenswrapper[4933]: E1202 17:54:00.055281 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 17:54:05 crc kubenswrapper[4933]: I1202 17:54:05.108803 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-ngshh" Dec 02 17:54:05 crc kubenswrapper[4933]: I1202 17:54:05.110004 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-ngshh" Dec 02 17:54:05 crc kubenswrapper[4933]: I1202 17:54:05.207365 4933 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-ngshh" Dec 02 17:54:05 crc kubenswrapper[4933]: I1202 17:54:05.696407 4933 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-ngshh" Dec 02 17:54:05 crc kubenswrapper[4933]: I1202 17:54:05.766336 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ngshh"] Dec 02 17:54:07 crc kubenswrapper[4933]: I1202 17:54:07.644195 4933 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-ngshh" podUID="bf51d19b-802a-463f-a833-50973dce7583" containerName="registry-server" containerID="cri-o://a71c5952d0eb3bd2e3ae2dc54f24b5192721c4817c5474373dea1bae16d14cc8" gracePeriod=2 Dec 02 17:54:08 crc kubenswrapper[4933]: I1202 17:54:08.222138 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ngshh" Dec 02 17:54:08 crc kubenswrapper[4933]: I1202 17:54:08.358098 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf51d19b-802a-463f-a833-50973dce7583-utilities\") pod \"bf51d19b-802a-463f-a833-50973dce7583\" (UID: \"bf51d19b-802a-463f-a833-50973dce7583\") " Dec 02 17:54:08 crc kubenswrapper[4933]: I1202 17:54:08.358321 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf51d19b-802a-463f-a833-50973dce7583-catalog-content\") pod \"bf51d19b-802a-463f-a833-50973dce7583\" (UID: \"bf51d19b-802a-463f-a833-50973dce7583\") " Dec 02 17:54:08 crc kubenswrapper[4933]: I1202 17:54:08.358399 4933 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k9lz9\" (UniqueName: \"kubernetes.io/projected/bf51d19b-802a-463f-a833-50973dce7583-kube-api-access-k9lz9\") pod \"bf51d19b-802a-463f-a833-50973dce7583\" (UID: \"bf51d19b-802a-463f-a833-50973dce7583\") " Dec 02 17:54:08 crc kubenswrapper[4933]: I1202 17:54:08.361018 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf51d19b-802a-463f-a833-50973dce7583-utilities" (OuterVolumeSpecName: "utilities") pod "bf51d19b-802a-463f-a833-50973dce7583" (UID: "bf51d19b-802a-463f-a833-50973dce7583"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 17:54:08 crc kubenswrapper[4933]: I1202 17:54:08.369125 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf51d19b-802a-463f-a833-50973dce7583-kube-api-access-k9lz9" (OuterVolumeSpecName: "kube-api-access-k9lz9") pod "bf51d19b-802a-463f-a833-50973dce7583" (UID: "bf51d19b-802a-463f-a833-50973dce7583"). InnerVolumeSpecName "kube-api-access-k9lz9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 17:54:08 crc kubenswrapper[4933]: I1202 17:54:08.406070 4933 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf51d19b-802a-463f-a833-50973dce7583-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bf51d19b-802a-463f-a833-50973dce7583" (UID: "bf51d19b-802a-463f-a833-50973dce7583"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 17:54:08 crc kubenswrapper[4933]: I1202 17:54:08.462590 4933 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf51d19b-802a-463f-a833-50973dce7583-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 17:54:08 crc kubenswrapper[4933]: I1202 17:54:08.462625 4933 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k9lz9\" (UniqueName: \"kubernetes.io/projected/bf51d19b-802a-463f-a833-50973dce7583-kube-api-access-k9lz9\") on node \"crc\" DevicePath \"\"" Dec 02 17:54:08 crc kubenswrapper[4933]: I1202 17:54:08.462636 4933 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf51d19b-802a-463f-a833-50973dce7583-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 17:54:08 crc kubenswrapper[4933]: I1202 17:54:08.666271 4933 generic.go:334] "Generic (PLEG): container finished" podID="bf51d19b-802a-463f-a833-50973dce7583" containerID="a71c5952d0eb3bd2e3ae2dc54f24b5192721c4817c5474373dea1bae16d14cc8" exitCode=0 Dec 02 17:54:08 crc kubenswrapper[4933]: I1202 17:54:08.666560 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ngshh" event={"ID":"bf51d19b-802a-463f-a833-50973dce7583","Type":"ContainerDied","Data":"a71c5952d0eb3bd2e3ae2dc54f24b5192721c4817c5474373dea1bae16d14cc8"} Dec 02 17:54:08 crc kubenswrapper[4933]: I1202 17:54:08.666807 4933 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ngshh" event={"ID":"bf51d19b-802a-463f-a833-50973dce7583","Type":"ContainerDied","Data":"b1dfada127819b9a8192561abb4eaf36d1b0939eb56191c35cfbafd9925f3d2d"} Dec 02 17:54:08 crc kubenswrapper[4933]: I1202 17:54:08.666918 4933 scope.go:117] "RemoveContainer" containerID="a71c5952d0eb3bd2e3ae2dc54f24b5192721c4817c5474373dea1bae16d14cc8" Dec 02 17:54:08 crc kubenswrapper[4933]: I1202 17:54:08.667200 4933 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ngshh" Dec 02 17:54:08 crc kubenswrapper[4933]: I1202 17:54:08.709915 4933 scope.go:117] "RemoveContainer" containerID="655f7e96f6f159c34022ff33ab456380882216abd79f70ee317c2304c8b77554" Dec 02 17:54:08 crc kubenswrapper[4933]: I1202 17:54:08.731345 4933 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ngshh"] Dec 02 17:54:08 crc kubenswrapper[4933]: I1202 17:54:08.741612 4933 scope.go:117] "RemoveContainer" containerID="ccfd743f16e1cb28be48b71053bc017865c6cb9d852d69ff4356453d26a8f1ef" Dec 02 17:54:08 crc kubenswrapper[4933]: I1202 17:54:08.750167 4933 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-ngshh"] Dec 02 17:54:08 crc kubenswrapper[4933]: I1202 17:54:08.815476 4933 scope.go:117] "RemoveContainer" containerID="a71c5952d0eb3bd2e3ae2dc54f24b5192721c4817c5474373dea1bae16d14cc8" Dec 02 17:54:08 crc kubenswrapper[4933]: E1202 17:54:08.816105 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a71c5952d0eb3bd2e3ae2dc54f24b5192721c4817c5474373dea1bae16d14cc8\": container with ID starting with a71c5952d0eb3bd2e3ae2dc54f24b5192721c4817c5474373dea1bae16d14cc8 not found: ID does not exist" containerID="a71c5952d0eb3bd2e3ae2dc54f24b5192721c4817c5474373dea1bae16d14cc8" Dec 02 17:54:08 crc kubenswrapper[4933]: I1202 17:54:08.816182 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a71c5952d0eb3bd2e3ae2dc54f24b5192721c4817c5474373dea1bae16d14cc8"} err="failed to get container status \"a71c5952d0eb3bd2e3ae2dc54f24b5192721c4817c5474373dea1bae16d14cc8\": rpc error: code = NotFound desc = could not find container \"a71c5952d0eb3bd2e3ae2dc54f24b5192721c4817c5474373dea1bae16d14cc8\": container with ID starting with a71c5952d0eb3bd2e3ae2dc54f24b5192721c4817c5474373dea1bae16d14cc8 not found: ID does not exist" Dec 02 17:54:08 crc kubenswrapper[4933]: I1202 17:54:08.816223 4933 scope.go:117] "RemoveContainer" containerID="655f7e96f6f159c34022ff33ab456380882216abd79f70ee317c2304c8b77554" Dec 02 17:54:08 crc kubenswrapper[4933]: E1202 17:54:08.816652 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"655f7e96f6f159c34022ff33ab456380882216abd79f70ee317c2304c8b77554\": container with ID starting with 655f7e96f6f159c34022ff33ab456380882216abd79f70ee317c2304c8b77554 not found: ID does not exist" containerID="655f7e96f6f159c34022ff33ab456380882216abd79f70ee317c2304c8b77554" Dec 02 17:54:08 crc kubenswrapper[4933]: I1202 17:54:08.816702 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"655f7e96f6f159c34022ff33ab456380882216abd79f70ee317c2304c8b77554"} err="failed to get container status \"655f7e96f6f159c34022ff33ab456380882216abd79f70ee317c2304c8b77554\": rpc error: code = NotFound desc = could not find container \"655f7e96f6f159c34022ff33ab456380882216abd79f70ee317c2304c8b77554\": container with ID starting with 655f7e96f6f159c34022ff33ab456380882216abd79f70ee317c2304c8b77554 not found: ID does not exist" Dec 02 17:54:08 crc kubenswrapper[4933]: I1202 17:54:08.816736 4933 scope.go:117] "RemoveContainer" containerID="ccfd743f16e1cb28be48b71053bc017865c6cb9d852d69ff4356453d26a8f1ef" Dec 02 17:54:08 crc kubenswrapper[4933]: E1202 17:54:08.817446 4933 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ccfd743f16e1cb28be48b71053bc017865c6cb9d852d69ff4356453d26a8f1ef\": container with ID starting with ccfd743f16e1cb28be48b71053bc017865c6cb9d852d69ff4356453d26a8f1ef not found: ID does not exist" containerID="ccfd743f16e1cb28be48b71053bc017865c6cb9d852d69ff4356453d26a8f1ef" Dec 02 17:54:08 crc kubenswrapper[4933]: I1202 17:54:08.817488 4933 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ccfd743f16e1cb28be48b71053bc017865c6cb9d852d69ff4356453d26a8f1ef"} err="failed to get container status \"ccfd743f16e1cb28be48b71053bc017865c6cb9d852d69ff4356453d26a8f1ef\": rpc error: code = NotFound desc = could not find container \"ccfd743f16e1cb28be48b71053bc017865c6cb9d852d69ff4356453d26a8f1ef\": container with ID starting with ccfd743f16e1cb28be48b71053bc017865c6cb9d852d69ff4356453d26a8f1ef not found: ID does not exist" Dec 02 17:54:09 crc kubenswrapper[4933]: I1202 17:54:09.069847 4933 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf51d19b-802a-463f-a833-50973dce7583" path="/var/lib/kubelet/pods/bf51d19b-802a-463f-a833-50973dce7583/volumes" Dec 02 17:54:11 crc kubenswrapper[4933]: I1202 17:54:11.053977 4933 scope.go:117] "RemoveContainer" containerID="7c40d90a41c7f2f8aed02bda995f0865e4a7a77a8372ef4cdfaefeaf4a34037b" Dec 02 17:54:11 crc kubenswrapper[4933]: E1202 17:54:11.054630 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 17:54:22 crc kubenswrapper[4933]: I1202 17:54:22.057796 4933 scope.go:117] "RemoveContainer" containerID="7c40d90a41c7f2f8aed02bda995f0865e4a7a77a8372ef4cdfaefeaf4a34037b" Dec 02 17:54:22 crc kubenswrapper[4933]: E1202 17:54:22.059250 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 17:54:34 crc kubenswrapper[4933]: I1202 17:54:34.055573 4933 scope.go:117] "RemoveContainer" containerID="7c40d90a41c7f2f8aed02bda995f0865e4a7a77a8372ef4cdfaefeaf4a34037b" Dec 02 17:54:34 crc kubenswrapper[4933]: E1202 17:54:34.056504 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 17:54:46 crc kubenswrapper[4933]: I1202 17:54:46.054631 4933 scope.go:117] "RemoveContainer" containerID="7c40d90a41c7f2f8aed02bda995f0865e4a7a77a8372ef4cdfaefeaf4a34037b" Dec 02 17:54:46 crc kubenswrapper[4933]: E1202 17:54:46.055775 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 17:54:58 crc kubenswrapper[4933]: I1202 17:54:58.055030 4933 scope.go:117] "RemoveContainer" containerID="7c40d90a41c7f2f8aed02bda995f0865e4a7a77a8372ef4cdfaefeaf4a34037b" Dec 02 17:54:58 crc kubenswrapper[4933]: E1202 17:54:58.056206 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 17:55:13 crc kubenswrapper[4933]: I1202 17:55:13.054474 4933 scope.go:117] "RemoveContainer" containerID="7c40d90a41c7f2f8aed02bda995f0865e4a7a77a8372ef4cdfaefeaf4a34037b" Dec 02 17:55:13 crc kubenswrapper[4933]: E1202 17:55:13.055326 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa" Dec 02 17:55:27 crc kubenswrapper[4933]: I1202 17:55:27.066934 4933 scope.go:117] "RemoveContainer" containerID="7c40d90a41c7f2f8aed02bda995f0865e4a7a77a8372ef4cdfaefeaf4a34037b" Dec 02 17:55:27 crc kubenswrapper[4933]: E1202 17:55:27.067881 4933 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2p6w_openshift-machine-config-operator(e6c1c5e6-50dd-428a-890c-2c3f0456f2fa)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2p6w" podUID="e6c1c5e6-50dd-428a-890c-2c3f0456f2fa"